Int
ern
at
i
onal
Journ
al of Inf
orm
at
ic
s
and
Co
m
munic
at
i
on
Tec
hn
olog
y (IJ
-
I
CT)
Vo
l.
6
,
No.
3
,
D
ece
m
ber
20
1
7
, pp.
155
~
165
IS
S
N:
22
52
-
8776
, DO
I: 10
.11
591/ijict
.
v6
i
3
.
pp15
5
-
165
155
Journ
al h
om
e
page
:
http:
//
ia
esj
ou
r
nal.co
m/
on
li
ne/in
dex
.php
/
IJ
ICT
Marko
vian S
egm
entation
of
B
rain
Tum
or
MRI
Images
Mery
e
m
Ame
ur*, C
herki
D
aoui,
an
d
Naj
l
ae
Id
ri
ssi
La
bora
tor
y
of
In
form
at
ion
Proc
e
ss
ing
and
De
ci
si
on
Support,
Fa
c
ulty
of
Scie
n
ce
s
and
T
ec
hni
cs,
Sulta
n
Moul
a
y
S
li
m
ane
Univ
ersity
,
Ben
i
Me
ll
a
l,
Morrocc
o.
Art
ic
le
In
f
o
ABSTR
A
CT
Art
ic
le
history:
Re
cei
ved
A
ug
4
th
, 201
7
Re
vised Oct
25
th
, 201
7
Accepte
d Nov
7
th
, 201
7
Im
age
segm
enta
ti
on
is
a
funda
m
ent
a
l
op
erati
on
i
n
image
p
roc
ess
ing,
which
consists
to
d
i
-
vi
de
an
image
in
t
he
hom
ogene
ous
reg
ion
for
h
el
pi
ng
a
hum
an
to
an
aly
se
imag
e
,
to
d
ia
gnose
a
di
sea
se
and
ta
k
e
th
e
d
ec
ision
.
In
this
work,
we
pre
sent
a
compa
rat
iv
e
stud
y
b
etw
ee
n
two
it
er
at
i
ve
esti
m
at
or
a
lg
orit
hm
s
such
as
EM
(Expect
a
ti
on
-
Maximiz
at
ion)
and
IC
E
(I
te
ra
ti
v
e
Condit
ion
a
l
Esti
m
at
ion)
a
cc
o
rding
to
the
com
ple
xit
y
,
the
PS
NR
ind
ex,
th
e
SS
I
M
inde
x
,
the
err
or
ra
te
and
th
e
conv
erg
ence.
The
se
al
gori
thms
are
used
to
se
gm
ent
bra
in
tumor
Magne
tic
Resonance
Im
a
ging
(MRI)
ima
ges,
under
Hidd
en
Markov
Chai
n
with
Inde
peda
nt
Noise
(H
MC
-
IN).
W
e
ap
pl
y
a
f
ina
l
B
a
y
es
ia
n
decision
cr
iteria
MP
M
(Margi
nal
Pos
te
ri
ori
Mode)
to
estim
at
e
a
fin
al
con
figura
t
ion
o
f
the
resulted
imag
e
X.
The
exp
erim
ent
al
result
s
show
that
IC
E
and
EM
giv
e
the
sam
e
result
s
in
t
e
rm
of
the
qualit
y
PS
NR
inde
x,
SS
IM
inde
x
and
err
or
rate,
but
ICE
conve
rg
es
t
o
a
solut
i
on
f
ast
er
tha
n
EM.
Th
e
n,
IC
E
is
m
ore
c
om
ple
x
th
an
EM.
Ke
yw
or
d
s
:
Brai
n
T
um
or
MR
I
EM
HMC
-
IN
ICE
Im
ages MPM
Copyright
©
201
7
Instit
ut
e
o
f Ad
vanc
ed
Engi
n
ee
r
ing
and
S
cienc
e
.
Al
l
rights re
serv
ed
.
Corres
pond
in
g
Aut
h
or
:
Me
rye
m
A
m
eu
r
,
Lab
or
at
ory
of
I
nfor
m
a
ti
on
P
rocessi
ng and
De
ci
sion
Sup
port
,
Faculty
of S
ci
e
nces a
nd Tec
hnic
s
,
Su
lt
an
M
oula
y Sl
i
m
ane Univ
ersit
y
,
BP 325, Be
niMe
ll
al
, Mo
rroc
co
.
Em
a
il
:
m
.a
m
eur
@
us
m
s.
m
a
1.
INTROD
U
CTION
Hidden
Ma
r
ko
v
m
od
el
[
2]
is
ver
y
e
xplo
red
i
n
m
any
fiel
ds
l
ike
fi
na
nce[4]
,
i
m
ager
y[1
4],[1
2],
m
edical
fiel
d[16
]
,
[26]
and
c
hem
ist
ry
[29],
it
has
a
n
im
po
rtant
pl
ace
in
proces
sing
im
age[
28
]
pr
eci
sel
y
in
i
m
age
segm
entat
ion
.
Ma
rkov
ia
n
se
gm
entat
ion
is
a
non
s
up
e
r
vised
sta
ti
sti
cal
m
eth
od
of
se
gm
entat
ion
.
It
ca
n
be
us
ed
to
est
im
a
te
an
i
m
age
res
ult=
fx1;
::
::
:;
xN
g
from
the
obse
rv
e
d
im
age
Y
=
fy1;
::
::
:;
yN
g
2
R
wh
e
re
N
is
a
nu
m
ber
of p
i
xe
l com
po
ne
nt th
e i
m
age.
It
exists
th
ree
basic
Ma
r
kovi
an
m
od
el
s[
30
]
of
s
egm
entat
ion
:
fiel
ds[1
2],[2
4],
chai
ns
[
25]
,
[26]
an
d
the
trees[
23
]
,
[27].
Each
m
od
el
ha
s
it
s
pri
nciple
t
o
m
od
el
the
im
age
Y
to
be
se
gm
ented.
T
he
adv
a
ntage
of
fi
el
d
is
to
ta
ke
acc
ount
into
the
c
on
te
xtu
al
in
form
at
i
on
i
n
the
im
age.
To
m
od
el
a
n
im
age
with
this
m
od
el
,
we
div
ide
the
im
age
in
cl
iqu
es
,
eac
h
cl
iqu
e
c
on
ta
in
s
f
our
pi
xels
neig
hbors
at
le
ast
,
t
hi
s
m
od
el
li
ng
m
akes
the
c
ompu
ti
ng
sp
ee
d
an
d
the
ti
m
e
of
exec
utio
n
ve
ry
le
ss
com
par
e
d
to
t
he
ot
he
r
Ma
rko
vian
m
od
el
s[1
4].
T
o
transfo
rm
the
i
m
age
in
a
Ma
rkov
c
hain,
we
ca
n
use
ei
ther
Hilbe
rt
Pean
o
tra
ns
f
or
m
er[
18]
,[1
9],
zi
gzaggin
g,
li
ne
by
li
ne
pa
rc
ours
,
colum
n
by
col
um
n
par
c
ours.
These
par
c
ou
rs
tra
ns
f
or
m
the
im
age
ta
kin
g
acc
ount
i
nto
the
neig
hborh
ood
betwee
n
t
wo p
ixels
i
n
t
he
im
a
ge.
Eac
h
pix
el
in
the
Ma
r
kov chain
yn
de
pe
nds only
o
n
it
s nei
ghbor
yn+
1
i
n
t
he
i
m
age,
it
resp
e
ct
s
the
pro
per
t
y
of
Ma
r
kov.
This
m
od
el
is
ver
y
fa
ste
r
co
m
par
ed
with
t
he
tree
a
nd
t
he
fiel
d.
Ma
rkov
ia
n t
ree
is
a
ge
ne
ral
ca
se
of
c
hain,
it
con
sist
s
t
o t
ra
nsfo
rm
the
i
m
age
in
bitree[
20
]
or
qua
dtree
[21],
[22],
it
is
or
ga
nized
by
hiera
rch
ic
al
way
in
T
hier
arch
ic
al
le
vels
S
su
c
h
as
S
1
<
S2
<
::
::
<
ST
,
each
pi
xel
child
in
the
tree
ys+
de
pends
only
on
it
s
pix
el
pa
re
nt
ys
.
T
ree
is
a
c
om
petit
or
to
th
e
fiel
d
beca
us
e
it
’s
chara
ct
eriz
ed
by
it
s
sp
ee
d
t
o
e
stim
at
e
the
pa
ra
m
et
ers,
it
ada
pt
s
m
uch
wit
h
t
he
m
ulti
reso
l
ut
ion
im
age
se
gm
entat
ion
,
t
he
sp
at
ia
l
relat
ion
of
t
he
neig
hborh
ood
i
s
not
res
pected
by
the
tree,
co
ntrar
y
to
the
fi
el
d,
we
can
c
onside
r
t
he
t
ree
li
ke
a
Evaluation Warning : The document was created with Spire.PDF for Python.
IS
S
N
:
2252
-
8776
IJ
-
ICT
V
ol.
6
,
No.
3
,
Decem
ber
201
7
:
155
–
165
156
directed
gr
a
ph
and
t
he
fiel
d
li
ke
a
no
n
dire
ct
ed
gra
ph.
The
s
e
m
od
el
s
cal
le
d
cl
assic
al
hidden
Ma
rko
vian
m
od
el
.
Othe
r
Ma
r
kovi
an
m
od
el
s
exis
t
recently
in
th
e
li
te
ratur
es[
30
]
pairw
ise
Ma
r
kov
m
od
el
s[
36]
and
triple
t
M
arko
v
m
od
el
s.
The
pair
wise
Ma
rkov
m
od
el
is
a
ge
ner
al
iz
at
ion
of
a
cl
assic
al
m
od
el
.
T
ripl
et
Ma
rkov
m
od
el
[37]
is
al
so
a
ge
neral
iz
at
ion
of
pairw
ise
m
od
el
,
it
’s
c
om
po
sed
of
th
r
ee
proc
esses
(
ob
s
er
ved
pr
oc
ess,
a
uxil
ia
ry
process
,
hidden
proce
ss
).
It
treat
s
t
he n
on
sta
ti
on
a
ry
da
ta
. Ou
r st
ud
y
fo
c
us
es
on
a
cl
assic
al
hidden
Ma
rkov
c
hai
n
m
od
el
to se
gm
ent the b
rai
n
tum
or
M
RI im
ages.
Hidden
Ma
r
kov
m
od
el
m
od
el
s
the
im
age
Y
a
ccordin
g
t
o
the
sel
ect
ed
m
od
el
(f
ie
ld,
c
hai
n,
t
ree).
It
us
e
d
the
Ba
ye
sia
n
t
he
or
em
to
cal
cul
at
e
the
a
poste
r
iori
prob
a
bili
ty
P
(XjY
)
to
fin
d
a
final
c
onfi
gurati
on
of
the
im
age
resu
lt
of
segm
entat
ion
X
∈
Ω
=
{
ω
1
,
...
.
.,
ω
K
}
K
is
a
num
ber
of
m
e
m
ber
sh
i
p
cl
asses,
it
is
init
ia
li
zed
by
t
he
us
er
.
(1)
wh
e
re:
1.
P(
X|
Y) : i
s the
pro
bab
il
it
y of t
he post
erio
ri law
X kno
ws Y.
2.
P(
X
)
: i
s t
he pr
ob
a
bili
ty
o
f
t
he
prior
i l
a
w.
3.
P(
Y|
X) : i
s the
pro
bab
il
it
y of t
he
at
ta
che
d dat
a law.
4.
P(
Y
)
: i
s
a c
onsta
nt of
norm
alizat
ion
P
(Y) = 1
To
est
im
at
e
th
ese
pro
ba
bili
ties,
it
sho
uld
t
o
a
pp
ly
the
it
erati
ves
est
im
ato
rs
of
par
am
eter
s
EM[
13]
,
ICE[
6],
MC
E
M(M
on
te
Ca
rl
o
E
xpect
at
ion
-
Ma
xim
iz
at
ion
)
[13]....
In
this
work,
we
ha
ve
lim
it
ed
ou
r
st
ud
y
on
two
it
erati
ve e
s
tim
a
tors s
uc
h a
s
EM
a
nd
ICE,
we
are
us
in
g t
hese
al
gorithm
s
to
est
im
a
te
a
par
am
et
er
of
H
idde
n
Ma
rkov
C
hain
with
I
nd
e
pe
nd
ant
N
oise(
HMC
-
IN)
m
od
el
to
segm
ent
the
brai
n
tum
or
MR
I
i
m
ages[
11]
,
w
e
hav
e
reali
zed
a
c
om
par
at
ive
study
betwee
n
ICE
an
d
EM
.
W
e
are
us
e
d
M
PM
Algo
rithm
[5
]
to
est
im
a
te
a
fi
nal
config
ur
at
io
n o
f
X
. Also
, we e
xtract a
brai
n
t
um
or
u
si
ng th
r
esh
old
in
g
te
c
hnic
[
11
]
.
The
st
ru
ct
ur
e
of this
pa
per is
orga
nised
a
s fo
ll
ow
s :
Sect
ion
1 p
rese
nts H
i
dden
Marko
v
Chai
n wit
h Inde
penda
nt
N
oise
m
od
el
.
Sect
ion
2 sh
ow
s EM al
gorith
m
, I
CE al
go
rit
hm
, co
m
plexit
y of t
hese e
sti
m
at
or
s and M
P
M al
gorithm
.
Sect
ion
3
il
lust
rates the e
xperi
m
ental
r
esults.
Finall
y, we ha
ve
a c
oncl
us
i
on a
nd s
om
e o
pe
n qu
e
sti
on
s
.
2.
HIDDEN
MA
RKOV
CHAI
N
WIT
H I
NDE
PEND
AN
T
NOI
SE
Now,
we
pres
ent
the
Hidde
n
Ma
r
kov
Cha
in
with
I
ndepe
nd
e
nt
No
ise
(
HMC
-
I
N).
T
hi
s
m
od
el
is
a
cl
assic
al
Ma
rk
ov
ia
n
m
od
el
,
it
con
ta
ins
t
wo
processes
:
hi
dden
Ma
rko
vian
process
X
an
d
ob
se
r
ved
proc
ess
Y.
It’s
cal
le
d
Hi
dden
Ma
rko
v
Chain
with
I
nd
e
pende
nt
N
oise
(H
MC
-
I
N
),
be
cause
it
ignore
s
the
noise
inf
orm
ation
con
ta
ine
d
i
n
t
he
which t
he
im
age
Y [17]
.
Let
the
process
Z = (X
, Y
),
w
her
e
X = (
X
n )
N
∈
Ω a
nd Y
= (Yn )
N
∈
R.
The pr
ocess
Z
= (X,
Y ) is a
HMC
-
I
N, if a
nd
on
ly
if
:
1.
The pr
ocess
X
is a Ma
r
kov
C
hain, it
'
s h
om
og
ene
us an
d
sta
t
ion
a
ry, its l
aw i
s as foll
ows :
P (X
)
=
P (X
1 = x1
).
II P
(Xn+1 =
xn+1
|Xn
=
xn )
(2)
2.
The o
bs
er
vatio
ns
Y
a
re c
onditi
on
al
ly
indepe
nd
e
nt
of X
3.
Each
obser
vation
y
n
,
∀
n
∈
N
dep
e
nds
only
on its
hidden
class
x
n
.
P
(
Y
n
=
y
n
|
X
)
=
P
(
Y
n
=
y
n
|
X
n
=
x
n
)
(3)
Each
process
of
a
hi
dd
e
n
M
arko
v
chai
n
ha
s
it
s
par
am
et
ers,
a
hidden
Ma
rko
vian
proce
ss
X
has
it
s
i
niti
al
la
w
P
I
and
it
s
m
at
rix
of
tra
ns
it
ion
A.
A
n
obse
rv
e
d
process
Y
has
al
so
it
s
pa
ram
e
te
rs,
t
hese
pa
ra
m
et
er
s
dep
e
nd
on the l
aw of
proba
bili
ty
f
ollo
wing
by
this pro
ce
ss.
To
est
im
at
e these p
a
ram
et
ers.
W
e
apply
th
r
ee p
hases
:
1.
In
it
ia
li
zat
ion
phase.
2.
Iterati
ve
est
im
at
ion
ph
a
se.
n=1
n=1
N −1
n=1
Evaluation Warning : The document was created with Spire.PDF for Python.
IJ
-
ICT
IS
S
N:
22
52
-
8776
Mark
ovia
n Segment
ation of
B
ra
in
Tum
or
M
RI Im
ages
(
Me
ryem A
me
ur
)
157
3.
Final decisi
on
ph
a
se.
a.
In
the
i
niti
al
iz
a
ti
on
phase
,
we
init
ia
li
ze
the
par
am
et
ers
θ
0
=
(P
I
0
,
A
0
,
µ
0
,
(
σ
2
)
0
)
of
eac
h
la
w.
It
’s
an
im
po
rtant
phase.
F
or
a
pri
or
i l
a
w param
et
ers
θ
0
=
(P
0
,
A
0
), we
h
a
ve:
1.
T
he
i
niti
al
law P
I
0
(i
)
=
p(x
1
=
i)
∀
i
∈
Ω
of
siz
e K
.
2.
The
t
ran
sit
io
n
m
at
rix
A
0
(i,
j)
=
p
(x
n+1
=
ω
j
|x
n
=
ω
i
)
betwe
en
t
he
cl
asses
i
an
d
j
∀
i
,
j
∈
Ω
of
siz
e
K
∗
K.
Fo
r
the
at
ta
che
d
data
la
w
pa
r
a
m
et
ers
θ
0
y|
x,
if
we
a
ssu
m
e
that
the
obser
va
ti
on
s
fo
ll
ow
t
he
Gau
s
sia
n
la
w
p
(y
n
|x
n
=ω
t
),
we
init
ia
li
ze θ
0
=
(
µ
0
(σ
0
)
2
),
for
eac
h
cl
as
s
∀
i
∈
Ω
we ha
ve:
1.
The
m
ean µ
0
of size
K
.
2.
The varia
nce
(
σ
0
)
2 of
size
K
.
b.
I
n
the
it
erati
ve
est
im
a
ti
on
ph
ase,
we
cal
cula
te
the
pa
ram
eter
s
θ
q
=
(θ
q
,θ
q
)
of
eac
h
la
w
f
or
eac
h
nu
m
ber
of
it
erati
on
s
q
∈
Q
us
in
g
t
he
est
i
m
at
or
al
go
rithm
s
su
ch
as
EM[
31
]
,
ICE[
32
]
,
SEM(St
och
ast
i
c Expect
at
ion
-
Ma
xim
iz
at
ion
)
[13],
[
33]
c.
In
the
final
ph
ase
of
decisi
on
,
we
est
im
a
te
a
final
co
nf
i
gur
at
ion
of
the
hid
de
n
process
X
(im
age
resu
lt
)
. Usin
g M
PM or MA
P
[38] Ba
ye
sia
n crit
eria.
HMC
-
I
N
m
od
e
l est
i
m
at
es K
2 + 3
K param
et
e
rs
in
each
it
erati
on
q.
3.
EM
AND
I
CE
A
LG
ORI
T
H
MS
In
this
sect
ion,
we
pr
ese
nt
t
he
EM,
ICE
est
i
m
at
or
s,
it
s
c
om
plexit
y
and
MPM
al
gorith
m
.
They
are
base
d on Ba
um
W
el
ch
al
gor
it
h
m
[1
]
.
EM
us
e
s
t
he
de
te
rm
inist
ic
strat
egy
to
cal
culat
e
the
pa
ram
et
er
s,
it
is
base
d
on
m
axi
m
iz
ing
a
li
kelihood
P
(
x,
y
|
θ
)
It
ha
s m
any d
ifficul
ti
es to con
verg
e
[
3].
ICE
is
a
n
it
erat
ive
al
gorithm
ba
sed
on
a p
ri
nc
iple
of
S
IP
[
3] and
Mo
nte
Ca
r
lo
m
et
ho
d
[7
]
,
[
10
]
,
[34]. I
t
us
es a
h
y
br
i
d
s
trat
egy (
determ
inist
ic
+st
och
as
ti
c) to
esti
m
at
e
the
par
am
et
ers.
3.1.
E
M Alg
or
ithm
EM p
ro
cee
ds
i
n
tw
o
ste
ps
Ex
pectat
ion(E)
and Ma
xim
iz
at
io
n(
M)
:
Fo
r
eac
h
it
erati
on q 2 Q:
1.
Step(E
):
–
We calc
ulate
2.
S
te
p(
M
):
–
We calc
ulate
the
par
am
et
ers
of each
law
of
HMC
-
I
N:
–
Con
ce
rn
i
ng a
pri
or
i l
a
w para
m
et
er’
s, we cal
culat
e :
(4)
(5)
–
Con
ce
rn
i
ng a
da
t
a att
ached
la
w param
et
er’
s,
w
e cal
c
ulate
:
(6)
(7)
–
Af
te
r
cal
culat
ing
the
at
ta
ched
data
par
am
et
ers,
we
cal
culat
e
a
Gau
s
sia
n
de
ns
it
y
f
[2
]
,
∀
i
∈
Ω
∀
n
∈
N
in
each
it
erati
on
q
∈
Q.
(8)
x
y
|
x
t
i
x
x|
y
Evaluation Warning : The document was created with Spire.PDF for Python.
IS
S
N
:
2252
-
8776
IJ
-
ICT
V
ol.
6
,
No.
3
,
Decem
ber
201
7
:
155
–
165
158
3.2.
I
ce
Algori
th
m
This esti
m
at
or
p
r
ocee
ds
als
o
i
n
tw
o
ste
ps
:
Fo
r
eac
h
it
erati
on
q
Є
Q
1.
We calc
ulate
We sim
ulate
a sam
ple o
f
X
q f
or one
rand
om
si
m
ulati
on
u
si
ng the
p
a
ram
et
e
rs of
t
he
it
erati
on q [
9]
.
2.
We calc
ulate
a
pr
i
or
i l
aw
an
d at
ta
ched
data l
aw param
et
er’
s
(9)
(10)
(11)
(12)
We als
o
cal
cul
at
e a d
e
ns
it
y f
q
us
i
ng the e
qua
ti
on
(3.1.)
EM
an
d
ICE
use
a
determ
inist
ic
strat
egy
to
cal
culat
e
the
a
pri
ori
la
w
pa
ram
et
er’
s.
T
o
est
i
m
at
e
the
at
ta
ched
data
pa
ram
et
er’
s EM u
ses
also
a
determ
inistic strat
egy an
d ICE
use
s a stoc
hastic
strat
egy.
3.3.
B
au
m
We
lc
h A
lg
orit
hm
Ca
lc
ulati
on
para
m
et
ers
by
EM
or
ICE
is
base
d
on
Ba
um
W
el
ch
Algorithm
.
This
al
go
rithm
[1
]
procee
ds
as w
e
calc
ulate
:
1.
The Fo
rw
a
r
d p
roba
bili
ti
es α
2.
The
Ba
c
kw
a
r
d pro
bab
il
it
ie
s β
3.
The
m
arg
inal a
posterio
ri
pro
bab
il
it
y ξ
4.
The j
oin
t a
pos
te
rior
i
pro
bab
il
it
y ϓ
I
n
For
ward
Ba
ckw
a
r
d Al
gorithm
, we
cal
cula
te
the
F
orward
and
the
Ba
c
kward
pro
ba
bili
ti
e
s:
Forwa
r
d
Algorithm
α
n
(
i) =
p (y
1
,
.
...., y
n
, x
n
)
pro
cee
ds i
n
t
wo steps
:
1.
In
it
ia
li
zat
ion
:(
n=
1)
(13)
2.
Ind
uction:
(n >
1)
(14)
Ba
ckw
a
rd
Algorithm
β
n
(i)
= p
(y
n+1
, .
...., yN
|x
n
)
al
so
pr
oc
eeds in
tw
o
ste
ps
in
the
o
pposi
te
d
irect
ion
sta
rting
with
n = N
:
1.
In
it
ia
li
zat
ion
:(
n=N
)
(15)
2.
Ind
uction:
(n
<
N
)
(16)
Evaluation Warning : The document was created with Spire.PDF for Python.
IJ
-
ICT
IS
S
N:
22
52
-
8776
Mark
ovia
n Segment
ation of
B
ra
in
Tum
or
M
RI Im
ages
(
Me
ryem A
me
ur
)
159
W
e
al
s
o
cal
cul
at
e
t
w
o
pr
ob
a
bi
li
t
ie
s
f
or
t
w
o
l
a
w
,
t
he
ma
r
gina
l
a
poste
rio
ri
l
a
w
ξ
n
(i)
and
t
he
j
oi
nt
a
po
ste
rio
ri
l
a
w
γ
n
(i,
j
)
w
he
re:
(17)
and
(18)
3.4. C
omp
le
xity of I
CE
an
d
EM
algorithm
s
The
ai
m
of
thi
s
sect
io
n
is
to
com
par
e
the
c
om
plexity
of
I
CE
an
d
EM
al
gorithm
s,
for
this
reason,
we
cal
culat
e
the
c
om
plexity
of
e
ach
ta
sk
exec
ut
able
by
these
est
i
m
at
or
s,
we
cal
culat
e
the
c
om
plexity
of
F
orward
al
gorithm
n(
i), th
e
com
plexity
of
Ba
ck
wa
rd
a
lgorit
h
m
n(
i),
t
he
c
om
plexity
to
cal
culat
e
a m
arg
inal
a pos
te
rior
i
al
gorithm
n(
i)
and
t
o
cal
cul
at
e
the
joint
a
po
ste
rio
ri
al
gorithm
n(
i;
j
)
.
The
n,
the
c
om
plexit
y
to
cal
culat
e
par
am
et
ers
P
I
(i);
A(
i;
j
);
i;
i
2
a
nd
t
he
sim
u
la
ti
on
of
X
by
the
ICE
al
gorithm
in
each
i
te
r
at
ion
q
2
Q.
We
ha
ve
N
obser
vatio
ns
(size
of
t
he
Y
)
a
nd
K
sta
te
s
(
nu
m
ber
of
cl
as
ses).
We
a
re
re
su
m
ed
the
com
plexity
of
each
ta
sk
execu
te
d by E
M an
d
ICE
in
t
his table:
Table
1.
C
om
plexity
o
f
EM
a
nd I
CE
alg
or
it
hm
s
Task
EM
ICE
Fo
rwar
d
O(K2N)
O(K2N)
Ba
ck
ward
O(K2N)
O(K2N)
Jo
in
t a
p
o
sterio
ri
p
rob
ab
ility
O(K2N)
O(K2N)
Mar
g
in
al a
po
sterio
ri
p
rob
ab
ility
O(KN
)
O(KN
)
Initial law
P
I
O(K)
O(K)
Matr
ix
of
tr
an
sitio
n
A
O(K2N)
O(K2N)
Mean
O(KN
)
O(KN
)
Variance 2
O(KN
)
O(KN
)
Si
m
u
latio
n
X
n
o
t execu
tab
le by
t
h
i
s alg
o
rith
m
O(KN
)
Fr
om
this
ta
ble,
we
noti
ce
that
the
com
plexity
of
ICE
is
s
uperior
tha
n
the
c
om
plexity
of
E
M.
Be
cau
se,
ICE sim
ulate
s
the h
id
de
n pro
cess X [35]
on
e tim
e in each
it
erati
on
q
2 Q.
This task
m
akes I
CE m
or
e co
m
plex
than EM
.
3.5. MP
M Alg
orit
hm
To
fi
nd
a
fi
nal
config
ur
at
io
n
of
X.
T
his
est
im
at
or
m
axi
m
iz
es
f
or
eac
h
pix
el
y
n
,
∀
n
∈
N
.
T
he
m
arg
inal
a posterio
ri
pro
bab
il
it
y [5
]
:
(19)
We
us
e
this
m
a
them
a
ti
cal
fo
r
m
ula
to
est
i
m
a
t
e
a
m
e
m
ber
sh
ip
cl
ass
x
¯
nm
pm
,
fo
r
eac
h
pixe
l
y
n
,
∀
n
∈
N.
(20)
By
this appr
oa
ch,
we
e
stim
ate
a fi
nal con
fig
ur
at
io
n of t
he
pro
ces
s X.
MPM
h
as a
c
om
plexit
y of
O (K N
).
4.
E
X
PERI
MEN
TAL RES
UL
TS
4.1.
E
xp
eri
me
nt
s
We se
gm
ent a b
rai
n
MR
I
im
a
ges
t
o
th
ree
regi
on
s
.
We
com
par
e
E
M
an
d
ICE
al
gorithm
s
in
te
r
m
of
qual
it
y
suc
h
as
PS
NR
i
ndex
,
SS
IM
i
ndex,
E
rror
rate
and Co
nver
gence.
The
n,
we
e
xtra
ct
a r
e
gion
of
i
n
te
rest
us
in
g
t
hresh
old
i
ng tech
nics[
11
]
,
[15].
Evaluation Warning : The document was created with Spire.PDF for Python.
IS
S
N
:
2252
-
8776
IJ
-
ICT
V
ol.
6
,
No.
3
,
Decem
ber
201
7
:
155
–
165
160
We
us
e
K
-
m
eans al
gorithm
[
8]
to
init
ia
li
ze the co
nf
i
gurati
on
of X
0
.
C
on
ce
rn
i
ng the
init
ia
l l
aw
PI
0
we have:
PI
0
=
Con
ce
rn
i
ng the
m
a
trix of tra
nsi
ti
on
A
0
, we
ha
ve
:
A
0
=
The
m
ean µ
0
a
nd the
va
riance
(
σ
0
)
2
are
init
ia
li
zed b
y
K
-
m
eans
from
the co
nfi
gurati
on
of
X
0
.
We
hav
e
a
num
ber
o
f
it
erati
on
s
Q =
30.
W
e
ha
ve
us
e
d
t
his
ty
pe
of
init
ia
li
zat
ion
par
a
m
et
ers
in
al
l
e
xperim
ents
pres
ented
in
this
w
ork.
We
ha
ve
reali
zed
te
n
ex
per
im
ents
f
or
t
en
MR
I
im
ages.
We
ass
um
e
t
hat
th
e
MR
I
i
m
ages
us
in
g
in
this
com
pu
ta
ti
on
a
re
filt
ered
.
A
fter
s
egm
entat
ion
phase.
We
ha
ve
ta
ken
t
he
im
age
res
ult
of
seg
m
entat
ion
X
obta
ined
by
ICE
in
eac
h
exp
e
rim
ent
an
d
we
ha
ve
e
xt
racted
f
ro
m
this
i
m
age
the
r
egio
n
of
inte
re
st
(t
um
or
)
.
Us
ing
the
thre
sholdin
g
te
chn
ic
,
this
te
chn
ic
co
ns
ist
s
t
o el
im
inate
al
l reg
i
on
s
of
the
i
m
age, a
nd
just
le
ft
the
reg
i
on
of
interest
w
hi
ch i
t’
s
necessa
ry
to
e
xtract
f
r
om
the
cel
ebr
al
e
im
age
X.
T
o
fac
il
it
at
e
the
dia
gnos
is
the
ty
pe
of
tum
or
(
be
nign
or
ma
li
gn
ant),
we
ta
ke
the
ori
gina
l
i
m
age
Y
a
nd
we
m
ark
the
po
sit
io
n
of
t
he
tum
or
by
t
he
wh
it
e
c
olor.
We
ha
ve
su
r
rou
nded
t
he
tum
or
by
a
re
d
co
ntou
r.
In
pa
rtic
ular,
we
present
the
ob
ta
i
ned
res
ults
in
e
ach
ex
pe
rim
en
t,
they
are a
vaila
ble in
the
fo
ll
owin
g fig
ur
es
.
(a)
O
rigin
al
i
m
age Y
(b)
C
onfig
ur
a
-
ti
on
X
0
(c)
EM
(d) ICE
(e)
I
nd
e
xe
d
Re
gions
(f) Interest
Re
gion
(g)
Fi
nal
Re
su
lt
Figure
1. Ex
pe
rim
ent
1
(a)
O
rigin
al
i
m
age Y
(b)
C
onfig
ur
a
-
ti
on
X
0
(c)
EM
(d) ICE
(e)
I
nd
e
xe
d
Re
gions
(f) Interest
Re
gion
(g)
Fi
n
al
Re
su
lt
Figure
2. Ex
pe
rim
ent
2
Evaluation Warning : The document was created with Spire.PDF for Python.
IJ
-
ICT
IS
S
N:
22
52
-
8776
Mark
ovia
n Segment
ation of
B
ra
in
Tum
or
M
RI Im
ages
(
Me
ryem A
me
ur
)
161
(a)
O
rigin
al
i
m
age Y
(b)
C
onfig
ur
a
-
ti
on
X
0
(c)
EM
(d) ICE
(e)
I
nd
e
xe
d
Re
gions
(f) Interest
Re
gion
(g)
Fi
nal
Re
su
lt
Figure
3. Ex
pe
rim
ent 3
(a)
O
rigin
al
i
m
age Y
(b)
C
onfig
ur
a
-
ti
on
X
0
(c)
EM
(d) ICE
(e)
I
nd
e
xe
d
R
egio
ns
(f) Interest
Re
gion
(g)
Fi
nal
Re
su
lt
Figure
4. Ex
pe
rim
ent 4
(a)
O
rigin
al
i
m
age Y
(b)
C
onfig
ur
a
-
ti
on
X
0
(c)
EM
(d) ICE
(e)
I
nd
e
xe
d
R
egio
ns
(f) Interest
Re
gion
(g)
Fi
nal
Re
su
lt
Figure
5. Ex
pe
rim
ent 5
(a)
O
rigin
al
i
m
age Y
(b)
C
onfig
ur
a
-
ti
on
X
0
(c)
EM
(d) ICE
(e)
I
nd
e
xe
d
R
egio
ns
(f) Interest
Re
gion
(g)
Fi
nal
Re
su
lt
Figure
6. Ex
pe
rim
ent 6
(a)
O
rigin
al
i
m
age Y
(b)
C
onfig
ur
a
-
ti
on
X
0
(c)
EM
(d) ICE
(e)
I
nd
e
xe
d
R
egio
ns
(f) Interest
Re
gion
(g)
Fi
nal
Re
su
lt
Figure
7. Ex
pe
rim
ent 7
Evaluation Warning : The document was created with Spire.PDF for Python.
IS
S
N
:
2252
-
8776
IJ
-
ICT
V
ol.
6
,
No.
3
,
Decem
ber
201
7
:
155
–
165
16
2
(a)
O
rigin
al
i
m
age Y
(b)
C
onfig
ur
a
-
ti
on
X
0
(c)
EM
(d) ICE
(e)
I
nd
e
xe
d
R
egio
ns
(f) Interest
Re
gion
(g)
Fi
nal
Re
su
lt
Figure
8. Ex
pe
rim
ent 8
(a)
O
rigin
al
i
m
age Y
(b)
C
onfig
ur
a
-
ti
on
X
0
(c)
EM
(d) ICE
(e)
I
nd
e
xe
d
R
egio
ns
(f) Interest
Re
gion
(g)
Fi
nal
Re
su
lt
Figure
9. Ex
pe
rim
ent 9
(a)
O
rigin
al
i
m
age Y
(b)
C
onfig
ur
a
-
ti
on
X
0
(c)
EM
(d) ICE
(e)
I
nd
e
xe
d
R
egio
ns
(f) Interest
Re
gion
(g)
Fi
nal
Re
su
lt
Figure
10. E
xp
erim
ent 1
0
Fr
om
these
fig
ur
es
,
we
noti
ce
that
:
HMC
-
I
N
div
i
des
t
he
i
m
age
in
t
hr
ee
r
egio
ns
,
am
on
g
these
reg
i
on
s
we
fin
d
the
regi
on
s
c
on
ta
i
ning
the
br
ai
n
tum
or
.
Vis
ually
,
ICE
and
EM
m
eth
ods
ca
pture
sa
m
e
detai
ls
of
the
real
i
m
age in
these
exp
e
rim
ents.
4.2.
Th
e
Resul
ts
We
hav
e
res
um
ed
the obtai
ne
d
res
ults in t
he
f
ollo
wing ta
bl
es.
W
e
ha
ve
c
om
par
ed
the
se
est
i
m
at
or
s in
te
n
ex
pe
rim
ent
s in
te
rm
o
f
the
PSNR i
nde
x,
t
he
S
SI
M i
nd
e
x, the e
rro
r
rate
and the c
onve
r
ge
nce
.
Table
2.
PS
NR ind
e
x
a
nd S
SIM
ind
e
x
Exp
eri
m
en
ts
PSNR IC
E
SSIM
I
CE
PSNR E
M
SSIM
E
M
Exp
eri
m
en
t
1
2
1
,95
9
4
0
.53
9
0
2
1
,95
0
0
0
.53
9
7
Exp
eri
m
en
t
2
2
4
,06
7
2
0
.57
1
0
2
4
,06
7
2
0
.56
9
7
Exp
eri
m
en
t
3
1
9
,93
2
3
0
.48
2
1
1
9
,93
2
2
0
.48
4
7
Exp
eri
m
en
t
4
2
2
,15
2
9
0
.49
9
0
2
2
,15
2
9
0
.49
7
7
Exp
eri
m
en
t
5
1
8
.47
1
3
0
.47
7
3
1
8
.47
1
3
0
.47
8
4
Exp
eri
m
en
t 6
2
1
,80
5
0
0
.51
5
7
2
1
,80
5
8
0
.51
5
0
Exp
eri
m
en
t
7
2
0
,37
3
8
0
.39
0
8
2
0
,37
3
8
0
.39
2
2
Exp
eri
m
en
t
8
1
9
,00
8
3
0
.34
8
8
1
9
,00
8
3
0
.35
0
6
Exp
eri
m
en
t
9
1
8
,06
3
6
0
.3
845
1
8
,06
3
1
0
.38
4
3
Exp
eri
m
en
t
10
2
1
.75
8
7
0
.35
7
4
2
1
.75
8
7
0
.35
7
2
Evaluation Warning : The document was created with Spire.PDF for Python.
IJ
-
ICT
IS
S
N:
22
52
-
8776
Mark
ovia
n Segment
ation of
B
ra
in
Tum
or
M
RI Im
ages
(
Me
ryem A
me
ur
)
163
Table
3.
E
rro
r rat
e
Exp
eri
m
en
ts
Er
ror
r
ate
I
C
E
Er
ror
r
ate
EM
Exp
eri
m
en
t
1
9
,21
2
7
9
,21
2
7
Exp
eri
m
en
t
2
8
,13
5
7
8
,13
5
7
Exp
eri
m
en
t
3
7
,97
6
6
7
,97
6
6
Exp
eri
m
en
t
4
1
1
,42
7
0
1
1
,4
2
7
0
Exp
eri
m
en
t
5
9
.60
7
5
9
.60
7
5
Exp
eri
m
en
t
6
1
0
,04
5
0
1
0
,04
5
0
Exp
eri
m
en
t
7
9
,41
2
8
9
,41
2
8
Exp
eri
m
en
t
8
7
,09
7
5
7
,09
7
5
Exp
eri
m
en
t
9
1
3
,08
7
2
1
3
,08
7
2
Exp
eri
m
en
t
10
1
2
.95
5
8
1
2
.95
5
8
Table
4.
T
he
Conve
rg
e
nce
of
ICE a
nd EM al
gorithm
s
Exp
eri
m
en
t
s
ICE
EM
Exp
eri
m
en
t
1
7
iter
atio
n
s
8
iter
atio
n
s
Exp
eri
m
en
t
2
6
iter
atio
n
s
7
iter
atio
n
s
Exp
eri
m
en
t
3
9
iter
atio
n
s
1
2
iter
atio
n
s
Exp
eri
m
en
t
4
1
0
iter
atio
n
s
1
3
iter
atio
n
s
Exp
eri
m
en
t
5
6
iter
atio
n
s
8
iter
atio
n
s
Exp
eri
m
en
t
6
7
iter
atio
n
s
6
iter
atio
n
s
Exp
eri
m
en
t
7
9
iter
atio
n
s
1
1
iter
atio
n
s
Exp
eri
m
en
t
8
7
iter
atio
n
s
9
iter
atio
n
s
Exp
eri
m
en
t
9
9
iter
atio
n
s
8
iter
atio
n
s
Exp
eri
m
en
t
10
1
0
iter
atio
n
s
9
iter
atio
n
s
Fr
om
these
ta
bl
es,
we
noti
ce
that
the
values
of
PS
NR
in
de
x,
S
SI
M
i
nd
e
x
an
d
er
ror
rate
ob
ta
ine
d
i
n
each
e
xp
e
rim
e
nt
by
EM
an
d
I
CE
are
sim
il
es.
EM
an
d
ICE
giv
e
the
sam
e
resu
lt
in
the
se
te
n
exp
e
rim
ents.
D
espite
of
t
hey
us
e
the
strat
egies
dif
fe
ren
ces
t
o
est
im
at
e
the
par
am
eter
s.
T
he
qu
al
it
y
of
se
gm
entat
i
on
is
c
om
par
ab
le
f
or
bo
t
h
al
gorithm
s,
we
hav
e
no
diff
e
re
nce
in
te
rm
s
of
qu
al
it
y.
From
the
val
ue
s
of
c
onve
rg
e
nce,
EM
a
nd
I
CE
are
ver
y
quic
k
to
conve
rg
e
.
But,
ICE
is
qu
ic
k
to
co
nv
e
r
ge
as
EM,
beca
us
e
the
co
nv
e
r
gence
of
EM
has
so
m
e
diff
ic
ulti
es, it d
epends
on its i
niti
al
p
aram
e
ter
s.
5.
CONCL
US
I
O
N
In this
pa
per
,
we have
r
eal
iz
ed
a
co
m
par
at
ive st
ud
y
betwe
en
tw
o i
te
rati
ve
estim
at
or
s s
uc
h
as
EM a
nd
ICE
to
est
im
a
t
e
HMC
-
IN
pa
r
a
m
et
er’
s
acco
r
ding
the
fi
nal
Ba
ye
sia
n
decis
ion
c
rite
ria
M
PM,
to
segm
ent
te
n
m
edical
br
ai
n
tum
or
MR
I
im
a
ges.
We
ha
ve
us
e
d
the
t
hr
es
holdin
g
te
c
hn
ic
to
extr
act
the
i
nterest
r
egi
on
(
tum
or
po
sit
io
n)
by
th
e
i
m
age
resu
lt
of
se
gm
entat
ion
.
Gen
e
rall
y,
I
CE
and
EM
giv
e
the
sam
e
resu
lt
s
in
te
rm
of
the
qu
al
it
y
PS
NR
ind
e
x,
SS
IM
i
nd
e
x
a
nd
e
rror
rate,
but
the
e
xp
e
rim
ental
resu
lt
s
s
how
t
hat
ICE
c
onve
r
ge
s
to
a
so
luti
on
fa
ste
r
than
EM.
A
nd,
EM
is
le
ss
co
m
plex
than
IC
E.
This
w
ork
c
om
e
up
with
m
any
open
quest
ion
s
.
I
n
par
ti
cula
r,
it
’s
po
s
sible t
o :
1.
Use
t
hese e
s
tim
a
tors
to
se
gm
ent co
lo
r
te
xt
ur
e
d
im
ages.
2.
Program
these estim
at
or
s t
o
es
ti
m
at
e a p
aram
et
er o
f
a
pa
irwise
or triple
t M
arkov chai
n m
od
el
s.
3.
Segm
ent the
MR
I
im
ages using t
he
tri
plet
Ma
rkov chai
n,
consi
der
i
ng tha
t X is
non st
at
ion
a
ry.
Evaluation Warning : The document was created with Spire.PDF for Python.
IS
S
N
:
2252
-
8776
IJ
-
ICT
V
ol.
6
,
No.
3
,
Decem
ber
201
7
:
155
–
165
164
REFE
R
E
N
C
E
S
[1]
P.
Devij
ver
,
"Baum
’s forwar
d
bac
kwa
rd
a
lgo
rit
hm
rev
isi
te
d
,
"
Pat
te
rn
Recogni
t
io
n
Lett.3,
pp
36
9
–
373,
1985
.
[2]
R.
Van
Had
e
l,
"H
idd
en
Mark
ov
m
odel
s,"
pp
5
1
-
64,
ju
l
y
28
,
20
08.
[3]
J.C.
Bisca
ra
t,
G.Ce
le
ux
,
J.Di
e
bolt
,
"S
toc
hastic
ver
sions of
the
EM
al
gor
it
hm
,
"
1985.
[4]
Y .
Zha
ng
,
"P
red
iction
of
fin
an
ci
a
l
t
imes seri
es
with
Hidden
Ma
rkov
Chai
n
,
" t
h
e
sis Shandong
Univer
sit
y
,
Chin
a,
2011.
[5]
M.
L.
Corne
r
and
E
.
J.
de
lp,
"The
EM/
MP
M
Algorit
hm
for
s
egmenta
t
ion
of
t
ext
ure
d
images,
"
Octobe
r
,
2000
,
1731
-
1744
[6]
W
.
Piczy
nski,
"S
ur
la
conve
rg
enc
e
d
e
l’
esti
m
ation
Cond
it
ionn
elle
itéra
ti
ve
,
"
C
.
R.
Ac
ad.
Sc
i,
Par
i
s,
ser
.
1346
(200
8)
457
-
460.
[7]
Drik.
P.Korese,
"M
ont
e
Ca
rlo M
et
hodes,
" Cou
rse
Univer
sit
y
of
Quee
nsland
,
20
11.
[8]
S.
ta
t
ira
ju
,
al.
,
"Im
age segmenta
ti
on
using K
-
m
ea
ns c
lustri
ng
E
M a
nd
norm
aliz
ed
cu
ts,"
2008
.
[9]
W
.
Piec
z
y
n
ski,
"Converg
en
ce
of
th
e
it
e
rati
ve
cond
it
ion
al
esti
m
at
ion
and
appl
i
ca
t
ion
on
t
he
m
ixt
ure
pro
porti
on
ide
nti
f
icati
on
,
" I
EE
E
,
Statis
ti
c
al
Signal
W
orkshop,
SS
P( 2007)
Madison
,
W
I,
US
A,Augus
t
26
-
29,
2
007.
[10]
S.Palt
ani,
"
Monte
Car
lo
Me
thod,
" statisti
cs
cour
se
for
As
troph
y
sic
ists,
U
niv
ersity
of
Gene
v
a
,
2011
.
[11]
N.Idri
ss
i,
F.
E.
Ajm
i,
"A
H
y
br
id
Segm
ent
a
ti
on
Approac
h
for
Br
ai
n
Tumor
Ext
r
a
ct
ion
and
Detect
i
on,
"
Confer
ence
Paper
,
DO
I:
10.
1109/I
CMC
S.2014.
6911131,
April
201
4.
[12]
Y.Z
hang,
a
l.
,
"S
egmenta
t
io
n
of
br
ai
n
MR
images
through
a
h
idd
en
Marko
v
field
m
odel
a
nd
the
ex
ce
pt
at
i
on
–
m
axi
m
iz
at
ion
a
l
gorit
hm
,
" IEEE
Tra
nsac
ti
ons On
Medical
imagin
g,
vol
.
20,
No.1
,
1
Janura
y
,
2001
.
[13]
Cel
ux
G
.
an
d
Diebo
lt
.
J.A,
"S
toc
hasti
c
appr
oximati
on
t
y
pe
EM
a
lgori
thm
f
or
th
e
m
ixt
ur
e,
"
Rappor
t
d
e
recherch
e
I
NRIA
,
1383,
19
91.
[14]
N.
Rec
hid,
al
.
,
"S
egmenta
t
io
n
non
supervisé
e
d’images
basé
e
sur
le
s
m
odèl
es
de
Markov
c
ac
h
és",Courri
er
d
e
savoi
r
,
No12,
o
ct
obre
2
011,
pp
34
-
39.
[15]
S.
S.
Al
-
am
ri,
N
.
V.
Ka
l
y
ank
ar,
Kh
amitka
r
S.
D,
"Im
age
Segm
ent
a
ti
on
b
y
Us
ing
Thre
shol
d
T
echnique
s"
.
[16]
T.
MA
CKE
L,
a
l.,
"A
ppli
c
at
i
on
of
Hidden
Markov
Modeli
ng
of
Objec
t
ive
Me
dic
a
l
Skill
Eva
lu
at
ion
,
"
Medc
in
e
Mee
t
s
Virtua
l
Reality
1
5,
long
Be
ac
h
C
A,
Februra
y
200
7.
[17]
M.
Am
eur
,
N.
Idrissi,
C.
D
aoui
,
"M
ark
ovian
Segm
ent
at
ion
of
Color
and
Gr
a
y
L
eve
l
Im
age
s,"
Paper
Con
-
f
er
enc
e
,
DO
I:10.
1109/C
GIV
.
2016.
57,
M
arc
h
2016
.
[18]
N.
J.
Rose,
"H
il
ber
t
-
T
y
p
e
S
pac
e
-
Fi
ll
ing
Cur
ves,
" 2000.
[19]
Sagan,
H,
"
Space
Fi
ll
ing
Cu
rve
s,
Spring
er
-
Verl
ag
,
" New
Yor
k
1994.
[20]
P.
F.
Fel
ze
n
szwal
b,
D
.
P.
Hutte
nlo
che
r
,
"E
ffc
ie
nt
Graph
-
Base
d
Im
age
Segm
en
ta
ti
on
,
" 1999
.
[21]
G.
R.
C
.
M
arq
uez,
H
.
J.
Es
ca
l
ant
e
,
L.
E
.
S
uca
r,
"S
implifi
e
d
Quadtr
e
e
Im
age
Segm
entati
on
for
Im
age
Ann
ota
ti
on
,
"
AIA
R2010:
Pro
ce
ed
ings
of
the
1st
Autom
at
ic
I
m
age
Annotat
io
n
and
Ret
r
ie
v
al
W
orkshop
2010,
volume
1,
issue
:1
,
pp.
24
-
34
.
[22]
B.
G.
H.
G
orte
,
"M
ulti
-
Spe
ct
ra
l
Qua
dtree
b
ase
d
Im
ag
e
Seg
m
ent
at
ion
,
"
In
ternat
ion
Archi
v
es
of
Photogramm
et
r
y
and
Remote
Sensing.
vol.
XX
XI,
Par
t B3.
Vienn
a
1996
.
[23]
P.L
anc
han
ti
n,
F.
Sal
ezenste
in
,
"S
egmentation d’images
m
ult
is
pec
tr
al
es
par
arb
res
de
Markov
C
ac
hés
f
loue
s,"
20
05.
[24]
W
.
Piec
z
y
n
ski,
D.
Benboud
je
m
a,
P.
La
n
chantin,
"S
ta
t
isti
c
al
image
segm
entati
on
using
tri
pl
et
Markov
fi
el
d
s,"
in
:
Inte
rna
ti
ona
l
S
ym
posium
on
Remote
Sen
sing,
SP
IEs,
Crete, Gre
ec
e
,
2002
,
pp
.
22
–
27.
[25]
N.
Giorda
na
,
W
.
Pi
ecz
y
nski,
"Estima
ti
on
of
gene
r
al
i
ze
d
m
ul
ti
sensor
hidde
n
Markov
cha
ins
a
nd
unsupervise
d
image
segm
ent
at
ion
,
" I
EE
E
Tra
ns
.
Pat
t
ern
Anal
.
Ma
chine
Int
el
l
.
,
vol
.
1
9,
no
.
5
,
pp
.
465
–
475,
Ma
y
199
7
.
[26]
S.Bric
q,
C
H.
Coll
et , J.P.A
rm
pac
h,
"t
riplet
m
ark
ov
chains f
or
3D MRI br
ai
n
segm
ent
a
ti
on
us
ing
a
proba
b
il
ist
e
a
tlas,"
IEE
E
2006,
Int
er
nat
ion
al
s
y
m
pos
uim on
Biom
edica
l
Im
agi
ng,
Apr
il
2006
.
[27]
E.
Monfrini
,
a
l.,
"Im
age
and
Signal
Restor
at
i
on
using
Pairwis
e
Markov
Trees,
"
IEE
E
W
orkshop
on
Sta
ti
sti
cal
S
ignal
Proce
ss
ing
(SS
P
2003),
Saint L
o
uis,
Miss
ouri, Se
p.
-
Oct.,
200
3.
[28]
S.
Saini
,
K.
Arora,
"A
Stud
y
An
aly
sis
on
th
e
Diff
ere
nt
Im
a
ge
Segm
ent
a
ti
on
Techni
qu
es,
"
In
te
rna
ti
ona
l
Jour
nal
of
Inform
at
ion
a
nd
Com
puta
ti
on
Technol
og
y
,
ISS
N 0974
-
2239
Volu
m
e
4,
Num
ber
1
4
(2014), pp. 14
45
-
1452.
[29]
A.
Nakib
,
H.Oulha
dj
and
P.
Siarr
y
.
,
"M
icros
copi
c
image
segm
ent
at
ion
wi
th
two
-
dimensio
nal
expone
n
ti
a
l
ent
ro
p
y
base
d
on
h
y
brid
m
ic
roc
anonica
l
anne
a
li
ng,
"
MV
A2007
IAP
R
Co
nfe
ren
c
e
on
Ma
c
hine
Vision
App
li
c
a
-
ti
ons,
Ma
y
1
6
-
18,
2007
,
Tok
y
o
,
Japa
n
.
[30]
W
.
Piec
z
y
n
ski,
"M
odèles de Markov en
tr
ait
ement
des
image
s,"
Traite
m
ent d
u
signal,
vo
l.
20
,
No.3,
pp255
-
278
.
[31]
A.Demps
te
r,
al.,
"M
axi
m
um
li
ke
li
hood
fro
m
imcom
ple
te
d
at
a
vi
a
t
he
EM
Algorit
hm
,
"
Jou
rna
l
the
Ro
y
a
l
Stat
ist
ic
Socie
t
y
,
ser
ie
B(
Methodol
ogical)
,
1977
.
[32]
W
.
pi
ecz
y
n
ski,
"
EM
and
IC
E
in
hidde
n
and
tr
ipl
e
t
Markov
m
odel
s,"S
toc
hasti
c
Model
ing
T
e
chni
ques
and
an
aly
s
is,
Inte
rna
ti
ona
l
Co
nfe
ren
c
e, j
un
e
8
-
11,
2010
.
[33]
Cel
ux.
G.and
Diebolt.
J.A
,
"S
toc
hasti
c
appr
oximati
on
t
y
pe
EM
al
go
rit
hm
f
or
th
e
m
ixt
ur
e,
"
Rapport
d
e
rec
h
erch
e
INRIA
,
1383,
19
91.
[34]
W
ei
and
T
anne
r,
"A
Monte
Carl
o
implementation
Of
th
e
EM
al
gori
thm
a
nd
the
poor
m
a
n‘s
dat
a
augmenta
ti
on
al
gorit
hm
,
"
Journal
of
th
e
Am
erica
n
St
at
ist
ic
a
l
A
ss
oci
at
ion
,
85,
6
99
-
704,
1987
.
[35]
J.P.Del
m
as,
"Rel
at
ions
En
tr
e
Le
s
Algorit
hm
es
D’estimat
ion
Ite
ra
ti
ves
Ic
e
And
Em
Av
ec
Ex
emple
D’
appl
i
cation,
"
Quinzi
eme
Co
lloque
Gresti
-
Juan
-
Les
-
Pins
-
Du 18
AU
21
Sept
ember
1995.
[36]
S.
Derrode
,
W
.
Piec
z
y
nski
,
"
Uns
uper
vised
da
ta
c
la
ss
ifica
t
ion
using
pai
rwise
Markov
chains
with
aut
om
atic
c
opu
-
la
s
sele
c
ti
on,
"
Com
puta
ti
on
al Sta
t
ist
ic
s a
nd
Dat
a
An
aly
s
is,
63
(2013)
,
81
–
98
.
Evaluation Warning : The document was created with Spire.PDF for Python.