TELKOM
NIKA
, Vol.12, No
.2, June 20
14
, pp. 357~3
6
6
ISSN: 1693-6
930,
accredited
A
by DIKTI, De
cree No: 58/DIK
T
I/Kep/2013
DOI
:
10.12928/TELKOMNIKA.v12i2.2022
357
Re
cei
v
ed Fe
brua
ry 13, 20
14; Re
vised
Ap
ril 10, 201
4; Acce
pted
April 25, 201
4
Multi-focus Image Fusion with Sparse Feature Based
Pulse Coupled Neural Network
Yongxin Zha
ng
1,2
, Li Chen*
1
, Zhihua Zhao
1
, Jian Jia
3
1
School of Info
rmation Sci
enc
e and T
e
chno
l
o
g
y
,North
w
e
st Univers
i
t
y
,
Xi’
a
n 710
12
7, Sha
a
n
x
i,C
h
in
a
2
Luo
yan
g
Nor
m
al Univ
ersit
y
,
Luo
ya
ng 4
7
1
0
22, He’
n
a
n
, Ch
ina
3
Department o
f
Mathematics, North
w
e
s
t Un
iv
ersit
y
,
Xi’
an 7
1
012
7, Shaa
n
x
i,
Chin
a
*Corres
p
o
ndi
n
g
author, e-ma
i
l
: tabo12
6@
12
6.com
A
b
st
r
a
ct
In order to better extract the fo
cused
reg
i
o
n
s an
d effectiv
ely i
m
prove th
e qu
ality
of the fuse
d
imag
e, a n
o
vel
mu
lti-focus
image fus
i
o
n
sch
eme w
i
th
spars
e
feature
bas
e
dpu
lse
co
up
le
d ne
ural
netw
o
rk
(PCNN) is pro
pose
d
. The registere
d
sourc
e
imag
es are
deco
m
pose
d
i
n
to princ
i
pa
l matrices an
d sp
arse
matric
es by
ro
bust pri
n
ci
pal
compo
nent
an
alysis (
R
PC
A).
T
he sal
i
e
n
t fe
atures
of the
sparse
matrice
s
construct the sparse featur
e space of the s
ource i
m
ages.
T
he sparse
fe
atures are us
e
d
to motiv
a
te the
PCNN
neur
ons
. The focuse
d
regi
ons of th
e
source
i
m
ag
es
are d
e
tected
by the
output
of the PCN
N
a
n
d
integr
ated
to c
onstruct the
fin
a
l fus
e
d
i
m
a
g
e
.
Exper
i
m
ental
resu
lts show
that the
pr
op
os
ed sc
he
me
w
o
rks
better i
n
extra
c
ting th
e foc
u
s
ed r
egi
ons
an
d i
m
pr
ovi
ng th
e fusi
on
qu
alit
y co
mp
ared
to
the
other
exis
tin
g
fusion
meth
ods
in both sp
atial
and transfor
m
do
ma
in.
Ke
y
w
ords
: i
m
age
fusio
n
, ro
bust pri
n
ci
pal
compo
nent
an
alysis, p
u
ls
e-coup
led
ne
ura
l
netw
o
rk, spar
se
feature, firing ti
mes
1. Introduc
tion
Multi-focus
i
m
age
fusi
on is
a
p
r
o
c
e
s
s that
differe
nt image
s with different setti
ngs
a
r
e
integrate
d
to prod
uce a ne
w image cont
ains all rel
e
vant object
s
in
focus,
which is very useful
for
human or mac
h
ine perc
ept
ion [1
].In general, image f
u
s
i
onmethod
s
c
an be c
a
tegoriz
ed
into
t
w
o
grou
ps: spati
a
l domain fu
sion a
nd tra
n
sform doma
i
n fusion [2]. The spatial
domain fusi
on
method
s are
easy to imple
m
ent and h
a
v
e low comp
utational com
p
lexity,but th
e spatial
dom
ain
method
s may
pro
d
u
c
e bl
o
cki
ng a
r
tifact
s an
d
comp
romise
the q
u
a
lity of the final fused im
age.
Different from
the sp
atial d
o
main fu
sion,
the
tran
sform domai
n fusion metho
d
s
can
get imp
r
o
v
ed
contrast, b
e
tter
sign
al-to
-
n
o
ise
ratio
an
d
better fu
sio
n
quality [3],but the tra
n
sfo
r
m dom
ain fu
sion
method
s are time/spa
ce
-co
n
sumi
ng to impleme
n
t.
Pulse
coupl
e
d
neu
ral net
work(PCNN)
is a novel
visual co
rtex-in
s
pired n
e
u
r
al
netwo
rk
cha
r
a
c
teri
ze
d
by the
glo
bal
cou
p
ling
and
pul
se
synchro
n
ization
of neu
ro
ns, whi
c
h was
develop
ed by
Eckhorn et a
l
. [4] in the ex
perim
ental
o
b
s
ervatio
n
of
synchroni
zatio
n
bu
rst
s
in
ca
t
and mon
k
ey
visual cortexi
n
1990. Bro
u
s
sard
et
al. [5] have firstly applie
d PCNN in image fu
sion
for o
b
je
ct det
ection
an
d
Jo
hnson
et al. [
6
] have
point
ed o
u
t the
great pote
n
tial
of PCNN in t
h
e
field of data
fusion i
n
the
same y
e
ar.
It has be
en
obs
erv
ed th
a
t
PCNN
ba
s
ed imag
e fus
i
on
method
s
out
perfo
rm th
e
conve
n
tional
metho
d
s [7]. So far,
ma
ny multi-fo
cu
s im
age
fusi
on
methods
based on PCNN have
been
pr
oposed [8]-[12]. H
o
w
e
ver
,
mos
t
of them suffer
fr
om
variou
s pro
b
l
e
ms. Miao et al. [8] have propo
sed
a fusi
on method u
s
ing the sha
r
p
ness of a sm
all
neigh
borhoo
d
of each pixel
as the lin
kin
g
stre
ngt
h of
PCNN. It works
better in
p
r
eservin
g
edg
e
and texture i
n
formatio
n, but suffers fro
m
contra
st re
ductio
n
. Hua
ng et al. [9]
have pro
p
o
s
ed a
fast fusio
n
m
e
thod ba
se
d
on ene
rgy of
the image
la
placi
an (EO
L
) motivated P
C
NN in
spati
a
l
domain. It pro
duces the
blo
cki
ng a
r
tifacts in fus
ed im
a
ge whil
e improves the fu
si
on sp
eed. Q
u
et
al. [10] have develop
ed a
fusion m
e
tho
d
ba
sed o
n
spatial freq
ue
ncy (SF
)
mot
i
vated PCNN in
non
sub
s
um
pl
ed co
ntourl
e
t transfo
rm (NSCT) do
main
. It works
we
ll for multi-focu
s image a
n
d
visible/infra
r
e
d
image, but the absen
ce
of direct
io
nal
informatio
n in
SF and the use of the sa
me
fusion
rule fo
r all the sub
-
band
s lea
d
to the co
ntra
st redu
ction a
nd the lo
ss
o
f
image detai
ls.
Wan
g
et al. [11] have pro
posed a fusi
on schem
e b
a
se
d on dual
-ch
ann
el PCNN. Thi
s
sch
e
me
motivates the
dual
-chan
nel
PCNN by
using the
E
O
L
of the pixel’
s
neigh
borhoo
d
and
achieve
s
better fu
sion
result. It improves th
e q
ual
ity of
the fused ima
ge, b
u
t
con
s
um
es
more
time. G
eng
et al. [12]
ha
ve develo
ped
the fu
sion
m
e
thod
ba
sed
on P
C
NN i
n
she
a
rlet
dom
ain. It improves
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 16
93-6
930
TELKOM
NIKA
Vol. 12, No. 2, June 20
14: 357 – 36
6
358
the qu
ality of
the fused i
m
a
ge, but
th
e a
b
se
nce of
shi
ft invariant i
n
she
a
rlet t
r
an
sformation
lea
d
s
to the unwa
n
ted image d
e
g
r
adatio
ns.
Different fro
m
the fusion
method
s me
nti
oned a
bov
e, in this pa
per, a ne
w
method of
multi-focus image fu
sio
n
w
ithsparse
feature
ba
se
d PCNN is pro
p
o
s
ed.
Robu
st pri
n
ci
pal
comp
one
nt a
nalysi
s
(RPCA) [13] is an
importa
nt m
e
thod of l
o
w-ran
k
mat
r
ix recove
ry, whi
c
h
decompo
se
s
an ima
ge i
n
to
a lo
w-ra
nkm
a
trix whi
c
h
co
rre
sp
ond
sto t
he b
a
ckg
r
ou
n
d
, and
a
sp
arse
one whi
c
h
lin
ks
to sali
ent obje
c
ts
[14]. Wan et
al
. [15] have inve
stigated the p
o
t
ential appli
c
a
t
ion
of RPCA in the multi-fo
cu
s image fu
si
on and a
c
hie
v
ed a con
s
istently good fusio
n
re
sult, but
their method
requires lo
nger
comp
utational ti
me. Different from Wan’
s method,the
mai
n
contri
bution
o
f
this pa
pe
r is that the
spa
r
se fe
at
ure
s
o
f
the so
urce i
m
age
s a
r
e
u
s
ed
to motiva
te
thePCNN ne
uron
s for ima
ge fu
sion.
Th
e spa
r
se m
a
trice
s
of the
source
imag
es are o
b
taine
d
by
usin
g RP
CA
decompo
sitio
n
. The
sp
arse feature
co
mputed f
r
om
the
sparse
m
a
trice
s
are u
s
ed to
motivate the
PCNN n
euro
n
s. Th
e focused region
s
are dete
c
ted by
com
pari
ng th
e firing time
s
of
the PCNN n
euro
n
s. Th
e prop
osed me
thod ca
n effi
ciently extract the focuse
d
region
s deta
ils
from the so
urce imag
es a
n
d
improve the
visual quality
of the fused image.
The re
st of the pape
r is o
r
gani
zed a
s
fo
llows.
In secti
on 2, the basic idea of RP
CA and
PCNN will be briefly desc
r
ibed,
followed by the new method with
RPCA and PCNN for image
fusion
in se
ction 3.
In se
ction 4,
e
x
tens
ive
sim
u
lation
s a
r
e
perfo
rme
d
to evaluate
the
perfo
rman
ce
of the
pro
p
o
s
ed m
e
thod. I
n
ad
dition,
several
expe
ri
mental
re
sult
s a
r
e
p
r
e
s
ent
ed
and di
scusse
d. Finally, conclu
ding re
m
a
rks a
r
e drawn in se
ction 5
.
2. Related Work
2.1. Robus
t Principal Co
mponent
An
aly
i
sis
RPCA i
s
an
e
ffective way
to recover bot
h lo
w-ran
k
an
d spa
r
se
com
pone
nts
exactly from
high dim
e
n
s
i
onal data
by solving the
p
r
inci
pal
co
mp
onent pu
rsuit
[13]. In which, an input d
a
ta
matrix
M
N
D
¡
issubj
ect to low-ra
n
k
pro
p
e
r
ty. In
orde
r to reco
ver the low-rank
stru
cture
of
D
,
D
can be d
e
co
mposed a
s
:
,(
)
m
i
n
(
,
)
DA
E
r
a
n
k
A
M
N
=
(1)
whe
r
e matrix
A
is a prin
cipal
matrix, and
E
is
a s
p
ars
e
matrix. It
is
obv
ious that this
probl
em is
difficult to sol
v
e. Rece
ntly, Wrig
h
t et al [16]
have de
monst
r
ated t
hat whe
n
the
spa
r
se matri
x
E
is
s
u
ffic
i
ently s
p
arse
(relative to the
rank
of
A
)
,
o
n
e
c
a
n
ac
cu
r
a
te
ly
re
c
o
ver
th
e prin
c
i
pa
l ma
tr
ix
A
from
D
by solving the followi
ng co
nvex op
timization p
r
o
b
lem [17]:
*1
,
mi
n
|
|
|
|
|
|
|
|
.
.
AE
A
Es
t
A
E
D
(2)
whe
r
e
*
||
||
den
otes the
n
u
c
lea
r
norm
of mat
r
ix
A
,
is a
po
sitive weig
hting
paramete
r
, a
n
d
1
||
||
denote
s
the
1
l
norm of the
matrix
E
.
Can
d
e
s
et
al.[13] have
extende
d t
he
RP
CA
to
the
ba
ckground
mod
e
lli
ng from
surveill
an
ce video. They correctly identi
f
ied t
he moving pede
stri
a
n
s in the fore
grou
nd by using
the sp
arse
co
mpone
nt of surveillan
c
e vi
deo. Th
e spa
r
se
matrix
E
re
pre
s
ent
s the
salie
nt feature
of the foregro
und obje
c
tseffectively. As is kno
w
n,
the salie
nt object
s
in the foreg
r
oun
d are very
importa
nt for
multi-focus i
m
age fu
sio
n
. Motivated b
y
Cand
es’
s
i
dea, this pap
er trie
s to
extract
the spa
r
se fe
ature
of the
source
imag
esby us
i
n
g
R
PCA de
comp
osi
t
ion. Figu
re
1
(a
)
sho
w
s th
e
multi-focus
source ima
g
e
s
‘Book’. Figu
re 1 (b
)
and 1
(c)
sh
ow the
corre
s
po
ndin
g
image
s of the
prin
cipal
matrix
A
and
sp
arse matrix
E
, re
spectively. It is o
b
vious tha
t
the sali
ent f
eature
s
of
spa
r
s
e
m
a
t
r
ix
E
ag
ree
well
wi
th the lo
cal
fe
ature
s
of
thef
ocu
s
e
d
o
b
je
cts in th
e
sou
r
ce i
m
age
s.In
this pap
er, th
esp
a
rse feat
ure
s
co
mput
ed from the
spa
r
se matri
x
E
are used t
o
motivate the
PCNN nerons, whi
c
h will be introdu
ced in the following subsection.
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
1693-6
930
Multi-focus I
m
age Fusionwith Sparse F
eat
ure Based Pulse Coupl
ed .... (Yongxin Zhang)
359
Figure 1. De
compo
s
ition of
multi-focu
s i
m
age
s ‘Boo
k’
using
RPCA:
(a) Sou
r
ce image
s
D
;(b)
Princi
pal matrix
A
; (c
) Sparse matrix
E
2.2. Pulse Coupled Ne
ur
al Net
w
o
r
k
PCNN i
s
a
feedba
ck net
work
and
be
long
s to the
third
gen
eration a
r
tificia
l
neu
ral
netwo
rk. In image p
r
o
c
e
s
sing, PCNN i
s
a sin
g
le layered, two
-
di
mensi
onal, la
terally con
n
e
c
ted
with image pi
xels each other. Each PCNN neu
ro
n co
n
s
i
s
ts of three pa
rts: the receptive field,
modulatio
n field an
d pul
se
gene
rato
r. T
he PCNN ne
uron’
s
sp
ecifi
c
st
ru
cture
is
sho
w
n i
n
Fig
u
re
2. The neu
ro
n can b
e
de
scrib
ed a
s
[7]:
()
(
1
)
(
1
)
()
(
1
)
(
1
)
()
()
[
1
()
]
1(
)
(
)
()
0
()
(
1
)
(
1
)
F
L
ij
i
j
ij
F
i
j
k
l
k
l
kl
ij
ij
L
i
j
k
l
k
l
kl
ij
ij
ij
ij
i
j
ij
ij
ij
ij
Fn
e
F
n
S
V
M
Y
n
Ln
e
L
n
V
W
Y
n
Un
F
n
L
n
Un
n
Yn
oth
e
rwise
ne
n
V
Y
n
(3)
Whe
r
e th
e in
dexes
i
and
j
refer to th
e pix
e
l location i
n
t
he ima
ge,
k
an
d
l
refer to th
e
dislo
c
atio
n in a symmetric neighbo
rh
oo
d arou
nd the
one pixel.
n
d
enote
s
the current iteratio
n
and
ij
S
de
note
s
the in
put
stimulus such a
s
the
no
rmali
z
ed
gray leve
l of imag
e pi
xels.
F
,
L
and
are the
decay consta
nt
s of th
e PCNN neu
ro
n.
F
V
,
L
V
and
V
are the
magnitud
e
scalin
g
terms. T
he
consta
nt
is th
e linki
ng
stre
ngth.
ij
F
is the
primary inp
u
t from the ne
uro
n
s receptive
fields.
ij
L
is the
se
co
nda
ry i
nput of
lateral conn
ectio
n
s
with
neigh
borin
g
neu
ro
ns. T
he i
n
ter-
c
o
nn
ec
tio
n
s
M
and
W
are the consta
nt syna
ptic wei
ght m
a
trice
s
for
ij
F
an
d
ij
L
, res
p
ec
tively.
is a dyn
a
mic
neuron th
re
shold. T
he
ne
uron
will g
e
n
e
rate p
u
lse when
()
()
ij
ij
Un
n
. This pulse
is also
calle
d
one firing time.The sum of
ij
Y
in
n
iteration is call
ed firing times, to represent
image info
rm
ation, whi
c
h i
s
defin
ed a
s
[7]:
(1
)
(
)
ij
ij
ij
TT
n
Y
n
(4)
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 16
93-6
930
TELKOM
NIKA
Vol. 12, No. 2, June 20
14: 357 – 36
6
360
Figure 2. A PCNN ne
uro
n
model
The adva
n
ta
ge of PCNN in imag
e
fusion lie
s in its glob
al cou
p
ling
and pul
se
synchro
n
ization of ne
uron
s.In this
pap
er, the
fo
cu
sed region
s
a
r
e d
e
tecte
d
by com
pari
n
gthe
firing times of
the PCNNne
uron
s.
3. Multi-focus Image Fusion
w
i
th Sparse Feature
Based PCNN
3.1. Fusion Algorithm
In this su
bse
c
tion, a nove
l
algorithm
of
multi-focu
s i
m
age fu
sioni
s propo
se
d and the
fusion fram
e
w
ork is de
picted in Figure
3. For simp
li
city,
this paper assu
me
s that there are on
ly
two so
urce i
m
age
s, nam
ely
A
I
and
B
I
, resp
ectively. The ratio
nale
behind the
prop
osed
algorith
m
ap
plies t
o
the f
u
sio
n
of m
o
re than t
w
o
multi-focus i
m
age
s. Th
e
sou
r
ce ima
g
e
s
are
assume
d to pre
-re
giste
r
e
d
and the im
age regist
rati
on is not in
cl
uded in the f
r
ame
w
o
r
k. T
he
fusion al
gorit
hm con
s
i
s
ts o
f
the following
4 steps:
Figure 3. Block di
agram of
propo
se
d mu
lti-focu
s imag
e fusion fra
m
ewo
r
k
Step 1: Co
nstruct d
a
ta ma
trix
D
. The source im
age
s
{,
}
,
,
M
N
AB
A
B
II
I
I
¡
is
c
o
n
v
er
ted
into colum
n
vectors
1
,
cc
M
N
AB
II
¡
, resp
e
c
tively.The data matrix
D
is defined a
s
:
[]
cc
AB
DI
I
(
5
)
Step 2: Perfo
r
m RP
CA de
comp
ositio
n
on
D
to obtain aprin
cip
a
l ma
trix
2
MN
A
¡
and a
spa
r
s
e
mat
r
i
x
2
MN
E
¡
, res
p
ec
tively. The s
pars
e
matrix
2
MN
E
¡
is compute
d
throug
h
inexact au
gm
ented La
gran
ge multiplie
rs algorithm
(I
ALM) of RP
CA
[13], which i
s
a fast versi
on
of implem
ent
ation for reco
vering l
o
w-ra
nk m
a
tri
c
e
s
.
The
spa
r
se
matrix
2
MN
E
¡
is converted
into matrices
,
M
N
AB
EE
¡
corre
s
p
ondin
g
to the sou
r
ce image
s
A
I
and
B
I
, res
p
ec
tively.
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
1693-6
930
Multi-focus I
m
age Fusionwith Sparse F
eat
ure Based Pulse Coupl
ed .... (Yongxin Zhang)
361
Step 3:Con
s
t
r
uct PCNN
model
with the sp
arse fe
ature
comp
uted from the
spa
r
se
matrices
,
M
N
AB
EE
¡
, respec
tively.
Step 4:Acco
rding to
the
fusio
n
rule
s, t
he
fo
cu
sed
region
s
of the
so
urce
ima
g
es
are
integrate
d
to obtain the fused image.
3.2. Fusion Rules
There are two key issue
s
[19] for the fusio
n
rule
s. One is h
o
w t
o
measure the activity
level of the focu
se
d re
gio
n
s, which re
cog
n
izes
th
e
sha
r
pn
ess o
f
the sou
r
ce
image
s.Figu
re 1
sho
w
s that the sali
ent feat
ure
s
of sp
arse matrix
E
agre
e
well
with the local feature
s
of
th
e
f
o
c
us
ed
ob
je
c
t
s
in
th
e
s
o
ur
ce
imag
es
. T
h
e
s
a
lie
nt feature
s
represe
n
t the sp
arse featu
r
e
s
of
the so
urce i
m
age
s. More
over, the adv
antage of
P
C
NN in i
m
age
fusion i
s
glo
b
a
l co
upling
a
n
d
pulse syn
c
h
r
onization of
neuron
s. Th
us, we u
s
e t
he firing
times of the
PCNN n
euron
s to
measure the
activity level. The P
C
NN
neuron
s a
r
e
motivated by
the
spa
r
se f
eature
comp
uted
f
r
om t
he sp
ar
se mait
ri
ce
s.
The spa
r
se
matrices
A
E
a
nd
B
E
are divi
ded into
blo
c
ks
with fixed blo
c
k si
ze,
r
e
spec
tively. Let
()
k
A
E
an
d
()
k
B
E
denote the
k
th block
of the
spa
r
se m
a
trice
s
A
E
and
B
E
,
respe
c
tively. The EO
L of
each blo
c
k is
use
d
a
s
the
sparse fe
ature
of t
he source imag
es,
wh
ich
can b
e
cal
c
ul
ated as [18]:
22
()
ii
jj
ij
EOL
E
E
(6)
(1
,
1
)
4
(1
,
)
(1
,
1
)
4
(
,
1
)
2
0
(
,
)
4
(
,
1
)(
1
,
1
)
4
(
1
,
)(
1
,
1
)
ii
jj
E
E
E
ij
E
i
j
E
ij
E
i
j
E
i
j
Ei
j
E
i
j
Ei
j
E
i
j
(7)
whe
r
e
(,
)
Ei
j
indicat
e
s the value
of the elemen
t at the position
(,
)
ij
in s
p
arse matrix block
.
Let
()
A
E
k
EO
L
and
()
B
E
k
EOL
be the EOL of
()
k
A
E
and
()
k
B
E
, respe
c
tively. The EOL of e
a
c
h
block of th
e
spa
r
se mat
r
ices
con
s
truct
s
the feature
maps
A
F
and
B
F
, res
p
ec
tively.
A
F
and
B
F
are i
nput to
PCNN to m
o
tivate the ne
u
r
on
s to
gen
erate pul
se
wit
h
Equatio
n (3), an
d the
firing
times of the n
euro
n
s a
r
e
ca
lculate
d
with
Equation (4).
The oth
e
r i
s
how to
integ
r
ate the fo
cu
sed pixel
s
o
r
region
s of th
e
sou
r
ce ima
g
e
s
into th
e
cou
n
terp
art
s
of the fuse
d i
m
age.Th
e firi
ng times
of the corre
s
po
n
d
ing bl
ocks a
r
e
comp
are
d
to
determi
ne wh
ich blo
c
k is in focu
s.A decisi
on matrix
M
N
H
¡
is
c
o
ns
truc
ted for rec
o
rding the
comp
ari
s
o
n
result
s acco
rdi
ng to the sele
ction rul
e
as f
o
llows:
1
()
()
(,
)
0
AB
kk
Tn
T
n
Hi
j
ot
herwi
s
e
=
(8)
whe
r
e‘1’ in
H
indicate
s the p
i
xel
(,
)
ij
of
the
k
th block of image
A
I
is in focus
and‘0’ in
H
indicates the
pixel
(,
)
ij
of the
k
th block of image
B
I
is in focus.
Ho
wever, ju
d
g
ing by the f
i
ring time
s of
t
he PCNNn
euro
n
s
alon
e
is not sufficient to
detect
all the
focu
se
d blo
c
ks. Th
ere a
r
e thin
p
r
otrusions, narro
w brea
ks,
thin gulfs and sm
all
hole
s
in
H
. To
o
v
erco
me th
ese di
sadva
n
ta
ges,
morphol
ogical o
perations [20] a
r
e
perfo
rmed
on
H
. Opening, de
noted a
s
H
Z
o
, is simply erosi
o
n of
H
by the structu
r
e ele
m
ent
Z
, followed
by dilation of the re
sult by
Z
. T
h
is
pr
oc
es
s c
a
n
re
mo
ve
th
in
g
u
l
fs
a
nd th
in
p
r
o
t
ru
s
i
on
s
.
C
l
os
ing
,
denote
d
a
s
H
Z
, is dilatio
n
foll
owe
d
by e
r
o
s
ion. It
can j
o
in na
rrow breaks
and thi
n
gulf
s
. T
o
corre
c
tly jud
ge the
small
hole
s
, a
thresh
old i
s
se
t to re
move
the h
o
le
s
smaller than
the
threshold.In t
h
is
pap
er, th
e st
ructu
r
e
el
ement
Z
of the
pro
p
o
s
ed
m
e
thod i
s
a
88
matrix with
logical 1’
s a
nd the th
re
shold i
s
set to 100
0. Thu
s
, the final f
u
se
d ima
ge
F
is
c
o
ns
tr
uc
te
d
according to the rule a
s
foll
ows:
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 16
93-6
930
TELKOM
NIKA
Vol. 12, No. 2, June 20
14: 357 – 36
6
362
(,
)
(,
)
1
(,
)
(,
)
(,
)
0
A
B
Ii
j
Hi
j
Fi
j
Ii
j
Hi
j
=
(9)
whe
r
e the
(,
)
A
I
ij
and
(,
)
B
I
ij
are the values of the pix
e
ls at the
(,
)
ij
in the so
urce im
age
s
A
I
and
B
I
, res
p
ec
tively.
4. Experimental Re
sults
In orde
r to ev
aluate the p
e
r
forma
n
ce of
t
he pro
p
o
s
ed
method, several exp
e
rim
e
nts are
perfo
rmed
o
n
two
pai
rs of
multi-focus i
m
age
s [2
1,
2
2
] vary in
co
ntent an
d tex
t
ure, a
s
sh
own in
Figure 4. The
two pairs are grayscal
e image
s with size of
51
2
3
84
pixels and
64
0
4
80
pixels,
respe
c
tively.
In general, image regist
ration sho
u
ld
be perfo
rme
d
before imag
e fusion. In this
pape
r, all
the
so
urce
imag
es
are a
s
sum
ed to
have
b
een
re
giste
r
e
d
. Experim
en
ts a
r
e
co
ndu
cted
with Matlab in
Windo
ws en
vironme
n
t on a comp
uter
with Intel Xeon X5570 an
d 4
8
G memo
ry.
For
co
mpa
r
ison, be
sid
e
th
e p
r
opo
se
d
method,
som
e
existin
g
m
u
lti-focu
s im
ag
e fusi
on
method
s a
r
e
also im
pleme
n
ted on th
e same set
of
source
im
age
s.These meth
ods are discrete
wavelet tran
sform (D
WT
), SF (Li’s meth
od [23]
), PCNN
1 (H
uan
g’s method [9])
, PCNN2 (Mi
ao’
s
method
[8]),
RPCA
(Wan’
s m
e
thod
[15
]
). Du
e to
th
e
la
ck
of ori
g
inal sou
r
ce code,
thi
s
pap
er
use
s
the Edu
a
rdo F
e
rnand
ez Cang
a’s M
a
tlab image f
u
sio
n
toolbox
[24] as a refe
ren
c
e for
DWT,
SF. Specifically, the Da
ube
chie
s
wa
velet func
tion
‘b
i9
7
’
is
u
s
ed
in
the DW
T
an
d
th
e
decompo
sitio
n
level of DWT i
s
4.Th
e
RPCA tool
bo
x [25] is u
s
e
d
as th
e refe
ren
c
e fo
r RP
CA
decompo
sitio
n
. The P
CNN tool
box [2
6] is
used a
s
a
refe
re
nce for P
C
NN1
, PCNN2
an
d the
prop
osed me
thod, respe
c
tively. The
parameters of PCNN1 a
r
e se
t as
13
13
kl
,
1.
0
L
,
5.
0
,
0.
2
L
V
,
20
.0
V
and
30
0
N
. The paramete
r
s of Miao’
s method are
set as
33
kl
,
0.
9
L
,
2.
5
,
0.
2
L
V
,
20
.0
V
an
d
200
N
. The p
a
ra
meters of t
he
prop
osed met
hod are set a
s
the sa
me a
s
that
of Hua
ng’s meth
od
and the blo
ck size i
s
88
.
In orde
r to qu
antitatively compare the pe
rfor
ma
nce of the prop
osed
method an
d that of
the othe
r fu
sion
metho
d
s
me
ntione
d
above, two
metrics
are
use
d
to ev
aluate the
fu
sion
perfo
rman
ce.
They are:
(i) Mutu
al in
formati
on
(M
I) [27], which mea
s
ures the degree
of
depe
nden
ce
of the
sou
r
ce
image
and
th
e fused im
ag
e. (ii)
/
AB
F
Q
[28], which
refle
c
ts the am
ount
of edg
e info
rmation tran
sferred f
r
om th
e source
ima
ges to the
fu
sed
imag
e. A
larg
er value
for
them means
a better fus
i
on result.
Figure 4.Multi-focu
s
sou
r
ce
images: (a)
Nea
r
focu
se
d
image ’Ro
s
e’
; (b) Far fo
cu
sed ima
ge
’Ro
s
e’; (c) Ne
ar focu
se
d im
age ’La
b
’; (d)
Far focused i
m
age ’La
b
’
4.1. Qualitati
v
e
Analy
s
is
For vi
sual
co
mpari
s
o
n
, the
f
used
imag
es ‘Ro
s
e’
and
‘
Lab’ o
b
taine
d
by differe
nt
method
s
are
sho
w
n
i
n
Figu
re 5
a
nd Fig
u
re
6, re
spe
c
tively. The differen
c
e im
age
s b
e
twee
n the f
a
r
focu
sed
sou
r
ce ima
ge ‘L
ab’ and thei
r co
rres
pond
ing fuse
d im
age
s obtain
e
d
by differe
nt
method
s are sho
w
n in Fig
u
re 7.
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
1693-6
930
Multi-focus I
m
age Fusionwith Sparse F
eat
ure Based Pulse Coupl
ed .... (Yongxin Zhang)
363
Figure 5. The
fused imag
e
s
‘Ro
s
e’ o
b
tai
ned by differe
nt fusion mth
ods: (a
)DWT; (b)SF;
(c
)RP
C
A; (d)
P
CN
N1; (e
)P
CN
N2; (f)P
ro
pos
ed
Figure 6. The
fused imag
e
s
‘Lab’ o
b
tain
ed by di
fferen
t
fusion mtho
ds: (a
)DWT; (b)SF; (c)RPCA;
(d)P
CN
N1; (
e
)PC
N
N
2
; (f)P
ropo
se
d
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 16
93-6
930
TELKOM
NIKA
Vol. 12, No. 2, June 20
14: 357 – 36
6
364
Figure7. The
differre
nce image
sbet
wee
n
the fa
r focu
sed so
urce im
age ‘La
b
’ and
their
corre
s
p
ondin
g
fused ima
g
e
so
btaine
d b
y
differ
ent fusion mthod
s: (a)DWT; (b)S
F; (c)RPCA;
(d)P
CN
N1; (
e
)PC
N
N
2
; (f)P
ropo
se
d
Inspe
c
ting th
e rose and th
e wall in Figu
re 5, the cont
rast of the fused ima
ge of
DWT is
worse
than
th
at of the
SF,
RPCA
and
th
e p
r
opo
se
d m
e
thod, a
nd th
e contrast
of the fu
sed
ima
g
e
of propo
se
d method is b
e
tter than that of t
he fused image
s o
f
the other fusio
n
metho
d
s
mentione
d ab
ove. There a
r
e som
e
blurry
region
s on th
e wall in the fuse
d image
s
of PCNN1 an
d
PCNN2, re
sp
ectively. Moreover, the ob
vious blo
c
ki
n
g
artifact
s an
d small blu
rry
region
s ap
p
ear
on the
do
or frame i
n
the
fu
sed
imag
e
of
SF and
RP
CA, respe
c
tively. Inspe
c
ting
the stu
dent
a
nd
the clo
ck i
n
F
i
gure
6, the st
udent’
s
hea
d in the
fuse
d image of DWT sho
w
s obvi
ously a
r
tifacts. A
narro
w promi
nent appe
ars on the uppe
r edge of t
he student’
s
head
in the fused image of RPCA.
Blocki
ng a
r
tifacts a
ppe
ar
on the left and right
e
dge
of student’
s
head in the f
u
se
d image
s of
PCNN1
and
SF, respe
c
tively. The o
b
vious artifa
ct a
ppea
rs on
th
e rig
h
t ed
ge
of the
studen
t’s
head in th
e fuse
d imag
es of PCNN2. In Figure
7,
mis-regi
stration and
disto
r
tion are o
b
viously
observed i
n
the differe
nce
image of
DWT. The
r
e
a
r
e some
obvious
blo
ckin
g
artifacts i
n
the
differen
c
e
im
age
of SF
an
d PCNN1, re
spe
c
tively. Th
ere
are
som
e
obviou
s
i
m
a
ge
re
sidual
in
the
right of the di
fference imag
es
of RP
CA and PCNN2, respe
c
tively
. Thus, the fu
sed image
of the
prop
osed m
e
thod a
c
hieve
s
supe
rio
r
visual pe
rfor
m
a
nce
by co
ntai
ning all th
e focu
se
d conte
n
ts
from the so
urce imag
es. B
u
t it should b
e
noted that
there a
r
e al
so
little blockin
g
artifacts in th
e
edge
of clo
ck in Figu
re
7(f
)
. We
attribut
e this to th
e fixed si
ze of t
he st
ru
cture
element
Z
. To
eliminate the
thin protrusi
o
n
s, na
rro
w b
r
eaks, th
in gul
fs, small hol
e
s
, etc. in de
ci
sion mat
r
ix
H
,
the mo
rphol
o
g
ical
ope
rati
ons are p
e
rf
orme
d on
d
e
ci
sion
matri
x
H
by usi
n
g
the st
ru
cture
element
Z
with fixed si
ze.
T
he morphol
ogical
operations l
a
ck
adapt
ability for the fixed si
ze of
the stru
cture
element
Z
. It cann
ot elimi
nate the thin
protru
sio
n
s,
narrow b
r
e
a
ks, thin g
u
lfs,
small hol
es, e
t
c. in deci
s
ion
matrix
H
c
o
mpletely.
4.2. Quantati
v
e
Analy
s
is
For
qua
ntitative com
p
a
r
iso
n
, t
he qu
antitative re
sults i
n
tw
o quality measures
are
shown
in Tabl
e 1. T
he p
r
op
osed
method
gain
s
highe
st MI [
27] and
/
AB
F
Q
[28] values
co
mp
ared
to the
other fu
se
d
method
s. Th
e ru
nnin
g
tim
e
s
are
al
so sho
w
e
d
in T
able 1.
The
prop
osed me
thod
requi
re
s sho
r
ter co
mputati
onal time th
an that
of Wan’s m
e
thod.
Due to the
slidin
g win
d
o
w
techni
que is applied for
the detectio
n
of the fo
cused regio
n
s,t
he com
putati
on of standa
rd
deviation of
e
a
ch
slidi
ng
wi
ndo
w in
Wa
n’s meth
od [15
]
requi
re
s lo
n
ger
com
putati
onal time th
a
n
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
1693-6
930
Multi-focus I
m
age Fusionwith Sparse F
eat
ure Based Pulse Coupl
ed .... (Yongxin Zhang)
365
that of the bl
ock divisi
on
use
d
in th
e
prop
osed m
e
thod. But the
pro
p
o
s
ed
m
e
thod
still yie
l
ds
longe
r com
p
utational co
st
than DWT-b
a
se
d
fu
si
on
method and
SF-ba
s
ed
fusion
meth
od, and
the matrix de
comp
ositio
n accou
n
ts for t
he majo
rity of the computat
ional load.
Table 1.Th
e perfo
rman
ce
of different fusion meth
od
s
Method
Rose Lab
MI
/
AB
F
Q
Run-time(s)
MI
/
AB
F
Q
Run-time(s)
DWT
4.78
0.67 0.45 6.47
0.69
0.59
SF
6.78
0.72 0.66 7.94
0.72
1.03
RPCA 7.75
0.71
39.28
8.50
0.75
60.80
PCNN1
7.45
0.64 0.51 8.86
0.71
0.55
PCNN2
6.33
0.65
20.64
8.78
0.68
32.51
Proposed
7.85 0.74
0.84
8.90 0.76
1.08
5. Conclusio
n
In this pa
pe
r, a novel fusion metho
d
i
s
p
r
opo
se
d t
o
effectively extract the f
o
cu
se
d
regio
n
s
and i
m
prove th
e q
uality of the fuse
d im
ag
e. The qu
alitative and q
uantit
ative analysi
s
sho
w
that the prop
osed
method
a
c
hi
eves supe
rio
r
fusion resul
t
s comp
ared to some existing
fusion
metho
d
s
and
sig
n
ificantly imp
r
ov
es th
e qu
a
lity of the fu
sed
image. In th
e
future,
we
will
con
s
id
er opti
m
izing the p
r
opo
se
d met
hod to re
d
u
ce the time-consuming a
n
d
improving
the
adaptivity of the pro
p
o
s
ed
method.
Ackn
o
w
l
e
dg
ements
The
wo
rk wa
s
sup
porte
d
by Natio
nal K
e
y Te
chnol
og
y Scien
c
e
an
d Te
ch
nique
Suppo
rt
Program
(No
.
2013BA
H
4
9
F03
)
, Key
Tech
nolo
g
ie
s R&
D Program of
He
na
n Provin
ce
(No.
1421
0221
063
7), the
Natio
nal
Natu
re S
c
ien
c
e
Fo
un
dation
of Chi
na
(No. 61
3
7901
0),
and
the
Natural Scie
n
c
e Basi
c Research Plan in
S
haanxiProvi
nce of China
(No. 20
12
JQ
1012
).
Referen
ces
[1]
HJ Z
hao, Z
W
Shan
g, YY T
a
ng, B F
ang. Multi-
focus im
ag
e fusion
base
d
on the ne
ig
hb
or distanc
e.
Pattern Reco
g
n
itio
n
. 201
3; 46(3): 100
2-1
0
1
1
.
[2]
ST
Li, XD K
a
n
g
, JW
Hu, B Y
ang. Ima
ge m
a
tting fo
r fus
i
on
of multi-foc
u
s i
m
ages
in
d
y
n
a
mic scen
e
s.
Information Fusion
. 20
13; 14:
147-1
62.
[3]
H Hari
har
an. E
x
te
ndi
ng D
epth
of F
i
eld v
i
a M
u
lti-focus F
u
s
i
o
n
.
PhD Thes
is.
Univers
i
t
y
of T
ennessee,
Kno
x
vi
lle. 20
11
.
[4]
R Eckhor
n, HJ
Reitb
oeck, M
Arndt, PW Dic
ke. F
eature
lin
king v
i
a s
y
nc
hr
oniz
a
tion
amo
ng d
i
stribut
ed
assemb
lies: Si
mulati
on of
results from cat cortex
.
Ne
ural C
o
mp
utatio
n
.19
9
0
; 2: 293-3
07.
[5]
RP Brouss
a
rd,
SK Rog
e
rs, ME Oxle
y, GL
T
a
rr.
Phy
s
i
o
l
o
gical
l
y
motiv
a
ted im
age fus
i
on for o
b
ject
detectio
n
usi
n
g a puls
e
cou
p
le
d neur
al n
e
t
w
o
rk.
IEEE Transaction Neural Networks
.1
999; 10: 5
54-
563.
[6]
JL Johns
on,
HS Ran
g
a
nat
h, G Kuntima
d
, HJ
Caulfi
el
d. Pulse co
up
led n
eur
aln
e
tw
o
r
ks.
Neur
al
Netw
orks and
Pattern Reco
g
n
itio
n
. 199
8: 1-56.
[7]
Z
B
W
ang, YD
Ma, F
Y
Cheng,
LZ
Yang. Revi
e
w
of puls
e
-co
upl
ed ne
ura
l
n
e
t
w
o
r
ks.
Ima
g
e
and Vis
i
on
Co
mp
uting
. 2
0
10; 28(1): 5-1
3
.
[8]
QG Miao, BS
W
ang.
A Nove
l Adaptiv
e Mul
t
i-focus Ima
g
e
Fusion Alg
o
rit
h
m Bas
ed o
n
PCNN an
d
Sharp
ness
. S
P
IE 200
5: Se
n
s
ors, an
d C
o
mmand, C
ontr
o
l, Comm
unic
a
tions,
and
Intelli
ge
nce (
C
3I)
T
e
chnolog
ies
for H
o
mel
and
S
e
curit
y
and
H
o
mela
nd
Defe
ns
e IV, Ed
w
a
rd
M. Cara
pezza,
Editor. 2
0
0
5
:
704-
712.
[9]
W
Huan
g, Z
L
J
i
ng. Mu
lti-focus
imag
e
fusi
on usin
g
p
u
lse
co
upl
ed neur
al n
e
t
w
o
r
k.
Pattern
Reco
gn
itio
n
Letters
. 200
7; 28 (9): 112
3-1
132.
[10]
XB Qu, JW
Ya
n, HZ
Xiao, Z
Q
Z
hu.
Image F
u
sion Alg
o
rith
m Based on S
patia
l F
r
eque
n
c
y
-Motiv
a
te
d
Pulse
Co
up
led
Ne
ural
N
e
t
w
o
r
ks in
No
nsu
b
s
a
mpl
ed
Co
nto
u
rlet T
r
ansfor
m
Dom
a
in.
Act
a
Automatica
Sinic
a
. 200
8; 34(2): 150
8-1
5
1
4
.
[11]
ZB W
ang, YD
Ma, J Gu. Mult
i-focus imag
e fusio
n
usin
g PCNN.
Pattern Reco
gniti
on: T
he Jour
nal o
f
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 16
93-6
930
TELKOM
NIKA
Vol. 12, No. 2, June 20
14: 357 – 36
6
366
the Pattern Re
cogn
ition S
o
cie
t
y
. 2010; 43(6):
2003-
20
16.
[12]
P Geng,
X Z
h
eng, Z
G
Z
han
g, YJ Shi, S
Q
Yan. Multifo
c
us Image F
u
sion
w
i
t
h
PC
N
N
in Sh
ear
let
Domai
n
.
Rese
arch Jour
nal o
f
Applie
d Scie
nces, Engi
nee
ring an
d T
e
ch
nol
ogy
. 201
2; 4(15): 22
83-
229
0.
[13]
E Cand
es, X L
i
, Y Ma, J W
r
i
ght. Robust pri
n
cip
a
l comp
on
ent ana
l
y
sis?.
Journ
a
l of the ACM
. 2011;
58(3): 1-3
7
.
[14]
W
Z
ou, K Kpa
l
ma, Z
Liu,
an
d et al.
S
e
g
m
entatio
n Driv
e
n
Low
-rank
Matrix Rec
o
very
for Sali
enc
y
Detectio
n
. 24th
British Machi
n
e Visio
n
Conf
e
r
ence (BMVC).
Bristol. 2013: 1-13.
[15]
T
W
an, CC Z
hub, Z
C
Qi
n. M
u
lt
ifocus
Image
F
u
sio
n
Bas
e
d
on
Rob
u
st Pri
n
cipal
C
o
mpo
n
e
n
t Ana
l
ysis.
Pattern Recognition Letters
. 201
3; 34(9): 10
01-1
008.
[16]
J W
r
ight, A Ga
nesh, S
Rao,
Y Ma.
Ro
bust
princi
pa
l co
mp
one
nt an
alysis:
Exactrecovery
of corru
pted
low
-rank matric
es via conv
ex opti
m
i
z
at
ion
. P
r
ocee
din
g
s of Advanc
es in n
eura
l
informati
on proc
essin
g
s
y
stems. 20
09:
2080-
20
88.
[17]
Z
Lin, M Ch
e
n
, L W
u
, Y Ma.
T
he
au
g
m
e
n
t
ed
La
gra
nge mu
lti-pl
ier met
hod
for exact recovery of
corrupte
d
low
-rank matrices
.
UIUC T
e
chnica
l Rep
o
rt
UILU-
E
NG-09-2
215. 200
9.
[18]
W
Huan
g, Z
Jing. Eva
l
uati
o
n
of focu
s mea
s
ures in m
u
lti-
focus ima
ge f
u
sio
n
.
Pattern Recognition
Letters
. 200
7; 28(4): 49
3-5
0
0
.
[19]
Y Jian
g, M W
ang. Imag
e fu
sion
w
i
th mor
p
hol
ogic
a
l c
o
mp
one
nt an
al
ysis.
Information F
u
sion
. 20
14;
18: 107-
11
8.
[20]
B Yan
g
, ST
Li
. Multi-focus
i
m
age
fusio
n
b
a
se
d
o
n
sp
atia
l freq
uenc
y
a
n
d
mor
pho
lo
gic
a
l
oper
ators.
Chinese Optics Letters
. 2007;
5(8): 452-
45
3.
[21] http://
w
w
w
.
ec
e
.
lehi
gh.e
du/spc
r
l.
2005.
[22] http://
w
w
w
.
im
g
f
sr.com/sitebuil
der/ima
ges.
20
09.
[23]
S Li, JT
K
w
ok
, Y W
ang. Combin
ation of i
m
ages
w
i
t
h
di
verse focuses
usin
g the spati
a
l frequ
enc
y.
Information fusion
. 20
01; 2(3):
169-1
76.
[24]
Image fusio
n
tool
bo
x: http://
w
w
w
.
i
m
ag
efusi
o
n.org/.
[25] RPCA
tool
bo
x:
http://per
ceptio
n.csl.illi
no
is.ed
u
/mat
rix-r
ank/s
ampl
e_co
de.ht
ml.
[26]
PCNN toolbox
:
http
://qux
iaobo.go.
8866.or
g/project/PCNN/
PCNN_t
oolbox.rar.
[27] DJC
MacKa
y
.
Informati
on t
heory, i
n
feren
c
e an
d le
arni
ng a
l
gor
ith
m
s
.Cambri
d
g
e
un
i
v
ersit
y
press
.
200
3.
[28]
CS Xy
deas, V Petrovic. Object
ive imag
e fus
i
on
performa
nc
e meas
ure.
Electronics Letters
. 2000; 3
6
:
308-
309.
Evaluation Warning : The document was created with Spire.PDF for Python.