TELKOM
NIKA
, Vol.13, No
.2, June 20
15
, pp. 614 ~ 6
2
3
ISSN: 1693-6
930,
accredited
A
by DIKTI, De
cree No: 58/DIK
T
I/Kep/2013
DOI
:
10.12928/TELKOMNIKA.v13i2.1433
614
Re
cei
v
ed
Jan
uary 16, 201
5
;
Revi
sed Ma
rch 2
8
, 2015;
Acce
pted April 12, 2015
Image Denoising Based on Artificial Bee Colony and BP
Neural Network
Junping Wa
ng, Dapen
g
Zhang*
Information En
gin
eer De
part
m
ent, Hena
n V
o
catio
nal
and
T
e
chnical Institute,
Z
hengz
ho
u 45
004
6, Hen
an, Chin
a
*Corres
p
o
ndi
n
g
author, e-ma
i
l
: 5541
04
45@
q
q
.com
A
b
st
r
a
ct
Imag
e is
ofte
n su
bject to
nois
e
p
o
ll
utio
n
duri
ng t
he
p
r
ocess of c
o
ll
ection,
acqu
isi
t
ion
and
transmissio
n
, n
o
ise
is
a
ma
jor
factor
affecti
n
g the
i
m
a
g
e
q
uality, w
h
ic
h
h
a
s gr
eatly
i
m
p
ede
d p
e
o
p
le
fr
om
extracting
infor
m
ati
on fro
m
th
e i
m
a
ge. T
h
e
purp
o
se
of
i
m
age
de
nois
i
n
g
is to restor
e th
e ori
g
in
al
i
m
ag
e
w
i
thout no
ise
from the
no
ise
i
m
a
ge,
and
at t
he s
a
me ti
me ma
inta
in
th
e d
e
tail
ed infor
m
a
t
ion of
the i
m
a
ge as
muc
h
as
p
o
ssible. T
h
is p
a
p
e
r, by co
mbin
i
ng arti
fic
i
al
b
e
e
col
ony
al
gor
ithm an
d BP
neur
al
netw
o
rk,
prop
oses the i
m
a
ge d
eno
isin
g meth
od b
a
se
d on artifi
cia
l
b
ee col
ony an
d BP neura
l
net
w
o
rk (ABC-BPNN),
ABC-BPNN
a
dopts th
e “d
o
ubl
e circ
ulati
o
n
”
structure
d
u
rin
g
the
train
i
ng
proc
ess, a
fter specifyi
ng
the
expecte
d co
nv
erge
nce sp
ee
d
and
precis
ion,
it can a
d
just th
e rules
accor
d
i
ng to the struct
ure, auto
m
atic
ally
adj
usts the nu
mb
er of neur
o
n
s, w
h
ile
the w
e
ig
ht of the ne
urons a
nd
re
le
vant para
m
eter
s are deter
mi
n
ed
throug
h be
e co
lony o
p
ti
mi
z
a
ti
on. T
he simul
a
tion resu
lt
sho
w
s that
the alg
o
rith
m pro
pose
d
in this pa
per can
ma
inta
in th
e
i
m
a
g
e
ed
ges
a
nd
other
i
m
p
o
r
tant features
w
h
ile r
e
movi
n
g
n
o
ise, s
o
a
s
to o
b
tain
b
e
tte
r
den
oisi
ng effec
t
.
Ke
y
w
ords
: Image D
e
n
o
isi
ng,
Artificial Bee
Colo
ny, BP Ne
ural N
e
tw
ork
1. Introduc
tion
The main m
edia for info
rmation tra
n
s
missio
nis v
o
ice a
nd im
age. The qu
antity of
informatio
n a
nd intuitive
containe
d in
a
n
imag
e
a
r
e unmatched
b
y
sou
nd and
words. Ho
we
ver,
image i
s
ea
sil
y
interfered
b
y
all kind
s of noise
in the p
r
ocess of g
e
n
e
ra
tion
and transmi
ssion, t
he
quality of image will be
damag
ed, which i
s
very
unfavora
b
le to the sub
s
e
q
uent high
er-level
image processing [1]. Therefore, at
the pre
-
processin
g
stage of im
age , it is quite nece
s
sa
ry to
con
d
u
c
t ima
ge d
enoi
sing
, in orde
r to
improve
the
SNR
(sign
a
l
to noi
se
ratio
)
of im
age
a
n
d
highlight the
desi
r
ed featu
r
es of image [
2
].
Today, the st
udy on the th
eory an
d ap
p
lication
of image d
enoi
sin
g
is
still a very active
resea
r
ch an
gle in the ima
g
e
pro
c
e
s
sing
field. Many re
sea
r
che
r
s
at both hom
e a
nd ab
roa
d
ha
ve
analyzed the
statisti
cal m
odel
s an
d fre
quen
cy di
st
ri
bution of th
e
noise si
gnal
according
to
the
sign
al characteristics,
sum
m
ar
i
z
ed
and
put forward
many metho
d
s
for i
m
age
d
enoi
sing.
Divided
according
to
the appli
c
atio
n dom
ain, th
e imag
e de
n
o
isin
g alg
o
rit
h
m can
be di
vided into
sp
atial
domain
meth
od a
n
d
tran
sf
orm
dom
ain
method
[3]. Divided
a
c
co
rding
to th
e
con
c
rete
app
lied
theory, the i
m
age
denoi
si
ng alg
o
rithm
can
be divid
e
d
into alg
o
rit
h
m ba
se
d on
multi-resolution
analysi
s
, alg
o
rithm b
a
sed
on p
r
ob
abilit
y statistics
th
eory, metho
d
based o
n
n
online
a
r filteri
ng
theory, meth
od ba
se
d on
partial
differe
ntial equ
at
ion
theory, et
c. A dilemma
of denoi
sin
g
is
how
to the maintai
n
image d
e
tai
l
s as m
u
ch a
s
possible
whil
e lowe
ring im
age noi
se. An
d the re
sea
r
ch
focu
s is to
explore
wh
ether the
algo
rithm
can sparsely re
prese
n
t the image info
rm
atio
n
effectively by
dealin
g with d
i
fferent image
feature
s
whil
e denoi
sing [
4
].
Artificial neu
ral netwo
rk i
s
a forefront subje
c
t as wel
l
as a hot research topi
c in image
pro
c
e
ssi
ng
which h
a
s
gain
ed the inte
rn
ational re
cog
n
ition cu
rrentl
y
. It has attracted mo
re a
n
d
more
attentio
n with
its ap
p
lication
in fiel
ds
of
ima
ge
d
enoi
sing, im
a
ge
com
p
re
ssi
on, ima
ge
ed
ge
detectio
n
, im
age i
n
tegratio
n, etc. In th
e
field of
swarm intellige
n
ce, the a
r
tificia
l
bee
colony i
s
a
rand
om opti
m
ization
sea
r
ch alg
o
rithm
based on th
e
heuri
s
tic met
hod of popul
ation, spe
c
iali
zing
in finding
sol
u
tions to
spa
c
ial optimi
z
ati
on problem
s
[5]. This pap
er, by com
b
i
n
ing a
r
tificial
bee
colo
ny and
BP neural
n
e
twork, p
r
op
ose
s
the
AB
C-BPNN ima
ge den
oisi
ng
method. In
this
pape
r, the pri
n
cipl
e of image den
oisin
g
is first de
scri
bed, then on
basi
s
of the analysi
s
of b
ee
colo
ny algorit
hm and BP n
eural n
e
two
r
k, the
BP neural netwo
rk training p
r
o
c
e
s
s optimi
z
ed b
y
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
1693-6
930
Im
age Denoi
sing Ba
sed o
n
Artificial Bee Colo
ny a
n
d
BP Neural Network (Ju
npi
ng Wa
ng)
615
artificial b
ee
colony is
put f
o
rward, and fi
nally
the si
mu
lation expe
ri
ment an
d an
a
l
ysis a
r
e
ca
rri
ed
out.
2. Source of
Noise
The quality of digital image will be
affected by noise, and the noi
se
s are mainly
gene
rated d
u
r
ing two p
r
o
c
esse
s: acqui
sition and tran
smissio
n
. In the image a
c
q
u
isition p
r
o
c
e
ss,
the imaging
sensor will be
affected
by various factors, such as
the external environment and the
q
u
a
lity o
f
th
e
s
e
ns
or
its
e
lf,
e
t
c
.
F
i
gu
r
e
1
s
h
ows
th
e d
i
g
i
tiz
a
tio
n
p
r
oc
es
s o
f
th
e
s
i
mu
la
tio
n
imag
e
,
a gray
scale i
m
age
can b
e
represented
by a two-di
mensi
onal fu
nction
(,
)
f
xy
with the si
ze of
mn
after bei
ng d
i
gitized,
whe
r
ein,
(,
)
x
y
re
pre
s
e
n
ts the
coo
r
d
i
nate of the p
i
xel in the o
r
igina
l
image,
(,
)
f
xy
rep
r
e
s
ent
s the
grayscale valu
e of c
oordin
a
te pixel, at
this time,
x
y
、
and
amplitude
f
are
discrete. T
he noi
se m
a
inly com
e
s from the collectio
n, tra
n
smi
ssi
on a
n
d
manag
eme
n
t stage
s du
ring
the image di
gitization p
r
o
c
e
ss.
Figure 1. The
digitization p
r
ocess of the
simulatio
n
im
age
(1) Image
inf
o
rmatio
n
coll
ection
sta
ge:
at
the im
ag
e colle
ction p
hase, the
sta
t
e of the
sy
st
em
se
ns
or i
s
af
f
e
ct
e
d
by
f
a
ct
o
r
s
su
ch
as th
e quality of
com
pon
ent
s, the
worki
n
g
environ
ment, etc. The noi
se at this stag
e is
mainly sa
lt&peppe
r noi
se an
d bipol
a
r
noise.
(2) Imag
e informatio
n transmi
ssion
stage:
at the image tra
n
smi
ssi
on st
age, the
transmissio
n cha
nnel i
s
in
terfere
d
and
noise is thu
s
gene
rated.
Bipolar n
o
ise
and Gau
s
si
an
noise are
co
mmon at this
stage.
(3) Im
age inf
o
rmatio
n man
ageme
n
t stag
e: image m
a
n
ageme
n
t incl
ude
s sto
r
ag
e, delete
,
copy, etc. Due to aging
of compo
n
e
n
ts, self-e
xci
t
ation of circuit and poo
r filtering in the
manag
eme
n
t system, the resi
stan
ce the
r
mal noi
se, th
unde
r-ball noi
se, etc. are g
enerated [6].
3. Artificial Bee Colony
Artificial be
e
colo
ny sim
u
lates th
e a
c
tual b
ee h
oney gath
e
ri
ng me
cha
n
ism and
pro
c
e
s
ses th
e function o
p
timization p
r
oble
m
, and
divides the a
r
tificial bee
colony into three
categ
o
rie
s
: le
ading b
ee, followe
d bee a
n
d
scoute
r
. Th
e basi
c
ide
a
of such alg
o
ri
thm start
s
fro
m
one ra
ndo
mly generated in
itial populatio
n, sea
r
ches
a
r
oun
d one h
a
l
f individual with best fitness
value and p
r
eferentially re
tains the indi
vidual by
adopting the on
e-to-one
com
petition su
rvival
strategy,
and
su
ch
ope
ratio
n
is n
a
med
th
e lea
d
in
g
be
e
se
arch.
The
n
sel
e
ct
optim
al individ
ual
by
usin
g the ro
ul
ette sele
ction
method an
d
con
d
u
c
t the g
r
eedy
sea
r
ch
arou
nd to ge
nerate th
e other
half individual
, and such a
pro
c
e
ss i
s
n
a
m
ed the follo
wed
bee
sea
r
ch. New
pop
ulation i
s
formed
by individual
s gene
rate
d by leading b
ee and follo
wed be
e to avoid the loss of populati
o
n
diversity. Co
ndu
ct the a
n
a
logo
us
scou
ter vari
ati
on
sea
r
ch to fo
rm the iterative pop
ulation.
By
con
s
tant ite
r
ative cal
c
ulat
ion,
su
ch
alg
o
rithm
retain
s excellent i
ndividual
s, el
iminates infe
rior
individual
s, and gets
clo
s
e
r
to the global
optimal soluti
on.
Then, ta
ke th
e sol
u
tion of
the nonli
nea
r func
tion
mini
mization
for
example to
el
aborate
the con
c
rete operation p
r
o
c
e
ss of ABC
algorith
m
in d
e
tails.
Anal
og im
age
Im
age codi
ng
C
h
an
nel
c
odi
n
g
Noise
St
ora
g
e or
p
r
o
cessi
ng
Noise
Dig
ital
im
age
User
Im
age decoding
C
h
an
nel
c
odi
n
g
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 16
93-6
930
TELKOM
NIKA
Vol. 13, No. 2, June 20
15 : 614 – 62
3
616
The no
nlinea
r
function
minim
u
m value is expre
s
sed
as
mi
n
(
)
f
X
,
L
U
XX
X
,
L
X
and
U
X
are resp
ectively
the upper a
nd lowe
r bou
nds of the
variable
12
(,
,
)
n
Xx
x
x
valu
e, and
n
is th
e varia
b
le di
mensi
on.
During the
sol
u
tion of the
nonlin
ear fun
c
tion mi
nimu
m value
with
ABC algo
rith
m, first of all,
the initial p
o
p
u
lation in
clu
d
i
ng
NP individ
ual
s i
s
g
ene
rate
d withi
n
the
val
ue
scope,
and
ea
ch i
n
d
i
vidual h
a
s a
co
rrespon
di
ng
can
d
idate
sol
u
tion in th
e fe
asibl
e
solutio
n
spa
c
e a
nd t
he dim
e
n
s
ion
of the in
divid
ual vari
able
D
equal
s th
e di
mensi
on
n
of t
he o
b
je
ctive functio
n
d
e
ci
si
on va
riable
X
. Suppo
se
th
e algorith
m
’s
maximum iterative number i
s
G
, the nu
mber
i
individual i
n
n
u
m
ber
t
popul
ation
can
b
e
expre
s
sed a
s
((
1
)
,
,
(
)
)
tt
t
ii
i
x
xx
n
,
1,
2
/
2
iN
P
. The followi
ng is th
e de
scriptio
n of
ke
y steps
of ABC algori
t
hm:
(i) Popul
ation
initialization
Set the initial evolution
algeb
ra
0
t
,
the initial p
opulatio
n co
nsi
s
ting of
NP
X
individual
s th
at are rando
mly generate
d
by fo
rmul
a
(1) and sat
i
sfies con
s
tra
i
nt
conditio
n
s
within the feas
ible optimiz
a
ti
on solutio
n
space is form
e
d
.
0
()
(
)
,
1
,
2
LU
L
ii
i
i
XX
r
a
n
d
X
X
i
N
P
(1)
In which,
0
i
X
sho
w
s the n
u
mb
er
i
individual in numbe
r 0 p
opulatio
n and
()
rand
(ii) Le
ading b
ee se
arch
The
half in
di
vidual
with
small fitne
s
s form
s the
le
a
d
ing
bee
po
p
u
lation, a
nd t
he oth
e
r
half forms the
followed b
e
e
populatio
n.
For
one
go
al
individu
al
t
i
X
of the
cu
rre
nt
numbe
r
t
lea
d
i
ng b
ee
pop
u
l
ation, ra
ndo
mly
sele
ct the individual
1
[1
,
2
,
/
2
]
rN
P
to condu
ct the cro
s
sover search by dimen
s
ion t
o
gene
rate the
new in
dividua
l
V
. See the formula (2
) for d
e
tails.
1
()
(
)
(
1
2
)
(
(
)
(
)
)
tt
t
ii
i
V
j
xj
r
a
n
d
xj
x
j
(2)
Figure 2 is th
e cro
s
sove
r search dia
g
ra
m when the o
b
jective fun
c
tion is two.
From the Fi
g
u
re 2
crossov
e
r
search
dia
g
ram, we can
see that the
gene
rated
differen
c
e
vector is un
certain
in th
e
dire
ction
and
si
ze
. T
he
g
oal le
ading
i
ndividual
bee
and
rando
m
l
y
sele
cted le
ad
ing individu
al bee ad
d such
differen
c
e ve
ctor to the b
a
se vecto
r
, wh
ich eq
ual
s th
at
the
rand
om disturban
ce
within
one
regulate
d
sco
pe i
s
a
dde
d
to the
ba
se
vecto
r
, thu
s
the
popul
ation di
versity is enla
r
ged [7].
Figure 2. Illustration of the cro
s
sove
r pro
c
e
ss
with D=2
Like
other
evolutiona
ry alg
o
rithm
s
, ABC algorit
h
m
ad
opts the "
s
u
r
vival of the fittest" ide
a
of Darwin'
s
e
v
olution theo
ry retain the i
ndividual
with
the purp
o
se
to ensu
r
e tha
t
the algorith
m
The a
r
ea
of the
residual
v
ector
r
e
su
lted b
y
go
al
lead
in
g b
e
e and
r
a
ndo
m
l
y
selected leadi
n
g bee
Leadi
n
g bee
Goal lea
d
ing
bee
N
e
w po
sition
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
1693-6
930
Im
age Denoi
sing Ba
sed o
n
Artificial Bee Colo
ny a
n
d
BP Neural Network (Ju
npi
ng Wa
ng)
617
evolves
co
nstantly to the
glob
al opti
m
al so
lution.
Evaluate th
e fitness
of
newly
gene
rated
individual
V
and goal i
ndiv
i
dual
t
i
X
, and then comp
are
their fitness values, an
d
sele
ct the
individual
s wi
th better fitness valu
e by fo
rmul
a (3
) to form the leadi
ng bee p
opul
ation.
1
,(
)
(
)
,(
)
(
)
t
t
i
i
tt
ii
Vf
V
f
X
X
X
fX
f
V
(3)
Figure 3. Flow ch
art of the
standa
rd ABC algo
rithm
(iii) Follo
we
d bee search
Followed b
e
e
sele
cts th
e o
p
timal goal i
n
dividual
1
,[
1
,
,
/
2
]
t
k
Xk
N
P
in the ne
w leadi
n
g
bee po
pulatio
n acco
rdin
g to the proba
bi
lity formul
a (4) an
d in the
roulette
sele
ction mo
de a
n
d
Followed
bee
searches t
o
produce
n
e
w
fo
llowed b
ee p
opu
latio
n
Leading bee s
earches to
produce
new leadin
g
be
e
p
o
p
u
l
ation
Star
t
In
itialize related
p
a
ram
e
ter,
N
P, G, limit, t
=
0
Produ
ce i
n
itial
p
o
p
u
l
atio
n
C
a
l
c
ul
at
e t
h
e i
ndi
vi
dual
fi
t
n
ess val
u
e a
n
d
t
a
ke hal
f
nu
m
b
ers of t
h
e i
ndi
vi
d
u
al
wi
t
h
best
fitn
ess
v
a
lu
e as lead
in
g
bee
p
o
p
ul
at
i
o
n
a
n
d
t
h
e
ot
he
r hal
f
as fol
l
o
we
d bee
p
o
p
u
l
atio
n
Yes
Yes
The sc
oute
r
se
arches a
n
d updates
th
e iterativ
e
p
o
p
u
l
ation
C
o
m
b
i
n
e t
h
e l
eadi
n
g
bee
po
pul
at
i
o
n a
n
d
f
o
l
l
o
we
d
bee
p
o
p
u
l
atio
n
t
o
fo
rm
th
e iterativ
e
p
o
p
u
l
atio
n
Whet
her
t
h
e s
c
out
e
r
i
s
pr
o
duc
ed
t
≤
G
Out
p
ut
t
h
e
o
p
ti
mal so
lu
tio
n
En
d
No
No
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 16
93-6
930
TELKOM
NIKA
Vol. 13, No. 2, June 20
15 : 614 – 62
3
618
con
d
u
c
ts the
sea
r
ch tog
e
t
her
with ra
n
domly sel
e
ct
ed individ
ual
based o
n
th
e formul
a (2) to
gene
rate ne
w individual
1
,[
/
2
1
,
,
]
t
k
X
k
NP
NP
and form the followe
d bee p
o
pulation.
/2
1
i
i
NP
i
i
f
it
P
f
it
(4)
The
sea
r
ch
way of followed
bee
pop
ulati
on in th
e a
r
tificial
bee
col
o
n
y
is the
key
why it is
different from
other evoluti
onary al
gorith
m
s. Its esse
n
c
e is to
sele
ct prefere
n
tiall
y
individually to
con
d
u
c
t the g
r
eedy
sea
r
ch,
whi
c
h is th
e
key facto
r
of
the algo
rithm’
s fast
conve
r
gen
ce. But, its
sea
r
ch way
itself introdu
ce
s so
me random info
rmation, whi
c
h thus do
es not redu
ce
the
popul
ation di
versity to a large extent [8].
(iv) Scoute
r
search
After the com
b
ined
se
arch
by leadin
g
be
e pop
ulation
and follo
we
d
bee p
opul
atio
n, new
popul
ation in
the sam
e
si
ze with the init
ial popul
at
ion
is form
ed. In orde
r to avoi
d the exce
ssi
v
e
loss of the population diversity as the population ev
ol
ves, the artificial bee
colo
ny simulate
s the
biologi
cal b
e
havior of the
scouter
se
arching for
pote
n
tial hon
ey s
ource
s to put
s forwa
r
d
spe
c
ific
scouter sea
r
ch
way. Assume a
certai
n indivi
du
al
doe
s n
o
t ch
ange
continu
ously fo
r "li
m
it"
gene
ration
s,
the corre
s
po
nding
individ
ual tra
n
sfe
r
s
into the
scou
ter, gen
erate
s
n
e
w individ
ual
according to
formula (1) sea
r
ch
and
make
s one
-to-one
com
p
arison with
origin
al indivi
dual
according to formul
a (3
) to prefe
r
entially
retain individuals
with better fitness
.
Thro
ugh a
b
o
v
e sea
r
ch
es of leading
bee po
pulati
on, followe
d
bee pop
ula
t
ion and
scouters, m
a
ke th
e p
opul
ation evolve
to the n
e
xt g
eneration
an
d recy
cle u
n
til the al
gorith
m
iteration num
ber
t
re
ach
e
s
the preset m
a
ximum iteration nu
mbe
r
G
or the
pop
ula
t
ion optimal
s
o
lu
tion
r
e
ach
e
s
th
e
pr
es
et e
r
r
o
r
a
c
c
u
r
a
c
y
.
In orde
r to further u
nde
rstand the p
r
in
cipl
e of the ABC algo
rithm
,
Figure 3
sh
ows the
operation flo
w
ch
art.
4. Back
-Prop
a
gation
Ne
tw
o
r
k (BP Ne
t
w
o
r
k)
Ba
c
k
-
P
ro
p
aga
tio
n
Ne
tw
ork
(
BP Ne
tw
or
k)
is
th
e mu
ltilayer netwo
rk g
ene
rali
zi
ng W-H
learni
ng rul
e
and condu
cti
ng the weig
ht training to
wa
rds the n
onlin
ear differenti
a
l function. T
he
adju
s
tment o
f
weig
hts a
d
opts th
e ba
ck p
r
op
agatio
n lea
r
nin
g
al
gorithm
whi
c
h is
a
kind
of
multilayer fo
rward feed
ba
ck
neu
ral
ne
twork,
an
d its ne
urons transfo
rmatio
n
function i
s
S
function, its o
u
tput quantity is
the contin
uou
s amo
unt
from 0 to
1,
and it ca
n a
c
hieve the
a
n
y
nonlin
ear ma
pping fro
m
in
put to output.
4.1. BP Net
w
ork Fea
t
ure
s
(i) The in
put and outp
u
t are parall
e
l ana
log qua
ntity.
(ii) T
he i
nput
and
outp
u
t
relation
shi
p
o
f
t
he net
work is
determine
d by
weight
factors
con
n
e
c
ted by
each laye
r, and there i
s
no
fixed algorith
m
.
(iii) Th
e wei
ght factor i
s
adju
s
ted by
st
udying
sig
nal, and the
more you l
earn, the
s
m
arter the network
is
.
(iv) The m
o
re hidde
n lay
e
r is, the
hi
gher th
e net
work o
u
tput
pre
c
isi
on i
s
, and the
damag
e of so
me individual
weig
ht factor
will
not exert large im
pa
ct on the netwo
rk output.
Only when
you
want to
li
mit the o
u
tpu
t
of
the
network,
for
exam
ple b
e
twe
en
0 an
d 1,
there sh
ould exist
S
type activation functi
on in th
e o
u
tput layer. In
gene
ral,
S
type ac
tivation
function i
s
usually adopted
in hidden la
yer, wh
ile, the output layer adopt
s the linear a
c
tivatio
n
func
tion [9],[10].
4.2. Introduc
tion to Multil
a
y
er BP Netw
o
r
k
Multilayer BP netwo
rk i
s
the multilayer neural
network
with thre
e layers or o
v
er thre
e
layers,
and
e
a
ch
layer is
comp
osed
of a n
u
mbe
r
of neu
ron
s
as
sho
w
n
in Fi
g
u
re
4, an
d e
a
ch
neuron bet
we
en its left and right layers a
c
hieve
s
the full conn
ectio
n
,
namely, each neuron in the
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
1693-6
930
Im
age Denoi
sing Ba
sed o
n
Artificial Bee Colo
ny a
n
d
BP Neural Network (Ju
npi
ng Wa
ng)
619
left layer
and
the
right l
a
yer i
s
co
nne
ct
ed, but
there
is no
conn
e
c
tion
betwee
n
up
an
d
do
wn
neuron
s [11].
Figure 4. Multilayer BP network
BP network condu
cts the trainin
g
acco
rding to
the learnin
g
way with teache
rs.
Whe
n
a
pair of le
arni
ng mod
e
l is
offered to th
e network
, th
e activation v
a
lue of its
ne
uron
will
sp
read
from the
inp
u
t
layer to
the
output laye
r t
h
rou
gh th
e m
i
ddle l
a
yer, a
nd e
a
ch n
euron in
the
out
put
layer co
rresp
ond
s with the netwo
rk re
spo
n
se
of the input mod
e
, then, passe
s throu
gh the
middle laye
r
from the o
u
tp
ut layer a
c
co
rding to
th
e
prin
ciple
of redu
cing exp
e
c
ted o
u
tput a
n
d
actual
outp
u
t errors, a
nd f
i
nally retu
rn
s to the i
nput
layer to
co
rre
c
t the i
ndivid
ual
con
nectio
n
weig
ht one la
yer by one layer. Due to this co
rr
ectio
n
process is condu
cted fro
m
the output to
input level, so it is called
“error ba
ck propa
gat
ion
algorithm
”. With this kin
d
of erro
r back
prop
agatio
n t
r
ainin
g
i
s
con
ducte
d, the
resp
on
se a
ccura
cy rate
of
the network t
o
the in
put m
ode
will also be unceasi
ngly enhanced [12].
Becau
s
e BP netwo
rk h
a
s t
he hidde
n layer whi
c
h i
s
in
the middle p
o
sition, an
d there a
r
e
corre
s
p
ondin
g
lea
r
ning
ru
les to foll
ow to train
su
ch network to
make it abl
e to ide
n
tify the
nonlin
ear mo
del.
4.3. Three-la
y
e
r
BP Net
w
ork
In ord
e
r to
endo
w th
e B
P
netwo
rk
with a
certai
n
functio
n
to
compl
e
te a
task, the
interlaye
r
con
nectio
n
weig
hts
a
nd nod
e
thre
shol
d
va
l
ue mu
st b
e
a
d
juste
d
to
en
sure that
erro
rs
of all sampl
e
s’ actual and expec
ted output stabilize within a smal
ler value. In the process of
training BP n
e
twork al
gorit
hm, the error back p
r
opa
g
a
tion algo
rith
m is one
mo
st effective a
nd
comm
only used method [1
3]. Figure 5 shows th
re
e-l
a
yer BP netwo
rk
stru
cture chart.
The lea
r
nin
g
pro
c
e
ss of B
P
network ma
inly has follo
wing fou
r
pa
rts.
Input mod
e
d
i
rect
propa
ga
tion (the
inp
u
t
mode
cal
c
ul
ation i
s
sprea
d
to the
outp
u
t layer
throug
h the middle laye
r from the inp
u
t layer).
Output e
r
ror
back
pro
pag
a
t
ion (the
outp
u
t erro
r is sp
read to th
e in
p
u
t layer to
the
output
layer thro
ugh
the middle la
yer).
Cycle
memo
ry training
(the
cal
c
ulatio
n p
r
oc
ess of
the mode dire
ct
p
r
opa
gation
an
d erro
r
back propa
ga
tion is rep
eat
ed altern
ately).
Study result d
i
scretion
(det
ermin
e
wh
eth
e
r t
he glob
al error is
clo
s
e
r
to the minimum).
Figure 5. Three-laye
r BP network
stru
cture cha
r
t
M
i
ddl
e
l
…
I
npu
t
l
…
Out
put
l
…
I
npu
t
Out
put
R
i
1
R
i2
R
im
O
1
O
2
O
m
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 16
93-6
930
TELKOM
NIKA
Vol. 13, No. 2, June 20
15 : 614 – 62
3
620
5. Artificial Bee Col
ony
Optimized BP
Neur
al Net
w
ork and Trai
ning
Artificial bee
colo
ny algorit
hm doe
sn’t n
eed t
he de
rivative but the value of the obje
c
tive
function
and
it has
excellent optimi
z
ation
e
ffect
s an
d lo
w requireme
nt for
softwa
r
e
and
hard
w
a
r
e; th
erefo
r
e, it has obvio
us
advantag
es
t
o
use it to train the BP
neural net
wo
rk.
Integrate the
artificial be
e colony algo
rith
m and the BP neural n
e
two
r
k an
d form A
B
C-BPNN.
5.1. Pixel Pollution Judg
ment Ba
sed
on ABC-BP
NN
Firstly, jud
ge
wheth
e
r the i
m
age
pixels
hav
e b
een
p
o
lluted
by im
pulse n
o
ise a
nd divid
e
the pixel
s
int
o
two
types:
polluted
and
un-p
o
llute
d
a
c
cordi
ng to
th
e jud
g
ment
result. Prepro
c
e
ss
the data befo
r
e it enters in
to ABC-BPNN, incl
udi
ng the extractio
n
and norm
a
li
zation of me
dian
filter and eig
e
n
value. After the norm
a
lize
d
data ente
r
s into ABC-BPNN, trai
n it wi
th the trainin
g
method indi
cated in Figu
re
6.
Since ABC-BPNN i
s
used
to classify the
pollute
d a
nd unp
ollute
d pixels, its expecte
d
output is defi
ned by the followin
g
formul
a:
1,
(
,
)
(
,
)
0
(,
)
0,
(
,
)
(
,
)
0
gi
j
f
i
j
Si
j
gi
j
f
i
j
(5)
In this
formula,
g
stand
s for the origin
al and un
pollute
d image
whil
e
f
r
e
pr
es
en
ts
th
e
image poll
u
te
d by the impu
lse noi
se. If pixel
(,
)
ij
has be
en
polluted, na
mely
(,
)
(
,
)
0
gi
j
f
i
j
,
then the expected o
u
tput of ABC-BPN
N is 1, otherwise, it is 0.
A
cco
rdi
ngly, if the output of the
trained AB
C-BPNN is
clo
s
e to 1, then the co
rr
espon
ding pixel ma
y have been
polluted, if cl
ose
to 0, they may not be pollu
ted.
5.2. ABC-BP
NN Tr
aining Process
The num
ber
of neuro
n
s in
the BP neural netwo
rk
h
a
s
a relatively signifi
cant im
pact on
the app
roxim
a
tion a
c
cura
cy. The more
neuron
s the
r
e
are, th
e ap
proximatio
n a
c
cura
cy i
s
hig
her.
In pra
c
tical a
pplication, more ne
uron
s are not
bette
r since the co
mputation co
mplexity is neede
d
to be
take
n i
n
to co
nsi
d
e
r
ation. Th
erefo
r
e
,
for
the
BP n
eural
net
wo
rk, the dyn
a
mic adju
s
tme
n
ts
of
the nu
mbe
r
of neu
ron
s
will g
r
eatly i
n
crea
se
it
s
a
pplication
scope
and
get
the o
p
timal
co
st
perfo
rman
ce i
n
a cert
ain co
ndition.
In order to
automatic
a
lly adjus
t
ment the netwo
rk st
ru
cture
in the t
r
aining, the t
r
aining
of
ABC-BPNN
has ad
ded
the u
pdate
d
module
of
n
eur
al
net
wo
rk stru
cture a
nd con
s
titute
d
the
“dual
cy
cle
”
training
p
r
o
c
ess of AB
C-BPNN
with t
he u
pdate
d
module
of n
eural
net
wo
rk
coeffici
ent. The neu
ral net
work st
ru
cture update mai
n
ly refers to the increa
se
of neuro
n
s
while
the neural ne
twork co
effici
ent is mainly
updated
by the bee colon
y
algorithm. The followi
ng
is
the further d
e
s
cription of A
B
C-BPNN tra
i
ning:
(i) Abst
ra
ct and define the
individual
attri
bute of the be
e colo
ny.
It can be kn
o
w
n every ne
u
r
on in the n
e
u
ral net
wo
rk
has 4 va
riabl
es an
d there are 4K
coeffici
ents t
o
be d
e
termi
ned in the B
P
neural
net
work
with K
neuron
s an
d
every co
efficient
corre
s
p
ond
s t
o
one
dimen
s
ion of the
sea
r
ch
sp
ace.
In other wo
rd
s,
the
search
scope of the
be
e
colo
ny is 4K-dimen
sion
al.
(ii) Initialize the bee colony.
Initialize the
bee
colony b
e
fore the t
r
ai
ning.
Additio
nally, re-i
nitialize the
bee
colo
ny
whe
n
the
r
e a
r
e new
ne
uro
n
s sin
c
e
cha
nge ha
s
o
c
cu
rre
d in
the
di
mensi
o
n
s
of
the
sea
r
ch
sp
ace.
For the
latter,
if the num
be
r of the
neu
rons i
n
cre
a
se
s fro
m
K to K
+
1, the i
ndivi
dual di
men
s
i
ons
of the be
e
colony al
so i
n
cre
a
ses from
4K
to (4K+4). Adopt
a
rand
om a
s
si
gnment i
n
th
e
initialization.
(iii) Calculate
the neural net
work outp
u
t.
Cal
c
ulate the
output value
according to
Formul
a (6
).
11
1
()
(
)
(
)
qq
M
kk
k
i
i
k
k
i
i
j
j
i
k
ii
j
on
e
t
w
y
a
w
w
x
a
(6)
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
1693-6
930
Im
age Denoi
sing Ba
sed o
n
Artificial Bee Colo
ny a
n
d
BP Neural Network (Ju
npi
ng Wa
ng)
621
(iv) Trai
ning
stop con
d
ition
s
Two
con
d
itio
ns: a)
Whe
n
it reach
e
s t
he ma
ximum
iteration
s
. b) When it rea
c
he
s the
expecte
d out
put error. Iteration will sto
p
if
one of the two conditio
n
s is satisfie
d.
(v) Adjust the
network
stru
cture
con
d
itio
ns.
Jud
ge
wheth
e
r a
d
ju
stment
s n
eed to
be
made i
n
the
netwo
rk st
ru
cture. To
in
cre
a
se
the
neuron
s nee
d to investig
ate the pre
s
e
t
neural n
e
twork
adju
s
tme
n
t rule
s in ad
vance. Incre
a
s
e
the numbe
r o
f
the neuron
s
if satisfying the ru
le
s and th
e adju
s
tment rule
s are as f
o
llows:
a) The n
u
mb
er of the initial neuro
n
s i
s
1
.
b) Set the error thre
sh
old
to accept the existing structure and the
maximum tolera
n
c
e
algeb
ra
p
G
to achieve this thresh
old.
Structu
r
e adj
ustment rule:
Acce
pt the
setting of the
cu
rrent ne
u
r
on
s
if the
a
v
erage
erro
r in the
app
roximate
gene
ration o
f
p
G
is sm
aller than
after
p
G
gene
ration e
v
olution sin
c
e the initialization,
otherwise, increa
se the nu
mber of the n
euro
n
s.
Actually, wha
t
the e
r
ror
gives i
s
the
exp
e
cted
conve
r
gen
ce
ac
cu
ra
cy
of
A
B
C-B
P
NN
in
the cu
rre
nt structu
r
e
while
p
G
gives the expecte
d co
nve
r
gen
ce
spe
e
d
. If the current stru
cture
fails to conve
r
ge at the se
t accu
ra
cy and sp
eed, it can o
n
ly adjus
t ABC-BP
NN
stru
cture
and
increa
se the
neuron
s.
(vi) Impl
eme
n
t the
updat
e of th
e b
e
e
col
ony in
dividual
s a
c
cordi
ng to
the
est
ablished
artificial be
e colony optimization.
ABC-BPNN training p
r
o
c
e
s
s is sho
w
n in
Figure 6.
Ye
s
NO
NO
In
cr
ea
se
in
dividu
al
d
i
mens
ions
In
cr
ea
se
ne
u
r
on
number
s
Upd
a
t
e
individual
St
a
r
t
Ab
s
t
r
a
ct
an
d
de
f
i
ne
ind
i
vidua
l
a
t
tr
ibut
e
Initializ
e
ind
i
vidua
l
Calcu
l
a
t
e
neur
al
ne
tw
ork
o
u
tput
End
M
eet
tr
aining
sto
p
c
o
nd
it
io
n
Need
to
adju
s
t
the
ne
tw
or
k
st
r
u
c
t
u
r
e
Ye
s
Figure 6. ABC-BPNN trai
n
i
ng pro
c
e
ss
6. Simulation and Experi
mental An
aly
s
is
In ord
e
r to te
st the pe
rformance of the
algor
ith
m
p
r
opo
sed i
n
thi
s
pa
pe
r, the
followin
g
take
s the co
mmon 256
*2
56 origi
nal grayscale imag
e Came
ram
a
n as an exam
ple, in Matlab test
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 16
93-6
930
TELKOM
NIKA
Vol. 13, No. 2, June 20
15 : 614 – 62
3
622
environ
ment,
add the G
aussia
n
noi
se
0
,
2
0.
02
, and make a comp
ari
s
on
with the
traditional d
e
noisi
ng meth
ods
su
ch as mean filt
ering, global th
reshold filteri
ng and me
di
an
filtering, the experime
n
tal result is
sho
w
n in Figure 7.
It can be see
n
from Figure
7, after adding int
ense G
aussia
n
noise to the original image
Lena, the
visual effect
of the ima
ge afte
r bei
ng filter
e
d
in thi
s
pa
pe
r is
sig
n
ifica
n
tly improved, t
h
e
edge
and
im
age d
e
tails
are
more
cle
a
r, the im
ag
e quality i
s
i
m
prove
d
, an
d it ha
s
sho
w
n
promi
nent ad
vantage
s co
mpared
with the traditio
nal
Gau
ssi
an n
o
ise filterin
g m
e
thod. Thu
s
,
the
algorith
m
a
d
opted i
n
thi
s
pape
r h
a
s g
r
eat adva
n
tag
e
s
com
p
a
r
ed
with th
e trad
itional al
gorit
hm
and the gl
ob
al thre
shol
d a
l
gorithm, the
SNR of t
he i
m
age i
s
in
cre
a
se
d to a la
rge extent an
d the
quality of the image is g
r
e
a
t
ly improved.
(a) O
r
igin
al image
(b) Adding
-no
i
se imag
e
(c) Mea
n
filtering
(d) Gl
obal thresh
old filterin
g
(e) Median filtering
(f) This alg
o
rith
m
Figure 7. The
denoi
sing eff
e
ct co
mpa
r
ison of Came
ra
man imag
e
7. Conclusio
n
In ord
e
r to i
m
prove
the i
m
age
quality
and
meet t
he requi
rem
e
nt of the
su
b
s
eq
uent
highe
r-l
evel pro
c
e
ssi
ng,
i
m
age den
oising
ha
s be
co
me imp
o
rtant
wo
rk in th
e
pre
-
p
r
o
c
e
ssi
ng
pro
c
e
ss
of image. Th
e convention
a
l image d
enoi
si
ng metho
d
s
will blu
r
the i
m
age e
dge,
while
maintainin
g a
nd en
han
cing
the image
e
dge, they w
ill
affect the de
noisi
ng effe
ct of the image
.
This
pap
er,
with its focus o
n
imag
e de
n
o
isin
g, ha
s re
alize
d
the im
age d
enoi
sin
g
metho
d
ba
sed
on a
r
tificial b
ee colony a
n
d
BP neu
ral
netwo
rk (A
B
C
-BPNN), by comp
ari
ng
wi
th other t
r
adit
i
onal
denoi
sing
me
thods throug
h si
mulation
experim
ent,
the b
e
tter i
m
a
ge q
uality is
resto
r
e
d
, whi
c
h
has verifie
d
the perfo
rma
n
c
e of the algo
rithm pro
p
o
s
e
d
in this pap
e
r
.
Referen
ces
[1]
Hamid AJ, Ra
bha W
I. F
r
acti
ona
l Ale
x
a
n
d
e
r
Pol
y
nomi
a
ls
for Image Den
o
isin
g.
Sign
al Processi
ng
.
201
5; 107(
2): 340-3
54.
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
1693-6
930
Im
age Denoi
sing Ba
sed o
n
Artificial Bee Colo
ny a
n
d
BP Neural Network (Ju
npi
ng Wa
ng)
623
[2]
PR Hi
ll, AM
Achim, DR B
u
ll, ME Al-M
u
a
lla.
Dua
l
-tree
Comp
le
x Wa
velet C
oeffici
e
n
t Magn
itud
e
Mode
lli
ng us
in
g T
he Bivariate
Cauc
h
y
–R
a
y
l
e
ig
h Distri
butio
n for Image
De
noisi
ng.
Si
gn
al
Processi
ng
.
201
4; 105(
12): 464-
472.
[3]
F
a
tma L. A N
o
vel Ap
pro
a
ch t
o
Sp
eckle
No
i
s
e F
ilteri
ng B
a
sed o
n
Artifici
a
l
Bee
Co
lo
n
y
Algorit
hm: An
Ultraso
und Im
age A
ppl
icatio
n.
Co
mput
er Methods
and Pr
ogra
m
s i
n
Bio
m
e
d
ici
n
e
. 2
0
1
3
; 111(
3): 561
-
569.
[4]
Hans
han
L, Z
h
i
y
on
g L.
Res
e
arch
on
Infrare
d
Sp
eci
a
l F
a
c
u
la
Vie
w
M
eas
ureme
n
t Meth
od B
a
se
d o
n
Image Proc
es
sing T
e
chn
o
lo
g
y
.
T
E
LKOMN
I
KA Indon
esia
n Jour
nal
of E
l
ectrical
Eng
i
n
eeri
n
g
. 2
012
;
10(6): 14
22-
14
29.
[5]
Hamid AJ, Ra
bha W
I. F
r
acti
ona
l Ale
x
a
n
d
e
r
Pol
y
nomi
a
ls
for Image Den
o
isin
g.
Sign
al Processi
ng
.
201
5; 107(
2): 340-3
54.
[6]
PR Hi
ll, AM
Achim, DR B
u
ll, ME Al-M
u
a
lla.
Dua
l
-tree
Comp
le
x Wa
velet C
oeffici
e
n
t Magn
itud
e
Mode
lli
ng us
in
g T
he Bivariate
Cauc
h
y
–R
a
y
l
e
ig
h Distri
butio
n for Image
De
noisi
ng.
Si
gn
al
Processi
ng
.
201
4; 105(
12): 464-
472.
[7]
F
a
tma L. A N
o
vel Ap
pro
a
ch t
o
Sp
eckle
No
i
s
e F
ilteri
ng B
a
sed o
n
Artifici
a
l
Bee
Co
lo
n
y
Algorit
hm: An
Ultraso
und Im
age A
ppl
icatio
n.
Co
mput
er Methods
and Pr
ogra
m
s i
n
Bio
m
e
d
ici
n
e
. 2
0
1
3
; 111(
3): 561
-
569.
[8]
Kazim H,
M F
a
tih T
.
Segme
n
tation
of SA
R
Images
usi
n
g
Improve
d
Artif
i
cial
Be
e C
o
l
o
n
y
Al
gorithm
and N
eutros
o
p
h
ic Set.
Appli
e
d Soft Computi
n
g
. 201
4; 21(8)
: 433-44
3.
[9]
G Rosline N,
S Maruthup
eru
m
al. Normal
i
ze
d Image W
a
termarking Sch
e
m
e Using C
h
a
o
tic S
y
stem.
Internatio
na
l Journ
a
l of Infor
m
at
i
on a
nd N
e
tw
ork Security (IJINS)
. 2012; 1(4): 255-2
64.
[10]
Bharg
a
v V, Bi
s
w
a
r
u
p
D, R
u
dra PM. An I
m
prove
d
Sch
e
m
e for Ide
n
tif
y
ing F
a
ult Z
o
n
e
in A S
e
ries
Comp
ensate
d
T
r
ansmission
Lin
e
us
ing
Un
decim
ated W
a
velet T
r
ansfor
m
and
Ch
eb
yshev N
eur
a
l
Net
w
ork.
Intern
ation
a
l Jo
urna
l of Electrical Po
w
e
r & Energy Systems
. 20
14
; 63(12): 76
0-7
68.
[11]
Z
hen L, Bi
ng
a
ng
X, Z
heru
C
,
Daga
n F
.
Intelli
ge
nt
Char
ac
terizatio
n
an
d Evalu
a
tion of Yarn
Surfac
e
Appe
ara
n
ce
us
ing
Sal
i
e
n
c
y
M
ap A
n
a
l
ysis, W
a
vel
e
t T
r
ansform an
d F
u
zz
y
ART
M
AP Neur
al
Net
w
ork.
Expert Systems w
i
th Applicati
ons
. 201
2; 39(
4): 4201-
42
12.
[12]
Chia-
N
a
n
K. Identif
icati
on of
Nonl
in
ear S
y
st
ems
w
i
th Outli
e
rs usi
ng W
a
v
e
let N
eura
l
Net
w
o
r
ks Bas
e
d
on
An
ne
ali
ng D
y
namic
al Le
a
r
nin
g
Alg
o
rithm
.
Engin
eer
ing
Appl
icatio
ns of
Artificial Inte
lli
genc
e
. 20
12
;
25(3): 53
3-5
4
3
.
[13]
Ale
x
a
nder
AF
, Dusa
n H, P
a
v
e
l YP, V
a
clav
S.
Ne
w
BF
A M
e
thod
Bas
ed
o
n
Attractor N
e
ural
Net
w
o
r
k
and L
i
kel
i
h
ood
Maximizati
on.
Neur
o co
mputi
n
g
. 201
4; 132(
20): 14-2
9
.
Evaluation Warning : The document was created with Spire.PDF for Python.