TELKOM
NIKA Indonesia
n
Journal of
Electrical En
gineering
Vol.12, No.4, April 201
4, pp. 2890 ~ 2
8
9
7
DOI: http://dx.doi.org/10.11591/telkomni
ka.v12i4.4244
2890
Re
cei
v
ed Au
gust 6, 201
3; Re
vised Novem
ber 1, 201
3; Acce
pted
No
vem
ber 2
2
,
2013
A Research on the Application of Quantum Neural
Network Optimization
Su Heng-
y
an
g
Exp
e
rime
ntal
T
r
aining Man
a
geme
n
t Center
, Guangd
ong I
ndustr
y T
e
chni
cal Col
l
e
ge,
Guangz
ho
u 51
030
0, Chi
n
a
email: sh
y_
lov
e
y@1
63.com
A
b
st
r
a
ct
T
he char
acters
of fixed w
o
rkin
g ho
urs tab
l
e i
n
clu
d
in
g the
la
rge a
m
ount
of i
nput a
nd
unc
ertainty o
f
nu
mb
er of inp
u
t para
m
eters,
make the e
n
c
odi
ng me
tho
d
has great i
n
flue
nce on d
e
sig
n
met
hod
of
traditio
nal
BP n
e
tw
ork combi
n
i
ng
gen
etic
alg
o
r
ithm. T
h
is
p
a
p
e
r a
naly
z
e
s
the
features
of se
archi
ng
abi
lity
o
f
the traditi
on
al
BP and
ge
neti
c
alg
o
rith
m, ba
sed o
n
w
h
ich combi
n
in
g
the qua
ntu
m
calc
ul
ation and ne
utra
l
mo
de
l to c
o
mpose
q
uantu
m
ne
urons.
T
h
e
n
the
q
uant
u
m
ne
urons
ar
e ex
pan
de
d t
o
q
uant
u
m
n
e
u
tral
network to replace the tr
aditional neutra
l net
work. Com
p
aring to the dr
aw
backs of the tradi
tional genet
ic
alg
o
rith
m, the
pa
per
ado
pts a var
i
atio
n cl
ampi
ng
mecha
n
is
m. T
he
me
chan
is
m gr
adu
ally
narrow
s
t
h
e
gen
etic
oper
ati
on s
pac
e by
fi
xing
not
sens
iti
v
e si
ngl
e
ge
n
e
loc
u
s i
n
th
e
p
opu
latio
n
s, so
that the
ge
ne
l
o
ci
do
not
meet th
e re
quir
e
me
nts are
more
l
i
kel
y
to p
a
rt
ici
pate
in
crossov
e
r
and
mutati
on t
o
acc
e
ler
a
te t
h
e
spee
d u
p
the
gen
etic a
l
gor
it
hm opti
m
i
z
a
t
io
n an
d pr
eve
n
t it from fa
lli
ng
i
n
to loc
a
l
extre
m
e
val
ue. F
i
n
a
lly
,
base
d
on fix
e
d table
of mec
han
ical sta
nda
rd w
o
rki
ng h
o
u
rs, compare
d
to a variety o
f
commo
n
ly us
ed
meth
ods, the i
m
pr
ove
d
alg
o
ri
thm h
a
s better perfor
m
a
n
ce.
Ke
y
w
ords
: qu
antu
m
ne
ura
l
n
e
tw
ork, improv
ed ge
netic
a
l
go
rithm, fixed w
o
r
k
ing h
ours tab
l
e
Copy
right
©
2014 In
stitu
t
e o
f
Ad
van
ced
En
g
i
n
eerin
g and
Scien
ce. All
rig
h
t
s reser
ve
d
.
1. Introduc
tion
Fixed worki
ng h
ours
manag
eme
n
t is
an im
portant fo
un
dation fo
r
enterp
r
i
s
e
manag
eme
n
t, enterp
r
ise p
r
oje
c
t mana
g
e
ment, and it
is also a
n
importa
nt basis for e
c
on
o
m
ic
accou
n
ting,
p
r
odu
ction sch
edule cont
rol,
cost c
ont
rol
and p
r
od
uct
prici
ng. The
quality of fixe
d
workin
g h
ours fo
rmulatin
g
not o
n
ly dire
ctly affe
ct la
b
o
r h
o
u
r
s,
equ
ipment utili
za
tion, pro
d
u
c
tion
cycle
s
a
nd la
bor
rem
une
ra
tion of empl
o
y
ees, but al
so promote th
e increa
se
in
labor produ
cti
v
ity
[1, 2]. Comm
on u
s
ed
met
hod
s for
cal
c
ulating fix
ed
workin
g ho
urs a
r
e tabl
e m
e
thod, em
piri
cal
estimate
s m
e
thod, analo
g
y method, the learni
ng
curve meth
od, etc. In
addition, artif
i
cial
intelligen
ce h
a
s al
so b
een
introdu
ce
d in
orde
r to obtai
n better results. Cu
rre
ntly, a little work h
a
s
been d
one
o
n
cal
c
ulatio
n
of fixed working h
o
u
r
s fo
r neut
ral net
work in
our
cou
n
try, su
ch
as
Shujuan in Xi
'an Unive
r
sity
of Technol
o
g
y [3], Zhong Hong
cai in
Shangh
ai Jia
o
tong Unive
r
sity
[4-7], Zhu
Lix
i
n in
No
rthwe
s
tern
Polytechnical
University, Liu Shu
hong
in
Jian
gnan
Unive
r
sity,
ZhuQia
oqia
o
in Southea
st Universi
ty an
d etc. The re
sults they ac
h
i
eved prove n
eural n
e
two
r
k is
feasibl
e
in
calcul
ation
of fixed worki
ng h
ours. T
hey u
s
e BP
neu
ral
net
work whi
c
h
has
sho
r
tco
m
ing
s
of ea
sy con
v
ergen
ce
an
d ea
sy to fal
l
into lo
cal
minima to t
r
ain fixed
wo
rking
hours. Even i
f
the neu
ral n
e
twork
stru
ct
ure i
s
re
a
s
on
able, we can
not gua
rante
e
to qui
ckly
reach
the
optimal value.
The biologi
cal evolution
b
a
se
d
gen
etic al
gorithm ca
n
obtain a
la
rge
r
prob
ability global optimal
solutio
n
, and has
stron
g
ro
bustn
ess, ad
aptability and
high paralleli
sm
cha
r
a
c
teri
stics whi
c
h i
s
widely use
d
in
optimiz
atio
n
probl
ems. Geneti
c
algo
rithms (GA) has
parall
e
l glob
al sea
r
ch ab
ility and BP
neural netwo
rk (BP
)
ha
s local
sea
r
ch
ability, so the
combi
nation of
these
two algorith
m
s ca
n
mut
ually co
mpen
sate the
i
r deficie
nci
e
s [8, 9].
In gene
ral, task time tabl
e has la
rge
qu
antit
y of data and the data
input pa
rame
ters a
r
e
not fixed. The
s
e
cha
r
a
c
teri
stics ma
ke B
P
desig
n
h
a
ve differe
nt structures,
so
while ap
plying
GA
to optimize B
P
weight
s an
d thre
shol
d value, the
len
g
t
h cha
nge
ran
ge of en
co
din
g
is
wide. An
d it
is difficult to
encode the i
n
itial gro
up. Population
i
n
itialization’
s o
peratio
n effici
ency is l
o
w.
So
how to choose the appr
opriate method for GABP optimiz
a
tion,
quic
k
l
y
and accurately
train
t
he
fixed working
hours table and esta
blish
a model
me
ets the req
u
irements, a
r
e three p
r
obl
em
s
need to b
e
solved wh
en training h
ours f
i
xed stand
ard
s
. This
pape
r use
s
a fixed
workin
g hou
rs'
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
2302-4
046
A Rese
arch o
n
the Applicat
ion of Quantu
m
Neural Network O
p
tim
i
zation (Su Hen
g
-yang
)
2891
table a
s
the t
r
ainin
g
sampl
e
and
optimi
z
es
sep
a
rately, and in
duct
s
and a
nalyzes the optimi
z
in
g
effects i
n
ord
e
r to im
prove
the traini
ng
effect of
ge
n
e
tic ne
ural n
e
twork trainin
g
fixed worki
ng
hours' q
uantit
ies table [10
-
12].
2.
BP Neur
al Net
w
o
r
k Mode
l of Quantu
m
Quantum
Ne
ural
Net
w
o
r
ks i
s
a
ne
ural
netwo
rk mo
d
e
l combini
ng
quantum
com
putation
based o
n
the
tradition
al BP artificial n
e
u
ral
network
[13-15]. Th
e
neural net
wo
rk m
odel
ca
n
be
cla
ssifie
d
to quantum n
e
u
r
al network
model ba
se
d
on quantum
gates. Qua
n
tum BP neural
netwo
rk mod
e
l co
nveys b
a
se
d on
ch
a
nge
s in t
he
angle
of qua
ntum state
s
i
n
formatio
n a
n
d
achi
eve qu
a
n
tum ph
ase
operation via
shifting
and
rotating
ge
n
e
ral
qua
ntum
gate
whi
c
h
is
different fro
m
the traditi
onal
Ne
ural
netwo
rk
s.
Q
uantum
gate
s
a
r
e
the
b
a
si
s fo
r q
u
a
n
tu
m
comp
uting, a
nd qu
antum
BP neural
net
work m
a
inly
u
s
e
s
unive
rsal
quantum
gate
s
in
stead
of the
activation fu
n
c
tion
s to
ope
rate on
inp
u
t
vectors.
Re-f
ormul
a
tion
of qua
ntum
bits are li
sted, fo
r a
quantum state
|
, we c
an get:
1
|
0
|
|
(1)
,
satisfy the normali
zatio
n
con
d
ition:
22
1
The above e
q
uation of qua
ntum state
s
is expre
s
sed
as the plu
r
al form:
sin
cos
)
(
i
e
f
i
(2)
Comp
ari
ng the two equ
ations, we
can
obtain
that |0> corre
s
pon
ds to the co
sine, |1>
corre
s
p
ond
s
to the sine which i
s
also mean
s the imagina
ry part and
indica
tes the pha
se
angle.
Acco
rdi
ng to the above co
mplex rep
r
e
s
entati
on of quantum state
s
, one bits ph
ase g
a
te
and two bit
s
controlle
d NO
T gate ca
n be
expresse
d a
s
follows:
One bit pha
se gate:
12
()
12
1
2
i
ff
f
e
(3)
Two bit
s
cont
rolled
NOT g
a
te
:
s
i
n
c
o
s
1
c
o
s
s
i
n
0
2
i
fi
el
se
(4)
From the formulas a
bove,
we can infe
r that
the function of one bit pha
se shift g
a
te is to
pha
se
shift th
e qu
antum
st
ates. A
s
a
co
ntrol
paramet
er
of two
bit
s
co
ntroll
ed
NOT g
a
te, wh
en
1
, the quantu
m
states rota
te at the same time whe
n
0
, the phase
chan
ge
s but
the
prob
ability a
m
plitude o
b
served d
o
e
s
n't
cha
nge, so
we defin
e th
ere i
s
no
cha
nge. When
is
other valu
e, the qu
antum
states can
be f
r
eely
c
han
ge
d. Accordi
ng
to the de
scrip
t
ion above, th
e
quantum n
e
u
r
on mod
e
l co
mposed by one bit phase shift gate and
two bits cont
rolled
NOT g
a
te
can b
e
expre
s
sed a
s
follo
ws:
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 23
02-4
046
TELKOM
NI
KA
Vol. 12, No. 4, April 2014: 2890 – 2
897
2892
y
f
z
g
2
y
∑
f
1
f
2
f
n
f
+
+
_
a
r
g(
u)
u
_
n
x
1
x
2
x
+
+
Figure 1. Qua
n
tum Neu
r
o
n
Model
12
,,
,
n
X
xx
x
rep
r
e
s
ent
s the an
gle in
p
u
t co
rre
sp
on
ding to the
a
c
tual p
r
obl
e
m
,
12
,,
,
n
indicate
s th
e pha
se shift which also
calle
d weig
ht,
is call
ed
offset or
threshold,
mean th
e control fa
ctor for an
gle,
arg
(
u
)
represe
n
ts the
pha
se of
u
arg
I
m
/
R
e
ua
c
t
a
g
u
u
, and z is t
he output of
quantum n
euro
n
.
gx
is si
gmoid
function.
Suppo
se
that
i
I
i
s
the
i-th
i
nput of
qu
ant
um n
euron, t
hen th
e
quan
tum ne
uro
n
a
bove
can b
e
expre
s
sed by the followin
g
form
ula:
1
()
a
r
g
(
)
2
()
n
ii
i
uf
f
I
f
yg
u
Of
y
(5)
The qu
antum
neuron mo
d
e
l is mainly
adju
s
ted by
,,
i
. The process of quantum
neuron
co
nta
i
ns th
ree
ste
p
s. Fi
rst, the
input q
uant
u
m
state i
s
p
hase shifted. Then
the off
s
et
angle i
s
ad
d
ed to corre
c
t the pha
se
shift resul
t. Fi
nally, the co
rrect
re
sult is pro
c
e
s
sed
by
controlled
NO
T gate to get the quantum
neuron outp
u
t.
3.
Impro
v
ed Genetic Algo
rithm
The imp
r
ove
d
geneti
c
alg
o
rithm find th
e app
roximat
e
optimal
sol
u
tion thro
ugh
the good
gene
s bit OS
. This is eq
ui
valent to decompo
se la
rg
e individual i
n
to a larg
e n
u
mbe
r
of sm
all
individual, col
l
ect good g
e
n
e
fragme
n
ts appe
ars, and
then combi
n
e them into a new individu
als
with hig
h
fitness. Wh
en a
certai
n pe
rio
d
gene m
eet
t
he optimi
z
ati
on a
c
cura
cy, in ord
e
r to
avoid
the damag
e cau
s
e
d
by crossover a
n
d
mutation
in next selecti
on and n
a
rro
w
the scop
e
of
optimizatio
n, we
fix it to
achieve
the
goal
s.
Fi
nal
ly, we
apply
it to a
sim
p
le a
ddition
sum
optimizatio
n functio
n
, the resultin
g effect is fa
r better than that of si
mple gen
etic
algorith
m
.
Assu
ming
th
e si
ze
of
a
popul
ation i
s
P, i
ndividu
a
l
length
is L
,
uniform
cro
s
sover,
mutation rate
is M, and the maximum a
l
gebra is G.
k
i
x
rep
r
e
s
ent
s the
i
th gene lo
ci
in the
k
th
individual,
1,
kP
and
1,
iL
. Evaluation functio
n
is
k
f
. In a simple
genetic al
go
rithm,
with the increase in the number
of genetic, optimi
z
ati
on
curve gr
adually level
off, and ultimat
e
ly
achi
eve extre
m
e value th
e
n
stay u
n
cha
nged. M
u
tation p
r
ob
ability of sin
g
le g
e
ne lo
cu
s of t
he
individual
is
M an
d the
probability that
the ge
ne
bit remain
s u
n
ch
ange
d in
n
year i
s
1
n
M
.
The two situ
ations indi
cat
e
that with the incr
ea
se i
n
the numbe
r of evolution
a
ry gene, lo
cus
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
2302-4
046
A Rese
arch o
n
the Applicat
ion of Quantu
m
Neural Network O
p
tim
i
zation (Su Hen
g
-yang
)
2893
mutation p
r
o
bability is al
so increa
sin
g
. For t
he
entire
popul
ation,
each individu
al ch
ang
e alo
ng
the dire
ction
of fitting e
v
aluation fun
c
tion (i
e
indi
viduals
with
great fitness are retai
n
ed,
individual
s wi
th
little
fitness
a
r
e eliminat
ed). We ca
n
also
co
nsi
d
e
r
that the
sen
s
itivity individual
to fitness fun
c
tion i
s
de
cre
a
sin
g
, which i
s
al
so
the
rea
s
on
con
s
tantl
y
flats the
opt
imization
curve
c
a
lled mutation relaxation.
This pa
pe
r p
r
esents a
n
improve
d
method,
call
ed variant cl
amp
ed whi
c
h im
prove the
perfo
rman
ce
of gen
etic
opt
imization
alg
o
rithm
by fixing a
large
r
o
r
small
e
r lo
ci.
Each tim
e
fixing
a gen
e can
cha
nge
gen
e
t
ic algo
rithm
optimizatio
n
spa
c
e, in
wh
ich li
kely in
cl
ude
s the
sig
nal
informatio
n cannot o
b
serv
ed in th
e old
spa
c
e.
Ea
ch
spa
c
e
optim
ization
ca
n
make
certain
of
improvem
ent
to the ave
r
a
ge fitness
of the pop
ul
atio
n. As the
r
e a
r
e al
way
s
ne
w ge
ne lo
cu
s is
fixed, fitness of the population is also
rising.
Thi
s
mech
ani
sm i
s
adju
s
ted b
y
the following
para
m
eters: mark
pa
ram
e
ter
0,
0
.
5
bj
, unma
r
k
para
m
eter
,0
.
5
unbj
b
j
and mark time
bj
t
. Then, the followin
g
equati
on ca
n be got
ten.
1
p
k
i
k
i
x
t
p
,
1,
iL
,
1,
kP
(6)
If in the
n
gen
eration
bj
t
i
or
bj
t
i
1
, all the
i
th se
g
m
ent ge
ne i
n
the individ
ual
are ma
rked. In the followin
g
evolution if
1
i
unbj
t
unbj
, all the individual exit the marking
pro
c
e
ss, o
r
el
se go o
n
. If the marke
d
g
ene segme
n
t is the
bj
t
th generation, it will
be fixed and
no long
er pa
rticipate in th
e geneti
c
alg
o
rithm opt
imi
z
ation o
peration, whi
c
h is called va
riatio
n
clampi
ng.
The initial
mark vector is
12
,
...
0
,
0...0
n
zz
z
. If in
the h-1th
gene
ration
1
i
unbj
t
unbj
, and th
e hth
gen
eratio
n
bj
t
i
or
bj
t
i
1
,
then we get
i
z
+
1
. In the
next
bj
t
ge
neration, if al
wa
ys meet
the co
n
d
ition
bj
t
i
or
bj
t
i
1
, all
the ith
segm
e
n
t gen
e
of the individual in the h+
bj
t
th gene
ration
will be fixed. Otherwise, is no one in th
e next
bj
t
gene
ration
s meet
1
i
unbj
t
unbj
,
i
z
=
0
that is
back
to the origin
Assu
me that
the
12
,
.
..,
n
tt
t
th loci in
the mth ge
ne
ration m
eets
the fixed re
q
u
irem
ent,
then:
Population of
the m-th gen
eration i
s
:
12
12
,
...
...
...
...
n
kk
k
k
k
k
tt
t
n
x
xxx
x
x
,
(1
,
2
,
3
.
.
.
)
kP
(7)
Population
s
o
f
the m+u-th (
1,
uG
m
) gene
ratio
n
is:
12
12
,
.
..
...
...
.
..
n
kk
k
k
k
k
tt
t
n
yy
x
x
x
y
,
(1
,
2
,
3
.
.
.
)
kP
(8)
In the case of no variation, the gene lo
ci
have
bee
n fixed will be
stri
ctly in the vicinity of
greate
r
or
sm
aller value.
The ope
ratin
g
step
s of the improved g
e
netic alg
o
rith
m are a
s
follo
ws:
1) Re
pre
s
e
n
tation of the proble
m
: code
the required
problem a
n
d
define the objective
function (fitne
ss fun
c
tion
).
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 23
02-4
046
TELKOM
NI
KA
Vol. 12, No. 4, April 2014: 2890 – 2
897
2894
2) Initial
po
pulation
pa
rameters, in
cl
uding
overall
si
ze,
crossover
pro
babi
lity and
mutation prob
ability, and ra
ndomly
gen
erate an initial popul
ation.
3) Re
peat the
following o
p
e
r
ation to ch
ro
moso
me in th
e popul
ation
until terminati
on
condition is m
e
t.
a. Calculating
populatio
n fitnes
s value fo
r each gen
e locu
s.
b. If the fitne
ss val
ue of a
gene lo
cu
s
satisfie
s the
set thre
sh
old
value (i.e. th
e above
-
identified opti
m
um value),
mark it and make it
out of
the genetic
manipul
ation,
thereby the
gene
locus is retained.
c. Retentio
n optimal value
,
the resultin
g
optimal ge
ne seg
m
ent
s are com
b
ine
d
into a
new in
dividua
l, and repla
c
e
the individual
wi
th the worst fitness value
in generation
.
d. To the ge
ne loci me
et the fixed require
ment
s, all gene
s in
that position
of the
individual
s in
the popul
atio
n are
repla
c
e
d
by the
fixed gene, an
d they are
no lo
n
ger in
crosso
ver
and mutation
operation
s
.
f. According to si
ze and value of f
i
tnes
s,
the crossover pr
obability and
mutation
prob
ability to sele
ct, crosso
ver and mutat
e
chromo
som
e
s to produ
ce
a new ge
neration.
A simple example is
selected to illustrat
e
that
the algorithm is
superior more rel
a
tively
than the simp
le geneti
c
alg
o
rithm. For a
binary sum fu
nction:
12
...
n
yx
x
x
0
,
1
,
10000
i
xn
(9)
The sel
e
cte
d
param
eters are as foll
ows:
the individual lengt
h is L = 10
000, the
popul
ation si
ze is P
= 50
0, the maximum evol
ution
algeb
ra Max
G
= 60
0, ch
romosome
s a
r
e
encode
d as a
binary co
de, uniform
cro
ssover, and mut
a
tion pro
babil
i
ty is 0.003.
Figure 2. SGA Optimizatio
n
Re
sults
Figur
e 3. Optimization
Re
sults of Variati
on
Clampi
ng Ge
netic Algo
rith
m
Figure 2 is a
optimizatio
n curve
of sim
p
le
gen
etic al
gorithm. As
can be
see
n
from the
Figure 2 that
the conve
r
g
ence cu
rve i
s
gradu
ally smooth with t
he incre
a
si
ng
of the evolu
t
io
n
algeb
ra. So
it is difficult
to achieve t
he
maxim
u
m
.
Figure 3
shows th
e im
proved
ge
net
ic
algorith
m
opt
imization
cu
rve whi
c
h ma
intain
a rapid
conve
r
ge
nce sp
eed and
can eventua
lly
conve
r
ge
to t
he m
a
ximum.
Throug
h the
above
si
mul
a
tion exp
e
rim
ents
we
can
obtain th
at th
e
variation
cla
m
ping
gen
etic al
go
rithm
can
effect
ive
l
y improve
the
rate
of g
enetic alg
o
rit
h
m
optimizatio
n
and p
r
eve
n
t falling into lo
cal extrem
u
m
. Later i
n
this
cha
p
ter,
we a
pply
cla
m
p
geneti
c
vari
ation n
eural n
e
twork algo
rith
m to the
opt
i
m
ization
of th
e initial
wei
g
h
t
thre
shold,
al
so
have goo
d re
sults.
4.
Simulation Experiment a
nd Analy
s
is
Take
three ta
bles of p
r
od
u
c
tion
quota
a
s
the
traini
ng
sam
p
le, a
n
d
test the
ab
o
v
e three
codi
ng m
e
tho
d
s’ effe
ct. Ta
ble 1 li
sted
p
a
rts
of the d
a
ta in the th
re
e
tables, in
whi
c
h T
is
stan
d
a
rd
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
2302-4
046
A Rese
arch o
n
the Applicat
ion of Quantu
m
Neural Network O
p
tim
i
zation (Su Hen
g
-yang
)
2895
man-hou
r qu
ota (o
utput), t
he rest p
a
ra
meters a
r
e in
put pa
ramete
rs. Ea
ch
sym
bol’s
meani
n
g
is:
D - out
side di
ameter, L - le
ngth, S – acro
ss flats dista
n
ce, B - width
,
T – depth
Table 1. Stan
dard T
a
sk Ti
me List
S/mm
16 16 20
20
24
24 30
End mill square
comprehensive time schedule
T/min
3.5 3.9 3.8
4.3
5.2
6.5 85
D/mm
22.5 22.5 26.5
26.5
33.9
33.9 42.4
L/mm
10 17 13
22
28
47 60
D/mm
63 63 63
63
80
80 80
Cutter g
r
oove
comprehensive time schedule
t/mm
3 6 10
7
4
6 8
B/mm
6 6 10
10
8
12
16
T/min
3
3.2 3.7
3.9
4
5.1 6.5
L/mm
20 26 34
45
59
78 102
From task time table, it can be seen that
the inputs are respe
c
tively 3 and 4, th
e output
is 1. Determi
n
ing the
num
ber
of hidde
n
layer no
de
s a
c
cordi
ng to K
o
lmogo
rov th
eore
m
are 7
and
9, so est
ablish BP model who
s
e
stru
ct
ure i
s
3-7
-
1
and 4
-
9-1. GA is adopte
d
to optimize
BP
initial weight
s and
threshol
d value
s
. T
h
e optimi
z
atio
n results
are
assig
ned
to
BP, and
ag
ain
carry on the trainin
g
. Prog
ram flow ch
art
is sho
w
n in F
i
gure 1.
Figure 4. Program Flo
w
Chart
Acco
rdi
ng to
the stru
cture of BP, determi
ni
ng the
total weight
s and th
re
sh
old value
numbe
r,
e
n
code ea
ch parameter
eig
h
t binary bits
. T
he training’
s
BP stru
cture i
s
3
-
7
-
1
and
4
-
9-
1, the numb
e
r
s of p
a
ra
met
e
rs
are
36 a
n
d
55, afte
r en
codi
ng, there are 3
6
* 8 =
2
88 and
55 * 8
=
440 bin
a
ry di
gits. Geneti
c
algorith
m
(g
a
)
adopt
s a sta
ndard gen
etic algorithm.
The chro
mo
some ge
ne
s n
u
mbe
r
s of variation cla
m
pin
g
codi
ng a
r
e
36 and
55. Base
d on
real cod
ed
g
enetic algo
rithm,
variation operation
i
s
t
he key. In this pa
per, th
e followin
g
sche
me
is ado
pted to
impleme
n
t variation: x is g
ene po
sition,
high an
d low
are the u
ppe
r limit and lower
limit of x, rand is a
small
random
numb
e
r p
r
od
uc
e
d
betwe
en 0 a
n
d
1. Due to t
he lo
w po
ssi
bility
of the same random n
u
mb
ers p
r
o
d
u
c
ed
by rand,
x ca
n be take
n as variation value to use.
()
x
r
and
hi
gh
l
o
w
l
ow
(10)
Becau
s
e th
ere ha
s not di
rect conversio
n
al
go
rithm b
e
twee
n re
al
and g
r
ay cod
e
, in the
gray
code
co
ding, first tra
n
sform real n
u
mbe
r
in
to 0
1
co
de, then
tran
sform
0
1
co
de into
g
r
ay
cod
e
an
d the
reverse
seq
uen
ce i
s
con
ducte
d in
de
codi
ng. Othe
r GA ope
ratio
n
s a
r
e th
e
sa
me
with 01 code.
The a
bove th
ree
co
ding
ge
netic
algo
rith
ms all
ado
pt
Formul
a
(22
)
as fitne
s
s fun
c
tion, in
whi
c
h E is m
ean squa
re e
rro
r got in ne
ural net
wo
rk
pre
-
traini
ng o
f
each chro
m
o
som
e
.
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 23
02-4
046
TELKOM
NI
KA
Vol. 12, No. 4, April 2014: 2890 – 2
897
2896
1/
(
1
)
fitn
e
s
s
E
(11)
The a
bove t
h
ree
codin
g
mode
s a
r
e
u
s
ed
to
o
p
timize BP
para
m
eters, in
which
the
para
m
eter
se
lection of GA
is as follo
ws:
populat
io
n si
ze 80, iterative
uppe
r limit 2000, variatio
n
prob
ability is
0.01, crossov
e
r p
r
ob
ability is 0.65.
Th
e time and th
e la
rge
s
t fitness
value by taki
ng
the three met
hod
s after opt
imization a
r
e
as sho
w
n in
Table 2
Paramete
r v
a
lue
s
after o
p
timization
is assign
ed to
BP trainin
g
. Set the n
u
m
ber of
training
500
0
times, Tabl
e
3lists th
e tra
i
ning a
c
curacy, training e
r
ror an
d maxi
mum, minimu
m
relative ge
ne
ralization e
r
ror. It can be
see
n
that
trai
ning results’
relative erro
r of three cod
i
ng
method me
ets the re
quire
ment of 5%.
Table 2. Gen
e
tic Algorithm
Trainin
g
Co
rrelation Tabl
e
01coding
Gra
y
code
Va
riation clamping coding
End mill square
comprehensive time
schedule
program
time-
consuming (s)
359.391
1.7256E+03
318.015
Max
i
mum fitness
value
9.9854E-01
8.6617E-01
0.99966
Cutter g
r
oove
comprehensive time
schedule
Program time
-
consuming (s)
809.94
3.5624E+03
718.00
Max
i
mum fitness
value
9.9865E-01
9.9492E-01
9.9875E-01
Table 3. Ne
ural Network
Training Contra
st
Table
Training
accur
a
c
y
Training
steps
The ma
x
i
mum
rel
a
ti
v
e
error
The minimum
rel
a
ti
v
e
error
End mill square
comprehensive
time schedule
01coding 1.00E-04
5
4.73E-02
6.9633E-04
Gra
y
code
1.00E-04
5
3.63E-02
3.6905E-04
Variation
clamping coding
1.00E-04
5
3.61E-02
1.20E-03
Cutter g
r
oove
comprehensive time
schedule
01coding
1.00E-05
5000
4.95E-02
6.7488E-06
Gra
y
code
1.00E-04
87
4.96E-02
1.0548E-04
Variation
clamping coding
1.00E-05
58
4.97E-02
1.2711E-04
From th
e a
b
o
ve table
da
ta, it can
be
see
n
that t
hese three
coding
metho
d
s’ final
optimizatio
n result
s are
cl
ose to ea
ch
other. The V
a
riation
clam
ping metho
d
has the sho
r
test
encodin
g
len
g
th and
the l
east
codi
ng ti
me. Each
bit
of variation
cl
amping
codin
g
re
pre
s
e
n
ts
a
para
m
eter which
do
es no
t need
de
co
de. The
resu
lt is the
opti
m
ization
solu
tion. Whil
e e
a
ch
para
m
eter co
ding l
ength
o
f
01
cod
e
a
n
d
gray code
i
s
d
e
termi
ned
acco
rdin
g to
the a
c
curacy
of
para
m
eters, and 01 co
din
g
ad
ds
a pro
c
e
s
s
from
bin
a
ry to
real
de
codi
ng, g
r
ay
cod
e
sho
u
ld f
i
rst
decode to bin
a
ry, and de
co
de from bin
a
ry to real.
In addition, it
can
be
cal
c
ulated fro
m
fitnes
s value,
after the o
p
t
imization of
geneti
c
algorith
m
, ne
ural
net
work
para
m
eters g
o
t
can
ma
ke
the training sampl
e
up to 10-4 orders
of
magnitud
e
s.
If put the p
a
ram
e
ters o
p
timized
into
neu
ral n
e
twork, it m
a
y
soo
n
rea
c
h
the
spe
c
ified con
v
ergen
ce
p
r
e
c
isi
on.
5. Conclu
sion
This p
ape
r di
scusse
s different
co
ding m
e
thod
s’ efficie
n
cy in trai
nin
g
formul
a qu
ota table
of genetic n
e
u
ral net
wo
rk,
and u
s
e
s
an
example to
illustrate th
e a
d
vantage
s an
d disa
dvanta
ges
of different
coding
metho
d
s: 01
code i
s
ea
sy
to
rea
lize a
nd
crossover an
d m
u
tation op
era
t
ion
are
simple; V
a
riation
clam
ping codin
g
length is
sh
o
r
t
with less time con
s
u
m
ing,
genetic o
p
e
r
ator
operation is
a little complicated; Gray cod
e
overco
mes the sho
r
tcomi
n
g
s
of Hammi
ng in 01
codi
ng, b
u
t consume
s
a
l
o
t of time. Al
though
after
geneti
c
al
gori
t
hm optimi
z
at
ion, the
neu
ral
netwo
rk
rapi
dly converge
s and in
cre
a
s
e
s
gre
a
tly
the pro
bability
of optimal solution, if neural
netwo
rk st
ru
cture i
s
not re
aso
nabl
e o
r
para
m
eter
se
ttings a
r
e
imp
r
ope
r, n
eural
netwo
rk fall i
n
to
local mini
mu
m or ca
nnot
rea
c
h the sp
ecified g
ene
ralizatio
n erro
r. So besid
es codin
g
pro
b
l
e
m,
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
2302-4
046
A Rese
arch o
n
the Applicat
ion of Quantu
m
Neural Network O
p
tim
i
zation (Su Hen
g
-yang
)
2897
the structu
r
e
desi
gn a
nd p
a
ram
e
ter
sett
ing of t
he ne
ural network are also
imp
o
rtant
p
r
obl
e
m
s
need to be
so
lved in the future.
Referen
ces
[1]
T
a
keshi Hash
i
m
oto, Abdu
lla
h
S Alaraim
i
, Ch
eng
ga
o Ha
n IC
I. Mitigation Sc
hemes for U
n
c
ode
d OF
DM
over Channels
w
i
t
h
Doppler Sp
reads and Frequency
Offsets – Pa
rt II: Asy
m
ptotic Analy
s
is.
Journa
l o
f
Co
mmun
icati
o
ns
. 2012; 7(
9): 686-
700.
[2]
Xi
ao H
o
n
g
, C
ao Mao
j
u
n
, Li
Panchi. Qu
an
tum-inspir
ed N
eura
l
Net
w
ork
w
i
t
h
App
licati
on to Imag
e
Restorati
on.
AISS
. 2013; 5(1
0
)
: 1198-1
2
0
7
.
[3]
Manj
u Mathe
w
,
A Be
njam
i
n
Premkumar,
Chie
w
-
T
ong
Lau. Multi-
use
r
Interference
and Cr
os
s
Correlation Effects of Sp
l
i
ne
M
u
lti
w
a
v
elet base
d
Cog
n
itive
Ra
dio Net
w
o
r
k.
Jour
nal
o
f
Co
mmun
icati
o
ns
. 2012; 7(
9): 701-
711.
[4]
Qi Ren, Jian
hu
i W
u
, W
e
ijun
Guan, Yun L
i
, Lih
ua Cu
i. Res
earch o
n
Ha
pp
iness b
a
se
d o
n
Anal
ys
is of
Variance.
IJACT
. 2013; 5(5): 232-
239.
[5]
Gilbert Micallef
, Louai Sak
e
r, Salah E Elay
oubi. Ha
ns-Otto Sceck Realistic
Ener
gy
S
a
ving Potential of
Slee
p Mo
de fo
r Existing
a
nd
F
u
ture Mo
bil
e
Net
w
orks.
Jo
ur
nal of
C
o
mmu
nicati
ons
. 2
0
1
2
;
7(10): 7
40-
748.
[6]
Peng J
i
a
ng, B
o
Xu, H
ual
i C
a
i, Yue
x
ia
ng
Yang. T
he Air
c
raft Life C
y
c
l
e Cost Estim
a
tion b
a
se
d o
n
Genera
lize
d
Vi
rtual Econ
om
y
.
JCIT
. 2013; 8(5): 128-1
37.
[7]
Masah
i
ro S
a
s
abe, H
i
rotak
a
Naka
no. Perfe
c
t Cell
P
a
rtitio
nin
g
Sch
e
me f
o
r Micro-C
e
l
l
ul
ar Net
w
o
r
ks.
Journ
a
l of Co
mmu
n
ic
ations
. 2
012; 7(1
0
): 749
-757.
[8]
Han
Xi
aod
on
g
.
F
ault Positionin
g
T
e
chnolo
g
y
for Po
w
e
r Grid Based o
n
W
a
velet Ne
ural Net
w
o
r
k
.
Bull
etin of Scie
nce an
d T
e
chn
o
lo
gy
. 201
3; 29(6): 59-6
1
.
[9]
Cho Se
e Jun
g
.
W
o
rking-time
Reducti
ons a
nd Ch
an
ges i
n
Emplo
y
me
nt: T
he Case of Korea.
JCIT
.
201
3; 8(12): 42
9-43
5.
[10]
Se-He
e
Ju
ng,
Yunsuk
Ch
a.
W
e
llness
i
n
th
e W
o
rkpl
ace:
Healt
h
Prom
oti
on Pr
ograms.
JCIT
. 2013
;
8(13): 44
0-4
4
6
.
[11]
Z
h
ifeng
Z
h
a
ng,
Su W
a
ng,
Xin
gan
g Mi
ao.
Re
aliz
ati
o
n
of Ro
bot-assiste
d
La
ser
Sca
nni
ng
Measur
emen
t
S
y
stem b
a
sed
on Assemb
l
y
P
l
atform.
JDCTA
. 2013; 7(1): 153-
158.
[12]
Lia
ngmi
ng L
i
, Ai-hu
a
Yan
g
, Hui-T
ang. T
he effe
cts of GL
UT
4 and IL-6 on the d
e
vel
o
p
m
ent of insul
i
n
resistanc
e inh
i
bited b
y
e
x
erci
se.
JDCTA
. 2012; 6(21): 3
82-
389.
[13]
Hon
g
ju
an Z
h
a
ng, Lon
g Qua
n
.
T
he Kinetic
Char
acte
ristics
of the Clampi
ng Unit in Inj
e
ction Mol
d
in
g
Machi
ne Driv
e
n
b
y
Ne
w
Pum
p
Contro
l S
y
st
em.
AISS
. 2012; 4(21): 46
8-4
75.
[14]
Nagar
aj Mud
u
kpl
a
Shad
ak
shara
ppa. Optimum
Gener
ation Sch
edu
li
n
g
for
T
hermal
Po
w
e
r Pla
n
ts
usin
g Artificia
l
Neur
al N
e
t
w
or
k.
Internation
a
l
Journ
a
l of El
e
c
tr
ical an
d C
o
mp
uter En
gin
e
e
rin
g
. 201
1;
1(2): 134-
13
9.
[15]
YU Li-
y
a
ng, W
A
NG Neng, Z
H
ANG W
e
i. Neura
l
-net
w
o
rk Based Ag
gre
g
a
tion F
r
ame
w
o
r
k for W
i
reles
s
Sensor N
e
t
w
or
ks.
Comp
uter Scienc
e
. 200
8; 35(12): 43-
37.
Evaluation Warning : The document was created with Spire.PDF for Python.