TELKOM
NIKA Indonesia
n
Journal of
Electrical En
gineering
Vol. 13, No. 1, Janua
ry 201
5, pp. 159 ~
165
DOI: 10.115
9
1
/telkomni
ka.
v
13i1.669
2
159
Re
cei
v
ed Se
ptem
ber 16, 2014; Revi
se
d No
vem
ber
14, 2014; Accepted Decem
ber 12, 20
14
Nonlinear Classifier Design Research
Based on SVM
and Genetic Algorithm
Wenjua
n Ze
ng*, Haibo G
a
o
Schoo
l of Information Sci
enc
e and En
gi
neer
ing
,
H
una
n Internati
ona
l Eco
nomics U
n
iver
sit
y
,
Cha
ngsh
a
, 410
205
,
Ch
i
na
*Corres
p
o
ndi
n
g
author, e-ma
i
l
: matlab_
b
y
sj
@12
6
.com
A
b
st
ra
ct
T
h
is pap
er presents a sup
p
o
rt vector mac
h
in
e
(SVM) mode
l structure,
the genetic a
l
gorit
h
m
para
m
eters of
the mode
l po
rtfolio opti
m
i
z
a
t
ion
mod
e
l,
a
n
d
used for
no
n-lin
ear p
a
tter
n
recog
n
iti
on,
the
m
e
tho
d
i
s
no
t o
n
l
y e
ffe
cti
v
e
fo
r l
i
n
e
a
r
p
r
ob
le
m
s
, no
n
l
i
n
ea
r p
r
o
b
l
em
s a
ppl
y e
ffe
cti
v
e
;
th
e
l
a
w si
mp
l
e
and
easy, better th
an the
multi-s
e
gment li
near cl
assifier d
e
sig
n
meth
ods a
nd B
P
netw
o
rk algo
rithm retur
n
s th
e
error. Exampl
e
s
show
the effici
ency of its re
cogn
ition
of 10
0%.
Ke
y
w
ords
:
gen
etic a
l
gor
ithm, s
upp
ort
vector
mach
in
e, pa
ttern r
e
c
ogn
ition,
no
nli
near, co
mbi
n
a
t
oria
l
opti
m
i
z
at
ion.
Copy
right
©
2015 In
stitu
t
e o
f
Ad
van
ced
En
g
i
n
eerin
g and
Scien
ce. All
rig
h
t
s reser
ve
d
.
1. Introduc
tion
Suppo
rt vector machine
s
SVMs are statisti
cal lea
r
ning the
o
ry
the content
of th
e
younge
st an
d
the mo
st useful pa
rt. Its core
c
onte
n
t is p
r
e
s
ented
from 19
92 to
1995 [5], mai
n
ly
for patte
rn
reco
gnition
problem
s. SVMs i
s
sepa
rable from th
e ca
se
of th
e linea
r o
p
timal
sep
a
ratin
g
pl
ane ma
de, that su
ch a
cla
ssifi
ca
tion hy
perpl
ane
ca
n
not only co
rrect cl
assificat
i
on
of all training
sa
mple
s, b
u
t
also
from
t
he
cla
ssifi
cati
on of th
e trai
ning
sam
p
le
surfa
c
e
at th
e
clo
s
e
s
t point
to the cla
s
sificatio
n
of the ma
ximum
distan
ce fro
m
su
rface . By making t
he
maximum i
n
terval to
control the
complexity
of cla
s
sificatio
n
, thus a
c
h
i
eving a
go
od
gene
rali
zatio
n
ability. Line
ar
ca
se
ca
n
not be
se
pa
rated, thro
ugh
a n
online
a
r t
r
an
sform
a
tion
to
transfo
rm th
e
input
spa
c
e i
n
to a hi
gh di
mensi
onal
fe
ature
sp
ace, in this
ne
w sp
ace to
stri
ke
the
optimal lin
ea
r
cla
ssifi
catio
n
surfa
c
e. In
order
to
avoid hi
gh-dim
ensi
onal
feat
ure
spa
c
e i
n
the
compl
e
x non
-linea
r op
era
t
ions, in SVMs kernel fu
nction m
e
tho
d
is ad
opted
, it is the high-
dimen
s
ion
a
l
i
nner produ
ct spa
c
e
i
s
con
v
erted
to
the
origin
al calculation of the
ke
rnel fun
c
ti
on
spa
c
e.
Geneti
c
al
go
rithm
referre
d
to a
s
GA
(Gen
etic Alg
o
rithm
s
) in 1
962
by the
American
University of Michiga
n
Professo
r Holl
and'
s
sim
u
la
te the natural geneti
c
mech
ani
sms a
nd
biologi
cal
evo
l
ution from
a
parall
e
l rand
o
m
se
arch
opti
m
ization
met
hod. It "su
r
vival of the fittest,
the fittest gen
eration" of bi
ologi
cal evolu
t
ion prin
ciple i
n
to se
rie
s
opt
imized
param
eters
of a co
d
e
grou
p, acco
rding to the value of the selecte
d
fit fu
nction a
nd re
plicatio
n thro
ugh inhe
ritan
c
e,
cro
s
sove
r an
d mutation scre
enin
g
of individual
s to
make a
dapti
v
e value hig
h
individual
wa
s
retaine
d
, to form a
ne
w g
r
oup,
ne
w g
r
oup in
he
rits t
he p
r
eviou
s
gene
ration
of
informatio
n,
but
also
better t
han the p
r
ev
ious
gene
rati
on. This
cy
cl
e, grou
ps
of individual
s to
adapt to ev
er-
increa
sing,
u
n
til it meet
s
certain
conditi
ons.
The
alg
o
r
ithm i
s
simpl
e
, pa
rallel
pro
c
e
ssi
ng,
can
get
global o
p
tima
l solution.
We u
s
ed
a li
near pattern reco
gnition m
e
thod
s an
d a
pplication
s
m
a
ture,
we a
r
e
pro
b
ing
for no
nline
a
r pro
b
lem
s
wi
th multi-stag
e or mo
re
of
the lin
ear a
ppro
a
ch; [1]
is a
metho
d
of
resea
r
ch
sch
o
lars, thou
gh
this
ca
n ofte
n to a
c
hi
eve the
pu
rpo
s
e
of
the sam
p
le
can be divided,
but the metho
d
of complex
stru
cture, dea
l with the pro
b
lem "su
r
ge
ry
" more cumb
ersome.
2. Support V
ector M
achi
n
es for
Clas
sifier
Assu
me that trainin
g
data {
x
i
,y
i
} (i=
1
,…,l; x
i
R
n
; y
i
{-1,
+1}
)
,
Can b
e
a hyperpl
ane
(wx) +
b = 0 no error to separate
the tw
o samples with the l
a
rgest dist
ance separating
hyperplane
will
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 23
02-4
046
TELKOM
NI
KA
Vol. 13, No. 1, Janua
ry 2015 : 159 –
165
160
get the
best generali
z
ati
on abilit
y. Optimal hyperplane will
be
the nearest
point
of a
small
numbe
r of
sa
mples
(calle
d
sup
port ve
ct
ors) to d
e
ci
de
, nothing to d
o
with the
oth
e
r
sampl
e
s.
Use
the followin
g
form for the d
e
scriptio
n an
d cla
ssifi
catio
n
of the samp
le interval
hyperpla
ne:
(w.x
)
+
b
=
0
,
|
|
w
|
|
=
1
(
1
)
y=
1
,若
(w
.x)
+
b
≥
(
2
)
y=
-
1
,若
(w
.x)+
b
≤
-
(
3
)
The optimi
z
at
ion problem
of SVMs in the se
pa
ratin
g
hyperplane
to make the
followin
g
norm
a
lization
:
Let
= 1, and w and b
ca
n be scale
d
. Hyperplan
e from the nea
re
st sam
p
le poi
nts
(su
ppo
rt vect
ors) satisfy:if
y=1 then:
(w.x
i
)
+
b
=
1
(
4
)
if y=-1 then (w.x
i
)
+
b
=
-
1
(
5
)
Suppo
rt Vector to the hype
rplan
e
dista
n
c
e 1/||w||
。
Th
us, the mathe
m
atical optimi
z
ation
probl
em form
ulation is:
min ½||w||
2
s
.t. y
i
(w.x
i
+b
)-1
≥
0 (i=1,…,l)
(6)
Acco
rdi
ng to the most opti
m
al solutio
n
of
quadratic
prog
ram
m
ing
theory, the
probl
em
can b
e
tran
sforme
d into the Wolfe du
al probl
em
to so
lve. Lagran
ge
function is
co
nstru
c
ted:
L(w,
,b)=
½||w||
2
-
∑
i=1
l
i
y
i
(w
.
x
i
+b)
+
∑
i=1
l
i
(
i
≥
0; i=1,2,…,l)
(7)
Whe
r
e
:
i
is a Lagrang
e multiplier. According to opt
imization the
o
ry are:
L(w,
,b)/
w =
0
(8)
L(w,
,b)/
b =
0
(9)
w =
∑
i=1
l
i
y
i
.x
i
(10
)
∑
i=1
l
i
y
i
=
0
(11)
Substituting
back the two
Lagra
nge fu
nction,
elimin
ating w and
b, by comput
ing the
origin
al optim
ization p
r
obl
e
m
by Wolfe d
ual pro
b
lem:
max w(
)=
∑
i=
1
l
i
-½
∑
i,j=1
l
i
j
y
i
. y
j
x
i
x
j
s
.t.
∑
i=1
l
i
y
i
=
0
(
i
≥
0; i=
1,2,…,l)
(12)
The o
r
igin
al
optimizatio
n
probl
em
solut
i
on is th
e ov
erall o
p
timal
solutio
n
. Opti
mization
algorith
m
ca
n be
solved
;
Param
e
ter b
acco
rdi
ng to the K
a
ru
sh
-Kuhn
-T
ucker conditi
ons
obtaine
d:
b= y
i
-w
T
x
i
(
i
(0,C) )
(13)
Then the opti
m
al hyperpla
ne as:
f(
x)
=(
w
.
x)+
b
=
∑
i=1
l
i
y
i
(x
i
.
x
)+
b
(14)
For line
a
r cl
assificatio
n
proble
m
s
can
not
be sep
a
rated, you
can ma
p the
input x
throug
h the
n
online
a
r fu
nct
i
on into
a hi
g
h
dime
nsi
ona
l feature
spa
c
e
(x)
,
In this
s
p
ac
e then
a
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
2302-4
046
Nonli
nea
r Cla
ssifie
r
De
sig
n
Research Ba
sed
o
n
SVM and Ge
netic
Algorithm
(Wenjua
n Zeng
)
161
linear
cla
ssifi
cation. The e
nd re
sult is, by a kern
el
functio
n
K (xi, x) instead of
Equation (14
)
in
(x
i
.x)
。
The op
timal split pla
ne is:f(x)=
∑
i=1
l
i
y
i
K (x
i
,x)+
b
The discrimin
ator is:
f(
x)
=
sgn{
∑
i=1
l
i
y
i
K (x
i
,x) }
(15)
Kernel fun
c
ti
on method in
supp
ort vect
or ma
chin
es
played an im
portant role, and the
choi
ce of kernel functio
n
satisfy the Mercer
c
o
nditions
, as
long as
any s
y
mmetric
func
tion.
3. The Basic
Opera
t
ion of the Gen
e
tic
Algorithm
1) Co
py (Rep
rodu
ction O
p
erato
r): is sel
e
cted fro
m
a
n
old popul
ation and
stron
g
vitality
of the individual bits. The
process of gene
rating
n
e
w pop
ulatio
ns of strin
g
s.
Acco
rding to
the
fitness valu
e
of the copi
ed
bit string, which i
s
t
he m
ean value
wit
h
a high fit in
the bit string
is
more li
kely t
o
pro
d
u
c
e th
e next gen
eration of
on
e
or mo
re
chi
l
dren. It mim
i
cs th
e natu
r
al
phen
omen
on,
the appli
c
ati
on of Darwin'
s
theo
ry of
th
e fittest gene
ration. Co
py o
peratio
n can
be
achi
eved
by random
meth
od. If usi
ng
a
co
mpute
r
p
r
ogra
m
to i
m
p
l
ement,
con
s
i
der first
prod
uce
betwe
en 0 a
nd 1 unifo
rml
y
distribute
d
rand
om num
bers, if the probability of
a string of 40
% of
repli
c
ation,
whe
n
ge
nerating rand
o
m
numb
e
rs
betwe
en 0
and 0.4, th
e
strin
g
is copied,
Otherwise be
eliminated.
2) Cro
ss
(Cro
ssover O
pera
t
o
r): co
py op
eration from the
old po
pula
t
ion who
cho
o
se the
good, b
u
t no
t cre
a
te ne
w ch
romo
som
e
s. Th
e sim
u
lated cro
s
s-b
r
eedi
ng p
r
o
c
ess of biol
og
ical
evolution to imagine, throu
gh the com
b
i
nation of
the two chro
mo
so
mes ex
chan
g
e
, to create n
e
w
varieties.
The
pro
c
e
s
s i
s
a
s
follo
ws: Ch
oose two
chromosome
s i
n
the p
ool
mat
c
h, ex
cha
nge
of
one
o
r
m
o
re
points we
re randomly sele
cted
p
o
siti
o
n
s
: the ex
cha
nge of th
e ri
ght of the p
a
r
ent
chromo
som
e
s exch
ang
e some
point, you can g
e
t two new
chrom
o
some
numbe
rs stri
ng.
Exchang
e reflects th
e nat
ure of info
rm
ation exc
han
ge ide
a
s.
Cross a littl
e cross, multi-p
o
i
n
t
cro
s
sov
e
r,
as
well a
s
t
he
same
c
r
o
ss,
orde
r
cro
s
so
v
e
r and
cy
cle
cr
os
sov
e
r.
T
hat
c
r
o
ss i
s
t
he
most ba
sic m
e
thod, used
widely. It refers to the chro
moso
me
s ha
ve a cut off point.
3) Va
riatio
n
(Mutation
Op
erato
r): m
u
ta
tion
op
eration u
s
ed
to
si
mulate th
e b
i
ologi
cal
environ
ment i
n
the natural geneti
c
facto
r
s due to va
ri
ous a
c
cide
ntal geneti
c
mu
tation, it is very
small
pro
babi
lity value of random
ge
net
ic chan
ge.
If the only
sele
ction a
nd
cro
s
sover, a
nd
no
variation, co
mbination
of gene
s ca
n
n
o
t
be outsi
de
t
he initial
sea
r
ch
sp
ace, the
evolution
int
o
a
local
solutio
n
at an ea
rly stage
and int
o
the term
in
ation process, thus affecti
ng the soluti
on
quality. In order to
obtai
n
the gr
eate
s
t
possibl
e
cont
rol of th
e q
u
a
lity of highe
r
optimal
soluti
on,
mutation ope
ration must be
used.
4. Model Parameter
s
Ba
s
e
d on Gene
ti
c Algorithm
Optimiza
tion
Principle
4.1. The Parameter
s
and
that
First
dete
r
min
e
the
pa
rame
ter rang
e that
is gen
erally
given by th
e
use
r
, a
nd th
e
n
by th
e
accuracy
re
q
u
irem
ents, it
s en
codi
ng. S
e
lect th
e
bi
na
ry stri
ng to
re
pre
s
ent
ea
ch
pa
ramete
r, a
n
d
establi
s
h the
relation
ship
betwe
en pa
rameters. The
n
con
nect th
e binary stri
n
g
to form a long
binary st
ring,
the string fo
r the gen
etic
al
gorithm
can o
perate the o
b
j
e
ct.
4.2. Select th
e Initial Population
Becau
s
e
of the ne
ed to im
plement the
p
r
oces
s of p
r
o
g
rammi
ng, so
the initial po
pulation
rand
omly ge
nerate
d
by compute
r
. For binary e
n
co
ding, the first betwe
en 0
and 1 g
ene
rated
uniformly
dist
ributed
rand
o
m
num
be
r, then th
e
p
r
ov
ision
s
of the
ra
ndom
nu
mber ge
ne
ra
ted
betwe
en 0 a
nd 0.5 to 1 betwe
en th
e
representati
v
es on be
hal
f of
0,0.5 1. Of course,
real
rand
om n
u
m
bers can al
so b
e
u
s
ed.
In addition,
taking i
n
to
accou
n
t the com
p
lexity of
comp
utation to spe
c
ify the size of popul
ation.
4.3. Dete
rmination of
fitn
ess fu
nction
Gene
ral
con
s
train
ed o
p
timization
met
hod
s to me
e
t
the co
nditions
ca
n be
obtaine
d
unde
r a set of
param
eters in the desi
gn
par
a
m
eters from the gro
u
p
to find the best.
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 23
02-4
046
TELKOM
NI
KA
Vol. 13, No. 1, Janua
ry 2015 : 159 –
165
162
4.4. Genetic
Opera
t
ions
First fit valu
e obtain
ed
by fitness fu
ncti
on, a
nd
then find a
copy of e
a
ch strin
g
corre
s
p
ond
s t
o
the p
r
ob
abil
i
ty. Copy the
string
of ea
ch
gene
ration
p
r
oba
bility and
the produ
ct
o
f
the numbe
r
of string
s that shoul
d be
replic
ated in
the numbe
r of the next
gene
ration. T
h
e
probability of a large copy i
n
the
next generation will
have more offs
pring, on the
contrary
will be
eliminated.
Second, si
ngl
e-poi
nt crossover, crossov
e
r probability P
c
. After cop
y
ing a m
e
mb
er from
the P
c
of the
proba
bility of selectio
n in orde
r to match the stri
ng
form pool, po
ol matche
s a
nd
then a memb
er of the ran
d
o
m match, ra
ndom cro
s
so
ver positio
n is determin
ed.
Finally, the
mutation probability P
m
.
The initial
p
opulatio
n through
re
pro
d
u
ction,
cro
s
sove
r a
n
d
mutatio
n
a
r
e the
ne
w
ge
neratio
n of
p
opulatio
n, the
pop
ulation
o
n
be
half of
fu
ture
gene
ration
s i
n
to the ad
ap
ter by the d
e
co
ding
fu
nction of the e
nd meet
s th
e co
ndition
s
of
observation, i
f
not met, the
n
repe
at the operation unti
l
satisfied.
End of the
co
ndition
set by
the sp
ecifi
c
i
s
sue
s
, as l
o
n
g
as th
e targ
et para
m
eters withi
n
pre
s
crib
ed li
mits, then te
rminate the
cal
c
ulat
io
n. The above
op
erati
on
ca
n be expresse
d in
Figure 1.
Figure 1. Flowchart of GA
4.5. Optimization of Mo
d
e
l Parameter
s
using Gen
e
tic Algori
t
h
m
Specific Steps
1) to determine the app
rox
i
mate ran
ge o
f
each pa
ram
e
ter and the
code len
g
th, coding;
2) ra
ndomly g
enerated initi
a
l popul
ati
on
of n individual
s form P (0
);
3) th
e p
opul
a
t
ion of
ea
ch i
ndividual
de
coded
into
the
co
rrespon
di
ng p
a
ramete
r value,
use thi
s
arg
u
m
ent to the value of co
st functio
n
and fi
tness functio
n
value of J f, taking f = 1 / J;
4) T
he a
ppli
c
ation of
rep
r
o
ductio
n
, cro
s
sover an
d m
u
tation op
era
t
ors
on th
e p
opulatio
n
P (t) operatio
n to produ
ce t
he next gene
ration pop
ulati
on P (t +1);
5) Rep
eat steps
(3
)
a
nd (4). Until
p
a
rame
ter conv
erge
nce o
r
t
o
a
c
hieve th
e de
sired
targets.
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
2302-4
046
Nonli
nea
r Cla
ssifie
r
De
sig
n
Research Ba
sed
o
n
SVM and Ge
netic
Algorithm
(Wenjua
n Zeng
)
163
5. Example Implementa
tion Proces
s
5.1. Identif
y
t
h
e Sample Data
w
i
th Sup
port Vec
t
or
Machine Mo
del
Select [1] ca
n be divid
e
d
in a no
nline
a
r two exam
ples, thi
s
case is two dim
ensi
onal
probl
em, it has 45 set
s
of sample
s, a cla
ss of 24
grou
ps of sampl
e
s, sampl
e
gro
ups of 21 cla
s
s
2; their
ra
w d
a
ta sh
own in
Figure 2
(C
1
Cla
ss
1 d
a
ta, that co
rresp
ond
s to 1
poi
nt on the
gra
ph;
C
2
for the
2
categ
o
rie
s
of
data, the co
rre
sp
ondi
ng f
i
gure
on the
2; the first
column of the
i
sampl
e
s of serial orde
r), Figure 2 [1],
the mult
i-sta
ge linea
r cla
ssifi
cation Co
ntrol su
bdivision
plan
s. Figure and data fro
m
the literature [1].
Figure 2. Orig
inal data di
stribution an
d multi-se
gme
n
t linear
cla
ssifi
cation
As can
be
se
en fro
m
Fig
u
r
e 2, th
e
ca
se is no
n-lin
e
a
r p
a
ttern
re
cog
n
ition p
r
o
b
lem, we
tak
e
the sensor c
o
re func
tions
:
Senso
r
tanh(x)=(e
x
-e
-x
)/(e
x
+e
-x
)
Kernel func
tion K(x,x
i
)=tanh(-||x
- x
i
||
2
/2)
(16)
In (12), u
s
in
g the ke
rnel
function K(x
,
x
i
) is Instea
d of (x
i
.x)
,
And with the
penalty funct
i
on
method, the (12) into the o
p
timization p
r
oblem:
Min w(
)=½
∑
i,j=1
l
i
j
y
i
. y
j
K(
x
i
,
x
j
)-
∑
i=1
l
i
+M*|
∑
i=1
l
i
y
i
|
,
(
i
≥
0; i=
1,2,…,l)
(17)
Whe
r
e M
is t
he pe
nalty co
efficient, taki
ng M
= 1
00. l
is the
sa
mpl
e
num
ber, th
e ca
se
s
of l = 45.
Discri
minant function classi
fication is
still (15), and:
b= y
i
-
∑
j=1
l
j
y
j
K( x
i
,
x
j
)
(
i
(0,C
) )
(18
)
5.2. Parameter Optimiza
ti
on Gene
tic Algorithm
Here we
use real
-code
d geneti
c
alg
o
rithm, 45 sa
mples of the
ca
se
s, find the model
(17
)
, the opti
m
al sol
u
tion,
whe
r
e the m
odel pa
ram
e
ters
i
有
45
个。
have 45. Sel
e
cted fo
r ea
ch
i
(0,1)
unifo
rm
ran
dom
nu
mber
betwee
n
the initial
weig
ht, as
o
ne spe
c
ie
s e
a
ch
I
,
i
the
numbe
r of initial popul
ation of 50;
Determin
e the
cost fun
c
tio
n
value J of
the rule
s: J=
w(
)
;
Cro
s
so
ver pro
babilit
y and mutatio
n
prob
ability were P
c
=0.
9
,
P
m
=0.
0
3
3
。
With re
al nu
mber
encodin
g
, after 40 g
ene
ra
tions of evolu
t
ion,
i
to obtain the optim
al paramete
r
s in Tabl
e 1
,
b
=0.11
33.
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 23
02-4
046
TELKOM
NI
KA
Vol. 13, No. 1, Janua
ry 2015 : 159 –
165
164
Table 1. Opti
mization of m
odel pa
ram
e
ters
i
1 2
3 4 5
0.3588
0.4310
0.2240
0.6072
0.3543
0.2073
0.6209
0.3925
0.3772
0.7272
0.3270
0.7003
0.3892
0.2211
0.6315
0.4330
0.1120
0.3426
0.6120
0.1918
0.2388
0.0294
0.5087
0.7849
0.7468
0.4133
0.6692
0.7216
0.1697
0.4502
0.6859
0.3936
0.5534
0.5187
0.4502
0.4622
0.4290
0.3320
0.2657
0.3054
0.5697
0.1456
0.4303
0.6336
0.4735
By Equation (15) to identify the case
s of non
lin
ear p
r
o
b
lems a
nd practical, and e
ffective;
the re
cognitio
n
rate of 100
%, identification of
new
cla
s
ses a
nd the
actual outp
u
t
of the samp
le
Figure 3 (Re
c
og
nition fun
c
tion y=
∑
i=
1
l
i yiK (xi,x)+b
,
The first 24 sampl
e
s of
a class, ide
n
tify
the dialogu
e, reco
gni
ze th
e value of y> b; 21
for the two types of s
pecim
en
s, identificati
on
dialog
ue, recogni
ze th
e va
lue of y
<b
),
and i
dent
ify the o
pen
sid
e
between
two
types
of "zo
ne"
That recognit
i
on
results are obvio
us.
T
he [1] u
s
e
d
fi
ve linea
r
cla
s
sifier, th
e recognition
re
sul
t
s
were not ne
cessarily inten
ded to do (se
e
, Figure
2
)
, the pro
c
ed
ures an
d algo
ri
thm compl
e
xity.
The metho
d
s describ
ed h
e
re, and p
u
t forwa
r
d for
the most effective optimi
z
ation al
gorit
hm,
whi
c
h ha
s go
od versatility
and can be u
s
ed in ma
ny area
s.
Figure 3. SVM and gen
etic algo
rithm fo
r nonlin
ea
r pa
ttern re
cog
n
ition re
sults
6. Conclusio
n
As a high
-pro
file field of machi
ne lea
r
ni
ng su
ppo
rt vector m
a
chin
es SVMs sho
w
s g
r
eat
advantag
es.
Suppo
rt vector ma
chin
es
are b
a
sed o
n
stru
ctural risk mi
nimization prin
cipl
e, to
maximize th
e
gene
rali
zatio
n
ability of le
arnin
g
ma
ch
i
ne, that is, fro
m
the limited
training
samp
les
are
small
errors on the
in
depe
ndent te
st set i
s
still
able to g
uarantee a
smal
l error. Sup
p
o
rt
vector ma
chi
ne algo
rithm
is a convex
optimizat
ion
proble
m
, so
the local op
timal solution
is
globally o
p
timal solution.
Suppo
rt ve
ctor ma
chi
ne
algorith
m
u
s
i
ng the
kern
e
l
functio
n
in
an
implicit nonli
n
ear tra
n
sfo
r
m
a
tion, clever
ly
solved the curse of dimen
s
ion
a
lity.
The ge
netic
algorith
m
an
d sup
p
o
r
t vector ma
chi
n
e
combi
ned ef
fectively to high-spe
ed
nonlin
ear p
a
ttern recognitio
n
, it is supe
ri
or to
other m
e
thod
s. And the gen
etic al
gorithm h
a
s t
he
following main features
:
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
2302-4
046
Nonli
nea
r Cla
ssifie
r
De
sig
n
Research Ba
sed
o
n
SVM and Ge
netic
Algorithm
(Wenjua
n Zeng
)
165
1) Th
e gen
etic algo
rithm i
s
coded to
o
perate
on th
e paramete
r
s, rather tha
n
the paramete
r
s
themselve
s
;
2) Gen
e
tic al
gorithm
s sta
r
t
from many p
o
ints
in pa
rall
el operation, not just the p
o
int;
3) Ge
netic
al
gorithm to
cal
c
ulate th
e ad
aptive val
ue
of the obje
c
ti
ve function
wi
thout the oth
e
r is
derived, an
d thus le
ss dep
ende
nt on the
proble
m
;
4) The
rule
s of the genetic algorithm o
p
timization
i
s
d
e
termin
ed by the prob
abilit
y, not certaint
y;
5) G
eneti
c
al
gorithm fo
r ef
ficient solutio
n
sp
ac
e he
uristic
sea
r
ch, rather th
an bli
ndly exhau
sti
v
e
or co
mpletely
rando
m se
arch;
6) Optimi
zati
on of gen
etic
algorith
m
is a
function to
b
e
t
he ba
sic u
n
re
st
ri
ct
ed, it neither
req
u
ires
a contin
uou
s functio
n
, differentiabl
e
functi
on i
s
not requi
red, either
expre
s
sed b
y
mathemati
cs
were analytic functions, an
d can
be eve
n
a neural ne
twork mappi
n
g
matrix the
implicit functi
on, and thu
s
a wide
r ra
nge
of applicatio
ns;
7) The ge
neti
c
algo
rithm h
a
s the ch
ara
c
teri
stic
s of parallel
comp
u
t
ing, which
can be massiv
ely
parall
e
l co
mp
uting to incre
a
se
comp
utin
g spe
ed;
8) The g
eneti
c
algo
rithm is more suitabl
e fo
r larg
e-scale optimization of compl
e
x issue
s
;
9)
The g
eneti
c
algo
rithm is simple, po
we
rful.
Referen
ces
[1]
Leo
n Bobro
w
s
k
i. Design of
piec
e
w
is
e lin
e
a
r classifiers f
r
om formal ne
urons b
y
a ba
sis exc
h
a
nge
techni
qu
e.
Pattern Rec
ogn
itio
n.
1991; 2
4
(9): 863-
870.
[2]
Chan
g-qu
n. W
ang
Xia
o
l
ong
a
nd s
upp
ort vec
t
or cl
assific
a
tio
n
a
nd m
u
lti-
w
i
dth of th
e Ga
u
ssian
kern
el
,
Journ
a
l of Elec
tronics
. 200
7; 35(3); 48
4-4
8
7
.
[3]
Abhi
jit, S Pand
ya, Ro
bert, B Mac
y
for
w
a
r
d. neur
al net
w
o
rk
pattern recog
n
i
tion a
nd its implem
entatio
n.
Xu Y
ong, Ji
ng
T
ao,
translatio
n
, Beiji
ng. Pub
l
ishin
g
Ho
use o
f
Electronics In
dustr
y
.
1
999; 6
.
[4]
W
ang
Xi
aop
in
g
,
Cao Lim
i
ng
with ge
netic a
l
go
rithms
- theor
y,
appl
icati
on a
n
d
soft
w
a
re im
pl
ementati
o
n
of [M], Xi'
an,
Xi
'
an Jiaoto
ng U
n
iversit
y
Press.
2002; 6.
[5]
Cortes C, Vapnik V. Support Vector Net
w
orks.
Machin
e Le
arni
ng.
19
95; 2
0
; 273-2
95.
[6]
V Vapnik. Stati
s
tical Le
arni
ng
T
heor
y
.
Ne
w
Y
o
rk: W
ile
y
.
1
9
9
8
.
[7]
V Vapnik. T
he Nature of Stati
s
tical Le
ar
ni
ng
T
heor
y
.
Ne
w
Y
o
rk: Spring
er-V
erla
g. 199
5.
[8]
W
ang P
eng, Z
hu
Xiao
ya
n. R
B
F
kernel
b
a
s
ed SVM m
o
d
e
l
sel
e
ctio
n a
n
d
its ap
plic
atio
n
.
Co
mput
er
Engi
neer
in
g an
d Appl
icatio
ns.
200
3; 39(2
4
): 72 ~
73.
[9]
Li
Lin,
Z
H
AN
G Xiao.
SVM
bas
ed
on
R
B
F
kerne
l
op
timizatio
n
l
ear
nin
g
a
l
g
o
rithm
.
Co
mp
uter
Engi
neer
in
g an
d Appl
icatio
ns.
200
6; 42(2
9
): 190 ~
192.
Evaluation Warning : The document was created with Spire.PDF for Python.