TELKOM
NIKA Indonesia
n
Journal of
Electrical En
gineering
Vol.12, No.5, May 2014, pp
. 4056 ~ 40
6
2
DOI: http://dx.doi.org/10.11591/telkomni
ka.v12i5.4364
4056
Re
cei
v
ed O
c
t
ober 1
7
, 201
3; Revi
se
d Decem
b
e
r
28, 2013; Accept
ed Ja
nua
ry 1
5
, 2014
Application of Support Vector Machine Model in Mine
Gas Safety Level Prediction
H
u
a
p
in
g
Zh
ou
1
, Ruixin Zhang
2
1
School of Co
mputer Scie
nc
e and En
gi
neer
ing An
hui U
n
iv
ersit
y
of Sci
enc
e and T
e
chno
l
o
g
y
,
Huai
Na
n, Chin
a
2
F
a
cult
y
of Res
ources & Safet
y
En
gin
eeri
ng,
Chin
a Un
iversit
y
of Mini
ng & T
e
chn
o
lo
g
y
,
Beiji
ng, Ch
in
a
A
b
st
r
a
ct
F
o
r the
li
mitat
i
on
of trad
itio
nal
infor
m
atio
n fusi
on t
e
ch
nol
ogy
in t
h
e
mine
g
a
s sa
fety clas
s
pred
icitio
n, an
intelli
ge
nt alg
o
ri
th
m is prop
osed i
n
w
h
ich
Genetic Al
gor
ithms is a
dopt
ed to opti
m
i
z
e
the
para
m
eters of t
he
least s
q
u
a
r
e
s su
pport v
e
c
t
or mach
ine
a
n
d
esta
blis
hes
a
multi-se
nsor
i
n
formatio
n
fusi
on
m
o
del GA-LSSVM which overcom
e
s the
s
ubjectivity and blindness on
parameters selection, and thus
improves
its cl
assificati
on ac
curacy a
nd c
o
nverg
ence
sp
e
ed. Exper
i
m
ent
al res
u
lts show
that co
mpar
ed
to
the least sq
uar
es supp
ort vector ma
c
h
in
e mode
l not be
en
opti
m
i
z
e
d
an
d
the least sq
uar
es supp
ort vector
m
a
c
h
ine
m
o
del optim
i
z
ed
by the gr
id
searc
h
ing algorithm
,
GA-LSSVM
model can
be a
good solution
on
the issue
of the hig
h
-di
m
ens
i
ona
l,
non
lin
ear
and u
n
certai
nt
y of the sma
ll s
a
mpl
e
in co
al
mi
ne u
n
d
e
rgro
und
envir
on
me
nt le
vel eva
l
uati
on.
Ke
y
w
ords
:
infor
m
ati
on
fu
sion, gen
etic alg
o
rith
ms,
l
e
as
t squ
a
res s
upp
ort vector
mac
h
i
ne, p
a
ra
met
e
r
opti
m
i
z
at
ion cr
oss vali
datio
n
Copy
right
©
2014 In
stitu
t
e o
f
Ad
van
ced
En
g
i
n
eerin
g and
Scien
ce. All
rig
h
t
s reser
ve
d
.
1. Introduc
tion
The info
rmati
on fusi
on m
e
thods i
n
min
e
safety a
r
e
mainly involves Baye
s e
s
t
i
mation
theory [1], fuzzy inform
atio
n fusio
n
[2, 3], Vague se
ts
informatio
n fu
sion [4], ad
ap
tive estimatio
n
method
s
i
n
b
a
tche
s
[5], DS
eviden
ce
theory[6], rou
gh set [7], neural net
works or a combin
ation
of both m
e
th
ods [8], whi
c
h complete
resp
ectively o
ne level fu
si
on an
d d
e
ci
sion level fu
si
on.
De
cisi
on-l
e
ve
l informatio
n
fusion
meth
od ha
s
adva
n
tage
s an
d
disa
dvantag
e
s
. DS evid
e
n
ce
theory is
difficult to find the more
reasona
bl
e b
a
si
c proba
bil
i
ty assign
me
nt for sp
ecif
ic
circum
stan
ce
s. Th
e me
mb
ership val
ue
of Fu
zzy Info
rmation F
u
si
o
n
is a
sin
g
le
value
whi
c
h
may
not al
so
indi
cate the
evide
n
ce
of Su
ppo
rting
and
op
p
o
sin
g
a
nd i
s
not the
be
st t
heory
for de
a
ling
with the
un
certai
nty. Vague
set
s
th
at co
nsi
d
e
r
both m
e
mbe
r
shi
p
and
n
on-m
e
mbe
r
ship
informatio
n, but target
selectio
n met
hod i
s
mo
re difficult to
determi
ne.
Nerve
net
work
informatio
n fusio
n
alg
o
rit
h
m ha
s the
sho
r
tcoming
s
such a
s
trainin
g
sl
ower, mo
re diff
icult
para
m
eter
se
lection, ea
sy
to over-fitting.
So, it is
difficult to adapt to the mine. I
n
this pa
pe
r, the
above a
naly
s
is, g
eneti
c
algorith
m
op
timizing
l
east squa
re
s
suppo
rt vecto
r
ma
chin
e (GA-
LSSVM) information fusi
on optimization mode i
s
proposed. Support vector
machine solves t
he
que
stion by
quad
ratic opt
imization,
so
the soluti
on
is gl
obal
op
timal sol
u
tion
, avoiding lo
cal
minima. Lea
st squa
re
s su
pport ve
ctor
machi
ne i
s
a
form of the
model
whi
c
h
can im
prove
the
training
sp
ee
d and
cla
ssifi
cation
spe
e
d
of model. T
he pa
ramete
rs of suppo
rt
vector m
a
chi
n
e
have a great
er impa
ct on
the model, so this arti
cle
adopt
s gen
etic algo
rithm
s
to optimize le
ast
squ
a
re
s supp
ort vector m
a
chin
e model
para
m
eters.
2. Least Squ
a
res Suppo
r
t
Vecto
r
Mac
h
ine
Suppo
rt Vect
or Ma
chin
e (SVM) [9] con
s
tru
c
ts the o
p
timal se
para
t
ing hyperpla
ne, and
make
s th
e p
o
ints
of the t
r
ainin
g
set a
w
ay fro
m
it f
a
r a
s
po
ssi
bl
e. The
nonli
near qu
estio
n
i
s
solved
thro
ug
h the i
n
tro
d
u
c
tion
of no
nli
near ma
ppin
g
map
ped
int
o
a
high
dim
ensi
onal
feat
ure
spa
c
e, thu
s
transfo
rme
d
to a linea
r
probl
em. Th
e co
nst
r
uctio
n
of the opt
imal se
pa
rati
ng
hyperpl
ane i
s
divided into li
nearly
sep
a
ra
ble an
d lin
e
a
rly insep
a
ra
bl
e. Suppo
rt vector m
a
chine
is
initially prese
n
ted in the
ca
se of linea
r separable.
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
2302-4
046
Applicatio
n of Support Vect
or Ma
chin
e Model in Mine
Gas Safet
y
L
e
vel
…
(Huapi
ng Zhou
)
4057
Assu
ming
the training
sa
mple set
11
{(
,
)
,
...,
(
,
)}
(
y
)
,
l
ll
Tx
y
x
y
x
among it,
,
{
1
,
1
}
,
1
,
...,
n
ii
x
xR
y
y
i
l
;
constructin
g
a
nd solving th
e optimizatio
n
probl
em for the variabl
es
and
b
, the objective function
is:
2
,
1
mi
n
2
wb
w
s
.t.
[(
)
]
1
,
1
,
...,
ii
yw
x
b
i
l
(1)
The
*
w
and
*
b
is th
e optimum
so
lution. The
o
p
timal sepa
ra
ting hype
rpla
ne
con
s
tru
c
te
d
is
:
(*
)
*
0
wx
b
,:
Get the followin
g
deci
s
ion fun
c
tion
(
)
sgn[
(
*
)
*
]
f
xw
x
b
(2)
Whe
n
the
trai
ning
set
are
l
i
nearly i
n
sep
a
rabl
e,
the
error mu
st exist whi
c
h i
s
ref
e
rred to
as a
.Accordi
ng to the stru
ctural
risk mi
nimizatio
n
pri
n
cipl
e, the introdu
ction of slack variabl
es,
denote
d
by
i
.s
.t.
0
i
.The co
nstrai
nts i
s
relaxed:
[(
)
]
1
ii
i
yw
x
b
.
1
l
i
i
i
s
adopte
d
as a
measure whi
c
h de
scrib
e
s
the degr
ee of
miscla
s
sifica
tion of the training set.
While
al
so
en
sure
2
2/
w
maximu
m. The
r
efore
a pe
nalty pa
rameter C i
s
i
n
trodu
ce
d
as
the co
mbinat
ion of the
s
e
two target
weig
hts.
The
obje
c
tive function
be
co
mes th
e foll
owin
g
form
:
2
1
1
2
l
i
i
wC
(3)
Introdu
cing L
agra
nge m
u
ltipliers
i
,
*
i
,
i
, and
*
i
, Establishi
n
g
Lagrang
e functio
n
:
2
*
11
**
*
*
11
1
()
(
)
2
()
(
)
ll
ii
i
i
i
i
ii
ll
ii
i
i
i
i
i
i
ii
Lw
C
y
w
x
b
yw
x
b
(4)
Tak
e
partial derivative with respec
t to w,b,
i
and
*
i
and set to zero. O
b
tai
n
the followin
g
form
:
*
1
*
1
**
()
()
0
0
0
l
ii
i
i
l
ii
i
ii
ii
wx
C
C
(5)
Origin
al probl
em is tran
sfo
r
med into its d
ual form:
**
*
*
,1
1
1
*
1
*
1
mi
n
(
)
(
)
(
)
(
)
2
..
(
)
0
0,
ll
l
ii
jj
i
j
i
i
i
i
i
ij
i
i
l
ii
i
ii
xx
y
st
C
(6)
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 23
02-4
046
TELKOM
NI
KA
Vol. 12, No. 5, May 2014: 4056 – 40
62
4058
Solving
the above conve
x
quadratic prog
ram
m
ing
probl
em
s, getting the
optimal
cla
ssifi
cation equatio
n:
1
(
)
s
gn[
*
*
]
l
ii
i
f
xy
x
x
b
(7)
For no
nline
a
r
sep
a
rable
ca
se, throu
g
h
the
introdu
ction of ke
rn
el function
s
meeting
Mercer
co
ndi
tion, transfo
rm into linea
r probl
em in
high-dimen
s
i
onal spa
c
e t
h
rou
gh n
onli
near
action. In hig
h
-dim
en
siona
l space se
ek the optim
al sep
a
ratin
g
hyperpl
ane. Kernel function
s
to
meet the con
d
itions mai
n
ly are as T
able
1:
Table 1. Kern
el Functio
n
Polynomial kerne
l
function
(,
)
(
,
)
,
,
0
d
kx
z
x
z
c
d
Z
c
Gaussian kernel function
2
2
(,
)
e
x
p
(
)
,
0
2
xz
kx
z
Exponential ra
dial basis kernel
2
(
,
)
e
xp(
)
,
0
2
xz
kx
z
B-spline kernel
11
1
1
(,
)
(
,
;
,
.
.
.
,
)
(
)(
)
,
,
m
pp
mi
i
i
kx
z
k
x
z
t
t
x
t
z
t
x
z
R
Fourier ke
rnel
2
1
2
1
(,
)
,
,
2(
1
2
c
o
s
(
)
q
kx
z
x
z
R
qx
z
q
RBF kernel
0
),
2
exp(
)
,
(
2
z
x
z
x
k
is the width o
f
the radial ba
sis function. T
he parameters of radial
basis ke
rnel function
are less
with a simple
ca
lculation, and the perform
ance i
s
bette
r and
it has
more
commo
n
applications. So this paper uses
RBF kernel f
uncti
on as a sup
port vector machine ke
rnel
function. After
kernel function is
introduced, its decision function is:
*
1
(
)
s
gn[
(
,
)
*
]
l
ii
i
i
f
xy
K
x
x
b
(8)
Suyken
s [10]
anno
un
ce
s le
ast sq
ua
re
s suppo
rt vector machi
ne mo
del to the pu
blic for
the first time in the last centu
r
y. Standard su
ppo
rt vector ma
chin
e model
is the ineq
uality
con
s
trai
nts
[(
)
]
1
,
1
,
...,
ii
yw
x
b
i
l
.But the lea
s
t squa
re
s
suppo
rt vecto
r
ma
chin
e i
s
equality co
nstraints
whi
c
h
is s.t.
[(
)
]
1
,
1
,
...,
ii
yw
x
b
i
l
. Thu
s
, solving li
n
ear e
quatio
n
s
instea
d of solving qu
adratic p
r
og
ram
m
ing p
r
obl
e
m
s, thereby
redu
cin
g
th
e su
ppo
rt vector
machi
ne mo
d
e
l comp
utatio
nal com
p
lexity, speedin
g
u
p
the solving
spe
ed [11, 12
].
For S
V
M
mult
i-cl
as
s
cl
as
sif
i
cat
i
o
n
probl
em
s,
t
h
is p
ape
r u
s
e
s
p
a
ire
d
cla
ssif
i
cat
i
on
algorith
m
s, n
a
mely on
e-a
gain
s
t-on
e al
gorithm
(a
bb
reviated
1-a
-
1 SVM). T
r
ai
ning a
cl
assi
fier
each two typ
e
s, for a
pro
b
l
em of n-type,
there a
r
e n
(n-1) / 2 c
a
tegory func
tion. Eac
h
c
l
as
s
i
fier is
to take a
n
y d
a
ta of two
cat
egori
e
s to t
r
ai
n the
[13, 14].
For the t
r
aini
ng bet
wee
n
class i an
d cl
a
ss
j, you need to solve the followin
g
two types of cla
s
sification:
,,
1
mi
n
(
)
2
ij
j
i
j
ij
T
i
j
i
j
t
wb
i
t
ww
C
()
(
)
1
,
,
0
()
(
)
1
,
,
0
ij
T
i
j
i
j
i
j
tt
t
t
ij
T
i
j
i
j
i
j
tt
t
t
wx
b
y
i
wx
b
y
j
(9)
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
2302-4
046
Applicatio
n of Support Vect
or Ma
chin
e Model in Mine
Gas Safet
y
L
e
vel
…
(Huapi
ng Zhou
)
4059
In this p
ape
r,
"the big
g
e
s
t refe
ren
dum
act" is ad
opt
ed d
e
termin
e
whi
c
h
cate
g
o
ry the
sampl
e
is, th
at is, each two cla
ssifie
r
b
o
th judge
s th
e categ
o
ri
es
of the sampl
e
, the class of
the
most votes i
s
the belongi
n
g
cla
ss of the
unkn
o
wn sa
mple. Cla
ssif
y
the unkno
wn sampl
e
x, the
deci
s
io
n funct
i
on is:
]
)
(
)
sgn[(
)
(
ij
t
T
ij
b
x
w
x
f
(10)
3. GA-LSSV
M Prediction Model
Geneti
c
algo
rithms (GA) is first pro
p
o
s
e
d
by Joh
n
Ho
lland in the 1
860
s. The int
e
lligent
sea
r
ch of ge
netic alg
o
rith
m is ad
opted
in the pr
o
c
e
ss
of parame
t
er sel
e
ctio
n
of supp
ort ve
ctor
machi
n
e
algo
rithm in
thi
s
p
aper a
nd fin
d
the
optimal
p
a
ram
e
ter.
Lo
okin
g fo
r the
optimal
su
pp
ort
vector ma
chi
ne model for
the sampl
e
of the coal
min
e
. By compari
ng with grid
search alg
o
rith
m,
geneti
c
se
arch algorith
m
can qui
ckly obt
ain the sati
sfactory pa
ram
e
ters.
SVM model
includ
es th
e qualitative
opti
ons
an
d quantitative option
s
. The forme
r
inclu
d
e
s
ho
w to identify specifi
c
su
ppo
rt vector m
a
chin
e algo
rit
h
m and Ke
rn
el. The latter is
s
u
pport vec
t
or parameter sele
c
t
ion. LSSVM
parameter c
hoice
inc
l
udes
:
k
e
rnel func
tion
para
m
eters a
nd the erro
r
penalty pa
ra
meter . The
error p
enalty para
m
eter
of different SVM is
named
different. The
na
m
e
of
different
ke
rn
el fun
c
ti
on p
a
rameters i
s
not the
same.
Ho
wev
e
r,
the role and
significa
nce
is both the same. Fo
r convenie
n
ce
of descriptio
n
herei
n, penalty
para
m
eter a
n
d
kernel pa
ra
meters are ex
pre
s
sed u
s
in
g
γ
and
σ
.
Figure 1. Optimiz
a
tion Process
Chart of GA-LSSVM
Kernel
pa
ram
e
ter
sele
ction
of lea
s
t
squ
a
r
es
s
upp
ort v
e
ctor ma
chi
n
e is di
re
ctly related to
the learni
ng
perfo
rman
ce
and gen
erali
z
ation ability
of least squ
a
res
sup
port vector m
a
chin
e.
The pa
ram
e
ter sele
ction
method
s co
mmonly use
d
are m
a
inly cro
s
s-valid
a
t
ion method
and
nucl
ear
calib
ration. Cross-validation m
e
thod re
quires a lot of
comp
uting, to determin
e
the
begin
Initiali
ze
individ
u
als and
produce populations(
γ
,
σ
)
compute ind
i
vid
u
al f
itness b
y
LS
SVM training
algorithm
Se
le
ct
Find the
larg
e f
i
t
n
ess
individua
ls to
joi
n
the
next
g
ener
ation
of
g
rou
p
s
Crossover
Mutation
compute th
e
individual'
s f
itness b
y
LSSVM training
algorithm
W
h
ether th
e
term
inat
ion cond
ition
is
satisfied
Decode,
regr
essi
on predi
c
tion
wi
th th
e r
e
sulting
p
a
ram
e
ters
(
γ
,
σ
)
end
yes
no
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 23
02-4
046
TELKOM
NI
KA
Vol. 12, No. 5, May 2014: 4056 – 40
62
4060
optimum parameters.
Especially when
the number of
parameters is lar
ge, it
will t
a
ke a l
o
t of ti
me
to stri
ke
the
optimal
soluti
on. Nucl
ear
calib
ration
m
e
thod i
s
relat
ed to m
u
ch
kno
w
le
dge
a
nd
resea
r
ch of nucle
ar mat
r
ix, so it
is more
difficult to achieve it. To compen
sate fo
r the insuffici
ent
of existing p
a
r
amete
r
sele
ction algo
rithm
,
the SVM
mo
del p
r
edi
cts t
he ga
s level
with the g
ene
tic
algorith
m
. Th
e algo
rithm is not only abl
e to ac
hi
eve
a global
se
arch, an
d se
arch
spee
d can
be
guarantee
d. The
p
r
a
c
tical
appli
c
ati
on
shows: the i
m
proved
supp
o
r
t vecto
r
ma
chine p
a
ramet
e
rs
sele
ction
alg
o
rithm
ba
sed
on g
eneti
c
algorith
m
ca
n get o
p
timal
ope
rating
pa
ramete
rs of
non-
stationa
ry tim
e
serie
s
a
nd
nonlin
ear pre
d
iction
mod
e
l
.
It is a p
r
ov
en meth
od of
sele
cting
SVM
kernel pa
ram
e
ters. O
p
timizing p
r
o
c
e
ss
sho
w
n in Fig
u
re 1.
4. Simulation Resul
t
s
4.1. Mine Gas Lev
e
l
Ev
aluation Mod
e
l Data
Selecting
20
0 sample
s from mo
re th
a
n
20
co
al mi
nes in
Chin
a
in this
pap
er,
of whi
c
h
125 sample
s
as SVM traini
ng sa
mple
s, the esta
blis
h
m
ent of mod
e
l training, 7
5
sampl
e
s a
s
test
sampl
e
s.
Co
mpre
hen
sive
literature [1
] and the lit
eratu
r
e [2], extracting fo
ur characte
ri
stic
para
m
eters,
whi
c
h
are
m
o
st relevant
to ga
s a
c
cid
ents, d
u
st, t
e
mpe
r
ature,
wind
speed,
gas
conte
n
t co
ncentration
as i
nput dime
nsi
on of supp
ort vector ma
chine. Acco
rdi
ng to the mi
ne
safety rule
s gas st
ate saf
e
ty class were divided
into
safer, more se
cure, gene
ral safety, more
dang
ero
u
s a
nd ha
za
rd
ou
s, re
sp
ective
ly values
0,1,2,3 and
4.
This
will
a
dopt GA
-LS
S
VM
cla
ssifi
cation
algorith
m
an
d training
sa
mples to e
s
tablish pre
d
ict
i
ve model. In orde
r to test the
corre
c
tne
s
s
and th
e g
e
n
e
rali
zation
p
e
rform
a
n
c
e
of the m
ode
l, training
sa
mples an
d t
e
st
sampl
e
s
sele
cted
are di
sjo
i
nt. And e
n
su
re th
at the
te
st sampl
e
co
ntains all
gra
des of
coal
m
i
ne
safety. Then for the given cha
r
a
c
teri
stic paramet
e
r
s the model can
make an int
e
lligent de
cisi
on
for environm
ental conditio
n
s u
nde
rline
the coal mi
ne.
Since th
e d
a
t
a sam
p
le val
ues
of 4 feat
ure
vectors vary
greatly, in o
r
der to
ma
ke t
he diffe
rent d
i
mensi
on
and
magnitu
de o
f
feature ve
ct
ors
minimize the
impa
ct o
n
t
he p
r
e
d
iction
mod
e
l, thu
s
en
su
re th
e
accuracy
of
SVM pre
d
icti
on
model, the
d
a
ta sample
s
sho
u
ld b
e
n
o
r
mali
zed
and
be
conve
r
te
d to a val
ue
betwe
en 0
a
nd 1,
the mean met
hod is a
dopte
d
in this pap
e
r
.
For
a data
se
ries, it
s ave
r
a
ge value
is di
vided by all
o
f
this d
a
ta series, the
mea
n
value
of the data serie
s
is
a ne
w sequ
ence
after treat
me
nt. Suppose the ori
g
inal
serie
s
de
note
d
a
s
x0=
(
x0(
l
)
,
x0(2)
,,
…
x
0(n)). The avera
ge
denote
d
as
0
x
.
The origi
nal
data seq
uen
ce x0 which is
averag
ed is t
he data sequ
ence y0. Calculated a
s
follows:
00
0
00
0
0
00
0
(1)
(
2
)
(
)
{
(
1
)
,(
2
)
,,(
)
}
,
,,
x
xx
n
yy
y
y
n
xx
x
(
1
1
)
4.2. Analy
s
is
of Experime
ntal Re
sults
Table 1. Co
m
parative Anal
ysis
Training time
Prediction Model
Parameter
Training time
Optimal number
of it
erationsClassification accurac
y
()
%
γ
σ
1
LSSVM 0.125
0.5
18.78
75.33
G
S
-LSSVM
1
0.5
15.47
50
82.67
G
A
-LSSVM
16.125
12.625
10.343
20
85.33
2
LSSVM 0.5
0.25
18.87
75.33
G
S
-LSSVM
8.37
6.725
19.59
100
74.67
G
A
-LSSVM
42.25
1.37
11.75
20
89.33
3
LSSVM 25.25
10.45
17.47
78.67
G
S
-LSSVM
62.35
12.345
16.34
140
89.33
G
A
-LSSVM
60.26
60.675
9.56
25
91.33
4
LSSVM 60.75
0.125
16.48
75.33
G
S
-LSSVM
80.67
25.38
18.27
155
83.33
GA-LSSVM
66.57
0.0093
10.56
20
89.33
LIBSVM is adopted as training and test
ing tools
of
support vector cl
assification model
in Matlab sof
t
ware pl
atform in the pap
er. In orde
r to better verify the validity of the predi
ctive
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
2302-4
046
Applicatio
n of Support Vect
or Ma
chin
e Model in Mine
Gas Safet
y
L
e
vel
…
(Huapi
ng Zhou
)
4061
model GA-LSSVM
and remove
the chance of
predic
tion result
s, 125 tr
aining samples were
sele
cted a
nd
to obtain the
optimal pa
ra
meters
γ
an
d
σ
,not param
eters
optimized lea
s
t sq
ua
re
s
s
u
pport vec
t
or machine (LSSVM), Meshing algori
thm optimiz
ation Leas
t
s
q
uares
support vec
t
or
machi
ne (GS-LSSVM) and genetic
algorithm le
ast squares
support
vector machine
(GA-
LSSVM) adopted by the paper,
traini
ng thre
e times
each predi
cti
on m
odel,
GS-LSSVM m
odel
and GA
-LSS
VM model a
r
e ado
pted re
spe
c
tively an
d ea
ch p
r
edi
ction mo
del i
s
train
ed th
ree
times. GS-LSSVM model and GA-LSSVM model, re
spectively, obt
ain the optimal parameters
γ
and
σ
, t
r
aini
ng time, the
optimal num
ber
of iter
ations.
LSSVM model
parameters are f
o
r the
artificial rand
om assign
me
nt, and then e
a
ch mo
del
is
validated an
d
the classifica
tion accuracy
is
obtaine
d with
the 75 test sample
s. Re
su
lts are
sho
w
n
in Table 1.
It can be
seen from T
able
1 that the av
erage trai
ning time of LS
SVM, GS-LSSVM and
GA-SVM is 1
6
.48s, 1
8
.27
s
and
10.56
s respe
c
tive
ly, the avera
g
e
numbe
r of iteration
s
of G
S
-
LSSVM, GA-SVM is 155
and 20 times respectively
, the he average
classifi
cation accuracy is
75.33%, 83.3
3
% and 89.3
3
%. Classification accu
ra
cy
rate of with
out the para
m
eter optimi
z
ation
LSSVM model is
the lowes
t. The parameters
γ
an
d
σ
have a gre
a
ter impa
ct o
n
the cla
ssifi
cation
performance of
su
pport vector
mach
ine. The t
r
aining
time of GS-LSSVM is the
longest, so it
is
boun
d to
affe
ct the
cl
assifi
cation
efficie
n
cy in
the
ca
se
of la
rge
a
m
ount
of trai
ning
sa
mple
s. All
particl
es of g
enetic alg
o
rit
h
m conve
r
ge
quickly
to the optimal sol
u
tion, and the classificatio
n
accuracy rate is the highes
t, reaching 89.33%, 12.666% higher than the LSSVM model, 6.532%
higher than
GS-LSSVM model. Therefore, the ge
netic algorithm optimiz
ati
on least squares
sup
port ve
ct
or m
a
chine
cla
s
sifier
p
r
edi
ction
mo
del p
r
op
ose
d
in th
e p
aper ha
s
b
e
tter
gene
rali
zatio
n
ability and highe
r cla
s
sification
cap
abil
i
ty.
5. Conclusio
n
Extracting fou
r
ch
aracte
risti
c
pa
ramet
e
rs, whi
c
h a
r
e m
o
st rel
e
vant to ga
s accid
e
n
ts, du
st,
temperature, wind spe
ed,
gas conte
n
t
con
c
e
n
trat
ion
as the facto
r
s of coal mi
ne enviro
n
m
ent
grad
e evalu
a
tion. Dividing
environ
ment
grad
e into
5
grad
e that are safe
r, more
se
cure, gen
eral
safety, more
dang
ero
u
s a
n
d
hazard
o
u
s
. Adopting g
e
n
e
tic algo
rithm
s
to optimize
the para
m
ete
r
s
of
least squares support
vector
machine
model
to establish th
e GA-LSSVM model of
coal mine
environment classifi
cati
on.
Compared to without the para
meter optimization
m
odel (LSSVM
),
grid
s
e
arch algorithm to opt
imiz
e the least s
quares
s
upport vec
t
or ma
c
h
ine model (GS-LSSVM),
this model h
a
s
a high
er p
r
o
c
e
ssi
ng spee
d and hig
her
cla
ssifi
cation
accuracy.
Ackn
o
w
l
e
dg
ements
This
wo
rk i
s
sup
porte
d by
Anhui Provincial
Natu
ral Scien
c
e
F
o
u
ndation Of
Universitie
s
Key Projects (Gra
nt No. KJ201
0A08
3), the Na
tural
Science Fo
undatio
n of the Anhui Hig
her
Educatio
n In
stitutions of
Chin
a (K
J2
0
12A099
),an
d
is
asl
o
sup
p
orted
by the
Nation
al
Nat
u
ral
Scien
c
e Fo
un
dation of Chi
na (G
rant No. 51174
257
).
Referen
ces
[1]
F
u
Hua, Z
h
a
o
Da
n, Z
h
o
u
F
ang. R
e
se
arch o
n
app
li
cation
of RS-
R
BF
informati
on fusi
on
i
n
gasmo
nitori
n.
Transduc
er and Microsystem
Technologie
. 2
009; 28(
12): 30
-32.
[2]
LIU Bing,LI H
u
i,XING Gan
g
.
F
u
zzy
Infor
m
ation F
u
si
on
T
a
rget Reco
gniti
on Base
d
on W
e
ighte
d
Evide
n
ce T
heo
r
y
.
Co
mp
uter E
ngi
neer
in
g.
20
12; 38(1
5
): 172
-174.
[3]
JIN Hai. Appl
ic
ation of Multis
ensor F
u
zz
y In
forma
tion F
u
si
on Alg
o
rithm in
Coal Min
e
Ga
s Monitori
ng.
Coal T
e
c
hno
lo
gy.
2012; 3
1
(8)
:
82-84.
[4]
F
U
Hua, GAO
T
i
ng, YANG Xi
n. Ap
plic
atio
n of v
agu
e s
e
t informati
on f
u
sio
n
the
o
r
y
i
n
min
e
s
a
fet
y
monitori
ng. Ap
plicati
on R
e
se
arch
of Comp
u
t
ers. 2009; 26(
6): 2282-
22
84.
[5]
SUN Ke-l
ei, Q
i
n R
u
-xia
ng. S
t
ud
y of multi-s
ens
or data
fus
i
on base
d
o
n
ada
ptive batch
estimatio
n
alg
o
rithm for g
a
s monitor
i
ng.
Transduc
er and Microsystem
Technologies
.
201
1; 30(1
0
): 47-49.
[6]
F
U
Hua,
LI Bo
, XUE Y
o
n
g
-c
un. An
al
ysis
of
und
ergr
oun
d
monitor m
e
tho
d
b
a
sed
on
D-S dec
isio
n-
makin
g
data fu
sion.
T
r
ansd
u
c
e
r and Micr
osy
s
tem T
e
ch
nol
o
g
ie
. 20
07; 26(
1
)
: 27-29.
[7]
DU Xia
o
-kun,
CHEN F
eng.
Applic
ation
o
f
information
fusion i
n
coa
l
mine mon
i
to
ring s
y
stem.
Transduc
er and Microsystem
Technologie
. 2
010; 29(
7): 124
-126.
[8]
Vapn
ik VN. T
h
e nature of Sta
t
istical le
arni
ng
theor
y
.
Ber
lin:
S
prin
g-Verl
ag. 199
5.
[9]
Su
y
k
ens J
AK, Vand
e
w
all
e
J.
Least s
q
u
a
re
s supp
ort vect
or mach
ine
cla
ssifiers.
Ne
ural
Processi
n
g
Letters
. 199
9; 9(3): 293-
30
0.
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 23
02-4
046
TELKOM
NI
KA
Vol. 12, No. 5, May 2014: 4056 – 40
62
4062
[10]
Sana M Vie
i
ra,
Luís F
Mendo
nça,
Gonçal
o
J. Modified b
i
n
a
r
y
PSO
for feature sel
e
ctio
n
using SVM
app
lie
d to mortalit
y
pre
d
ictio
n
of septic pati
e
n
t
s.
Applied S
o
ft Computi
ng.
20
13; 13(8): 3
494
-350
4.
[11] Ngh
e
Wan
g
,
Xin
y
i Zh
ao, B
aot
ian Wa
ng.
LS-
SVM
an
d Mo
nte Car
l
o m
e
tho
d
s b
a
sed
rel
i
a
b
ilit
y
an
al
ysi
s
for settlement
of soft cla
y
e
y
f
oun
dati
on.
Jo
u
r
nal
of Rock
Mecha
n
ics a
n
d
Geotec
hnic
a
l
Engi
ne
erin
g
.
201
3; 5(4): 312
-317.
[12]
En yon
gqi. LS
_SVM
Param
e
te
rs Sel
e
ctio
n
Based
on H
y
brid C
o
mp
le
x
Particle S
w
a
r
m Optimizatio
n
.
Energy Proc
ed
ia
. 201
2; 17: Part A: 706-71
0.C
[13]
Hen
g
-Lu
ng H
u
ang, Jia
n
-F
an
Dun. A multip
l
e
ke
rne
l
frame
w
o
r
k for ind
u
c
t
ive semi-sup
e
r
vised SVM
lear
nin
g
.
Neur
oco
m
p
u
ting
. 2
012; 90(
1): 46-
58.
[14]
Ehua L
i
u, Hui
Qian,
Guang
Dai, Z
h
ih
ua Z
h
ang. An
iterati
v
e SVM appro
a
ch to
feature
selecti
on an
d
classificati
on i
n
high-
dime
nsio
nal d
a
tasets.
Pattern Reco
gn
ition
. 20
12; 46(
9
)
: 2531-2
5
3
7
.
Evaluation Warning : The document was created with Spire.PDF for Python.