Int
ern
at
i
onal
Journ
al of Ele
ctrical
an
d
Co
mput
er
En
gin
eeri
ng
(IJ
E
C
E)
Vo
l.
11
,
No.
3
,
June
2021,
pp. 2
414~
2422
IS
S
N: 20
88
-
87
08
,
DOI: 10
.11
591/
ijece
.
v11
i
3
.
pp2414
-
24
22
2414
Journ
al h
om
e
page
:
http:
//
ij
ece.i
aesc
or
e.c
om
Hybrid
f
eatur
e
s
electi
on
m
ethod b
ase
d
on
p
articl
e
swarm
optimiz
ation and
a
dapti
ve local
s
ea
rc
h
m
eth
od
Ma
le
k
A
lz
aq
e
ba
h
1
,
S
ana
Jaw
arneh
2
, R
am
i Mus
taf
a A
.
Moham
ma
d
3
,
Mutase
m K
. Alsm
adi
4
,
Ibra
him
A
l
-
m
arashdeh
5
, E
man
A. E.
Ahm
ed
6
, Nas
hat
Alref
ai
7
,
F
aha
d A.
A
lgh
amdi
8
1,6,7
Depa
rtment
o
f
Mathe
m
atics,
Coll
ege of
Scie
n
ce
,
Im
am Abdulr
ahman
Bin
Fa
isa
l
Univer
si
t
y
,
Da
m
m
a
m
,
Saudi
A
rab
ia
1,6,7
Basic
and
Ap
pli
ed
Scie
n
ti
fi
c Resea
rch
Cen
te
r
,
Im
am Abdulrahm
an
Bin
Faisa
l U
nive
rsit
y
,
Dam
m
am,
Saudi
Ara
bia
2
Com
pute
r
Scie
n
ce
Dep
a
rtment
,
Com
m
unit
y
Coll
ege
Dam
m
am,
I
m
am Abdulra
hma
n
Bin
Faisal Un
ive
rsit
y
,
Dam
m
am,
Saudi
Arabi
a
3
Com
pute
r
Infor
m
at
ion
S
y
st
ems
Depa
rtment, College of
Com
puter Sci
en
ce a
nd
Inf
orm
at
ion
T
ec
hn
olog
y
,
Im
am Abdulra
hm
an
Bin
Faisa
l U
nive
rsit
y
,
Dam
m
am,
Saudi
Ara
bia
4,5,8
Depa
rtment
o
f
MIS
,
Coll
ege o
f
Applie
d
Studies
and
Com
m
unit
y
S
erv
ice, Im
am Abdulra
hm
an
B
in
Faisa
l
Unive
r
sit
y
,
Dam
m
am,
Saudi
Arabi
a
Art
ic
le
In
f
o
ABSTR
A
CT
Art
ic
le
history:
Re
cei
ved
Sep
8
, 2
0
20
Re
vised
Sep
2
1
,
20
20
Accepte
d
Oct
6
, 2
0
20
Mac
hine
le
a
r
nin
g
has
bee
n
exp
a
nsively
ex
amine
d
with
data
class
ifi
cation
as
the
m
ost
popula
rl
y
rese
arc
hed
subjec
t.
The
a
cc
ura
te
n
ess
of
pre
diction
i
s
impact
ed
b
y
th
e
data
provid
ed
t
o
the
class
ifi
cat
ion
al
gor
it
hm
.
Mea
nwhile,
uti
lizing
a
l
arg
e
amount
of
da
ta
m
a
y
in
cur
cos
ts
espe
cially
in
d
ata
colle
ct
ion
and
pre
proc
essing.
Studie
s
on
fea
tur
e
select
io
n
were
m
ai
nl
y
to
esta
bl
ish
te
chn
ique
s
th
at
ca
n
d
ec
re
ase
th
e
num
ber
of
u
tilized
fe
at
ur
es
(a
tt
ributes)
in
cl
assifi
ca
t
ion,
als
o
using
dat
a
th
at
gen
erate
accu
rat
e
p
red
i
ct
ion
i
s
impo
rta
nt.
Henc
e,
a
p
art
i
cle
sw
arm
opti
m
iz
ation
(PS
O)
algorith
m
is
suggested
in
the
cur
ren
t
art
i
cle
f
or
sele
c
ti
ng
th
e
ide
a
l
set
of
f
ea
t
ure
s.
PS
O
al
go
ri
thm
show
ed
to
be
superior
i
n
diffe
ren
t
dom
ai
ns
in
expl
or
in
g
the
sea
r
ch
spac
e
and
lo
cal
sea
rch
al
gor
it
h
m
s
are
good
i
n
expl
oit
ing
th
e
sea
rch
reg
ion
s.
Thus,
we
propose
the
h
ybridi
z
ed
PS
O
al
gorit
hm
wi
th
an
ada
pt
ive
l
oca
l
sea
r
ch
te
chn
ique
which
works
base
d
on
the
cur
r
ent
PS
O
sea
rch
state
a
nd
used
for
ac
c
ept
ing
the
c
a
ndida
t
e
solut
ion.
Having
thi
s
co
m
bina
ti
on
b
al
an
ce
s
t
h
e
lo
cal
int
ensifica
ti
on
a
s
well
as
the
glo
bal
dive
rsif
ication
of
the
sea
rch
ing
proc
ess.
Henc
e,
the
sugg
este
d
a
lgori
thm
surpass
es
the
origi
nal
PS
O
al
go
rit
hm
and
othe
r com
par
able
appr
o
ac
h
es,
in te
rm
s of
per
form
anc
e
.
Ke
yw
or
d
s
:
Ad
a
ptive l
ocal
s
earch
m
et
ho
d
F
eat
ur
e
s
el
ect
ion
Partic
le
s
wa
rm
o
pti
m
iz
at
ion
a
lgorit
hm
This
is an
open
acc
ess arti
cl
e
un
der
the
CC
B
Y
-
SA
l
ic
ense
.
Corres
pond
in
g
Aut
h
or
:
Ma
le
k
Alza
qebah
Ba
sic
an
d A
pp
li
ed
Scie
ntific
Re
s
earch
Cent
er
Im
a
m
A
bd
ulra
hm
an
Bi
n
Fais
al
U
ni
ver
sit
y
P.O. Bo
x 1
982, Dam
m
a
m
, S
a
ud
i
A
rab
ia
Em
a
il
:
m
aafeh
ai
d@
ia
u.ed
u.
s
a
”
NOME
N
CLA
TURE
Acron
y
m
s
PSO
:
Parti
cle
Sw
arm
Optimiza
ti
o
n
BCO
:
bee
col
on
y
optim
iz
at
ion
SSA
:
slap
sw
arm a
lgo
rit
hm
LAHC
:
La
t
e
a
ccept
anc
e
hil
l
-
c
li
m
bing
FFA
:
Firefly
al
gori
th
m
ALS
:
ada
pt
ive
local
s
e
arc
h
FS
:
fea
tur
e
se
le
c
ti
on
GA
:
Gene
tic
al
gori
th
m
CT
:
computed
tomog
rap
h
y
MFO
:
Moth
-
Flam optim
iz
at
ion
TS
:
Ta
bu
se
arc
h
ACO
:
ant
col
on
y
opti
m
iz
a
ti
o
n
BP
SO
:
bina
r
y
p
art
i
cle
s
warm
opti
m
iz
ati
on
NF
:
num
ber
of
sel
ec
t
ed
fe
at
ur
es
Evaluation Warning : The document was created with Spire.PDF for Python.
In
t J
Elec
&
C
om
p
En
g
IS
S
N: 20
88
-
8708
Hyb
ri
d
fe
atu
re
selec
ti
on
meth
od base
d o
n part
ic
le
swarm
opti
miza
ti
on
an
d ada
ptive
...
(
Malek Al
z
aq
e
bah)
2415
1.
INTROD
U
CTION
Ma
chine
le
ar
ni
ng
has
bec
om
e
m
or
e
pr
om
ine
nt
rece
ntly
in
m
any
researc
h
fiel
ds
,
a
nd
thi
s
is
due
to
the
fast
data
gro
wth
an
d
t
he
need
to
m
eani
ngf
ully
us
e
t
he
m
.
M
achine
le
arn
i
ng
co
nce
rns
disc
ov
e
rin
g
us
ef
ul
inf
or
m
at
ion
from
hu
ge
dat
a
us
in
g
s
om
e
m
achine
le
arn
i
ng
te
c
hniq
ues
incl
ud
i
ng
an
om
aly
detect
ion
,
cl
assifi
cat
ion
,
and
cl
us
te
rin
g
[
1
,
2
]
.
Acc
ording
ly
,
dim
ensio
nalit
y
can
i
m
p
ede
t
he
m
achine
le
arn
in
g
proc
ess
as
it
incur
s
hi
gh
c
om
pu
ta
ti
on
al
c
os
t.
Dim
ensional
it
y
is
a
m
ajo
r
issue
i
n
m
achine
le
ar
ning,
es
pecial
ly
con
ce
r
ning
dataset
s.
A
dat
aset
com
pr
ise
s
a
set
of
exam
ples
re
pr
e
senti
ng
inf
orm
ation
on
a
par
ti
cular
case
in
feature
form
,
and
dataset
c
an
ha
ve
s
ubst
antia
l
dim
ensi
on
al
it
y,
asi
de
from
carrying
feat
ur
es
t
ha
t
are
irrele
va
nt
an
d
redu
nd
a
nt,
an
d
no
ise
of
high
le
vel.
Su
ch
a
huge
num
ber
of
featu
res
c
ou
l
d
not
be
ha
nd
le
d
by
trad
it
ion
al
m
achine
le
arn
i
ng
m
et
ho
ds.
F
eat
ur
e
sel
ect
io
n
is
there
f
or
e
vital
as
a
pr
e
proces
sin
g
phas
e
as
it
decr
eas
es
data
dim
ension
al
it
y
w
hile
al
so
re
m
ov
ing
duplica
ti
ng
a
nd
use
le
ss
feat
ur
es
in
the
dataset
[
2
-
4
]
.
Feat
ure
se
le
ct
ion
process
ai
m
s
t
o
obta
in
the
optim
a
l
set
of
us
e
fu
l
featu
res
while
m
ai
ntaining
good
accu
rate
ness
in
re
pr
ese
nting
the
init
ia
l
featur
es
of
the
dataset
.
In
t
his
re
gard,
cl
assifi
ca
ti
on
in
vo
l
ves
de
te
rm
ining
the
cl
ass
value
of
eac
h
sam
ple f
ro
m
the av
ai
la
ble cla
s
s
pool
[
5
,
6
]
.
Feat
ur
e
sel
ect
ion
te
ch
niques
are
div
i
ded
int
o
three
cat
eg
ori
es
accord
i
ng
t
o
the
str
at
egy
of
sel
ect
io
n
as
fo
ll
ows:
F
il
te
r,
wr
a
pper
,
and
em
bed
de
d
t
echn
i
qu
e
s
.
Fil
te
r
ap
proache
s
do
no
t
re
quire
su
bse
que
nt
le
arn
i
ng
al
gorithm
s
[
7
,
8
]
,
w
hile
wr
a
pper
te
c
hn
i
qu
es
require
the
use
of
a
le
ar
ning
al
gorithm
[
9
,
10
]
.
When
c
om
par
ed
with
filt
er
appr
oach,
wr
a
pper
appr
oach
poss
ess
m
or
e
com
pu
ta
ti
on
al
costs
asi
de
from
sh
ow
in
g
an
over
-
f
it
ti
ng
risk.
Howe
ver,
in
e
m
bed
de
d
te
chn
iq
ues
,
th
e
featur
es
sel
e
ct
ion
m
et
ho
d
is
e
m
bed
de
d
w
it
hin
the
m
od
el
(s)
trai
ning
pr
oces
s
[
2
,
4
,
11
]
,
f
ollow
e
d
by
the
ge
ner
at
io
n
of
an
ideal
gro
up
of
featu
re
s
th
rou
gh
the
op
ti
m
izati
on
of
the
obj
e
ct
iv
e
f
un
ct
io
n.
Am
ong
t
he
t
hr
ee
m
entioned
ty
pe
s
of
featu
re
se
le
ct
ion
,
wr
a
pp
er
m
et
ho
ds
are
ch
os
e
n
in this
pa
per
.
Me
ta
heu
risti
cs
op
ti
m
iz
a
ti
on
al
gorithm
s
hav
e
sh
o
w
n
good
perform
ance
i
n
the
searc
h
f
or
an
opti
m
al
so
luti
on.
Also
,
these
al
gorith
m
s
are
easy
to
i
m
ple
m
ent
and
can
so
lve
a
wide
ra
ng
e
of
prob
le
m
s
[
12
]
.
A
m
ong
these
m
et
aheu
risti
cs
al
go
rith
m
s
are
al
gorithm
s
that
are
base
d
on
s
warm
intel
li
gen
ce.
Sw
arm
intel
lig
enc
e
al
gorithm
s
stud
y
the
beh
a
vior
of
a
c
ollec
ti
on
of
a
gen
ts
i
n
sel
f
-
organ
iz
e
d
so
ci
et
ie
s,
i.e.
be
es,
ants
,
bir
ds,
an
d
m
oth
s
[
13
-
17
]
.
Techn
i
qu
es
ba
sed
on
swa
r
m
intelli
gen
ce
hav
e
bee
n
wi
dely
us
ed
as
a
wr
ap
pe
r
m
e
tho
d
f
or
featur
e
sel
ect
ion
[
18
]
,
f
or
i
ns
ta
nce
,
bees
al
gorithm
[
19
]
,
ant
col
on
y
op
ti
m
iz
ation
(
ACO
)
[
20
]
,
butt
erf
ly
op
ti
m
iz
ation
al
gorithm
[
21
]
a
nd
m
oth
opti
m
iz
at
ion
alg
or
it
hm
[
22
]
.
Partic
le
swa
rm
opti
m
iz
at
ion
(
PSO)
was
ad
vi
sed
i
n
Kenne
dy
an
d
E
berha
rt
[
23
]
.
S
uch
an
al
gorithm
reli
es
on
the
be
hav
i
or
of
so
ci
al
or
ga
nism
s
t
hat
li
ve
in
gr
oups,
as
exem
plifie
d
by
bir
ds
and
fis
h.
PS
O
m
i
m
ic
s
the
intera
ct
ion
bet
ween
m
e
m
ber
s
in
in
f
orm
at
ion
sh
a
rin
g,
a
nd
the
ap
pl
ic
at
ion
of
PS
O
ca
n
be
obse
rv
e
d
i
n
var
i
ou
s
opti
m
i
zat
ion
do
m
ai
ns
an
d
al
s
o
to
ge
ther
with
othe
r
al
gorithm
s.
To
c
om
bin
e
th
ei
r
ad
va
ntages,
filt
er
-
wr
a
pper
gro
unded
upon
t
he
P
SO
featur
e
sel
ect
ion
te
ch
nique
wa
s
intr
oduc
ed
in
[
24
]
.
T
he
filt
ering
m
easur
e
i
s
app
li
ed
in
e
nc
od
i
ng
the
l
ocati
on
of
eve
ry
pa
rtic
le
,
w
hile
the
cl
assifi
cat
io
n
acc
ur
acy
is
ut
il
iz
ed
fo
r
the
f
it
ness
pur
po
se
.
A
s
ca
n
be
sho
wn
f
rom
the
exp
e
rim
ents,
the
sug
ge
ste
d
m
et
ho
d
w
as
m
arg
inly
bette
r
than
bin
a
ry
PSO
-
base
d
filt
er
ing
m
et
ho
d.
O
n
th
e
oth
e
r
ha
nd,
t
he
s
uggeste
d
m
et
ho
d
was
ye
t
to
be
c
om
par
ed
wit
h
any
w
rapper
al
gorithm
and
com
par
ed
to
the
filt
er
al
go
r
it
h
m
,
the
wr
app
e
r
al
gorithm
is
gen
erall
y
su
pe
rio
r
in
te
rm
s
of
cl
assifi
cat
ion
pe
rfor
m
ance.
In
deali
ng
with
th
e
FS
pr
ob
l
e
m
,
Ibrah
im
et
al.
[
25
]
s
ugge
ste
d
hybr
i
d
optim
iz
at
ion
te
chn
i
qu
e
that
com
pr
ise
s
a
com
bin
at
ion
of
a
sla
p
swa
rm
al
go
rit
hm
(S
SA
)
a
nd
PS
O.
This
com
bin
e
d
m
et
ho
d
was
cal
le
d
SSAPS
O.
T
he
auth
ors
r
ep
ort
ed
t
hat
this
m
e
tho
d
im
pr
ov
e
d
t
he
e
ff
e
ct
i
ven
ess
of
t
he
e
xp
l
oitat
ion
a
nd
exp
l
or
at
io
n p
ha
ses
[
25
]
.
PSO
a
nd
fire
fl
y
(F
F)
te
c
hn
i
ques
wer
e
hybri
dized
a
nd
cal
le
d
PS
O
-
FF
i
n
[
26
]
f
or
th
e
FS
pro
blem
in
the
exam
inatio
n
of
c
hildho
od
'
s
no
rm
al
"
te
ratoid/r
ha
bdoi
d"
tum
or
(
A
T/R
T)
in
br
ai
n
MR
I
im
ages
an
d
"hem
och
r
om
ato
sis"
in
com
pu
te
d
tom
og
r
aph
y
(CT
)
im
ages
of
li
ve
r.
Me
an
wh
il
e,
in
[
27
]
,
the
auth
or
s
dem
on
strat
ed
the
a
ppli
cat
ion
of
hy
br
i
d
bio
-
insp
i
red
te
ch
ni
qu
e
to
the
FS
process
.
T
his
pro
po
se
d
m
et
ho
d
is
gro
unde
d
upon
2
swa
rm
intel
l
igence
te
ch
niques
nam
el
y
PSO
a
nd
AC
O.
F
or
the
FS
pro
b
l
e
m
al
so
,
ta
bu
s
earch
(TS)
was
c
ombine
d
with
bina
ry
par
ti
cl
e
sw
arm
op
tim
iz
a
tio
n
(BPS
O)
i
n
[
2
8
]
.
I
n
this
st
ud
y,
BP
SO
functi
ons
as
a
local
opti
m
iz
er,
w
hen
e
ve
r
TS
is
exe
cut
ed
f
or
a
par
ti
c
ul
ar
ge
ner
at
i
on
i
n
ca
ncer
cl
assifi
cat
ion
,
duri
ng
gen
e
expressi
on.
S
om
eho
w
,
the
use
of
this
a
ppr
oa
ch
is
based
on
the
sm
allest
nu
m
ber
of
feat
ur
es
w
hich
m
eans
that
i
t
m
a
y
no
t
be
represe
ntati
ve
of
the
e
ntire
da
ta
set
.
As
su
c
h,
the
pro
blem
of
the
s
olu
ti
on
bei
ng
st
uck
i
n
local
po
i
nts
m
ay
oc
cur.
Re
le
van
tl
y
in
[
29
]
,
the
app
li
cat
ion
of
a
hybri
d
m
et
hod
com
pr
isi
ng
ACO
,
bee
colo
ny
op
ti
m
iz
atio
n
(
BC
O)
,
genet
ic
al
gorithm
(G
A
)
an
d
f
uzzy
C
-
m
eans
was
de
m
on
strat
ed,
wi
th
the
ai
m
of
featur
es
sel
ect
ion
f
r
om
the m
a
m
m
og
ram
i
m
age
s
.
In
the
c
urre
nt
researc
h
,
the
P
SO
te
ch
nique
is
com
bin
ed
w
it
h
an
ada
ptive
local
sear
ch
a
ppr
oach
to
qu
ic
kly
at
ta
in
su
it
able
s
olu
ti
ons
for
th
e
probl
e
m
by
com
bin
ing
t
he
a
dv
a
nt
age
of
the
e
xplorati
on
pro
vide
d
by
PSO
al
gorith
m
and
the
e
xploit
at
ion
a
bili
ty
by
the
pro
vid
e
d
by
the
l
ocal
searc
h
m
et
hod
.
P
SO
al
gorithm
ens
ur
es
the
di
ve
rsity
of
the
sol
ution
s
w
hile
t
he
a
da
ptive
lo
cal
searc
h
m
eth
od
e
xploit
s
t
he
so
l
utions
to
ob
ta
in
Evaluation Warning : The document was created with Spire.PDF for Python.
IS
S
N
:
2088
-
8708
In
t J
Elec
&
C
om
p
En
g,
V
ol.
11
, No
.
3
,
J
une
2021 :
24
14
-
2422
2416
the
possi
ble
id
eal
so
luti
on
.
T
his
c
om
bin
at
ion
inc
reases
the
flexi
bili
ty
of
t
he
P
SO
al
gorit
hm
to
en
han
ce
the
capab
il
it
y
to
e
xp
l
oit
the
s
olu
t
ion
s
i
n
sea
rch
i
ng
sp
ace
w
hile
the
poss
i
ble
ideal
so
l
ution
c
an
be
quic
kly
fou
nd.
The
pr
opos
e
d
adap
ti
ve
l
oca
l
search
ing
m
et
hod
is
reli
es
on
the
la
te
a
ccepta
nce
hill
-
cl
i
m
bin
g
al
gor
it
h
m
[
30
,
31
]
,
a
nd
t
his
m
et
ho
d
is
f
ree
from
par
a
m
et
er
tun
i
ng,
wh
e
re
by
the
par
am
et
ers
are
t
un
e
d
thr
ough
the
searc
h
of
PS
O
al
gorit
hm
wh
ic
h
m
akes
the
al
go
rith
m
m
or
e
fixab
le
.
PSO
al
gorit
hm
send
s
the
so
luti
on
to
be
e
xp
l
oited
tog
et
he
r
wit
h
the
cu
rr
e
nt
it
erati
on
an
d
the
nu
m
ber
of
trie
s
us
e
d
to
im
pr
ov
e
t
he
so
l
utio
ns
.
Am
on
g
the
m
os
t
sign
ific
a
nt
feat
ur
es
of
the
pro
po
s
ed
al
gorith
m
is
that
it
ta
kes
ad
van
ta
ge
of
po
pu
la
ti
on
-
ba
sed
al
gorit
hms
that
pr
ese
r
ve
the
di
ver
sit
y
of
the
so
l
ution
s
a
nd
local
sea
rch
a
lgorit
hm
s
that
exp
l
oit
the
s
ol
ution
ve
ry
fast.
T
he
resu
lt
s
gen
e
rat
ed
by
t
he
sug
ge
st
ed
al
gorith
m
are
con
t
rasted
agai
ns
t
the
tradit
ion
al
P
S
O
al
gorithm
and
with
oth
e
r
c
on
te
m
po
ra
ry
ap
proac
he
s.
The
str
uctu
re
of
the
a
rtic
le
i
s
as
fo
ll
ows:
S
ect
ion
2
detai
ls
the
sta
nd
a
rd
pa
rtic
le
swar
m
o
ptim
iz
ation
(P
S
O),
f
ollo
we
d
by
s
ect
io
n
3
that
exam
ines
and
el
ab
or
at
es
the
s
uggested
al
gorithm
na
m
el
y
“
par
ti
cl
e
s
war
m
op
ti
m
iz
ation
w
it
h
adap
ti
ve
l
oc
al
search
m
eth
od
”
.
T
he
n,
s
e
ct
ion
4
rev
ie
w
s
the
em
pirical
resu
lt
s,
a
nd
s
e
ct
ion
5
rev
ie
ws
the
stu
dy conclu
sio
ns al
ong wit
h sev
eral su
ggest
io
ns t
o be c
onside
red in
fu
t
ur
e
stud
ie
s
.
2.
PAR
TI
CLE S
WA
RM OPTI
MISATIO
N A
LGORIT
HM
The
PS
O
al
gorithm
is
created
by
Eberha
rt
and
Ke
nn
e
dy
in
[
23
]
,
an
d
this
al
gorithm
m
i
m
ic
s
the
com
m
un
ic
at
ion
be
hav
i
or
of
a
gr
oup
of
age
nts,
f
or
in
sta
nc
e,
bir
d
s
floc
ki
ng
an
d
fis
h
s
choolin
g.
I
n
th
e
PS
O
al
gorithm
,
a
gro
up
of
age
nts
de
note
s
the
s
olu
ti
on
s
(
par
ti
cl
es)
of
the
pr
ob
le
m
and
the
swa
rm
rep
re
s
ents
a
popula
ti
on
of s
olu
ti
ons.
PSO
al
gorithm
be
gin
s
by
ge
ne
rati
ng
ra
ndom
so
luti
ons
f
or
e
ach
par
ti
cl
e
an
d
assi
gn
i
ng
t
he
m
an
in
it
ia
l
velocit
y. Pa
rtic
le
s trav
el
within the
searc
hing
sp
ace i
n order
to searc
h for th
e ideal
so
l
utio
n. He
re,
t
he
loca
ti
on
of
e
ver
y
par
ti
c
le
is
updated
ba
sed
on
it
s
know
le
dg
e
an
d
it
s
ad
j
oi
ning
pa
r
ti
cl
es.
As
the
par
ti
cl
es
are
m
ov
i
ng,
their
curre
nt
po
sit
io
n
i
is
sy
m
bo
li
zed
by
a
vector
x
i
=
(
x
i
1
,
x
i
2
,
…
,
x
iD
),
wh
e
re
by
D
denotes
the
search
sp
ace’
s
dim
ens
ion
al
it
y.
Me
an
wh
il
e,
thei
r
ve
locit
y
i
is
sy
m
bo
li
zed
by
vi
=
(v
i1
,
v
i2
,
…,
v
iD
)
.
A
pr
e
defi
ne
d
m
axi
m
u
m
velo
ci
ty
con
fi
nes
t
he
velocit
y
of
t
he
par
ti
cl
es
whereb
y
v
m
ax
a
nd
∈
[
−
,
]
.
F
ur
t
her
,
the
best
past
posit
ion
of
a
pa
rtic
le
is
do
cum
ented
as
the
per
s
on
a
l
best
and
it
is
sy
m
bo
li
zed
as
pb
e
st
.
Acc
ording
ly
,
the
best
locat
i
on
ac
hieve
d
by
the
swar
m
is
cal
le
d
“glo
bal
best”
or
“
gbes
t”
.
PS
O
searc
h
es
and
fin
ds
th
e
ideal
so
luti
on
by
updating t
he pa
rtic
le
s u
sin
g
(
1
)
a
nd
(
2
)
is
us
ed
to
cal
culat
e t
he m
ov
ing
velocit
y as f
ollow
[
23
]
:
x
id
t
+
1
=
x
id
t
+
v
id
t
+
1
(1)
v
id
t
+
1
=
w
×
v
id
t
+
c
1
×
r
1i
×
(
p
id
−
x
id
t
)
+
c
2
×
r
2i
×
(
p
gd
−
x
id
t
)
(2
)
In
w
hich
,
t
sy
m
bo
li
zes
the
t
th
it
erati
on
in
the
e
voluti
on
ary
pr
ocess;
d
∈
D
sym
bo
li
zes
the
d
th
dim
ens
ion
within
the
sea
rch
i
ng
s
pace;
s
i
gnifie
s
t
he
wei
ght
of
ine
rtia
;
c
1
a
nd
c2
deno
te
the
acce
le
rati
on
con
sta
nts;
r
1i
a
nd
r
2i
are
r
an
dom
values
dis
per
se
d
ho
m
ogeneously
in
[
0,
1];
and
p
id
a
nd
p
gd
sym
bo
li
ze
the
el
e
m
ents o
f
“pbest”
a
nd
“
gbe
st”
in t
he
d
th
di
m
ension. F
i
gur
e 1
de
picts
the
ps
e
udo
-
c
ode
of PS
O.
Figure
1
.
Parti
cl
e sw
arm
o
pti
m
iz
at
io
n
(P
S
O
)
a
lg
or
it
hm
’s
pse
udo
-
co
de
Evaluation Warning : The document was created with Spire.PDF for Python.
In
t J
Elec
&
C
om
p
En
g
IS
S
N: 20
88
-
8708
Hyb
ri
d
fe
atu
re
selec
ti
on
meth
od base
d o
n part
ic
le
swarm
opti
miza
ti
on
an
d ada
ptive
...
(
Malek Al
z
aq
e
bah)
2417
3.
PROP
OSE
D PSO
ALGO
R
ITHM
WITH
ADAPTI
VE L
OCAL SE
ARCH
3.1.
S
olut
i
on
de
pictio
n
The
s
olu
ti
on
of
feat
ur
e
s
sel
ec
ti
on
is
de
picte
d
as
a
vecto
r
of
N
fe
at
ur
e
(
nu
m
ber
of
featu
r
es
withi
n
a
data
set
),
an
d
the
co
ntents
in
this
vector
a
re
ei
ther
0
or
1,
wh
e
re
by
0
m
e
ans
unsel
ect
ed
featur
e
an
d
1
m
eans
sel
ect
ed
featu
r
e.
PSO
al
gorithm
chan
ges
th
e
values
in
th
e
vector
t
o
i
m
pro
ve
cl
assifi
cat
ion
accu
racy
;
this
stud
y
us
es
cl
as
sific
at
ion
acc
uracy
as
an
obje
ct
ive
functi
on
to
be
m
axi
m
i
zed.
Accor
dingly
,
the
cl
assif
ic
at
i
on
al
gorithm
us
ed
in
the
pr
e
se
nt
stud
y
is
discuss
e
d
in
the
ens
uing
sect
io
n.
T
he
f
ollo
wing
fig
ur
e
s
ho
ws
the
represe
ntati
on
of
PS
O
al
gorithm
fo
r
featu
re
sel
ect
ion
.
F
or
dem
on
strat
io
n
pur
poses,
sup
po
s
e
that
we
ha
ve
a
so
luti
on
f
or
a
dataset
with
5
featur
e
s;
the
sel
ect
ed
featur
es
are
first
an
d
thir
d,
an
d
he
nc
e,
the
so
luti
on
will
be
[1,0,1
,0
]
.
3.2.
A
daptiv
e
local se
arch
The
local
sear
chin
g
m
e
tho
d
wo
r
ks
reli
es
on
“Lat
e
acce
ptance
hill
-
cl
im
bin
g”
(LAH
C)
[
16
,
3
0
]
.
LAH
C
al
go
rith
m
wo
r
ks
base
d
on
m
e
m
or
y
(
l
ist
)
with
le
ng
t
h
(
L
)
to
sa
ve
the
ob
j
ect
ive
va
lues
of
the
sol
utions
pro
du
ce
d
during
t
he
searc
h.
The
acce
ptanc
e
of
a
ny
ne
w
c
om
ing
so
luti
on
s
dep
e
nds
on
t
he
assessm
ent
of
th
e
new
so
l
ution
with
the
la
st
one
sa
ved
wit
hi
n
the
li
st
at
th
e
L
th
ste
p.
The
worst
s
olu
ti
on
is
acce
pted
pr
ov
i
ding
that
the
value
of
the
possible
so
luti
on
is
eq
ual
to
or
bette
r
than
the
value
within
the
li
st
L
of
in
de
x
v
(
virtu
a
l
sta
rting
of
the
li
st).
v
is
com
pu
te
d
by
di
vid
i
ng
t
he
c
urren
t
nu
m
ber
of
t
he
it
erati
on
s
I
by
the
le
ngth
of
L
(e.
g.
,
see
fig
ur
e
li
ne
#
12)
,
a
nd
afte
r
that
the
val
ue
in
L
of
in
de
x
v
bec
om
es
t
he
po
s
sible
s
olu
ti
on.
Othe
rw
i
se,
the
worse
s
olu
ti
on
will
no
t
be
a
ccepte
d.
I
n
thi
s
reg
a
rd,
the
“
ph
ysi
cal
”
li
st
sta
ys
sta
ti
c.
Ho
we
ve
r,
it
s
“v
i
rtual”
beg
i
nn
i
ng
v
is
dynam
ic
al
l
y
c
om
pu
te
d
as
a
div
isi
on
rem
inde
r
of
the
nu
m
ber
of
it
erati
ons
I
by
the
le
ng
th
of
li
st
L (v=I
m
od
L).
The
full
p
se
ud
ocode
of lat
e a
ccepta
nce
h
il
l
cl
i
m
bin
g
is
pr
e
sented
in Fi
gur
e 2
[
30
]
.
Figure
2
.
The
ps
e
udo
-
c
ode
of
l
at
e
a
ccept
anc
e
h
ill
c
lim
bin
g
[
30
]
The
ps
eu
doco
de
of
the
offere
d
al
gorithm
i
s
de
picte
d
in
F
igure
3.
I
n
our
pro
po
s
ed
al
gorithm
,
the
stoppin
g
c
ondi
ti
on
is
set
by
c
ountin
g
the
i
dl
e
ste
ps
(idel
ste
ps
)
or
the
m
axi
m
u
m
it
erati
on
num
ber
is
at
ta
ined,
wh
e
re
t
he
i
dle
ste
ps
is
inc
re
a
sed
by
one
if
the
al
gorithm
co
uldn’t
f
ur
t
he
r
im
pr
ove
the
lo
cal
best
s
ol
utio
n
(P
be
st)
see
F
ig
ur
e
3
li
ne
#
7
.
The
a
da
ptive
l
ocal
searc
h
(
A
LS)
is
pe
rfor
m
ed
if
t
he
rand
om
nu
m
ber
be
tween
0
and
1
is
great
er
than
0.7
5.
T
hi
s
per
ce
ntage
was
ch
os
e
n
ex
per
im
ental
ly
to
avo
i
d
ap
plyi
ng
the
local
sea
rch
for
ever
y
so
l
utio
n
and
to
a
void
lo
ng
processi
ng
t
i
m
e.
An
ot
her
c
onditi
on
is
a
ppli
ed
to
ens
ur
e
that
ALS
is
ap
pl
ie
d
wh
e
n
the
s
olut
ion
is
getti
ng
stuck
in
l
ocal
op
ti
m
a,
or
w
hen
no
f
ur
t
her
i
m
pr
ov
em
ent
is
po
ssi
ble,
a
wors
e
so
luti
o
n
is
acc
epted
in
t
his
s
ta
ge
to
sk
i
p
f
r
om
getti
ng
stu
ck
in
local
op
t
i
m
a
see
Figu
r
e
3
li
ne#
15
.
Thr
ee
par
am
et
ers
sh
ould
be
pro
vid
e
d
to
L
AH
C
al
gorithm
;
the
i
t
erati
on
s
num
ber
(
Nu
m
Of
Ite
)
,
the
li
st
siz
e
(
L
),
a
nd
the s
olu
ti
on (
x
i
)
see
Fig
ur
e
3 li
ne#
17
.
The
a
dap
ti
ve
l
ocal
searc
h
m
e
thod
us
e
s
an
a
dap
ti
ve
m
et
hod
to
set
these
pa
ram
et
ers
as
fol
lows
:
First
,
the
nu
m
ber
of
it
erati
on
s
(
Num
Of
Ite
)
is
cal
c
ulate
d
by
m
ult
i
plyi
ng
the
of
idle
ste
ps
nu
m
be
r
(
idelst
eps
)
with
the
current
num
ber
of
P
SO
it
erati
on
(
P
soIte
r
).
This
is
to
provi
de
m
or
e
it
erati
on
s
in
local
s
earch
at
the
en
d
of
a
search
of
P
SO
to
prom
ote
m
o
re
it
erati
on
at
the
fi
nal
sta
ge
.
Seco
nd
ly
,
i
n
te
rm
s
of
li
st
siz
e,
a
li
st
from
the
PS
O
Evaluation Warning : The document was created with Spire.PDF for Python.
IS
S
N
:
2088
-
8708
In
t J
Elec
&
C
om
p
En
g,
V
ol.
11
, No
.
3
,
J
une
2021 :
24
14
-
2422
2418
al
gorithm
(
PSO
List
i
,
sim
il
ar
to
the
LA
HC
li
st)
that
keep
s
the
obj
ect
ive
va
lue
of
eac
h
sol
uti
on
is
pro
vi
ded
to
local
search
se
e
Figu
re
3
li
nes#
7&1
9
.
He
nc
e,
the
histor
y
of
the
so
l
utions’
obj
ect
ive
va
lues
will
be
saved
in
the
PS
OList
.
T
his w
il
l
ben
e
fit from
the p
rev
i
ou
s
e
xp
e
rim
ent
s and sa
ve
i
niti
al
iz
at
ion
tim
e o
f
the list
of
A
LS.
Figure
3
.
The
ps
e
udo
-
c
ode
of the
pro
po
se
d PSO
alg
or
it
hm
w
it
h
a
da
ptiv
e
local
searc
h
4.
EMPERI
C
AL EVA
L
U
ATI
ON
RES
ULTS A
N
D DIS
C
US
SI
ON
This
pa
rt
of
the
arti
cl
e
loo
ks
into
the
eff
e
ct
iveness
an
d
stren
gth
of
the
su
ggest
ed
PSO
with
th
e
adap
ti
ve
l
ocal
search
al
gorith
m
(P
SO_
AL
S).
Furthe
r,
this
stud
y
com
par
e
d
PS
O_AL
S
w
it
h
oth
er
popul
at
ion
-
base
d
al
gorith
m
s,
and
am
on
g
the
al
go
rithm
s
co
m
par
e
d
in
t
his
stu
dy
inclu
de
G
A,
MF
O
a
nd
FFA.
Accor
dingly
,
the
te
sts
car
ried
ou
t
in
thi
s
stu
dy
invol
ved
the
us
e
of
8
dataset
s
com
pr
isi
ng
var
i
ou
s
c
har
act
e
risti
cs.
The
fo
ll
owin
g
Table
1
pres
ents
the
ei
gh
t
dataset
s
util
iz
ed
in
t
his
stu
dy.
T
hese
c
om
m
on
ly
us
ed
sta
nd
a
rd
datas
et
s
wer
e
obta
i
ned
f
ro
m
the
UCI
data
s
our
ce
[
32
]
,
a
nd
in
fact,
t
hey
ha
ve
bee
n
us
e
d
in
seve
ral
well
-
co
nf
i
rm
e
d
stu
dies.
Among
the
pri
m
ary
at
tribu
te
s
of
these
dataset
s
are
as
fo
ll
ows:
nu
m
ber
of
at
t
rib
utes
(f
eat
ures
),
num
ber o
f
e
xam
ples (
I
ns
ta
nces), t
he nu
m
ber
of possible
cl
ass
va
lues
. Ta
ble
1 shows
the
deta
il
s.
Fo
r
the
purpos
e
of
t
his
work
,
the
i
ns
ta
nces
i
n
the
dataset
s
we
re
s
plit
te
d
into
t
wo
gr
oups
of
te
sti
ng
and
trai
ni
ng.
I
n
sp
eci
fic,
80
%
of
the
instances
we
re
util
iz
ed
in
trai
ning
,
wh
il
e
the
othe
r
20%
wer
e
use
d
in
Evaluation Warning : The document was created with Spire.PDF for Python.
In
t J
Elec
&
C
om
p
En
g
IS
S
N: 20
88
-
8708
Hyb
ri
d
fe
atu
re
selec
ti
on
meth
od base
d o
n part
ic
le
swarm
opti
miza
ti
on
an
d ada
ptive
...
(
Malek Al
z
aq
e
bah)
2419
te
sti
ng
.
The
use
of
this
div
i
sion
wa
s
pro
pose
d
in
Fr
ie
dm
an
et
al.
[
33
]
.
The
r
uns
and
e
xp
e
rim
ents
wer
e
perform
ed
us
ing
a
syst
e
m
w
it
h
the
fo
ll
owi
ng
s
pecifica
ti
ons:
In
te
l
CPU
i5
-
5200U
2.2
GH
z
a
nd
a
R
AM
of
8.0
GB
.
T
he
pa
ram
et
er
values
of
the
s
ugges
te
d
al
gorithm
are
de
picte
d
i
n
Table
2.
Accord
i
ng
ly
,
the
se
value
s
hav
e
been
ide
ntifie
d
base
d
on
the
res
ults
ob
ta
ine
d
f
ro
m
10
war
m
ing
up
e
xp
e
rim
ent
al
runs
,
a
nd
as
can
be
ob
s
er
ved, Ta
bl
e 2
show
s
bette
r
set
ti
ngs
of
t
he
algorit
hm
’s
pa
ram
et
ers,
w
hi
ch gene
rate
bette
r
accu
racy.
Table
1
.
T
he
e
m
plo
ye
d
datas
et
s
Dataset na
m
e
Featu
res
Ins
tan
ces
Clas
s
Ger
m
an
20
1000
2
W
BC
10
699
2
Sp
ectF
44
267
2
So
n
ar
60
208
2
Ion
o
sp
h
ere
34
351
2
Hear
t
13
270
2
W
DBC
31
569
2
Parkin
so
n
s
23
197
2
Table
2
.
B
et
te
r
sett
ing
s
of the
al
gorithm
’s
para
m
et
ers
Para
m
eter
Valu
e
m
o
v
e
r
ate
(w1)
0
.5
p
o
p
u
latio
n
size
20
Max id
le step
s
50
Max n
u
m
b
er
o
f
I
te
ration
s
200
Li
m
it
search
rang
e
(
v
m
ax
)
4
Table
3
disp
la
ys
the
nu
m
ber
of
sel
ect
ed
fea
tures
(
NF),
the
best
-
at
ta
ined
accuracy
(
AC
C)
util
iz
ed
in
the
com
par
iso
n
betwee
n
PS
O_ALS
al
go
rithm
and
oth
er
s
ta
te
s
of
the
art
al
gorithm
s
na
m
el
y
GA
,
MF
O,
FF
A,
and
PS
O.
A
ve
rag
e
Accuracy
res
ults
of
G
A,
MFO,
FF
A,
a
nd
PSO
we
re
com
par
ed
wit
h
tho
se
of
P
SO_AL
S
.
The
res
ults
in
Table
3
dem
on
strat
e
the
s
uperior
it
y
of
t
he
PSO_
ALS
al
gorithm
wh
en
c
on
t
ras
t
ed
with
oth
er
te
chn
iq
ues
in
t
erm
s o
f
at
ta
ine
d
acc
ur
acy
with
75%, a
nd c
om
par
able (sam
e accu
racy)
with
12.5%.
Also
,
G
A
fail
e
d
to
ob
ta
in
superi
or
acc
ur
ac
y
resu
lt
in
co
m
par
ison
with
MFO
an
d
P
S
O
al
gorithm
s
wh
ic
h
obta
in
1
sa
m
e
Accu
rac
y
resu
lt
for
WD
BC
dataset
.
Table
3
al
so
s
hows
that
the
perform
ance
of
PS
O
-
ALS
al
gorit
hm
su
per
se
des
oth
e
r
al
gorith
m
s
wh
en
it
com
es
to
the
nu
m
ber
of
feat
ur
es
j
ust
in
2
dataset
s.
Howe
ver, MF
O
al
go
rithm
attains th
e
b
est
outc
om
es in 4 dat
aset
s
.
Table
3
.
C
om
par
iso
n of t
he b
est
a
ccur
a
cy
o
f
GA,
MF
O, FF
A,
PS
O
a
nd PSO_
ALS
Dataset
GA
MFO
FFA
PSO
PSO_
ALS
NF
ACC
Av
g
-
ACC
NF
ACC
Av
g
-
ACC
NF
A
CC
Av
g
-
ACC
NF
ACC
Av
g
-
ACC
NF
ACC
Av
g
-
ACC
Ger
m
an
11
7
8
.00
7
7
.26
12
7
8
.63
7
7
.66
8
7
7
.88
7
7
.03
13
7
8
.38
7
7
.88
15
8
1
.50
7
9
.87
Hear
t
6
8
8
.03
8
4
.81
5
8
8
.42
8
8
.42
5
8
8
.42
8
7
.33
5
8
8
.42
8
7
.29
6
9
0
.74
8
9
.02
Ion
o
sp
h
ere
20
8
7
.86
8
7
.21
15
8
9
.64
8
8
.75
9
8
8
.21
8
7
.75
19
8
9
.64
8
9
.18
13
9
0
.35
8
8
.95
Parkin
so
n
s
14
8
8
.42
8
6
.94
10
8
8
.42
8
7
.46
7
8
8
.50
8
7
.16
10
8
9
.00
8
7
.20
6
9
2
.30
8
9
.39
Sp
ectF
33
8
3
.20
7
9
.16
14
8
5
.97
8
0
.94
14
8
4
.55
8
0
.35
24
8
4
.13
8
0
.91
20
8
7
.38
8
5
.92
So
n
ar
36
8
4
.52
8
2
.03
12
8
6
.32
8
5
.26
17
8
3
.90
8
2
.81
33
86.
28
8
4
.64
24
8
5
.00
8
4
.51
W
DBC
15
9
8
.75
9
5
.22
8
9
8
.96
9
6
.95
12
9
8
.75
9
6
.59
13
9
8
.96
9
6
.70
10
9
8
.96
9
7
.98
W
BC
6
9
8
.17
9
7
.86
7
9
8
.17
9
8
.17
6
9
8
.17
9
7
.86
6
9
8
.17
9
7
.88
5
9
9
.27
9
9
.01
Re
su
lt
s
of
a
ve
rag
e
Acc
uracy
by
the
sam
e
al
gorithm
are
al
so
disp
la
ye
d
in
Ta
ble
3.
As
can
be
ob
s
er
ved, the PSO
_ALS alg
or
it
hm
p
rop
os
e
d
in this st
ud
y
achieve
d
six
be
st avera
ge
res
ults, p
a
rtic
ularl
y i
n
the
fo
ll
owin
g:
Germ
an,
hear
t,
WD
BC
,
Pa
r
kin
s
ons,
S
pectF
a
nd
W
BC
dataset
s
.
Me
an
w
hile,
PSO
s
hows
t
he
best
aver
a
ge
in I
on
os
phe
re
datase
t,
wh
il
e
MFO
s
hows
the b
est
av
era
ge
in
the Sonar
d
at
aset
. Th
e
detai
ls
are
sh
ow
n
in Ta
ble 3,
whereb
y t
he
acc
uraci
es w
it
h
t
he hig
hest a
ver
a
ge
are b
old
e
d
in
the table
abo
ve
.
The
sig
nifica
nc
e
of
t
he
obta
ined
res
ults
can
be
dete
rm
ined
thr
ough
the
Ma
nn
Wh
it
ney
te
s
t
as
was
dem
on
strat
ed
by
Mc
K
night
and
Na
j
a
b
[
22
]
.
Table
4
acco
rdi
ng
ly
sho
ws
th
e
Ma
nn
Wh
it
ne
y
sta
ti
sti
ca
l
test'
s
p
-
values
acc
o
r
di
ng
to
val
ues
of
su
it
abili
ty
.
Fr
om
thes
e
sta
ti
st
ic
al
t
est
s,
the
sp
ot
te
d
diff
e
re
nce
s
and
i
m
pr
ovem
ents
are
pro
ven
to
be
c
on
si
d
er
ed
m
eaningfu
l
.
T
he
e
xcell
en
ce
of
PS
O
_ALS
re
ga
rd
i
ng
aver
a
ge
accuracy
over
oth
e
r
c
om
par
able
al
gorithm
s
is
pro
ve
n
in
Table
4.
As
t
he
ta
ble
is
s
howi
ng,
the
pr
opos
e
d
al
gorithm
is si
gn
i
ficant sta
ti
sti
cal
ly
f
or
m
os
t of the case
s e
xc
epting s
om
e c
ases.
Evaluation Warning : The document was created with Spire.PDF for Python.
IS
S
N
:
2088
-
8708
In
t J
Elec
&
C
om
p
En
g,
V
ol.
11
, No
.
3
,
J
une
2021 :
24
14
-
2422
2420
Table
4
.
T
he
M
ann
W
hitney
te
st
p
-
value
s for t
he res
ults o
f
a
ver
a
ge
acc
uracy
Datasets
GA
MFO
FFA
PSO
W
BC
0
.10
0
0
.48
5
0
.20
0
0
.31
5
Sp
ectF
0
.10
0
0
.01
2
0
.00
2
0
.21
0
Ger
m
an
0
.00
2
0
.05
7
0
.02
8
0
.05
7
W
DBC
0
.00
0
0
.01
3
0
.02
2
0
.01
2
Hear
t
0
.20
0
0
.28
1
0
.20
0
0
.01
5
So
n
ar
0
.55
1
0
.12
5
0
.46
5
0
.00
1
Parkin
so
n
s
0
.01
3
0
.34
2
0
.01
2
0
.23
1
Ion
o
sp
h
ere
0
.01
3
0
.00
1
0
.11
4
0
.48
5
*
p
≥
0
.05
Table
5
prese
nt
s
the
Ma
nn
Wh
it
ney
te
st'
s
le
vels
of
m
arg
inal
sig
nifica
nce
(
p
-
values
)
based
on
t
he
featur
e
s
num
ber
.
A
s
can
be
seen
f
r
om
the
ta
ble,
the
o
bs
e
rv
e
d
di
ff
e
ren
c
es
betwee
n
al
gorithm
s
of
PS
O_A
L
S
and
G
A
f
or
al
l
dataset
s
are
sign
ific
a
nt
sta
ti
sti
cal
l
y
and
f
or
m
os
t
oth
er
com
petit
or
m
e
thods
are
si
gn
i
ficant
sta
ti
sti
cally
ex
cl
ud
in
g
f
or
the
PSO
al
gorith
m
.
Figu
re
4
shows
a
com
par
i
so
n
for
the
bes
t
accuracy
b
et
ween
t
he
GA,
FF
A,
M
F
O,
P
SO,
a
nd
P
SO_AL
S
al
gor
it
h
m
s,
and
can
be
reali
ze
that
PSO_
ALS
cl
ea
rly
achieves
t
he
best
resu
lt
s i
n
m
os
t datase
ts.
Table
5
.
T
he
M
ann
W
hitney
te
st p
-
value
s for t
he res
ults o
f
sel
ect
ed feat
ures
Datasets
GA
FFA
MFO
PSO
W
BC
0
.00
1
0
.01
2
0
.43
6
0
.91
2
Sp
ectF
0
.00
1
0
.85
3
0
.01
9
0
.06
3
Ger
m
an
0
.00
4
1
.00
0
0
.10
5
0
.14
3
W
DBC
0
.00
0
0
.10
5
0
.00
2
0
.85
3
Hear
t
0
.00
1
0
.00
3
0
.00
0
0
.63
1
So
n
ar
0
.00
2
0
.00
0
0
.31
5
0
.63
1
Parkin
so
n
s
0
.00
0
0
.21
8
0
.00
2
0
.08
9
Ion
o
sp
h
ere
0
.00
2
0
.00
9
0
.14
3
0
.00
2
*
p
≥
0
.05
Fig
ure
4
.
Acc
uracy
co
m
par
iso
n betwee
n
P
S
O,
G
A,
MF
O,
FFA, a
nd PSO
_ALS
Figure
5
(
a
)
prov
e
s
that
the
PSO
al
gorith
m
in
so
m
e
points
kee
ps
no
t
i
m
pr
ovin
g
th
e
so
luti
on
i
n
sever
al
it
erati
ons,
w
hich
m
ea
ns
it
’s
go
t
st
uc
k
in
l
ocal
opti
m
a
in
so
m
e
points.
This
ca
n
b
e
ex
em
plifie
d
by
the
first
gr
a
ph
sho
wn
in
Fi
gure
5
wh
ic
h
dem
on
strat
es
the
behavio
r
of
P
SO
a
nd
P
SO_
ALS
for
the
Hear
t
da
ta
set
.
As
s
how
n
in
F
igure
5
(
a)
,
the
so
luti
on
is
not
i
m
pr
ov
e
d
fro
m
it
erati
on
s
num
ber
110
t
o
180.
C
ontrari
wise,
i
n
F
igure
5
(
b)
gr
a
ph
f
or
PS
O
_ALS
a
pp
li
ed
to
a
sim
il
ar
dataset
,
the
al
gorithm
co
nv
e
r
ges
sm
oo
t
hly
an
d
ge
ne
rates
su
pe
rio
r
res
ults
becau
se
the
beh
a
vior
of
th
e
adap
ti
ve
loc
al
search
acce
pts
the
worst
so
luti
on
to
sk
i
p
the
al
gorithm
ou
t
fr
om
local
op
ti
m
a.
Fu
rthe
r,
in
the
PSO
al
gor
it
h
m
at
it
erati
on
#
120,
t
he
al
gorithm
de
m
on
s
trat
es
it
s
abili
ty
in
gen
erati
ng
a
n
a
ccur
acy
of
87
%.
H
oweve
r,
in
PS
O_AL
S
si
m
il
ar
accuracy
is
produce
d
after
Iterati
on# 1
70. T
hus,
it
ca
n be
sai
d
that t
he
a
ppli
cat
ion
of the
ad
a
ptive l
ocal
search
.
Evaluation Warning : The document was created with Spire.PDF for Python.
In
t J
Elec
&
C
om
p
En
g
IS
S
N: 20
88
-
8708
Hyb
ri
d
fe
atu
re
selec
ti
on
meth
od base
d o
n part
ic
le
swarm
opti
miza
ti
on
an
d ada
ptive
...
(
Malek Al
z
aq
e
bah)
2421
(a)
(b)
Figure
5
.
The
c
onve
rg
e
nce
be
hav
i
or of the
(a
)
PS
O
,
(b)
PS
O_ALS
alg
or
it
hm
s o
n
h
eart
da
ta
set
In
acce
ptin
g
c
and
i
date
s
olu
ti
on
that
has
le
s
s
accu
racy
in
order
to
jum
p
ou
t
from
local
optim
a
an
d
balance
t
he
lo
cal
intensific
at
ion
a
nd
globa
l
div
e
rsificat
io
n
of
t
he
sea
rc
h,
t
hat
has
bee
n
see
n
from
the
final
accuracy
pr
oduced
by
both
a
lgorit
hm
s.
Fo
r
m
os
t
of
the
dataset
s
con
si
der
e
d
in
this
resea
r
ch,
the
te
ch
nique
of
PSO_
ALS
yi
el
ded
eq
ual
or
im
pr
ov
e
d
re
su
lt
s
of
acc
ur
acy
and
sel
ect
ed
fe
at
ur
es
num
b
er.
So
m
ehow,
it
s
hould
be no
te
d
that
not all
the
dif
fere
nces
obser
ve
d are si
gn
i
ficant
stat
ist
ic
al
l
y wh
en
co
m
par
e
d t
o
ot
her com
petit
or
s.
Accor
dingly
,
Table
6
sho
w
s
a
com
par
ison
of
the
best
resu
lt
s
obta
ine
d
by
PS
O_AL
S
al
gorithm
against
so
m
e
so
luti
ons
re
port
e
d
in
the
li
te
ratur
e
,
involvi
ng
the
us
e
of
ei
gh
t
te
ste
d
dataset
s.
For
the
pur
pose
of
this
stud
y,
acc
ur
acy
is
util
ized
as
the
m
ain
goal
wh
e
n
c
om
par
ing
the
perform
ance
of
the
al
gorith
m
s
.
In
Table
6,
the
hi
gh
est
Accura
cy
is
sh
ow
n
i
n
bo
l
d.
Ta
ble
6
s
hows
that
P
SO_
ALS
pr
opos
e
d
in
this
stud
y
achieve
d
valu
es
that
are
hig
hly
com
par
able
to
tho
se
of
m
os
t
co
m
petito
rs
in
te
rm
of
accuracy.
H
oweve
r
,
PSO_
ALS
sho
ws
s
uperio
r pe
rfor
m
ance in
s
om
e d
at
aset
s, whe
n
c
om
par
e
d
to
o
t
her al
gor
it
h
m
s.
Table
6
.
C
om
par
in
g
P
SO_AL
S w
it
h co
ntem
porar
y
a
ppr
oac
hes
Dataset
PSO_
ALS
Bes
t
-
k
n
o
wn
r
esu
lt
So
u
rce
Ger
m
an
8
1
.50
7
8
.63
Alzaqeb
ah
et al.
[
22
]
Hear
t
9
0
.74
8
8
.42
Alzaqeb
ah
et al.
[
22
]
Ion
o
sp
h
ere
9
0
.35
8
9
.90
Maf
arja
et
al.
[
34
]
Parkin
so
n
s
9
2
.30
9
2
.00
Ku
m
ar
and
Ku
m
a
r
[
35
]
Sp
ectF
8
7
.38
8
6
.38
Alzaqeb
ah
et al.
[
22
]
So
n
ar
8
5
.00
9
1
.20
Maf
arja
et
al.
[
34
]
W
DBC
9
8
.96
9
8
.96
Alzaqeb
ah
et al.
[
22
]
W
BC
9
9
.27
9
8
.35
Alzaqeb
ah
et al.
[
22
]
5.
CONCL
US
I
O
N
AND
F
UT
U
RE W
ORKS
In
t
he
c
urre
nt
arti
cl
e,
the
a
ppli
cat
ion
of
the
PSO
Al
gorith
m
with
the
ad
aptive
loc
al
se
arch
m
et
ho
d
(P
S
O
_AL
S
al
gorithm
)
was
de
m
on
strat
ed
f
or
featur
e
s
sel
ect
ion
prob
le
m
.
In
the
c
urren
t
work,
the
al
go
rithm
s
are
em
plo
ye
d
t
o
a
be
nch
m
ark
of
8
sta
ndar
d
UCI
dataset
s.
The
PSO_
ALS
al
gorithm
resu
lt
s
we
re
c
on
t
r
ast
ed
against
these
ge
ner
at
e
d
from
the
four
ap
pro
ached
f
oun
d
in
the
li
te
ratur
e.
The
m
et
ho
d
propose
d
in
this
wor
k
dem
on
strat
ed
t
he
perform
ance
superi
or
it
y
w
hen
com
par
e
d
with
oth
er
eq
ui
valent
m
et
ho
ds,
by
balanci
ng
local
intensific
at
io
n
and
gl
ob
al
d
i
ve
rsificat
ion
of
t
he
sea
rch
th
rou
gh
the
a
ppli
cat
ion
of
P
SO
al
gorithm
that
fin
ds
t
he
best
global
so
l
ution
within
th
e
search
s
pace
and
a
dap
ti
ve
l
ocal
searc
h
m
e
thod
in
ex
plori
ng
th
e
local
search
sp
ace.
T
he
util
iz
at
ion
of
t
he
adap
ti
ve
l
ocal
search
te
ch
niqu
e
i
m
pr
ov
e
s
the
resu
lt
s
of
the
su
ggest
e
d
al
go
rithm
.
PSO_
ALS
sho
ws
pe
rfor
m
ance
that
is
su
peri
or
to
oth
e
r
co
m
par
able
app
r
oach
e
s,
an
d
al
so
to
the
basic
PSO
al
gorithm
.
REFERE
NCE
S
[1]
M.
Bkassin
y
,
Y.
Li
,
and
S.
K.
J
a
y
awe
er
a,
“
A
surve
y
on
m
ac
hin
e
-
learni
ng
t
ec
hn
ique
s
in
cogn
it
i
ve
rad
ios,
”
IE
E
E
Comm
unic
ati
ons Surve
ys
&
Tuto
rials,
vol
.
1
5
,
no
.
3
,
pp
.
1136
-
11
59,
2012
.
[2]
P.
Birkl
e,
e
t
al.
,
“
Mac
hine
Learn
ing
-
base
d
Approac
h
for
Autom
at
ed
Ide
nti
f
ic
a
ti
on
of
Produce
d
Wa
te
r
T
y
p
es
from
Convent
ional
an
d
Unconve
nt
ion
al
R
ese
rvoirs,
”
P
et
roleum
Geosta
ti
stic
s
,
vol
.
2019
,
no
.
1
,
pp.
1
-
5
,
2019.
[3]
J
.
Huang,
G
.
L
i,
Q.
Huang,
and
X.
W
u,
“
Joint
fea
tur
e
sel
ec
t
ion
and
class
ifi
catio
n
for
m
ult
il
ab
el
le
arn
ing,
”
IEEE
transacti
ons on cy
berne
ti
cs
,
vol
.
48,
no.
3,
pp.
87
6
-
889,
2017
.
Evaluation Warning : The document was created with Spire.PDF for Python.
IS
S
N
:
2088
-
8708
In
t J
Elec
&
C
om
p
En
g,
V
ol.
11
, No
.
3
,
J
une
2021 :
24
14
-
2422
2422
[4]
P.
Zhu,
Q.
Xu,
Q.
Hu,
C.
Zh
ang,
and
H
.
Zh
ao,
“
Multi
-
l
abel
fea
tur
e
select
io
n
with
m
issing
l
abe
ls
,
”
Pa
tt
er
n
Re
cogn
it
ion
,
vo
l
.
74
,
pp
.
488
–
50
2,
2018
.
[5]
A.
Tri
p
at
h
y
,
A.
Agrawal
,
and
S.
K.
Rat
h
,
“
Cl
assific
a
ti
on
of
s
ent
iment
r
eviews
using
n
-
gra
m
m
ac
hine
l
ea
rn
in
g
appr
oac
h
,
”
E
xper
t
Syste
ms
wi
th Appl
ic
a
ti
ons,
vo
l.
57
,
pp
.
117
–
12
6,
2016
.
[6]
M.
Al
-
Bat
ah
,
S.
Mr
a
y
y
en
,
and
M.
Alza
qeb
ah,
“
Arabi
c
Sent
iment
Cla
ss
ifi
c
at
ion
using
MLP
Network
Hy
b
r
id
wit
h
Naive
B
a
y
es
Al
gorit
hm
,
”
Journ
al
of
Computer
S
ci
en
ce.
Sc
ie
n
ce
Publ
ic
a
ti
ons,
vo
l.
14
,
no
.
8
,
pp
.
1
104
–
11
14,
2018
.
[7]
V.
Bolón
-
Can
e
do,
N.
Sánch
ez
-
Marono,
A.
Alonso
-
Bet
an
zos,
J.
M.
Beníte
z,
and
F.
Herr
er
a,
“
A
rev
ie
w
o
f
m
ic
roa
rra
y
dataset
s a
nd
app
li
ed
f
ea
tur
e
se
le
c
ti
on
m
et
hods,
”
In
formation
Sc
ie
nc
es,
vol. 282, pp. 11
1
–
135,
2014
.
[8]
W
.
Gao,
L
.
Hu,
P.
Zha
ng,
and
F.
W
ang,
“
Feat
ure
sele
c
ti
on
b
y
in
t
egr
ating
two
gro
ups
of
fea
tur
e
ev
al
ua
t
ion
cr
it
e
ria,
”
Ex
pert
Syste
ms
wit
h
App
licati
on
s,
vol. 110, pp. 1
1
-
19,
2018
.
[9]
D.
Rodrigue
s,
et
al
.
,
“
A
wrappe
r
appr
oac
h
fo
r
fe
at
ure
se
lecti
on
b
ase
d
on
bat
al
go
rit
hm
and
opti
m
um
-
pat
h
fore
st,
”
Ex
pert
Syste
ms
wit
h
App
licati
on
s,
vol. 41,
no.
5,
pp.
2250
-
2258
,
20
14.
[10]
M.
Mafa
rj
a
an
d
S.
Mirj
al
i
li,
“
W
hal
e
opti
m
izati
on
appr
oa
ches
for
wrappe
r
fea
tur
e
se
le
c
ti
o
n,
”
Appl
i
ed
So
f
t
Computing,
vo
l.
62,
pp
.
441
–
453
,
2018.
[11]
L.
Ji
an, J.
Li,
K.
Shu,
and
H. Liu,
“
Multi
-
la
b
el i
nf
orm
ed
feature se
le
c
ti
on,
”
I
JCA
I
,
pp.
1627
-
1633
,
2016.
[12]
K.
Hus
sain,
M.
N.
M.
Sall
eh
,
S.
Cheng,
and
Y
.
Shi,
“
Meta
heur
i
stic
re
se
arc
h:
a
comprehe
nsive
s
urve
y
,
”
Arti
f
ic
ia
l
Inte
lligen
ce R
e
view,
vo
l. 52, pp.
2191
-
2233,
201
9.
[13]
A.
E
.
Hass
anien and
E
.
Emar
y
,
“
Sw
arm i
nte
llige
nce
:
principl
es,
a
dvanc
es,
and
ap
pli
c
at
ions
,”
C
RC
Press
,
2018
.
[14]
M.
Alza
qeba
h
,
S.
Jawarne
h,
H.
M.
Sarim,
and
S
.
Abdulla
h,
“
Be
es
Algorit
hm
fo
r
Vehic
le
Rout
i
ng
Problems
with
Ti
m
e
W
indows,
”
Int
ernati
onal
J
ournal
of
Ma
chine
Learning
and
Computing,
vol
.
8,
pp.
234
–
240,
2018.
[15]
T.
Zh
ang,
S.
W
ang,
W
.
Tian,
and
Y.
Z
hang
,
“
ACO
-
VRP
TWR
V:
A
new
al
gorit
hm
for
the
v
ehi
c
le
r
outi
ng
probl
ems
with
ti
m
e
windo
ws
and
re
-
used
vehi
c
le
s
base
d
o
n
ant
co
lon
y
opt
imiza
ti
on
,
”
in
Si
xt
h
Int
ernati
ona
l
Confe
ren
ce
on
Inte
lligen
t
Syst
e
ms
Design
and
A
ppli
cations
,
200
6,
pp
.
390
–
395
.
[16]
M.
Alza
qeb
ah
a
nd
S.
Abdulla
h,
“
An
ada
pti
ve
ar
t
ifi
cial
b
ee
co
lon
y
and
la
t
e
-
a
ccep
ta
nc
e
hill
-
clim
bing
al
gorit
hm
for
exa
m
ina
t
ion
t
imeta
b
li
ng,
”
Journal
of
Sch
edul
ing
,
vol. 17,
no.
3,
p
p.
249
-
262
,
201
4.
[17]
M.
Alza
qeb
ah
an
d
S.
Abdullah,
“
Hy
br
id
bee
col
on
y
opt
imiza
t
ion
for
ex
ami
nat
ion
ti
m
eta
bli
ng
proble
m
s,
”
Computers
&
O
perati
ons R
ese
arch,
vo
l. 54, pp. 1
42
–
154,
2015
.
[18]
L.
Bre
zoč
nik
,
I.
Fis
te
r,
and
V.
Po
dgore
lec,
“
Sw
arm
int
el
li
g
ence
algorithms
for
fea
t
ure
select
ion
:
a
r
evi
ew,
”
Applie
d
Sci
en
ce
s,
vol. 8
,
p.
1521
,
2018
.
[19]
O.
Alom
ari
and
Z.
A.
Othm
an,
“
Bee
s
al
gorit
hm
for
fea
ture
se
lecti
o
n
in
ne
twork
anomal
y
de
tect
ion,
”
Journal
o
f
appli
ed
scie
n
ce
s
research,
vol
.
8
,
pp.
1748
–
1756,
2012.
[20]
Y.
W
an,
M.
W
a
ng,
Z
.
Ye
,
and
X.
Lai,
“
A
feat
ure
sel
ection
m
e
thod
base
d
on
m
odifi
ed
bin
ar
y
code
d
an
t
col
o
n
y
opti
m
iz
ation
al
g
orit
hm
,
”
Applied
Soft Computing,
vol. 49, pp. 248
–
258,
2016
.
[21]
M.
Alweshah,
et
al.
,
“
Th
e
m
onar
ch
butterfl
y
optim
iz
at
ion
al
gor
ithm
for
solv
ing
f
ea
tur
e
sel
ection
proble
m
s,
”
Neur
al
Computing
and
Appl
ic
a
ti
ons,
pp.
1
–
15,
2020.
[22]
M.
A.
Alz
aqe
b
a
h,
N.
Alr
efai,
E.
Ahm
ed
,
S.
Jawa
rne
h,
and
M.
Al
sm
adi
,
“
Neighbor
hood
sea
rch
m
et
hods
with
Moth
Optimiza
ti
o
n
a
l
gorit
hm
as
a
wr
appe
r
m
et
hod
fo
r
feature
sel
ec
t
i
on
proble
m
s,
”
I
nte
rnational
Jou
rnal
of
El
e
ct
ri
c
al
and
Computer
E
ngine
ering
(
IJECE)
,
vol. 10,
no.
4,
pp.
367
2
-
368
4
,
2020
.
[23]
J.
Kenne
d
y
and
R.
Ebe
rha
rt,
“
Parti
cle
sw
arm
opti
m
iz
at
ion
,
”
Proc
ee
dings
of
ICNN
'95
-
Inte
rnationa
l
Confe
renc
e
on
Neural
Ne
tworks
,
1995
,
pp
.
1942
-
1948.
[24]
M.
A.
Esseghir
,
G.
Gonca
lve
s,
and
Y.
Slim
a
ni,
“
Adapti
v
e
p
art
i
cl
e
sw
arm
o
p
ti
m
iz
er
for
feature
s
elec
ti
on,
”
Inte
rnational
Co
nfe
renc
e
on
Intel
li
gent Data Engi
nee
ring a
nd
Au
t
omated
Learnin
g
,
2010
,
pp
.
226
-
233.
[25]
R.
A.
Ibra
him,
et
al
.
,
“
Im
prove
d
salp
sw
arm
al
go
rit
hm
base
d
on
p
art
i
cl
e
sw
arm
op
ti
m
iz
ation
for
fea
ture
sel
ec
t
ion
,
”
Journal
of
Ambi
ent
In
te
l
li
gen
c
e
and
Hum
anized Com
puti
ng,
vo
l.
10,
pp
.
3155
-
31
69,
2019
.
[26]
B.
Tha
m
araic
h
elvi
and
G.
Yam
una,
“
H
y
brid
f
irefl
y
sw
arm
intelligence
base
d
f
e
at
ure
se
le
c
ti
on
f
or
m
edi
ca
l
d
ata
cl
assifi
ca
t
ion
an
d
s
egmenta
ti
on
in
SV
D
-
N
SC
T
dom
ai
n,
”
Int
ernati
onal
Journal
of
Adv
anc
ed
R
ese
arch,
vol
.
4,
pp.
744
-
760
,
20
16.
[27]
K.
Menghour
and
L.
Souici
-
Mesl
at
i
,
“
H
y
brid
a
co
-
pso
base
d
ap
proa
che
s
for
fe
at
ure
sele
ction,
”
Int
J
Inte
ll
Eng
S
yst
,
vol.
9
,
no
.
3
,
pp
.
65
-
79,
2016
.
[28]
L.
Y
.
Chuang
,
C.
H.
Yang,
an
d
C.
H
.
Y
ang
,
“
Ta
bu
sea
r
ch
a
nd
bina
r
y
par
ti
c
le
sw
arm
opti
m
iz
a
ti
on
for
fe
at
u
re
sele
c
ti
on
using
m
ic
roa
rra
y
data,
”
Journal
of
com
putat
ional biol
o
gy,
vo
l. 16,
no.
1
2,
pp
.
1689
-
170
3,
2009
.
[29]
K.
Tha
ng
avel
an
d
C.
Ve
lay
uth
a
m
,
“
Mammogram
image
anal
y
s
i
s:
Bio
-
i
nspired
c
om
puta
ti
ona
l
ap
proa
ch,
”
Proc
.
o
f
the
In
te
rnationa
l
Confe
ren
ce on Soft
Computing
f
or P
roblem
Sol
v
ing
(
SocP
roS 20
11)
,
2012,
pp
.
9
41
-
955.
[30]
E.
K
.
Burke
and
Y.
B
y
kov,
“
The
la
t
e
a
ccept
an
ce
hil
l
-
c
li
m
bing
h
e
uristi
c
,
”
Unive
rs
it
y
of
S
ti
rling
,
Te
ch.
Re
p
,
2012
.
[31]
M.
Alza
q
eba
h
,
e
t
a
l.
,
“
Self
-
Ada
pti
ve
B
ee
Co
lon
y
Opt
imis
at
ion
Algorit
hm
for
t
he
Flexi
b
le
Job
Shop
Schedul
in
g
Problem
,”
In
te
rn
ati
onal Journal of
Operationa
l
R
ese
arch
,
2020
.
[32]
A.
Frank
and
A.
As
unci
on,
“
UCI
m
ac
hine
l
ea
rning
r
eposit
o
r
y
,
”
vol
.
15,
p
.
22,
2011
,
[On
li
ne]
Avail
ab
l
e:
htt
p://ar
chi
ve
.
i
cs.
uci.e
du
/ml
.
[33]
J.
Friedman,
T
.
Hastie
,
and
R.
T
ibshira
ni
,
“
The
el
ements
of
sta
tis
ti
ca
l
learni
ng
,
”
Springer
series
in
stati
stic
s
New
Y
ork,
vol. 1, no.
10,
2001
.
[34]
M.
Mafa
rja,
et
al
.
,
“
Evol
uti
on
ar
y
popu
la
t
ion
d
y
nami
cs
and
gra
ss
hopper
opti
m
iz
at
ion
appr
o
ac
hes
for
featur
e
sele
c
ti
on
prob
lem
s,
”
Knowle
dge
-
Based
Syst
ems,
vol.
145
,
pp
.
25
-
45,
2018
.
[35]
R.
N.
Kum
ar
an
d
M.
A.
Kum
ar,
“
A
novel
fea
tur
e
select
ion
al
gor
it
hm
with
demps
te
r
shafe
r
fusio
n
in
form
at
ion
fo
r
m
edi
ca
l
da
ta
sets
,
”
In
te
rnat
ional
Journal
of
Appli
ed
Eng
ine
ering
Re
search,
vol
.
1
2,
no
.
14
,
pp
.
42
05
-
4212,
2017
.
Evaluation Warning : The document was created with Spire.PDF for Python.