Internati
o
nal
Journal of Ele
c
trical
and Computer
Engineering
(IJE
CE)
V
o
l.
5, N
o
. 5
,
O
c
tob
e
r
201
5, p
p
. 1
027
~103
4
I
S
SN
: 208
8-8
7
0
8
1
027
Jo
urn
a
l
h
o
me
pa
ge
: h
ttp
://iaesjo
u
r
na
l.com/
o
n
lin
e/ind
e
x.ph
p
/
IJECE
A new approach Based on Quan
tum Clustering and Wavelet
Trans
f
orm for breast can
c
er Cl
assification: Comparative study
N
e
zh
a
Ha
md
i
1
, Kha
lid
Auhma
n
i
2
, M
o
h
a
. M’
R
a
bet
H
a
ss
ani
1
1 Departm
e
nt
of
P
h
y
s
ics
,
F
acul
t
y
of s
c
ien
ces
S
e
m
l
a
li
a,
Cadi
A
yya
d
Univers
i
t
y
,
M
a
r
r
akech
, M
o
roc
c
o
Nezha_hamdi@
y
a
hoo
.com
2 Department of
Industrial Engin
eering
,
National
school of
applied scien
ces,
Cad
i
A
y
y
a
d
University
, Safi, Moro
cco
k.auhmani@uca.ma
Article Info
A
B
STRAC
T
Article histo
r
y:
Received Apr 17, 2015
Rev
i
sed
Jun
5
,
2
015
Accepted
Jun 20, 2015
Feature selectio
n involves iden
tif
y
ing a
subset
of the most useful features
that produce the
same results as the origin
al set of features
. In this paper, we
pres
ent a n
e
w ap
proach for
im
proving cl
as
s
i
fica
t
i
on ac
cura
c
y
.
Th
is
approach
is based on quantum clusterin
g
for f
eature s
ubset selection
and wavelet
trans
f
orm
for fe
atures
ex
tra
c
tion
.
The f
e
a
t
ure s
e
l
ect
ion is
perfor
m
ed in three
steps. First the
mammographic image
undergoes a wavelet tr
an
sform then
som
e
features ar
e extr
act
ed. In
t
h
e s
econd step
t
h
e origin
al fe
atu
r
e space
is
partitioned
in
clusters in ord
e
r
to
group similar
features. This
operation is
performed using the Quantum Clustering algor
ith
m. The third step deals with
the s
e
l
e
c
tion of
a r
e
pres
ent
a
t
i
v
e
fe
ature
for
ea
ch c
l
us
ter.
Th
is
s
e
le
ction
is
bas
e
d on s
i
m
ilarit
y
m
e
as
ures
s
u
ch as
the correl
a
ti
on coeffic
i
en
t (CC) and th
e
m
u
tual inform
ati
on (MI). The f
e
a
t
ure which m
a
x
i
m
i
zes this infor
m
ation (CC
or MI) is chosen b
y
the algorith
m
. This
approach is applied for breast can
cer
classification. The
K-near
est neighbors (KNN) classifier
is used
to ach
iev
e
the
clas
s
i
fi
cat
io
n. W
e
hav
e
pr
e
s
ented
clas
s
i
fi
ca
tion a
ccur
a
c
y
ve
rs
us
feature
ty
p
e
, w
a
velet
transform and K neighbors
in th
e KNN classifier
. A
n
accur
acy
of 100% was r
e
ached
in some
cas
es.
Keyword:
Classification
Correlation c
o
efficient
Feature
selection
Mu
tu
al informatio
n
Qu
an
tu
m
Clu
s
tering
Wav
e
let tran
sfo
r
m
Copyright ©
201
5 Institut
e
o
f
Ad
vanced
Engin
eer
ing and S
c
i
e
nce.
All rights re
se
rve
d
.
Co
rresp
ond
i
ng
Autho
r
:
Kh
alid Auh
m
a
n
i
Depa
rt
em
ent
of I
n
dust
r
i
a
l
En
gi
nee
r
i
n
g,
Natio
n
a
l Schoo
l
of Ap
p
lied
Scien
ces,
Cad
i
A
yyad Un
iv
er
sity,
BP 63
Saf
i
, M
o
ro
cco
Em
a
il: k
.
au
h
m
an
i@u
ca.m
a
1.
INTRODUCTION
The
hi
g
h
di
m
e
nsi
o
nal
nat
u
re
of m
a
ny
dat
a
i
n
bi
oi
n
f
o
r
m
a
t
i
cs, ha
s gi
ve
n r
i
se t
o
a
weal
t
h
of
feat
ure
subset selection techniques.
Feature selection aim
s
identifying a s
u
bset of
the m
o
st useful
feature
s
that a
llows
t
h
e sam
e
resul
t
s
as t
h
e
o
r
i
g
i
n
al
set
of
feat
ur
es. Feat
ure
su
b
s
et
sel
ect
i
on i
s
an e
ffect
i
v
e
m
e
t
hod
f
o
r
re
m
ovi
ng
i
rrel
e
va
nt
feat
u
r
es, i
m
pro
v
i
n
g
l
earni
ng a
ccu
r
acy
, an
d i
m
pro
v
i
n
g cl
assi
fi
cat
i
on acc
u
r
acy
.
M
a
ny
m
e
t
hods
ha
ve
been st
u
d
i
e
d f
o
r di
f
f
ere
n
t
applications.
They are ge
neral
l
y classifi
ed into three cate
g
ories: the
Wrappe
r,
Filter,
and
Emb
e
dd
ed
[1
].
The Wra
ppe
r approaches us
e
the
cla
ssification error rat
e
as a eval
uat
i
on c
r
i
t
e
ri
a [2]
.
They
t
h
e
n
i
n
co
rp
orat
e
t
h
e cl
assi
fi
cat
i
o
n
al
go
ri
t
h
m
i
n
t
h
e sea
r
c
h
a
n
d
sel
ect
i
on
of
at
t
r
i
but
es.
T
h
ese
m
e
t
hods al
l
o
w t
h
e
obtaini
ng of
hi
gh
pe
rform
a
nce. Howeve
r,
the use
of
suc
h
m
e
thods
re
quires for eac
h s
u
bspace
of attributes t
o
perform
classification, which
can bec
o
m
e
co
stly
in cal
culation tim
e
especially wh
en t
h
e
di
m
e
nsi
on d o
f
t
h
e
input s
p
ace is l
a
rge
.
T
h
ese m
e
thods a
r
e
very
de
pe
ndent
of t
h
e
use
d
classifi
cation al
gorithm
.
Filter app
r
o
a
ch
es
u
s
e an
ev
alu
a
tio
n
fu
n
c
tion
b
a
sed on
t
h
e ch
aracteristics o
f
all d
a
ta, i
n
d
e
p
e
nd
en
tly
of a
n
y
cl
assi
fi
cat
i
on al
g
o
ri
t
h
m
[3-
7
]
.
The
s
e
m
e
t
hods a
r
e f
a
st
, ge
neral
a
n
d l
e
ss ex
pe
nsi
v
e i
n
com
put
at
i
on t
i
m
e,
Evaluation Warning : The document was created with Spire.PDF for Python.
I
S
SN
:
2
088
-87
08
I
J
ECE Vo
l. 5
,
N
o
. 5
,
O
c
tob
e
r
20
15
:
102
7
–
10
34
1
028
wh
ich
allo
ws t
h
em
to
o
p
e
rat
e
m
o
re easily
with
d
a
tab
a
ses
of
very
l
a
rge
di
m
e
nsi
ons.
H
o
we
ve
r, as t
h
e
y
are
inde
pende
n
t
of the classification stage
,
they
do no
t
gua
ra
ntee to
reach the
best classification accuracy.
In
o
r
de
r t
o
co
m
b
i
n
e t
h
e ad
v
a
nt
ages
o
f
bot
h m
e
t
hods
, hy
bri
d
al
g
o
ri
t
h
m
s
"em
b
edde
d" ha
ve
bee
n
pr
o
pose
d
.
The
feat
ure
sel
ect
i
on p
r
o
cess i
s
pe
rf
orm
e
d i
n
conj
un
ction
with
t
h
e classification
process.
A fi
lter-
t
y
pe eval
uat
i
o
n f
u
nct
i
o
n
i
s
fi
rst
use
d
t
o
scre
en t
h
e m
o
st
di
s
c
ri
m
i
nat
i
ng fea
t
ure s
u
bspace
.
The
n
t
h
e e
r
r
o
r
rat
e
s
of m
i
sclassific
a
tion,
by consi
d
eri
ng
eac
h
dis
c
riminant subs
pace pre
v
iousl
y
selected, are
com
p
ared in
order t
o
determ
ine the final subs
pace
[8,
9].
Du
e t
o
th
ei
r com
p
u
t
atio
n
a
l efficien
cy and
ind
e
p
e
nd
en
ce of an
y classificatio
n
al
g
o
rith
m
,
th
e “filter”
approaches
are
m
o
re po
pu
lar
an
d co
mm
o
n
l
y u
s
ed
.
W
i
t
h
res
p
ect to the filter fe
ature selection
m
e
thods
, the
application
of cluster analysis has bee
n
d
e
m
o
n
s
trated to
b
e
m
o
re effectiv
e th
an
trad
itio
n
a
l feat
u
r
e selectio
n
algo
rit
h
m
s
[1
].
In c
ont
ra
ry
wi
t
h
feat
ure sel
e
ct
i
on f
o
r s
upe
r
v
i
s
ed l
ear
ni
n
g
sy
st
em
s, rel
a
t
i
vel
y
few ap
pr
oache
s
ha
ve
b
een
pr
opo
sed
f
o
r
un
sup
e
rv
ised
learn
i
ng
(
a
uto
m
a
tic cla
ssi
ficat
i
on or
cl
ust
e
ri
n
g
). I
ndee
d
, the feature sel
ection
p
r
ob
lem
fo
r au
to
m
a
tic classi
ficatio
n
is a mu
ch
m
o
re
d
i
ffi
cu
lt p
r
ob
lem
i
n
su
p
e
rv
ised
cases (d
iscrim
in
atio
n)
whe
r
e t
h
e
dat
a
are l
a
bel
e
d
[1
0]
. A
n
ot
he
r
im
port
a
nt
pr
o
b
l
e
m
associ
at
ed wi
t
h
cl
assi
fi
cat
i
on c
once
r
n
s
t
h
e
au
to
m
a
tic d
e
te
rm
in
atio
n
o
f
t
h
e nu
m
b
er
o
f
clu
s
ters th
at is clearly in
flu
e
n
ced b
y
th
e ou
tco
m
e o
f
th
e
feature
selectio
n
.
I
n
clu
s
ter an
al
ysis d
a
ta ar
e d
i
v
i
d
e
d
in
t
o
g
r
ou
p
s
(
c
l
u
ster
s)
.
Th
e
go
al is th
at th
e obj
ects
with
in
a
g
r
oup
be sim
ilar (or related) to o
n
e
anothe
r an
d d
i
ffere
nt fr
om
(
o
r unrelated to) the obj
ect
s i
n
ot
he
r gr
o
u
p
s
. Thi
s
st
ep i
s
usef
ul
f
o
r
feat
u
r
e sel
e
c
t
i
on
pu
r
pose
s
.
In
o
u
r
st
u
d
y
w
e
p
r
o
p
o
se a
ne
w m
e
t
hod
f
o
r
f
eat
ure s
e
l
ect
i
on
base
d
on
cl
u
s
t
e
ri
ng
. T
h
i
s
m
e
t
h
o
d
wo
r
k
s
in
two
steps.
In
th
e f
i
r
s
t
f
eat
u
r
es ar
e
gr
ouped
in clu
s
ters.
W
e
ap
p
l
y th
e
q
u
a
n
t
u
m
clu
s
ter
i
ng
algo
r
ithm
(
Q
C)
prese
n
t
e
d i
n
[1
1]
fo
r t
h
i
s
p
u
r
pos
e. I
n
t
h
e se
con
d
st
ep
, t
h
e m
o
st
represent
a
t
i
v
e feat
ure t
h
at
i
s
st
rongl
y
r
e
l
a
t
e
d
to target classe
s is selected from
each cluster to form
the final subset of features. T
h
is s
e
lection is perform
ed
b
y
u
s
ing
th
e si
milarit
y
m
easu
r
es su
c
h
as c
o
rrelation c
o
efficient (CC)
and th
e m
u
tu
al in
fo
rm
atio
n
(MI). The
featu
r
e wh
ich
max
i
m
i
zes th
is in
fo
rm
a
tion is
chosen by t
h
e
algorithm
.
2.
MATE
RIAL
S AND METHODS
2.
1 Fr
amew
or
k
Ma
mm
o
g
r
aphy is cu
rren
tly th
e
m
o
st effectiv
e i
m
ag
in
g
m
o
d
a
lity fo
r b
r
east can
cer screen
ing
.
Howev
e
r, th
e sen
s
itiv
ity o
f
ma
mm
o
g
r
aph
y
is h
i
g
h
l
y ch
allen
g
e
d
b
y
th
e presen
ce of d
e
n
s
e b
r
east p
a
ren
c
h
y
m
a
,
whi
c
h
det
e
ri
o
r
at
es bot
h
det
ect
i
on a
n
d
cha
r
act
eri
zat
i
on t
a
sks
[1
2-
1
3
]
.
C
o
m
put
er Ai
de
d
di
ag
no
si
s (
C
AD
)
sy
st
em
s have been
devel
ope
d t
o
ai
d ra
di
o
l
ogi
st
s i
n
det
e
ct
i
ng m
a
m
m
o
gra
p
hi
c l
e
si
on
s, charact
e
r
i
zed by
pr
om
i
s
i
ng per
f
o
rm
ance [1
4
-
1
5
]
.
C
A
D sy
st
em
s for ai
di
n
g
t
h
e deci
si
on c
o
ncer
ni
n
g
bi
o
p
s
y
and
fol
l
o
w-
u
p
ar
e
still unde
r
de
velopm
ent.
Vari
ous
C
A
D
di
ag
no
si
s al
g
o
r
i
t
h
m
s
have
be
en
pr
op
ose
d
.
These al
go
ri
t
h
m
s
are based
on
ext
r
act
i
n
g
i
m
ag
e featu
r
es fro
m
reg
i
o
n
s
o
f
in
terest (R
OIs) and
estimatin
g
th
e
p
r
o
b
ab
ility o
f
m
a
li
g
n
a
n
c
y.
A v
a
riety o
f
com
puter-e
xtra
cted features and cla
ssi
fi
cat
i
o
n schem
e
s hav
e
been
used t
o
aut
o
m
a
t
i
call
y
di
scri
m
i
nat
e
bet
w
een
beni
gn
an
d m
a
l
i
gnant
i
m
age.
The m
a
jori
t
y
o
f
t
h
ese
st
u
d
i
e
s
have
f
o
l
l
o
wed
t
h
ree st
e
p
s.
Th
e first step
d
eals with
two p
r
ob
lem
s
: su
p
p
ress
the noi
s
e and enha
nc
e the contrast
betwee
n the
r
e
g
i
o
n
o
f
in
terest (
R
O
I
)
and b
ackg
r
ou
nd
.
It is th
e task
o
f
p
r
epr
o
cessing
. Th
e second step
d
eals w
i
th
th
e
extraction of features. T
h
is task can
b
e
do
ne o
n
th
e im
ag
e with
ou
t an
y tr
ansform
a
tion or on the im
age after
an a
d
equate tra
n
sform
a
tion. T
h
e thi
r
d step
de
als w
ith t
h
e sel
ection
of feat
ures and classific
a
tion.
Figure 1
pre
s
ents the fra
m
ework we
realized
to im
prove the
classification accuracy of
m
a
m
m
oghrap
h
i
c im
ages t
h
r
o
ug
h t
h
e
pr
o
pos
ed m
e
t
hod.
The proposed algorithm
was
tested
on
t
h
e
o
r
ig
in
al m
a
mm
o
g
raph
ic im
ag
es of MIAS
d
a
tab
a
se
[16
]
.
Pre
p
r
o
cessi
n
g
i
s
an im
port
a
n
t
i
ssue i
n
l
o
w-l
e
vel
im
age proces
si
n
g
. The
un
derl
y
i
ng
pr
i
n
ci
pl
e o
f
pre
p
r
o
cessi
ng
i
s
t
o
enl
a
rge
t
h
e i
n
t
e
nsi
t
y
di
f
f
e
rence
bet
w
ee
n
ob
ject
s a
n
d
back
g
r
o
u
n
d
an
d t
o
p
r
o
d
u
ce
r
e
l
i
a
bl
e
rep
r
ese
n
t
a
t
i
ons
of
b
r
east
t
i
ssu
e st
ruct
ures
. M
o
re
det
a
i
l
s
o
f
t
h
e
pr
op
ose
d
pr
epr
o
cessi
ng m
e
t
h
o
d
ca
n
be f
o
un
d i
n
[1
7,
1
8
]
.
B
e
fo
re ext
r
act
i
ng s
o
m
e
feat
ures we deci
de
d t
o
t
r
ans
f
o
r
m
t
h
e i
nput
i
m
age by
a wavel
e
t
t
r
ansfo
r
m
.
This choice is
im
posed
by the fact that m
i
crocalci
fi
cat
i
o
ns are c
h
a
r
act
eri
zed
by
hi
g
h
fre
que
ncy
be
havi
or
.
Microcalcifications
are a
sign
of po
ssible
pre
s
ence
of canc
e
r
in breast.
The basi
c i
d
e
a
behi
n
d
wa
v
e
l
e
t
t
r
ansfo
r
m
i
s
t
o
anal
y
z
e di
ffe
rent
f
r
eq
uenci
e
s
of a
n
im
age usi
n
g
diffe
re
nt scales. Hig
h
fre
qu
encies are analyzed using low scales wh
ilst low fre
quencies
ar
e analyzed in high
scales. Th
is is a far m
o
re flex
ib
le app
r
o
a
ch th
an
th
e Fo
urier tran
sform
,
en
ab
ling
an
alysis o
f
bo
th
lo
cal an
d
global feat
ures
[19].
Evaluation Warning : The document was created with Spire.PDF for Python.
I
J
ECE
I
S
SN
:
208
8-8
7
0
8
A new
a
p
p
r
o
a
c
h
B
a
se
d
on
Q
u
ant
um
C
l
ust
e
ri
ng
a
n
d
W
a
vel
e
t
Tra
n
sf
or
m f
o
r …
(
K
hal
i
d
A
u
h
m
ani
)
1
029
In
th
is
wo
rk
we u
s
e th
e
d
i
screte wav
e
let tran
sf
o
r
m
(DWT) b
a
sed
on
th
e
co
n
c
ep
t of m
u
ltireso
l
u
tion
an
alysis (MR
A
) i
n
tro
d
u
c
ed
b
y
Mallat [20
]
and
th
e Do
u
b
le Density
W
a
v
e
let transfo
r
m
(DDW
T) i
n
trod
u
c
ed
by
I
v
a
n
Sel
e
s
n
i
c
k [
2
1,
2
2
]
.
I
m
ages are
dec
o
m
pos
ed in t
h
ree levels the
n
features a
r
e e
x
tracted.
Fi
gu
re 1.
Fram
ewo
r
k of
t
h
e pr
op
ose
d
feature subset
selection
algorithm
2.2 Discrete Wavelet
Tr
ansfor
m
Fo
r t
h
e DWT, an
i
m
ag
e is first tran
sfo
r
m
e
d
b
y
a
filter in
th
e ho
rizo
n
t
al d
i
rectio
n
.
Th
e h
i
gh-p
a
ss
filter an
d
th
e lo
w-pass filter are fin
ite i
m
p
u
lse respo
n
s
e fi
lters. Th
e filtered
ou
tpu
t
s are d
o
wn
sam
p
led
b
y
a
facto
r
of
2
in
h
o
rizon
t
al d
i
rectio
n
.
Th
e
signals are th
en
filtered
b
y
an id
en
tical filter in
th
e v
e
rtical d
i
rectio
n
and
d
o
w
n
sam
p
l
e
d
by
a fact
o
r
o
f
2 a
g
ai
n. T
h
e res
u
l
t
i
s
a d
ecom
posi
t
i
on
of t
h
e i
m
age i
n
t
o
f
o
u
r
s
u
b
-
ba
nds
.
A
n
app
r
oxi
m
a
t
i
on sub
-
ban
d
a
n
d t
h
ree
det
a
i
l
s
s
u
b-
ba
nds
.
Th
e three d
e
tails su
b
-
b
a
n
d
s rep
r
esen
t th
e fi
n
e
st scale wavelet co
efficien
t
s
wh
ile th
e app
r
ox
im
a
tio
n
s
u
b-b
a
nd
co
rr
es
p
ond
s
to
c
o
ars
e
le
v
e
l c
o
ef
f
i
cie
n
ts
.
To
ob
tai
n
th
e
n
e
x
t
co
arse lev
e
l of
wavelet co
efficien
t
s
, th
e
app
r
oxi
m
a
t
i
on sub
-
ba
n
d
i
s
furt
her dec
o
m
pose
d
an
d cri
t
i
cally sa
m
p
led
.
Th
is resu
lts in
two-lev
e
l wav
e
let
d
eco
m
p
o
s
ition
as
shown
in
Fig
u
re 2
.
Fig
u
re
2
.
Discrete wav
e
let tran
sfo
r
m
Filter Ban
k
fo
r On
e-Lev
e
l Im
ag
e
Deco
m
p
o
s
itio
n
[2
3
]
M
a
m
m
ogh
r
a
phic Im
ages
Preprocessing
Wavelet Transform
Feature Extraction
Quantu
m
Cluster
i
ng
Classif
i
cation Accuracy
Correlation
coef
f
i
cient (CC)
Mutual Inform
atio
n
(
M
I)
Selected Featu
r
es
Feature Sel
ection
Evaluation Warning : The document was created with Spire.PDF for Python.
I
S
SN
:
2
088
-87
08
I
J
ECE Vo
l. 5
,
N
o
. 5
,
O
c
tob
e
r
20
15
:
102
7
–
10
34
1
030
2.
3 Do
ubl
e De
nsi
t
y
W
avel
e
t T
r
ansf
orm
Al
t
h
o
u
g
h
t
h
e
D
W
T i
s
a
p
o
w
e
rf
ul
si
g
n
al
pr
o
cessi
ng
t
o
ol
, i
t
has t
w
o
seve
re
di
sad
v
a
n
t
a
ges
[2
4]
:
a.
Lack of
s
h
ift-i
nva
riance
, whi
c
h
m
ean
s th
at
min
o
r
sh
ifts in th
e i
n
pu
t si
g
n
al, can cause
maj
o
r v
a
riation
s
i
n
t
h
e di
st
ri
but
i
o
n
o
f
e
n
er
gy
bet
w
een
wa
ve
let coefficients
at diffe
re
nt
scales
.
b.
Sin
ce th
e wav
e
let filters are sep
a
rab
l
e an
d real, it cau
se
s po
or d
i
rectio
nal
selectiv
ity
fo
r d
i
ag
on
al features
The
d
o
u
b
l
e
de
nsi
t
y
wa
vel
e
t
t
r
ans
f
orm
pr
ovi
des
hi
g
h
e
r
di
r
ect
i
onal
sel
ect
i
v
i
t
y
, bet
t
e
r
pe
ak si
gnal
t
o
noi
se
rat
i
o
a
n
d
vi
su
al
pe
rcept
i
on t
h
an
t
h
e
sp
at
i
a
l
dom
ai
n
m
e
t
h
o
d
s a
n
d
ot
h
e
r f
r
e
que
ncy
d
o
m
a
i
n
m
e
t
hod
s [
23]
.
DDWT is a shift in
sen
s
itiv
e,
d
i
rection
a
l, com
p
lex
wav
e
let tran
sform
wit
h
a v
e
ry lo
w
red
und
an
cy facto
r
of
t
w
o re
ga
rdl
e
ss
of t
h
e
num
ber of scal
es
. D
o
u
b
l
e
de
nsi
t
y
wavel
e
t
s
ha
ve
a si
ngl
e scal
i
ng
fu
nct
i
o
n an
d t
w
o
wavel
e
t
fu
nct
i
ons
.
Fi
g
u
re 3 s
h
o
w
s
t
h
e DD
WT
sc
hem
e
[2
3]
.
Fi
gu
re
3.
D
o
ub
l
e
den
s
i
t
y
wav
e
l
e
t
t
r
ansf
orm
schem
e
[23]
2.
4 Fe
at
ures E
x
tr
acti
on
Features
are e
x
tracted
from a set of t
w
o cla
sses label
e
d im
ages (norm
a
l and abnorm
al). The
following
features a
r
e e
x
tracte
d
:
- A v
ect
o
r
of
2
4
tex
t
u
r
e
d
e
scrip
t
ors is formed
fro
m
a
mu
lti-lev
e
l h
i
stog
ram
o
f
3
,
5, 7
,
and
9
b
i
n
s
[2
5]
.
- A vector of
descriptors is
ca
lculated from
the
fi
rst
or
der
o
f
st
at
i
s
t
i
cal
m
o
m
e
nt
s [2
6]
.
- T
h
ree
o
f
t
h
e
s
i
x pa
ram
e
t
e
rs i
n
t
r
od
uce
d
by
T
a
m
u
ra are
u
s
ed
[
2
7
,
28]
.
- Radon’s
cha
r
acteristics are
calculated [29,
30].
- Ze
rni
k
e’s
m
o
m
e
nt
s of
o
r
der
n =
1
2
a
r
e cal
c
u
l
a
t
e
d c
o
r
r
es
po
ndi
ng
t
o
4
9
fea
t
ures
[
31]
.
If the u
s
ed
wavelet tran
sfo
r
m
i
s
th
e dou
b
l
e
d
e
n
s
ity
d
i
screte
wav
e
let tran
sform
(DDWT), th
e im
ag
e is
di
vi
de
d i
n
t
o
ni
ne s
u
b
-
i
m
ages, so t
h
e
n
u
m
b
er of
feat
u
r
es i
s
9 *
1
3
9
=1
2
5
1
.
If t
h
e t
r
a
n
sf
or
m
a
t
i
on i
s
base
d o
n
t
h
e
Evaluation Warning : The document was created with Spire.PDF for Python.
I
J
ECE
I
S
SN
:
208
8-8
7
0
8
A new
a
p
p
r
o
a
c
h
B
a
se
d
on
Q
u
ant
um
C
l
ust
e
ri
ng
a
n
d
W
a
vel
e
t
Tra
n
sf
or
m f
o
r …
(
K
hal
i
d
A
u
h
m
ani
)
1
031
d
i
screte wav
e
l
e
t tran
sfo
r
m
(DWT), t
h
e ori
g
in
al im
ag
e
i
s
decom
pose
d
i
n
t
o
f
o
u
r
su
b
-
i
m
ages, a
n
d t
h
e
num
ber
o
f
calcu
lated
featu
r
es is
4
*13
9
=
55
6. Each o
r
i
g
in
al im
ag
e w
ill b
e
rep
r
esen
ted
b
y
a gro
u
p
o
f
5
5
6
o
r
1
251
feature
s
.
2.
5 Fe
at
ure Se
l
ecti
o
n
Extra
features
can inc
r
ease c
o
m
putation time, and
ca
n i
m
pact the accuracy of
the
Detection System
.
Feat
ure
sel
ect
i
o
n
i
m
prove
s c
l
assi
fi
cat
i
on
b
y
searc
h
i
n
g
fo
r t
h
e
s
ubs
et
o
f
f
eat
ure
s
,
w
h
i
c
h
best
cl
assi
f
y
t
h
e
t
r
ai
ni
n
g
dat
a
[
3
2]
. Feat
u
r
e sel
e
ct
i
on re
d
u
ces t
h
e di
m
e
nsi
onal
i
t
y
of dat
a
by
s
e
l
ect
i
ng o
n
l
y
a sub
s
et
of m
easure
d
feat
ure
s
(
p
redi
ct
or
vari
a
b
l
e
s)
t
o
creat
e a
m
odel
[3
3]
.
Feat
ure s
u
bset
sel
ect
i
on i
s
a p
r
oces
s o
f
i
d
ent
i
fy
i
ng an
d rem
ovi
ng i
r
rel
e
va
n
t
and re
d
u
n
d
a
n
t
feat
ures.
Irreleva
nt features
, along wi
t
h
redunda
nt fe
atures, se
verel
y
affect th
e accuracy of the lear
ni
ng m
achines [34,
35]
. T
hus
, fea
t
ure su
bset
sel
ect
i
on sh
o
u
l
d
be abl
e
t
o
i
d
e
n
t
i
f
y
and rem
ove as m
u
ch of t
h
e i
rrel
e
va
nt
and
red
u
nda
nt
i
n
fo
rm
ati
on as
p
o
s
s
i
b
l
e
Many feature
subset selection algorithm
s
have
bee
n
de
vel
o
ped
.
S
o
m
e
of
them
e can effectively
eliminate irrel
e
vant
features
but fail to
handle
re
dund
a
n
t feat
ures
, ye
t som
e
of
others ca
n eliminate the
irrelev
a
n
t
wh
il
e tak
i
ng
care
of th
e red
und
ant featu
r
es
[1
,35]. W
e
p
r
op
ose a
m
e
th
o
d
th
at
falls in
to
t
h
e seco
nd
g
r
ou
p.
C
l
ust
e
ri
n
g
occ
u
rs i
n
u
n
s
upe
r
v
i
s
ed l
ear
ni
n
g
t
a
sk
s where t
h
e labels of training e
x
am
ples are not
kn
o
w
n a
p
r
i
o
ri
. The
g
o
al
i
s
t
o
fi
nd a
n
or
ga
ni
zat
i
on
p
o
i
n
t
cl
ou
d c
o
r
r
esp
o
ndi
ng
t
o
t
h
e
t
r
ai
ni
ng
exam
pl
es, M
areas, called clu
s
ters.
We
wi
ll u
s
e clu
s
teri
ng
in
a p
a
rtic
u
l
ar con
t
ex
t i
n
wh
ich
it will b
e
ap
p
lied d
i
rectly to
v
ectors of attrib
u
t
es: we
will see th
at we
n
eed
t
o
reo
r
g
a
n
i
ze th
ese attrib
u
t
ed
b
y
grou
p
i
n
g
t
h
ose that are
"clo
ser" to each
o
t
h
e
r i
n
cl
u
s
ters
(classes), form
in
g
sup
e
rclasses.
We will u
s
e a
p
a
rticu
l
ar clu
s
terin
g
approach: quantum
clustering
Quantum Clustering
C
l
ust
e
ri
n
g
occ
u
rs i
n
u
n
s
upe
r
v
i
s
ed l
ear
ni
n
g
t
a
sk
s where t
h
e labels of training e
x
am
ples are not
k
now
n. Th
e
go
al is to
f
i
n
d
an
or
g
a
n
i
zation
po
in
t cl
o
ud
cor
r
es
po
n
d
i
n
g t
o
t
h
e
training exam
ples, M
areas,
called
clu
s
ters. Th
e resu
lt is a fo
rest and
each
tree in
t
h
e
forest rep
r
esen
ts
a clu
s
ter [1
].
We
will u
s
e cl
u
s
tering
in
a p
a
rticu
l
ar co
n
t
ex
t in
which
it will b
e
ap
p
lied
d
i
r
ectl
y
to
featu
r
e
vecto
r
.
We will see th
at we need
to
reorga
nize these features
by grouping th
ose that are "close
r" to each ot
her
in
clusters.
We
will use a part
icular
cl
ust
e
ri
n
g
a
p
pr
oach:
quant
u
m clustering.
H
o
rn
[11
]
estimated
lo
cally
p
r
ob
ab
ility d
e
nsities at
th
e p
o
in
t x
b
y
ob
servin
g
th
e trai
n
i
ng
set around
t
h
i
s
p
o
i
n
t
.
Th
e
y
are i
m
pl
em
ent
e
d
by
t
h
e
wi
nd
o
w
s Pa
rze
n
t
echni
que
[
3
6]
Bravais-Pe
ars
o
n
Correlation Coefficient (CC)
The correlation coe
fficient of Bravais
-
Pearson is a
statisti
cal in
d
e
x
th
at ex
presses th
e i
n
ten
s
ity and
d
i
rection
(po
s
i
tiv
e or
n
e
g
a
tive)
o
f
th
e lin
ear relatio
ns
h
i
p
between two
q
u
an
titativ
e v
a
riab
les.
It is a m
e
asu
r
e
o
f
th
e lin
ear link
,
i.e. th
e ab
ility to
pred
ict a variab
le
x
b
y
ano
t
h
e
r
v
a
riab
le
y u
s
ing
a lin
ear m
o
d
e
l.
It allo
ws m
eas
u
r
i
n
g
th
e relatio
n
s
h
i
p
in
ten
s
ity b
e
tw
een
two q
u
a
n
titativ
e v
a
riab
les. It is th
erefore an
im
port
a
nt
pa
ra
m
e
t
e
r i
n
t
h
e l
i
n
ear reg
r
essi
ons
(si
ngl
e o
r
m
u
l
tip
le) an
alysis. Howev
e
r, th
is co
efficien
t is zero
(r
= 0) w
h
en t
h
e
r
e i
s
no l
i
n
ear r
e
l
a
t
i
onshi
p bet
w
een t
h
e
vari
a
b
les (which
does not excl
ude
the existence of a non
lin
ear relationsh
i
p). Moreo
v
er, th
e co
efficien
t h
a
s a
po
sitiv
e sig
n
if t
h
e relation
s
h
i
p
is po
sitiv
e (d
irect,
i
n
creasi
n
g
)
a
n
d
ne
gat
i
v
e si
gn
i
f
t
h
e
rel
a
t
i
o
nsh
i
p i
s
negat
i
v
e
(
i
nve
rse,
dec
r
ea
si
ng
).
Mu
tual
Inf
o
r
m
ati
o
n
(
M
I)
The m
u
tual inform
ation
of
a pair
o
f
vari
ables (
X
,
Y
)
rep
r
ese
n
t
s
t
h
e
de
pen
d
e
n
ce
deg
r
ee i
n
t
h
e
probabilistic sense.
It m
easu
r
es the am
ount
of inform
ati
on brought
by a random
varia
b
le on anothe
r. It is a
red
u
ct
i
o
n
of
u
n
cert
a
i
n
t
y
ab
out
a ra
n
dom
vari
abl
e
d
u
e t
o
t
h
e
kn
o
w
l
e
d
g
e
of
anot
her
.
Th
e m
u
tu
al inform
at
io
n
and th
e correlation
co
effici
ent
p
r
oce
d
ures
are
not
base
d
o
n
a pa
rt
i
c
ul
ar
m
odel
.
The co
nst
r
uct
i
on
of a
K-n
earest
nei
g
h
b
o
rs (
K
NN
)
m
odel
i
s
feasi
b
l
e
on t
h
e
d se
l
ect
ed vari
abl
e
s. The
o
b
t
ain
e
d
resu
lts fo
r t
h
e
QC-C
C and
QC-IM
m
o
d
e
l will b
e
p
r
esen
ted.
3.
RESULTS
A
N
D
DI
SC
US
S
I
ONS
In
orde
r to measure
the
perform
a
nce of t
h
e
QC-F
S algorithm
we ha
ve use
d
the
KNN
classifier.
Ex
peri
m
e
nt
s are
per
f
o
r
m
e
d un
de
r M
a
t
l
a
b
soft
ware
. T
h
e
ex
peri
m
e
nt
carri
ed
out
dea
l
s wi
t
h
t
h
e e
f
fect
o
f
wavel
e
t
t
r
a
n
s
f
orm
i
n
ass
o
ci
at
i
on
wi
t
h
si
m
i
l
a
ri
t
y
m
easure
o
n
t
h
e
cl
assi
fi
ca
t
i
on acc
uracy
.
Fi
gu
res
4
an
d
5 s
h
ow
resul
t
s
co
rre
sp
on
di
n
g
t
o
d
o
u
b
l
e
de
nsi
t
y
and di
sc
reet
wa
velet trans
f
orm
s
associat
ed with the correlation
coefficient a
n
d m
u
tual in
fo
rm
ation respecti
v
ely
.
Evaluation Warning : The document was created with Spire.PDF for Python.
I
S
SN
:
2
088
-87
08
I
J
ECE Vo
l. 5
,
N
o
. 5
,
O
c
tob
e
r
20
15
:
102
7
–
10
34
1
032
Fi
rst
,
we
n
o
t
e
t
h
e
su
peri
ori
t
y
of
D
D
W
T i
nde
pe
nde
nt
l
y
of
use
d
feat
ur
es an
d
em
pl
oy
ed si
m
i
l
a
ri
ty
measure for feature selec
t
i
o
n.
In
dee
d
, t
h
e
best
cl
assi
fi
c
a
t
i
on acc
uraci
es
are obtai
ne
d with
the Ze
rni
k
e
m
o
men
t
s (1
00%) fo
llo
wed
b
y
th
e statisti
cal
m
o
m
e
n
t
s
(9
1.5 %). Howeve
r, ass
o
cia
t
ed with the discrete
wav
e
let tran
sfo
r
m
,
Zern
i
k
e
m
o
men
t
s g
i
v
e
th
e least satisfacto
r
y resu
lts,
alth
o
ugh
th
ey
are th
e m
o
st su
ccessfu
l
with
DDWT. Here, we
show
a
relation
between
th
e feat
ure t
y
pe
a
n
d
t
h
e
use
d
T
r
a
n
sf
orm
a
t
i
on. F
o
r
di
scret
e
wavelet
(D
WT
) w
e
reac
h l
o
w
pe
rf
orm
a
nce (
6
1
.
6
%
).
Conce
r
ni
ng the contri
bution
of CC
and MI sim
i
larity
me
asure
s
in
classification accuracy we can
deduce that MI provi
des re
sul
t
s better than
CC. Indee
d
we
find t
h
at the a
v
era
g
e
of
class
i
fication accuracy for
MI is ab
ou
t
9
9
%, wh
ile th
is av
erag
e is aro
und
8
5
% for CC.
Fi
gu
re
4.
C
l
assi
fi
cat
i
on acc
ur
acy
vers
us
wa
v
e
l
e
t
t
r
ansf
orm
and feat
ures
types ass
o
ciated
with sim
ilarity
measure-base
d Corelation c
o
e
fficient
(CC)
Fi
gu
re
5.
C
l
assi
fi
cat
i
on acc
ur
acy
vers
us
wa
v
e
l
e
t
t
r
ansf
orm
and feat
ures
types ass
o
cieted
with sim
ilarity
measure-base
d Mutual
Information (MI)
Evaluation Warning : The document was created with Spire.PDF for Python.
I
J
ECE
I
S
SN
:
208
8-8
7
0
8
A new
a
p
p
r
o
a
c
h
B
a
se
d
on
Q
u
ant
um
C
l
ust
e
ri
ng
a
n
d
W
a
vel
e
t
Tra
n
sf
or
m f
o
r …
(
K
hal
i
d
A
u
h
m
ani
)
1
033
4.
CO
NCL
USI
O
N
In this
pape
r we have
prese
n
t
e
d a ne
w approach to
im
prove
classification accuracy. T
h
is
approach i
s
base
d
fi
rst
on
wavel
e
t
t
r
a
n
s
f
orm
fo
r
feat
u
r
e ext
r
act
i
o
n
an
d sec
o
nd
o
n
q
u
ant
u
m
cl
ust
e
ri
ng
t
o
s
e
l
ect
r
e
l
e
vant
feat
ure
s
.
W
e
h
a
ve ap
pl
i
e
d a do
u
b
l
e
densi
t
y
wavel
e
t
t
r
ans
f
o
r
m
on t
h
e
m
a
m
m
ograp
hi
c im
age and t
h
e
n
we
extracted
som
e
features to
be used
i
n
cla
ssification.
The subset feat
ure selectio
n
al
g
o
rith
m
ach
iev
e
s th
e
selectio
n
task
i
n
two
step
s. The first step
d
eal
s with
th
e
clu
s
t
e
ring
of d
a
ta.
Th
is clu
s
teri
n
g
is carried
ou
t
b
y
th
e
qua
nt
um
cl
ust
e
ri
n
g
al
g
o
ri
t
h
m
.
The s
econd step deals
with th
e selection of a
re
prese
n
t
a
tive feature of eac
h
cl
ust
e
r. Thi
s
se
l
ect
i
on
i
s
base
d on
m
u
t
u
al
in
fo
rm
atio
n
or
o
n
correlation
coefficien
t.
Classification accuracy of m
a
mmograp
hic image was compute
d
ve
rsus
the feature type and the
use
d
wav
e
let tran
sform
.
We noticed that the DDWT
provid
es
good accuracies inde
pendently of the used
feat
ures and the
em
pl
oy
ed si
m
i
l
a
ri
t
y
m
easure. I
ndee
d
,
t
h
e
best
cl
a
ssific
a
tion acc
uraci
es we
re reac
hed
with the
Z
e
rni
k
e
m
o
men
t
s (1
00%) fo
llo
wed
b
y
th
e statisti
cal
m
o
m
e
n
t
s
(9
1.5 %). Howeve
r, ass
o
cia
t
ed with the discrete
wav
e
let tran
sfo
r
m
,
Zern
i
k
e
m
o
men
t
s g
i
v
e
th
e least satisfacto
r
y resu
lts,
alth
o
ugh
th
ey
are th
e m
o
st su
ccessfu
l
with
DD
WT
. F
o
r
disc
rete wa
v
e
let (D
WT)
we
reac
h lo
w
per
f
o
rm
ance (
6
1
.
6
%
).
Conce
r
ni
ng the contri
bution
of CC
and MI sim
i
larity
me
asure
s
in
classification accuracy we can
deduce that MI provi
des re
sul
t
s better than
CC. Indee
d
we
find t
h
at the a
v
era
g
e
of
class
i
fication accuracy for
MI is ab
ou
t
9
9
%, wh
ile th
is av
erag
e is aro
und
8
5
% for CC.
REFERE
NC
ES
[1]
Qinbao Song, Jingjie Ni
and G
u
angtao
Wang
;
A Fast Clustering-Based Feat
ur
e Subset Selection Algorithm for
High-Dimensional Data; I
E
EE
Transact
ions on K
nowledge
and D
a
ta Engin
eer
i
ng,
(Volume: 25, Is
sue: 1), 2011.
[2]
R. Kohavi
and
G
.
John. Wrappers
for f
eatur
e subs
et
selection
.
Art
i
f
ici
a
l In
tellig
ence, p
a
ges 273–32
4, 1997
.
[3]
Dash M. and
L
i
u
H., Featur
e Selection
for Cl
assifi
cat
ion, Intellig
en
t Dat
a
Anal
y
s
is,
1(3), pp
131-156
, 1997
.
[4]
Langley
P., Selection of relevan
t
f
eatur
es in machine learning
, In
Pro
ceedings of
the AAAI Fall S
y
mposium o
n
Relev
a
nce, pp
1-
5, 1994
.
[5]
S
ouza J
., F
e
atur
e s
e
le
ction wi
th
a gener
a
l h
y
br
id
algorithm, Ph.D, Univer
sity
of
Ottawa,
Otta
wa, Ontario,
Canada,
2004.
[6]
X. He, D
.
Cai,
and P. Niy
ogi.
Laplaci
an scor
e for feature selection
.
In Pr
oceedings of the Advances
in Neural
Inform
ation Pro
cessing S
y
st
em
s ’NIPS 05’,
page
s 507–514, Van
c
ouver, C
a
nad
a
,
Decem
ber 2005
.
[7]
L. Ta
lav
e
ra
. F
eature s
e
l
e
c
tion
as
a preproces
s
i
ng s
t
ep for hier
archi
cal
clus
ter
i
ng. In P
r
oceedi
ngs
of the 16th
International Co
nference on
Machine
Learni
ng ’I
CML 99’, pages
433–443, Bled,
Slovenia, 1999
.
[8]
S. Das. Filters, wrappers and a boosting-
based h
y
brid for fe
atur
e selec
tion. In Proceed
ings of the 18th Internat
ion
a
l
Conference on
Machine Learning ’ICML 01’
, p
a
ges 74–81, Williamstown,
MA,
USA, June 2001
.
[9]
Gu
y
on I.
and Elisseeff A., An introduction to v
a
riable
and
feature
selection, Journal of
Machin
e Learning Research
,
3, pp
1157-1182
, 2003
.
[10]
G Gu
y
on, I., S. Gunn, M. Nikr
avesh,
et L. Zadeh (2006)
. Feature
Extr
action
,
Foundations and Applications
,
Editors
.
S
e
ri
es
S
t
udies
in F
u
zz
in
es
s
and S
o
ft Co
mputing, Ph
y
s
ic
a-Verlag
.S
pring
e
r.
[11]
H.
David ; GOTTLIEB
Assaf,
Algorith
m for data cluster
i
ng in
pattern
recogn
ition problems based on quantum
mechanics, Ph
y
s
ical rev
i
ew
letter
s
, 2002
, vol. 8
8
, no1
, pp
. 0187
02.1-018702.4
[12]
P.M. Sampat,
M.K. Markey
,
and A. C. Bovik
,
“Computer-aid
ed det
ection an
d diagnosis in
mammograph
y
” in
Handbook of Im
age
and Vid
e
o P
r
ocessing, 2
nd ed., A.
C
.
Bovik
Ed. Academic Pr
ess, 2005, pp. 11
95-1217.
[13]
D.D. Adler and M.A. Helvie
, “Mammographic biops
y
reco
mmendations”, Curr.
Opin. Radiol., v
o
l 4., pp. 123-12
9,
1992
[14]
M.L. Gig
e
r, N.
Karsse
meijer an
d S.G. Armato,
“Comput
er-aided diagnosis in
medical
imaging
”
, I
EEE Tr
ans.
on
Med. Imaging
,
v
o
l. 20
, pp
. 1205-
1208, 2001
.
[15]
C.J. V
y
born
y
,
M.L. Giger and
R.M.
Nishikawa, “Computer-aided detecti
on
and diagnosis of breast can
cer
”,
Radiologic C
linics of North
Amer
ica, vo
l. 38, pp.
725-740, 2000
.
[16]
http:/
/peip
a
.
e
ssex.ac
.uk/
info/m
ia
s.htm
l
[17]
N.
Hamdi,
K.
A
uhmani and M.M.
Hassa
ni; Design of a High-
Accuracy
Classi
fier Based
on Fisher Discriminant
Anal
y
s
is: Appl
ic
ation to Com
put
er-Aid
ed Diagno
sis of Microcalcifications;
Proceeding of Th
e 20
08 International
Conferenc
e
on Com
putational S
c
ien
ce and Its Applic
ations
, Peru
gia Ital
y
;
I
EEE -
Com
puter Soci
et
y
,
pp. 267-273
,
2008.
[18]
N.
Hamdi,
K.
Auhmani,
M.
M.
Hassa
ni, Comp
uter Aided Diag
nosis: Automa
ted detection and
enhancement o
f
m
i
crocal
cifi
ca
tio
ns
in digit
i
zed
m
a
mm
ogram
s
us
ing wa
velet decomposition an
d loca
l gray
thr
e
sholding, PCN
Journal, 47
, 75
-7
8, May
2009.
[19]
Misiti, M., Mi
siti, Y., Oppenheim
, G., Poggi, J.M.: Wav
e
let Toolbox
User’s Guide. MathWorks Inc.,
Massachusetts, (
1996).
[20]
S
.
G. Mallat
,
A Theor
y
for Mult
iresolution S
i
gn
al Deco
m
positio
n: The W
a
vel
e
t
Represent
a
tion
,
IEEE Tr
ansac
tio
ns
on Pattern Analysis and Machin
e
Intelligen
c, Volu
me 11 Issue 7
,
J
u
ly
1989, Page 6
74-693.
[21]
W. Selesnick. The double-density
dual-
tree DWT. I
E
EE Tr
ans. o
n
Signa
l Processing, 52(5)
:1304-
1314, May
2004
.
[22]
W. Selesnick
.
The double densit
y
dwt. In A. Petrosian and
F.
G. Mey
e
r,
editor
s,
Wavelets in
Signal and
Image
Anal
y
s
is
:
F
r
om
Theor
y
to P
r
a
c
ti
ce.
Kluwer
.
Evaluation Warning : The document was created with Spire.PDF for Python.
I
S
SN
:
2
088
-87
08
I
J
ECE Vo
l. 5
,
N
o
. 5
,
O
c
tob
e
r
20
15
:
102
7
–
10
34
1
034
[23]
S
.
Arivazhag
an
, L. Gan
e
s
a
n and C.N. S
a
vi
thri, “
E
ffe
ctiv
e
m
u
lti-resolutio
n transform
identifi
c
a
tion for
characterization
and classificatio
n of
tex
t
ure gro
ups”, ictact
jour
nal on
image
and video pro
ces
sing, november
2011, volume: 0
2
, issue: 02.
[24]
Varun P. Gopi,
V. Suresh Babu, D
ilna C, “Image Resolution En
hancemen
t Using Undecimated
Double Density
Wavelet Tr
ansform”, Signal
Processing: An Inter
n
ation
a
l Journal
(SPIJ), Volume (8) : Issue (5)
: 2
014
[25]
Hadjidem
itr
iou,
E., Grossberg,
M.D., & Nay
a
r
,
S.K. Multir
esolu
tion histogram
s and their use for
recogni
tion. I
E
E
E
Transactions on
Pattern
Anal
y
s
is
and Machin
e In
t
e
llig
ence, 26(7),
831-847. (2004)
[26]
S.
HERLIDOU,
Caractérisation
tissulaire en IRM
par l’analy
s
e de texture.
Étude du tissu muscul
aire et de
tumeurs
intracrâniennes, univers
ité de Renne1, 1999
.
[27]
H. TAMURA,
S. MORI et T. YAMAWAKI
. Textu
r
e fe
atu
r
es corresponding to visual p
e
rcep
tion, I
EEE
Transactions on
S
y
stems, Man
and
C
y
b
e
rnetics
,
SMC-8(6): 460–473, 1978
.
[28]
P
.
HOW
ARTH
et S
.
RÜGER
.
E
v
alua
tion of
tex
t
ure fe
atur
es
for
conten
t-bas
e
d
i
m
a
ge retr
iev
a
l.
I
n
P
r
oceed
ings
o
f
the Intern
ational Conference on Image and Video
Retrieval (CIVR’04), volume LNCS 3115
, pages 326–33
4
,
Dublin, Ir
eland,
jul 2004
.
[29]
Deans, S.R. Hough Transform
from
the Radon Transform
,
IEEE Trans. On Patt
. Anal. and
Mach. Inte
ll
., Vol.
PAMI-3, No. 2,
pp. 185_188
, 19
81.
[30]
Murph
y
, L.M. Linear featur
e detec
tion and enhancement in
nois
y
images
via the Radon tr
ansform, Patter
n
Recognition
Letters, No. 4, pp. 2
79-284, 1986
.
[31]
C.W. Chong, P. Raveendr
an, an
d R. M
ukun-dan. A comparativ
e analy
s
is of
algorithms for fast computation o
f
zernik
e
moment. Pattern R
ecognition, 36:731-742
, 2003
.
[32]
Thuzar Hla
i
ng,
F
eature S
e
l
ect
io
n and F
u
zz
y
De
cis
i
on Tre
e
for Network Intrusion Detect
ion, In
ternational Journal
of Inform
ati
c
s a
nd Com
m
unicati
on Te
chnolog
y
(
I
J-ICT) Vol.1
,
N
o
.2, pp. 109~118,
Decem
ber
201
2
[33]
Lich
en Xun, Gang Zheng, ECG Signal Featur
e Sele
cti
on for Em
otion Recogni
tio
n, TELKOMNIKA, Vol. 11, No.
3,
pp. 1363 ~ 13
70, Mar
c
h 2013
[34]
Hall M.A., Corr
elation-Based Feature Se
lec
tion f
o
r Discrete
and
Num
e
ric Cla
ss Machine
Learning, In Proceedin
g
s
of 17th In
tern
ational Conf
eren
ce on Mach
ine Learning, pp
359-36
6, 2000
.
[35]
Kohavi R. and
John G.H., Wrap
pers for f
eatur
e s
ubset
selection,
Artif. In
tell.
, 97(
1-2), pp
273-324
, 1997
.
[36]
R.O. Duda, P.E.
Hart and
D.G.
Stork. Pattern
Clas
sification. W
iley-Interscience, 2n
d ed., 2001
.
Evaluation Warning : The document was created with Spire.PDF for Python.