Internati
o
nal
Journal of Ele
c
trical
and Computer
Engineering
(IJE
CE)
V
o
l.
6, N
o
. 3
,
Ju
n
e
201
6, p
p
. 1
176
~ 11
82
I
S
SN
: 208
8-8
7
0
8
,
D
O
I
:
10.115
91
/ij
ece.v6
i
3.8
922
1
176
Jo
urn
a
l
h
o
me
pa
ge
: h
ttp
://iaesjo
u
r
na
l.com/
o
n
lin
e/ind
e
x.ph
p
/
IJECE
Walsh T
r
ansf
orm Bas
e
d
Feature Vect
or Gen
e
rati
on f
o
r
Imag
e Da
ta
base
Cla
ssific
a
tio
n
Ta
n
u
j
a
Sa
rode
1
, Jag
rut
i
Sav
e
2
1
Departem
ent
of
Com
puter Eng
i
neering
,
Thado
m
al S
h
ahani
En
gineer
ing Col
l
eg
e, M
u
m
b
ai
2
MPST
ME
,
NM
IMS Unive
r
sity
, Mumba
i
,
India
Article Info
A
B
STRAC
T
Article histo
r
y:
Received Oct 28, 2015
Rev
i
sed
May
4, 201
6
Accepted
May 20, 2016
Thousands of images are gen
e
rated
ev
er
y
d
ay
, which implies
the need to
build an
e
a
s
y
,
fas
t
er,
au
tom
a
te
d cl
as
s
i
fier
to
clas
s
i
f
y
and or
ganiz
e
th
es
e
im
ages
. Clas
s
i
fi
cat
ion m
eans
s
e
lect
ing an approp
riat
e clas
s
for a
given im
age
from
a set of
p
r
e-defin
e
d
class
e
s. Th
e m
a
in o
b
jec
tive
of th
is work is to
explore f
eatur
e v
ector g
e
ner
a
tion
using
Walsh tran
sform for classification. In
the first method, we applied Walsh transform on
the columns of an image to
generate f
e
ature
vectors. In secon
d
me
thod, Wals
h wavelet matr
ix
is used for
featur
e vecto
r
generation. In third
method we proposed to apply
v
e
ctor
quantization (V
Q) on feature v
ectors g
e
ner
a
ted
b
y
earlier meth
ods. It g
i
ves
bett
er accur
a
c
y
,
fas
t
com
putatio
n and les
s
s
t
orage s
p
ace as
co
m
p
ared with
the ear
lier methods. Nearest neighbor
and
nearest mean classificatio
n
algorithms are u
s
ed to classif
y
input te
st image.
Image datab
a
se used for the
experimentation
contains 2000
images
. All these methods generate larg
e
number of outp
u
ts for single test imag
e b
y
considering four
similar
ity
m
eas
ures
, s
i
x s
i
zes
of fea
t
ure v
ector
,
two way
s
of classificatio
n
, four VQ
techn
i
ques,
three sizes of cod
e
book,
and fiv
e
combinations
of wavelet
transform
m
a
trix genera
tion
.
W
e
obs
erved impr
ovement in accuracy
from
63.22% to
74%
(
55% training
data) through
th
e series of
techniques.
Keyword:
Feature
Vect
or
Im
age Classification
Vector Qua
n
tization
W
a
lsh Tr
an
sf
or
m
Wav
e
let Tran
sform
Copyright ©
201
6 Institut
e
o
f
Ad
vanced
Engin
eer
ing and S
c
i
e
nce.
All rights re
se
rve
d
.
Co
rresp
ond
i
ng
Autho
r
:
Tanuja
Sarode,
Depa
rt
em
ent
of C
o
m
put
er
En
gi
nee
r
i
n
g,
Tha
dom
al
Sha
h
ani
En
gi
nee
r
i
n
g
C
o
l
l
a
ge
,
Mum
b
ai, India.
Em
a
il:
1.
INTRODUCTION
Recent increas
es in st
ora
g
e c
a
pac
ity, proces
sing
powe
r, and
displa
y re
sol
u
tion ha
ve e
n
a
b
led la
rge
im
age dat
a
ba
se devel
opm
ent
.
D
u
e t
o
adva
ncem
ent
in internet technologies, these databases
have
trem
endously grown furt
he
r.
The task
of accessing, processing, an
aly
z
ing, a
n
d s
h
aring the
s
e im
ages has
becom
e
m
o
re
diffic
u
lt. If images
a
r
e prope
rly orga
nized,
then accessing th
ese im
ages will be fast. Hence,
with
th
e larg
e av
ailab
ility o
f
h
i
g
h
qu
ality d
i
g
ital i
m
ag
es, th
e
n
eed of classifying
/
categ
orizin
g
im
ag
es
aut
o
m
a
t
i
call
y
is bec
o
m
i
ng i
n
creasi
n
gl
y
im
port
a
nt
an
d c
h
al
l
e
ngi
n
g
n
o
wa
d
a
y
s
. H
u
m
a
n b
e
i
ngs e
a
si
l
y
cl
assi
fy
i
m
ag
es ev
en
if th
e i
m
ag
es are p
oorly illu
m
i
n
a
ted
,
p
a
rtially
o
cclud
ed, and n
o
i
sy.
However, th
e classifi
catio
n
task
is no
t easy fo
r m
ach
in
e. Hen
c
e, to
d
e
sig
n
g
e
neri
c image classifier
re
m
a
ins
an elus
ive goal. T
h
e t
e
rm
im
age classific
a
tion is a proc
ess of assigning an im
age
to one
of the pre
d
efi
n
ed class.
Manual classification
of rel
e
vant
i
m
ages fr
om
a
l
a
rge dat
a
base i
s
t
i
m
e
c
onsum
ing
,
l
a
bo
ri
o
u
s,
expe
nsi
v
e, an
d
sub
j
ect
i
v
e. S
o
m
a
ny
researc
h
ers ha
ve focuse
d on autom
a
tic
(
m
achine
)
classification of im
a
g
es. Content
-
based im
age retrieval
(
C
B
I
R
)
i
s
a
s
y
s
t
e
m
o
f
r
e
t
r
i
e
v
i
n
g
a
s
e
t
o
f
i
m
a
g
e
s
sim
i
lar to que
r
y im
ag
e from
a large
im
age data
ba
se.
A
successful classification of images
will greatly enhance
the perform
an
ce of CBIR syste
m
by filteri
ng
out
im
ages from
irrel
e
va
nt
cl
ass
e
s du
ri
n
g
m
a
tchi
n
g
[
1
]
.
The
pr
obl
em
of cl
assi
fy
i
ng i
m
ages i
n
dat
a
bas
e
i
n
t
o
p
r
ed
efi
n
ed
cat
eg
ory h
a
s m
a
n
y
lev
e
ls o
f
g
e
n
e
rality [2
].
It
can
b
e
as
b
r
o
a
d
as sep
a
rating in
do
or and
o
u
td
oor
Evaluation Warning : The document was created with Spire.PDF for Python.
I
J
ECE
I
S
SN
:
208
8-8
7
0
8
Wa
lsh Tra
n
sform Ba
sed
Fea
t
u
r
e Vect
o
r
Gen
e
ra
tio
n fo
r Ima
g
e
Da
taba
se Cla
ssifica
tio
n
(
T
an
u
j
a
S
a
rod
e
)
1
177
i
m
ag
es [
3
] or
dif
f
e
r
e
n
t
ou
tdoor
scen
es [4
], it can
b
e
as g
e
n
e
r
i
c as sep
a
r
a
ting
car, f
l
ow
er
, elep
h
a
n
t
classes, o
r
it
can
be fi
ner
[
5
]
as se
parat
i
ng
di
f
f
ere
n
t
t
y
pes o
f
cl
oud
im
ages
[6]. In one
class classification
(Una
ry
classificatio
n
)
p
r
ob
lem
,
i
m
ag
e
o
f
an object is classified
as g
e
nu
in
e ob
j
ect
or an ou
tlier obj
ect. Th
is
classificatio
n
i
s
u
s
efu
l
i
n
some d
a
ta min
i
ng
ap
p
lication
s
lik
e ou
tlier d
e
t
ectio
n
an
d anomaly d
e
tectio
n
[7
],[8
].
In t
w
o-class cl
assification
(bi
n
ary cl
assificat
ion) problem
, test im
age is
as
si
gn t
o
one
o
f
t
h
e p
r
e
d
efi
n
e
d
t
w
o
classes [9
]. Th
is m
e
th
o
d
h
a
s an
ap
p
licatio
n in
m
e
d
i
cal
field to
detect the a
b
norm
a
l
ity in medical im
ages [10].
If th
e classificatio
n
is for m
o
re th
an
t
w
o
classes th
en
we get
m
u
lti c
l
ass
classificatio
n
.
Classificatio
n
is also
cl
assi
fi
ed as s
upe
r
v
i
s
ed a
nd
uns
u
p
er
vi
se
d c
l
assi
fi
cat
i
on [
1
1]
,[
12]
.
In
ge
neral
,
a
u
t
o
m
a
t
i
c su
per
v
i
s
ed i
m
age
cl
assi
fi
cat
i
on c
ont
ai
n
s
t
w
o
st
eps:
1.
Feature Extraction 2.
B
u
ild
Classifier
[13]. T
h
e
sim
p
lest way to re
present
an im
age in fewer c
o
efficient
s
is to
extract color, s
h
ape a
n
d texture info
rm
at
i
on f
r
om
an i
m
age and
represent
it in a co
m
p
act form
[14]-[
17]. Global feat
ures a
r
e extrac
ted from
entire im
age where
a
s local features are
extracted
from parts of an image [
18]
. Feat
u
r
e ext
r
act
i
o
n c
a
n be d
o
n
e i
n
s
p
at
i
a
l
dom
ai
n
or t
r
a
n
sf
o
r
m
d
o
m
a
i
n
.
Im
age transform
such as Walsh T
r
ans
f
orm
has a
pr
ope
rt
y
of ene
r
gy
com
p
act
i
on. M
a
xi
m
u
m
ener
gy
accum
u
lated in fe
wer c
o
efficients; hence,
reduce
d feat
ure vector size.
Once the
f
eature vector
ge
nerated,
classification proce
d
ures such as nea
r
est neighbor cl
assificatio
n
[1
9
]
,
classificatio
n
u
s
ing
artificial n
e
ural
net
w
or
k
[2
0]
, a
n
d
su
p
p
o
r
t
vec
t
or m
achi
n
e
[2
1]
b
u
i
l
d
t
h
e cl
a
ssi
fi
er.
We
p
r
op
o
s
ed
a syste
m
in
wh
i
c
h
in
itially
W
a
lsh
tran
sform
[2
2
]
, app
lied
t
o
th
e co
lu
m
n
s of an
im
ag
e.
The
n
row m
e
an
vectors for three
plan
e
s
com
b
ined
to make
feature vect
or of that im
age [23].
To m
a
ke
effective a
n
d com
p
act representation of training s
e
t
feature vect
ors
,
differe
n
t
techniques
of vector
qua
nt
i
zat
i
on
(
V
Q
)
are
ap
pl
i
e
d o
n
t
r
ai
ni
n
g
set
.
Nea
r
es
t
neighbor (NN) classifier
and neare
s
t m
ean
(NM
)
classifier with
diffe
rent sim
i
larity
m
easure
s
[2
4]
a
ssign the output class for a te
st image. T
o
enhance the
accuracy of classifier m
odel, seco
nd m
e
thod sugge
sts applying
W
a
lsh wa
velet to the colum
n
s of an i
m
age
instead of si
mple
W
a
lsh tra
n
sform
.
Addition of
VQ techni
que
s increases
accuracy furt
her decrea
ses time and
st
ora
g
e re
qui
r
e
d. T
h
e pa
pe
r
i
s
orga
ni
zed
as fol
l
o
ws:
Se
ct
i
on 2 e
xpl
ai
ns det
a
i
l
e
d
pr
oced
u
r
e o
f
p
r
op
ose
d
sy
st
em
. Sect
i
on
3
di
scusse
s a
l
l
t
h
e res
u
l
t
s
o
f
i
m
pl
em
ent
a
tion
.
T
h
e c
o
ncl
u
si
on i
s
gi
ven
i
n
sect
i
o
n
4
fol
l
ow
e
d
by
re
fere
nces.
2.
R
E
SEARC
H M
ETHOD
The pa
pe
r gi
v
e
s t
h
ree di
f
f
er
ent
m
e
t
hods t
o
gene
rat
e
t
r
ai
n
i
ng set
o
f
feat
ure
vect
o
r
s as expl
ai
ne
d i
n
sect
i
on
2.
1,
2
.
2
,
an
d
2.
3.
2.
1.
‘W
al
sh T
r
ans
f
orm
o
v
er
R
o
w
mean
’
b
a
se
d Fe
ature
vec
t
or Ge
nera
ti
o
n
The ste
p
wise
proce
d
ure
for t
h
e first m
e
thod is as follows:
1
.
App
l
y
W
a
lsh
Transfo
r
m
to
th
e co
lu
m
n
s of
th
ree
planes
(R
, G, a
n
d B)
of t
r
aining im
age.
2.
Calculate the avera
g
e of each ro
w of transform
e
d i
m
age planes. This
will give one row m
ean vector of
size 256x1 for
each
plane
(siz
e
of eac
h im
age 256x256) [25].
3.
Or
ga
ni
ze fi
rst
'
Z
'
val
u
es of t
h
ese t
h
ree vec
t
ors o
n
e
bel
o
w t
h
e ot
her t
o
gene
rat
e
feat
ure
vect
o
r
o
f
si
ze
‘3Z
x
1’
. B
y
t
a
k
i
ng t
h
e val
u
e
o
f
'
Z
'
as 25,
50
,
1
0
0
,
1
50
,
20
0 a
nd
2
5
6
,
we get
feat
u
r
e vect
or
of
di
f
f
ere
n
t
si
z
e
s
su
ch
as 75
x1
,
1
50x
1, 300
x1
, 4
50x
1, 60
0x1 an
d
7
68x
1
r
e
sp
ectiv
ely. Abo
v
e
t
h
r
ee step
s ar
e r
e
p
eated
fo
r
each trai
ning i
m
age. T
h
is wil
l
gene
rate the
training set.
4.
Apply the a
bove
proce
d
ure
for test im
age
5.
Ap
pl
y
neare
s
t
nei
g
hb
o
r
(N
N
)
cl
a
ssifier whe
r
e all training fe
ature vect
ors a
r
e use
d
and apply nearest m
e
an
(NM) classifie
r
where ave
r
a
g
e featur
e vec
t
or of each cla
ss is used as
training set. Figure 1 s
h
ows
the
proce
d
ures for these two clas
sifier
s.
Diffe
re
nt distance m
easure
s
suc
h
as
Euclidea
n dist
ance, Ma
nhatt
an
d
i
stan
ce, C
o
sin
e
correlation
si
m
ilari
ty, and Bray-Curtis [26]-[28] are us
ed to calculate distance betwee
n
t
r
ai
ni
n
g
fe
at
ure
vect
o
r
a
n
d t
e
st
i
ng
feat
u
r
e
vec
t
or.
2.
2.
Wal
s
h W
avel
e
t
T
r
a
n
sf
orm
b
a
sed Fea
t
ure
vect
or Gener
a
ti
on
Thi
s
m
e
t
hod
u
s
es t
h
e
p
r
oce
d
ure
o
f
ge
nerat
i
on
o
f
wavel
e
t
m
a
t
r
i
x
f
r
om
t
w
o
o
r
t
h
o
g
o
n
al
t
r
ans
f
orm
matrices. Th
e alg
o
rith
m
is as fo
llo
ws:
1
.
Cr
eate
W
a
lsh
w
a
v
e
let m
a
tr
ix
f
r
o
m
tw
o
W
a
l
s
h
m
a
tr
ices [
2
9]. To
g
e
n
e
r
a
te
w
a
v
e
let m
a
tr
ix
o
f
size 25
6x25
6,
we ca
n t
a
ke
t
w
o
Wal
s
h
m
a
t
r
i
c
es o
f
si
zes '
4
x4
an
d
64
x
64'
o
r
'
8
x
8
a
n
d
32
x
32'
o
r
'
1
6
x
1
6
an
d
16
x
16'
.
2.
Ap
pl
y
st
eps
1
t
o
st
e
p
5 f
r
o
m
sect
i
on
2.
1
by
r
e
pl
aci
ng
Wal
s
h m
a
t
r
i
x
wi
t
h
Wal
s
h
W
a
vel
e
t
m
a
t
r
i
x
.
3.
If wa
vel
e
t
m
a
tri
x
i
s
creat
ed f
r
om
t
w
o
m
a
t
r
ices of
W
a
l
s
h t
r
ans
f
orm
wi
t
h
si
zes 8x
8 and
32
x
3
2
,
an
d N
N
classifier is used for classifi
catio
n
th
en
th
e
resu
lts are shown
u
n
d
e
r
th
e
n
a
m
e
W
a
lsh_w
av
elet_8X
32_
NN
an
d if
N
M
classif
i
er
is
u
s
ed
, t
h
e
r
e
su
lts ar
e sh
own
u
n
d
e
r
the n
a
m
e
W
a
lsh_
w
a
v
e
let_8
x32_
N
M
.
4.
Apply the a
bove
proce
d
ure
for test im
age
5.
Ap
pl
y
nea
r
est
nei
g
hb
o
r
(
N
N)
cl
assi
fi
er a
n
d
nearest m
ean
(NM) classi
fier.
Evaluation Warning : The document was created with Spire.PDF for Python.
I
S
SN
:
2
088
-87
08
IJEC
E
V
o
l
.
6,
No
. 3,
J
u
ne 2
0
1
6
:
11
7
6
– 11
82
1
178
NN
Cl
assifier
NM
Cl
assifier
Fi
gu
re
1.
Nea
r
est
nei
g
h
b
o
u
r
(
N
N
)
a
n
d neare
s
t m
ean (NM)
classifier
2.
3.
Prop
osed
met
hod
o
f
Fe
a
t
ur
e vec
t
or
Gene
rati
on
Th
e
d
e
tailed
p
r
o
cedure for m
e
th
od
is as fo
llows:
1.
Ap
pl
y
fi
r
s
t
t
h
r
ee st
eps
fr
om
sect
i
on
2.
1 t
o
g
e
nerat
e
t
r
ai
ni
n
g
set
.
2.
Each
feature vector
of
size 'Mx1'
in a trai
ning
set is a t
r
aining
poi
nt in '
M
'
dimensional s
p
ace. Appl
y
vect
o
r
q
u
ant
i
z
at
i
on t
ech
ni
q
u
e
s [3
0]
suc
h
as Li
nde
-B
uz
o-
Gray
(LB
G
)
[
31]
,
Kek
r
e'
s pro
p
o
rt
i
o
nat
e
er
ro
r
(KP
E
) [
3
2]
, K
e
kre'
s Fast
C
o
de b
o
o
k
Gene
r
a
t
i
on (
K
FC
G
)
[3
3]
and
Ke
kr
e'
s
m
e
di
an cod
e
bo
o
k
ge
nerat
i
on
(KM
C
G
)
[3
4]
al
go
ri
t
h
m
s
t
o
t
h
e t
r
ai
ni
n
g
poi
nt
s
of eac
h cl
as
s sepa
rat
e
l
y
.
3.
Gene
rat
e
t
h
e
c
ode
b
o
o
k
s
o
f
si
ze 4,
8 a
n
d
1
6
.
The
co
de
vect
ors
are
re
pre
s
e
n
t
a
t
i
v
e
of t
h
e c
l
asses.
Hence
,
t
h
e
code
vect
ors
form
the training set.
4.
Apply NN clas
sifier to find t
h
e class of test i
m
age.
The sam
e
pr
oc
edu
r
e i
s
a
p
pl
i
e
d
on
sec
o
n
d
m
e
t
h
o
d
(
W
al
s
h
wavel
e
t
base
d
t
r
ai
ni
n
g
set
)
.
3.
RESULTS
A
N
D
DI
SC
US
S
I
ON
For e
xpe
ri
m
e
nt
at
i
on, l
a
rge i
m
age dat
a
base
i
s
const
r
u
c
ted
.
Th
is d
a
tab
a
se
co
n
t
ains to
tal 2
000
i
m
ag
es
(2
0 cl
asses,
1
00 i
m
ages pe
r
cl
ass). Si
x cl
asses (
b
u
s
,
di
nos
au
r, el
e
pha
nt
, r
o
se,
h
o
r
s
e
,
an
d m
ount
ai
n) a
r
e
di
rect
l
y
t
a
ken
fr
om
W
a
n
g
da
t
a
base [
35]
. Im
ages o
f
rem
a
i
n
i
ng f
o
urt
e
e
n
cl
asses (i
bi
s bi
rd
, su
nset
, b
o
n
sa
i
,
car,
pan
d
a, s
u
n
f
l
o
wer
,
ai
rpl
a
ne
, coi
n
, sc
oot
e
r
, s
c
ho
o
n
er
, ki
n
g
fisher bird, starfish,
W
i
nd
sor chair, a
nd cup-sauce
r)
are do
wn
lo
ad
ed
fro
m
th
e web
related
to
th
e class k
e
yword.
Im
ages are selected in suc
h
a way that they have
man
y
v
a
riation
s
wit
h
in
th
e
class
and am
ong t
h
e classes
.
Figure
2 s
h
ows the sam
p
le im
ages of tra
i
ning
d
a
tab
a
se an
d
testin
g
d
a
tab
a
se. Accu
r
acy of classificatio
n
is calcu
lated
as
per equ
a
tio
n
(1). In
itially 3
5
imag
es
per
cl
ass a
r
e
u
s
ed
fo
r t
r
ai
ni
n
g
pu
r
pos
e a
n
d
rem
a
i
n
i
ng
65 images are
use
d
for testing
purpose.T
h
e
n
training
im
ages increas
ed by 10 for
two tim
e
s (45 im
ages pe
r class and the
n
55 im
ages per class). It has
been
obs
erved t
h
at as the num
b
er of trai
ning images inc
r
ease
d
accuracy inc
r
eases. Ta
ble 1 indicates the
size of
t
r
ai
ni
n
g
set
,
t
i
m
e requi
re
d f
o
r ge
nerat
i
o
n o
f
t
r
ai
ni
ng set
a
nd t
i
m
e requi
r
e
d f
o
r cl
assi
fi
c
a
t
i
on f
o
r si
n
g
l
e
t
e
st
im
age fo
r
di
f
f
e
rent
m
e
t
h
o
d
s
w
h
en
we
ha
v
e
use
d
3
5
i
m
ages
pe
r cl
ass
f
o
r
t
r
ai
ni
ng
p
u
r
pos
e.
It
s
h
ows
t
h
at
num
ber of
t
r
aining
feat
ure vector reduces because
of
ve
ctor qua
ntization he
nce reduces
the
classification
tim
e
. Table 2 s
p
ecifies
highes
t accuracy
obta
i
ned
for all three m
e
thods
for
55 traini
ng im
ages a
n
d 45 te
sting
im
ages per
cl
ass. Ta
bl
e 3
s
h
ows
t
h
e
va
ri
at
i
ons i
n
di
ffere
n
t fact
ors
considere
d
fo
r cla
ssification.
Ta
ble 4
sh
ows t
h
e confu
s
ion
m
a
trix
ob
tain
ed fo
r
‘Walsh
_wav
el
et_3
2X8
_
NN’ m
e
th
od
(sim
ilarit
y
criteria: Man
h
a
ttan
,
n
u
m
b
e
r
o
f
trai
n
i
ng
im
ag
es: 55
im
ag
es/class). Th
is
table s
hows the indivi
dual class
performance.
(1
)
The sizes of c
ode
book
we have tried a
r
e
4,
8
and 16
because the m
i
nim
u
m
nu
m
b
er of training
im
ages i
s
35 p
e
r cl
ass. It
ha
s been
o
b
ser
v
e
d
t
h
at
t
h
e pr
o
p
o
s
ed m
e
t
hod o
f
VQ
gi
ves
bet
t
e
r res
u
l
t
s
com
p
ared t
o
earlier m
e
th
o
d
s.
Evaluation Warning : The document was created with Spire.PDF for Python.
I
J
ECE
I
S
SN
:
208
8-8
7
0
8
Wa
lsh Tra
n
sform Ba
sed
Fea
t
u
r
e Vect
o
r
Gen
e
ra
tio
n fo
r Ima
g
e
Da
taba
se Cla
ssifica
tio
n
(
T
an
u
j
a
S
a
rod
e
)
1
179
Tabl
e 1. N
u
m
b
er of
t
r
ai
ni
n
g
f
eat
ur
e
vectors
and corre
sponding tim
e requi
re
d fo
r
eac
h
m
e
tho
d
fo
r 35
trai
nin
g
im
ages/class
Met
h
od
Num
b
er of t
r
aining im
ages : 700
(
35 im
ages/class)
Size of
each f
e
ature vect
or
:
768x1
Ti
m
e
requir
e
d
for generation
of
all t
r
aining
f
e
at
ure vect
ors
(in sec
)
Ti
m
e
requir
e
d
for classification
of single tes
t
ing
f
e
at
ure vect
or
(in sec
)
No.
of
t
r
aining
f
e
at
ure vect
ors/class
To
ta
l tra
i
ning
f
e
at
ure vect
ors
W
a
lsh+Row m
ean
+NN
35
700
34.
79
4.
24
W
a
lsh+Row m
ean
+NM
1
20
35.
48
0.
12
W
a
lsh_wavelet+NN 35
700
38.
72
4.
24
W
a
lsh_wavelet+NM 1
20
39.
32
0.
12
P
r
oposed VQ
(
C
B
size 4)
4
80
37.
48
0.
50
P
r
oposed VQ
(
C
B
size 8)
8
160
38.
34
1.
01
P
r
oposed VQ
(CB size 16)
16
320
40.
57
1.
92
O
b
servat
ions:
'N
M'
m
e
thod requires least classification ti
m
e
and leas
t st
orage space as the
size of training set is only
20
training vectors. 'NN'
m
e
thod requires hi
gh classi
fication ti
m
e
and large storage
space as the size of trainin
g
set is 70
0
tr
aining vector
s.
Pr
oposed 'VQ'
m
e
th
od gener
a
tes tr
aining set of
codebo
ok
(
CB)
size.
I
t
r
e
quir
e
s
m
o
r
e
classification tim
e
as co
m
p
ar
ed to 'NM
'
m
e
thod but it is quite less than 'NN'
m
e
thod.
Fi
gu
re
2.
Sam
p
l
e
im
ages of
t
r
ai
ni
ng
an
d t
e
st
i
n
g
Dat
a
base
Tabl
e
2. C
o
m
p
ari
s
o
n
of
hi
ghe
st
Ov
eral
l
C
l
assi
fi
cat
i
on acc
ur
acy
achei
ve
d
f
o
r
al
l
t
h
e m
e
t
hods
No.
of t
r
aining im
ages: 1100(
55/class
)
No.
of testing im
ages :
900(45/class)
Met
h
od
Highest %
a
ccura
cy
achieved
Corresponding
f
e
at
ure vect
or size
Corresponding
si
m
ilarity crite
ria
VQ
m
e
t
h
od/
Wa
velet
m
e
thod
W
a
lsh+Row m
ean
+NN
61.
22
600x
1
E
u
clidean
-
W
a
lsh+Row m
ean
+NM
52.
67
300x
1
Cosine
cor
r
e
lation
-
P
r
oposed VQ
(CB size 16)
63.
22
450x
1
M
a
nhattan
Walsh+Ro
w
m
e
a
n
+LBG
W
a
lsh_wavelet+NN 72.
78
768x
1
M
a
nhattan
W
a
lsh_wavelet_32X8_NN
W
a
lsh_wavelet+NM 63.
89
768x
1
M
a
nhattan W
a
lsh_wavelet_32X8_NM
P
r
oposed VQ
(
C
B
size 16)
74%
768x
1
Br
ay-
C
ur
tis
Walsh_w
a
velet_32X
8
+KMC
G
O
b
servat
ions:
Ac
curacy
incre
a
ses fro
m
52.67% to 7
4
%
. Highest accuracy
achieved with p
r
oposed tech
nique
applied o
n
secon
d
m
e
thod
of W
a
lsh wavelet.
Tabl
e
3.
Vari
at
i
ons
i
n
di
f
f
ere
n
t
fact
or
s c
onsi
d
ered
f
o
r
C
l
assi
fi
cat
i
on.
Sr. No.
Factors
No. of Variations
Variations
1
Featur
e Vector
size
6
75x1,
15
0x1,
30
0x
1,
450x1,
60
0x1,
7
68x1
2
Si
m
ila
rity Crite
ria
4
Euclidean, Manha
ttan,
Cosine cor
r
e
lation,
Br
ay-
C
ur
tis
3
Code boo
k size
3
4,
8,
16
4
Classifiers
2
NN
&
N
M
Classifi
er
5
VQ
m
e
thods
4
L
B
G,
KPE,
KFC
G
,
KM
C
G
6
W
a
lsh
W
a
velet
m
a
tr
ix gener
a
tion
5
4x64,
8x
32,
16x
16
,
32x8,
64x4
O
b
servat
ions:
‘
Walsh tr
ansform
+
r
o
w
m
e
an’
m
e
thod with factor
s 1,
2 and 4
gener
a
tes 48 r
e
sults for
single test im
age.
Pr
oposed VQ
technique over
the
‘
W
alsh tr
ansform
+
r
o
w
m
e
an’
m
e
thod with factor
s 1,
2,
3 and 5 gener
a
t
e
s 288 r
e
sults f
o
r
single test im
age.
W
a
ls
h
wavelet
m
e
thod with factor
s 1,
2,
4 and 6
gener
a
tes 240 r
e
sults for
single test i
m
age.
Pr
oposed VQ technique over
the
W
a
lsh
wave
let
m
e
thod with factor
s 1,
2,
3,
5 and 6 gener
a
tes 1440 r
e
sults for
single test im
age.
Evaluation Warning : The document was created with Spire.PDF for Python.
I
S
SN
:
2
088
-87
08
IJEC
E
V
o
l
.
6,
No
. 3,
J
u
ne 2
0
1
6
:
11
7
6
– 11
82
1
180
Due t
o
space
constraint it is not
possi
ble to show the re
sults of
all the
m
e
thods
with all factor
vari
at
i
o
n. B
u
t
aft
e
r i
m
pl
em
en
t
i
ng al
l
t
h
e m
e
tho
d
s,
i
t
has
be
en o
b
se
rv
ed t
h
at
pr
op
ose
d
m
e
t
h
o
d
o
f
a
ppl
y
i
n
g
V
Q
ove
r t
h
e
earl
i
e
r m
e
t
hods
gi
ve
s hi
g
h
est
acc
ur
acy
i
n
m
o
st
of t
h
e cl
asses.
In
som
e
cl
asses l
i
k
e B
u
s,
Di
n
o
s
au
r
,
R
o
se,
H
o
rse
an
d
Ai
rpl
a
ne, '
N
M
'
m
e
t
hod
gi
v
e
s bet
t
e
r
res
u
l
t
s
. It
m
eans t
h
at
t
hose
cl
asses a
r
e com
p
act
an
d t
h
e
y
can be
best
r
e
prese
n
t
e
d
by
s
i
ngl
e feat
ure
v
ect
or. B
r
ay
-C
u
r
t
i
s
and M
a
n
h
a
t
t
a
n per
f
o
r
m
a
nce i
s
al
m
o
st
sim
i
l
a
r
and t
h
ey
b
o
t
h
gi
ve
bet
t
e
r pe
rf
orm
a
nce t
h
an
ot
he
r t
w
o si
m
i
lari
t
y
cri
t
e
ri
a i
n
m
o
st
cl
asses. Eucl
i
d
ea
n an
d
C
o
si
n
e
cor
r
el
at
i
on
sh
ows
si
m
i
l
a
r p
e
rf
orm
a
nce.
In sunset class
and scooter class, c
o
sine
sim
ilarity gives
bette
r
perform
a
nce than a
n
y ot
her di
stances.
Tab
l
e 4
.
C
o
nfusio
n
Matrix
for W
a
lsh
_
wav
e
let_
32
X8_
N
N
m
e
thod (Feature vector
si
ze:
76
8x
1,
Si
m
i
l
a
ri
t
y
measure: Manhattan
distance
, No.
of trai
ning im
ages: 55/cl
ass)
Ib
is b
i
rd
Sunset
Bonsai
Bus
Dinosaur
E
l
ephant
Rose
Horse
M
ountain
Car
Panda
Sunf
lower
Air plane
Coin
Scooter
Schooner
Kingf
isher
Star Fish
Windsor
Chair
Cup-
saucer
Ib
is b
i
rd
17
0
2
0
0
5
0
4
1 0 1 0
1
0
1 1
9 3
0
0
Sunset 0
28
0
1
0
3
6
1
0 0 1 2
0
0
0 0
1 2
0
0
Bonsai
1
0
26
1
0
2
0
2
0 0 2 0
0
3
1 0
3 1
1
2
Bus
0
0
1
27
0
1
0
3
2 0 2 0
0
0
4 0
1 4
0
0
Dinosaur
0
0
0
0
44
0
0
0
0 0 0 0
0
0
0 0
0 0
0
1
E
l
ephant 2
0
0
0
2
32
0
0
1 0 1 0
1
0
2 0
0 2
0
2
Rose 0
0
0
0
0
0
41
1
0 0 2 0
0
0
0 0
0 1
0
0
Hor
s
e
2
0
0
0 0
0 0
43
0 0 0 0
0
0
0 0
0 0
0
0
M
ountain
3
0
1
4 0
3 0
4
20
0 1 0
1
0
3 2
2 1
0
0
Car
0
0
0
5 0
3 1
1
2
23
2 1
1
0
4 0
2 0
0
0
Panda
1
0
0
1 0
0 0
3
1
2
36
0 0 0
1
0
0
0 0
0
Sunflower
0
0
0
0 0
0 0
0
0
0
0
43
0
0
0 0
0 1
0
1
Air
plane
0
0
1
0
2
0
0
1
1 0 0 0
40
0
0 0
0 0
0
0
Coin
1
0
0
0
0
2
0
0
0 0 0 0
0
40
0 0
1 1
0
0
Scooter
0
0
1
1
0 0
0 0
1
0
0
0 0 0
41
0 0
1
0
0
Schooner
0
0
0
0
0 0
0 0
0
0
1
0 0 0
0
44
0 0
0
0
Kingfisher
0
0
0
0
0 2
3 4
0
0
4
0 0 0
0
0
27
3 0
2
Star
Fish
0
0
0
4
0 3
1 2
0
0
3
0 0 2
0
0
0
26
1 3
W
i
ndsor
Chair
2
0
0
0
3
0
0
0
0 0 1 0
0
0
0 0
0 2
37
0
Cup-
saucer
2
0
0
0
4
2
0
1
0 0 3 0
0
0
1 0
8 3
1
20
O
b
servat
ions:
Cl
asses whose accuracy is
m
o
re tha
n
90% are Dinos
aur
(98%), Rose (91%), Horse
(
96%), Sunf
lower
(96%),
Scooter
(
91%)
and Schoo
ner
(
98%)
.
W
o
r
s
t per
f
orm
i
n
g
classes ar
e I
b
is
bir
d
class with 38
% and cup-
sa
ucer
class with 44% ac
curacy.
Few of I
b
is bir
d
i
m
ages ar
e
m
i
sclas
s
ified as King
fishe
r
bir
d
im
ages due to si
m
ila
rit
y
in struct
ure and shape. Few of
the cup-sa
ucer class
i
m
ages a
r
e
m
i
sclas
s
ified as Dinosaur class
because of their plane bac
kground. Therefore, the
accurac
y
of those classes degrades
.
After
Walsh transform
applied to
each col
u
m
n
of an im
age, low fr
eque
ncy co
m
pone
nts
get stored in
the first fe
w coefficients of each co
l
u
m
n
. Hence m
a
xim
u
m feature vect
or
size is not requi
red to
get highes
t
accuracy i
n
first m
e
thod. B
u
t in
Wals
h
wavelet
m
e
t
hod,
t
h
e l
o
w freque
ncy c
o
m
pone
nts are s
p
read.
Hence
,
hi
g
h
est
acc
ura
c
y
i
s
o
b
t
a
i
n
ed
wi
t
h
feat
ure
v
e
ct
or
of
si
ze
76
8x
1.
4.
CO
NCL
USI
O
N
The pa
per
gi
ve
s
m
e
t
hods f
o
r
gene
rat
i
o
n of t
r
ai
ni
n
g
set
usi
ng
Wal
s
h Tra
n
sfo
r
m
for cl
assi
fi
cat
i
on of
im
age databas
e
. If
we increa
se training dat
a
, the accuracy
of classification i
n
creases
. In the pa
pe
r, we have
sho
w
n t
h
e re
s
u
l
t
s
f
o
r
5
5
%
of
dat
a
base
(
1
1
0
0
i
m
ag
es) for t
r
aining,
and trie
d to i
n
crease
acc
ura
c
y by
im
pl
em
ent
i
ng di
ffe
re
nt
t
echn
i
ques
of feat
ur
e vect
or
gene
r
a
t
i
on. F
o
r cl
as
si
fi
cat
i
on, a si
m
p
l
e
and fast
m
e
t
hod
,
Nearest
Neighbor classifier,
is use
d
.
It has
bee
n
obse
rve
d
t
h
at
cl
asses
l
i
k
e di
n
o
sa
ur
,
rose
gi
ve t
h
e
bet
t
e
r
results
with
NM classifier t
h
an NN classi
fier.
Wh
i
l
e
t
h
e ot
he
r cl
asse
s gi
ve
bet
t
e
r
per
f
o
r
m
a
nce wi
t
h
N
N
classifier. As
we we
re analy
z
ing this
fact,
it has been
found t
h
at classes like dinosaur
and
rose are c
o
m
p
act
and close i.e. low intra-class
distan
ce and hi
gh inte
r-class
distance com
p
ared
t
o
ot
her t
r
ai
ni
ng cl
asses
.
Aft
e
r a
lo
t of ex
p
e
rimen
t
atio
n
,
we realized
th
at in
st
ead
o
f
all
training
vectors
pe
r class or
a
v
era
g
e
t
r
ai
ni
n
g
vec
t
or per
class, the
r
e is
a nee
d
to re
present the
class
by fe
w
but
more
effective training feat
ure
vect
o
r
s. It
re
su
l
t
e
d
i
n
pr
o
pose
d
m
e
t
hod
o
f
a
p
pl
y
i
ng
vect
o
r
qua
nt
i
zat
i
on
on
‘
W
al
s
h
t
r
a
n
s
f
o
r
m
+
row m
ean+NN
’
m
e
t
hod.
T
h
i
s
m
e
t
h
o
d
increases
accuracy from
61.22% to 63.22%
and it has
gr
ea
t adva
ntage
of
reducing
tim
e
require
d
a
n
d stora
g
e
space re
quire
d
for classification. Howeve
r, i
n
thirst
of
m
o
re accuracy,
bet
t
er feature vect
or
represe
n
tation is
Evaluation Warning : The document was created with Spire.PDF for Python.
I
J
ECE
I
S
SN
:
208
8-8
7
0
8
Wa
lsh Tra
n
sform Ba
sed
Fea
t
u
r
e Vect
o
r
Gen
e
ra
tio
n fo
r Ima
g
e
Da
taba
se Cla
ssifica
tio
n
(
T
an
u
j
a
S
a
rod
e
)
1
181
gene
rat
e
d
usi
n
g
wavel
e
t
t
r
a
n
sf
orm
.
It
h
a
s
bee
n
o
b
se
r
v
e
d
t
h
at
Wal
s
h
wavel
e
t
has i
n
crease
d
a
n
o
v
eral
l
accuracy from
63.22% to
72.7
8%. Manhattan and Bray-C
urtis distance ar
e m
o
re
suitabl
e
sim
ilarity m
e
asure
s
for
this application. Wa
lsh_wavelet_32x8_NN m
e
thod
gives hi
ghest
accuracy of 72.78%.
In
all VQ
t
echni
q
u
es i
t
h
a
s bee
n
obse
r
v
e
d t
h
at
KM
C
G
wi
t
h
c
ode
b
o
o
k
si
ze
of
16
ha
s gi
ve
n
bet
t
e
r
per
f
o
r
m
a
nce. Hence
,
KMCG is a
p
plied on feature
vectors gene
rated by
Walsh wavelet. T
h
e
accuracy inc
r
eased
upt
o 74% for
code
b
o
o
k
si
ze
of
16
. T
hus t
h
e
pr
op
ose
d
t
ech
ni
q
u
e n
o
t
o
n
ly increases acc
uracy but al
so
d
r
astically red
u
c
es th
e
size of traini
ng set (from
900 traini
ng
vect
ors t
o
320 tr
aining vect
ors
)
results in less s
t
ora
g
e space
requi
red
and faste
r
class
i
fication.
ACKNOWLE
DGE
M
ENTS
We are e
x
tre
m
ely grateful
to ou
r
research
gu
id
e late
Dr.
H.B
.
Kek
r
e for h
i
s
v
a
lu
ab
le gu
id
an
ce,
scho
lar
l
y suggestio
n
s
and
con
s
isten
t
en
cour
ag
em
en
t.
Sir
h
a
d alw
a
ys cl
ar
if
ied
ou
r dou
b
t
s d
e
sp
ite h
i
s bu
sy
sch
e
d
u
l
es and
p
r
ov
id
ed
acad
e
mic su
pp
ort and
facilities to
carry
o
u
t
t
h
is research wo
rk
.
We ack
nowledg
e
o
u
r
d
eep
sen
s
e
o
f
g
r
atitud
e
to
ward
s
h
i
m
.
Th
ou
gh
h
e
is no
t p
h
y
sically p
r
esen
t with
u
s
,
h
i
s wo
rk
,
word
s and
b
l
essing
s will
re
m
a
in
with
u
s
forev
e
r.
REFERE
NC
ES
[1]
Vailay
a
A.
,
et al.
,
“
O
n Im
age Clas
s
i
fic
a
tion
:
Cit
y
Im
ages
vs
.
Lands
cap
es
,”
Pattern Recognition
, Published
b
y
Elsevier Science Ltd
.
, vol/issue:
31(12), pp
. 1921
-1935, 1998
.
[2]
M
i
o W
.,
et al
., “A learning appr
oach to con
t
ent-
based im
age categorization
and
re
t
r
ie
va
l.
VISAP
P
,
”
vol. 2
,
pp. 3
6
-
43.
[3]
N. Ghomsheh A and Talebpo
ur A,
“A New Method for
In
door-Outdoor I
m
age Classification using
Colo
r
Correla
ted Tem
p
eratur
e,
”
In
terna
tional
Journal o
f
Image
Processing (
I
JIP)
,
vol/issue: 6(2)
, pp
.167-
181, 2012
.
[4]
Favorskay
a M. and Proskurin
A., “I
mage
Categor
ization using Colour G-SUR
F
Invarian
t to Lig
y
ht
Intensit
y,
”
19
th
International
C
onference on K
nowledge based
and Intellig
ent
Information an
d Engineering S
y
stems. Procedi
a
Computer Scien
ce,
Elsevier publication
,
vo
l. 60,
pp. 681-690
, 20
15.
[5]
G
a
o S
.,
et al.
, “
L
earning C
a
tegor
y-S
p
e
c
ifi
c
Dictionar
y
an
d S
h
ared Dicti
n
ar
y
for F
i
n
e
-
G
rained Im
age
Categor
iza
tion
,
”
IEEE Transaction on Image Pro
cessing,
vo
l/issu
e: 23(2)
, pp
. 623
-634, 2013
.
[6]
Bajwa I.,
et al.
, “Feature based
Classification
b
y
us
ing Principal Component
Analy
s
is
,”
ICG
S
T-GVIP journal,
vol/issue:
9(2), p
p
. 11-17
, 2009
.
[7]
Ta
x D.,
“One
Cla
ss Cla
ssific
a
ti
on: Concept
Learning in
the Absence of Coun
ter
Examples,”
Ph.
D
. T
h
es
is
,
Delft
University
of
Technolog
y
,
Delf
t,
Netherland, 200
1.
[8]
Hamdi N,
et al.
, “A New Appr
oach based on
Quantum Clusteri
ng and Wavelet Transform fo
r Breast Cancer
Clas
s
i
fic
a
tion
:
Com
p
arative S
t
ud
y,”
Internatio
nal Journal of Electrica
l
and Computer Engineering (
I
JECE)
.
Vol/issue: 5(5)
,
2015.
[9]
Gopakumar C.
and Rahuman
R., “Randon Tr
ansform ba
sed Classification o
f
Ma
mmograms with improved
Loca
liz
ation
,
”
I
n
ternational Jou
r
nal of Electrica
l
and Computer
Engineering (
I
JECE)
, vol/issue: 4(4), pp
, 17-24
,
2014.
[10]
Le T.
,
et
al
., “A New Support
Vector Machin
e Met
hod for Medical Image Classification,”
IEEE 2nd European
Workshop on Visual Information
Processing (
E
U
V
IP)
,
pp. 165-17
0, 2010
.
[11]
Olaode A.,
et
al
., “Unsupervised
Classifica
tion of
Images: A Review,”
Internation
a
l Journal of Image Processing
(I
J
I
P
)
,
vol/issue: 8(5), pp. 325-34
2, 2014
.
[12]
Nazarloo
M
.
,
et
al
., “Gender C
l
assification
using H
y
brid
of G
a
bor
F
ilt
ers
and
Binar
y
F
eatur
e
s
of an Im
age
,
”
International
Jo
urnal of
Electrical and Co
mputer Engin
eering
(
I
JECE)
, vol/issue:
4(4), pp
. 539-54
7, 2014
.
[13]
J
a
in A
.,
et al
., “Statistical
pa
ttern recognition:
A review,”
IEEE transactions
on
Pattern Analysis and Machin
e
Intell
igen
ce
,
vol/issue: 22(1), pp.
4-37, 2000
.
[14]
Song H.,
et al.
,
“Adaptive Featu
r
e Selection and
Extraction Appr
oaches
for
Im
age Retr
iev
a
l b
a
s
e
d on Reg
i
on
,
”
Journal of Multimedia,
vo
l/issue: 5(1), pp. 85-92,
2010.
[15]
Haralick R.,
et al.
, “
T
extur
a
l
F
eatures
for Im
age Cl
as
s
i
fic
a
ti
on,”
I
EEE Transactions on Systems, Man an
d
Cy
be
rne
t
ic
s,
vol/issue: 3(6)
, pp
.
610-621, 1973
.
[16]
T
e
a
g
ue
M.,
“Ima
ge
Analy
s
i
s
via t
h
e ge
ne
ra
l T
h
eory
of Mome
nt
s,”
Journal
of the Optical
Society
of America
, vol.
70, pp
. 920-930
.
[17]
Zhang D. and Lu G., “Shape-based Image Re
trieval using gener
i
c Fourier Descr
i
ptor,”
Signal Pr
ocessing: Image
Communication,
vol/issue: 17(10
), pp
. 825-848
, 2
002.
[18]
Liping W. and
Juncheng P., “I
mage Classification Algorith
m b
a
sed on Sparse
Coding,”
Journ
a
l of Mu
ltim
edi
a
,
vol/issue:
9(1), p
p
. 114-122
, 201
4.
[19]
Boiman O., et al., “In Defens
e
of Near
est-Neig
hbor Based Im
age Classification
,
”
IEEE Conference on Computer
Vision and
Pat
t
e
r
n Recogn
ition
(
C
VPR)
,
2008
.
[20]
Zhou S.,
et a
l
.
,
“
D
eep Adaptiv
e Networks
for
Im
age Clas
s
i
fi
cat
ion,”
ICIMC
S'10 Proceed
ing
s
of the S
econd
International co
nference on
Intern
et Multimedia
Computing and
Service,
pp. 61-6
4
, 2010
.
[21]
Thai L
.,
et al
.,
“Image Classification usi
ng Support Vector Mach
ine and
Artif
icial Neural Networ
k,”
International
Journal on In
formation Techno
lo
gy and Co
mputer Scien
c
e, M
E
CS
publication
,
pp
. 32-38,
2012
.
Evaluation Warning : The document was created with Spire.PDF for Python.
I
S
SN
:
2
088
-87
08
IJEC
E
V
o
l
.
6,
No
. 3,
J
u
ne 2
0
1
6
:
11
7
6
– 11
82
1
182
[22]
Walsh J., “A Clo
s
ed set
of
Ortho
gonal Functions,”
American
Journal of Mathematics
, vo
l. 45, pp.
5-24, 1923
.
[23]
Kekre H.,
et
al
.,
“
D
CT Applied t
o
Row Mean
an
d Colum
n
Vecto
r
s in Fingerprin
t
Identif
ica
tion
,
”
i
n
Proceedings of
Int. Con
f
. on Co
mputer Networks and Security (
I
CCNS)
,
2008.
[24]
Deza
E.
and
Dez
a
M
.
, “
D
ict
i
onar
y
of Dis
t
an
ces
,
”
Elsev
i
er,
pp. 391
, 2006
.
[25]
Kekre H. B.,
et al.
, “Walsh Tran
sform over Row Mean
and Colu
mn
M
ean us
ing
Im
age F
r
agm
e
nt
ation
and
Ene
r
g
y
Com
p
action for Im
age Retriev
a
l
,
”
Internationa
l Journal on Comp
uter
Scien
ce and
Engineering
,
(
I
JCSE)
,
vol/issue:
2(1), pp
. 47-54
,
2010.
[26]
Kekre H.
,
et a
l
.
,
“
E
ffect of
Dis
t
a
n
ce M
eas
ur
es
on Trans
f
orm
bas
e
d Im
age Clas
s
i
fi
cat
ion,”
International Journal o
f
Engineering Science and
Techno
logy (
I
JEST)
, vo
l/issue: 4
(
8), pp.
3729-3742, 201
2.
[27]
Chen X. and C
h
am
T., “
D
is
cri
m
i
native Dis
t
an
ce M
eas
ures
for
Im
age M
a
tchin
g
,”
Internationa
l Conferen
ce on
Pattern
Recognition (
I
CPR)
, Cambridge,
England
,
vol. 3, pp. 691-
695, 2004
.
[28]
John P. Van De
Geer, “Some
Aspects of Minkowski distance
,” Depar
t
ment o
f
data th
eor
y
, Leiden University.
1995.
[29]
Kekre H.
,
et a
l
.,
“Inception of H
ybrid Wavelet Tr
ansform using
Two Orthogonal
Transforms and It’s use for Image
Compre
ssion,
”
International
Jou
r
nal of Computer
Scien
ce and
Information Security,
vo
l/issue:
9(6), pp. 80-8
7
,
2011.
[30]
Gra
y
R.
, “
V
ecto
r
Quanti
zat
ion,
”
IEEE
ASSP
Mag
, pp
. 4-29
, 1984
.
[31]
Linde Y.,
et al.
, “An Algorithm for Vector Quan
tizer Design,”
I
EEE T
r
ansactio
ns on Communications
, vo
l/issu
e:
28(1), pp
. 84-95
.
[32]
Kekre H. and Sarode T., “Fast I
m
proved Cluster
i
ng Algor
ithm
s
for Vector Quan
ti
zat
ion,”
NCS
P
A
, P
a
dm
as
hree Dr
.
D.Y.Patil Institu
t
e of
Eng
i
neer
in
g
and Technolog
y
,
P
une, India, 2
007.
[33]
Kekre H. and Sarode T., “Fast Code
book Generation Algorith
m
for Color Im
ages using Vector
Quantization
,
”
International Jo
urnal of
Computer Science and
I
n
formation Tech
nology
, vol/issue: 1(1), pp. 7-12
,
2009.
[34]
Kekre H. and Sarode T., “An
Efficient Fast Algorith
m to generate Codebook
for Vector Qu
antization,”
First
International co
nference on
Emerging Trends in
Engine
ering and
Technolog
y (
I
CETET)
, pp. 62-6
7
, 2008
.
[35]
Wang J.,
et a
l
.
,
“
S
IMPLIcit
y: Se
m
a
ntics-sensitiv
e Integr
at
ed Mat
c
hing for Pic
t
ure
Librar
ies,
”
IEEE Transaction on
Pattern
Analysis and Mach
ine In
telligen
ce
, vol/is
sue: 23(9), pp
. 9
47-963, 2001
.
BIOGRAP
HI
ES OF
AUTH
ORS
Dr
.
T
anuja K.
Sar
o
de
has Receiv
e
d Bsc. (M
athematics)
fro
m Mumbai University
in 1996
,
Bsc.Tech.(Comp
uter
Techno
log
y
)
from Mumbai Univ
ersity in 1999
, M.E. (Computer
Engineering) fr
om Mu
mbai University
in
20
04 and Ph.D. from Mukesh
Patel School of
Techno
log
y
, M
a
nagement and
Engineering
,
S
VKM’
s
NMI
M
S University
, Vile-Parle (W
),
M
u
m
b
ai, INDIA. S
h
e has
m
o
re
t
h
an 10
ye
ars
of
experi
enc
e
in
te
aching
.
Curren
t
l
y
working
as
As
s
o
ciate P
r
ofe
s
s
o
r in Dept.
o
f
Com
puter En
gineer
ing a
t
Th
adom
al S
h
ahan
i
Engin
eering
College
, M
u
m
b
ai. S
h
e is
lif
e m
e
m
b
er of IETE
,
IS
TE, m
e
m
b
er of Internat
iona
l As
s
o
ciation of
Engineers (IAENG) and Inter
n
ation
a
l Associa
tion
of Computer Scie
nce and
Information
Techno
log
y
(IA
CSIT), Singapor
e. Her
ar
eas of
in
teres
t
are
Im
age P
r
oce
ssing, Signal Processing
and Computer
Graphics. Sh
e has more th
an 100 papers in National /Intern
a
tion
a
l
Conferenc
e
s
/jou
r
nal to
h
e
r credi
t
.
Jagruti K. Save
has received
B.E. (Computer
Engg.) fro
m Mumbai University
in 1996, M.E.
(Computer Engineering) from Mumbai Univer
sity
in 2004,
currently
Pursuing Ph.D. fro
m
Mukesh Patel
School of Technolog
y
,
Man
a
gement and
Engineer
ing, SV
KM’s NMI
M
S
Univers
i
t
y
,
Vile
-P
arle (W
),
M
u
m
b
ai, INDIA.
S
h
e has
m
o
re t
h
an 10
ye
ars
of
exper
i
enc
e
in
teaching. Curr
en
tly
working
as Associate Profe
ssor in Dept. of Information Technolog
y
at Fr.
Conceicao Rodr
igues Colleg
e
o
f
Engg., Bandr
a,
Mumbai. Her areas of
inter
e
st are Imag
e
Processing, Neu
r
al Networks, Fu
z
z
y
s
y
ste
m
s,
Da
ta
ba
se
ma
na
ge
m
e
nt and Computer Vision. She
has 10 pap
e
rs in
National /In
t
ern
a
tional Confer
ences/journal to
her
credit.
Evaluation Warning : The document was created with Spire.PDF for Python.