Internati
o
nal
Journal of Ele
c
trical
and Computer
Engineering
(IJE
CE)
V
o
l.
5, N
o
. 5
,
O
c
tob
e
r
201
5, p
p
. 1
158
~116
3
I
S
SN
: 208
8-8
7
0
8
1
158
Jo
urn
a
l
h
o
me
pa
ge
: h
ttp
://iaesjo
u
r
na
l.com/
o
n
lin
e/ind
e
x.ph
p
/
IJECE
A Predictive Model for Mining
Opi
n
ions of
an Education
a
l
Database Using Neural Network
s
M R Nar
a
sin
g
a Ra
o
1
, Dee
p
t
h
i
Gurr
am
2
,
Sa
i
Ma
hat
h
i Va
d
d
e
3
, Sa
thish Ta
lla
m
*4
,
N. Sai Ch
an
d
5
, L.
Kiran
6
Department o
f
C
o
mputer Scien
c
e, K
L Un
iversity
, Guntur, Andhra Pradesh, India
Em
ail:
m
a
nda.r
a
m
anaras
ingar
a
o
@
gm
ail.com
1
, d
eepug2110@gmail.com
2
, sa
thish
.
tha
llam
@
gm
ail.
com
4
Article Info
A
B
STRAC
T
Article histo
r
y:
Received Apr 14, 2015
Rev
i
sed
Jun
21,
201
5
Accepte
d J
u
l
2, 2015
Assessing the perform
ance of a
n
educa
tiona
l in
stitute
is a prim
e conc
ern in
an educa
tiona
l s
cenar
io. Edu
cat
i
onal
Data Mining (EDM)
consid
ers several
tas
k
s
originat
ed from
an educati
onal cont
ext
.
One of the tas
k
s
i
d
entifi
e
d is
providing feedb
ack for supporting instructors, adm
i
nistrators, teach
ers,
course authors
in decision
making a
nd th
ereb
y en
abl
e
t
h
em
to take
appropria
te r
e
m
e
dia
l
a
c
tion
.
In
t
h
is re
sear
ch, we have d
e
veloped
a proto
t
y
p
e
Neural Network
Model which is trained
to predict th
e perform
ance of
an
educa
tiona
l inst
itution
.
A Mult
ila
ye
r Perc
eptro
n
Neural Ne
twork (MLP)
model had been
develop
e
d for th
is propos
ed research.
The ne
twork is trained
b
y
b
ack
propag
a
tion
algor
ithm. Data
was obtained from a well-d
e
fined
questionnaire
co
nsisting of 14 q
u
estions
in th
e
domains namely
Academic
Schedule, Intern
ation
a
l
Expos
ure, Jobs and In
tern
ship, Quality
of
the co
lleg
e
,
and Life at Cam
pus. The results of thes
e questio
ns have been tak
e
n as inputs
and perform
an
c
e
of
the
institu
t
e
has b
een
con
s
idered
as th
e
output.
To,
valid
ate th
e results genera
ted b
y
the netw
ork, statistical techniques have been
used for the
purpose.
In th
is proposed research
performance of
an
educa
tiona
l inst
itution
has be
e
n
predi
c
ted
.
Th
e resul
t
s gener
a
ted b
y
the
Neural Network
and the s
t
at
is
ti
cal t
echn
i
ques
have be
en com
p
a
r
ed in this
research
and
it is observed that, both
the metho
d
s have gen
e
rated accurate
res
u
lts
. Th
e res
u
lts
have b
een
c
ons
idered bas
e
d
on the Norm
ali
zed S
y
s
t
em
Error (NSE) values of the netw
ork.
A prototy
p
e Neural Networ
k model has
been d
e
veloped
to assess the perf
orma
nce of
an
educational institution.
Keyword:
Back
pr
op
ag
atio
n algo
r
ith
m
Mu
ltilayer p
e
rcep
tro
n
Neu
r
al
net
w
or
k
Norm
alized syste
m
error
Perform
a
nce analysis
Pred
ictio
n
Copyright ©
201
5 Institut
e
o
f
Ad
vanced
Engin
eer
ing and S
c
i
e
nce.
All rights re
se
rve
d
.
Co
rresp
ond
i
ng
Autho
r
:
Deept
h
i Gurra
m
,
Depa
rt
em
ent
of C
o
m
put
er
Sci
e
nce &
E
ngi
ne
eri
n
g,
KL Uni
v
er
sity
,
Vad
d
es
waram
,
G
unt
ur
(
D
t
)
,
An
d
h
ra
Pra
d
es
h,
I
ndi
a.
Em
a
il: d
eep
ug2
110
@g
m
a
il.c
o
m
1.
INTRODUCTION
There a
r
e inc
r
easing
researc
h
interests i
n
using data
m
i
ni
n
g
i
n
e
ducat
i
onal
co
nt
e
x
t
.
Thi
s
fi
el
d i
s
cal
l
e
d educat
i
onal
dat
a
m
i
ning c
o
n
cer
ned
wi
t
h
t
h
e de
vel
opm
ent
of m
e
tho
d
s t
h
at
ext
r
act
s kn
owl
e
dg
e from
ed
u
cation
a
l
d
a
ta. Edu
cation
a
l Data Min
i
ng
(EDM
) is
an
ap
p
lication to
an
alyze th
e typ
e
s of
data in
educat
i
o
nal
co
nt
ext
[
1
]
.
M
a
n
y
rel
e
vant
st
u
d
i
es have
been
carried
ou
t in
ed
u
cation
a
l d
a
ta
m
i
n
i
n
g
up
to
d
a
t
e
[2]
.
A num
ber
of dat
a
m
i
ni
ng
t
ech
ni
q
u
es h
a
ve
b
een
de
vel
ope
d
fo
r e
duca
t
i
onal
p
u
r
p
ose
s
. A m
odel
ba
sed
on
decision tree
had
been em
ployed to e
v
al
uate the st
ude
nt
pe
rf
orm
a
nce f
o
r
cl
assi
fi
cat
i
on t
a
sk
[3]
.
A
R
e
g
r
essi
o
n
m
odel
usi
ng st
at
i
s
t
i
cal
approa
ch has
bee
n
de
vel
o
ped i
n
ed
u
cat
i
onal
dat
a
m
i
ni
ng [
4
]
.
A
sur
v
ey
has
bee
n
m
a
de
o
n
th
e app
licatio
n
of
d
a
ta m
i
n
i
ng
i
n
learn
i
ng
m
a
n
a
g
e
m
e
n
t
system
s [5
]. A m
o
re v
e
rsatile d
a
ta m
i
n
i
n
g
too
l
has
been
de
vel
o
pe
d t
o
su
g
g
est
c
a
reer
opt
i
o
ns
fo
r st
u
d
ent
s
, t
o
i
m
prove st
u
d
ent
per
f
o
r
m
a
nce,
ove
rcom
i
ng t
h
e
pr
o
b
l
e
m
of l
o
w g
r
ade st
u
d
e
n
t
s
, an
d al
so
fo
r fi
n
d
i
n
g t
h
e vi
ol
ent
be
ha
vi
o
r
of t
h
e st
ude
nt
s [
6
]
[7]
.
DM
Evaluation Warning : The document was created with Spire.PDF for Python.
I
S
SN
:
2
088
-87
08
I
J
ECE Vo
l. 5
,
N
o
. 5
,
O
c
tob
e
r
20
15
:
115
8
–
11
63
1
159
t
echni
q
u
es
ha
ve bee
n
em
pl
oy
ed i
n
di
f
f
er
ent
d
o
m
a
i
n
s t
o
pr
odu
ce d
i
f
f
er
en
t k
i
nd
s of r
e
po
r
t
s for
an
alysis
pu
r
poses
[8]
.
Ap
pl
i
cat
i
on
of
DM
i
n
ed
ucat
i
onal
co
nt
e
x
t
i
s
di
ffe
rent
i
n
ed
ucat
i
onal
c
ont
e
x
t
com
p
ared t
o
ot
her
dom
ai
ns [9]
.
There
has
bee
n
i
n
crea
si
n
g
i
n
t
e
rest
i
n
t
h
e
appl
i
cat
i
on
o
f
DM
i
n
ed
uc
at
i
onal
co
nt
ex
t
[10]
.
Ass
o
ci
at
i
on r
u
l
e
m
i
ni
ng has
been
used t
o
c
o
n
f
r
o
nt
t
h
e pr
obl
em
of cont
i
n
u
o
u
s fee
dbac
k
i
n
t
h
e ed
uca
t
i
onal
pr
ocess [
11]
. R
e
l
a
t
i
onshi
p
b
e
t
w
een
l
e
a
r
ni
n
g
beha
vi
o
r
pat
t
ern fo
r
c
o
l
l
a
b
o
rat
i
v
e
l
ear
ni
n
g
has bee
n
d
o
n
e [1
2]
.
DM tech
n
i
q
u
e
s lik
e te
m
p
o
r
al DM, learn
i
ng
d
eco
m
p
o
s
ition
,
an
d
log
i
stic reg
r
essi
on
h
a
s been
u
s
ed
to
d
e
scribe
and
p
r
edi
c
t
st
u
d
ent
be
havi
or
and
t
o
e
v
al
uat
e
t
h
e
pr
og
ress
i
n
rel
a
t
i
o
n t
o
l
earni
ng
o
u
t
c
o
m
es [1
3]
[
14]
.
In
t
h
i
s
pr
o
pose
d
rese
arch
, we
w
o
u
l
d l
i
k
e t
o
ap
pl
y
Ne
ural
N
e
t
w
o
r
k m
odel
t
o
assess
t
h
e
per
f
o
rm
ance of
t
h
e
ed
u
cation
a
l i
n
stitu
te. Th
e
results o
f
wh
ich
h
a
v
e
b
een co
m
p
ared with th
e statistical tech
n
i
q
u
e
s.
2.
RELATED WORK
An
Edu
catio
nal In
stru
m
e
n
t
is u
s
ed
for collectin
g
th
e feed
b
a
ck
was used
in
th
is research. The
respon
ses from d
i
fferen
t
edu
catio
n
a
l institu
tes h
a
d
b
e
en
co
n
s
i
d
ered
in th
is research
. Th
e qu
estions fro
m
feedbac
k
are related to different ar
eas like
Academ
ic Schedule,
Internati
onal
E
x
pos
ur
e,
Job
s
an
d I
n
t
e
r
n
shi
p
,
Qu
ality of th
e
co
lleg
e
, and
Li
fe at Cam
p
u
s
is prep
ared
fo
r
th
is pu
rpo
s
e.
Th
e respon
ses are ob
tain
ed
on a 10
poi
nt scale as
follows:
Ex
cellen
t
(1
0)
, V
e
r
y
Go
od
(0
8)
,
G
ood
(0
6)
, Satisf
actor
y
(
0
4
)
,
Un
satisfacto
r
y (0
2)
. Th
e f
e
ed
b
a
ck
from
the stude
nts for t
h
e above questionnaire has
bee
n
taken as input for
the m
odel and
the ave
r
age
rat
i
ng
for
each
row is cal
culated
for t
h
is purpose a
n
d is
taken as t
h
e
output of t
h
e m
odel.
A
Mu
lti-layer Percep
tron
(MLP) n
e
two
r
k
m
o
d
e
l
with
b
a
ck
-p
rop
a
g
a
tion alg
o
rith
m
h
a
s b
een u
s
ed
i
n
th
is cu
rren
t research. Th
e i
n
pu
ts alo
n
g
with
th
e weigh
t
ed s
u
m
and bias term
are passed t
o
the activation level
th
ro
ugh
th
e tran
sfer fu
n
c
tion
to
p
r
od
u
ce th
e actu
a
l o
u
t
pu
t o
f
th
e
n
e
two
r
k. Th
e un
its are arrang
ed
in
layered
feed
fo
rwa
r
d neu
r
al
net
w
o
r
ks. T
h
e sc
he
m
a
t
i
c
represe
n
t
a
t
i
on
of
fee
d
f
o
r
w
ar
d
ba
ck-
p
r
o
pagat
e
d
neu
r
al
net
w
or
ks
wi
t
h
14 i
n
p
u
t
s
wi
t
h
one
hi
d
d
e
n
l
a
y
e
r co
nsi
s
t
i
n
g o
f
9
ne
ur
on
s an
d 1
o
u
t
p
ut
u
n
i
t
i
n
t
h
e o
u
t
put
l
a
y
e
r i
s
gi
ve
n i
n
Fi
g
u
r
e 1. Th
e si
gm
oi
dal
t
r
an
sfe
r
fu
nct
i
o
n i
s
ch
ose
n
suc
h
t
h
at
t
h
e al
go
ri
t
h
m
req
u
i
r
es a res
p
o
n
se
fun
c
tion
with
a con
tin
uou
s, sin
g
l
e
v
a
lu
ed
wi
th
first d
e
riv
a
ti
v
e
ex
isten
ce.
Figure
1.
Fee
d
Forwa
r
d Ne
ura
l
Network
The
ope
rat
i
o
n
of t
h
e t
y
pi
cal
M
L
P wi
t
h
bac
k
pr
opa
gat
i
o
n
al
go
ri
t
h
m
i
s
as fol
l
o
ws
. T
h
e
ope
rat
i
o
n o
f
t
h
e t
y
pi
cal
bac
k
pr
o
p
agat
i
o
n
net
w
or
k
occ
u
rs
as f
o
l
l
o
ws.
[
1
5]
1
)
After
p
r
esen
tin
g
i
n
pu
t d
a
t
a
to
th
e in
pu
t layer, in
f
o
rm
ati
on p
r
opa
gat
e
s
t
h
ro
u
gh t
h
e
n
e
t
w
o
r
k t
o
t
h
e out
pu
t
l
a
y
e
r (f
or
wa
rd
pr
o
p
agat
i
o
n)
.
Du
ri
n
g
t
h
i
s
t
i
m
e
i
n
p
u
t
a
n
d
o
u
t
put
st
at
es f
o
r e
ach
neu
r
on
wi
l
l
be set
.
xj[s]
= f(
Ij[s]
)
=f(
Σ
(w
ij[
s] *
x
i
[
s
-
1
]))
i [15
]
Whe
r
e x
j
[s]
- Den
o
tes
t
h
e
c
u
rre
nt
in
put state
o
f
the
jt
h
neu
r
o
n
i
n
the
cu
rre
nt [s]
lay
e
r.
Ij[s] - Deno
tes
th
e wei
g
h
t
ed
su
m
o
f
inpu
ts to th
e
j
t
h
n
e
u
r
on
in
th
e curren
t
l
a
yer[s].
f is co
nv
en
tio
nally th
e sig
m
o
i
d
fun
c
tion
.
[15]
W
i
j
[s]
- de
n
o
t
e
s t
h
e co
n
n
ect
i
on
wei
g
ht
bet
w
een t
h
e i
t
h
n
e
ur
o
n
i
n
t
h
e c
u
rre
nt
l
a
y
e
r [s]
and
jt
h
ne
ur
o
n
i
n
t
h
e
p
r
ev
iou
s
layer
[
s
-1
].
[15
]
Evaluation Warning : The document was created with Spire.PDF for Python.
I
J
ECE
I
S
SN
:
208
8-8
7
0
8
A Pre
d
i
c
t
i
ve Model
f
o
r Mi
ni
n
g
Opi
n
i
o
ns
of
an
E
duc
at
i
o
nal
D
a
t
a
bas
e
Usi
n
g
N
e
ur
al
N
e
t
w
orks
(
D
ee
pt
hi
G)
1
160
2)
Gl
o
b
al
er
ro
r
i
s
gene
rat
e
d
b
a
sed
on t
h
e su
m
m
e
d di
ffere
n
ce of
req
u
i
r
e
d
and cal
c
u
l
a
t
e
d
out
put
val
u
es
ofeac
h
neu
r
on
i
n
t
h
e
o
u
t
p
ut
l
a
y
e
r .
T
h
e
N
o
rm
al
i
zed Sy
st
em
erro
r E
(
g
l
o
b) i
s
gi
ven
by
t
h
e
Eq
uat
i
o
n
E(
gl
ob
)= 0.
5 * (r
k
- o
k
)
2
and
(rk
-
o
k
)
d
e
no
tes th
e
d
i
ff
er
en
ce of
r
e
qu
ir
ed
an
d calcu
lated ou
tpu
t
val
u
es
. [
1
5]
3)
Global error is back
pr
op
ag
ated
t
h
ro
ugh
th
e n
e
twor
k
t
o
calcu
late lo
cal erro
r
v
a
lu
es an
d
d
e
lta wei
g
hts for
each ne
uron.
Delta weights
are m
odified
according to the delta rule that
strictly controls
the continuous
decrease
o
f
sy
napt
i
c
st
re
ngt
h
of t
h
ose
ne
urons that are m
a
inly responsible
fo
r the
global error.
In
th
is man
n
er
the re
gular
dec
r
ease
of gl
obal
error can be
as
sure
d.
Ej
[s] = xj
[
s
]
* (
1
.
0
–
xj
[
s
])
*
Σ
(ek
[
s+
1]
*
wk
j[s+1]
)
k
[1
5
]
Wher
e E
j
[s
]
is the scaled l
o
cal error
of t
h
e jt
h
n
e
uro
n
in
t
h
e cu
rren
t layer [s]
layer
Δ
wji
[s]
= lcoe
f
* e
j
[
s
]
*
xi
[s
-1]
[1
5]
Whe
r
e
Δ
wj
i
[s] - Deno
tes the d
e
lta wei
g
h
t
o
f
th
e conn
ectio
n
b
e
tween
t
h
e cu
rren
t
n
e
uron
and
th
e join
in
g
neu
r
on
.
Here,
l
c
oef
de
n
o
t
e
s t
h
e l
earni
ng
coe
f
fi
ci
ent
/
l
earni
n
g
c
o
n
s
t
a
nt
of t
h
e t
r
ai
ni
n
g
par
a
m
e
t
e
rs.
4)
Sy
na
pt
i
c
we
i
ght
s a
r
e
up
dat
e
d
by
ad
di
n
g
d
e
l
t
a
wei
ght
s
t
o
t
h
e cu
rre
nt
wei
ght
s
.
[
1
5]
2.
1 An
al
ysi
s
o
f
V
a
ri
ance (A
NO
VA
)
Anal
y
s
i
s
o
f
va
ri
ance (
A
NO
V
A
) i
s
a ge
nera
l
m
e
t
hod o
f
st
udy
i
n
g sam
p
l
e
d-
dat
a
rel
a
t
i
o
n
s
hi
ps
. Th
e
m
e
t
hod e
n
a
b
l
e
s t
h
e di
ffe
renc
e bet
w
ee
n t
w
o
or m
o
re sam
p
l
e
m
eans t
o
be
anal
y
zed, ac
hi
eved
by
s
u
b
d
i
v
i
d
i
n
g
th
e to
tal su
m
o
f
sq
u
a
res.
On
e
way ANOVA
p
u
rp
o
s
e is to
te
st fo
r
sig
n
ifica
n
t di
ffe
re
nce
s
betwee
n
class
means,
an
d th
is is done b
y
an
alyzin
g th
e
v
a
r
i
an
ces
.In
c
id
en
tally if
w
e
ar
e on
ly co
m
p
ar
in
g two
d
i
f
f
e
r
e
n
t
m
ean
s th
en
t
h
e m
e
t
hod i
s
t
h
e sam
e
as t
h
e
t
-
t
e
st
fo
r i
n
de
p
e
nde
nt
sam
p
l
e
s.
3.
RESULTS
The f
o
l
l
o
wi
ng
Tabl
e 1 pr
o
v
i
d
es t
h
e resul
t
s
o
b
t
a
i
n
ed
on a
p
p
l
y
i
ng one way
AN
O
VA cl
assi
fi
cat
i
on o
n
th
e d
a
ta
set.
Th
e On
e way an
alysis is calc
u
lated
in
three step
s, fi
rst th
e su
m
o
f
sq
u
a
res fo
r all sam
p
le
s th
e withi
n
class and
between classes
.
For each stage
the
degrees
of
free
dom
are obtained
whe
r
e
df is the
num
b
er
of
inde
pende
n
t pi
eces that go in to estim
a
te of a param
e
ter.
The second stage dete
rm
ines the sum
of square
s
with
in
t
h
e classes. Th
en
th
e nu
ll h
ypo
th
esis
will b
e
ev
alu
a
t
e
d
u
s
ing
si
g
n
i
fican
t d
i
fferen
ces v
a
l
u
es
o
b
t
ai
n
e
d.
Tabl
e
1. R
e
s
u
l
t
s
o
b
t
a
i
n
e
d
on
a
ppl
y
i
n
g
one
w
a
y
AN
O
V
A
cl
assi
fi
cat
i
on
o
n
t
h
e dat
a
set
Su
m
of squar
e
s
(Between the
gr
oups)
Su
m
of
squar
e
s
(With in the
gr
oups)
Total
Df
(Between
the gr
oups)
df
(With
in the
gr
oups)
Total
Mean
Squar
e
(Between
the gr
oups)
Mean
Squar
e
(With in the
gr
oups)
F Sig
Q1 2.
31
1.
332
3.
642
7
92
99
0.
33
0.
014
22.
79
0
Q2 2.
309
2.
609
4.
918
7
92
99
0.
33
0.
028
11.
633
0
Q3
2.
052
1.
898
3.
95
7
92
99
0.
293
0.
021
14.
207
0
Q4 2.
419
2.
236
4.
654
7
92
99
0.
346
0.
024
14.
217
0
Q5
3.
069
2.
472
5.
54
7
92
99
0.
438
0.
027
16.
316
0
Q6 2.
777
2.
769
5.
546
7
92
99
0.
397
0.
03
13.
183
0
Q7 3.
52
2.
782
6.
302
7
92
99
0.
503
0.
03
16.
632
0
Q8 3.
21
2.
478
5.
688
7
92
99
0.
459
0.
027
17.
023
0
Q9 2.
616
2.
458
5.
074
7
92
99
0.
374
0.
027
13.
988
0
Q10 2.
674
3.
128
5.
802
7
92
99
0.
382
0.
034
11.
235
0
Q11 2.
044
2.
942
4.
986
7
92
99
0.
292
0.
032
9.
133
0
Q12 3.
364
2.
362
5.
726
7
92
99
0.
481
0.
026
18.
717
0
Q13 3.
58
2.
812
6.
392
7
92
99
0.
511
0.
031
16.
73
0
Q14
3.
737
2.
613
6.
35
7
92
99
0.
534
0.
028
18.
79
0
Gra
p
hs o
b
t
a
i
n
e
d
by
c
o
n
s
i
d
eri
ng
rat
i
n
g f
o
r t
h
e que
st
i
o
n
n
ari
e
t
a
ken
on
x a
x
i
s
an
d n
o
.
o
f i
n
p
u
t
sam
p
l
e
s t
a
ken o
n
y ax
is
Evaluation Warning : The document was created with Spire.PDF for Python.
I
S
SN
:
2
088
-87
08
I
J
ECE Vo
l. 5
,
N
o
. 5
,
O
c
tob
e
r
20
15
:
115
8
–
11
63
1
161
Fi
gu
re 2.
G
r
ap
h pl
ot
t
e
d f
o
r q
u
est
i
o
n 1
c
onsi
d
eri
n
g
r
a
tin
g on
x
ax
i
s
r
a
ting
o
n
x ax
is an
d no
. of
i
n
pu
t
sam
p
les on y a
x
is
Fi
gu
re 3.
G
r
ap
h pl
ot
t
e
d f
o
r q
u
est
i
o
n 2
c
onsi
d
eri
n
g
an
d no
.of
inpu
t sam
p
les o
n
y
ax
is
Fi
gu
re 4.
G
r
ap
h pl
ot
t
e
d f
o
r q
u
est
i
o
n 3
c
onsi
d
eri
n
g
r
a
tin
g on
x
ax
i
s
on
x
ax
is an
d no
.o
f inp
u
t
sam
p
les o
n
y ax
is
Fi
gu
re 5.
G
r
ap
h pl
ot
t
e
d f
o
r q
u
est
i
o
n 4
c
onsi
d
eri
n
g
r
a
tin
g and
no
.of
inpu
t sam
p
les on
y ax
is
3.
1 Neur
al
Ne
tw
ork M
o
del
(
T
rai
n
i
n
g
)
Tabl
e 2. Val
u
e
s
co
nsi
d
e
r
e
d
f
o
r di
ffe
rent
pa
ra
m
e
t
e
rs
fo
r
t
r
ai
ni
n
g
neu
r
al
net
w
o
r
k
m
odel
T
r
aining Par
a
m
e
ter
s
Value
M
o
m
e
ntu
m
R
a
te
0.
7
L
ear
ning Rate
0.
5
M
a
xim
u
m
Er
r
o
r
0.
01
M
a
xim
u
m
I
ndividual Unit E
r
r
o
r
0.
001
No.
of
m
a
x I
t
er
atio
ns
500
No of Outputs
1
T
o
tal No of I
nputs
15
No of hidden lay
e
r
s
1
Evaluation Warning : The document was created with Spire.PDF for Python.
I
J
ECE
I
S
SN
:
208
8-8
7
0
8
A Pre
d
i
c
t
i
ve Model
f
o
r Mi
ni
n
g
Opi
n
i
o
ns
of
an
E
duc
at
i
o
nal
D
a
t
a
bas
e
Usi
n
g
N
e
ur
al
N
e
t
w
orks
(
D
ee
pt
hi
G)
1
162
Tabl
e 3. N
o
rm
al
i
zed
Sy
st
em
Err
o
r
o
b
t
a
i
n
e
d
on
t
r
ai
ni
ng
t
h
e neu
r
al
net
w
or
k
m
odel
fo
r di
ff
erent
n
o
. of
i
n
p
u
t
sam
p
les considered
No.of
input
sa
m
p
les
Nor
m
ali
z
ed
Syste
m
E
rror(NSE
)
Neuron 1
Neuron 2
Neuron 3
Neuron 4
Neuron 5
Neuron 6
Neuron 7
Neuron 8
Neuron 9
Neuron 10
10
0.
07
0.
007
0.
006
0.
006
0.
066
0.
006
0.
007
0.
008
0.
006
0.
006
20
0.
09
0.
009
0.
008
0.
009
0.
008
0.
009
0.
009
0.
009
0.
009
0.
009
30
0.
0769
0.
067
0.
076
0.
067
0.
067
0.
067
0.
067
0.
067
0.
067
0.
066
40
0.
059
0.
059
0.
059
0.
050
0.
050
0.
050
0.
050
0.
050
0.
050
0.
050
50
0.
089
0.
089
0.
089
0.
089
0.
080
0.
080
0.
080
0.
080
0.
080
0.
080
60
0.
074
0.
067
0.
075
0.
067
0.
075
0.
067
0.
067
0.
066
0.
067
0.
067
70
0.
066
0.
066
0.
066
0.
066
0.
057
0.
057
0.
057
0.
057
0.
057
0.
057
80
0.
059
0.
059
0.
050
0.
050
0.
050
0.
050
0.
050
0.
050
0.
050
0.
050
90
0.
055
0.
055
0.
045
0.
045
0.
045
0.
044
0.
044
0.
044
0.
044
0.
044
100
0.
075
0.
067
0.
077
0.
063
0.
066
0.
066
0.
064
0.
066
0.
066
0.
066
3.
2 T
e
s
t
i
n
g
Tabl
e
4.
R
e
sul
t
s o
b
t
a
i
n
ed
o
n
t
e
st
i
ng t
h
e
neu
r
al
net
w
or
k
fo
r
di
ffe
re
nt
sam
p
l
e
s
Sa
m
p
le Nu
m
b
er
Network
Gene
rate
d value
Desir
e
d Value
Accur
a
cy
1 0.
66
0.
8
82.
5
2 0.
66
0.
7
94.
2
3 0.
66
0.
7
94.
2
4 0.
67
0.
7
96.
7
5 0.
67
0.
9
75.
2
6 0.
67
0.
8
84.
6
7 0.
67
0.
7
96.
7
8 0.
67
0.
8
84.
6
9 0.
67
0.
7
96.
7
10
0.
88
0.
9
98.
4
4.
DIS
C
USSI
ON
We
d
e
scrib
e
t
h
e d
e
v
e
lop
m
en
t of a pro
t
o
t
yp
e Mu
lti-
Layer Percep
tron
Neu
r
al
Netwo
r
k
m
o
d
e
l, wh
ich
h
a
d
b
een
trained
b
y
b
ack
p
r
o
p
a
g
a
tion
alg
o
rith
m to
p
r
ed
i
c
t th
e p
e
rfo
r
man
ce o
f
an
edu
catio
n
a
l in
stitu
te.
Al
t
h
o
u
gh i
n
p
r
i
n
ci
pl
e, si
m
i
l
a
r resul
t
s
m
a
y
be obt
ai
ne
d
usi
ng a
va
ri
et
y
of
st
at
i
s
t
i
cal
m
e
tho
d
s s
u
c
h
as l
ogi
st
i
c
regression a
n
d
are
under the
receiver
operati
n
g cha
r
acterist
i
c cu
rve, the c
u
rrent m
e
thod
has t
h
e a
dva
nt
age
of
g
i
v
i
ng
d
y
n
a
m
i
c ou
tpu
t
as m
o
re d
a
ta is
g
e
n
e
rated
and
fed
to it. It also
h
a
s t
h
e ad
v
a
n
t
ag
e of no
t requ
iring
sk
ills
and i
n
si
g
h
t
nee
d
ed t
o
per
f
o
r
m
and anal
y
ze re
sul
t
s
fr
om
sophi
st
i
cat
ed st
at
ist
i
cal
t
echni
qu
es. In fact
, t
h
i
s
i
s
t
h
e
fi
rst
st
ep i
n
de
vel
o
pi
n
g
suc
h
a neu
r
al
net
w
o
r
k sy
st
em
whi
c
h can be t
r
ai
ned a
nd
pre
d
i
c
t
e
d as
m
o
re d
a
t
a
is
avai
l
a
bl
e
on
a
n
ed
ucat
i
o
nal
do
m
a
i
n
. Ne
ural
n
e
t
w
o
r
k
m
odel
s
ha
ve
bee
n
em
pl
oy
ed
i
n
a
var
i
et
y
of
d
o
m
a
i
n
s,
bu
t
to
our
k
nowled
g
e
th
is is th
e first tim
e th
ey are
b
e
in
g
us
ed to assess
t
h
e
perform
a
nce of a
n
e
d
ucational
institute [1].
As m
o
re data is ava
ilable, the
syste
m
i
m
prove
s in precision
with respect to accuracy a
n
d
can
be
wi
del
y
em
pl
oy
ed i
n
di
ffe
rent
dom
ai
ns o
n
e
d
ucat
i
onal
f
r
o
n
t
.
5.
CO
NCL
USI
O
N
We ha
ve devel
ope
d a pr
ot
ot
y
p
e Neu
r
al
Net
w
o
r
k
m
ode
l to assess
the
pe
rform
a
nce of a
n
educational
in
stitu
te g
i
v
e
n
th
e d
a
ta fro
m
an
edu
cation
a
l
d
o
m
ain
.
Altho
ugh
, a
p
r
o
t
o
t
yp
e m
o
d
e
l h
a
s b
een d
e
v
e
loped
,
th
is
m
odel
can be
enha
nce
d
t
o
an
y
desi
re
d l
e
vel
by
i
n
c
r
easi
n
g
t
h
e n
u
m
b
er
of
i
n
p
u
t
sam
p
l
e
s. Thi
s
m
odel
ca
n al
s
o
b
e
ap
p
lied in wid
e
v
a
riety of ap
p
lication
s
g
i
ven
th
e in
co
m
p
lete d
a
taset.
REFERE
NC
ES
[1]
Barnes T., Des
m
arais M., Romero C., Ventura S.
(2009). Educational Data Mining 2009:
2nd Internation
a
l
Conference on
Educational Data
Mi
ning, Proceed
ings. Cordoba, S
p
ain.
Evaluation Warning : The document was created with Spire.PDF for Python.
I
S
SN
:
2
088
-87
08
I
J
ECE Vo
l. 5
,
N
o
. 5
,
O
c
tob
e
r
20
15
:
115
8
–
11
63
1
163
[2]
Cristobal Romero, Sebastian Ventur
a “Educational Data Min
i
ng: A Revi
ew of the State -of-the-Art”,
IE
EE
Transactions
on
Systems, Man
an
d Cybernetics
,
V
o
l. X
X
,
N
o
. X
,
2
00X
, P
g
: 1-21
.
[3]
Brijesh Kumar Baradwaj, Saurabh Pa
l, “
M
ining Educa
tiona
l Dat
a
to Anal
yze Stu
d
ents Perform
ance”
,
Internat
ion
a
l
Journal of Ad
va
nced Comput
er
Scien
c
e and App
lications,
Vol. 2
,
No. 6
,
2011
, Pg. No: 63-66
[4]
Dr.
P.
K.
Srima
n
i,
Mrs.
Ma
lini
M.
Pa
til,
“Re
g
ression
Modeling
on EDU-DATA
in Tec
hnical Ed
ucation
S
y
stems”,
International
Jo
urnal of Advanc
ed Scien
tific an
d Technical
Res
e
arch
, Issue 3 volume 1, Januar
y
-Febru
ar
y
2013
,
Pg. No: 320-336
[5]
Romero, C.et al, “Data mining
in
course management s
y
stems: Moodle case st
ud
y
and
tuto
rial”,
Computers &
Education
(2007
), doi: 10
.1016/j.compedu.2007.0
5
.016
[6]
Elakia, Gay
a
thri,
Aarthi,
Na
ren J, “Applications of
Data Mining in
Educational Database for predicting
Behaviour
al Patterns of the Students”,
International Journal of Computer
Science and Information Technolog
y,
Vol. 5(3)
, 2014
,
4649-4652.
[7]
Mohammed
M.
Abu Tair, Alaa
M. El-Hal
e
e
s, “
M
ining Educat
io
nal Data to
Improve Students Performance: A case
study
”
,
In
ternational Journal of
I
n
formation and
Communica
tion Technology
Research,
Volume 2
No. 2, 2012.
[8]
Han J., Kamber
M. (2006)
. Data
Mining: Con
cepts and Techn
i
ques, Morgan Kauf
mann Publishers.
[9]
Romero C., V
e
ntura S. (2007). Education
a
l D
a
ta Mini
ng:
a S
u
rvey
from 199
5 to 2005
. Exp
e
rt S
y
s
t
ems with
Applications, 1
,
33, 135-146
.
[10]
Romero C. Ventura, S., Pech
en
izkiy
M., Bake, R. (2010).
Handbook of Educational Data M
i
ning
Tay
l
or &
Francis.
[11]
Psaromiligkos Y., Orfanidou
M. K
y
tagias
C
.
, Zafir
i
E. (200
9). Mining log
da
ta for
the
an
aly
s
is of learners’
behaviour
in
web-based learn
i
ng
management s
y
stems. In Operat
ional Resear
ch J
ournal, 1-14.
[12]
Yu P., Own C., Lin L. (2001)
. On learning b
e
hav
i
or analy
s
is
of web based interactive e
nvironment. In International
Conferenc
e
on
C
o
m
puter and
El
e
c
tri
cal
Eng
i
neer
ing, Oslo/Berg
e
n, Norway
, 1-9
.
[13]
Beal C.R
.
and Cohen P.R. (2008). Tem
poral Data Mining for Educational Appl
ications. In Proceedings of the 10th
Pacifi
c Rim
inte
rnation
a
l Confer
ence on Artif
ic
i
a
l int
e
ll
ig
en
ce:
Trends in Artifi
c
ia
l inte
llig
enc
e
,
Hanoi, Vietn
a
m
,
66-77.
[14]
Feng, M. Beck., J.E., Hef
f
ernan
N.T. (20
09). Using Learning Deco
mp
osition and B
ootstrapping with
Randomization
to Compare the Impact of Dif
f
erent
Educ
atio
nal Intervention
s on Learning
. In International
Conference on
Educational Data
Mining, Cordob
a, Spain, 51-60.
[15]
Manda R. Narasinga Rao,
G.R.
Sridhar, K. Madhu, A
llam Appa Rao, “A clinical
decision support s
y
stem using
m
u
ltila
ye
r perc
e
p
tron neural n
e
t
w
ork to predict
wellbe
i
ng in
dia
b
etes”
,
Journal
of Association
of Ph
y
s
ici
a
ns of
India, Februar
y
,
2009, Pg. No: 12
7-33
Evaluation Warning : The document was created with Spire.PDF for Python.