TELKOM
NIKA
, Vol.12, No
.2, June 20
14
, pp. 465~4
7
4
ISSN: 1693-6
930,
accredited
A
by DIKTI, De
cree No: 58/DIK
T
I/Kep/2013
DOI
:
10.12928/TELKOMNIKA.v12i2.1603
465
Re
cei
v
ed
Jul
y
23, 201
3; Revi
sed Ma
rch
20, 2014; Accepte
d
April 1
0
, 2014
Early Model of Student's Graduation Prediction
Based on Neural Network
Budi Ra
hma
n
i*
1
, Hugo Aprilianto
2
Program Stud
i T
e
knik Informatika, ST
MIK Ba
njar
baru
Jl. Jend. Ahma
d Yani Km. 33,
5 Loktab
a
t Ban
j
arb
a
ru, 05
113
251
83
6
*Corres
p
o
ndi
n
g
author, e-ma
i
l
: budir
ahma
n
i
@
gmai
l.com
1
, hug
o_
apri
lia
nto@
ya
hoo.com
2
A
b
st
r
a
ct
Predicti
ng ti
mi
ng of stud
ent gr
ad
uatio
n w
ould b
e
a va
lu
abl
e in
put for the man
age
ment of
a
Dep
a
rtment at
a Un
iversity. H
o
w
e
ver, this is
a difficu
lt
task i
f
it is don
e
ma
nua
lly. W
i
th th
e he
lp
of lear
ni
ng
on the
existi
ng
Artificial N
eur
al N
e
tw
orks, it is poss
i
bl
e to
provi
de trai
ni
n
g
w
i
th a certai
n confi
gur
ation
,
in
w
h
ich b
a
sed
o
n
exp
e
ri
ence
o
f
previo
us
gr
ad
uate
data, it w
oul
d b
e
p
o
ssib
l
e to pr
ed
ict the
time gr
oup
in
g
of
a stud
ent
’
s
gr
adu
atio
n. T
h
e
inp
u
t of th
e sy
stem is
th
e
pe
rforma
nce
in
de
x of th
e first, s
e
con
d
, a
n
d
thi
r
d
semester. Bas
ed
on testi
ng
perfor
m
e
d
o
n
166
data, th
e Artificial Neur
a
l
Netw
orks
tha
t
have be
en b
u
il
t
w
e
re able to pr
edict w
i
th up to
99.9% accur
a
cy.
Ke
y
w
ords
: predicti
on, time o
f
graduati
on, A
r
ti
ficial Ne
ura
l
Netw
ork, Back-prop
agati
o
n
1. Introduc
tion
STMIK Banja
r
ba
ru i
s
one
of many
univ
e
rsit
ie
s end
e
a
vourin
g to
ra
ise its
accreditation
status, in
whi
c
h o
ne of its
comp
one
nts i
s
the p
e
ri
od o
f
study for
stu
dents [1]. Ta
ble 1
sho
w
s the
Student Gra
d
uation Level
Data at the F
ourth G
r
ad
ua
tion in 2012.
Table 1. STM
I
K Banjarba
ru
Student Gra
duation L
e
vel
Data at the Fourth G
r
ad
ua
tion in 2012
Period of
Y
udisium
Departme
n
t of In
formatics Techniques
Departme
n
t of S
y
st
em Inform
ation
Grad
uation Time
Average of
GPA
Grad
uation Time
Average of
GPA
June 2011
5
y
e
a
r
s and 4 m
onths
2.91
5
y
e
a
r
s and 3 m
onths
2.74
October 2
011
4
y
e
a
r
s and 9 m
onths
3.02
4
y
e
a
r
s and 11
months
2.89
Januar
y
2013
5
y
e
a
r
s and 1 m
onths
2.93
5
y
e
a
r
s and 2 m
onths
2.79
In a
nothe
r
rese
arch
the
GPA
(G
rad
e
Po
int
Avera
ge), th
e n
u
m
ber of
cou
r
se
s ta
ken,
the nu
mbe
r
o
f
rep
eated
co
urses a
nd
th
e
numb
e
r of
certai
n
cou
r
se
s ta
ken
can
affect
the
duratio
n pe
ri
od of
study [2]. This
wa
s
simila
rly st
at
ed in a
nothe
r re
sea
r
ch u
s
ing
re
gre
ssion
trees,
in
whi
c
h
it
wa
s
ascertai
ned
th
a
t
the
va
ri
abl
es
that
can
be
u
s
e
d
to
differentiate
the
length of
a st
udent study p
e
riod
a
r
e
th
e GPA,
the
d
u
ration
of
completing
a
mini-th
e
si
s
and
the fac
u
lty [3].
Based o
n
the facts abo
ve, especi
a
ll
y for STMIK Banjarb
a
ru,
in orde
r to predi
ct
the pe
riod o
f
study of a
stude
nt, on
e ca
n
u
s
e t
he GPA dat
a obtain
ed f
r
om a
pe
rso
n
durin
g the initial period
of study (seme
s
ter 1-3
)
.
This
of cou
r
se de
pend
s on th
e expectatio
n
that the academics at STMIK
Banjarbaru have i
m
pleme
n
ted pr
eventive m
easure
s
to avoid
surpa
ssi
ng the ideal ni
n
e
sem
e
ste
r
study pe
riod
or the maxi
mum of 3.5 years, in order for
the grad
uatio
n status to improve,
be
si
des in
crea
sin
g
the point/g
rade, whi
c
h is also one of
the
crite
r
ia for ev
aluating a
c
creditation.
There is a
n
assumptio
n
in some li
terature that the con
c
e
p
t of Artificial Neu
r
al
Network (ANN) be
gan with the paper of Waff
en M
c
Cullo
ch an
d
Walter Pitts i
n
1943. In th
at
pape
r they t
r
ied to fo
rmul
ate a
mathe
m
atical
mod
e
l of b
r
ain
cells. T
he m
e
thod
whi
c
h
wa
s
develop
ed b
a
s
ed
on the
bi
ology of the
nervou
s
syst
em, wa
s a
st
ep forwa
r
d i
n
the
comp
u
t
er
indu
stry. The Artificial
Neu
r
al
Network i
s
an i
n
formatio
n proce
s
sing p
a
radigm that
wa
s
inspi
r
ed
by
the biol
ogi
cal n
e
rvou
s
system
cell
s
,
s
i
m
i
l
a
r
t
o
t
h
e
b
r
a
i
n
i
n
p
r
o
c
e
s
s
i
n
g
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 16
93-6
930
TELKOM
NIKA
Vol. 12, No. 2, June 20
14: 465 – 47
4
466
informatio
n.
The
ba
si
c
el
ement
of
th
e afo
r
em
ent
i
oned
pa
radi
g
m
i
s
a
ne
w
structu
r
e
of
the
informatio
n pro
c
e
ssi
ng
system. The
Artificial Ne
ur
al Network, like a h
u
man,
learn
s
fro
m
an
example. Th
e Artificial Neu
r
al Network
wa
s
formed to solv
e certai
n problem
s su
ch
as
recognitio
n
o
f
pattern
s o
r
cla
ssifi
cation
due to
th
e lea
r
nin
g
p
r
ocess. T
he
Artificial Neural
Network ha
s develop
ed ra
pidly in the
past few years [4]. The en
ormo
us interest in Artificial
Neu
r
al
Net
w
orks th
at recently occu
rre
d
was d
ue
to s
e
veral
fac
t
ors
.
Firs
t, the pattern
of
training
which was
deve
l
oped
to b
e
c
ome
a
sma
r
ter
network model
that
co
uld
solv
e
probl
em
s. Se
con
d
, digital
comp
uters
wi
th high
spee
ds
have m
a
d
e
net
work
proce
s
s
simula
tion
easi
e
r to do.
Third, today'
s
technol
ogy p
r
ovide
s
spe
c
i
f
ic hardware
for neu
ral n
e
tworks. At the
same
time t
he d
e
velopm
ent of tra
d
itional
com
puti
ng h
a
s
mad
e
Artificial
Neural
Netwo
r
ks
learni
ng ea
si
er, the limitations faced by tr
aditional compute
r
s have motivated several
direc
t
ions
of res
e
arc
h
on A
r
tific
i
al Neural Networks
[5].
The network used to predict the du
ration
of stu
d
y perio
d is the backp
rop
agatio
n
Artificial
Neu
r
al
Network.
This
network ha
s
se
veral layers
, namely t
he in
p
u
t layer, o
u
tput
layer and
sev
e
ral hid
den la
yers. The
s
e
hidde
n layers assi
st the netwo
rk to reco
gni
ze mo
re
input patterns compared to networks t
hat do not have hidden l
a
yers [6],[7].
The ba
ckpro
pagatio
n trai
ning p
r
o
c
e
ss
requi
re
s th
re
e stage
s, na
mely the feedforwa
rd
data
inp
u
t f
o
r
trai
ning,
b
a
ckpropa
gati
on
for e
r
ror values
,
and
adjus
tment for the
weight
value of each node of in
dividual layers. Beginni
n
g
with the fee
d
forward inp
u
t value, each
first input unit (xi) receive
s
an input signal
whi
c
h will be subseque
ntly transmitted to th
e
hidden layer Z1,....,Zp.
The j hidden unit will
then cal
c
ul
ate
the signal
(Zj
)
value, which will
be tran
smitte
d to the outpu
t layer, using
the f activation function.
∑
And
whe
r
e
= hid
den bia
s
of the j unit. The bias valu
e an
d initial weigh
t
can be take
n rand
omly.
Each unit of
output
∑
and
whe
r
e
= hidden bia
s
of the k unit.
Thro
ugh
out the duration o
f
the training
pro
c
e
ss, e
a
ch
output unit co
mpares the ta
rget value (T
m) for an i
n
p
u
t pattern to cal
c
ulate the
para
m
eter v
a
lue
whi
c
h will co
rre
ct (up
date
)
the weight
value of each
unit in the individual layers.
The process of training
the backp
ro
pagatio
n algo
rithm ha
s a
n
activation
function
that must
have the fol
l
owin
g cha
r
acteri
stic
s,
namely
cont
inuou
sly, differentia
ble,
and
monotoni
cally
d
e
crea
sin
g
. O
ne
of
th
e mo
st
used
fun
c
tion
s
is the
sigm
oi
d fu
nctio
n
t
hat
has a range
of 0 to 1 [6] [7
] [8].
2. Rese
arch
Metho
d
In predi
cting
the
duratio
n of
study
fo
r STMIK
Banjarbaru
stude
nts, th
e writer
u
s
ed
s
e
c
ond
a
r
y
d
a
t
a ob
ta
in
e
d
fr
om th
e
Aca
demic Affairs
se
ction
of the
ST
MIK
Banjarbaru campu
s
in which as man
y
as 166
samples of stu
dent data who gradu
ate
d
in
2011
and
2
012
we
re ta
ken. G
ene
rall
y, the syste
m
to be
de
veloped
wa
s trying to a
pply
the
NN (Neu
ral Network) method
by u
s
ing
the
sem
e
ster 1,
2 a
n
d
3
Grade
Point in
put.
The
hope was,
after testing,
NN accu
ra
cy w
ould b
e
obtaine
d that would b
e
comp
ared to the
alumni data a
l
ready availa
ble to predi
ct
the duratio
n o
f
study for stu
dents.
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
1693-6
930
Early Mo
del o
f
Student's Graduatio
n Pre
d
iction Ba
sed
on Neu
r
al Network (Bu
d
i Rahm
ani
)
467
2.1.
Use c
ase dia
g
ram dan Se
quenc
e Diag
ram
If
figured in the form of
a use
c
a
s
e
d
i
agra
m
, the tools to be b
u
i
l
t are as follo
ws:
Figure 1. Use
c
a
s
e dia
g
ra
m
From the Figure ab
ove one can see that
there are three
cases that can
b
e
done
by the
system,
na
mely re
ce
ive
the in
put fro
m
the
gra
d
e
point va
ria
b
l
e, trainin
g
a
n
d
testing proce
ss by the JST backp
ro
p
agation, and
lastly, is
the pre
d
ictio
n
result that is
given [4].
T
o
b
e
mo
re
detaile
d, wh
at is d
one
by the
syst
em
ca
n
be
figured
in
t
h
e
diagram sequ
ence as follo
ws:
Figure 2. Sequen
ce dia
g
ra
m
2.2.
Determinatio
n of data for
training and
tes
t
ing
The follo
win
g
example
d
a
ta wa
s
obt
ained f
r
om t
he STMIK Banjarba
ru BA
AK (Biro
Administrasi
Akad
emik
dan Ke
mah
a
si
swaan/
Bu
reau of Ad
mi
nistratio
n
a
nd Acade
m
i
c
Affairs), nam
ely among others:
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 16
93-6
930
TELKOM
NIKA
Vol. 12, No. 2, June 20
14: 465 – 47
4
468
Table 2. Ju
disium
Data of June 201
1 of the Departme
n
t of Informati
cs T
e
chni
que
s.
(Ju
d
isi
u
m is d
a
te of decisi
on to gradu
ate a student
)
No.
Name
Student
numbe
r
GPA
Grad
uation
time
1
Nur Imans
yah
310104020
134
2.65
6
Y
e
a
r
s 11 Mont
hs
2
Gusti Indra Mulia
w
a
n
310105020
310
2.58
5
Y
e
a
r
s 11 Mont
hs
3
Rob
y
Roosad
y
310105020
320
2.44
5 Y
e
a
r
s
8
Month
s
4
Nina Herlina
310106020
415
3.14
4
Y
e
a
r
s 11 Mont
hs
5
Ridha Faisal
310106020
435
3.03
4
Y
e
a
r
s 11 Mont
hs
6
Himalini Alp
i
y
a
n
a
310106020
447
3.14
4
Y
e
a
r
s 11 Mont
hs
7
M. Fredd
y Pr
ata
m
a Putra
310106020
464
3.07
4
Y
e
a
r
s 9 Month
s
8
Lagairi
310106020
500
2.83
4
Y
e
a
r
s 10 Mont
hs
9
Asbihannor
310106020
512
3.05
4
Y
e
a
r
s 11 Mont
hs
10
Mawardi
310106020
524
3.18
4
Y
e
a
r
s 10 Mont
hs
Average
2.91
5
Y
e
a
r
s 4 Month
s
2.3.
Rec
a
pitula
tion of data
tha
t
w
i
ll be process
ed using the M
a
tlab 20
11b.
The data to
be processed is gro
u
p
ed
into six duratio
ns of
study pe
riod
grou
ps,
namely:
a.
≥
3,5 years (gro
up 1)
b.
≥
4
years (gro
up
2)
c.
≥
4,5 years (gro
up 3)
d.
≥
5 years
(grou
p
4)
e.
≥
6 years
(grou
p
5) an
d
f. > 7 years (grou
p
6)
The re
ca
pitul
a
tion of data grou
ping
sho
w
n in
the follo
wing
sep
a
rat
e
table of this paper.
2.4.
Designing
th
e Artifi
cial
Neural Ne
t
w
o
r
k (Ne
u
ral Ne
t
w
o
r
k)
The following Figure
shows
how the NN
struct
ure
w
ill
be
built with 3 inputs
with
700 l
a
yers, a
nd a
hid
den
l
a
yer, a
s
well
a
s
an
out
put. Th
e
JS
T
wa
s b
u
ilt
usin
g th
e
NN
Toolbox Matl
ab 201
1b.
Figure 3. NN
De
sign (NN T
oolbox)
In ord
e
r to
con
d
u
c
t train
i
ng on th
e d
a
ta
, adju
s
tm
ents to
para
m
eters a
r
e
required,
namely a
s
fol
l
ows: the nu
mber
of epo
ch is
10,000,
with a ta
rget
error to th
e sum of 1e
-5
.T
hu
s
on
the Matlab, several of the
aforem
entio
ned
setting
s are a
s
a
c
cording to the f
o
llowin
g
p
s
e
udo-
cod
e
s.
net.trainParam.s
how =
10;
net.trainParam
.epochs =
10000;
net.trainParam
.goal = 1e-5;
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
1693-6
930
Early Mo
del o
f
Student's Graduatio
n Pre
d
iction Ba
sed
on Neu
r
al Network (Bu
d
i Rahm
ani
)
469
The followi
ng
comma
nd ca
n be used to con
d
u
c
t traini
ng:
[net,tr]=train(net,input,target);
Afterwa
r
ds th
e post traini
n
g
workspa
c
e
can b
e
save
d
with the following n
a
me:
konfig_gabung_all.mat
2.5.
GUI De
sign
T
h
e
f
o
l
l
o
w
i
n
g
i
s
a
G
U
I
t
o
a
s
s
i
s
t
i
n
t
e
s
t
i
n
g
a
n
d
a
l
s
o
u
s
e
d
a
s
a
t
o
o
l
t
o
m
a
k
e
predi
ction
s
o
f
student gra
duation timin
g
base
d
on GPA on first, se
con
d
and t
h
ird seme
sters.
Figure 4. GUI
Desi
gn
There a
r
e
two tab
s
creat
ed fo
r thi
s
d
e
sig
n
. Th
e fi
rst ta
b i
s
na
med
G
r
ad
e
Point
Average,
wh
ere
the
r
e
a
r
e th
re
e
pa
rts
that
woul
d later have
to be fille
d in
with
sem
e
st
er
grad
e p
o
int value
s
of a
student
who
s
e gradu
ation
is to
be
predicte
d
. Such
filling in
m
u
st
be in num
eric form of
betwe
en 0 t
h
rou
gh 4,
cannot be fill
ed in with l
e
tters a
nd if in
decim
al form
, the numbe
rs h
a
ve to be sep
a
rate
d
with a dot si
g
n
not a com
m
a. If these
matters are violated, an error message on the
design will appear with the
followingscri
pt:
if isnan(IPSem_
3)
set(hObject, 'String', 0);
erro
rdlg('Input Input Must Be Numeric 0 through 4 or
use the dot sign as a replacement for commas','Error');
end
if (IPSem_3 > 4)
set(hObject, 'String', 0);
erro
rdlg('Input Input Must Be Numeric 0 through 4 or
use the dot sign as a replacement for commas','Error');
end
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 16
93-6
930
TELKOM
NIKA
Vol. 12, No. 2, June 20
14: 465 – 47
4
470
The
se
co
nd
tab i
s
nam
ed
pre
d
ictio
n
a
nd
on
i
t
there
a
r
e,
amon
g
oth
e
rs,
the
followin
g
but
tons:
Pro
c
e
s
s,
Re
set an
d Exit,
including predi
cti
on result writing, and al
so
predi
ction
groupin
g
from
six
group
s t
hat we
re
de
sign
ed. An e
rro
r me
ssag
e
is de
sig
ned
with
the followin
g
script if the system can
n
o
t
make
a predi
ction du
ring the pre
d
iction p
r
o
c
e
s
s
(the ‘Pro
ce
ss’
button is pre
s
sed):
if (hasil < 0.7)
set(handles.te
xt13, 'STRING', '
0
');
set(handles.te
xt14, 'STRING',
System cannot
make a predictio
n');
end
The ‘Re
s
et’
button in this se
ction i
s
used to reset the filled in informatio
n and
predi
ction re
sult that was done ea
rlie
r in order
to condu
ct the pr
ediction p
r
ocess on new
data. If
this reset button
is pressed,
all gr
ade p
o
int seme
ste
r
data will be set as 0 as
well
as all
predi
ction
re
sults a
nd p
r
e
d
i
c
tion
gro
u
p
s
.
Whe
r
ea
s th
e
‘Exit’ button isu
s
ed
to
exit
from the applicatio
n (GUI) and clo
s
e
the opene
d GUI win
d
o
w
.
2.6.
Sy
stem testi
ng
There are two way
s
to
do this
tes
t
ing.
The firs
t is
to automatic
ally c
onduc
t
testing agai
n
s
t all data via a prog
ram
with the follo
wing
comm
a
nd:
sim(net_gabung,input);
Re
sult d
a
ta
can late
r
be
se
en in
the
‘an
s
’ varia
b
le
on t
he
wo
rk spa
c
e, if thi
s
va
ri
able
i
s
clicke
d l
a
ter
on, it
will
di
splay the
nu
meri
c
re
sult
of the
p
r
edi
ction
in
its e
n
tirety from
d
a
ta
that was te
sted.The seco
nd way is to take
adva
n
tage of the GUI
that h
a
s already be
en
made a
nd te
st the existin
g
available
data one
by
one. The result of bot
h ways
will
be
filled in to
the table alrea
d
y made avai
l
able (re
c
apit
u
lation table
of data to be tested
).
3. Results a
nd Analy
s
is
3.1.
Neur
al Net
w
ork Training
Resul
t
The follo
win
g
is th
e result of trainin
g
condu
cted
again
s
t 1
6
6
input d
a
ta
on th
e
Neu
r
al
Network
that
wa
s cre
a
ted usi
ng NNT
ool
o
n
Matlab.
Th
e follo
wing i
s
the in
put an
d
target data:
input=[3.14 3.43 3.45 2.73 2.64 3.67 3.57 3.86 2.68 2.91 3.19 2.91 2.91 2.86 3.36 2.82
3.23 3.14 2.77 2.64 2.91 3.14 3.36 2.50 2.73 3.18 2.64 2.77 2.10 2.10 2.33 3.05 2.62
2.67 2.29 3.05 2.62 2.24 2.48 2.27 2.57 2.52 2.27 2.45 2.81 2.48 2.48 2.95 2.57 2.62
2.59 2.68 2.36 2.77 1.91 1.95 2.41 2.41 3.18 3.36 2.41 2.50 2.50 2.14 2.52 2.48 2.67
3.05 3.29 2.38 3.05 2.00 2.67 2.59 2.43 2.38 2.81 2.86 2.73 2.73 2.32 2.95 2.50 2.64
2.09 2.45 2.59 2.48 2.48 2.14 2.90 2.90 2.95 2.10 2.41 1.41 2.32 1.95 2.36 2.14 2.73
2.50 2.36 1.95 2.59 1.95 1.95 2.14 2.57 2.32 2.32 2.59 2.23 2.23 1.45 1.40 2.73 2.45
2.50 2.23 3.00 2.05 1.73 2.48 1.36 2.64 2.45 2.64 2.33 2.19 2.71 2.62 2.42 2.45 2.36
2.36 2.59 2.36 2.45 2.76 2.24 2.24 2.50 2.27 2.09 3.55 2.00 2.55 2.59 2.32 2.68 3.27
2.82 2.09 2.00 2.86 2.18 2.27 1.82 2.36 2.09 2.14 2.41 2.23 1.95 2.09;
3.85 3.23 2.76 2.53 2.53 3.86 3.73 3.79 2.26 3.05 3.42 3.32 3.16 3.05 3.48 3.00 2.76
2.62 2.63 2.53 2.42 3.11 3.00 2.26 2.42 2.79 2.16 1.75 2.75 1.56 2.59 2.81 2.79 2.26
1.65 3.19 2.44 2.44 2.29 1.82 2.26 2.11 2.50 2.67 2.53 2.47 2.95 2.74 2.63 2.21 2.00
2.75 2.53 2.74 2.60 1.67 2.18 2.06 2.13 3.38 2.24 2.05 2.58 1.35 2.82 2.47 2.06 3.00
3.48 1.82 2.32 2.19 2.63 2.50 2.35 2.12 3.16 2.68 2.42 2.40 2.00 2.63 2.33 2.50 2.35
2.50 1.40 1.95 2.47 2.59 2.47 2.79 2.42 2.53 2.18 1.45 1.82 2.43 1.65 2.50 2.45 2.26
2.00 2.33 2.58 1.76 0.88 1.35 2.68 1.67 2.18 2.15 1.69 1.94 1.75 1.00 2.15 1.85 1.72
1.39 2.55 2.61 2.19 2.21 2.25 2.75 2.50 2.21 2.35 1.76 2.42 2.35 2.29 2.00 2.18 2.17
2.17 2.45 2.67 2.89 1.94 1.59 2.65 2.17 2.00 3.23 1.81 2.33 2.55 1.75 1.10 3.14 2.15
1.83 2.00 2.45 1.33 2.45 1.75 2.00 1.56 2.25 3.00 2.22 2.00 2.00;
3.68 3.80 3.25 3.00 3.10 3.63 3.50 3.35 2.50 3.36 3.45 3.26 3.59 2.86 3.56 2.77 3.10
2.65 3.20 2.50 2.78 3.19 3.14 2.72 2.61 3.05 3.17 2.73 2.47 2.77 3.11 2.79 2.53 2.11
2.47 2.86 2.68 2.84 2.78 2.00 3.00 2.56 1.90 2.48 2.47 2.68 2.50 2.80 2.20 2.72 2.22
1.90 2.36 2.00 2.55 2.35 2.29 2.72 2.89 4.00 2.06 2.47 2.50 2.50 2.68 2.84 2.38 3.27
3.75 2.38 2.81 2.38 2.87 2.00 2.33 2.35 2.95 2.90 2.67 3.00 2.33 2.48 3.17 2.14 2.32
2.83 2.31 1.81 2.33 2.16 2.61 2.90 2.00 2.45 2.22 2.14 2.27 2.50 2.27 2.56 2.22 2.17
1.67 1.67 2.52 1.88 1.50 2.00 2.72 2.40 2.00 2.33 2.00 2.56 2.00 2.36 2.44 2.67 2.20
1.75 2.90 2.30 2.28 1.83 2.59 2.22 2.38 3.06 2.33 2.27 2.39 2.39 1.89 2.35 2.56 2.39
2.33 2.50 2.29 3.10 2.00 2.00 2.05 2.06 2.19 3.13 2.18 2.67 3.15 2.44 0.56 2.86 1.60
2.24 2.33 2.33 1.75 2.68 2.50 2.18 2.00 2.30 2.57 2.44 1.94 2.00];
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
1693-6
930
Early Mo
del o
f
Student's Graduatio
n Pre
d
iction Ba
sed
on Neu
r
al Network (Bu
d
i Rahm
ani
)
471
target_all=[1 1 1 1 1 1 1 1 1 1 1 2 2 2 2 2 2 2 2 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3
3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3
3 3 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4
4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 5 5 5 5 5 5 5 5 5 5 5 6 6 6 6 6 6 6];
The traini
ng result di
splay i
s
:
Figure 5. Ne
ural Netwo
r
k Traini
ng State
In the figu
re
above, the
training
Ne
ural
Net
w
ork
re
sult is
sh
own
with
10
000
ep
och
with the
be
st pe
rforma
nce
grade
to the
sum
of 0.003
939
6. Wh
ereas
the p
r
edi
cti
on
output value
on the re
gre
s
sion g
r
a
ph is
R=0.998
29 (t
he mo
st ideal
value is 1, whi
c
h shows that betwe
en
the target and the outpu
t result is the
same
).
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 16
93-6
930
TELKOM
NIKA
Vol. 12, No. 2, June 20
14: 465 – 47
4
472
3.2.
GUI De
sign Resul
t
The follo
wing
figure sho
w
s the ru
nnin
g
result
of the GUI that wa
s create
d
, wi
th input
example i
n
t
he form of th
e first
data i
n
the tabl
e o
f
the previou
s
te
sting
(G
rade Poi
n
t in
first
seme
ste
r
= 3.
14; Gra
de Po
int in second
seme
ste
r
=
3.
85; Gra
de Po
int in third se
meste
r
= 3.6
8
).
Figure 6. GUI
Displ
a
y with testing data
3.3.
Data Tes
t
ing
Result
Testing
will b
e
co
ndu
cted
usin
g scri
pt from
NN toolb
o
x (first
way)
and after
bei
ng given
the comm
and
:
sim(net_gabun
g,input);
Thus the result will be:
ans =
Columns 1 through 6
1.0007 1.0001 1.0015 0.9843 1.0395 0.9988
Columns 7 through 12
1.0008 0.9999 1.1106 0.9973 1.0006 2.0023
Columns 13 through 18
1.9981 2.0065 1.9967 1.9999 2.0013 2.0025
Columns 19 through 24
1.9987 3.1285 3.0122 2.9988 2.9992 3.1647
Columns 25 through 30
2.9166 2.9956 2.9955 3.0012 3.0020 3.0057
Columns 31 through 36
3.0017 3.0053 3.0060 3.0152 2.9947 3.0018
Columns 37 through 42
3.0135 3.0182 2.8903 3.0070 3.0080 2.9585
Columns 43 through 48
3.0011 2.9758 3.0027 3.0074 3.0084 3.0010
Columns 49 through 54
3.0081 2.9770 3.0003 3.0021 3.0207 3.0005
Columns 55 through 60
3.0039 2.9997 3.2305 2.9883 3.0005 3.0005
Columns 61 through 66
3.0484 3.0253 3.1139 2.9948 3.0087 3.0937
Columns 67 through 72
3.0432 3.0007 3.0027 3.0804 3.0014 2.9989
Columns 73 through 78
2.9859 2.9788 3.0181 3.1091 2.9990 3.0073
Columns 79 through 84
3.0614 2.9886 3.0206 2.9993 2.9976 3.0148
Columns 85 through 90
4.0331 3.9185 3.9960 3.9989 4.0394 3.9956
Columns 91 through 96
3.9909 3.9902 3.9998 3.9832 3.7824 3.9981
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
1693-6
930
Early Mo
del o
f
Student's Graduatio
n Pre
d
iction Ba
sed
on Neu
r
al Network (Bu
d
i Rahm
ani
)
473
Columns 97 through 102
3.9571 3.9867 3.9644 4.0136 4.0015 4.0086
Columns 103 through 108
3.9970 4.0009 3.8415 4.0006 3.9998 4.0030
Columns 109 through 114
3.9956 4.0725 4.0034 3.9404 3.9875 4.0138
Columns 115 through 120
3.9997 3.9970 3.9306 4.0033 4.0069 4.0022
Columns 121 through 126
3.9959 4.0063 3.9992 3.9991 4.0000 3.9912
Columns 127 through 132
3.9034 4.0055 3.9797 4.0392 3.9875 3.9273
Columns 133 through 138
3.9885 3.9240 4.0063 3.8194 4.0412 4.0187
Columns 139 through 144
4.0027 3.9997 4.0109 4.0175 4.0046 4.0139
Columns 145 through 150
4.0050 4.0018 4.0048 3.9236 4.9777 4.8951
Columns 151 through 156
4.9985 4.9988 4.9989 4.9626 5.0054 5.0012
Columns 157 through 162
5.0025 4.9696 4.9986 5.9957 5.9888 5.9605
Columns 163 through 166
5.9931 6.0490 6.0005 5.9873
3.4.
Resul
t
An
aly
s
is
The
writer
tries to
compa
r
e b
e
tw
ee
n pre-te
st and po
st-te
s
t re
sult
s (shown in
sep
a
rate tabl
e of this pape
r, Table 3, Ta
ble 4 and Ta
b
l
e 5).
From the test
ing and com
pari
s
on
re
sult betwee
n
data before a
n
d
after testing (pre
-
test and po
st-te
s
t) sh
ows 99.99%
of the system already indi
cate
s a maximum re
sult in
orde
r to execute time groupin
g
predi
ction
of student grad
uat
ion based on
as many as
166
data.
4. Conclusio
n
The write
r
con
c
lu
de
s that base
d
on
testing and ob
se
rvation again
s
t Neural
N
e
two
r
k
de
s
i
g
n
re
su
lts
in o
r
de
r to
pr
ed
ic
t time
gro
uping
of
stud
ent gradu
atio
n ba
se
d o
n
as
many as 16
6
data, the NN configu
r
ation
to be
made con
s
i
s
ts of three (3) inp
u
t node
s with 7
00
input laye
rs
and o
ne
hidd
en laye
r a
s
well
as an
output. Pe
rforma
nce
shows th
e b
e
st
perfo
rman
ce
figure to th
e sum of 0.0039
396.
Where
a
s the p
r
edi
ction out
put value in
the
reg
r
e
ssi
on g
r
aph
is
R=0.
9982
9, whi
c
h mea
n
s it
almost
rea
c
hed the
mo
st ideal val
u
e,
namely
on
e (1), whi
c
h
in
dicate
s
that betwe
en
the
target
and
ou
tput re
sult i
s
the
sam
e
. Th
e
result of te
sti
ng d
one
ag
ai
nst exi
s
ting
d
a
ta sho
w
s co
rre
ct re
sults
i
n
ma
kin
ga
p
r
edictio
n
re
ached
99.9%, or almost all of its predi
ction
s
are co
rrect.
Referen
ces
[1]
Azi M. Predi
ksi Lama M
a
s
a
Studi M
aha
sis
w
a d
e
n
g
a
n
Metode F
u
zz
y Su
ge
no.
The
s
is
. ST
MIK
Banj
arbar
u Kal
i
manta
n
Sel
a
ta
n Indon
esi
a
. 2012:1-
4.
[2]
Mein
and
a A, Muha
ndri S. P
r
ediksi M
a
sa
Studi
Sarj
an
a
den
ga
n Artificial N
eura
l
Net
w
o
r
k.
T
hesis
.
IT
B. Bandung.
200
9.
[3]
Rahm
ani B. Earl
y
M
ode
l of
T
r
affic Sign Remin
der
Based o
n
Ne
ural Net
w
o
r
k.
TEL
K
OMNIK
A
T
e
leco
mmunic
a
tion, Co
mputi
ng,
Electron
ics
and Co
ntrol.
2
012; 10(
4): 479
-758.
[4]
Gupta M, R Kumar, RA. Gupta Ne
ural N
e
t
w
o
r
k
Base
d In
de
xi
ng a
nd R
e
cogn
ition
of Po
w
e
r Qua
lit
y
Disturb
ances.
T
E
LKOMNIKA T
e
leco
mmu
n
i
c
ation, Co
mp
u
t
ing, Electron
ic
s and Co
ntrol.
2011; 9(
2):
227-
236.
[5]
Ba
yu AT
, Rodi
yat
u
l F
S
, H
e
r
m
ans
ya
h. An
Earl
y D
e
tectio
n Meth
od
of T
y
p
e
-2
Di
abete
s
Mell
itus i
n
Publ
ic Hosp
ital
.
T
E
LKOMNIKA T
e
leco
mmu
n
i
catio
n
, Co
m
p
u
t
ing, Electron
ic
s and Co
ntrol.
201
1; 9(2):
287-
294.
[6]
Han
y
F
,
F
e
l
i
x
P, Henr
y K. E
nha
nce
d
N
eur
o-F
u
zz
y Archit
ecture F
o
d El
ectrical
Lo
ad
F
o
recastin
g
.
Te
lkom
n
i
ka
.
20
10; 8(2): 87-9
6
.
[7]
De
w
i
YS. Pe
nera
pan
Meto
de R
egres
i B
e
rstruktur Poh
on p
a
d
a
Pe
n
dug
aa
n Lam
a
Masa Stu
d
i
Mahas
is
w
a
Me
ngg
un
akan Pa
ket Program R.
Jurnal Ilm
u
Dasar.
2007: 7
5
-8
2.
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 16
93-6
930
TELKOM
NIKA
Vol. 12, No. 2, June 20
14: 465 – 47
4
474
[8]
W
a
h
y
ud
i A. Prediksi H
a
sil U
j
i
an Nas
i
on
al B
e
rbas
is Jarin
g
an S
y
araf T
i
ruan.
Lap
ora
n
Skripsi
. ST
MI
K
Banj
arbar
u. Ba
njarmas
i
n. 20
1
2
: 1-2.
[9] Herma
w
a
n
A.
Jarin
gan Sy
ara
f
T
i
ruan T
eori d
an Apl
i
kasi
. Yo
g
y
akarta. Pe
ne
rbit Andi. 20
06.
[10] Kusuma
de
w
i
S
.
Artifical Intelle
genc
e (T
eknik
dan Ap
likas
iny
a
)
. Yog
y
akarta.
Graha Ilmu. 2003.
[11] Kusuma
de
w
i
S
.
Memba
n
g
un
Jarin
gan Sy
ara
f
T
i
ruan
. Yog
y
a
k
arta. Graha Ilmu. 2004.
[12]
Purnom
o MH, Agus K.
Super
vised N
eura
l
N
e
tw
ork dan Apl
i
kasiny
a
. Yog
y
akarta. Graha Ilmu.
Evaluation Warning : The document was created with Spire.PDF for Python.