Inter
national
J
our
nal
of
Ev
aluation
and
Resear
ch
in
Education
(IJERE)
V
ol.
5,
No.
3,
September
2016,
pp.
235
–
245
ISSN:
2252-8822
235
I
ns
t
it
u
t
e
o
f
A
d
v
a
nce
d
Eng
ine
e
r
i
ng
a
nd
S
cie
nce
w
w
w
.
i
a
e
s
j
o
u
r
n
a
l
.
c
o
m
Design
and
Implementation
of
P
erf
ormance
Metrics
f
or
Ev
aluation
of
Assessments
Data
Irfan
Ahmed
*
and
Arif
Bhatti
*
*
Colle
ge
of
Computers
and
Information
T
echnology
,
T
aif
Uni
v
ersity
,
Saudi
Arabia
Article
Inf
o
Article
history:
Recei
v
ed
July
21,
2016
Re
vised
August
10,
2016
Accepted
August
16,
2016
K
eyw
ord:
assessment
e
v
aluation
higher
education
student
outcomes
ABSTRA
CT
Ev
ocati
v
e
e
v
aluation
of
assessment
data
is
essential
to
quantify
the
achie
v
ements
at
course
and
program
le
v
els.
The
objecti
v
e
of
this
paper
is
to
design
performance
metrics
and
respecti
v
e
formulas
to
quantitati
v
ely
e
v
aluate
the
achie
v
ement
of
set
objecti
v
es
and
e
x-
pected
outcomes
at
the
course
le
v
els
for
program
accreditation.
Ev
en
though
assessment
processes
for
accreditation
are
well
docum
ented
b
ut
e
xistence
of
an
e
v
aluation
process
is
assumed.
This
w
ork
pro
vides
performance
metrics
such
as
attainment,
student
achie
v
e-
ment,
and
x-th
percentile
for
the
e
v
aluation
of
assessment
data
at
course
and
program
le
v
el.
Then,
a
sample
course
data
and
uniformly
distrib
uted
synthetic
data
are
used
to
an-
alyze
the
results
from
designed
metrics.
The
primary
findings
of
this
w
ork
are
tw
ofold:
(i)
analysis
with
sample
course
assessment
data
re
v
eals
that
qual
itati
v
e
mapping
between
marks
obtained
in
assessments
to
the
defined
outc
omes
is
essential
for
meaningful
re-
sults,
(ii)
analysis
with
synthetic
data
sho
ws
that
higher
v
alues
of
one
metric
does
not
imply
higher
v
alues
of
the
other
metrics
and
the
y
depend
upon
the
obtained
marks
dis-
trib
ution.
In
particular
,
for
uniformly
distrib
uted
marks,
achiev
ement
<
attainment
for
meanO
f
U
nif
or
mD
istr
:
<
av
er
ag
eM
ar
k
s
<
passing
T
hr
e
s
ho
l
d
(
)
.
Authors
hope
that
the
articulated
description
of
e
v
aluation
formulas
will
help
con
v
er
gence
to
high
quality
standard
in
e
v
aluation
process.
Copyright
c
2016
Institute
of
Advanced
Engineering
and
Science
.
All
rights
r
eserved.
Corresponding
A
uthor:
Irf
an
Ahmed
Colle
ge
of
Computers
and
Information
T
echnology
,
T
aif
Uni
v
ersity
,
T
aif-21974,
Saudi
Arabia
i.ahmed@tu.edu.sa
1.
INTR
ODUCTION
An
y
educational
program
starts
with
a
mission
statement,
objecti
v
es,
and
the
program
or
student
outcomes.
Mission
st
atement
describes
t
h
e
broad
goals
of
the
program.
Program
educational
objecti
v
es
(PEO)
are
statements
re
g
arding
the
e
xpected
positions
that
students
may
attain
within
a
f
e
w
years
of
graduation,
whereas
Student
outcomes
(SO)
are
the
e
xpected
skills
at
the
time
of
graduation.
SOs
are
directly
ti
ed
with
the
course
learning
outcomes
(CLO)
which
are
the
e
xpected
skills
of
a
student
at
the
end
of
the
course
in
a
program;
more
formal
d
e
finitions
can
be
found
at
[1].
Assessment
and
e
v
aluation
are
inte
gral
parts
of
the
quality
assurance,
continuous
impro
v
ement,
and
the
accreditation.
Assessment
is
defined
as
one
or
more
processes
that
identify
,
collect,
and
prepare
the
data
necessary
for
e
v
aluation.
Ev
aluation
is
defined
as
one
or
more
processes
for
interpreting
t
he
data
acquired
through
the
assessment
processes
in
order
to
determine
ho
w
well
SOs
are
being
attained
[1].
Man
y
authors
ha
v
e
published
their
w
ork
on
continuous
impro
v
ement,
data
collection
and
assessment
strate-
gies
b
ut
there
is
no
w
ork
in
the
literature
that
focuses
on
the
e
v
aluation
of
the
assessment
data
at
course
and
program
le
v
els.
A
complete
procedure
for
ABET
accreditat
ion
for
Engineering
programs
at
Qassim
Uni
v
ersity
has
been
pre-
sented
in
[2].
It
gi
v
es
the
detail
ed
implementation
of
the
continuous
impro
v
ement
process,
ef
fecting
major
changes
in
the
educational
plan,
curricular
content,
f
acilities,
acti
vities,
teaching
methodologies,
and
assessment
practices.
But
this
paper
does
not
go
into
the
details
of
e
v
aluation
process.
Olds
et
al.
[3]
e
xamine
man
y
possible
assessment
methods
in
comply
with
ABET
cr
iteria.
The
y
cate
gorize
assessment
methodologies
into
descripti
v
e
and
e
xperimental
studies,
J
ournal
Homepage:
http://iaesjournal.com/online/inde
x.php/IJERE
I
ns
t
it
u
t
e
o
f
A
d
v
a
nce
d
Eng
ine
e
r
i
ng
a
nd
S
cie
nce
w
w
w
.
i
a
e
s
j
o
u
r
n
a
l
.
c
o
m
Evaluation Warning : The document was created with Spire.PDF for Python.
236
ISSN:
2252-8822
and
pro
vide
v
arious
ef
fecti
v
e
types
of
assessme
n
t
s
for
engineering
education
without
the
insights
of
e
v
aluation
strate-
gies.
An
o
v
ervie
w
of
program
le
v
el
assessment
for
continuous
impro
v
ement
is
gi
v
en
in
[4].
It
e
xplains
assessment
process
in
fi
v
e
steps:
from
the
identification
of
educational
objecti
v
es
to
the
measurement
of
assessment
data
b
ut
none
of
these
step
pro
vides
the
e
v
aluation
methodology
.
A
complete
assessment
process
from
writing
good
learning
out-
comes,
mapping
between
course
le
v
el
outcomes
and
program
le
v
el
outcomes,
and
data
collection
using
v
arious
direct
and
indirect
assessment
methods
is
illustrated
in
[5]
and
[6].
In
[7]
authors
present
an
assessment
plan
and
continuous
impro
v
ement
process
at
James
Madison
Uni
v
ersity
.
The
y
ha
v
e
introduced
course
assessment
and
continuous
impro
v
e-
ment
(CA
CI)
reports
at
course-le
v
el
and
student
outcomes
summary
report
(SOSR)
at
program-le
v
el.
This
paper
sho
ws
some
sample
reports
and
assessment
templates
b
ut
does
not
discuss
the
e
v
aluation
process.
A
web-based
tool
has
been
introduced
in
[8]
for
outcome-based
open-ended
and
recursi
v
e
hierarchical
quantitati
v
e
assessment.
This
quantitati
v
e
assessment
is
used
to
structure
outcomes
and
measures
into
a
le
v
eled
hierarch
y
,
with
course
outcomes
at
the
bottom
and
more
general
objecti
v
es
at
the
top.
A
general
curriculum
outcome
(GCO)
layer
has
been
added
between
course’
s
outcomes
and
program
or
student’
s
outcomes.
In
[9]
both
direct
and
indirect
measures
are
used
to
collect
and
analyze
data
to
assess
the
attainments
of
the
student
outcomes.
T
o
ensure
data
inte
grity
,
a
set
of
rubrics
with
benchmarks
and
performance
indicators
at
both
the
program
and
curriculum
le
v
els
are
de
v
eloped.
Each
outcome
has
been
assessed
for
dif
ferent
le
v
els
(introductory
,
be
ginning,
de
v
eloping,
proficient,
e
x
emplary)
and
from
dif
fere
n
t
sources.
The
article
[10]
presents
discussions
on
writing
learning
outcomes
a
nd
to
assess
soft
skills
in
engineering
education.
The
paper
[11]
describes
the
assessment
techniques
and
the
mapping
of
CLO
to
SO
without
the
insight
of
e
v
aluation
process.
A
case
study
[12]
describes
the
features
that
contrib
ute
to
assessment
qualit
y
at
the
programme,
course
and
task
le
v
el.
This
case
study
has
a
particular
focus
on
the
technical
such
as
task
analysis
and
task
relationship
patterns.
Another
case
study
[13]
presents
a
health
science
program
reform
and
e
v
aluation.
It
discusses
potential
for
e
v
aluation
to
es-
tablish
responsi
v
e
communication
between
students,
teaching
staf
f
and
programme
administrators,
ensuring
a
match
between
the
intended,
implemented
and
attained
curriculum.
A
web-based
Instrumental
Module
De
v
elopment
System
(IMODS)
for
outcome-based
course
design
has
been
presented
in
[14].
It
defines
the
learning
outcomes,
mapping,
and
assessment
process
b
ut
the
e
v
aluation
of
assessment
data
is
not
e
xplained.
Ibrahim
et
al.
[15]
w
ork
is
close
to
our
proposed
design.
It
consists
of
a
web-based
tool
to
measure
the
mean,
standard
de
viation,
and
the
achie
v
ement
through
course
assessments’
data.
The
y
formulate
these
e
v
aluation
metrics
by
assuming
normally
distrib
uted
dataset
only
.
These
formulas
cannot
be
applied
to
other
distrib
utions
that
may
occur
in
real
practice.
None
of
these
w
orks
ho
we
v
er
,
pro
vide
the
detail
of
general
e
v
aluation
metrics
and
their
use
in
course
and
program
assessment.
The
rest
of
the
article
is
or
g
anized
as
follo
ws:
the
ne
xt
section
describes
the
performance
metrics
formulation,
course
le
v
el
e
v
aluation
based
on
performance
metrics
are
e
xplained
in
section
III,
analysis
and
interpretations
are
discussed
in
section
IV
and
conclusions
are
dra
wn
in
section
V
.
2.
PERFORMANCE
METRICS
FORMULA
TION
Let
A
i;j
;m
be
the
marks
obtained
by
student
m
in
question
j
of
assessment
i
(home
w
ork,
assignment,
quiz,
midterm,
or
final
etc).
Here
i
can
tak
e
the
v
alues,
i
=
1
;
2
;
:::;
I
,
j
can
tak
e
the
v
alues
j
i
=
1
;
2
;
:::;
J
i
,
and
m
can
tak
e
the
v
alues
m
=
1
;
2
;
:::;
M
.
I
;
J
i
;
M
are
the
total
number
of
assessments,
questions
in
assessment
i
,
and
the
students,
respecti
v
ely
.
F
or
quantitati
v
e
analysis,
question
is
a
basic
unit
of
computation
for
assessment.
A
v
erage
score
of
question
j
in
assessment
i
that
has
M
students
is
gi
v
en
by
B
i;j
=
1
M
M
X
m
=1
A
i;j
;m
(1)
B
i;j
can
be
written
in
v
ector
form
as
~
B
=
B
1
B
2
:
:
B
I
1
L
(2)
P
assing
threshold
(PT)
could
be
absolute,
relati
v
e
or
composite
[8],
such
that
the
PT
of
question
j
in
assessment
i
is
gi
v
en
by
one
of
the
follo
wing:
P
T
i;j
=
Q
tot
i;j
(3)
P
T
i;j
=
B
i;j
(4)
P
T
i;j
=
min
f
B
i;j
;
Q
tot
i;j
g
(5)
IJERE
V
ol.
5,
No.
3,
September
2016:
235
–
245
Evaluation Warning : The document was created with Spire.PDF for Python.
IJERE
ISSN:
2252-8822
237
where
Q
tot
i;j
is
the
maximum
or
total
marks
of
question
j
in
assessment
i
and
0
<
<
1
.
The
maximum,
minimum,
standard
de
viation,
and
x-th
percentile
of
question
j
in
assessment
i
are
calculated
as
A
i;j
;max
=
max
m
A
i;j
;
:
A
i;j
;min
=
min
m
A
i;j
;
:
A
i;j
;std
=
stdev
A
i;j
;m
A
i;j
;per
=
per
centil
e
(
A
i;j
;
:
;
x
)
Course
learning
outcomes
describe
what
students
are
e
xpected
to
learn
in
a
course.
A
mapping
between
CLO
and
assessment
questions
is
required
to
compute
the
attainment
of
the
course
CLOs.
If
a
course
co
v
ers
N
number
of
CLOs
then
n
th
CLO
is
written
as
C
LO
n
;
n
=
1
;
2
;
:::N
.
The
three
dimensional
matrix
A
is
con
v
erted
into
a
tw
o
dimension
matrix
~
A
as
~
A
=
A
T
1
A
T
2
:
:
A
T
I
(6)
where
A
T
i
is
the
transpose
matrix
of
i
th
assessment
matrix
A
J
M
.
Matrix
~
A
has
the
dimension
M
L
,
where
M
is
the
total
number
of
students
and
L
=
P
I
i
=1
J
i
is
a
total
number
of
questions
in
all
assessments.
CLO
to
SO
mapping
matrix
is
by
CS
=
2
6
6
6
6
4
C
S
1
;a
C
S
1
;b
C
S
1
;k
C
S
2
;a
C
S
2
;b
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
C
S
N
1
;k
C
S
N
;a
C
S
N
;j
C
S
N
;k
3
7
7
7
7
5
(7)
The
matrix
element
C
S
n;a
is
a
v
ariable.
C
S
n;a
>
0
if
C
LO
n
maps
to
SO
a
,
otherwise
C
S
n;a
=
0
.
A
non-zero
v
alue
is
a
rele
v
ance
of
the
CLO
to
compute
the
SO.
It
can
tak
e
v
alues
1,2,3,
for
low
,
moder
ate
,
and
high
rele
v
ance,
respecti
v
ely
.
Similarly
,
the
CLO
to
question
mapping
is
gi
v
en
by
the
follo
wing
matrix
CQ
=
2
6
6
6
6
4
C
Q
1
;
1
C
Q
1
;
2
C
Q
1
;L
C
Q
2
;
1
C
Q
2
;
2
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
C
Q
N
1
;L
C
Q
N
;
1
C
Q
N
;L
1
C
Q
N
;L
3
7
7
7
7
5
(8)
Ro
ws
of
abo
v
e
matrix
contain
binary
v
ariables
C
Q
n;l
that
represent
the
m
apping
of
n
th
CLO
with
question
l
,
where
l
maps
to
j
th
question
of
an
assessment
i
.
C
Q
n;l
=
1
if
C
LO
n
maps
to
question
l
.
Student
marks
in
assessment
questions
are
used
to
compute
ho
w
well
the
students
ha
v
e
done
and
what
per
-
centage
of
students
ha
v
e
met
a
certain
criteria.
Ev
ery
question
contrib
utes
to
one
or
more
CLOs
and
e
v
ery
CLO
contrib
utes
to
one
or
more
student
outcomes
(SO)
as
sho
wn
in
(8)
and
(7)
respecti
v
ely
.
2.1.
CLO
Attainment
This
metric
is
about
ho
w
well
the
students
ha
v
e
done,
in
percentage,
for
each
CLO.
Attainment
of
a
CLO
is
deri
v
ed
from
the
a
v
erage
marks
obtained
di
vided
by
total
marks
for
all
questions
that
maps
to
the
CLO.
Let
Q
tot
i;j
be
the
maximum
or
total
marks
of
question
j
in
assessment
i
.
In
general,
for
assessment
i
,
Q
tot
i
=
Q
tot
i;
1
Q
tot
i;
2
:
:
Q
tot
i;J
(9)
and
~
Q
tot
=
Q
tot
1
Q
tot
2
:
:
Q
tot
I
1
L
(10)
Then,
the
percentage
of
CLO
attainment
for
n
th
CLO
is
gi
v
en
by
attainmentC
LO
n
[%]
=
P
L
l
=1
~
B
l
C
Q
n;l
P
L
l
=1
~
Q
tot
l
C
Q
n;l
100
(11)
The
operator
is
used
for
element-wise
multiplication.
Design
and
Implementation
of
P
erformance
Metrics
for
Evaluation
...
(Irfan
Ahmed)
Evaluation Warning : The document was created with Spire.PDF for Python.
238
ISSN:
2252-8822
2.2.
CLO
W
eightage
Inf
ormation
In
order
to
get
meaningful
results,
one
should
design
the
CLOs
such
that
there
is
a
uniform
distrib
ution
of
the
marks
o
v
er
CLOs
in
questions
to
CLO
mapping.
F
or
e
xample,
if
a
course
contains
four
CLOs
then,
ideally
,
each
CLO
should
get
25%
weightage.
The
ideal
case
of
uniform
distrib
ution
of
marks
o
v
er
the
CLOs
is
seldom
realized.
In
these
situations,
the
CLO
weightage
information
renders
a
f
air
picture
of
%
CLO
attainment.
The
percentage
weightage
of
n
th
CLO
is
gi
v
en
by
W
eig
htag
eC
LO
n
=
X
i
P
J
i
j
w
(
C
Q
n;j
)
Q
tot
i;j
A
tot
i
w
(
A
i
)
(12)
where
w
(
C
Q
n;j
)
is
the
weight
of
C
LO
n
in
question
j
,
w
(
A
i
)
is
the
weight
of
assessment
i
,
and
A
tot
i
is
the
total
marks
of
assessment
i
.
2.3.
Student
Achie
v
ement
per
CLO
Student
Achie
v
ement
per
CLO
is
defined
as
the
percentage
of
student
s
who
are
abo
v
e
the
e
xpected
le
v
el
as
sho
wn
in
(5).
Expectation
or
tar
get
is
a
design
parameter
,
one
choice
of
the
tar
get
could
be
min(
B
i;j
;
0
:
7
Q
tot
i;j
)
,
i.e.,
the
minimum
of
the
a
v
erage
obtained
marks
and
the
70%
of
the
total
marks
[8].
It
counts
the
number
of
students
who
met
the
criteria
by
comparing
each
student
marks
in
a
question
j
of
an
assessment
i
.
If
the
marks
obtained
A
i;j
;m
are
greater
than
the
passing
threshold
P
T
i;j
,
then
it
increments
the
counting
v
ariable
C
P
S
(count
pass
student)
by
1
.
Finally
,
C
P
S
i;j
or
C
P
S
l
1
contains
number
of
passed
students
for
each
question
j
in
assessment
i
.
Therefore,
the
a
v
erage
student
achie
v
ement
per
CLO
is
gi
v
en
by
S
A
C
L
O
n
=
P
I
i
=1
1
M
i
P
J
i
j
=1
C
P
S
i;j
C
Q
n;i;j
Q
tot
i;j
P
I
i
=1
P
J
i
j
=1
Q
tot
i;j
C
Q
n;i;j
(13)
where
M
i
is
the
number
of
students
that
participated
in
the
assessment
i
.
2.4.
Student
P
er
ception
of
CLO
Attainment
A
course
surv
e
y
is
conducted
at
the
end
of
each
semester
to
g
auge
students’
perception
of
ho
w
well
the
CLOs
were
co
v
ered
in
the
course.
It
is
the
a
v
erage
of
CLO
perception
from
the
students.
F
or
each
CLO,
students
pro
vide
their
i
nput
on
the
scale
of
1
5
where
1
means
CLO
is
not
achie
v
ed
and
5
means
CLO
is
achie
v
ed
completely
.
Summary
of
responses
is
gi
v
en
in
follo
wing
matrix.
SC
=
2
6
6
6
6
4
S
C
1
;
1
S
C
1
;
2
S
C
1
;N
S
C
2
;
1
S
C
2
;
2
.
.
.
.
.
.
.
.
.
.
.
.
S
C
M
1
;N
S
C
M
;
1
S
C
M
;N
1
S
C
M
;N
3
7
7
7
7
5
(14)
Student’
s
perception
of
n
th
CLO
attainment
is
gi
v
en
by
S
E
C
L
O
n
=
1
M
M
X
m
=1
S
C
m;n
(15)
2.5.
x-th
P
er
centile
Marks
per
CLO
x-th
Percentile
Marks
per
CLO
is
defined
as
the
weighted
a
v
erage
of
x
th
percentile
marks
di
vided
by
total
marks
of
the
questions
that
map
to
particular
CLO.
Let
xP
i;j
be
the
x-th
percentile
marks
of
question
j
in
assessment
i
.
In
general,
for
assessment
i
we
ha
v
e
xP
i
=
xP
i;
1
xP
i;
2
:
:
xP
i;J
(16)
and
~
xP
=
xP
1
xP
2
:
:
xP
I
1
L
(17)
1
C
P
S
l
is
a
ro
w
v
ector
form
of
C
P
S
i;j
,
similar
to
(6)
or
(2)
IJERE
V
ol.
5,
No.
3,
September
2016:
235
–
245
Evaluation Warning : The document was created with Spire.PDF for Python.
IJERE
ISSN:
2252-8822
239
T
able
1.
Basic
V
ariables
to
compute
e
v
aluations
metrics
for
a
sample
course.
CLOs
and
questions
mapping
(8)
is
sho
wn
in
first
tw
o
ro
ws
midterm
Quiz1
Quiz4
Quiz3
HW1
Qui
z2
Final
Class
partici
pation
CLOs
Co
v
er
ed
1,2,3
1
2
3,4
1
6
5
2
2
3
3
3
2
5
5
5
6
6
1-6
Question
No.
1
2
3
4
1
1
1
1
2
3
4
5
1
1
2
3
4
5
1
Question
Marks
8
8
8
8
4
4
4
0.4
0.4
0.4
0.4
0.4
4
5
10
10
10
10
5
Actual
A
v
erage
5.22
6.6
6.99
6.69
3.25
3.664
3.12
0.4
0.4
0.4
0.352
0.128
3.68
1.9
8.3
7.2
6.1
9.3
4.7
P
assing
Threshold
(PT)
5.22
5.6
5.6
5.6
2.8
2.8
2.8
0.28
0.28
0.28
0.28
0.128
2.8
1.9
7
7
6.1
7
3.5
No.
of
Students
Abo
v
e
PT
6
9
10
9
9
10
6
10
10
10
8
4
9
7
6
4
4
10
10
Minimum
Marks
1.2
4.2
6
4.5
1.3
3.04
2.4
0.4
0.4
0.4
0.16
0
2.4
0
5
5
1
8
4.27
Maximum
Marks
7.35
7.5
7.5
7.5
4
3.84
3.84
0.4
0.4
0.4
0.4
0.32
4
4
10
10
10
10
4.9
Standard
De
viation
1.78
0.95
0.62
0.92
0.78
0.25
0.47
0.00
0.00
0.00
0.10
0.12
0.44
1.37
1.95
1.60
2.91
0.64
0.21
50th
Percentile
Marks
5.85
6.75
7.35
6.9
3.3
3.76
3.2
0.4
0.4
0.4
0.4
0.08
3.84
2
9.5
6.5
5.5
9
4.79
Then,
the
a
v
erage
percentage
x-th
percentile
marks
per
CLO
is
gi
v
en
by
xP
er
centil
eC
LO
n
=
P
L
l
=1
xP
l
C
Q
n;l
P
L
l
=1
Q
tot
l
C
Q
n;l
(18)
2.6.
SO
Attainment
By
using
CLO-SO
mapping
in
(7),
cours
e
le
v
el
SO
assessment
can
be
achie
v
ed.
SO
attainment
for
an
SO
is
computed
from
the
CLO
attainment
of
all
CLOs
that
map
to
t
he
SO.
SO
attainment
is
defined
as
the
weighted
a
v
erage
of
CLOs
attainment
(in
%).
attainmentS
O
n
=
P
i
2C
n
attainmentC
LO
i
w
i
P
i
2C
n
w
i
(19)
where
C
n
is
the
set
of
CLOs
that
map
to
S
O
n
and
w
i
is
the
weight
(or
rel
e
v
ance)
of
i
th
mapping
between
CLO
and
SO.
2.7.
Student
Achie
v
ement
per
SO
It
is
defined
as
weighted
a
v
erage
of
student
achie
v
ement
of
CLOs
(in
%)
that
map
to
a
particular
SO.
S
A
S
O
n
=
P
i
2C
n
S
A
C
LO
i
w
i
P
i
2C
n
w
i
(20)
2.8.
Student
P
er
ception
of
SOs
Attainment
Student
perception
of
SOs
attainment
gi
v
es
an
indirect
measurement
of
SO
attainment.
Thi
s
metric
is
deri
v
ed
from
student
perception
of
CLOs
attainment
in
(15)
and
the
CLO-SO
mapping
in
(7).
S
E
S
O
n
=
P
i
2C
n
S
E
C
L
O
i
w
i
P
i
2C
n
w
i
(21)
2.9.
x-th
P
er
centile
per
SO
x-th
Percentile
per
SO
uses
CLO-A
v
erage
x-th
Percentile
%
marks
with
CLO-SO
mapping
in
(7).
xP
er
centil
eS
O
n
=
P
i
2C
n
xP
er
centil
eC
LO
i
w
i
P
i
2C
n
w
i
(22)
3.
COURSE
LEVEL
PERFORMANCE
EV
ALU
A
TION
Direct
assessment
of
an
academic
program
is
performed
by
e
v
aluation
of
courses
in
the
study
plan.
If
not
all,
at
least
a
selected
subset
of
the
courses
is
required
to
find
out
program’
s
success
le
v
el.
Pre
vious
section
presented
formal
formulations
of
the
performance
metrics
that
can
be
used
in
course
e
v
aluation.
This
section
discusses
an
implementation
of
thes
e
metrics
in
e
v
aluation
of
a
sample
course.
The
section
starts
with
setup
required
for
e
v
aluation
follo
wed
by
e
v
aluation
results
and
concludes
by
discussing
issues
and
concerns.
Design
and
Implementation
of
P
erformance
Metrics
for
Evaluation
...
(Irfan
Ahmed)
Evaluation Warning : The document was created with Spire.PDF for Python.
240
ISSN:
2252-8822
T
able
2.
Mapping
of
Course
Learning
Outcomes
(CLOs)
to
Student
Outcomes
(SOs)
(7)
as
Course
Assessment
Matrix
CLO
a
b
c
d
e
f
g
h
i
j
k
1
2
2
2
1
1
3
3
4
2
1
5
3
2
2
6
3
3
2
T
able
3.
Computation
of
SO
attainment
from
CLO
attainment
using
table
2
CLO-SO
mapping
CLO
a
b
c
d
e
f
g
h
i
j
k
1
79.08
2
82.91
82.91
82.91
3
78.78
4
87.62
87.62
5
74.18
74.18
74.18
6
81.94
81.94
81.94
SO
attainment
79.68
79.68
79.94
87.62
82.91
87.62
74.18
W
eighted
SO
attainment
79.27
79.52
79.77
87.62
82.91
87.62
74.18
Rele
v
ance
3
2
3
2
1
1
2
3.1.
Course
Setup
f
or
Ev
aluation
Course
e
v
aluation
is
computation
of
performance
metrics
from
basic
v
ariable
of
the
course
and
perform
analysis.
T
o
compute
metrics
defined
in
section
II
from
collected
data,
each
course
must
ha
v
e
well
defined
CLOs,
CLO
to
SO
mapping
as
in
(7),
CLO
to
question
mapping
in
each
assessment
as
in
(8),
and
passing
threshold
as
defined
in
(5).
T
able
1
sho
ws
basic
v
ariables
of
a
sample
course.
First
tw
o
ro
ws
sho
w
mapping
between
CLOs
and
questions
for
all
assessments
conducted
in
t
he
course.
T
able
2
sho
ws
mapping
between
CLOs
and
SOs
defined
by
the
course
designer
.
A
numeric
v
alue
in
a
cell
represents
a
relationship
be
tween
a
CLO
and
an
SO.
A
v
alue
of
1,
2,
or
3
indicates
that
a
CLO
addresses
an
SO
slightly
,
moder
ately
,
or
substantively
.
P
assing
threshold
is
set
to
min(
av
g
;
70%)
,
which
is
used
in
computation
of
student
achie
v
ement
per
CLO
(13)
and
student
achie
v
ement
per
SO
(20).
3.2.
P
erf
ormance
Ev
aluation
This
section
presents
computed
v
alues
of
metrics
defined
in
section
II
for
the
sample
course.
3.2.1.
CLO
Attainment
CLO
attainment
for
the
sample
single
section
course
is
sho
wn
in
Figure
1.
CLO
attainment
quantifies
the
student
attainment
le
v
el
of
particular
CLO
through
the
perc
entage
marks
allocated
to
that
CLO.
Since
this
is
a
per
-
centage
v
alue
of
a
v
erage
marks
obtained
in
the
questions
maps
to
a
particular
CLO,
therefore,
it
is
necessary
to
either
distrib
ute
the
marks
uniformly
o
v
er
the
CLOs
or
gi
v
e
an
e
xplicit
e
vidence
of
CLO
to
marks
ratio.
9
5
1
0
0
C
L
O
8
5
9
0
9
5
A
t
t
a
i
n
m
e
n
t
[
%
]
S
t
u
d
e
n
t
7
5
8
0
a
c
h
i
e
v
e
m
e
n
t
[
%
]
50th
Ͳ
t
i
l
%
6
5
7
0
p
e
r
c
e
n
t
i
l
e
%
m
a
r
k
s
S
t
u
d
e
n
t
perception
6
0
C
L
O
1
C
L
O
2
C
L
O
3
C
L
O
4
C
L
O
5
C
L
O
6
perception
o
f
C
L
O
[
%
]
Figure
1.
CLO
Performance
Ev
aluation
IJERE
V
ol.
5,
No.
3,
September
2016:
235
–
245
Evaluation Warning : The document was created with Spire.PDF for Python.
IJERE
ISSN:
2252-8822
241
3.2.2.
CLO
W
eightage
The
CLO
weightage
for
a
sample
single
section
course
is
assumed
as
CLO1
15%
,
CLO2
16%
,
CLO3
9%
,
CLO4
5%
,
CLO5
30%
,
and
CLO6
25%
.
The
CLO
attainment
and
student
achie
v
ement
of
CLO
are
based
on
these
weightages.
3.2.3.
Student
Achie
v
ement
per
CLO
Student
achie
v
ement
per
CLO
for
the
sample
single
section
course
is
sho
wn
in
Figure
1.
It
is
the
percentage
number
of
students
that
meet
or
e
xceed
the
tar
get
or
e
xpectations.
There
i
s
an
upper
limit
for
the
tar
get
(
70%
)
b
ut
there
is
no
lo
wer
limit
and
it
depends
upon
the
a
v
erage
marks.
W
e
can
get
absolute
student
achie
v
ement
by
fixing
the
tar
get,
for
e
xample,
with
tar
get
v
alue
of
60%
.
3.2.4.
50-th
P
er
centile
per
CLO
The
50-th
percentile
for
the
sample
course
is
sho
wn
in
Figure
1.
It
sho
ws
the
percentage
median
marks
for
each
CLO.
3.2.5.
Student
P
er
ception
of
CLOs
Attainment
F
or
each
CLO,
student
perception
of
CLO
attainment
can
be
on
the
scale
of
1
to
5
,
1
mean
str
ongly
disa
gr
ee
to
5
for
str
ongly
a
gr
ee
.
Student
perception
of
CLO
attainment
for
the
sample
course
is
sho
wn
in
Figure
1.
There
were
10
students
b
ut
8
participated
in
the
course
surv
e
y
.
3.2.6.
SO
Attainment
Bar
graphs
for
SO
attainment
are
sho
wn
in
Figure
2.
These
le
v
els
are
a
v
erages
of
CLO
attainments
that
map
to
particular
SO,
therefore,
the
health
of
CLO
attainments
and
CLO-SO
mapping
is
critical.
9
5
0
0
1
0
0
.
0
0
8
0
0
0
8
5
.
0
0
9
0
.
0
0
9
5
.
0
0
A
v
e
r
a
g
e
%
S
O
a
t
t
a
i
n
m
e
n
t
S
t
u
d
e
n
t
7
0
.
0
0
7
5
.
0
0
8
0
.
0
0
a
c
h
i
e
v
e
m
e
n
t
o
f
S
O
[
%
]
5
0
t
h
Ͳ
percentile
%
m
a
r
k
s
5
5
.
0
0
6
0
.
0
0
6
5
.
0
0
%
m
a
r
k
s
S
t
u
d
e
n
t
p
e
r
c
e
p
t
i
o
n
o
f
S
O
a
t
t
a
i
n
m
e
n
t
[
%
]
5
0
.
0
0
a
b
c
e
h
i
k
a
t
t
a
i
n
m
e
n
t
[
%
]
Figure
2.
SO
Performance
Ev
aluation
3.2.7.
Student
Achie
v
ement
per
SO
Student
achie
v
ement
for
the
sample
course
is
gi
v
en
in
Figure
2.
This
is
a
deri
v
ed
v
alue
from
CLO
achie
v
e-
ments
and
depicts
the
percentage
number
of
students
achie
v
ed
the
set
tar
get
a
v
eraged
o
v
er
the
CLOs
mapped
to
that
SO.
3.2.8.
50-th
P
er
centile
per
SO
The
50-th
percentile
per
SO
v
alues
are
deri
v
ed
from
50-th
percentile
per
CLO.
Figure
2
depicts
the
50-th
percentile
or
median
marks
for
each
mapped
SO.
3.2.9.
Student
P
er
ception
of
SOs
Attainment
The
student
perception
of
SOs
attainment
is
sho
wn
in
Figure
2.
It
is
an
indirect
measurement
obtained
from
the
course
e
xit
surv
e
y
where
students
pro
vide
their
feedback
about
the
CLOs
attainment.
Design
and
Implementation
of
P
erformance
Metrics
for
Evaluation
...
(Irfan
Ahmed)
Evaluation Warning : The document was created with Spire.PDF for Python.
242
ISSN:
2252-8822
3.3.
Issues
and
Guidelines
Course
designer
is
responsible
to
establish
quality
mapping
between
CLOs
and
SOs.
Course
instructor
is
responsible
for
CLOs
to
questions
mapping
for
all
assessments.
Quality
of
these
mappings
ha
v
e
direct
impact
on
the
e
v
aluation
results
as
discussed
in
rest
of
this
section.
3.3.1.
Relationship
of
Questions,
Marks
distrib
ution
and
CLOs
It
has
been
observ
ed
that
questions
to
CLO
mapping
requires
uniform
marks
distrib
ution
o
v
er
the
CLOs.
The
quantitati
v
e
measurement
of
CLOs
pro
vides
the
baseline
data
for
direct
assessment,
therefore,
questions
to
CLOs
mapping
is
critical
in
direct
assessment.
CLOs
should
be
designed
in
such
a
w
ay
that
the
y
co
v
er
all
the
core
topics
(qualitati
v
e
equality)
and
course
assessments
should
co
v
er
all
CLOs
with
uniform
marks
distrib
ution
o
v
er
the
CLOs
(quantitati
v
e
equality).
Similar
measures
are
required
in
capstone
project
rubrics’
design.
Capstone
project
is
an
important
entity
of
program
in
which
students
apply
the
kno
wledge
g
ained
during
the
course
of
the
program
to
solv
e
the
engineering
problems.
The
capstone
project
rubrics
map
to
CLOs
and
these
CLOs
usually
co
v
er
all
the
SOs.
Since
the
sample
size
in
this
assessment
is
not
as
lar
ge
as
of
direct
assessment
therefore
results
may
dif
fer
in
these
assessments.
3.3.2.
Questions
to
CLOs
Mapping
A
ppr
oaches
Due
to
the
man
y-to-man
y
mapping
between
questions
and
CLOs,
a
common
ques
tion
arises
about
the
weights
of
a
question
that
maps
to
multiple
CLOs.
There
are
three
possibilities:
One-to-man
y
mapping
with
equal
weights
One-to-man
y
mapping
with
proportional
weights
One-to-one
mapping
between
questions
and
CLOs.
In
this
manuscript,
equal
weights
ha
v
e
been
used
in
questions
to
CLOs
mapping.
The
proportional
weights
add
one
more
le
v
el
of
comple
xity
for
the
f
aculty
and
hence
more
chances
of
errors.
One-
to-one
mapping
is
another
attracti
v
e
solution
which
eliminates
the
weight
problem
because
in
this
case
one
question
can
be
mapped
to
one
CLO
at
most.
In
this
technique
man
y
questions
can
be
mapped
t
o
one
CLO
b
ut
con
v
erse
is
not
possible.
Proportional
weights
and
one-to-one
schemes
require
a
proper
design
of
CLOs
and
the
mapping
table
between
questions
and
CLOs.
3.3.3.
CLOs
to
SOs
Mapping
within
a
Course
There
are
three
choices:
One
CLO
can
be
mapped
to
an
y
number
of
SOs
without
weights
(one-to-man
y
mapping
without
weights)
One
CLO
can
be
mapped
to
an
y
number
of
SOs
with
weights
(one-to-man
y
mapping
with
weights)
One
CLO
can
be
mapped
to
one
SO
only
(one-to-one
mapping)
[16]
In
this
manuscript,
one-to-man
y
mapping
with
weights
has
been
used
as
sho
wn
in
T
able
2.
One-to-man
y
mapping
without
weights
assumes
equal
weights
across
all
S
Os
mapped
to
a
particular
CLO.
A
straight
forw
ard
w
ay
of
mapping
is
one-to-one
mapping
which
does
not
require
weights
b
ut
ag
ain
the
design
of
CLOs
is
important
in
this
case.
4.
AN
AL
YSIS
AND
INTERPRET
A
TIONS
This
section
pro
vides
a
detail
analysis
of
course
le
v
el
e
v
aluations
based
on
the
formulated
e
v
aluation
metrics
using
synthetic
data.
The
implementation
of
these
metrics
ha
v
e
re
v
ealed
se
v
eral
ne
w
directions
and
interpretations.
Course
e
v
aluation
produces
quantitati
v
e
v
alues
of
attainment,
achie
v
ement,
and
x
th
percentile
metrics
for
CLOs
and
SOs.
F
or
a
course,
relati
v
e
v
alues
of
these
metrics
pro
vide
insight
into
what
happened
in
the
course
and
zero
v
alue
for
a
metri
c
indicates
that
topics
related
to
the
corresponding
CLO
or
SO
are
either
not
co
v
ered
in
the
course
or
data
w
as
not
collected
for
e
v
aluation.
F
or
a
multi-section
course,
these
metrics
can
point
to
lack
of
coordination
among
course
instructors,
and
dif
ference
in
teaching
and
e
v
aluation
standards,
If
a
course
instructor
does
not
co
v
er
some
CLOs
then
corresponding
metrics
v
alues
will
be
zero
as
sho
wn
in
Figure
3.
F
or
the
sample
course,
CLO
4
and
5
are
not
co
v
ered
and
since
these
tw
o
CLOs
maps
to
SO
”a”,
so
the
SO
IJERE
V
ol.
5,
No.
3,
September
2016:
235
–
245
Evaluation Warning : The document was created with Spire.PDF for Python.
IJERE
ISSN:
2252-8822
243
10
20
30
40
50
60
70
80
90
0
10
20
30
40
50
60
70
80
90
100
Average marks
Percentage values
Attainment
Achievement absolute
Achievement relative
Achievement composite
50−th percentile
Figure
3.
Comparison
of
CLO
attainment,
student
achie
v
ement,
and
50-th
percentile
is
also
not
co
v
ered
in
the
course.
F
or
multi-section
courses,
a
zero
v
alue
for
an
y
of
the
defined
metrics
in
some
of
the
section
indicates
lack
of
coordination
among
course
instructors.
In
order
to
e
xplain
the
relationship,
CLO
attainment,
student
achie
v
ement,
and
50-th
ha
v
e
been
plotted
ag
ainst
the
students’
a
v
erage
marks
in
Fig.
3.
These
graphs
sho
w
the
three
e
v
aluation
metrics’
v
alues
for
a
range
of
a
v
erage
marks
associated
with
a
particular
CLO.
In
this
figure,
the
a
v
erage
marks
are
obtained
from
normal
distrib
ution
mean
(a
v
erage)
with
standard
de
viation
5
.
Number
of
students
is
30
and
the
results
are
a
v
eraged
o
v
er
1000
iterations.
From
this
figure,
i
t
can
be
seen
that
CLO
attainment
is
a
linear
function
of
questions’
a
v
erage
marks
mapped
to
that
CLO.
The
at
tainment
is
equal
to
the
50-th
percentile
because
the
mean
and
median
of
normal
distrib
ution
are
equal.
The
composite
student
achie
v
ement
remains
al
most
constant
up
to
70%
a
v
erage
marks
due
to
the
passing
threshold
min(
av
er
ag
e
M
ar
k
s;
70%
T
otal
M
ar
k
s
)
.
F
or
normally
dis
trib
uted
marks,
there
are
al
w
ays
50%
students
belo
w
the
a
v
erage
v
alue
and
50%
students
are
abo
v
e
the
a
v
erage
v
alue,
hence,
student
achie
v
ement
remains
constant
at
50%
v
alue.
When
the
a
v
erage
marks
go
abo
v
e
70%
,
the
passing
threshold
shifts
from
a
v
erage
v
alue
to
70%
and
all
the
students
with
marks
greater
than
70%
contrib
ute
to
the
student
achi
e
v
em
ent.
At
about
80%
a
v
erage
marks,
the
student
achie
v
ement
reaches
to
100%
v
alue
because
all
the
students
e
v
en
with
the
standard
de
viation
of
5
no
w
lie
abo
v
e
70%
threshold.
The
relat
i
v
e
achie
v
ement
(passing
threshold=a
v
erage
marks)
al
w
ays
remains
around
50%
.
The
absol
ute
achie
v
ement
(passing
threhold=0.7T
otal
marks)
for
=
0
:
7
crosses
the
50%
v
alue
at
a
v
erage
marks
equal
to
70
.
The
designed
e
v
aluation
metrics
gi
v
e
comprehensi
v
e
results
when
considered
collecti
v
ely
.
1.
Attainment
and
Ac
hie
vement
:
The
relationship
between
attainment
and
student
achie
v
ement
(
absolute,
com-
posite)
is
linear
for
a
v
erage
marks
greater
than
set
tar
get,
i.e.,
for
suf
ficiently
lar
ge
population
size
and
normal
distrib
ution
of
obtained
marks,
high
v
alues
of
attainment
corresponds
to
high
v
alues
of
achie
v
ements
and
the
lo
w
attainment
e
xpects
lo
w
achie
v
ement.
Attainment
and
achie
v
ement
are
independent
for
achie
v
ement
le
v
el
belo
w
the
threshold.
If
the
distrib
ution
is
not
normal
then
linearity
is
not
guaranteed.
F
or
e
xample,
if
there
are
10
students
and
9
students
secure
50
marks
(out
of
100
)
and
one
student
get
10
marks
then
the
composite
achie
v
ement
is
90%
b
ut
the
attainment
is
46%
2.
Attainment
and
P
er
centile
:
The
50-th
percentile
(or
median)
gi
v
es
an
additional
information
about
the
health
of
attainment.
It
is
also
called
location
parameter
.
Median
close
to
attainment
indicates
normal
distrib
ution
of
marks.
3.
Ac
hie
vement
and
P
er
centile
:
If
50-th
percentile
(median)
is
equal
to
the
tar
get
v
alue
of
achie
v
ement
then
the
achie
v
ement
is
equal
to
50%
.
Median
v
alues
abo
v
e
the
the
achie
v
ement
tar
get
sho
ws
that
more
students
ha
v
e
met
the
e
xpectation
and
v
alue
of
achie
v
ement
will
be
high.
Media
n
v
alues
less
than
the
achie
v
ement
tar
get
results
in
the
achie
v
ement
le
v
el
less
than
50%
.
4.
Attainment,
Ac
hie
vement,
and
P
er
centile
:
Attainment
and
50-th
percentile
(median)
ha
v
e
the
same
units,
i.e.,
a
v
erage
and
median
marks
in
a
particular
CLO,
whereas,
student
achie
v
ement
gi
v
es
the
number
of
students.
If
attainment
and
median
are
sim
ilar
(normal
distrib
ution)
and
ha
v
e
high
v
alues,
then,
the
absolute
and
composite
achie
v
ements
will
also
be
high
because
more
number
of
students
w
ould
ha
v
e
marks
greater
than
the
set
tar
get,
whereas,
the
relati
v
e
achie
v
ement
will
remain
flat
at
50%
because
of
normally
distrib
uted
marks.
Con
v
ersely
,
if
attainment
and
median
ha
v
e
lo
w
le
v
els,
then,
the
composite
achie
v
ement
becomes
50%
.
Note
that
the
absolute
achie
v
ement
is
proportional
to
the
attainment
and
median
near
the
tar
get
and
becomes
independent
for
the
a
v
erage
marks
suf
ficiently
less
or
greater
than
the
tar
get
v
alue.
Design
and
Implementation
of
P
erformance
Metrics
for
Evaluation
...
(Irfan
Ahmed)
Evaluation Warning : The document was created with Spire.PDF for Python.
244
ISSN:
2252-8822
5.
CONCLUSION
AND
FUTURE
W
ORK
This
w
ork
qualitati
v
ely
e
v
aluates
the
course
assessment
data
using
designed
perf
ormance
metrics
(attainment,
student
achie
v
ement,
x-th
percentile).
The
main
contrib
ution
of
this
w
ork
is
to
design
and
implement
the
performance
metrics
for
the
e
v
aluation
of
assessment
data.
There
are
man
y
publis
hed
papers
on
the
outcome-based
assessment
of
course
and
program
b
ut
none
of
those
e
xplicitly
depicts
the
formulation
of
the
e
v
aluation
metrics.
Using
the
designed
metrics,
the
first
finding
obtained
is
that
meaningful
results
from
e
v
aluation
metrics
depend
upon
the
qualitati
v
e
mapping
between
marks
obtained
in
assessments
to
the
CLOs.
CLOs
definitions
from
the
course
core
topics
require
qualitati
v
e
equality
,
and
the
marks
distrib
ution
o
v
er
the
CLOs
requires
quantitati
v
e
equality
.
Analysis
of
the
results
obtained
from
the
uniformly
distrib
uted
synthetic
data
sho
w
that
higher
v
alues
of
one
metric
does
not
imply
higher
v
alues
of
the
other
metrics
and
the
y
depend
upon
the
obtained
marks
distrib
ution.
In
particular
,
for
uniformly
distrib
uted
marks,
achiev
ement
<
attainment
for
meanO
f
U
ni
f
or
mD
istr
:
<
av
er
ag
eM
ar
k
s
<
passing
T
hr
eshol
d
(
)
.
As
future
w
ork,
these
performance
metrics
can
be
used
to
e
v
aluate
the
outcome-based
program
assessment
with
the
real
data
o
v
er
tw
o
or
three
years.
Some
qualitati
v
e
measurements
(student
course
surv
e
y
,
graduate
student
surv
e
y
,
emplo
yer
surv
e
y)
can
complement
the
study
presented
here.
REFERENCES
[1]
ABET
,
“ABET:
Acc
reditation
Board
for
Engineering
and
T
echnology.
”
2015.
[Online].
A
v
ailable:
www
.abet.or
g
[2]
S.
A.
Al-Y
ah
ya
and
M.
A.
Abdel-halim,
“
A
Successful
Experience
of
ABET
Accreditation
of
an
Electrical
Engineering
Program,
”
IEEE
T
r
ansactions
on
Education
,
v
ol.
56,
no.
2,
pp.
165–173,
May
2013.
[Online].
A
v
ailable:
http://ieee
xplore.ieee.or
g/lpdocs/epic03/wrapper
.htm?arnumber=6243237
[3]
B.
M.
Olds,
B.
M.
Moskal,
and
R.
L.
Miller
,
“Assessment
in
Engineering
Education:
Ev
olution,
Approaches
and
Future
Collaborations,
”
J
ournal
of
Engineering
Education
,
v
ol.
94,
n
o.
1,
pp.
13–25,
Jan.
2005.
[Online].
A
v
ailable:
http://search.proquest.com/docvie
w/217947398/abstract/BF8766A91853437DPQ/1
[4]
J.
M
cGourty
,
C.
Sebastian,
and
W
.
Sw
art,
“De
v
eloping
a
comprehensi
v
e
assessment
program
for
engineering
education,
”
J
ournal
of
Engineeri
ng
Education
,
v
ol.
87,
no.
4,
p.
355,
Oct.
1998.
[Online].
A
v
ailable:
http://search.proquest.com/docvie
w/217941995/citation/A352F
A037517402DPQ/1
[5]
B.
Chance
and
R.
Peck,
“From
Curriculum
Guidelines
to
Learning
Outcomes:
Assessment
at
the
Program
Le
v
el,
”
The
American
Statistician
,
v
ol.
69,
no.
4,
pp.
409–416,
Oct.
2015.
[Online].
A
v
ailable:
http://dx.doi.or
g/10.1080/00031305.2015.1077730
[6]
R.
A.
K.
Mehdi,
“
Academic
Program
Assessment:
A
Case
Study
of
a
Pragmatic
Approach,
”
Cr
eative
Education
,
v
ol.
04,
no.
01,
pp.
71–81,
2013.
[Online].
A
v
ailable:
http://www
.scirp.or
g/journal/P
aperDo
wnload.aspx?DOI=
10.4236/ce.2013.41010
[7]
O.
Pierrak
os
and
H.
W
atson,
“
A
comprehensi
v
e
ABET-focused
assessment
plan
designed
to
in
v
olv
e
all
program
f
aculty
,
”
in
F
r
ontier
s
in
Education
Confer
ence,
2013
IEEE
.
IEEE,
2013,
pp.
1716–1722.
[Online].
A
v
ailable:
http://ieee
xplore.ieee.or
g/xpls/abs
all.jsp?arnumber=6685131
[8]
J.
Reed
and
H.
Zhang,
“
A
hierarchical
frame
w
ork
for
mapping
and
quantitati
v
ely
assessing
program
and
learning
outcomes,
”
i
n
Pr
oceedings
of
the
18th
A
CM
confer
ence
on
Inno
vation
and
tec
hnolo
gy
in
computer
science
education
.
A
CM,
2013,
pp.
52–57.
[Online].
A
v
ailable:
http://dl.acm.or
g/citation.cfm?id=2462481
[9]
Y
.
Kalaani
and
R.
J.
Haddad,
“Continuous
Impro
v
ement
in
the
Assessment
Process
of
Engineering
Programs,
”
in
Pr
oceedings
of
ASEE
Southeast
Section
Confer
ence
,
Macon,
GA,
Apr
.
2014.
[Online].
A
v
ail
able:
http://asee.cs.southern.edu/proceedings/ASEE2014/P
apers2014/4/44.pdf
[10]
V
.
Sriraman
and
W
.
Stapleton,
“Lessons
learned
in
first
time
accreditation
of
engineering
programmes,
”
Global
J
ournal
of
Engineering
Education
,
v
ol.
15,
no.
2,
2013.
[Online].
A
v
ailable:
http://www
.wiete.com.au/journals/
GJEE/Publish/v
ol15no2/05-
Sriraman-
V
.pdf
[11]
K.
A.
M.
Nasir
and
A.
Alghamdi,
“T
o
w
ards
ABET
Accreditation
for
a
SWE
Program:
Alternati
v
e
Student
Assessment
T
echniques,
”
Science
International
J
ournal
,
v
ol.
23,
no.
1,
pp.
7–12,
2011.
[Online].
A
v
ailable:
http://www
.sci-
int.com/pdf/227272332-
-
7-
12%20Final-
2-
e
Student
final%5B.pdf
[12]
C.
Hughes,
“
A
case
study
of
assessment
of
graduate
learning
outcomes
at
the
programme,
course
and
task
le
v
el,
”
Assessment
and
Evaluation
in
Higher
Education
,
v
ol.
38,
no.
4,
pp.
492–506,
2013.
[Online].
A
v
ailable:
http://dx.doi.or
g/10.1080/02602938.2012.658020
[13]
L.
Harris,
P
.
Driscoll,
M.
Le
wis,
L.
Matthe
ws,
C.
Russell,
and
S.
Cumming,
“Implementing
curriculum
e
v
aluation:
case
study
of
a
generic
under
graduate
de
gree
in
health
sciences,
”
Assessment
and
Evaluation
in
Higher
Education
,
v
ol.
35,
no.
4,
pp.
477–490,
2010.
[Online].
A
v
ailable:
http://dx.doi.or
g/10.1080/02602930902862883
[14]
S.
Bansal,
A.
Bansal,
and
O.
Dalrymple,
“Outcome-based
Education
Model
for
Computer
Science
Education,
”
IJERE
V
ol.
5,
No.
3,
September
2016:
235
–
245
Evaluation Warning : The document was created with Spire.PDF for Python.