Inter
national
J
our
nal
of
Electrical
and
Computer
Engineering
(IJECE)
V
ol.
10,
No.
6,
December
2020,
pp.
5772
5778
ISSN:
2088-8708,
DOI:
10.11591/ijece.v10i6.pp5772-5778
r
5772
A
haptic
feedback
system
based
on
leap
motion
contr
oller
f
or
pr
osthetic
hand
application
Hussam
K.
Abdul-Ameer
1
,
Luma
Issa
Abdul-Kr
eem
2
,
Huda
Adnan
3
,
Zahra
Sami
4
1,3,4
Biomedical
Engineering
Department
Al-Khw
arizmi
Colle
ge
of
Engineering,
Uni
v
ersity
of
Baghdad,
Iraq
2
Control
and
Systems
Engineering
Department
Uni
v
ersity
of
T
echnology
,
Iraq
Article
Inf
o
Article
history:
Recei
v
ed
Jul
13,
2019
Re
vised
Mar
1,
2020
Accepted
Mar
8,
2020
K
eyw
ords:
Haptic
feedback
InMoo
v
hand
Leap
motion
controller
Sensors
placement
ABSTRA
CT
Leap
motion
controller
(LMC)
is
a
gest
ure
sensor
consists
of
three
infrared
light
emitters
and
tw
o
infrared
stereo
cameras
as
tracking
sensors.
LMC
translates
hand
mo
v
ements
into
graphical
data
that
are
used
in
a
v
ar
iety
of
applications
such
as
vir
-
tual/augmented
reality
and
object
mo
v
ements
control.
In
this
w
ork,
we
intend
to
con-
trol
the
mo
v
ements
of
a
prosthetic
hand
via
(LMC)
in
which
fingers
are
fle
x
ed
or
e
xtended
in
response
to
hand
mo
v
ements.
This
will
be
carried
out
by
passing
in
the
data
from
the
Leap
Motion
to
a
processing
unit
that
processes
the
ra
w
data
by
an
open-
source
package
(Processing
i3)
in
order
to
control
fi
v
e
serv
o
motors
using
a
micro-
controller
board.
In
addition,
hapt
ic
setup
is
proposed
using
force
sensors
(FSR)
and
vibro-motors
in
which
the
speed
of
t
hese
motors
is
proportional
to
the
amount
of
the
grasp
force
e
x
erted
by
the
prosthetic
hand.
In
v
estig
ation
for
optimal
placement
of
the
FSRs
on
a
prosthetic
hand
to
obtain
con
v
enient
haptic
feedback
has
been
carried
out.
The
results
sho
w
the
ef
fect
of
object
shape
and
weight
on
the
obtained
response
of
the
FSR
and
ho
w
the
y
influence
the
locations
of
the
sensors.
Copyright
c
2020
Insitute
of
Advanced
Engineeering
and
Science
.
All
rights
r
eserved.
Corresponding
A
uthor:
Hussam
K.
Abdul-Ameer
,
Al-Khw
arizmi
Colle
ge
of
Engineering,
Uni
v
ersity
of
Baghdad,
Baghdad,
Iraq.
Email:
hussam@k
ecb
u.uobaghdad.edu.iq
1.
INTR
ODUCTION
Capturing
of
human
body
motion
is
an
increasing
research
area
due
to
potential
applications
in
robotics
and
informatics
areas.
Motion
capture
of
the
human
hand
is
comple
x
task
because
each
finger
,
and
e
v
en
each
phalanx,
has
distinct
and
independent
mo
v
ements
that
yield
in
return
higher
de
grees
of
freedom
[1-4].
Some
techniques
ha
v
e
been
de
v
eloped
to
capture
hand
motions,
see
[3-6].
Ho
we
v
er
,
accu-
rac
y
is
a
major
setback
to
track
hand
motion.
A
leap
motion
controller
is
a
motion
sensor
that
w
as
de
v
eloped
for
hand
motion
tracking
with
high
detection
accurac
y
that
reached
0.01
mm
[7].
It
consists
of
three
parts,
tw
o
cameras,
LEDs,
and
a
microcontroller
in
which
the
cameras
capture
successi
v
e
images
of
the
hand
and
then
the
y
are
passed
to
the
controller
to
process
the
images
and
e
xtract
spatial
information
of
the
hand
and
fingers.
Leap
Motion
has
been
used
in
se
v
eral
projects
to
recognize
hand
gestures
[8-13].
Although
approaching
the
de
xterity
of
a
human
hand
to
control
robotic
arm
is
dif
ficult,
LM
w
as
used
to
do
this
task
in
[14-16],
where
hand
gestures
ha
v
e
been
translated
to
joint
angles
to
perform
a
specific
task.
A
potential
application
for
LM
is
in
robotic
sur
gery
,
in
which
a
sur
geon
could
control
sur
gical
i
nstruments
from
a
console
located
in
the
operat-
ing
room
[17-19].
T
ouchless
interaction
has
been
in
v
estig
ated
in
man
y
w
orks.
Such
studies
can
impro
v
e
user
performance
in
dif
ferent
applications
such
as
wheelchair
control
and
maneuv
ering
and
bro
wsing
of
medical
im-
ages
using
leap
motion
[20-23].
Additional
w
orks
ha
v
e
been
carried
out
to
in
v
estig
ate
the
haptic
feedback
with
J
ournal
homepage:
http://ijece
.iaescor
e
.com
Evaluation Warning : The document was created with Spire.PDF for Python.
Int
J
Elec
&
Comp
Eng
ISSN:
2088-8708
r
5773
implantation
of
leap
motion,
see
[24-26].
Ho
we
v
er
,
locations
of
the
sensors
that
are
responsible
for
generating
appropriate
control
signals
for
haptic
actuators
need
further
in
v
estig
ation.
The
aim
of
this
w
ork
is
to
in
v
estig
ate
the
rob
ustness
of
a
modified
haptic-prosthetic
system
for
telemedicine
applications,
where,
we
suggest
using
the
leap
motion
controller
to
control
t
he
mo
v
ements
of
a
prosthetic
hand
and
create
haptic
sensation
for
the
subjects.
The
haptic
system
consists
of
three
FSRs
and
tw
o
vibro-motors
which
are
attached
to
a
glo
v
e.
A
microcontroller
is
used
to
synchronize
the
mo
v
ement
of
the
actuators
with
FSRs
signals.
In
addition,
we
in
v
estig
ate
the
response
of
FSRs
at
dif
ferent
spatial
locations
and
using
dif
ferent
objects.
The
paper
is
or
g
anized
as
follo
ws,
section
2
presents
the
adopted
methodology
while
section
3
sho
ws
the
used
e
xperiment
al
setup.
Experimental
results
and
discussion
are
described
in
section
4.
Our
conclusions
are
dra
wn
in
the
final
section.
2.
METHODOLOGY
F
or
the
sak
e
of
clarity
,
the
proposed
methodology
is
di
vided
into
three
parts,
leap
motion,
pros
thetic
hand,
and
haptic
feedback
system.
Figure
1
sho
ws
a
block
diagram
of
the
suggested
control
system.
In
the
follo
wing,
we
will
e
xplain
each
part
in
detail.
L
eap Mo
t
i
on C
on
t
ro
l
l
er
P
ro
ce
ssi
n
g i
3
Use
P
roc
e
ssi
n
g
to obt
a
in
fing
e
r’
s ti
p dire
c
ti
ons
a
nd c
a
lcul
a
te joint
a
ng
l
e
s of f
in
g
e
rs
M
icr
oc
on
t
r
oll
e
r
(A
r
d
u
in
o B
oar
d
)
To c
ontrol t
he
se
rvos
a
nd vibro
-
mot
ors a
nd
a
c
quire
F
S
R
s si
ng
les.
InMoov p
ros
t
het
i
c h
and
a
ttac
he
d
to
F
SR
s
Hap
tic
glove
(V
1 a
nd
V
2 a
r
e
h
a
ptic
a
c
tu
a
tor
s)
Figure
1.
Block
diagram
of
the
suggested
system
for
controlling
InMoo
v
hand
and
creating
haptic
sensation.
The
solid
arro
ws
refer
to
control
signals
while
empty
arro
ws
refer
to
feedback
signals.
The
touchless
interaction
between
haptic
glo
v
e
and
prosthetic
hand
is
e
xpressed
by
tw
o
arro
ws,
the
filled
arro
w
to
control
the
prosthetic
hand
while
the
empty
arro
w
for
haptic
feedback
2.1.
Leap
motion
Leap
Motion
Controller
is
human-hand
tracking
de
vice
equipped
with
infrared
sensors
that
enable
the
LMC
to
respond
swift
ly
to
the
fingers
and
hand
mo
v
ements.
It
intends
to
re
v
olutionize
the
w
ay
that
we
are
using
our
computers
in
f
a
v
or
of
utilizing
the
touchless
interacti
on
concept.
The
leap
motion
has
a
compact
size
which
mak
es
it
suitable
for
dif
f
erent
control
applications.
In
addition,
the
height
of
the
interaction
space
is
60
cm
abo
v
e
the
LMC
where
the
field
of
the
interaction
is
similar
to
an
in
v
erted
p
yramid.
This
p
yramidal
shape
increases
the
cameras
field
of
vie
w
till
150
de
grees
with
area
detection
capability
ranging
from
25
mm
to
600
mm
abo
v
e
the
LMC
[1].
The
first
phase
of
the
w
ork
methodology
is
to
acquire
the
joint
angles
of
the
human-
fingers’
mo
v
ements
from
LMC
using
Processing
softw
are
which
is
an
open
source
frame
w
ork
for
visual
art
programming.
Then,
the
angles
v
alues
are
con
v
erted
to
appropriate
control
signals
that
are
fed
to
actuators
for
dri
ving
prosthetic
hand
fingers
2.2.
Pr
osthetic
hand
W
e
suggest
to
use
fi
v
e-finger
prost
hetic
hand
in
which
each
finger
can
be
controlled
indi
vidual
ly
according
to
the
obtained
information
from
the
LMC.
There
are
man
y
open-source
3D
printable
models
of
fi
v
e-finger
prosthetic
hand
a
v
ailable
in
the
internet.
A
reliable,
anthropomorphic,
and
lo
w-cost
design
to
be
produced
is
InMoo
v
hand
[27].
This
model
has
been
adopted
in
this
w
ork
since
it
is
based
on
sharing
and
enhancing
polic
y
.
Some
amendments
on
the
fingers
should
be
made
to
ensure
full
inte
gration
between
the
A
haptic
feedbac
k
system
based
on
leap
motion
contr
oller
for
...
(Hussam
K.
Abdul-Ameer)
Evaluation Warning : The document was created with Spire.PDF for Python.
5774
r
ISSN:
2088-8708
model
and
the
proposed
haptic
system.
These
modifications
related
to
the
rounded
finger
geometry
where
we
modify
the
shape
of
the
fingers
to
allo
w
the
feedback
sensors
fix
ed
on
the
finger’
s
tip
easily
.
Figure
2
sho
ws
the
InMoo
v
model
and
the
modified
printed
design.
(a)
(
b
)
0
(
d
)
0
(
c
)
0
Figure
2.
InMoo
v
prosthetic
hand,
(a)
Sho
ws
the
3D
printed
model
while,
(b)
sho
ws
the
modified
model,
(c)
presents
the
original
finger
model,
(d)
presents
the
modified
inde
x
finger
to
allo
w
FSR
fix
ed
at
inde
x
tip.
All
models
ha
v
e
been
printed
using
Mak
erbot
replicator
2.3.
Haptic
system
There
are
dif
ferent
configurations
to
create
a
haptic
sens
ation.
The
one
that
we
proposed
in
this
w
ork
is
based
on
the
feedback
obtained
from
reaction
forces
on
the
grasped
object.
These
reactions
will
trigger
haptic
actuators
where
a
vibro-motor
is
used
for
this
purpose.
Adding
haptic
feedback
to
the
touchless
interaction
has
the
potential
for
man
y
applications
such
as
g
aming
and
rehabilitation.
3.
EXPERIMENT
AL
SETUP
T
o
demonstrate
the
proposed
methodology
,
practical
part
of
this
w
ork
has
been
di
vided
into
tw
o
subsections.
3.1.
LMC
and
haptic
system
LMC
has
been
placed
before
the
user
to
acquire
images
of
the
user’
s
hands
which
the
y
should
be
set
within
150
de
grees
from
LM
and
from
25
mm
to
600mm
directly
abo
v
e
the
de
vice.
The
captured
information
is
then
transferred
to
personal
computer
to
e
xtract
information
related
to
figures
mo
v
ements
and
directions.
Processing
softw
are
is
used
for
that
purpose
in
which
proper
libraries
should
be
added
to
the
Processing
to
be
able
to
read
and
interpret
the
incoming
information
from
LMC.
A
setback
is
detected
due
to
lar
ge
amount
of
data
acquired
by
LMC
that
Processing
softw
are
cannot
handle
smoothly
.
This
lagging
problem
has
been
min-
imized
by
reduci
ng
the
frames
to
be
processed
and
use
a
personal
computer
with
higher
processing
capability
.
A
haptic
setup
is
proposed
using
three
force-sensi
ng-resisti
v
e
(FSR),
type
Interlink
402,
and
tw
o
vibro-motors,
model
1027.
The
FSR
s
are
laid
out
at
dif
ferent
locations
on
the
modified
InMoo
v
hand.
As
mentioned
earlier
,
these
sensors
are
used
to
generate
the
signals
that
trigger
the
haptic
actuators
in
which
the
mean
v
alue
of
the
incoming
signals
is
used
to
acti
v
ate
the
haptic
actuators,
vibrating
motors.
T
o
mimic
a
realistic
grasp
feeling,
the
speed
of
the
haptic
actuators
is
proportional
to
the
mean
v
alue.
3.2.
Pr
osthetic
hand
and
sensor
calibrations
The
prosthetic
hand
used
in
this
w
ork
w
as
InMoo
v
where
the
fingers’
shape
has
been
modified
to
be
capable
of
holding
the
FSR,
see
Figure
2.
The
hand
w
as
printed
using
3D
Mak
erBot
Replicator
and
PLA
filament.
The
mo
v
ement
of
each
finger
is
actuated
by
a
serv
omotor
type
T
o
werProMG995.
Fi
v
e
serv
o
motors
for
al
l
fingers
were
used.
Serv
o
horn
and
fishing
line
are
attac
h
e
d
with
ea
ch
serv
omotor
to
control
the
motion
direction
of
the
fingers
where
the
y
can
be
fle
x
ed
or
relax
ed
rel
ati
v
e
to
the
motor
rotational
direction.
A
micro-
controller
type
Arduino
MEGA
i
s
utilized
to
control
the
serv
omotors,
vibrating
motors
a
nd
FSRs.
The
FSRs
ha
v
e
been
calibrated
using
dif
ferent
range
of
weight
v
al
u
e
s
and
a
best
fitting
polynomial
function
between
the
Int
J
Elec
&
Comp
Eng,
V
ol.
10,
No.
6,
December
2020
:
5772
–
5778
Evaluation Warning : The document was created with Spire.PDF for Python.
Int
J
Elec
&
Comp
Eng
ISSN:
2088-8708
r
5775
output
v
olt
age
from
each
FSR
and
the
used
weights
w
as
obtained
for
the
used
FSRs.
These
polynomial
func-
tions
were
used
to
obtain
FSRs
responses
for
selecti
v
e
locations
on
the
InMoo
v
.
A
schematic
diagram
of
the
used
components
and
their
wiring
is
sho
wn
in
Figure
3.
(1)
(2)
(3)
(4)
(5)
Figure
3.
Schematic
diagram
of
the
haptic–hand
system,
(1)
vibrating
motor
(2)
FSR
(3)
MEGA
Arduino
(4)
serv
o
motor
(5)
battery
4.
RESUL
TS
AND
DISCUSSION
In
this
w
ork,
LMC
is
used
to
control
the
motion
of
a
modified
v
ersion
of
InMoo
v
hand
and
return
haptic
feedback.
In
addition,
locations
of
feedback
sensors
for
the
suggested
haptic
system
are
in
v
estig
ated
to
pro
vide
a
more
realistic
touch
e
xperience.
T
o
v
alidate
the
obtained
results,
fi
v
e
subjects
were
v
olunteered
to
carry
out
the
prosthetic
hand
control
e
xperime
n
t
s,
in
which
the
tests
ha
v
e
been
carried
out
for
all
participants.
T
able
1
sho
ws
the
age
and
gender
of
the
subjects.
Although
the
angles
and
directions
g
athered
from
leap
motion
functions
are
dif
ferent
for
each
participant,
the
hand
response
is
perfectly
matched
for
all
subjects’
hands
mo
v
ements.
Rarely
,
response
lagging
is
occurred
due
to
limitation
i
n
Processing
softw
are
ability
for
images
acquiring
and
processing.
T
able
1.
The
participants
gender
and
age.
Maximum
and
minimum
angles
v
alues
for
each
finger
recorded
by
the
LMC
(full
straight
and
straight-fist).
Inde
x
Middle
Ring
Pink
y
Thumb
Gender
Age
Min
v
alue
Max
v
alue
Min
v
alue
Max
v
alue
Min
v
alue
Max
v
alue
Min
v
alue
Max
v
alue
Min
v
alue
Max
v
alue
Male
60
-82
22
-87
42
-80
35
-52
20
-60
39
Male
20
-61
23
-80
39
-84
42
-66
35
-98
31
Female
50
-67
15
-82
22
-75
24
-51
19
-88
38
Female
25
-72
32
-81
51
-82
57
-75
20
-68
30
Female
17
-65
23
-82
40
-75
33
-53
30
-80
24
An
in
v
estig
ation
has
been
carried
out
for
optimal
placement
of
the
FSRs
such
that
these
FSRs
can
return
a
con
v
enient
haptic
feedback.
Since
the
c
ylindrical
objects
are
widely
util
ized
to
e
xamine
the
grasping
reactions
[28]
,
three
c
ylindrical
objects
with
dif
ferent
diameters
and
12
locations
on
the
prosthetic
hand
ha
v
e
been
selected
for
this
in
v
estig
ation.
Figure
4
sho
ws
the
layout
of
the
FSR
sensors
in
which
the
12
locations
are
disseminate
along
the
prosthetic
hand
and
the
output
responses
from
FSRs
were
recoded
and
presented
in
Figure
5.
It
can
be
noticed
that
the
grasping
reactions
a
t
locations
3
till
6
are,
in
general,
tend
to
decrease
compared
with
other
locati
o
ns
.
Such
performance
is
link
ed
to
the
c
ylindrical
s
hape
of
the
grasped
object
in
which
line
contact
between
the
grasped
object
and
the
upper
part
of
the
prosthetic
palm
is
occurred.
Inter
-
estingly
,
a
wide
v
ariance
in
the
v
alue
of
the
reactions
at
locations
7,
8
and
9
are
detected
when
the
objects’
diameters
are
changed.
This
finding
is
related
to
the
object
and
modified
fi
ng
e
rs
geometry
where
the
object
is
surrounded
by
the
fingers
and
this
encircling
let
FSRs
generate
electric
v
oltage
proportional
to
the
size
of
the
grasped
object.
Ho
we
v
er
,
increasing
the
object
size
up
to
a
critical
v
alue,
which
needs
further
in
v
estig
ating,
can
cause
a
superficial
result
due
to
missing
the
contact
bet
ween
the
sensors
and
the
grasped
object.
A
similar
response
for
FSRs
is
occurred
when
the
sensors
are
positioned
at
locations
10
and
11
while
location
12
returns
a
less
reaction
v
alue
compared
with
pre
vious
locations.
The
reason
for
this
rather
contradictory
re
sult
is
still
not
completely
clear
,
b
ut
we
refer
that
to
palm
shape
of
InMoo
v
.
Our
findings
w
ould
seem
to
sho
w
that
the
A
haptic
feedbac
k
system
based
on
leap
motion
contr
oller
for
...
(Hussam
K.
Abdul-Ameer)
Evaluation Warning : The document was created with Spire.PDF for Python.
5776
r
ISSN:
2088-8708
FSRs
locate
in
the
inde
x
and
middle
fingers’
tips
gi
v
e
better
reactions
than
other
locations
assuming
that
the
grasped
objects
ha
v
e
a
c
ylindrical
shape.
Ho
we
v
er
,
dif
ferent
shapes
for
tests
objects
need
to
be
in
v
estig
ated
to
obtain
a
comprehensi
v
e
understanding
of
relation
bet
ween
the
grasped
objects
and
the
modified
prosthetic
hand.
P
artially
captured
objects
can
also
af
fect
the
grasping
reactions
that
m
ay
decrease
the
reaction
v
alue
and
also
may
case
slipping
of
the
grasped
objects.
Thus,
the
applied
grasped
force
from
the
real
hand
should
be
proportional
to
the
object
diameter
to
pre
v
ent
object
sliding.
This
requires
a
further
study
which
is
be
yond
the
scope
of
this
w
ork.
The
weight
of
the
object
af
fects
the
rea
ction
depending
on
the
pros
thetic
hand
layout.
In
the
case
of
a
horizontal
layout,
the
FSR
is
influenced
by
the
object
weight
while
in
a
v
ertical
layout,
the
weight
does
not
contrib
ute
to
the
grasping
reactions.
Instead,
it
has
a
slipping
ef
fect
which
should
be
balanced
by
increasing
the
applied
grasping
force.
Figure
4.
T
est
locations
on
the
InMoo
v
to
in
v
estig
ate
the
FSRs
responses
when
a
c
ylindrical
object
is
used
Figure
5.
FSRs
responses
at
dif
ferent
locations
on
the
InMoo
v
e
hand,
Cylindrical
objects
are
used
to
in
v
estig
ated
the
c
ylindrical
grasp
5.
CONCLUSION
In
this
w
ork
LMC
is
utilized
to
control
anthropomorphic
hand
with
haptic
feedback
to
perform
c
ylindrical
grasping
tasks.
The
proposed
system
w
ork
ed
fine
and
smooth
response
in
general
w
as
obtained.
Although,
the
LMC
enables
decent
touchless
interaction
between
the
user
and
the
prosthetic
hand,
geometry
of
the
prosthetic
hand
requires
additional
modifications
and
dynamic
analysis
to
ensure
full
inte
gration
with
the
haptic
system.
Such
in
v
estig
ations
will
impro
v
e
the
potential
of
using
LMC
in
telesur
gery
and
tele
e
xamination
since
accurac
y
and
interaction
are
utmost
k
e
y
f
actors.
W
e
in
v
estig
ated
the
locations
of
the
feedback
sensors
for
the
proposed
haptic
system
where
12
locations
distrib
uted
along
the
prosthetic
hand
were
sel
ected.
The
tip
of
the
middle
finger
yields
a
good
grasping
reaction
despite
v
arying
in
object’
s
shape.
Placing
the
FSR
on
the
hand
palm
can
gi
v
e
acceptable
reaction,
ho
we
v
er
,
object
size
can
decrease
the
contact
area
and
return
less
ef
fecti
v
e
contact
reaction.
W
e
recommend
a
further
analysis
for
optimal
sensors
placement
pre
v
enting
object
slipping.
Int
J
Elec
&
Comp
Eng,
V
ol.
10,
No.
6,
December
2020
:
5772
–
5778
Evaluation Warning : The document was created with Spire.PDF for Python.
Int
J
Elec
&
Comp
Eng
ISSN:
2088-8708
r
5777
This
may
be
achie
v
ed
by
fusion
the
date
obtained
from
FSRs
and
a
proximity
sensor
,
where
the
return
signal
from
the
last
sensor
can
detect
the
occurrence
of
object
slipping
while
the
FSRs
maintain
the
haptic
feedback.
A
CKNO
WLEDGMENT
W
e
w
ould
lik
e
to
thank
Dr
.
Ali
H.
Al-T
imemy
for
pro
viding
a
printed
InMoo
v
hand.
Dr
.
Al-T
imemy
is
a
member
of
staf
f
at
Al-Khw
arizmi
colle
ge
of
Engineering,
Uni
v
ersity
of
Baghdad
REFERENCES
[1]
R.
Brouet,
R.
Blanch,
and
M.-P
.
Cani,
“Understanding
Hand
De
grees
of
Freedom
and
Natural
Gestures
for
3d
Interaction
on
T
abletop,
”
Human-Computer
Interaction–INTERA
CT
,
v
ol.
8117,
pp.
297–314,
2013.
[2]
B.
Buchholz,
T
.
J.
Armstrong,
and
S.
A.
Goldstein,
“
Anthropometric
data
for
describing
the
kinematics
of
the
human
hand,
”
Er
gonomics,
v
ol.
35,
no.
3,
pp.
261–273,
1992.
[3]
J.
Lin,
Y
ing
W
u,
and
T
.
Huang,
“Capturing
human
hand
motion
in
image
sequences,
”
Proceedings
IEEE
Comput.
Soc,
W
orkshop
on
Motion
and
V
ideo
Computing,
pp.
99–104,
2002.
[4]
A.
H.
Al-T
imemy
,
R.
N.
Khushaba,
G.
Bugmann,
and
J.
Es
cudero,
“Impro
ving
the
performance
ag
ainst
force
v
ariation
of
emg
controlled
multifunctional
upper
-limb
prostheses
for
transradial
amputees,
”
IEEE
T
ransactions
on
Neural
Systems
and
Rehabilitation
Engineering,
v
ol.
24,
no.
6,
pp.
650–661,
2016.
[5]
C.
C.
Moldo
v
an
and
I.
Staretu,
“Capturing
human
hand
mo
v
ements
with
a
webcam
to
control
an
anthro-
pomorphic
gripper
,
”
IProcedia
Manuf
acturing,
v
ol.
22,
pp.
519–526,
2018.
[6]
M.
Y
e,
et
al.,
“
A
Surv
e
y
on
Human
Motion
Analysi
s
from
Depth
Data,
”
IT
ime-of-Flight
and
Depth
Imag-
ing.
Sensors,
Algorithms,
and
Applications,
pp.
149–187,
2013.
[7]
F
.
W
eichert,
D.
Bachmann,
B.
Rudak,
and
D.
Fisseler
,
“
Analysis
of
the
Accurac
y
and
Rob
ustness
of
the
Leap
Motion
Controller
,
”
Sensors,
v
ol.
13,
no.
5,
pp.
6380–6393,
2013.
[8]
G.
Marin,
F
.
Dominio,
and
P
.
Zanuttigh,
“Hand
gesture
recognition
with
leap
motion
and
kinect
de
vices,
”
IEEE
International
Conference
on
Image
Processing,
pp.
1565–1569,
2014.
[9]
W
.
Lu,
Z.
T
ong,
and
J.
Chu,
“Dynamic
Hand
Gesture
Recognition
W
ith
Leap
Motion
Controller
,
”
IEEE
Signal
Process.
Lett.,
v
ol.
23,
no.
9,
pp.
1188–1192,
2016.
[10]
S.
Ameur
,
A.
B.
Khalif
a,
and
M.
S.
Bouhlel,
“
A
comprehensi
v
e
leap
motion
database
for
hand
gesture
recognition,
”
7th
International
Conference
on
Sciences
of
Electronics,
T
echnologies
of
Information
and
T
elecommunications,
pp.
514–519,
2016.
[11]
A.
Elons,
M.
Ahmed,
H.
Shedid,
and
M.
T
olba,
“
Arabic
sign
language
recognition
using
leap
motion
sensor
,
”
9th
International
Conference
on
Computer
Engineering
and
Systems,
pp.
368–373,
2014.
[12]
A.
Setia
w
an
and
R.
Pulung
an,
“Deep
Belief
Netw
orks
for
Recognizing
Handwriting
Captured
by
Leap
Motion
Controller
,
”
IJECE,
v
ol.
8,
no.
6,
p.
4693,
Dec.
2018.
[13]
K.
N.
Y
asen,
F
.
L.
Malallah,
L.
F
.
Abdulrazak,
A.
M.
Darwesh,
A.
Khmag,
and
B.
T
.
Shareef,
“Hand
detection
and
se
gmentation
using
smart
path
tracking
fingers
as
features
and
e
xpert
system
classifier
,
”
International
Journal
of
Electrical
and
Computing
Engineering,
v
ol.
9,
no.
6,
2019.
[14]
Y
.
Pititeeraphab,
P
.
Choitkunnan,
N.
Thongpance,
K.
K
ullathum,
and
C.
Pinta
virooj,
“Robot-arm
control
system
using
LEAP
motion
controller
,
”
International
Conference
on
Biomedical
Engineering,
pp.
109–112,
2016.
[15]
S.
Chen,
H.
Ma,
C.
Y
ang,
and
M.
Fu,
“Hand
Gesture
Based
Robot
Control
System
Using
Leap
Motion,
”
Intelligent
Robotics
and
Applications,
v
ol.
9244,
pp.
581–591,
2015.
[16]
I.
Staretu
and
C.
Moldo
v
an,
“Leap
Motion
De
vice
Used
to
Control
a
Real
Anthropomorphic
Gripper
,
”
International
Journal
of
Adv
anced
Robotic
Systems,
v
ol.
13,
no.
3,
2016.
[17]
F
.
Despino
y
,
N.
Zemiti,
G.
F
orestier
,
A.
Sanchez,
P
.
Jannin,
and
P
.
Poignet,
“Ev
aluation
of
contactless
human–machine
interf
ace
for
robotic
sur
gical
training,
”
Int
J
CARS,
v
ol.
13,
no.
1,
pp.
13–24,
2018.
[18]
T
.
A.
T
ra
v
aglini,
P
.
J.
Sw
ane
y
,
K.
D.
W
ea
v
er
,
and
R.
J.
W
ebster
III,
“Initial
Experiments
with
the
Leap
Mo-
tion
as
a
User
Interf
ace
in
Robotic
Endonasal
Sur
gery
,
”
Robotics
and
Mechatronics,
v
ol.
37,
pp.
171–179,
2016.
[19]
Y
.
Kim,
P
.
C.
W
.
Kim,
R.
Selle,
A.
Shademan,
and
A.
Krie
ger
,
“Experiment
al
e
v
aluation
of
contact-less
hand
tracking
systems
for
tele-operation
of
sur
gical
tasks,
”
IEEE
International
Conference
on
Robotics
and
Automation,
pp.
3502–3509,
2014.
A
haptic
feedbac
k
system
based
on
leap
motion
contr
oller
for
...
(Hussam
K.
Abdul-Ameer)
Evaluation Warning : The document was created with Spire.PDF for Python.
5778
r
ISSN:
2088-8708
[20]
M.
Sathiyanarayanan
and
S.
Rajan,
“Understanding
the
use
of
leap
motion
touchless
de
vice
in
ph
ysio-
therap
y
and
impro
ving
the
healthcare
system
i
n
India,
”
9th
International
Conference
on
Communication
Systems
and
Netw
orks,
pp.
502–507,
2017.
[21]
S.
Nicola,
L.
Stoicu-T
i
v
adar
,
I.
V
irag,
and
M.
Crisan-V
ida,
“Leap
Motion
supporting
medical
education,
”
12th
IEEE
International
Symposium
on
Electronics
and
T
elecommunications,
pp.
153–156,
2016.
[22]
A.
Skraba,
A.
K
olozv
ari,
D.
K
ofjac,
and
R.
Stojano
vic,
“Wheelchair
maneuv
ering
using
leap
motion
controller
and
cloud
based
speech
control:
Prototype
realization,
”
4th
Mediterranean
Conference
on
Em-
bedded
Computing,
pp.
391–394,
2015.
[23]
G.
M.
Rosa
and
M.
L.
Elizondo,
“Use
of
a
gesture
user
interf
ace
as
a
touchless
image
na
vig
ation
system
in
dental
sur
gery:
Case
series
report,
”
Imaging
Sci
Dent,
v
ol.
44,
no.
2,
pp.
155–160,
2014.
[24]
R.
G.
Lupu,
N.
Botezatu,
F
.
Ungureanu,
D.
Ignat,
and
A.
Moldo
v
eanu,
“V
irtual
reality
based
strok
e
reco
v
ery
for
upper
limbs
using
leap
motion,
”
20th
International
Conference
on
System
Theory
,
Control
and
Computing,
pp.
295–299,
2016.
[25]
E.
Freeman,
S.
Bre
wster
,
and
V
.
Lantz,
“T
actile
Feedback
for
Abo
v
e-De
vice
Gesture
Interf
aces:
Adding
T
ouch
to
T
ouchless
Interactions,
”
Proceedings
of
the
16th
International
Conference
on
Multimodal
Inter
-
action,
pp.
419–426,
2014.
[26]
Mingyu
Kim,
Changyu
Jeon,
and
Jinmo
Kim,
“
A
Study
on
Immersion
and
Presence
of
a
Portable
Hand
Haptic
System
for
Immersi
v
e
V
irtual
Reality
,
”
Sensors,
v
ol.
17,
no.
5,
2017.
[27]
A.
Bulg
arelli
,
G.
T
oscana,
L.
O.
Russo,
G.
A.
F
arulla,
M.
Indaco,
and
B.
Bona,
“
A
Lo
w-Cost
Open
Source
3d-Printable
De
xt
erous
Anthropomorphic
Robotic
Hand
wi
th
a
P
arallel
Spherical
Joint
Wrist
for
Sign
Languages
Reproduction,
”
International
Journal
of
Adv
anced
Robotic
Systems,
v
ol.
13,
no.
3,
2016.
[28]
M.
Nieuwenhuisen,
J.
Stueckler
,
A.
Berner
,
R.
Klein,
and
S.
Behnk
e,
“Shape-primiti
v
e
based
object
recognition
and
grasping,
”
7th
German
Conference
on
Robotics,
pp.
1–5,
2012.
Int
J
Elec
&
Comp
Eng,
V
ol.
10,
No.
6,
December
2020
:
5772
–
5778
Evaluation Warning : The document was created with Spire.PDF for Python.