Inter
national
J
our
nal
of
Electrical
and
Computer
Engineering
(IJECE)
V
ol.
15,
No.
1,
February
2025,
pp.
292
∼
302
ISSN:
2088-8708,
DOI:
10.11591/ijece.v15i1.pp292-302
❒
292
Fr
om
concept
to
application:
b
uilding
and
testing
a
lo
w-cost
light
detection
and
ranging
system
f
or
small
mobile
r
obots
using
time-of-ight
sensors
Andr
´
es
Gar
c
´
ıa,
Mauricio
D
´
ıaz,
Fr
edy
Mart
´
ınez
F
acultad
T
ecnol
´
ogica,
Uni
v
ersidad
Distrital
Francisco
Jos
´
e
de
Caldas,
Bogot
´
a
D.C,
Colombia
Article
Inf
o
Article
history:
Recei
v
ed
May
4,
2024
Re
vised
Aug
15,
2024
Accepted
Sep
3,
2024
K
eyw
ords:
Autonomous
robots
Cost
reduction
Light
detection
and
ranging
technology
Mobile
robotics
Sensor
de
v
elopment
T
ime-of-ight
sensors
ABSTRA
CT
Adv
ancements
in
light
detection
and
ranging
(LiD
AR)
technology
ha
v
e
signi-
cantly
impro
v
ed
robotics
and
automated
na
vig
ation.
Ho
we
v
er
,
the
high
cost
of
traditional
LiD
AR
sensors
restricts
their
use
in
small-scale
robotic
projects.
This
paper
details
the
de
v
elopment
of
a
lo
w-cost
LiD
AR
prototype
for
small
mobile
robots,
using
time-of-ight
(T
oF)
sensors
as
a
cost-ef
fecti
v
e
alternati
v
e.
Inte-
grated
with
an
ESP32
microcontroller
for
real-time
data
processing
and
W
i-Fi
connecti
vity
,
the
prototype
f
acilitat
es
accurate
distance
measurement
and
en
vi-
ronmental
mapping,
crucial
for
autonomous
na
vig
ation.
Our
approach
included
hardw
are
design
and
assembly
,
follo
wed
by
programming
the
T
oF
sensors
and
ESP32
for
data
collection
and
actuation.
Experiments
v
alidated
the
accurac
y
of
the
T
oF
sensors
under
s
tatic,
dynamic,
and
v
aried
lighting
conditions.
Results
sho
w
that
our
lo
w-cost
system
achie
v
es
accurac
y
and
reliability
comparable
to
more
e
xpensi
v
e
options,
with
an
a
v
erage
mapping
error
within
acceptable
limits
for
practical
use.
This
w
ork
of
fer
s
a
blueprint
for
af
fordable
LiD
AR
systems,
e
xpanding
access
to
technology
for
research
and
education,
and
demonstrating
the
viability
of
T
oF
sensors
in
economical
robotic
na
vig
ation
and
mapping
solu-
tions.
This
is
an
open
access
article
under
the
CC
BY
-SA
license
.
Corresponding
A
uthor:
Fredy
Mart
´
ınez
F
acultad
T
ecnol
´
ogica,
Uni
v
ersidad
Distrital
Francisco
Jos
´
e
de
Caldas
Carrera
7
No
40B-53,
Bogot
´
a
D.C.,
Colombia
Email:
fhmartinezs@udistrital.edu.co
1.
INTR
ODUCTION
Light
detection
and
ranging
(LiD
AR)
technology
has
emer
ged
as
a
k
e
y
tool
in
the
e
v
olution
of
modern
robotics,
of
fering
unprecedented
precision
in
automated
na
vig
ation
and
en
vironmental
mapping
[1]–[3].
This
technology
utilizes
pulsed
laser
beams
to
measure
distances,
creating
detailed
three-dimensional
maps
of
sur
-
roundings,
which
is
crucial
for
v
arious
appl
ications
in
robotics
[4],
[5].
The
accurac
y
and
reliability
of
LiD
AR
ha
v
e
enabled
robots
to
perform
comple
x
tasks
such
as
autonomous
dri
ving,
aerial
surv
e
ys,
and
industrial
au-
tomation
with
greater
ef
cienc
y
and
minimal
human
interv
ention
[6].
The
adoption
of
LiD
AR
in
sectors
lik
e
agriculture
for
crop
monitoring,
in
archaeology
for
e
xploring
inaccessible
historical
sites,
and
in
forestry
for
biomass
estimation
sho
wcases
its
wide-ranging
impact
[7],
[8].
Moreo
v
er
,
the
inte
gration
of
LiD
AR
with
other
technologies
such
as
articial
intelligence
and
machine
learning
has
further
enhanced
its
capabilities,
leading
to
smarter
and
more
adapti
v
e
robotic
systems
[9].
Despite
its
v
ast
potential,
the
application
of
LiD
AR
in
robotics
has
traditionally
been
limited
by
its
J
ournal
homepage:
http://ijece
.iaescor
e
.com
Evaluation Warning : The document was created with Spire.PDF for Python.
Int
J
Elec
&
Comp
Eng
ISSN:
2088-8708
❒
293
high
cost,
which
restricts
its
accessibility
to
lar
ge-scale
industrial
projects
or
well-funded
research
initiati
v
es
[10].
The
nancial
barrier
not
only
hampers
inno
v
ation
at
the
grassroots
le
v
el
b
ut
also
limits
the
e
xploration
of
LiD
AR’
s
benets
in
e
v
eryday
applications
[11].
This
has
prompted
a
gro
wing
interest
in
de
v
eloping
more
af
fordable
LiD
AR
alternati
v
es
that
can
democratize
this
transformati
v
e
technology
,
making
it
a
v
ailable
to
a
broader
audience
[12].
Recent
adv
ancements
ha
v
e
seen
the
emer
gence
of
time-of-ight
(T
oF)
se
nsors
as
a
cost-ef
fecti
v
e
solution
that
maintains
a
balance
between
performance
and
e
xpense
[2],
[13],
[14].
These
de
v
el-
opments
signify
a
pi
v
otal
shift
in
ho
w
robotic
technologies
can
be
utilized
across
dif
ferent
elds,
potentially
leading
to
more
widespread
adoption
and
inno
v
ati
v
e
applications
of
LiD
AR
technology
in
robotics.
T
raditional
LiD
AR
systems
are
often
e
xpensi
v
e
due
to
their
comple
x
design
and
the
precision
com-
ponents
required
for
their
operation,
restricting
their
us
age
to
well-funded
industrial
project
s
or
specialized
research
laboratories
[15],
[16].
This
economic
barrie
r
sties
inno
v
ation
by
limiting
the
di
v
ersity
of
ideas
and
applications
that
could
otherwise
enhance
technological
progress
and
practical
implementations
of
robotics.
Recognizing
the
importance
of
accessibility
in
technology
is
crucial
for
fostering
inno
v
ation
and
broadening
the
impact
of
adv
anced
tools
lik
e
LiD
AR
[17].
The
democratization
of
such
technologies
can
lead
to
a
sig-
nicant
increase
in
creati
v
e
solutions
to
e
v
eryday
problems,
allo
wing
a
wider
range
of
users
to
e
xperiment,
inno
v
ate,
and
contrib
ute
to
their
elds.
Thus,
there
is
a
compelling
need
for
a
lo
w-cost
LiD
AR
solution
that
maintains
functional
inte
grity
while
being
economically
feasible
[18].
De
v
eloping
such
solutions
w
ould
not
only
e
xpand
the
application
scope
of
LiD
AR
technology
b
ut
also
empo
wer
a
ne
w
generation
of
technologists
and
enthusiasts
to
e
xperiment
and
inno
v
ate,
thereby
accelerating
adv
ancements
in
robotics
and
related
areas.
T
oF
sensors
present
a
promising
and
cost-ef
fecti
v
e
alternati
v
e
to
traditional
LiD
AR
systems,
addres
s-
ing
the
critical
barrier
of
high
e
xpense
associated
with
con
v
entional
LiD
AR
technologies
[19].
T
oF
sensors
operate
on
the
principle
of
measuring
the
time
it
tak
es
for
a
light
pulse
to
tra
v
el
to
an
object
and
back
to
the
sensor
,
thereby
determining
the
distance
based
on
the
speed
of
light
[20].
This
straightforw
ard
yet
ef
fec-
ti
v
e
mechanism
allo
ws
T
oF
sensors
to
perform
distance
measurements
and
en
vironmental
mappings
analogous
to
those
achie
v
ed
by
more
comple
x
LiD
AR
setups.
The
ability
of
T
oF
sensors
to
deli
v
er
real-time
spatial
a
w
areness
and
precision
at
a
signicantly
reduced
cost
mak
es
them
particularly
appealing
for
applications
in
consumer
electronics,
and
mobile
robotics,
where
cost
considerations
are
paramount.
The
primary
objecti
v
e
of
this
research
is
to
de
v
elop
a
lo
w-cost
prototype
that
le
v
erages
T
oF
s
ensors
inte
grated
into
a
mobile
robot
platform.
This
prototype
is
designed
to
e
x
ecute
tasks
traditionally
performed
by
more
e
xpensi
v
e
LiD
AR-equipped
robots,
such
as
distance
measurement,
object
detection,
and
en
vironmen-
tal
mapping.
By
utilizing
T
oF
sensors,
the
prototype
aims
to
bring
the
bene
ts
of
precision
na
vig
ation
and
mapping
to
smaller
,
potentially
indoor
en
vironments
where
deplo
ying
lar
ge-scale,
high-cost
LiD
AR
systems
is
impractical
[21].
The
focus
is
on
creating
a
v
ersatile
and
accessible
tool
that
can
be
used
in
educational
set-
tings,
small
b
usiness
applications,
and
by
robotics
hobbyists.
The
de
v
elopment
of
this
prototype
underscores
an
ef
fort
to
democratize
adv
anced
robotic
technologies,
making
them
a
v
ailable
and
af
fordable
to
a
broader
audience.
This
project
not
only
enhances
the
technological
capabilities
of
compact
robotic
systems
b
ut
also
e
xpands
the
potential
for
inno
v
ation
in
spaces
constrained
by
size
and
b
udget.
Our
methodological
approach
encompassed
a
comprehensi
v
e
design
and
de
v
elopment
process,
tai-
lored
to
inte
grate
T
oF
sensors
with
an
ESP32
microcontroller
,
which
serv
ed
as
the
central
unit
for
processing
and
communication.
The
de
v
elopment
be
g
an
with
a
conceptual
design
that
outlined
the
k
e
y
functionalities
and
system
requirements,
follo
wed
by
the
ph
ysical
assembly
of
the
prototype.
T
oF
sensors
were
selected
for
their
cost-ef
fecti
v
eness
and
ability
to
perform
in
a
range
of
en
vironmental
conditions,
mirroring
the
capabilities
of
more
sophisticated
LiD
AR
systems.
W
e
adopted
an
iterati
v
e
de
v
elopment
strate
gy
,
where
initial
testing
in
con-
trolled
en
vironments
led
to
successi
v
e
renements
in
both
hardw
are
and
softw
are
components.
Each
iteration
included
rigorous
testing
under
v
arious
conditions
(ranging
from
lo
w-light
en
vironments
to
obstacle-rich
paths)
to
ensure
reliability
and
accurac
y
.
This
process
not
only
enhanced
the
prototype’
s
adaptability
to
real-w
orld
scenarios
b
ut
also
helped
in
ne-tuning
the
system
for
optimal
performance
across
dif
ferent
conte
xts.
This
research
contrib
utes
signicantly
to
the
eld
of
robotic
na
vig
ation
by
demonstrating
the
practi
cal
application
of
T
oF
sensors
as
a
viable
alternati
v
e
to
traditional
LiD
AR
systems
in
a
cost-ef
fecti
v
e
manner
[22].
By
inte
grating
these
sensors
into
a
mobile
robot
platform,
we
pro
vide
a
blueprint
for
constructing
lo
w-
cost
robotic
systems
that
do
not
compromise
on
functionality
.
Our
w
ork
pa
v
es
the
w
ay
for
broader
access
and
e
xperimentati
on
in
t
h
e
robotics
community
,
potentially
fostering
inno
v
ation
in
v
arious
sectors
including
education,
small-scale
industrial
applications,
and
personal
technology
projects.
The
structure
of
this
paper
is
or
g
anized
to
guide
the
reader
through
our
study:
be
ginning
with
a
detailed
description
of
the
design
and
F
r
om
concept
to
application:
b
uilding
and
testing
a
low-cost
light
detection
and
...
(Andr
´
es
Gar
c
´
ıa)
Evaluation Warning : The document was created with Spire.PDF for Python.
294
❒
ISSN:
2088-8708
technical
specications
of
our
prototype
in
the
Methods
section,
follo
wed
by
a
presentation
of
our
e
xperimental
setup
and
testing
protocols
in
the
Results
section.
W
e
then
discuss
the
implications
and
potential
applications
of
our
ndings
in
the
Discussion
section,
concluding
with
a
summary
of
our
insights
and
suggestions
for
future
research
in
the
Conclusion.
This
or
g
anization
ensures
a
clear
and
logical
o
w
,
making
it
easy
for
readers
to
understand
our
processes,
replicate
our
results,
and
e
xtend
our
w
ork
to
ne
w
applications.
2.
LITERA
TURE
REVIEW
LiD
AR
technology
has
increasingly
become
a
staple
in
aut
omation,
enhancing
applications
across
drones,
mobile
robotics,
and
broader
automa
tion
conte
xts.
These
systems
typically
emplo
y
T
oF
sensors,
kno
wn
for
their
accurac
y
in
measuring
distances
swiftly
and
ef
ciently
[23].
Despite
their
potential,
the
high
cost
and
substantial
weight
of
traditional
LiD
AR
systems
rest
rict
their
broader
application,
particularly
in
cost-sensiti
v
e
or
weight-sensiti
v
e
en
vironments
such
as
consumer
drones
and
lightweight
mobile
robots
[24].
This
limitation
has
s
purred
signicant
research
into
the
de
v
elopment
of
more
accessible,
lo
w-cost
LiD
AR
alternati
v
es
that
le
v
erage
compact
T
oF
sensors.
One
notable
adv
ancement
in
this
domain
is
the
T
eraR
anger
Ev
o
Mini,
a
compact
LiD
AR
sensor
that
is
both
af
fordable
and
po
wer
-ef
cient,
making
it
ideal
for
battery-po
wered
de
vices
and
embedded
applications
[25].
The
inte
gration
of
T
oF
sensors
in
mobile
robotics
has
opened
ne
w
a
v
enues
for
enhancing
autonomous
na
vig
ation
capabilities,
particularly
within
indoor
en
vironments
where
precision
and
reliability
are
crucial.
Recent
studies
ha
v
e
focused
on
combining
LiD
AR
with
vision
sensors
to
create
rob
ust,
lo
w-cost
sensing
arrays
for
mobile
robots
[26].
This
fusion
enhances
spatial
a
w
areness
and
impro
v
es
the
robots’
ability
to
na
vig
ate
and
localize
within
comple
x
en
vironments.
Furthermore,
the
inno
v
ation
of
adapti
v
e
scanner
technologies
for
mobile
robots
highlights
a
gro
wing
trend
to
w
ards
de
v
eloping
e
xible
and
adapti
v
e
sensing
systems
that
can
dynamically
adjust
to
their
surroundings,
thus
pro
viding
more
accurate
localization
and
ef
cient
na
vig
ation
[27].
These
adv
ancements
underline
the
shift
to
w
ards
de
v
eloping
v
ersatile
and
b
udget-friendly
LiD
AR
systems
that
do
not
compromise
on
functionality
.
Adv
ancements
in
solid-state
LiD
AR
technology
also
underscore
a
signicant
shift
to
w
ards
more
sus-
tainable
and
scalable
applications
in
mobile
robotics.
Unlik
e
traditional
mechanical
LiD
AR
systems
that
rely
on
mo
ving
parts,
solid-state
LiD
AR
uses
a
stationary
laser
beam
and
an
array
of
T
oF
sensors
to
detect
dis-
tances,
signicantly
reducing
comple
xity
,
size,
and
susceptibility
to
mechanical
f
ailures
[28].
Th
e
se
systems
are
particularly
adv
antageous
for
applications
requiring
durable
and
compact
solutions,
such
as
in
service
robots
operating
wi
thin
cluttered
or
dynamic
human
en
vironments.
The
adoption
of
solid-state
LiD
AR
is
set
to
re
v
olutionize
ho
w
robots
percei
v
e
and
interact
with
their
en
vironment,
enabling
more
sophisticated
and
widespread
applications
in
industrial
automation,
personal
robotics,
and
be
yond.
By
le
v
eraging
the
capabilities
of
T
oF
sensors,
the
de
v
elopment
of
these
inno
v
ati
v
e
LiD
AR
systems
of
fers
promising
prospects
for
the
future
of
autonomous
mobile
robotics,
pro
viding
both
cost-ef
fecti
v
e
and
high-performance
solutions
[29].
Performance
comparisons
re
v
eal
that
whi
le
commercial
LiD
AR
systems
e
xcel
in
resolution
and
range,
lo
w-cost
alternati
v
es
lik
e
the
T
eraRanger
Ev
o
Mini
pro
vide
adequate
functionality
for
man
y
applications.
F
or
instance,
the
T
eraRanger
Ev
o
Mini,
although
less
adv
anced,
meets
the
requirements
for
indoor
na
vig
ation
and
object
detection
in
smaller
en
vironments
[25].
Additionally
,
inte
grating
LiD
AR
with
vision
sensors
enhances
the
o
v
erall
system
performance
by
compensating
for
indi
vidual
sensor
limitations.
This
combination
results
in
a
more
rob
ust
system
capable
of
functioning
ef
fecti
v
ely
in
di
v
erse
conditions
[26].
The
ongoing
adv
ance-
ments
in
solid-state
LiD
AR
further
bolster
these
capabiliti
es,
of
fering
high
durability
and
reduced
mechanical
comple
xity
,
which
are
critical
for
long-term
deplo
yment
in
dynamic
settings
[28].
The
de
v
elopment
of
lo
w-cost
LiD
AR
syst
ems
using
T
oF
sensors
marks
a
signicant
adv
ancement
in
making
this
technology
accessible
for
v
arious
applications.
These
systems
balance
performance
with
af
ford-
ability
,
enabling
broader
adoption
and
fostering
inno
v
ation
across
dif
ferent
elds.
The
inte
gration
of
multiple
sensing
technologies
and
the
mo
v
e
to
w
ards
solid-state
designs
are
k
e
y
in
o
v
ercoming
the
limitations
of
tradi-
tional
LiD
AR
systems,
pa
ving
the
w
ay
for
more
v
ersatile
and
ef
cient
solutions
in
autonomous
mobile
robotics.
3.
METHODS
The
primary
goal
of
this
research
w
as
to
de
v
elop
a
cost-ef
fecti
v
e
LiD
AR
prototype
that
could
be
inte
grated
into
a
mobile
robot
capable
of
distance
measurement,
object
detection,
and
real-time
en
vironmental
mapping.
T
o
achi
e
v
e
this,
we
opted
for
an
alternati
v
e
to
traditional
LiD
AR
sensors
by
utilizing
the
T
oF
infrared
Int
J
Elec
&
Comp
Eng,
V
ol.
15,
No.
1,
February
2025:
292-302
Evaluation Warning : The document was created with Spire.PDF for Python.
Int
J
Elec
&
Comp
Eng
ISSN:
2088-8708
❒
295
sensor
VL53L1X
as
sho
wn
in
Figure
1.
This
sensor
pro
vides
similar
functionaliti
es
as
a
LiD
AR
sensor
b
ut
on
a
smaller
scale
and
is
signicantly
less
e
xpensi
v
e,
making
it
ideal
for
b
udget-sensiti
v
e
projects.
Additionally
,
a
stepper
motor
controlled
by
an
H-bridge
w
as
adapted
to
enhance
the
eld
of
vie
w
of
the
T
oF
sensor
from
its
standard
27
de
grees
to
a
wider
angle
of
270
de
grees.
The
ESP32
module
w
as
selected
as
the
robot’
s
controller
due
to
its
adequate
processing
speed,
lo
w
po
wer
consumption,
and
inte
grated
W
i-Fi
and
Bluetooth
capabilities,
which
f
acilitate
data
transmission
to
a
web
platform
and
allo
w
remote
control
operations.
Figure
1.
W
iring
diagram
3.1.
Hard
war
e
design
and
assembly
The
robot’
s
design
w
as
concei
v
ed
with
a
focus
on
functionality
and
cost-ef
cienc
y
.
The
chassis
of
the
robot
w
as
constructed
from
acrylic,
and
support
pieces
for
the
control,
detection,
and
po
wer
modules
were
f
abricated
using
3D
printing
as
sho
wn
in
Figure
2(a).
The
design
inte
grates
v
arious
components
essential
for
the
prototype’
s
operation:
−
VL53L1X
sensor:
This
state-of-the-art,
miniature
T
oF
laser
sensor
operates
at
a
frequenc
y
of
50
Hz
and
can
measure
distances
up
to
four
meters.
Its
ability
to
pe
rform
accurate
measurements
under
dif
ferent
lighting
conditions
and
irrespecti
v
e
of
the
object’
s
color
or
reecti
vity
mak
es
it
highly
v
ersatile
for
robotic
applications.
−
ESP32
module:
Kno
wn
for
its
lo
w
cost
and
po
wer
ef
cienc
y
,
the
ESP32
module
supports
a
wide
r
ange
of
programming
languages
and
i
s
compatible
with
numerous
e
xisting
libraries.
Its
b
uilt-in
W
i-Fi
and
Bluetooth
f
acilitate
seamless
communication
between
the
prototype
and
the
web
interf
ace.
−
Additional
hardw
are:
The
prototype
also
includes
battery
holders,
a
5-v
olt
battery
,
a
gear
motor
ranging
from
3
to
9
v
olts,
a
stepper
motor
,
an
L298n
motor
dri
v
er
or
H-bridge,
a
chassis,
wheels,
and
a
laptop
for
control
and
monitoring.
The
arrangement
of
these
components
w
as
strate
gically
planned
to
optimize
the
a
v
ailable
space
on
the
acrylic
base,
simplify
connections,
and
ensure
easy
assembly
and
disassembly
,
enhancing
the
prototype’
s
maintainability
and
reducing
the
risk
of
connection
f
ailures
during
operation
as
sho
wn
in
Figure
2(b).
F
r
om
concept
to
application:
b
uilding
and
testing
a
low-cost
light
detection
and
...
(Andr
´
es
Gar
c
´
ıa)
Evaluation Warning : The document was created with Spire.PDF for Python.
296
❒
ISSN:
2088-8708
(a)
(b)
Figure
2.
Design
of
the
support
structure
(a)
CAD
design
of
the
support
and
(b)
side
vie
w
of
the
prototype
robot
3.2.
Softwar
e
de
v
elopment
The
prototype’
s
functionality
is
complemented
by
a
web-based
interf
ace,
which
allo
ws
users
to
inter
-
act
with
the
robot
in
real-time.
The
web
interf
ace
includes:
−
Control
interf
ace:
Users
can
control
the
robot’
s
mo
v
ements
(forw
ard,
backw
ard,
left,
right,
and
stop)
and
adjust
the
operation
mode
of
the
LiD
AR
sensor
(near
,
mid,
and
f
ar)
through
an
intuiti
v
e
web
page
as
sho
wn
in
Figure
3.
−
Data
visualization:
The
interf
ace
displays
real-time
data
from
the
T
oF
sensor
,
including
distance
measure-
ments
and
the
sensor’
s
operational
mode.
A
graphical
representation
of
the
LiD
AR
s
can
is
also
a
v
ailable,
pro
viding
a
visual
map
of
the
surroundings.
−
Ja
v
aScript
functions:
Custom
scripts
handle
HTTP
reques
ts
for
robot
control
and
data
retrie
v
al,
update
sensor
data
on
the
webpage,
and
manage
the
display
of
controls
and
charts
based
on
user
interactions.
Figure
3.
System
web
en
vironment
The
de
v
elopment
process
also
included
rigorous
testing
phases
to
ensure
accurac
y
and
reli
ability
.
These
tests
were
conducted
under
v
arious
en
vironmental
conditions
to
simulate
real-w
orld
applications,
ensur
-
ing
the
robot’
s
rob
ustness
and
ef
fecti
v
eness
in
dif
ferent
settings.
The
inte
gration
of
the
T
oF
sensor
with
the
mobile
platform,
controlled
via
a
web
interf
a
ce,
demonstrates
a
successful
application
of
lo
w-cost
technologies
in
robotic
na
vig
ation
and
en
vironmental
mapping.
Int
J
Elec
&
Comp
Eng,
V
ol.
15,
No.
1,
February
2025:
292-302
Evaluation Warning : The document was created with Spire.PDF for Python.
Int
J
Elec
&
Comp
Eng
ISSN:
2088-8708
❒
297
4.
RESUL
TS
This
section
presents
the
e
v
aluation
results
of
the
LiD
AR
prototype
inte
grated
i
nto
a
mobile
robot.
The
e
v
aluation
aims
to
systematically
assess
the
prototype’
s
performance
across
se
v
eral
k
e
y
functionalities:
distance
measurement
accurac
y
,
operational
ef
cienc
y
,
en
vironmental
mapping
capability
,
and
user
interf
ace
ef
fecti
v
eness.
These
e
v
aluations
were
designed
to
v
alidate
the
prototype’
s
practical
applications
and
identify
areas
for
impro
v
ement.
4.1.
Distance
measur
ement
accuracy
A
series
of
tests
were
conducted
to
e
v
aluate
the
distance
measurement
accurac
y
of
the
prot
otype.
Using
a
standardized
test
en
vironment
as
sho
wn
in
Figure
4(a),
the
robot
w
as
placed
at
a
x
ed
starting
point,
and
measurements
were
tak
en
at
v
arious
predetermined
distances
and
angl
es.
Objects
of
kno
wn
dimensions
were
placed
at
specic
locations,
and
the
robot’
s
measurements
were
recorded
and
compared
ag
ainst
these
kno
wn
v
alues.
The
tests
were
repeated
under
dif
ferent
en
vironmental
conditions
to
assess
the
sensor’
s
reliability
across
v
arying
light
le
v
els
and
surf
ace
reecti
vities.
Results
indicated
a
consistent
performance
in
normal
lighting
conditions,
b
ut
measurements
v
aried
under
lo
w
light
and
highly
reecti
v
e
surf
aces,
suggesting
a
need
for
sensor
calibration
and
possibly
softw
are
adjustments
to
enhance
accurac
y
in
di
v
erse
operating
en
vironments.
A
short
video
of
the
prototype’
s
performance
can
be
seen
at
the
follo
wing
link:
https://youtu.be/59XUFDRoyEg
4.2.
Operational
testing
and
mapping
accuracy
Operational
tests
were
designed
to
e
v
aluate
the
robot’
s
na
vig
ation
and
obstacle
detection
capabilities
,
along
with
the
accurac
y
of
its
en
vironmental
mapping.
The
robot
na
vig
ated
a
course
with
multiple
obstacles,
and
its
ability
to
detect
and
a
v
oid
these
obstacles
w
as
recorded.
Additionally
,
the
robot
performed
a
complete
scan
of
the
en
vironment
to
create
a
2D
map,
which
w
as
then
compared
to
a
pre-mapped
layout
of
the
area
as
sho
wn
in
Figure
4(b).
The
e
v
aluation
sho
wed
that
while
the
robot
could
successfully
na
vig
ate
and
a
v
oid
imme-
diate
obstacles
,
the
precision
of
the
generated
map
v
aried,
especially
near
the
boundaries
of
the
en
vironment.
This
suggests
impro
v
ements
are
needed
in
the
scanning
algorithm
to
enhance
edge
detection
and
o
v
erall
map
accurac
y
.
(a)
(b)
Figure
4.
T
esting
en
vironment
(a)
distance
test
track
with
obstacles
and
(b)
c
ylindrical
obstacle
mapping
test
F
r
om
concept
to
application:
b
uilding
and
testing
a
low-cost
light
detection
and
...
(Andr
´
es
Gar
c
´
ıa)
Evaluation Warning : The document was created with Spire.PDF for Python.
298
❒
ISSN:
2088-8708
4.3.
W
eb
interface
functionality
and
usability
testing
The
functionality
of
the
web
interf
ace,
which
allo
ws
for
real-time
interaction
with
the
robot,
w
as
crit-
ically
assessed.
The
interf
ace
w
as
tested
for
user
-friendliness,
responsi
v
eness,
and
accurac
y
of
data
presenta-
tion.
Users
were
able
to
control
the
robot,
change
scanning
modes,
and
vie
w
real-time
data,
including
distance
measurements
and
en
vironmental
maps.
Feedback
mechanisms
were
tested
for
delay
,
with
most
commands
e
x
ecuted
with
minimal
lag.
Ho
we
v
er
,
impro
v
ements
are
suggested
to
enhance
the
user
e
xperience,
particularly
in
streamlining
the
interf
ace
for
easier
na
vig
ation
and
quick
er
access
to
common
functions.
4.4.
Comparati
v
e
analysis
with
commer
cial
LiD
AR
systems
T
o
benchmark
the
protot
ype’
s
performance,
comparati
v
e
tests
were
conducted
ag
ainst
se
v
eral
com-
mercial
LiD
AR
systems.
These
tests
focused
on
comparing
the
d
i
stance
measurement
accurac
y
,
mapping
reso-
lution,
and
operational
rob
ustness
under
similar
tes
t
conditions.
The
prototype
e
xhibited
comparable
accurac
y
in
distance
measurements
b
ut
sho
wed
lo
wer
resolution
in
mapping
details.
The
comparati
v
e
analysis
highlights
the
prototype’
s
competiti
v
e
performance,
considering
its
signicantly
lo
wer
cost,
b
ut
also
underscores
the
need
for
further
enhancements
in
sensor
resolution
and
data
processing
capabilities.
4.5.
Ov
erall
e
v
aluation
and
futur
e
dir
ections
The
prototype
demonstrated
a
promising
capacity
for
basic
na
vig
ation
and
en
vironmental
mapping
tasks,
suitable
for
educational
and
hobbyist
applications
as
sho
wn
in
Figure
5.
The
tests
conrmed
that
the
prototype
meets
essential
operational
requirements
b
ut
also
re
v
ealed
se
v
eral
areas
where
further
de
v
elopment
is
needed.
Future
w
ork
will
focus
on
impro
ving
the
accurac
y
and
resolution
of
the
en
vironmental
mapping,
enhancing
the
rob
ustness
of
the
na
vig
ation
algorithms,
and
rening
the
user
interf
ace
for
a
more
intuiti
v
e
user
e
xperience.
Figure
5.
Na
vig
ation
test
results
5.
DISCUSSION
The
e
v
aluation
of
the
prototype
using
T
oF
sensors
in
a
mobile
robotic
platform
re
v
ealed
se
v
eral
cri
tical
insights
and
implications
for
both
the
technology’
s
capabilities
and
its
de
v
elopmental
trajectory
.
Through
rigorous
testing
and
analysis,
it
became
e
vident
that
while
the
prototype
met
man
y
of
the
baseline
e
xpectations,
it
also
highlighted
areas
requiring
further
renement
and
inno
v
ation.
This
s
ection
delv
es
into
the
detailed
outcomes
of
the
prototype’
s
performance
,
of
fering
a
comprehensi
v
e
discussion
o
n
the
strengths
and
limitations
observ
ed
during
the
testing
phase.
The
insights
g
ained
from
this
e
v
aluation
not
only
shed
light
on
t
he
current
state
of
T
oF
sensor
technology
in
robotics
b
ut
also
point
to
potential
a
v
enues
for
future
enhancements
and
applications.
By
situating
these
ndings
within
the
broader
conte
xt
of
contemporary
technological
trends,
we
can
better
understand
the
implications
for
ongoing
research
and
de
v
elopment
in
this
eld.
Int
J
Elec
&
Comp
Eng,
V
ol.
15,
No.
1,
February
2025:
292-302
Evaluation Warning : The document was created with Spire.PDF for Python.
Int
J
Elec
&
Comp
Eng
ISSN:
2088-8708
❒
299
5.1.
Assessment
of
measur
ement
accuracy
and
en
vir
onmental
mapping
The
testing
phase
highlighted
the
prototype’
s
competence
in
distance
measurement
within
certain
li
m-
its.
While
the
T
oF
sensor
performed
adequately
within
a
controlled
range,
discrepancies
emer
ged
when
dealing
with
comple
x
angles
and
e
xtended
distances
as
sho
wn
in
Figure
6.
These
results
illuminate
the
inherent
chal-
lenges
in
relying
solely
on
T
oF
sensors
for
applications
where
precision
is
crucial,
such
as
in
precise
industrial
measurements
or
comple
x
na
vig
ation
tasks
in
cluttered
en
vironments.
Enhancements
in
sensor
accurac
y
,
pos-
sibly
through
adv
anced
calibration
methods
or
the
inte
gration
of
multiple
sensors
to
mitig
ate
indi
vidual
sensor
limitations,
could
substantially
impro
v
e
performance.
Figure
6.
Measured
and
calculated
a
v
erage
distance
with
respect
to
angle
v
ariation
5.2.
Robotic
na
vigation
and
obstacle
a
v
oidance
capabilities
The
prototype’
s
ability
to
na
vig
ate
and
a
v
oid
obstacles
underlines
the
potential
of
T
oF
sensors
to
support
basic
autonomous
functions.
Ho
we
v
er
,
the
robot’
s
performance
in
dynamic
en
vironments,
where
ob-
stacles
and
en
vironmental
conditions
change
rapidly
,
highlighted
areas
for
impro
v
ement.
As
sho
wn
in
Figure
7,
Figure
7(a)
illustrates
the
robot’
s
distance
measurement
accurac
y
in
an
en
vironment
with
a
white
background,
while
Figure
7(b)
presents
the
results
in
a
sett
ing
with
a
black
background.
Both
subgures
re
v
eal
that
the
measured
distances
de
viate
signicantly
from
the
calculated
distances
as
the
angle
increases,
with
a
more
pro-
nounced
discrepanc
y
observ
ed
in
the
black
background
scenario.
These
v
ariations
suggest
that
the
T
oF
sensors’
performance
is
af
fected
by
changes
in
en
vironmental
background,
impacting
the
robot’
s
ability
to
maintain
accurate
distance
measurements
during
na
vig
ation.
Future
de
v
elopments
could
focus
on
real-time
learning
al-
gorithms
that
allo
w
the
robot
to
dynamically
adjust
to
ne
w
obstacles
and
changes
in
the
en
vironment,
thereby
enhancing
its
applicability
in
more
v
aried
and
unpredictable
settings.
(a)
(b)
Figure
7.
Performance
ag
ainst
changes
in
en
vironmental
background
(a)
white
background
and
(b)
black
background
F
r
om
concept
to
application:
b
uilding
and
testing
a
low-cost
light
detection
and
...
(Andr
´
es
Gar
c
´
ıa)
Evaluation Warning : The document was created with Spire.PDF for Python.
300
❒
ISSN:
2088-8708
5.3.
Interface
usability
and
r
eal-time
data
handling
The
web
interf
ace
w
as
pi
v
otal
for
user
interaction,
pro
viding
essential
controls
and
feedback
in
real-
time.
Feedback
from
users
highlighted
the
ease
of
use
and
the
ef
fecti
v
e
communication
f
acilitated
by
the
interf
ace.
Ho
we
v
er
,
occasional
lags
and
incons
istencies
in
data
transmission
were
noted,
particularly
in
lo
wer
bandwidth
conditions.
Impro
ving
the
rob
ustness
of
the
communication
protocols
and
enhancing
the
interf
ace’
s
capability
to
handle
data
intermittenc
y
and
netw
ork
v
ariability
could
lead
to
broader
deplo
yment
scenarios,
including
outdoor
or
industrial
en
vironments
where
netw
ork
conditions
are
less
controlled.
5.4.
Comparati
v
e
analysis
and
mark
et
positioning
When
positioned
ag
ainst
commercial
high-end
LiD
AR
systems,
the
prototype
of
fered
a
signicantly
lo
wer
-cost
alternati
v
e
b
ut
with
reduced
performance
in
terms
of
range
and
resolution.
This
trade-of
f
is
crucial
for
potential
users
to
consider
,
depending
on
their
specic
needs.
F
or
applications
that
require
high
precision
and
e
xtensi
v
e
data
analysis,
current
T
oF
sensor
capabilities
might
be
limiting.
Ho
we
v
er
,
for
educational
pur
-
poses,
hobbyist
projects,
or
initial
prototyping
where
cost
is
a
critical
f
actor
,
t
his
prototype
of
fers
substantial
v
alue.
Future
research
could
e
xplore
combining
lo
w-cost
T
oF
sensors
with
other
types
of
sensors,
such
as
ultrasonic
or
infrared,
to
create
a
more
rob
ust
system
that
balances
cost
and
performance
more
ef
fecti
v
ely
.
5.5.
Futur
e
r
esear
ch
dir
ections
and
technological
adv
ancements
The
study
opens
se
v
eral
a
v
enues
for
further
research,
particularly
in
sensor
technology
and
multi-
sensor
inte
gration.
Explori
n
g
the
potential
for
h
ybrid
sensing
systems
that
le
v
erage
the
strengths
of
v
arious
types
of
sensors
could
address
the
current
limitations
noted
in
T
oF
sensors.
Additionally
,
adv
ancements
in
machine
learning
and
articial
intelligence
could
be
applied
to
enhance
the
sensor
data
processing,
pro
viding
more
accurate
and
reliable
outputs
necessary
for
comple
x
applications
lik
e
autonomous
dri
ving
or
adv
anced
robotic
na
vig
ation.
6.
CONCLUSION
The
de
v
elopment
and
e
v
aluation
of
a
lo
w-cost
LiD
AR
prototype
using
a
T
oF
sensor
inte
grated
into
a
mobile
robot
represent
a
signicant
achie
v
ement
within
the
eld
of
robotics,
especially
in
terms
of
accessibility
and
cost
ef
cienc
y
.
This
project
successfully
demonstrated
that
T
oF
sensors,
while
less
e
xpensi
v
e,
can
still
per
-
form
man
y
of
the
core
functions
of
more
sophisticated
LiD
AR
systems,
such
as
distance
measurement,
object
detection,
and
basic
en
vironmental
mapping.
The
utilization
of
the
VL53L1X
T
oF
sensor
,
coupled
with
the
ESP32
microcontroller
,
sho
wcased
a
viable
approach
to
reducing
the
nancial
barriers
associated
with
robotic
na
vig
ation
technologies.
Despite
the
lo
wer
cost,
the
prototype
managed
to
perform
reliably
in
controlled
en
vi-
ronments,
of
fering
a
practical
demonstration
of
it
s
capability
to
na
vig
ate
and
map
its
immediate
surroundings
with
a
reasonable
de
gree
of
accurac
y
.
Ho
we
v
er
,
the
prototype’
s
performance
also
highlighted
se
v
eral
limita-
tions,
primarily
its
range
and
precision
compared
to
high-end
LiD
AR
systems.
While
adequate
for
simple
tasks
and
smaller
en
vironments,
the
prototype
struggled
wi
th
comple
x
na
vig
ation
scenarios
and
lar
ger
area
map-
pings.
These
limitations
underscore
the
necessity
for
further
enhancements,
particularly
in
e
xtending
the
range
and
impro
ving
the
delity
of
en
vironmental
scans.
Additionally
,
the
prototype’
s
dependenc
y
on
stable
W
i-Fi
connecti
vity
posed
challenges
in
data
transmission,
suggesting
the
e
xploration
of
more
rob
ust
communication
technologies
could
enhance
operational
reliability
and
e
xtend
the
prototype’
s
utility
to
more
dynamic
and
chal-
lenging
en
vironments.
Moreo
v
er
,
the
project
i
lluminated
potential
areas
for
future
research
and
de
v
elopment.
Incorporating
mul
tiple
T
oF
s
ensors
could
address
the
issues
of
limite
d
co
v
erage
and
mapping
resolution,
while
adv
anced
processing
algorithms
might
better
handle
the
data
comple
xit
y
from
a
more
e
xtensi
v
e
sensor
array
.
Optimizing
these
algorithms
for
real-time
applications
w
ould
also
be
crucial
for
e
xpanding
the
prototype’
s
use
in
scenarios
requiring
quick
decision-making,
such
as
dynamic
obstacle
a
v
oidance.
Furthermore,
the
proto-
type’
s
frame
w
ork
pro
vides
a
foundation
for
educational
and
hobbyist
projects,
of
fering
a
platform
not
only
for
teaching
the
principles
of
robotic
na
vig
ation
b
ut
also
for
inspiring
inno
v
ations
that
could
one
day
translate
into
more
adv
anced
applications.
A
CKNO
WLEDGMENT
This
research
recei
v
ed
support
from
the
Uni
v
ersidad
Distrital
Francisco
Jos
´
e
de
Caldas.
The
opinions,
ndings,
and
conclusions
e
xpressed
in
this
paper
are
those
of
the
authors
and
do
not
necessarily
reect
the
Int
J
Elec
&
Comp
Eng,
V
ol.
15,
No.
1,
February
2025:
292-302
Evaluation Warning : The document was created with Spire.PDF for Python.
Int
J
Elec
&
Comp
Eng
ISSN:
2088-8708
❒
301
vie
ws
of
the
Uni
v
ersidad
Distrital.
The
authors
e
xtend
their
sincere
appreciation
to
the
ARMOS
research
group
for
their
rigorous
e
v
aluation
and
v
aluable
feedback
on
the
prototype.
REFERENCES
[1]
K.
B.
Olayemi
et
al.
,
“The
impact
of
LiD
AR
conguration
on
goal-based
na
vig
ation
within
a
deep
reinforcement
learning
frame
w
ork,
”
Sensor
s
,
v
ol.
23,
no.
24,
2023,
doi:
10.3390/s23249732.
[2]
G.
Li,
Z.
Mo,
and
B.
W
.
K.
Ling,
“
AMFF-Net:
An
ef
fecti
v
e
3D
object
detector
based
on
attention
and
multi-scale
feature
fusion,
”
Sensor
s
,
v
ol.
23,
no.
23,
2023,
doi:
10.3390/s23239319.
[3]
A.
Hassani
and
M.
Joer
ger
,
“
Analytical
and
empirical
na
vig
ation
safety
e
v
aluation
of
a
tightly
inte
grated
Li-
D
AR/IMU
using
return-light
intensity
,
”
Navigation,
J
ournal
of
the
Institute
of
Navigation
,
v
ol.
70,
no.
4,
2023,
doi:
10.33012/na
vi.623.
[4]
A.
Elamin
and
A.
El-Rabban
y
,
“U
A
V
-bas
ed
image
and
LiD
AR
fusion
for
pa
v
ement
crack
se
gmentation,
”
Sensor
s
,
v
ol.
23,
no.
23,
2023,
doi:
10.3390/s23239315.
[5]
K.
T
.
Y
.
Mahima,
A.
Perera,
S.
Ana
v
atti,
and
M.
Garratt,
“Exploring
adv
ersarial
rob
ustness
of
LiD
AR
semantic
se
gmentation
in
autonomous
dri
ving,
”
Sensor
s
,
v
ol.
23,
no.
23,
2023,
doi:
10.3390/s23239579.
[6]
Z.
Dai,
J.
Zhou,
T
.
Li,
H.
Y
ao,
S.
Sun,
and
X.
Zhu,
“
An
intensity-enhanced
LiD
AR
SLAM
for
unstructured
en
viron-
ments,
”
Measur
ement
Science
and
T
ec
hnolo
gy
,
v
ol.
34,
no.
12,
2023,
doi:
10.1088/1361-6501/acf38d.
[7]
D.
T
u,
H.
Cui,
and
S.
Shen,
“P
anoVLM:
Lo
w-cost
and
a
ccurate
panoramic
vision
and
LiD
AR
fused
map-
ping,
”
ISPRS
J
ournal
of
Photo
gr
ammetry
and
Remote
Sensi
ng
,
v
ol.
206,
no.
1,
pp.
149–167,
2023,
doi:
10.1016/j.isprsjprs.2023.11.012.
[8]
R.
Mkhaitari,
Y
.
Mir
,
and
M.
Zazoui,
“
Assessing
annual
ener
gy
production
using
a
combination
of
LiD
AR
and
mast
measurement
campaigns,
”
International
J
ournal
of
P
ower
Electr
onics
and
Drive
Systems
,
v
ol.
14,
no.
4,
pp.
2398–2408,
2023,
doi:
10.11591/ijpeds.v14.i4.pp2398-2408.
[9]
I.
A.
Grishin,
T
.
Y
.
Kruto
v
,
A.
I.
Kane
v
,
and
V
.
I.
T
erekho
v
,
“Indi
vidual
tree
se
gmentation
quality
e
v
aluation
using
deep
learning
models
LiD
AR
based,
”
Optical
Memory
and
Neur
al
Networks
(Information
Optics)
,
v
ol.
32,
no.
2,
pp.
270–276,
2023,
doi:
10.3103/S1060992X23060061.
[10]
X.
Li,
S.
Chen,
S.
Li,
Y
.
Zhou,
and
S.
W
ang,
“
Accurate
and
consistent
spatiotemporal
calibration
for
heterogenous-
camera/
ILiD
AR
system
based
on
continuous-time
batch
estimation,
”
IEEE/ASME
T
r
ansactions
on
Mec
hatr
onics
,
v
ol.
29,
no.
3,
pp.
2009–2020,
2024,
doi:
10.1109/TMECH.2023.3323278.
[11]
P
.
H.
Salmane
et
al.
,
“3D
object
detection
for
self-dri
ving
cars
using
video
and
LiD
AR:
An
ablation
study
,
”
Sensor
s
,
v
ol.
23,
no.
6,
2023,
doi:
10.3390/s23063223.
[12]
L.
C
heng
and
C.
Xie,
“
An
inte
grated
of
f-line
echo
s
ignal
acquisition
system
implemented
in
SoC-FPGA
for
high
repetition
rate
LiD
AR,
”
Electr
onics
,
v
ol.
12,
no.
10,
May
2023,
doi:
10.3390/electronics12102331.
[13]
T
.
Y
ang,
Q.
Y
u,
Y
.
Li,
and
Z.
Y
an,
“Learn
to
model
and
lter
point
cloud
noise
for
a
near
-infrared
T
oF
LiD
AR
in
adv
erse
weather
,
”
IEEE
Sensor
s
J
ournal
,
v
ol.
23,
no.
17,
pp.
20412–20422,
2023,
doi:
10.1109/JSEN.2023.3298909.
[14]
X.
W
ang,
R.
Ma,
D.
Li,
J.
Hu,
and
Z.
Zhu,
“
A
wide
dynamic
range
analog
front-end
with
recongurable
tran-
simpedance
amplier
for
direct
T
oF
LiD
AR,
”
IEEE
T
r
ansactions
on
Cir
cuits
and
Systems
II:
Expr
ess
Briefs
,
v
ol.
70,
no.
3,
pp.
944–948,
2023,
doi:
10.1109/TCSII.2022.3220888.
[15]
J.
Huang,
S.
Ran,
W
.
W
ei,
and
Q.
Y
u,
“Digital
inte
gration
of
LiD
AR
system
implemented
i
n
a
lo
w-cos
t
FPGA,
”
Symmetry
,
v
ol.
14,
no.
6,
2022,
doi:
10.3390/sym14061256.
[16]
D.
Hanto
et
al.
,
“
A
simple
and
cost-ef
fecti
v
e
ph
ysical
distancing
violation
detector
using
a
rotating
time
of
ight
LiD
AR,
”
International
J
ournal
on
Advanced
Science
,
Engineering
and
Information
T
ec
hnolo
gy
,
v
ol.
12,
no.
3,
pp.
1073–1079,
2022,
doi:
10.18517/ijaseit.12.3.15444.
[17]
M.
Cao,
J.
Zhang,
and
W
.
Chen,
“V
isual-inertial-laser
SLAM
based
on
ORB-SLAM3,
”
Unmanned
Systems
,
no.
1,
pp.
1–10,
2023,
doi:
10.1142/S2301385024500262.
[18]
L.
G.
da
Silv
a
and
A.
S.
Cerqueira,
“
A
LiD
AR
architecture
based
on
indirect
T
oF
for
autonomous
cars,
”
J
ournal
of
Mi-
cr
owaves,
Optoelectr
onics
and
Electr
oma
gnetic
Applications
,
v
ol.
20,
no.
3,
pp.
504–512,
2021,
doi:
10.1590/2179-
10742021V20I31137.
[19]
K.
Y
oshioka,
“
A
tutorial
and
re
vie
w
of
automobile
direct
T
oF
LiD
AR
SoCs:
Ev
olution
of
ne
xt-generation
LiD
ARs,
”
IEICE
T
r
ansactions
on
Electr
onics
,
no.
10,
pp.
534–543,
2022,
doi:
10.1587/transele.2021CTI0002.
[20]
T
.
Y
ang,
Y
.
Li,
Y
.
Ruichek,
and
Z.
Y
an,
“Performance
modeling
a
near
-infrared
T
oF
LiD
AR
under
fog:
A
data-dri
v
en
approach,
”
IEEE
T
r
ansactions
on
Intellig
ent
T
r
ansportation
Systems
,
v
ol.
23,
no.
8,
pp.
11227–11236,
2022,
doi:
10.1109/TITS.2021.3102138.
[21]
G.
Chen,
C.
W
iede,
and
R.
K
ok
ozinski,
“Data
proc
essing
approaches
on
SP
AD-based
d-T
oF
LiD
AR
systems:
A
re
vie
w
,
”
IEEE
Sensor
s
J
ournal
,
v
ol.
21,
no.
5,
pp.
5656–5667,
2021,
doi:
10.1109/JSEN.2020.3038487.
[22]
S.
Cattini,
D.
Cassanelli,
L.
Di
Cecilia,
L.
Ferrari,
and
L.
Ro
v
ati,
“
A
procedure
for
the
characterization
and
comparison
of
3D
LiD
AR
systems,
”
IEEE
T
r
ansactions
on
Instrumentation
and
Measur
ement
,
v
ol.
70,
no.
1,
pp.
1–10,
2021,
doi:
10.1109/TIM.2020.3043114.
F
r
om
concept
to
application:
b
uilding
and
testing
a
low-cost
light
detection
and
...
(Andr
´
es
Gar
c
´
ıa)
Evaluation Warning : The document was created with Spire.PDF for Python.