TELKOM
NIKA Indonesia
n
Journal of
Electrical En
gineering
Vol.12, No.7, July 201
4, pp
. 4973 ~ 49
8
0
DOI: 10.115
9
1
/telkomni
ka.
v
12i7.447
7
4973
Re
cei
v
ed Se
ptem
ber 25, 2013; Revi
se
d Jan
uary 26,
2014; Accept
ed Feb
r
ua
ry
10, 2014
Short-Term Prediction of Wind Power Based on an
Improved PSO Neural Network
Hong Zhang*, Guo Zhao,
Lixing Chen, Bailiang Liu
Schoo
l of Elect
r
ical En
gin
eeri
ng, South
east
Univers
i
t
y
, N
a
n
jing, Ji
angs
u, 2
100
96, Ch
ina
*Corres
p
o
ndi
n
g
author, e-ma
i
l
: hazh0
21
6@1
63.com
A
b
st
r
a
ct
Con
nectin
g
w
i
nd pow
er to the pow
er grid h
a
s recently bec
ome more co
mmo
n
.
T
o
better man
a
g
e
and
use w
i
nd
pow
er, its strength
must
b
e
pred
icted pr
eci
s
ely, w
h
ich is
of great safety
and
econ
o
m
i
c
signific
anc
e. In this pap
er, the short-ter
m
p
o
w
e
r pr
edicti
o
n of w
i
nd pow
er is
bas
ed o
n
self-ada
ptive
nich
e
particl
e sw
arm optimi
z
a
t
io
n (NPSO) in a ne
ural net.
Impr
o
v
ed PSO adop
ts the rules of
classificati
on a
n
d
eli
m
i
nati
on of a nich
e usin
g a self-ad
aptive
nonl
ine
a
r mutation o
perat
or. Compar
ed w
i
th the traditio
n
a
l
meth
od
of
ma
ximu
m gr
adi
en
t, NPSO can s
k
ip a
loc
a
l
opti
m
a
l
so
luti
on
a
nd
appr
oac
h the
glo
bal
o
p
ti
ma
l
soluti
on mor
e
easily
in practi
ce.
Co
mp
ared
w
i
th
the
bas
ic
PSO, the nu
mber of iter
atio
n
s
is red
u
ce
d w
hen
the gl
ob
al o
p
ti
ma
l so
lutio
n
is
obtai
ne
d. T
he
meth
od
pr
o
pos
ed i
n
this
pa
pe
r is exp
e
ri
me
ntally s
how
n to b
e
capa
ble
of efficient pre
d
ictio
n
and us
eful for
short-term p
o
w
e
r pred
iction.
Ke
y
w
ords
: PSO, niche, mutat
i
on o
perat
or, short-term p
o
w
e
r predicti
on, ne
ural n
e
t
Copy
right
©
2014 In
stitu
t
e o
f
Ad
van
ced
En
g
i
n
eerin
g and
Scien
ce. All
rig
h
t
s reser
ve
d
.
1. Introduc
tion
Wind p
o
wer
is a re
ne
wab
l
e energy so
urce t
hat is
becoming in
crea
singly po
p
u
lar for
appli
c
ation
in
the
grid
be
cause of
its
en
vironme
n
ta
lly frien
d
ly an
d l
o
w-co
st
pro
p
e
rties.
However,
becau
se the
powe
r
fluctu
ates with the
wind str
engt
h, conn
ectin
g
wind po
we
r to the grid
is
chall
engin
g
. To make the
use of
wind p
o
we
r re
asona
ble and
red
u
c
e its n
egativ
e effects o
n
the
power g
r
id, scienti
s
ts in m
any cou
n
trie
s have
been
workin
g to de
velop method
s to pre
d
ict t
he
power of the
wind
ge
nerators,
wh
ich i
s
of great im
po
rtance to
the
econo
mical
distrib
u
tion
a
nd
operation
of
the po
we
r
gri
d
. De
nma
r
k
wa
s a
m
on
g
t
he first
cou
n
tries to
devel
op a
system
of
power
prediction for
win
d
po
wer [1]. Predi
ktor
i
s
the wi
nd p
o
w
er work
predictio
n
syst
em
develop
ed by
Ri
s National
Labo
rato
ry
of De
nma
r
k,
whi
c
h m
a
inly
applie
s
physical m
odel
s [
2
].
ANEMOS, a rese
arch p
r
oject spon
so
red by
the Europ
ean Union, com
b
in
es phy
sical and
statistical met
hod
s [3]. The
eWi
nd i
s
a
system
devel
o
ped by
AWS True
win
d
in America
[4]. The
highly p
r
e
c
ise mathe
m
atical mod
e
ls of
atmosp
he
ric physi
cs
an
d adaptive st
ati
s
tical
mod
e
ls are
combi
ned; th
e velo
city of the
wind
a
nd the
po
we
r of th
e
win
d
po
we
r pl
a
n
ts h
a
ve b
e
en
investigate
d
in studie
s
ba
sed on time serial
s and n
e
ural net
wo
rks [5-7]. The b
a
ck propag
ation
(BP) n
e
u
r
al
n
e
twork is the
mo
stly wid
e
l
y
use
d
neu
ra
l network. T
h
e cl
assi
c BP
learni
ng l
a
w i
s
typically use
d
in BP neural
netwo
rks to
determi
ne
ne
twork conn
ection weig
hts. Ho
wever, thi
s
techni
que is
slo
w
in pra
c
tice and may l
ead to a loca
l optimal solu
tion. In this p
aper, the sh
o
r
t-
term po
wer
predi
ction of
the wind p
o
we
r is
ba
sed on self
-a
daptive nich
e particl
e swarm
optimizatio
n (NPSO) in a
neural netwo
rk. Improv
ed PSO adopts t
he rule
s of cl
assificatio
n
a
n
d
elimination of
a niche an
d use
s
a self
-a
daptiv
e nonli
near mutatio
n
operator. Compa
r
ed with
the
traditional
me
thod of maxi
mum gradie
n
t, NPSO can
ski
p a lo
cal o
p
timal sol
u
tio
n
and
app
roa
c
h
the glob
al opt
imal sol
u
tion
more
ea
sily in pra
c
tice. Compa
r
ed
with
the ba
sic PS
O, the num
be
r of
iteration
s
i
s
redu
ced
wh
en
the glob
al o
p
timal so
lutio
n
is o
b
taine
d
. The m
e
thod
prop
osed in
this
pape
r is exp
e
rime
ntally shown to be
cap
able of
ef
ficient predi
ction and
us
eful for short
-
term
power p
r
edi
ct
ion.
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 23
02-4
046
TELKOM
NI
KA
Vol. 12, No. 7, July 201
4: 4973 – 49
80
4974
2. Theore
t
ic
al Basis fo
r Improv
ed Self-Adap
t
iv
e PSO
2.1. Theore
t
ical Basis for
Basic Partic
le S
w
a
r
m Op
timization
In 1995, J. Kenne
dy and
R. C. Eberh
a
r
t develope
d
PSO [8, 9], w
h
ich aim
s
to
simulate a
simple
s
o
ci
al
sy
st
e
m
,
s
u
c
h
a
s
a
bi
rd f
l
oc
k
se
a
r
chin
g for food
s, t
o
stu
d
y an
d
explain
com
p
lex
so
cial b
ehavi
o
r. In b
a
si
c
PSO, every
can
d
idate
sol
u
tion is com
pare
d
to a
bi
rd
sea
r
ching
the
spa
c
e
and i
s
called
a
particl
e. The
positio
n an
d velocity of
a pa
rticle i
s
de
noted
a
s
ii
1
i
2
i
D
X
(
, x
,
,x
)
x
gg
g
and
ii
1
i
2
i
D
V(
,
,
,
)
vv
v
gg
g
, resp
ectively. At
the initial stag
e, a swa
r
m of
particl
es is
randomly
sel
e
cted. T
hen,
the swa
r
m
is up
dated
a
c
cordi
ng to t
he be
st
kno
w
n
positio
ns of i
ndividual p
a
rt
icle
s and th
e
entire
swarm. The equ
ations d
e
fining
the po
sition
and
velocity of the particle
s
a
r
e
sho
w
n b
e
low:
id
id
1
1
i
d
i
d
2
2
id
id
id
id
i
d
v(
k
+
1
)
=
w
v(
k
)
+
c
r
(
p(
k
)
-
x
(
k
)
)
+
c
r
(
g
(
k
)
-
x
(k
))
x
(
k
+
1)
=
x
(
k
)
+
v
(
k)
(
1
)
i
n
i
e
n
d
m
a
x
m
ax
en
d
w=
(
w
w
)
(
k
k
)
/
k
+
w
(
2
)
In Equatio
ns
(1) an
d
(2
),
p is the
be
st
kn
ow
n
po
sition of
a
pa
rticle
and
g
is
the be
st
kno
w
n p
o
siti
on of the ent
ire swa
r
m;
i
= 1,2···n; D i
s
the dim
e
n
s
i
on of a pa
rti
c
le; k i
s
the
k-th
iteration; d i
s
the d
-
th dim
ensi
on; kmax
is th
e
maxi
mum nu
mbe
r
of iteration
s
;
w i
s
the i
n
e
r
tia
weig
ht; wini i
s
the initial i
n
ertia weight;
wen
d
is th
e final ine
r
tia weight; c
1
a
nd c
2
are lea
r
ni
ng
factors; and r1 and r
2
are u
n
iform ra
ndo
m numbe
rs in
the range [0,
1].
2.2. Adap
tiv
e
Niching Pa
rticle S
w
a
r
m
Optimizatio
n
Basic PSO
may lead to prem
ature
co
nverge
nc
e to a local optimum, thus affecting the
quality of the solution. The
probability of prem
at
urity can be reduced by
mixing basi
c
PSO wit
h
other algo
rit
h
ms or
by adoptin
g
a comp
re
hen
si
ve strate
gy. Ni
che
tech
nology
simul
a
tes
ecol
ogi
cal b
a
l
ance, i.e., a
speci
e
s evolve
s to
es
ta
blish
a surviving
ni
che
in
a la
rge
r
envi
r
onm
ent,
whi
c
h refle
c
ts the evolution
a
ry ru
le of su
rvival of the fittest. Goldbe
rg and Ri
cha
r
dso
n
de
scrib
e
d
nich
e technol
ogy ba
sed
o
n
a
sha
r
ing
mech
ani
sm i
n
[10] an
d B
r
its et
al., de
scribe
d
NPSO in
[11]. The following formula
e
are ba
se
d o
n
adaptive NPSO:
id
id
1
1
id
id
2
2
id
id
3
3
id
id
id
id
id
v
(
k+
1)
=
w
v
(
k)
+
c
r
(
p
(
k)
-
x
(
k
)
)
+
c
r
(g
(k
)
-
x
(
k
)
)
+
c
r
(
p
(
k
)
-
x
(
k
)
)
x
(
k+
1)
=
x
(
k
)
+
v
(
k)
(
3
)
in
i
e
n
d
e
n
d
ma
x
k
w
=
(
w
-
w
)e
x
p
(
-
1
/
[
1
+
(
1
+
)
]+
w
k
(
4
)
In Equation
(3) an
d (4
),
͞
pi
d is the b
e
st
kno
w
n p
o
sitio
n
of a su
b-swarm; c
3
i
s
the
learni
ng
factor; an
d r
3
is a unifo
rm
rand
om
seq
uen
ce in the
rang
e [0, 1]. The dive
rsity
sele
ction of
the
swarm
re
gula
t
es the
adapt
ability of individual p
a
rtic
l
e
s by refle
c
ting
the sh
ari
ng fu
nction
s a
m
on
g
them, upon
whi
c
h the lat
e
r evolutio
nary process is
sele
cted, to
create a
n
evol
ved enviro
n
m
ent
and to reali
z
e swarm div
e
rsity.The a
d
aptive
mutation ope
rator
adopt
s an ad
aptive non-li
n
ear
decrea
s
in
g i
nertia
weig
ht function[12]
. The de
cr
e
a
sin
g
velocit
y
of the inertia weight
is
accele
rated i
n
the first iterat
ion of the algorithm to a
c
hieve a more efficient sol
u
tion.
2.3. The Main Steps of
th
e Impro
v
ed PSO Algorithm
The main
ste
p
s of the improved PSO algorithm a
r
e a
s
follows:
Step 1: Start.
Step 2: Generate the initi
a
l popul
ation
by chaoti
c
iteration.
Step 3: Initial
i
z
e
parameters
.
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
2302-4
046
Short-T
e
rm
Predi
ction of Wind Powe
r Ba
sed o
n
an Impro
v
ed PSO
Neu
r
al…
(Ho
ng Zhang
)
4975
Step 4: Sele
ct a p
a
rti
c
le
rand
omly an
d divi
de
all o
f
the pa
rticle
s evenly i
n
to
m sm
all
nich
e su
bpo
p
u
lation ba
se
d
on adaptive functio
n
s.
Step 5: Establish the initial
velocity of the particle
s
ran
domly.
Step 6: Set t
he initial
po
si
tion of the
prese
n
t
pa
rticle
as the in
dividual hi
sto
r
ica
l
optimal
value, pbx; set the histori
c
al optimal value of
the optimal individual
in each subp
opulatio
n as the
popul
ation hi
stori
c
al
optim
al value,
p
bx; and
set th
e
histo
r
ical o
p
t
imal value
o
f
all of the
particl
es a
s
the overall hi
storical o
p
tima
l value, gbx.
Step 7: Whe
n
k is less th
an the maximum num
b
e
r of iterations,
the following
cycle of
operation
s
is
perfo
rmed fo
r each
sub
pop
ulation:
a) Cal
c
ul
ate the inertia
wei
ght, threshold
value, and calibratio
n
co
e
fficient.
b) Up
date the
velocity and positio
n of ev
ery parti
cle wi
thin each su
b
popul
ation.
Step 8: Adopt a niche elimi
nation strateg
y
.
Step 9: Determin
e whet
her the co
n
v
ergen
ce
co
ndition
s are
met; if so, stop the
cal
c
ulatio
n an
d output the result
s; if not,
go to Step 6.
Step 10: End.
The flow
cha
r
t in Figure 1 il
lustrate
s the
main step
s of
the improved
PSO algorith
m
.
Figure 1. Flowchart of the
Improved PS
O Algorithm
2.4. Testing the Improv
ed
PSO Algor
ithm Using Standard T
est
Function
s
To te
st the
p
e
rform
a
n
c
e
o
f
the imp
r
ove
d
PSO al
gorit
hm, two
sta
n
dard
testin
g f
unctio
n
s
are sele
cted:
the 2-D Rose
nbrock fu
nction an
d 2-D
Ra
strigi
n function. Standard testing
function
s
are
comm
only e
m
ployed i
n
th
e optimi
z
atio
n literatu
r
e
to
evaluate
the
efficien
cy of
new
algorith
m
s [1
3, 14]. The two stand
ard te
sting fu
n
c
tion
s have num
erous lo
cal o
p
tima and a glo
bal
minimum that
is
very diffic
u
lt to loc
a
te.
2.4.1. The 2-D Ros
e
nbr
o
ck Func
tion
The 2-D Rosenbrock fun
c
tion is
given b
y
Equation (5
):
22
2
12
2
1
1
f
(
x
,
x
)
=
100(
x
-
x
)
(
1
x
)
(5)
Figure 2. Gra
ph of the Ro
senbr
ock Fu
nction
Figure 3. Gra
ph
of the Ra
strigin Fun
c
tio
n
-2
-1
0
1
2
-4
-3
-2
-1
0
1
2
3
4
0
2000
4000
6000
8000
X1
Ro
s
e
nb
r
o
c
k
F
unc
t
i
o
n
X2
f (
X
1
,
X
2
)
G
l
o
b
a
l
Min
i
m
u
m
-4
-2
0
2
4
-4
-2
0
2
4
0
20
40
60
80
X1
Ra
s
t
r
i
g
i
n
F
u
n
c
t
i
o
n
X2
g (
X
1 ,
X 2)
Gl
o
b
a
l
M
i
n
i
m
u
m
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 23
02-4
046
TELKOM
NI
KA
Vol. 12, No. 7, July 201
4: 4973 – 49
80
4976
For the 2-D Ro
sen
b
ro
ck functio
n
in this pape
r, the global minim
u
m is fglobal
= 0 as x =
(1,1), but the
valley in whi
c
h the mini
m
u
m lies h
a
s
steep edg
es a
nd a na
rro
w ridge. The tip
of
ridge i
s
also steep. Figure 2 illustrate
s the main
cha
r
acteri
stics of the
2-D Ro
se
nbro
c
k functi
on.
2.4.2 The 2-D Ra
strigin function
The 2-D Rast
rigin fun
c
tion
is given by Equation (6):
22
12
1
2
1
2
g
(
x
,
x)
=
x
x
1
0
[
c
o
s
(
2
x
)
c
o
s
(
2
x)
]
2
0
(6)
For th
e 2
-
D
Ra
strigin
fun
c
tion
employ
ed in
this pa
p
e
r, the
glo
bal
minimum
is f
g
lobal
= 0
whe
n
x = (0,
0
). The
r
e a
r
e
many local
minima a
rra
n
ged in a latti
ce configu
r
ati
on, as
sho
w
n
in
Figure 3. Fig
u
re 3 ill
ustrates the m
a
in
chara
c
te
risti
c
s of the 2-D Rose
nbr
ock fu
nction. Th
e gl
obal
minima of the
2-D
Rosenb
rock functio
n
and 2-D Rast
rigin fun
c
tion
can b
e
locate
d by simulati
on
comp
utation
based
on the
improved P
S
O algo
rith
m
.
Thus, t
he
model
ba
sed
on th
e imp
r
oved
PSO can be
use
d
in pra
c
ti
ce.
3. Neural Netw
o
r
k Mod
e
l Bas
e
d on Se
lf-Adap
t
iv
e
Niche PSO
3.1. Theore
t
ical Basis for
the Ba
sic Neural Ne
t
w
o
r
k
Figure 4. Artificial inte
lligence neural network.
Figure 5. Learni
ng p
r
o
c
e
ss of the NPS
O
neural syst
em
Since the i
n
sightful study
of the neu
ral
netwo
rk i
n
the 19
80
s [15
-
16], neu
ral
n
e
tworks
have be
en
widely appli
e
d
to the in
du
strial field. Th
e artifici
al int
e
lligen
ce
neu
ral n
e
two
r
k i
s
a
compl
e
x nonli
near
system.
The artifici
al neural net
work is al
so a n
o
n
linea
r map
p
i
ng syste
m
wit
h
good
self
-ad
aptability and
can
be
used
to identify
a
n
y com
p
licated state
or p
r
ocess. Fig
u
re 4
descri
b
e
s
a
simple
artifici
al intelligen
ce neu
ra
l n
e
twork. Th
e b
a
si
c pri
n
ci
ple
of the neu
ral
netwo
rk mod
e
l to p
r
o
c
e
s
s
informatio
n is that the in
pu
t sign
al X(i
)
a
c
ts
on the
int
e
rme
d
iate n
o
de
(the hid
den
layer), lea
d
in
g to a resul
t
from
the output node,
whi
c
h utilize
s
a non
-lin
e
a
r
transfo
rmatio
n an
d g
ene
ra
tes a
n
o
u
tput
sig
nal Y
(
k)
b
y
adju
s
ting
W(ij
)
, the
val
ue
relating
to
the
input node
s
and hidd
en l
a
yer nod
es.T
(jk), the value relating to
the hidden la
yer node
s, the
output no
de,
and thei
r respective valu
e
s
, is
redu
ce
d
by repetitive l
earni
ng traini
ng; the net
wo
rk
para
m
eters (weig
h
ts an
d thre
shol
d valu
es)
relatin
g
to the minimu
m error a
r
e d
e
termin
ed. T
h
e
training
co
ntinue
s until the
error
rea
c
h
e
s
the th
re
sho
l
d value. Th
e
BP neural n
e
twork m
odel
is
expre
s
sed in
Equation (7):
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
2302-4
046
Short-T
e
rm
Predi
ction of Wind Powe
r Ba
sed o
n
an Impro
v
ed PSO
Neu
r
al…
(Ho
ng Zhang
)
4977
ji
j
i
j
kj
k
j
k
Of
(
W
X
q
)
Yf
(
T
O
q
)
(
7
)
Where f is the activating func
tion and q is the neural cell th
reshol
d. Figure 5 illustrates the
perfo
rman
ce
of predi
ction
based on the
improve
d
PSO neural net
work.
3.2. The Ste
p
s of th
e Pre
d
iction Algo
rithm
Bas
e
d
on Self-Ada
ptiv
e NPSO
Neur
al Net
w
ork
The m
a
in
ste
p
s
of p
r
edi
cti
on al
go
rithm
based
on
the
self-a
daptive
NPSO n
e
u
r
al
network
are as
follows:
Step 1: Start.
Step 2: Input the initial values an
d target
values of the
sampl
e
s.
Step 3: Initialize the coupli
ng wei
ght values a
nd thre
shold
s
.
Step 4: Convert con
n
e
c
tio
n
weig
hts an
d threshold
s
to particl
es.
Step 5: Divide the initial populatio
n into
several
small
niche
sub
p
o
pulation
s
.
Step 6: Calcu
l
ate the adapt
ive va
lues of the parti
cle swarm.
Step 7:
Dete
rmine
the
be
st kno
w
n
po
sition
s of th
e
individual
s,
sub
-
po
pulatio
ns, a
nd
overall po
pul
ation.
Step 8: Adjust the adaptabi
lity and inerti
a we
ig
ht and
update th
e velocity and p
o
s
ition of
the particl
es.
Step 9: Judg
e wheth
e
r the
niche u
pdate
conditio
n
s a
r
e met. If not,
go to Step 6.
Step 10: Run
the nich
e opti
m
ization
rule
s.
Step 11: Jud
ge wh
ether th
e maximum time is re
ache
d. If not, go to
Step 6.
Step 12: Dete
rmine the
cou
p
ling value a
nd thre
shol
d.
Step 13: End.
The flow cha
r
t in Figure 6 il
lustrate
s the
ma
in step
s of
the predi
ctio
n algorith
m
b
a
se
d o
n
the self-a
dapt
ive NPSO ne
ural net
wo
rk.
Figure 6. Flow Ch
art of the Predi
ction
Algorit
hm Ba
sed o
n
the Self-ada
ptive NPSO Ne
ural
Netw
or
k
4. Predictiv
e
Analy
s
is of
the Neural Net
w
o
r
k Ba
se
d on Self-Ad
a
ptiv
e NPSO
The po
we
r predictio
n mod
e
l is esta
blish
ed
by the neu
ral network b
a
se
d on self-adaptive
NPSO (im
p
roved PSO).
The po
we
r o
f
a wind g
e
nerato
r
in
Dongtai (Jia
ng
su, China
)
was
predi
cted in
2008 ba
se
d on the meteo
r
ologi
cal d
a
ta
and data for the powe
r
g
enerated by the
wind
gen
erat
or in th
e previous m
onth
s
.
The p
r
edi
ctive model
s fo
r
the neu
ral n
e
t
work a
r
e b
a
sed
on PSO, NP
SO, and T
r
ai
ngdm. Fi
rst, t
he o
r
iginal
d
a
ta rel
a
ted to
wind
speed
and
wind
po
wer
must
be
pro
c
essed
an
d n
o
r
mali
zed
by a
d
vanced m
a
thematical m
e
thods [17]. F
o
r
example, t
he
model will o
b
s
erva
bly decrease sy
stem
atic error
wh
en the origi
n
data have be
en pro
c
e
s
sed
by
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 23
02-4
046
TELKOM
NI
KA
Vol. 12, No. 7, July 201
4: 4973 – 49
80
4978
the Kalman
filter de
scrib
e
d
in the lite
r
at
ure [1
8]. All predi
ctive mo
dels
are train
ed befo
r
e
han
d.
Figure 7 illust
rates the mai
n
cha
r
a
c
teri
stics obt
ai
ned f
r
om differe
nt predi
ction mo
dels 3 h ahe
a
d
.
Figure 7(a
)
illustrate
s that
higher win
d
powe
r
s ge
nerally corre
s
po
nd to hig
her wi
nd spe
eds.
Figure 7
(
b) p
r
ese
n
ts th
e m
easure
d
p
o
wer a
nd
fo
re
ca
sted
power b
a
se
d on
PSO
, improved
P
S
O
,
and T
r
ain
g
d
m
. Comp
arin
g the re
sult
s
of the thr
ee method
s,
the
forecaste
d
wind po
wer curve
based o
n
the
improve
d
P
S
O is the
clo
s
e
s
t to the m
easure
d
po
wer in
Figu
re
7(b
)
. Figu
re
7(c)
pre
s
ent
s the
relative e
r
ror
from diffe
rent
pre
d
icti
o
n
s.
The mi
nimum
relative
erro
r of the fo
re
ca
st
wind p
o
wer is obtained by the improved
PSO method.
0
2
4
6
8
10
12
14
16
18
2
0
22
24
2
4
6
Wi
nd P
o
we
r (1
00K
W
)
Wi
n
d
S
p
e
e
d
(
m
/s
)
Ti
m
e
(
h
)
W
i
nd S
p
e
e
d
W
i
nd P
o
w
e
r
(a)
0
2
4
6
8
1
0
1
2
1
4
1
61
82
02
2
2
4
200
400
600
800
W
i
nd
Po
wer (KW
)
Ti
m
e
(
h
)
W
i
nd
po
w
e
r
PSO
Pr
e
d
i
c
t
i
o
n
I
m
pr
o
v
ed
P
S
O
P
r
ed
i
c
t
i
on
T
r
ai
ngd
m
P
r
e
d
i
c
t
i
on
(b)
0
2
4
6
8
1
01
2
1
4
1
61
8
2
02
2
2
4
0.
0
0.
1
0.
2
0.
3
R
e
la
ti
ve err
o
r
Ti
m
e
(
h
)
PS
O
I
m
pr
ov
ed P
S
O
T
r
ai
ngdm
(c
)
0.
01
0.
5
2
10
30
50
70
90
98
99
.
5
0
.
0
0
0.
05
0.
10
0
.
1
5
0.
20
0.
25
0.
30
0
.
0
0
0.
05
0.
10
0
.
1
5
0.
20
0.
25
0.
30
0
2
4
6
F
r
e
quenc
y(C
o
unt
s)
R
e
l
a
ti
v
e
E
rro
r
C
u
m
u
l
a
t
i
ve Pr
ob
abi
l
i
t
y
%
PS
O
0.
01
0.
5
2
10
30
50
70
90
98
99
.
5
0
.
0
4
0.
06
0.
08
0
.
1
0
0.
12
0.
14
0.
16
0
.
0
4
0.
06
0.
08
0
.
1
0
0.
12
0.
14
0.
16
0
2
4
6
8
F
r
e
quenc
y(C
o
unt
s)
R
e
l
a
ti
v
e
E
rro
r
C
u
m
u
l
a
t
i
ve Pr
ob
abi
l
i
t
y
%
I
m
pr
o
v
ed P
S
O
0.
01
0.
5
2
10
30
50
70
90
98
99
.
5
0
.
0
0
0.
05
0.
10
0
.
1
5
0.
20
0.
25
0.
30
0
.
0
0
0.
05
0.
10
0
.
1
5
0.
20
0.
25
0.
30
0
2
4
6
Fr
e
que
ncy(
C
o
un
t
s
)
R
e
l
a
ti
v
e
E
r
r
o
r
Cu
mula
t
i
ve Prob
ab
ili
ty
%
Tr
ai
ng
dm
(d)
Figure 7. Main Cha
r
a
c
teri
stics Obtai
ned
from
the Thre
e Different Predictio
ns. (a
)
Wind
spe
ed
and wi
nd po
wer. (b
) The m
easure
d
po
wer and fo
re
ca
sted po
we
r b
a
se
d on PSO
, improved PSO,
and Traing
d
m
. (c)
Relativ
e
error fro
m
d
i
fferent pre
d
iction model
s b
a
se
d on PSO
, improved PSO,
and Traing
d
m
. (d) Frequ
e
n
cy and p
r
ob
ability from different pre
d
ict
i
on model
s b
a
se
d on PSO
,
improve
d
PSO, and Trai
ng
dm
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
ISSN:
2302-4
046
Short-T
e
rm
Predi
ction of Wind Powe
r Ba
sed o
n
an Impro
v
ed PSO
Neu
r
al…
(Ho
ng Zhang
)
4979
Figure 7(d) il
lustrates the fr
equency and probabilit
y from
different predi
ction models
based on
PS
O,
improved PSO,
and Traingdm. The
prob
abilit
y of
a rel
a
tive e
r
ror of
less th
a
n
0.1
for the imp
r
o
v
ed PSO method is
gre
a
te
r than tho
s
e
of PSO and
Train
gdm. Th
us, the p
r
edi
ction
accuracy of the improve
d
PSO method is bette
r than
those of PSO
and Train
g
d
m
. The absol
ute
err
o
r, r
e
lativ
e
er
ror, me
a
n
ab
solute e
rro
r, me
a
n
relative error,
stand
ard
d
e
viation, rela
tive
stand
ard d
e
viation, and interval pro
babil
i
ty in
th
is pap
er are illu
strat
ed by Equation (8)[1
9
]. From
Figure 7, the
statistical dat
a indi
cate tha
t
the
predi
ctio
n po
wer
ba
se
d on imp
r
ove
d
PSO is b
e
tter
than those ba
sed o
n
PSO and Traing
d
m
.
ab
s
o
l
u
t
e
e
r
r
o
r
=
|
f
o
r
eca
s
t
(
)
-
m
ea
s
u
r
e
(
)
|
|
f
o
r
e
cast
(
)
-
m
easu
r
e
(
)
|
m
e
an
ab
s
o
l
u
t
e
e
r
r
o
r
n
|
f
o
r
eca
s
t
(
)
-
m
ea
s
u
r
e
(
)
|
re
l
a
t
i
v
e
e
r
ro
r
=
m
e
as
u
r
e (
)
r
e
l
a
t
i
v
e er
r
o
r
m
e
an
r
e
l
a
t
i
v
e
er
r
o
r
=
n
s
t
a
nda
r
d
de
v
i
a
t
i
ii
ii
ii
i
nn
2
i=
1
i
=
1
n
i=
1
1
(
f
o
r
ec
as
t
(
)
-
f
o
r
ec
a
s
t
(
)
)
n
on
n-
1
s
t
a
nda
r
d
de
v
i
a
t
i
o
n
r
e
l
a
t
i
v
e
s
t
a
nda
r
d
de
v
i
a
t
i
o
n
1
fo
r
e
c
a
s
t
(
)
n
f
r
e
que
nc
y
(
c
ount
s
)
i
n
t
e
r
v
a
l
pr
oba
bi
l
i
t
y
=
n
ii
i
(8)
4. Conclusio
n
In this pa
per,
a pre
d
ictive
model fo
r ne
ur
al n
e
two
r
ks based on
se
lf-adaptive
NPSO is
establi
s
h
ed. Usi
ng mod
e
l analysi
s
, experime
n
ts
, and
compa
r
ison
with pre
d
ictiv
e
model
s based
on other al
g
o
rithm
s
, the model is
sh
own to be
more p
r
e
c
ise
than the other two m
o
d
e
ls
con
s
id
ere
d
; furthe
rmo
r
e, it has the lowest ab
so
lute
varian
ce, de
monst
r
ating i
t
s effectivene
ss.
The reliability
of the model
is si
gnificantl
y
related
with
the pre
c
i
s
ion
of the we
ath
e
r forecast,
With
comp
uters b
e
c
omin
g in
cre
a
sin
g
ly po
we
rful, the p
r
edi
ctive metho
d
of the ne
ural
netwo
rk ba
se
d
on hybrid m
u
l
t
i-algorithms
will be
most useful in the future.
Ackn
o
w
l
e
dg
ements
This p
ape
r is sup
porte
d b
y
the Nationa
l High T
e
chn
o
logy Rese
arch a
nd
Devel
opment
Program (8
63
Progra
m
) (2
011AA05A1
0
7
).
Referen
ces
[1]
Lan
db
erg L, W
a
tson SJ. Sh
or
t-term
predicti
o
n of loc
a
l
w
i
n
d
cond
itions.
B
o
und
ary-Lay
er Meteoro
l
ogy
.
199
4; 70(1-
2): 171-
195.
[2]
Lan
db
erg
L, Gieb
el G, N
i
els
e
n HA,
et al.
S
hort
‐
term Pred
i
c
tion: An
Over
vi
e
w
. Wi
nd
Ene
r
gy
. 20
03
;
6(3): 273-
28
0.
[3]
Marti I, Karini
otakis G, Pins
on
P, et al. Eval
ua
tion of
adva
n
ce
d
w
i
nd
po
w
e
r forecasti
ng m
o
dels
–
resu
lt
s
of the Anemos
proj
ect. 2006.
[4]
Porter K, Rogers J.
Status of Centra
li
z
e
d
W
i
nd Pow
e
r F
o
re
casting
in N
o
rth America.
NREL
/
SR
-5
50
-
478
53. Gold
en
, CO: National
Ren
e
w
a
ble E
n
erg
y
La
bor
ator
y, 20
10.
[5]
Yang
X,
Xi
ao
Y, Chen S.
W
i
nd sp
eed
an
d
gen
erate
d
pow
er forecasti
ng i
n
w
i
nd far
m
.
Procee
din
g
s of
the CSEE 2005, 25: 1-5.
[6]
Ding M, Z
h
a
n
g
L, W
u
Y. W
i
nd spe
ed forec
a
st mode
l for w
i
nd farms
ba
sed o
n
time s
e
ries a
n
a
l
ysis
.
Electric Power Autom
a
tion Equipment.
200
5; 8: 32-34.
[7]
Xi
ao Y, W
ang
W
,
Huo X. Stud
y
o
n
the time-
s
er
ies
w
i
nd sp
eed forec
a
stin
g of the
w
i
nd f
a
rm base
d
on
neur
al net
w
o
rk
s.
Energy Con
s
ervatio
n
T
e
ch
nol
ogy.
20
07: 2: 2.
[8]
Eberh
a
rt R, K
enn
ed
y
J. A
ne
w
optim
izer
usi
ng
partic
l
e
s
w
arm
theor
y.
Procee
di
ngs
of the
Six
t
h
Internatio
na
l Symp
osi
u
m
on
Micro Machi
ne
and H
u
man Sc
ienc
e.
Nag
o
y
a;
1995: 3
9
-43
Evaluation Warning : The document was created with Spire.PDF for Python.
ISSN: 23
02-4
046
TELKOM
NI
KA
Vol. 12, No. 7, July 201
4: 4973 – 49
80
4980
[9]
Kenn
ed
y J, Eberh
a
rt R.
Particle s
w
arm optimiz
ation.
IEEE International Co
nferenc
e
on Neur
al
Netw
orks
. 199
5: 1942-
19
48.
[10]
Goldb
e
rg DE, Richar
dso
n
J. G
enetic al
gorit
hms
w
i
t
h
shari
ng for multimo
dal functi
on o
p
timizati
on
.
Cambri
dg
e. 19
87: 41-4
9
.
[11]
Brits R, Eng
e
lb
recht AP, Van
den B
e
rg
h F
.
A nich
ing
p
a
rticl
e
s
w
arm
optimi
z
er. Sing
ap
ore.
200
2: 6
92-
696.
[12]
Gao Y, Ren Z
.
Adaptive Pa
rticle Sw
arm Optimi
z
a
ti
on Al
g
o
rith
m w
i
th Genetic Mutatio
n
O
peratio
n.
In
T
h
ird Internatio
n Confer
enc
e on Natur
a
l Co
m
putatio
n, Hai
k
ou, Japa
n. 20
07: 211-2
15.
[13]
F
aerman M, Bi
rnba
um A, Ber
m
an
F
,
et al. R
e
sourc
e
al
loc
a
tion strate
gi
es for gu
ide
d
p
a
ra
meter spa
c
e
search
es.
Internatio
nal J
ourn
a
l of Hig
h Pe
rformanc
e Co
mputin
g App
licati
ons.
200
3; 17(
4): 383-4
02.
[14]
Potter MA.
T
h
e desi
gn a
nd
ana
l
y
sis of a com
putati
o
n
a
l
mode
l of coop
erative co
evol
ution.
Cites
eer
.
199
7.
[15]
Mcclell
a
n
d
JL,
Rum
e
lh
art D
E
, Gr
oup PR.
Para
lle
l distribute
d
proces
sing.
Ex
plor
ati
ons in
t
h
e
micr
ostructure of
cogniti
on
. 19
86; 2.
[16]
Hopfi
e
ld JJ, T
a
nk DW
. Compu
t
ing
w
i
th n
eur
al
circuits- A model.
Science
. 1
986; 23
3(4
764)
: 625-63
3.
[17]
Ogasa
w
a
r
a
E, Martinez LC,
d
e
Olive
i
ra
D, e
t
al. Ada
p
tive
normal
i
zati
on:
A nov
el d
a
ta
n
o
rmaliz
atio
n
appr
oach
for n
on-statio
nar
y ti
me seri
es.
T
h
e
20
10 Inter
nati
ona
l Jo
int C
onf
erenc
e o
n
N
e
u
r
al N
e
tw
ork
s
(IJCNN)
. 201
0: 1-8.
[18]
Louk
a P, Ga
la
nis G, Si
ebert
N, et a
l
. Impro
v
ements
in
w
i
n
d
sp
eed
forec
a
sts for
w
i
nd
po
w
e
r pr
ed
icti
o
n
purp
o
ses us
in
g Kalm
an filter
ing.
Jo
urna
l of
W
i
nd En
gin
e
e
rin
g
an
d Ind
u
s
trial Aer
odyn
a
mics
. 20
08
,
96(1
2
): 234
8-2
362.
[19]
Spie
gel M, Sch
iller J, Sriniv
as
an A.
Schau
m'
s Easy Outline
of
Probabi
lity and Statistics.
McGra
w
-
Hil
l,
200
2.
Evaluation Warning : The document was created with Spire.PDF for Python.