TELKOM
NIKA
, Vol.11, No
.11, Novemb
er 201
3, pp. 6373
~6
380
e-ISSN: 2087
-278X
6373
Re
cei
v
ed Ap
ril 9, 2013; Re
vised Ma
y 31
, 2013; Accep
t
ed Jun 2
7
, 2013
A Modified Conjugate Gradient Method for
Unconstrained Optimization
Can Li
Coll
eg
e of Mathematics, Ho
n
ghe U
n
ivers
i
t
y
,
Mengzi, Ch
ina
Corresp
on
din
g
author, e-mai
l
: lican
841
01
4@
163.com
A
b
st
r
a
ct
Conj
ug
ate gra
d
ie
nt meth
ods
ar
e an impo
rtant class of metho
d
s for solving u
n
co
nstrain
e
d
opti
m
i
z
at
ion
pr
obl
e
m
s, esp
e
ci
ally for
lar
ge-s
c
ale
pro
b
le
ms.
Rece
ntly, they
hav
e b
een
stu
d
ie
d i
n
d
epth. I
n
this pa
per, w
e
further study t
he co
nju
gate
g
r
adi
ent
metho
d
for unco
n
strai
ned
opti
m
i
z
at
io
n. W
e
focus o
u
r
attention
to th
e desc
ent co
nj
ugate
gr
a
d
ie
nt met
hod. T
h
is
pap
er pr
esent
s
a
mod
i
fie
d
co
nju
gate
grad
ie
nt
meth
od. An int
e
restin
g featur
e of t
he prese
n
ted metho
d
is
that the dire
cti
on is alw
a
ys a desce
nt directi
o
n
for the ob
jecti
v
e functio
n
. Moreov
er, the p
r
operty is
i
n
d
e
pen
de
nt of the lin
e
se
arch used.
U
nder mi
l
d
cond
itions, w
e
prove t
hat the
mo
di
fi
ed c
onj
u
gate
grad
ie
nt meth
od w
i
th A
r
mi
jo-type
li
ne
search
is g
l
o
b
a
ll
y
conver
gent. W
e
also pr
ese
n
t some nu
meric
a
l resu
lt
s to show
the efficien
cy of the propo
sed metho
d
.
Ke
y
w
ords
:
u
n
c
onstrai
ned
o
p
timi
z
a
ti
on pr
o
b
le
m, co
nju
gat
e gra
d
ie
nt
me
thod, ar
mi
jo-ty
pe l
i
ne s
earc
h
,
glo
bal co
nver
g
ence
Copy
right
©
2013 Un
ive
r
sita
s Ah
mad
Dah
l
an
. All rig
h
t
s r
ese
rved
.
1. Introduc
tion
Let us con
s
id
er the followi
ng un
con
s
trai
ned optimi
z
at
ion pro
b
lem:
mi
n
(
)
n
xR
f
x
(1)
Whe
r
e
:
n
f
RR
is a
continu
o
u
s
ly differentiabl
e f
unction
a
nd its gradie
n
t is den
ote
d
by
()
()
g
xf
x
. Optimizatio
n
probl
em
s exist in ma
ny area
s [1
-2], su
ch a
s
engine
ering,
prod
uctio
n
m
anag
ement,
eco
nomy et
c. Gen
e
rally, O
p
timization
p
r
oblem i
s
solv
ed by
conve
r
t
ed
to unco
n
strai
ned optimi
z
at
ion pro
b
lem.
Conj
ugate g
r
adient metho
d
s co
nstitute
an exce
llent
choice for e
fficiently solving the
optimizatio
n
probl
em (1),
esp
e
ci
ally wh
en the dime
n
s
ion
n
is large
due to the
simplicity of their
iteration, co
n
v
ergen
ce p
r
opertie
s
an
d
their
low
memory req
u
irem
ents. In fact, conj
ugate
gradi
ent met
hod
s have
p
l
ayed
spe
c
ial
role
s i
n
sol
v
ing large
scale
nonli
n
e
a
r o
p
timizati
on
probl
em
s. Althoug
h conj
ug
ate gradi
ent method
s are
not the fastest or most
rob
u
st optimizatio
n
algorith
m
s fo
r no
nline
a
r
problem
s avail
able tod
a
y,
they re
main v
e
ry po
pula
r
f
o
r e
ngin
eers
and
mathemati
c
ia
ns en
gag
ed
with solvin
g large p
r
o
b
lem
s
.
Conj
ugate gradient
m
e
tho
d
s gen
erate
a
se
que
nce of
points
{}
k
x
, s
t
arting from an initial
gue
ss
0
n
x
R
, using
the iterative formul
a
1
kk
k
k
x
xd
,
(2)
Whe
r
e
0
k
is obtained by line
sea
r
ch, and the dire
ction
k
d
is gen
erate
d
b
y
:
1
0
,
1
,
k
k
kk
k
gk
d
gd
k
(3)
Whe
r
e
()
kk
gg
x
and
k
is
a sc
alar. Some well-known formulae for
k
are give
n in [3-8].
Evaluation Warning : The document was created with Spire.PDF for Python.
e-ISSN: 2
087-278X
TELKOM
NIKA
Vol. 11, No
. 11, Novemb
er 201
3: 637
3 – 6380
6374
For a conju
g
a
te gra
d
ient
method in th
e form
(2
)-(3
), we say that
the descent
con
d
itio
n
holds
if:
0,
0
.
kk
gd
k
(4)
In addition, we say that the sufficie
n
t descent
condit
i
on hold
s
if there exi
s
ts a
con
s
tant
1
0
c
s
u
c
h
that:
2
1
||
||
,
0
,
kk
k
gd
c
g
k
(5)
Whe
r
e
||
||
stand
for the Eucli
dean n
o
rm
o
f
vectors.
Th
e desce
nt condition o
r
the suffi
cient
desce
nt co
nd
ition is ofte
n
use
d
in the
literatu
r
e
to
an
alyze the
glo
bal conve
r
ge
nce
of conjug
ate
gradi
ent met
hod with i
nex
act line
sea
r
ch, and may b
e
cruci
a
l for
conj
ugate g
r
adient meth
o
d
s.
But for cl
assi
cal
conj
ugate
gra
d
ient me
thods, th
e
de
scent conditi
on o
r
the
suf
f
icient de
sce
n
t
con
d
ition hol
ds de
pendi
n
g
on the line
search u
s
e
d
.
During the
last de
cade,
much effo
rt has
been
devote
d
to gen
erate
desce
nt
co
nj
ugate g
r
adi
e
n
t method
s i
ndep
ende
nt
of the line
se
arch
use
d
. Similar to the sp
ect
r
al co
njug
ate
gradi
ent
met
hod [9], Zha
ng, Zho
u
an
d
Li [10] propo
sed
a desce
nt modified FR conj
ugate gradie
n
t
method as:
2
11
1
22
11
||
||
,
||
||
|
|
|
|
kk
k
kk
k
kk
dy
g
dg
d
gg
Whe
r
e
00
dg
and
11
=
kk
k
yg
g
. A rema
rkable
prop
erty of the metho
d
is that it produ
ce
s
desce
nt dire
ction,
2
||
||
,
0
.
kk
k
gd
g
k
and thi
s
p
r
o
perty is i
nde
pend
ent of the line
se
arch u
s
e
d
. Mo
tivated by this ni
ce d
e
scent
prop
erty, Zha
ng, Xiao and Wei [11] intro
duced a
de
scent three-te
rm conju
gate gradi
ent method
based on the
Dai-Liao m
e
thod [12] as:
11
1
11
1
11
11
()
()
,
kk
k
k
k
kk
k
k
k
kk
kk
gy
t
s
g
s
dg
s
y
t
s
ys
ys
Whe
r
e
00
dg
,
11
=
kk
k
sx
x
and
0
t
. Again, it is
easy to se
e
that the suffici
ent desce
nt
con
d
ition al
so
holds in
dep
e
ndent by t
he line se
arch, i.e. for this method
2
||
||
kk
k
gd
g
for all
k
. Other de
scent co
njugat
e gra
d
ient m
e
thod
s an
d their gl
obal
converg
e
n
c
e
can
be foun
d
in
[13-17] et
c.
In this
pap
e
r
, we
p
r
opo
se
a mo
dified
conj
ugate
gra
d
ient m
e
thod. Th
e
dire
ction
gene
rated
by the propo
se
d metho
d
is
alway
s
a d
e
sc
ent di
re
ction
of the obj
ecti
ve function. T
h
i
s
prop
erty is i
ndep
ende
nt of the line
search u
s
e
d
. Und
e
r mil
d
condition
s, we
prove that t
he
modified conj
ugate gradie
n
t
method with
Armijo
-type li
ne se
arch is
globally converge
nt.
In the next se
ction, we
prop
ose the
met
hod. Se
ction 3 is
d
e
voted to the global
conve
r
ge
nce of the method
. At last, we pres
ent so
me
nume
r
ical re
sults in se
ction
4.
2. Algorithm
Rece
ntly, Li and Feng [18] pro
p
o
se
d a modifi
ed Liu-Sto
r
ey
(MLS) meth
o
d
by letting:
2
11
1
2
11
1
1
=
()
kk
k
kk
k
TT
kk
kk
yg
d
gy
t
dg
dg
,
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
e-ISSN:
2087
-278X
A Modified Conjug
ate Gra
d
ient Method
for Un
con
s
tra
i
ned Optim
i
za
tion (Ca
n
Li)
6375
Whe
r
e
1
4
t
. In [18], Li and Fe
ng proved th
at the
MLS method can alway
s
ge
nerate
de
scent
dire
ction
s
whi
c
h satisfy the sufficie
n
t descent conditio
n
.
2
1
(1
)
|
|
|
|
.
4
kk
k
gd
g
t
Li and F
eng
also
establi
s
h the glob
al
conve
r
ge
nce
of the MLS method
with
stron
g
Wolfe
line
sea
r
ch. Base
d on the MLS
method, we
prop
ose ou
r algorith
m
as f
o
llows.
Algorithm 1
:
Step 1.
Give
n con
s
tant
s
0
,
1
1
4
,
01
,
0
. Choo
se
an initial point
0
n
x
R
Set
00
dg
,
:0
k
.
Step 2.
Cal
c
ulate the se
arch directio
n
k
d
by
(3), whe
r
e
k
is defined b
y
:
2
1
11
11
1
=(
)
k
T
kk
k
k
TT
kk
kk
g
g
dg
dg
d
g
.
(6)
Whe
r
e
1
1
4
.
Step 3.
Dete
rmine
m
a
x
,
0,
1
,
2,
.
.
.
j
k
j
sat
i
sf
y
i
ng
4
2
()
(
)
kk
k
k
k
k
f
xd
f
x
d
.
(7)
Step 4.
Let
1
kk
k
k
x
xd
.
If
2
1
k
g
, then stop. Otherwise go
to step 5.
Step 5.
Let
:1
kk
a
nd go to step
2.
The followi
ng
theorem
sho
w
s that the Al
gorit
hm 1 p
o
sse
s
ses the
su
fficient con
d
ition (4
).
Theorem 1.
Let the seq
u
e
n
ce
s
k
g
and
k
d
be generated by
the Algorith
m
1, then:
2
1
(
1
)
,
0
.
4
T
kk
k
gd
g
k
(8)
Proof:
Sinc
e
00
dg
, we have
2
00
0
T
gd
g
, whic
h satis
f
ies
(8).
Multiplying the directio
n
k
d
by
k
g
, we have from (3
) and
(6) that for all
1
k
2
11
11
11
1
+(
)
.
k
TT
T
kk
k
k
k
k
k
k
TT
kk
k
k
g
gd
g
g
g
d
g
d
dg
d
g
22
2
2
11
2
11
11
(d
)
(
d
)
=+
()
TT
kk
k
k
k
k
k
TT
kk
k
k
gg
gg
g
dg
d
g
.
(9)
Applying the inequ
ality
22
1
(|
|
|
|
|
|
|
|
)
2
uv
u
v
into
2
1
11
(d
)
T
kk
k
T
kk
gg
dg
, we obtain:
21
1
1
1
2
11
11
1
(d
)
2
(
d
)
2
(d
)
()
T
TT
kk
k
k
k
k
T
kk
k
TT
kk
kk
g
gg
g
gg
dg
dg
Evaluation Warning : The document was created with Spire.PDF for Python.
e-ISSN: 2
087-278X
TELKOM
NIKA
Vol. 11, No
. 11, Novemb
er 201
3: 637
3 – 6380
6376
22
22
11
1
2
11
11
()
2
(
d
)
22
()
TT
kk
k
k
k
k
T
kk
dg
g
g
g
dg
2
2
2
1
2
11
(d
)
1
4(
)
T
kk
k
k
T
kk
gg
g
dg
Substituting the ineq
uality into (9), we ge
t:
22
1
4
T
kk
k
k
g
dg
g
=
2
1
(1
)
4
k
g
.
From the
pro
o
f of Theore
m
1, we
can
see th
at
k
d
pro
v
ides a d
e
scent dire
ction
of
f
at
k
x
, and
this pro
p
e
r
ty is inde
pen
den
t of the line search u
s
ed.
3. The Prope
rties and
the
Global Con
v
ergence
In the followi
ng, we
assu
me that
0
k
g
for all
k
, otherwi
se a statio
na
ry point ha
s
been
found.
The foll
owi
ng a
s
sumpti
ons are ofte
n u
s
ed to
p
r
ove the
co
nverge
nce of
the
conj
ugate g
r
a
d
ient method.
Assump
tion A:
(1) T
he lev
e
l set
0
|(
)
(
)
n
x
Rf
x
f
x
is bou
nd.
(2)
In some
neigh
borhoo
d
N of
,
()
f
x
is
conti
nuou
sly differentiable
and
its g
r
adie
n
t
()
gx
is
Lipschit
z co
ntinuou
s, name
l
y, there exists a co
nsta
nt
0
L
such that:
()
(
)
,
,
.
gx
g
y
L
x
y
x
y
N
(10)
Sinc
e
k
d
is a
d
e
scent di
re
cti
on of
f
at
k
x
, Algorithm
1 is
well d
e
fined.
More
over, it
follows from
(7) that the fu
nction val
ue
seque
nce
{(
)
}
k
f
x
is d
e
crea
sing.
We also have f
r
om (7
)
that
4
2
0
kk
k
d
, if
f
is bou
nded fro
m
bel
ow. In parti
cu
lar, we have
2
li
m
0
kk
k
d
(11)
In addition, we can g
e
t fro
m
Assu
mptio
n
A that there exists a co
nstant
0
s
u
ch that:
,.
k
gx
(12)
Lemma 1.
L
e
t the sequ
e
n
ce
s
k
x
be
ge
nerate
d
by A
l
gorithm
1. If there
exist
s
a con
s
tant
0
s
u
ch that:
k
g
,
k
,
(13)
Then the
r
e e
x
ists a co
nsta
nt
0
Q
such t
hat
:
k
dQ
.
(14)
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
e-ISSN:
2087
-278X
A Modified Conjug
ate Gra
d
ient Method
for Un
con
s
tra
i
ned Optim
i
za
tion (Ca
n
Li)
6377
Proof:
By the
definition of
k
, we can sim
p
li
fy its expressi
on as:
2
1
11
11
1
()
k
T
kk
k
k
TT
kk
kk
g
g
dg
gd
gd
2
1
11
11
1
T
k
kk
TT
kk
kk
g
gd
gd
gd
2
11
1
2
11
()
k
TT
kk
k
k
T
kk
g
gd
g
d
gd
2
11
1
2
11
k
TT
kk
k
k
T
kk
g
gd
g
d
gd
2
11
2
11
T
k
kk
k
T
kk
g
g
gd
gd
(15)
By (15), (8
), (12), (1
3) an
d (10
)
, we obtai
n:
2
11
22
kk
k
k
g
gd
2
11
1
22
kk
k
L
dd
=
2
2
11
22
kk
L
d
(16
)
Whe
r
e
1
(1
)
4
.
By the definition of
k
d
, we get from (1
6) an
d
(12) that:
1
kk
k
k
dg
d
2
2
11
1
22
kk
k
Ld
d
.
Sinc
e
2
li
m
0
kk
k
d
, there
exists a
con
s
tant
0,
1
c
and an
intege
r
0
k
,
su
ch that th
e
followin
g
ineq
uality holds fo
r all
0
kk
.
2
2
11
22
kk
L
dc
a
Hen
c
e
,
we have for all
0
kk
.
1
kk
dc
d
00
0
1
2
(1
)
kk
kk
k
cc
c
c
d
0
1
k
d
c
Let
00
12
ma
x
,
,
,
,
1
kk
Qd
d
d
d
c
,
we hav
e
0
,
k
dQ
k
k
.
Theorem 2.
Let the seq
u
e
n
ce
s
k
x
be gen
erated by Alg
o
rithm 1
,
then:
li
m
i
n
f
0
k
k
g
(17)
Proof.
We p
r
ove the resul
t
of this theo
rem by
co
ntradictio
n. Assume that the
theorem i
s
not
true, there exi
s
ts a con
s
tan
t
0
s
u
c
h
that:
k
g
,
0
,
1
,
2
,
...
k
.
(18)
Evaluation Warning : The document was created with Spire.PDF for Python.
e-ISSN: 2
087-278X
TELKOM
NIKA
Vol. 11, No
. 11, Novemb
er 201
3: 637
3 – 6380
6378
If
li
m
i
n
f
0
k
k
,
w
e
ge
t fr
o
m
(
8
)
an
d (
1
1)
tha
t
li
m
i
n
f
0
k
k
g
,
This co
ntradi
cts
assumptio
n
(18).
If
li
m
i
n
f
0
k
k
,
there exi
s
ts an infinite index set
K
s
u
ch that:
,
li
m
0
k
kK
k
.
(19)
By the step 3 of Algorithm 1
,
we have:
4
11
2
()
(
)
(
)
kk
k
k
k
k
f
xd
f
x
d
(20)
By the mean-value theore
m
, (10)
a
nd (8), there i
s
a con
s
tant
(0
,
1
)
s
u
ch that:
11
1
()
(
)
(
)
T
kk
k
k
k
k
k
k
f
xd
f
x
g
x
d
d
11
1
[(
)
(
)
]
TT
kk
k
k
k
k
k
k
k
gd
g
x
d
g
x
d
2
12
2
T
kk
k
k
k
gd
L
d
22
12
2
1
(1
)
4
kk
k
k
g
Ld
Subs
tituting the inequality into (20), we get for all
kK
s
u
ffic
i
ently large,
22
4
11
1
(1
)
4
kk
k
k
k
g
Ld
d
With (14), we get:
22
2
11
2
1
(1
)
4
kk
k
k
k
g
Ld
Q
d
Dividing both
side
s of this i
nequ
ality by
1
(1
)
0
4
, we get:
12
22
()
1
1
4
kk
k
LQ
g
d
.
Sinc
e
2
li
m
0
kk
k
d
, the last i
nequ
ality implies that
,
li
m
i
n
f
0
k
kK
k
g
. This also
contradi
cts a
s
sumption
(1
8). The contr
adictio
n sh
ows that (17
)
is true.
4. Numerical
Experiments
I
n
t
h
is
se
ct
io
n,
we
re
po
rt
some
r
e
s
u
lt
s
of
the n
u
me
ri
cal
experi
m
e
n
ts. We te
st
Algorithm
1 and
com
p
a
r
e its p
e
rfo
r
m
ance with th
o
s
e of PR
P m
e
thod
who
s
e
results b
e
giv
en by [19]. The
probl
em that
we te
sted a
r
e from [2
0]. Our li
ne
sea
r
ch
sub
r
o
u
tin
e
co
mpute
s
k
su
ch that the
Armojo type line se
arch co
ndition (7
) hol
ds with
0.5
and
0.01
.
The te
rmin
ation
con
d
ition i
s
5
||
||
10
k
g
or th
e ite
r
at
ion n
u
mbe
r
e
x
ceed
s
4
21
0
or the
function
eval
uation n
u
mb
er ex
cee
d
s
5
31
0
. In the follo
wing ta
ble
s
,
each colum
n
has the
fo
llo
w
i
ng
me
an
in
gs
:
Problem: the
name of the p
r
oble
m
;
Dim: the dim
ensi
on of the probl
em;
Evaluation Warning : The document was created with Spire.PDF for Python.
TELKOM
NIKA
e-ISSN:
2087
-278X
A Modified Conjug
ate Gra
d
ient Method
for Un
con
s
tra
i
ned Optim
i
za
tion (Ca
n
Li)
6379
NI: the numb
e
r of iteration
s
;
NF: the total numbe
r of function eval
ua
tions;
From the n
u
m
eri
c
al result
s, we
can
se
e t
hat the pro
posed meth
o
d
perfo
rm
s b
e
tter than
the PRP met
hod for
so
me
probl
em
s. For some te
st
probl
em
s, althoug
h the n
u
m
ber
of iterat
ions
of Algorithm
1 is m
o
re th
an the P
R
P method, the
total numb
e
r
of fun
c
tion
evaluation
s
of
Algorithm 1
is less than
the PRP method. In
su
mmary, the nume
r
ical re
sults
sho
w
that
Algorithm 1 i
s
mo
re effici
e
n
t than the P
R
P metho
d
a
nd provide
s
a
n
efficient me
thod for
solvi
ng
uncontrai
ned optimizaito
n
probl
em
s.
Table 1. Nu
m
e
rical Re
sult
s
Problem Dim
Al
g
orithm
1
P
RPSWP
NI/NF
N
I/NF
rose 2
36
/
118
29/502
helix 3
41/207
49/255
bard
3
26/89
23/98
Gulf
3
1/2
1/2
kow
o
sb
4
77/251
62/361
bi
gg
s 6
125/350
121/495
osb2 11
141/528
293/1372
w
a
tson
20
616/1307
990/2773
vardim 50
8/51
10/52
Tri
g
100
68/241
46/342
Ie 500
4/9
6
/13
Lin 1000
1/3
1
/3
5. Conclusio
n
Different
conj
ugate gradie
n
t algorithm
s corre
s
po
nd
to different choices for th
e scalar
para
m
eter
k
. For
cla
s
sic conjug
ate g
r
a
d
ient meth
od
s, the
pa
ram
e
ter
k
is sele
ct
ed so
t
h
at
whe
n
applie
d
to minimize a stron
g
ly quadr
ati
c
co
nve
x
function, the dire
ction
s
k
d
and
1
k
d
are
conj
ugate
su
bject to the
He
ssi
an of the qua
drati
c
function. Th
e
r
efore, to minimize a
co
n
v
e
x
quad
ratic fun
c
tion in a sub
s
pa
ce
spa
n
n
ed by a set of
mutually conj
ugate directio
ns is e
quivale
nt
to minimize t
h
is fun
c
tion
along
ea
ch
conj
ugate
direction i
n
turn. This i
s
a
very goo
d id
ea.
H
o
w
e
ve
r
,
th
e d
i
re
c
t
ion
s
in w
h
ic
h th
e pa
r
a
me
te
r
k
is
comp
uted
by
(6
)
doe
s
no
t sati
sfy the
conj
uga
cy co
ndition in this
pape.
Referen
ces
[1]
Asad
y B, Man
s
ouri P, Gupta
N.
T
he modify versi
on of ar
tificial b
ee co
l
o
n
y
al
gorit
hm to solve re
a
l
optimiz
ation
pr
obl
ems.
Intern
ation
a
l J
ourn
a
l
of Electr
ic
al a
nd
C
o
mp
uter Engi
neer
in
g.
2
012; 2(4):
4
73-
480.
[2]
Seman K, P
u
spita FM, T
a
ib BM, Shafii
Z. An
improve
d
optim
izatio
n
mode
l of i
n
te
rnet char
gin
g
scheme i
n
muti
service net
w
o
r
ks.
TELKOMNIKA.
2012; 10(
3
)
: 592-59
8.
[3]
Polak E, Ri
biè
r
e G. Note sur la conv
erge
nc
e de meth
od
e
s
de dir
e
ctions
conju
g
é
e
s.
Rev. F
r
ancai
s
e
Informat Rech
erche Operti
on
elle, 3
e
Ann
ée.
1969; 1
6
(1): 3
5
- 43.
[4]
Pol
y
ak BT
.
T
h
e conj
ugate gr
adi
ent metho
d
in extrem
al p
r
obl
ems.
USSR Co
mp. Math. and Math.
Phys
. 1969; 9(
4): 94-11
2.
[5]
Hesten
es MR,
Stiefel E
L
. Method
of co
nj
u
g
a
t
e grad
ie
nts for
solvi
ng
lin
ear
s
y
stems.
J. Res. Nat. Bur
.
Stand.
19
52; 4
9
(6): 409-
43
2.
[6]
F
l
etcher R, Re
eves C. F
uncti
on mi
n
i
mizati
o
n
b
y
c
onj
ug
ate grad
ients.
J. Comput.
196
4; 7
(
2): 149-1
54.
[7]
Dai YH, Yua
n
Y. A nonlin
ear
conju
gate gra
d
ie
nt
method
w
i
t
h
a strong g
l
ob
al conv
erg
e
n
ce pro
pert
y
.
SIAM J.
Optim
.
1999; 1
0
(1): 1
77-1
82.
[8]
F
l
etcher R. Practical Metho
d
s of Optimization
. Ne
w
York: Jo
hn W
ile
y & So
ns, 1987: 1. Unconstra
i
ne
d
Optimization.
[9]
Birgin E, Marti
nez JM. A spectral con
j
ug
at
e grad
ient met
hod for unc
on
straine
d
optimi
z
ation.
App.
Math. Optim
.
2
001; 43(
2): 117
-128.
[10]
Z
hang
L, Z
hou
W
J
, Li DH. Glob
al co
nverg
e
n
ce
of a mo
dif
i
ed F
l
etch
eer-
R
eev
es con
j
ug
ate gra
d
ie
n
t
method
w
i
t
h
Arnijo-t
yp
e li
ne s
earch.
Numeris
c
he Mathematik
. 2006; 10
4(4)
: 561-57
2.
Evaluation Warning : The document was created with Spire.PDF for Python.
e-ISSN: 2
087-278X
TELKOM
NIKA
Vol. 11, No
. 11, Novemb
er 201
3: 637
3 – 6380
6380
[11]
Z
hang J,
Xia
o
Y, W
e
i Z
.
Nonl
ine
a
r co
nju
gat
e gr
a
d
i
ent met
hods
w
i
th s
u
fficient d
e
sce
nt
cond
ition
for
larg
e-scal
e
un
constrai
ned
o
p
timizati
on.
Mathe
m
atic
al Pr
obl
e
m
s in En
gin
eeri
n
g
. 2009, Article ID
243
29
0. http://dx.
doi.or
g
/10.1
155/2
0
0
9
/24
3
2
9
0
.
[12]
Dai YH, Lia
o
L
Z
. Ne
w
con
j
u
g
a
c
y
co
nd
itions
and re
lated n
o
n
lin
ear co
nju
g
a
te grad
ient m
e
thods.
Appl.
Math. Optim
.
2
001; 43(
1): 87-
101.
[13]
Z
hang
L, Z
h
ou
W
,
Li
DH. A
d
e
scent m
o
d
i
fie
d
Po
lak-R
i
bi
er
e-Pol
y
a
k
co
nj
u
gat
e gra
d
ie
nt method
a
n
d
its
glo
bal co
nver
g
ence.
IMA Jour
nal of Nu
meric
a
l Ana
l
ysis
.
20
06; 26(4): 6
29-
640.
[14]
Che
ng W
.
A tw
o
-
term PRP-
base
d
desc
ent
method.
Numer. Funct. Anal. Optim
.
200
7
;
28(11-1
2
):
121
7-12
30.
[15]
Al-Ba
y
ati A
Y
, Sharif W
H
. A
n
e
w
thre
e-term
conj
ugat
e gr
ad
ient met
hod
for
unco
n
strai
n
e
d
optimiz
atio
n.
Can. J. Sci. Eng. Math
. 2010;
1: 108-1
24.
[16]
Z
hang L, Z
h
o
u
W
,
Li DH.
Some desc
ent
thr
ee-term co
nju
gate gr
adi
e
n
t methods a
n
d
their gl
ob
al
conver
genc
e.
Optim
.
Methods software
. 2007; 22(4): 69
7-7
11.
[17]
Li C, F
ang L, Cao
XL. Glob
a
l
conver
genc
e of a kind of co
nju
gate gra
d
i
e
nt method.
TELKOMNIKA.
201
3; 11(1): 54
4-54
9.
[18]
Li M, F
eng H.
A sufficient d
e
scent LS c
o
n
j
ug
ate gra
d
i
e
n
t
method for
unco
n
strain
ed optimiz
ati
o
n
prob
lems.
Appl
ied Math
e
m
atic
s and Co
mputa
t
ion.
201
1; 218
(5): 1577-
15
86
.
[19]
W
e
i Z
X
, Yao SW
, Liu LY.
T
he conv
erge
nc
e prop
erties of
some ne
w
c
o
nju
gate gra
d
i
e
nt methods.
Appl
ied Mat
h
e
m
atics a
nd C
o
mp
utatio
n.
200
6; 183(2): 1
341
-135
0.
[20]
More JJ, Garbo
w
BS, Hillstrom KE. T
e
sting unco
n
strai
ned
optimiz
ation
s
o
ft
w
a
r
e
.
ACM Trans.
Math.
Softw
. 1981; 7(
1): 17-41.
Evaluation Warning : The document was created with Spire.PDF for Python.