Grds international conference on pure and applied science (5)
Transcript of Grds international conference on pure and applied science (5)
A NEW HYBRID WYL-AMRI CONJUGATE GRADIENT METHOD WITH SUFFICIENT DESCENT CONDITION
FOR UNCONSTRAINED OPTIMIZATION
Ibrahim S. Mohammed*Mustafa Mamat
Abdelrhaman Abashar Kamil Uba Kamfa
Contents :-* Introduction * Objectives of the research* Conjugate Gradient Version* New method and Algorithm* Numerical Results* Conclusion* References
* Introduction
Conjugate gradient method (CG) are designed to solve large scale
unconstrained optimization problems. In general, the method has the
form
ππππ βπΉπ π(π) (1.1)
where π βΆ πΉπ β πΉ is continuously differentiable . Conjugate gradient
methods are iterative methods of the form
ππ+π = ππ + πΆππ π (1.2)
where ππ+π is the current iterate point , πΆπ > π is step length which is
computed by carrying out a line search , π π is the search direction of the
conjugate gradient method define by
π π βππ , ππ π = π
βππ + π·ππ π, ππ π β₯ π(1.3)
Some classical formulaβs for π·π are given as follows
π·ππ―πΊ =
πππ» (ππβππβπ)
(ππβππβπ)π» π πβπ
(1.4)
β’ Cont : Introduction
β’ π·πππΉ =
πππ» ππ
ππβππ» ππβπ
(1.5)
β’
β’ π·ππ·πΉπ· =
πππ»(ππβππβπ)
ππβππ» ππβπ
(1.6)
β’ π·ππͺπ« = β
πππ» ππ
π πβππ» ππβπ
(1.7)
β’ π·ππ«π =
πππ» ππ
(ππβππβπ)π» π πβπ
(1.8)
β’ π·ππ³πΊ = β
πππ»(ππβππβπ)
π πβππ» ππβπ
(1.9)
Cont : Introduction Zoutendijk proved that FR Method with exact line search is globally
convergent , Al β Baali extended this results to the strong Wolfe -
Powell line search.
Recently, Wei et al. [7], propose a new CG formula
Abashar et al (2014), modified RMIL method to suggest
β’ Research Objective
* To proposed a new formula for solving unconstrained optimization.
* To analyze the performance these new formulas based on standard optimizations test problem functions.
* To proof the sufficient descent conditions of our new method
β’ Conjugate Gradient Version
(i) Hybrid CG methods (ii) Scaled CG methods. (iii) Three terms CG methods.
.
An important class of conjugate gradient algorithm is the hybrid CG method, for example Hu and Storey
[18] propose
Dai and Yuan [19] suggested two hybrid methods,
New method and Algorithm
we propose a new hybrid CG method which is a combination of two CG methods
(WYL, AMRI)
ALGORITHM
Step1. Given an initial point nRx 0 , )1,0( , Set 00 gd if |||| 0g , then stop.
Step2. ComputeASW
k
based on (14).
Step3. Compute kd based on (4). If |||| kg , then terminate,
Step4. Compute step size based on (3).
Step5. Update new point based on (2).
Step6. Convergent test and stopping criteria
If )()( 1 kk xfxf and |||| kg , then terminate, else, Set 1 kk and go to Step 2.
Numerical Results
β’ Test problem functions considered by Andrei.
β’ Stopping criteria as Hillstrom , ππ β€ π.
β’ Matlab Subroutine programming was used.
β’ Using exact line search.
β’ Performance profile introduced by Dolan and More.
Numerical Results
TABLE 1: A LIST OF PROBLEM FUNCTIONS
No Function Dim Initial Points
1 Six hump 2 (1, 1), (2, 2), (5,5), (10, -10)
2 Three hump 2 (24, 24), (29, 29), (33, 33), (50, 50)
3 Booth 2 (10, -10), (20, 20), (50, 50), (100, 100)
4 Treccani 2 (5, 5), (10,10), (-20, 20), (-50, -50)
5 Matyas 2 (1, 1), (5, 5), (10, 10), (50, 50)
6 Extended Maratos 2, 4 (0,0,0,0), (0.5,5, 0.5, 5), (10, 0.5, 10, 0.5), (70, 70, 70, 70)
7 Ext FREUD & ROTH 2, 4 (13, 13, 13, 13), (21, 21, 21, 21), (25, 25, 25,25), (23, 23, 23, 23)
8 Generalized Trig 2, 4, 10 (0.5, 5, β¦, 5),(5, 10, β¦, 10),(7,7, β¦, 7),(50, 50, β¦, 50)
9 Fletcher 2, 4, 10 (23, 23, β¦, 23), (45, 45, β¦, 45), (50, 5,β¦, 5), (70, 70, β¦,70)
10 Extended Penalty 2, 4, 10, 100 (0.5,5, β¦,5),(10,-0.5β¦,-0.5), (105,105, β¦,105), (130,130, β¦,130)
11 Raydan 1 2, 4, 10, 100 (1, 1, β¦,1), (3, 3, β¦, 3), (5, 5, β¦, 5), (-10, -10, β¦, -10)
12 Hager 2, 4, 10, 100 (3, -3, β¦, -3),(21, 21, β¦, 21), (-23, 23, β¦, 23), (23, 23, β¦, 23)
13 Rosenbrock 2, 4, 10, 100, 500, 1000, 10000 (7, 7, β¦, 7), (13, 13, β¦, 13), (23, 23, β¦, 23), (35, 35, β¦, 35)
14 Shallow 2, 4, 10, 100, 500, 1000, 10000 (21, -21, β¦, -21), (21, 21, β¦, 21), (50,50, β¦, 50),(130, 130, β¦, 130)
15 Tridiagonal 1 2, 4, 10, 100, 500, 1000, 10000 (0, 0,β¦, 0), (1, -1, β¦, -1), (17, -17, β¦, -17), (30, 30, β¦, 30)
16 Ext White & Holst 2, 4, 10, 100, 500, 1000, 10000 (-5, -5, β¦, -5), (2, -2, β¦, -2), (3, -3, β¦, -3), (7, -7, β¦, -7)
17 Ext Denschnb 2, 4, 10, 100, 500, 1000, 10000 (8, 8, β¦, 8), (11, 11, β¦,11), (12, 12, β¦, 12),(13, 13, β¦, 13)
18 Diagonal 4 2, 4, 10, 100, 500, 1000, 10000 (2, 2, β¦, 2), (5, 5, β¦,5), (10, 10, β¦, 10), (15, 15, β¦, 15)
Numerical Results
β’ Performance Profile based on Number iterations
Cont. Numerical Results
β’ Performance Profile based on CPU time
β’ Conclusion
* AMRI was able to solve 95% of test problems.
* WYL solve 97% of test problems.
* SW-A solve all test problems.
β’ References* M.R. Hestenes, E.L. Stiefel, Methods of conjugate gradients for solving linear systems, J. Res. Natl. Bur. Stand. Sec. B 49 1952, pp. 409β432
*Z. Wei, S. Yao, L. Liu, The convergence properties of some new conjugate gradient methods, Appl. Math. Comput. 183 2006, pp. 1341β1350.
*G. Zoutendijk, Nonlinear programming computational methods, in:J.Abadie (Ed.), Integer and Nonlinear Programming, North-Holland, Amsterdam, 1970, pp. 37β86.
*M.Rivaie,M.Mamat,L. June, M.Ismail, a new class of nonlinear conjugate gradient coefficient with global convergence properties, Appl.Math.comput.218 ,2012, pp.11323-11332.
. * Y.H. Dai, Y. Yuan. An efficient hybrid conjugate gradient method for unconstrained optimization. Ann. Oper. Res. 103, 2001, pp. 33β47.
* E. Dolan, J.J.More, Benchmarking optimization software with performance profile, Math. Prog. 91, 2002, pp. 201β213.
*Y. F. Hu, C. Storey . Global convergence result for conjugate gradient methods. J.Optim.Theory.Appl., 71,1991, pp. 399-405.
* N. Andrei, An unconstrained optimization test functions collection, Adv. Modell. Optim. 10, 2008, pp. 147β161.
Thank You