Grds international conference on pure and applied science (5)

15
A NEW HYBRID WYL-AMRI CONJUGATE GRADIENT METHOD WITH SUFFICIENT DESCENT CONDITION FOR UNCONSTRAINED OPTIMIZATION Ibrahim S. Mohammed* Mustafa Mamat Abdelrhaman Abashar Kamil Uba Kamfa

Transcript of Grds international conference on pure and applied science (5)

Page 1: Grds international conference on pure and applied science (5)

A NEW HYBRID WYL-AMRI CONJUGATE GRADIENT METHOD WITH SUFFICIENT DESCENT CONDITION

FOR UNCONSTRAINED OPTIMIZATION

Ibrahim S. Mohammed*Mustafa Mamat

Abdelrhaman Abashar Kamil Uba Kamfa

Page 2: Grds international conference on pure and applied science (5)

Contents :-* Introduction * Objectives of the research* Conjugate Gradient Version* New method and Algorithm* Numerical Results* Conclusion* References

Page 3: Grds international conference on pure and applied science (5)

* Introduction

Conjugate gradient method (CG) are designed to solve large scale

unconstrained optimization problems. In general, the method has the

form

π’Žπ’Šπ’π’™ βˆˆπ‘Ήπ’ 𝒇(𝒙) (1.1)

where 𝒇 ∢ 𝑹𝒏 β†’ 𝑹 is continuously differentiable . Conjugate gradient

methods are iterative methods of the form

π’™π’Œ+𝟏 = π’™π’Œ + πœΆπ’Œπ’…π’Œ (1.2)

where π’™π’Œ+𝟏 is the current iterate point , πœΆπ’Œ > 𝟎 is step length which is

computed by carrying out a line search , π’…π’Œ is the search direction of the

conjugate gradient method define by

π’…π’Œ βˆ’π’ˆπ’Œ , π’Šπ’‡ π’Œ = 𝟎

βˆ’π’ˆπ’Œ + πœ·π’Œπ’…π’Œ, π’Šπ’‡ π’Œ β‰₯ 𝟏(1.3)

Some classical formula’s for πœ·π’Œ are given as follows

πœ·π’Œπ‘―π‘Ί =

π’ˆπ’Œπ‘» (π’ˆπ’Œβˆ’π’ˆπ’Œβˆ’πŸ)

(π’ˆπ’Œβˆ’π’ˆπ’Œβˆ’πŸ)𝑻 π’…π’Œβˆ’πŸ

(1.4)

Page 4: Grds international conference on pure and applied science (5)

β€’ Cont : Introduction

β€’ πœ·π’Œπ‘­π‘Ή =

π’ˆπ’Œπ‘» π’ˆπ’Œ

π’ˆπ’Œβˆ’πŸπ‘» π’ˆπ’Œβˆ’πŸ

(1.5)

β€’

β€’ πœ·π’Œπ‘·π‘Ήπ‘· =

π’ˆπ’Œπ‘»(π’ˆπ’Œβˆ’π’ˆπ’Œβˆ’πŸ)

π’ˆπ’Œβˆ’πŸπ‘» π’ˆπ’Œβˆ’πŸ

(1.6)

β€’ πœ·π’Œπ‘ͺ𝑫 = βˆ’

π’ˆπ’Œπ‘» π’ˆπ’Œ

π’…π’Œβˆ’πŸπ‘» π’ˆπ’Œβˆ’πŸ

(1.7)

β€’ πœ·π’Œπ‘«π’€ =

π’ˆπ’Œπ‘» π’ˆπ’Œ

(π’ˆπ’Œβˆ’π’ˆπ’Œβˆ’πŸ)𝑻 π’…π’Œβˆ’πŸ

(1.8)

β€’ πœ·π’Œπ‘³π‘Ί = βˆ’

π’ˆπ’Œπ‘»(π’ˆπ’Œβˆ’π’ˆπ’Œβˆ’πŸ)

π’…π’Œβˆ’πŸπ‘» π’ˆπ’Œβˆ’πŸ

(1.9)

Page 5: Grds international conference on pure and applied science (5)

Cont : Introduction Zoutendijk proved that FR Method with exact line search is globally

convergent , Al – Baali extended this results to the strong Wolfe -

Powell line search.

Recently, Wei et al. [7], propose a new CG formula

Abashar et al (2014), modified RMIL method to suggest

Page 6: Grds international conference on pure and applied science (5)

β€’ Research Objective

* To proposed a new formula for solving unconstrained optimization.

* To analyze the performance these new formulas based on standard optimizations test problem functions.

* To proof the sufficient descent conditions of our new method

Page 7: Grds international conference on pure and applied science (5)

β€’ Conjugate Gradient Version

(i) Hybrid CG methods (ii) Scaled CG methods. (iii) Three terms CG methods.

.

An important class of conjugate gradient algorithm is the hybrid CG method, for example Hu and Storey

[18] propose

Dai and Yuan [19] suggested two hybrid methods,

Page 8: Grds international conference on pure and applied science (5)

New method and Algorithm

we propose a new hybrid CG method which is a combination of two CG methods

(WYL, AMRI)

ALGORITHM

Step1. Given an initial point nRx 0 , )1,0( , Set 00 gd if |||| 0g , then stop.

Step2. ComputeASW

k

based on (14).

Step3. Compute kd based on (4). If |||| kg , then terminate,

Step4. Compute step size based on (3).

Step5. Update new point based on (2).

Step6. Convergent test and stopping criteria

If )()( 1 kk xfxf and |||| kg , then terminate, else, Set 1 kk and go to Step 2.

Page 9: Grds international conference on pure and applied science (5)

Numerical Results

β€’ Test problem functions considered by Andrei.

β€’ Stopping criteria as Hillstrom , π‘”π‘˜ ≀ πœ€.

β€’ Matlab Subroutine programming was used.

β€’ Using exact line search.

β€’ Performance profile introduced by Dolan and More.

Page 10: Grds international conference on pure and applied science (5)

Numerical Results

TABLE 1: A LIST OF PROBLEM FUNCTIONS

No Function Dim Initial Points

1 Six hump 2 (1, 1), (2, 2), (5,5), (10, -10)

2 Three hump 2 (24, 24), (29, 29), (33, 33), (50, 50)

3 Booth 2 (10, -10), (20, 20), (50, 50), (100, 100)

4 Treccani 2 (5, 5), (10,10), (-20, 20), (-50, -50)

5 Matyas 2 (1, 1), (5, 5), (10, 10), (50, 50)

6 Extended Maratos 2, 4 (0,0,0,0), (0.5,5, 0.5, 5), (10, 0.5, 10, 0.5), (70, 70, 70, 70)

7 Ext FREUD & ROTH 2, 4 (13, 13, 13, 13), (21, 21, 21, 21), (25, 25, 25,25), (23, 23, 23, 23)

8 Generalized Trig 2, 4, 10 (0.5, 5, …, 5),(5, 10, …, 10),(7,7, …, 7),(50, 50, …, 50)

9 Fletcher 2, 4, 10 (23, 23, …, 23), (45, 45, …, 45), (50, 5,…, 5), (70, 70, …,70)

10 Extended Penalty 2, 4, 10, 100 (0.5,5, …,5),(10,-0.5…,-0.5), (105,105, …,105), (130,130, …,130)

11 Raydan 1 2, 4, 10, 100 (1, 1, …,1), (3, 3, …, 3), (5, 5, …, 5), (-10, -10, …, -10)

12 Hager 2, 4, 10, 100 (3, -3, …, -3),(21, 21, …, 21), (-23, 23, …, 23), (23, 23, …, 23)

13 Rosenbrock 2, 4, 10, 100, 500, 1000, 10000 (7, 7, …, 7), (13, 13, …, 13), (23, 23, …, 23), (35, 35, …, 35)

14 Shallow 2, 4, 10, 100, 500, 1000, 10000 (21, -21, …, -21), (21, 21, …, 21), (50,50, …, 50),(130, 130, …, 130)

15 Tridiagonal 1 2, 4, 10, 100, 500, 1000, 10000 (0, 0,…, 0), (1, -1, …, -1), (17, -17, …, -17), (30, 30, …, 30)

16 Ext White & Holst 2, 4, 10, 100, 500, 1000, 10000 (-5, -5, …, -5), (2, -2, …, -2), (3, -3, …, -3), (7, -7, …, -7)

17 Ext Denschnb 2, 4, 10, 100, 500, 1000, 10000 (8, 8, …, 8), (11, 11, …,11), (12, 12, …, 12),(13, 13, …, 13)

18 Diagonal 4 2, 4, 10, 100, 500, 1000, 10000 (2, 2, …, 2), (5, 5, …,5), (10, 10, …, 10), (15, 15, …, 15)

Page 11: Grds international conference on pure and applied science (5)

Numerical Results

β€’ Performance Profile based on Number iterations

Page 12: Grds international conference on pure and applied science (5)

Cont. Numerical Results

β€’ Performance Profile based on CPU time

Page 13: Grds international conference on pure and applied science (5)

β€’ Conclusion

* AMRI was able to solve 95% of test problems.

* WYL solve 97% of test problems.

* SW-A solve all test problems.

Page 14: Grds international conference on pure and applied science (5)

β€’ References* M.R. Hestenes, E.L. Stiefel, Methods of conjugate gradients for solving linear systems, J. Res. Natl. Bur. Stand. Sec. B 49 1952, pp. 409–432

*Z. Wei, S. Yao, L. Liu, The convergence properties of some new conjugate gradient methods, Appl. Math. Comput. 183 2006, pp. 1341–1350.

*G. Zoutendijk, Nonlinear programming computational methods, in:J.Abadie (Ed.), Integer and Nonlinear Programming, North-Holland, Amsterdam, 1970, pp. 37–86.

*M.Rivaie,M.Mamat,L. June, M.Ismail, a new class of nonlinear conjugate gradient coefficient with global convergence properties, Appl.Math.comput.218 ,2012, pp.11323-11332.

. * Y.H. Dai, Y. Yuan. An efficient hybrid conjugate gradient method for unconstrained optimization. Ann. Oper. Res. 103, 2001, pp. 33–47.

* E. Dolan, J.J.More, Benchmarking optimization software with performance profile, Math. Prog. 91, 2002, pp. 201–213.

*Y. F. Hu, C. Storey . Global convergence result for conjugate gradient methods. J.Optim.Theory.Appl., 71,1991, pp. 399-405.

* N. Andrei, An unconstrained optimization test functions collection, Adv. Modell. Optim. 10, 2008, pp. 147–161.

Page 15: Grds international conference on pure and applied science (5)

Thank You