An application of conjugate gradient method under strong Wolfe line search for solving unconstrained optimization

 
 
 
  • Abstract
  • Keywords
  • References
  • PDF
  • Abstract


    The conjugate gradient (CG) method is one of the most prominent methods for solving linear and nonlinear problems in optimization. In this paper, we propose a CG method with sufficient descent property under strong Wolfe line search. The proposed CG method is then applied to solve systems of linear equations. The numerical results obtained from the tests are evaluated based on number iteration and CPU time and then analyzed through performance profile. In order to examine its efficiency, the performance of our CG formula is compared to that of other CG methods. The results show that the proposed CG formula has better performance than the other tested CG methods.

     

     


  • Keywords


    Conjugate Gradient Method; Spectral Conjugate Gradient; Strong Wolfe Line Search.

  • References


      [1] Hestenes MR & Steifel E (1952), Method of Conjugate Gradient for Solving Linear Equations, J. Res. Nat. Bur. Stand. 49, 409-436.

      [2] Fletcher R & Reeves C (1964), Function Minimization by Conjugate Gradients, Comput. J. 7, 149-154.

      [3] Polak E and Ribiere G (1969), Note Sur La Convergence de Directions Conjugees, Rev. Francaise Informat Recherche Operationalle 3, 35-43.

      [4] Liu Y and Storey C (1992), Efficient Generalized Conjugate Gradient Algorithm Part 1: Theory, J. Comput. Appl. Math. 69, 129-137.

      [5] Dai YH and Yuan Y (1999), A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property, SIAM J. Optim. 10, 177-182.

      [6] Fletcher R, Practical method of optimization, Unconstrained Optimization, 1, John Wiley and Sons, New York, (1987).

      [7] Rivaie M, Mustafa M, Mohd I, and Fauzi M (2010), Modified Hestenes-Steifel Conjugate Gradient Coefficient for Unconstrained Optimization, Journal of Interdisciplinary Maths. 13(3), 241-251.

      [8] Rivaie M, Mustafa M, June LW and Mohd I (2012), A New Class of Nonlinear Conjugate Gradient Coefficient with Global Convergence Properties, Appl. Math. Comp. 218, 11323-11332.

      [9] Wei Z, Yao S and Liu L (2006), The Convergence Properties of Some New Conjugate Gradient Methods, Applied Maths. And Comp. 183, 1341-1350.

      [10] Hajar N, Mamat M, Rivaie M and Jusoh I (2016), A New Type of Descent Conjugate Gradient Method with Exact Line Search, AIP Conf. Proc. 1739, 020089.

      [11] Ghani NHA, Rivaie M and Mamat M (2016), A Modified Form of Conjugate Gradient Method for Unconstrained Optimization Problems, AIP Conf. Proc. 1739, 020076.

      [12] Ghani NHA, Rivaie M and Mamat M (2017), A New Family of Polak-Ribiere-Polyak Conjugate Gradient Method with the Strong-Wolfe Line Search, AIP Conf. Proc. 1870, 040060.

      [13] Shoid S, Rivaie M and Mamat M (2016), A Modification of Classical Conjugate Gradient Method Using Strong Wolfe Line Search, AIP Conf. Proc. 1739, 020071.

      [14] Du X, Zhang P and Ma W (2016), Some Modified Conjugate Gradient Methods for Unconstrained Optimization, Journal of Comp. and Applied Maths. 305, 92-114.

      [15] Mohamed NS, Mamat M and Rivaie M (2016), Solving a Large Scale Nonlinear Unconstrained Optimization with Exact Line Search Direction by Using New Coefficient of Conjugate Gradient Methods, AIP Conf. Proc. 1787, 080018.

      [16] Mohamed NS, Mamat M and Rivaie M (2017), A New Nonlinear Conjugate Gradient Coefficient under Strong Wolfe-Powell Line Search, AIP Conf. Proc. 1870, 040055.

      [17] Shapiee N, Rivaie M and Mamat M (2016), A New Classical Conjugate Gradient Coefficient with Exact Line Search, AIP Conf. Proc. 1739, 020082.

      [18] Shapiee N, Rivaie M and Mamat M (2015), A New Simple Conjugate Gradient Coefficient for Unconstrained Optimization, Applied Mathematical Sciences 9 (63), 3119 – 3130.

      [19] Khadijah W, Rivaie M and Mamat M (2017), A Three-Term Conjugate Gradient Method under the Strong-Wolfe Line Search, AIP Conf. Proc. 1870, 040056.

      [20] ‘Aini N, Rivaie M and Mamat M (2016), A Modified Conjugate Gradient Coefficient with Inexact Line Search for Unconstrained Optimization, AIP Conf. Proc. 1787, 080019.

      [21] Abidin ZZ, Mamat M and Rivaie M (2016), A New Steepest Descent Method with Global Convergence Properties, AIP Conf. Proc. 1739, 020070.

      [22] Birgin EG and Martinez JM (2001), A Spectral Conjugate Gradient Method for Unconstrained Optimization, Appl. Math Optim. 43, 117-128.

      [23] Zhang L, Zhou W and Li D (2006), Global Convergence of a Modified Fletcher-Reeves Conjugate Gradient Method with Armijo-Type Line Search, Numer. Math. 104, 561-572.

      [24] Zhang L. and Zhou W (2008), Two Descent Hybrid Conjugate Gradient Methods for Optimization, J. Comp. and Appl. Math. 216, 251-264.

      [25] Khadijah W, Rivaie M, Mamat M and Jusoh I (2016), A Spectral KRMI Conjugate Gradient Method under the Strong-Wolfe Line Search, AIP Conf. Proc. 1739, 020072.

      [26] Andrei N (2008), An Unconstrained Optimization Test Functions Collection, Adv. Model. Optim. 10, 147-161.

      [27] Hilstrom KE (1997), A Simulation Test Approach to Evaluation of Nonlinear Optimization Algorithms, A.C.M. Trans. Maths. Softw. 3, 305-315.

      [28] Dolan E and More JJ (2002), Benchmarking Optimization Software with Performance Profile, Maths. Prog. 91, 201-213.

      [29] Chong EKP and Zak SH, An Introduction to Optimization, 3rd Edition, John Wiley and Sons, New Jersey (2008).


 

View

Download

Article ID: 20956
 
DOI: 10.14419/ijet.v7i3.28.20956




Copyright © 2012-2015 Science Publishing Corporation Inc. All rights reserved.