Modification on Spectral Conjugate Gradient Method for Unconstrained Optimization

 
 
 
  • Abstract
  • Keywords
  • References
  • PDF
  • Abstract


    The classical Newton’s direction and spectral conjugate gradient direction are the prominent directions in solving large-scale unconstrained optimization problems. Using the standard secant equation, a modified spectral CG method (MSCG) is proposed, the scheme is a modification of Birgin and Martinez spectral CG method (SCG). Sufficient descent property as well as global convergence has been proved by strong Wolfe line search. Numerical outcome shows that the method is practically effective when compared with classical PRP, FR and spectral CG methods.

                                                                                                                                          

     


  • Keywords


    Global convergence; inexact line search; spectral CG; secant condition; sufficient descent property.

  • References


      [1] M. Raydan (1997). The Barzilai and J.M. Borwein gradient methods for the large scale unconstrained minimization in extreme problems. SIAM. J. Optim., 7 (1), 26-33.

      [2] E. Dolan and J. J. More (2002). Benchmarking optimization software with performance profile. Math. Prog., 91, 201-213.

      [3] E.G. Birgin and J. M. Martinez (2011). A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim., 43 (2), 117-128.

      [4] G. Zoutendijk (1970). Nonlinear programming, computational methods. Integer and Nonlinear Programming, 1970, 37-86.

      [5] A. Perry (1978). A modified conjugate gradient algorithm. Operational Research, 26, 1073-1078.

      [6] M. J. D. Powell (1984). Non-convex minimization calculations and the conjugate gradient method. Lecture Notes in Mathematics, 1066, 122-241.

      [7] J. Barzilai and J. M. Borwein (1988). Two-point step size gradient methods, IMA J Numer Anal, 8, 141-148.

      [8] N. Zull, M. Rivaie, M. Mamat, Z Salleh and Z. Amani (2015). Global convergence of a spectral conjugate gradient by using strong Wolfe line search. Appl. Math. Sci., 63, 3105-3117.

      [9] W. W. Hager and H. Zhang (2006). A survey of nonlinear conjugate gradient methods. Pacific Journal of Optimization, 2(1), 35-58.

      [10] N. Andrei (2008). An unconstrained optimization test functions collection. Adv. Modell. Optim., 10, 147-161.

      [11] R. B. Yunus, M. Mamat, A. Abashar M. Rivaie, Z. Salleh and Z. Amani (2015). The convergence properties of a new kind of conjugate gradient method for unconstrained optimization. Appl. Math. Sci., 38, 1845-1856.

      [12] M. Rivaie, M. Mamat and A. Albashar (2015). A new class of nonlinear conjugate gradient coefficients with exact and inexact line search. Appl. Math. and Comp., 268, 1152-1163.

      [13] A. Y. Usman, M. Mamat, M. Rivaie, A. M. Mohamad and B. Y. Rabi’u (2018). Secant free condition of a spectral WYL and its global convergence properties. Far East Journal of Mathematical Science, 103(12), 1889-1902.

      [14] A. Y. Usman, M. Mamat, M. Rivaie, A. M. Mohamad and J. Sabi’u (2018). A recent modification on Dai-Liao conjugate gradient method for solving symmetric nonlinear equations. Far East Journal of Mathematical Science, 103(12), 1961-1974.

      [15] K. U. Kamfa, M. Mamat, A. Abashar, M. Rivaie, P. L. B. Ghazali and Z. Salleh (2015). Another modified conjugate gradient coefficient with global convergence properties. Applied Mathematical Sciences, 9, 1833-1844.

      [16] N. Z. Abidin, M. Mamat, B. Dangerfield, J. H. Zulkepli, M. A. Baten and A. Wibowo (2014). Combating obesity through healthy eating behavior: A call for system dynamics optimization. Plos One, 9(12), 1-17.

      [17] M. Mamat, Y. Rokhayati, N. M. M. Noor and I. Mohd (2011). Optimizing human diet problem with puzzy price using fuzzy linear programming approach. Pakistan Journal of Nutrition, 10(6), 594-598.

      [18] A. Abashar, M. Mamat, M. Rivaie and I. Mohd (2014). Global convergence properties of a new class of conjugate gradient method for unconstrained optimization. Applied Mathematical Sciences Issue, 65-68, 3307-3319.

      [19] A. Abashar, M. Mamat, M. Rivaie, I. Mohd and O. Omer (2014). The proof of sufficient descent condition for a new type of conjugate gradient methods. AIP Conference Proceedings, 1602, 296-303.


 

View

Download

Article ID: 23466
 
DOI: 10.14419/ijet.v7i3.28.23466




Copyright © 2012-2015 Science Publishing Corporation Inc. All rights reserved.