A New Conjugate Gradient Method with Exact Line Search

  • Authors

    • Mouiyad Bani Yousef
    • Mustafa Mamat
    • Mohd Rivaie
    2018-12-09
    https://doi.org/10.14419/ijet.v7i4.33.28167
  • Conjugate gradient method, exact line search, sufficient descent property, Global convergence, performance profile.
  • The nonlinear conjugate gradient (CG) method is a widely used approach for solving large-scale optimization problems in many fields, such as physics, engineering, economics, and design. The efficiency of this method is mainly attributable to its global convergence properties and low memory requirement. In this paper, a new conjugate gradient coefficient is proposed based on the Aini-Rivaie-Mustafa (ARM) method. Furthermore, the proposed method is proved globally convergent under exact line search. This is supported by the results of the numerical tests. The numerical performance of the new CG method better than other related and more efficient compared with previous CG methods.

     


  • References

    1. [1] Polyak, Boris T. "The conjugate gradient method in extremal problems." USSR Computational Mathematics and Mathematical Physics 9.4 (1969): 94-112.

      [2] Polak, Elijah, and Gerard Ribiere. "Note sur la convergence de méthodes de directions conjuguées." Revue française d'informatique et de recherche opérationnelle. Série rouge3.16 (1969): 35-43.

      [3] Fletcher, Reeves, and Colin M. Reeves. "Function minimization by conjugate gradients." The computer journal7.2 (1964): 149-154.

      [4] Wei, Zengxin, Shengwei Yao, and Liying Liu. "The convergence properties of some new conjugate gradient methods." Applied Mathematics and computation 183.2 (2006): 1341-1350.

      [5] Shapiee, Norrlaili, Mohd Rivaie, and Mustafa Mamat. "A new classical conjugate gradient coefficient with exact line search." AIP Conference Proceedings. Vol. 1739. No. 1. AIP Publishing, 2016.

      [6] Hestenes, Magnus Rudolph, and Eduard Stiefel. Methods of conjugate gradients for solving linear systems. Vol. 49. No. 1. Washington, DC: NBS, 1952.

      [7] Fletcher, Roger. Practical methods of optimization. John Wiley & Sons, 2013.

      [8] Dai, Yu-Hong, and Yaxiang Yuan. "A nonlinear conjugate gradient method with a strong global convergence property." SIAM Journal on optimization 10.1 (1999): 177-182.

      [9] Zoutendijk, G. "Nonlinear programming, computational methods." Integer and nonlinear programming (1970): 37-86.

      [10] Al-Baali, Mehiddin. "Descent property and global convergence of the Fletcher—Reeves method with inexact line search." IMA Journal of Numerical Analysis 5.1 (1985): 121-124.

      [11] Touati-Ahmed, D., and C. Storey. "Efficient hybrid conjugate gradient techniques." Journal of optimization theory and applications 64.2 (1990): 379-397.

      [12] Powell, Michael JD. "Nonconvex minimization calculations and the conjugate gradient method." Numerical analysis. Springer, Berlin, Heidelberg, 1984. 122-141.

      [13] Gilbert, Jean Charles, and Jorge Nocedal. "Global convergence properties of conjugate gradient methods for optimization." SIAM Journal on optimization 2.1 (1992): 21-42.

      [14] Powell, Michael James David. "Restart procedures for the conjugate gradient method." Mathematical programming 12.1 (1977): 241-254.

      [15] Hager, William W., and Hongchao Zhang. "A new conjugate gradient method with guaranteed descent and an efficient line search." SIAM Journal on optimization 16.1 (2005): 170-192.

      [16] Andrei, Neculai. "Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization." Journal of Computational and Applied Mathematics 230.2 (2009): 570-582.

      [17] Wei, Zengxin, Shengwei Yao, and Liying Liu. "The convergence properties of some new conjugate gradient methods." Applied Mathematics and computation 183.2 (2006): 1341-1350.

      [18] Abashar, Abdelrahman, et al. "Global convergence properties of a new class of conjugate gradient method for unconstrained optimization." (2017).

      [19] Jusoh, I., Mamat, M., & Rivaie, M. (2014). A new edition of conjugate gradient methods for large-scale unconstrained optimization. International Journal of Mathematical Analysis, 8, 2277-2291.

      [20] Rivaie, Mohd, et al. "The convergence properties of a new type of conjugate gradient methods." (2016).

      [21] Rivaie, Mohd, et al. "A new class of nonlinear conjugate gradient coefficients with global convergence properties." Applied Mathematics and Computation 218.22 (2012): 11323-11332.

      [22] Yuan, G., Lu, S., & Wei, Z. (2010). A line search algorithm for unconstrained optimization. Journal of Software Engineering and Applications, 3, 503-509.

      [23] Andrei, Neculai. "An unconstrained optimization test functions collection." Adv. Model. Optim 10.1 (2008): 147-161.

      [24] Dolan, Elizabeth D., and Jorge J. Moré. "Benchmarking optimization software with performance profiles." Mathematical programming 91.2 (2002): 201-213.

  • Downloads

  • How to Cite

    Bani Yousef, M., Mamat, M., & Rivaie, M. (2018). A New Conjugate Gradient Method with Exact Line Search. International Journal of Engineering & Technology, 7(4.33), 521-525. https://doi.org/10.14419/ijet.v7i4.33.28167