An Efficient Modified Hestenes-Steifel Conjugate Gradient Method with Global Convergence Properties Under Strong Wolfe Powell Line Search for Large–Scale Unconstrained Optimization

  • Authors

    • Talat Alkhouli
    • Mustafa Mamat
    • Mohd Rivaie
    • Sukono .
    • Mohamad Afendee Mohamed
    • Puspa Liza Ghazali
    https://doi.org/10.14419/ijet.v7i3.28.27379
  • unconstrained optimization, conjugate gradient, strong Wolfe–Powell line search, Hestenes-Steifel formula, global convergence.
  • Abstract

    Conjugate gradient method has been widely used for solving unconstrained optimization and famously known for its low memory requirements and global convergence properties. Many works have been directed towards improving this method. In this paper, we propose a superior conjugate gradient coefficient 14خ²k">  by revising the already proven Hestenes-Steifel formula. Theoretical proofs show that the new method fulfils sufficient descent condition if strong Wolfe-Powell inexact line search is used. Moreover, the numerical results showed that the proposed method has outclassed the exiting CG coefficients operating under standard test set. The numerical results also showed that the new formula for 14خ²k">  performs significantly better than that of the original Hestenes-Steifel.

     

     

  • References

    1. [1] Stiefel, E. (1952). Methods of conjugate gradient for solving linear equation. J. Res. Nat. Bur. Standards, 49, 409-436.

      [2] Fletcher, R., & Reeves, C. M. (1964). Function minimization by conjugate gradients. Computer Journal, 7(2), 149-154.

      [3] Polyak, B. T. (1969). The conjugate gradient method in extremal problems. USSR Computational Mathematics and Mathematical Physics, 9(4), 94-112.

      [4] Polak, E., & Ribiere, G. (1969). Note sur la convergence de méthodes de directions conjuguées. Revue française d'informatique et de recherche opérationnelle. Série Rouge, 3(16), 35-43.

      [5] Fletcher, R. (2013). Practical methods of optimization. John Wiley and Sons.

      [6] Liu, Y., & Storey, C. (1991). Efficient generalized conjugate gradient algorithms, part 1: theory. Journal of Optimization Theory and Applications, 69(1), 129-137.

      [7] Dai, Y. H., & Yuan, Y. (1999). A nonlinear conjugate gradient method with a strong global convergence property. SIAM Journal on Optimization, 10(1), 177-182.

      [8] Zoutendijk, G. (1970). Nonlinear programming, computational methods. Integer and Nonlinear Programming, 1970, 37-86.

      [9] Powell, M. J. D. (1977). Restart procedures for the conjugate gradient method. Mathematical Programming, 12(1), 241-254.

      [10] Powell, M. J. (1984). Nonconvex minimization calculations and the conjugate gradient method. In D. F. Griffiths (Ed.), Numerical Analysis. Berlin: Springer, pp. 122-141.

      [11] Powell, M. J. (1986). Convergence properties of algorithms for nonlinear optimization. Siam Review, 28(4), 487-500.

      [12] Wei, Z., Li, G., & Qi, L. (2006). New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems. Applied Mathematics and Computation, 179(2), 407-430.

      [13] Dai, Z. F. (2011). Two modified HS type conjugate gradient methods for unconstrained optimization problems. Nonlinear Analysis: Theory, Methods and Applications, 74(3), 927-936.

      [14] Rivaie, M., Mamat, M., Mohd, I., & Fauzi, M. (2010). Modified Hestenes-Steifel conjugate gradient coefficient for unconstrained optimization. Journal of Interdisciplinary Mathematics, 13(3), 241-251.

      [15] Dai, Z. F. (2011). Two modified HS type conjugate gradient methods for unconstrained optimization problems. Nonlinear Analysis: Theory, Methods and Applications, 74(3), 927-936.

      [16] Zhang, L. (2009). New versions of the Hestenes-Stiefel nonlinear conjugate gradient method based on the secant condition for optimization. Computational and Applied Mathematics, 28(1), 111-133

      [17] Shapiee, N., Rivaie, M., & Mamat, M. (2016). A new classical conjugate gradient coefficient with exact line search. AIP Conference Proceedings, 1739(1), 1-8.

      [18] Hamoda, M., Rivaie, M., Mamat, M., & Salleh, Z. (2015). A new nonlinear conjugate gradient coefficient for unconstrained optimization. Applied Mathematical Sciences, 9, 1813-1822.

      [19] Al-Baali, M. (1985). Descent property and global convergence of the Fletcher—Reeves method with inexact line search. IMA Journal of Numerical Analysis, 5(1), 121-124.

      [20] Li, M., & Feng, H. (2011). A sufficient descent LS conjugate gradient method for unconstrained optimization problems. Applied Mathematics and Computation, 218(5), 1577-1586.

      [21] Andrei, N. (2011). 40 conjugate gradients algorithms for unconstrained optimization. Bull. Malay. Math. Sci. Soc., 34, 319-330.

      [22] Wei, Z., Yao, S., & Liu, L. (2006). The convergence properties of some new conjugate gradient methods. Applied Mathematics and Computation, 183(2), 1341-1350.

      [23] Zhang, Y., Zheng, H., & Zhang, C. (2012). Global convergence of a modified PRP conjugate gradient method. Procedia Engineering, 31, 986-995.

      [24] Dai, Z., & Wen, F. (2012). Another improved Wei–Yao–Liu nonlinear conjugate gradient method with sufficient descent property. Applied Mathematics and Computation, 218(14), 7421-7430.

      [25] Dai, Y. H., & Yuan, Y. (2000). Nonlinear conjugate gradient methods. Shanghai Science and Technology Publisher, Shanghai.

      [26] Yuan, Y., & Sun, W. (1999). Theory and methods of optimization. Science Press of China.

      [27] Hager, W. W., & Zhang, H. (2005). A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM Journal on Optimization, 16(1), 170-192.

      [28] Yuan, G., & Wei, Z. (2009). New line search methods for unconstrained optimization. Journal of the Korean Statistical Society, 38(1), 29-39.

      [29] Andrei, N. (2009). Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization. Journal of Computational and Applied Mathematics, 230(2), 570-582.

      [30] Mamat, M., Rivaie, M., Mohd, I., & Fauzi, M. (2010). A new conjugate gradient coefficient for unconstrained optimization. Int. J. Contemp. Math. Sciences, 5(29), 1429-1437.

      [31] Dolan, E. D., & Moré, J. J. (2002). Benchmarking optimization software with performance profiles. Mathematical Programming, 91(2), 201-213.

      [32] Hillstrom, K. E. (1977). A simulation test approach to the evaluation of nonlinear optimization algorithms. ACM Transactions on Mathematical Software, 3(4), 305-315.

      [33] Andrei, N. (2008). An unconstrained optimization test functions collection. Advanced Modeling and Optimization, 10(1), 147-161.

      [34] Abashar, A., Mamat, M., Rivaie, M., Mohd, I., & Omer, O. (2014). The proof of sufficient descent condition for a new type of conjugate gradient methods. AIP Conference Proceedings, 1602(1), 296-303.

  • Downloads

  • How to Cite

    Alkhouli, T., Mamat, M., Rivaie, M., ., S., Afendee Mohamed, M., & Liza Ghazali, P. (2018). An Efficient Modified Hestenes-Steifel Conjugate Gradient Method with Global Convergence Properties Under Strong Wolfe Powell Line Search for Large–Scale Unconstrained Optimization. International Journal of Engineering & Technology, 7(3.28), 329-334. https://doi.org/10.14419/ijet.v7i3.28.27379

    Received date: 2019-02-12

    Accepted date: 2019-02-12