The Sufficient Descent Condition of Nonlinear Conjugate Gradient Method

  • Authors

    • Srimazzura Basri
    • Mustafa Mamat
    • Puspa Liza Ghazali
    2018-11-30
    https://doi.org/10.14419/ijet.v7i4.30.22367
  • conjugate gradient, decent condition, exact line search, inexact line search, optimization.
  • Abstract

    Non-linear conjugate gradient methods has been widely used instrumental in solving large scale optimization. These methods has been proved that only required very low memory other than its numerical efficiency. Thus, many studies have been conducted to improve these methods to find the most efficient method. In this paper, we proposed a new non-linear conjugate gradient coefficient that guarantees sufficient descent condition. Numerical tests indicate that the proposed coefficient is better than the three classical conjugate gradient coefficients.

  • References

    1. [1] Stiefel E & Hestenes MR, Method of conjugate gradient for solving linear equation, Journal of Research of the National Bureau of Standards 49, (1952), pp. 409-436.

      [2] Fletcher R & Reeves C, Function minimization by conjugate gradients, The Computer Journal, Vol.7, (1964), pp. 149-154.

      [3] Polyak BT, The conjugate gradient method in extreme problems, USSR Computational Mathematics and Mathematical Physics 9, (1969), pp. 94-112.

      [4] Fletcher R, Practical method of optimization Vol. 1: unconstrained optimization, John Wiley & Sons, New York, (1987).

      [5] Liu Y & Storey C, Efficient generalized conjugate gradient algorithms, Part 1: Theory, Journal of Optimization Theory and Applications 69, (1991), pp. 129-137.

      [6] Dai YH & Yuan Y, A nonlinear conjugate gradient method with a strong global convergence property, SIAM Journal on Optimization 10, (1999), pp. 177-182.

      [7] Rivaie M, Mamat M, Leong WJ & Ismail M, A new conjugate gradient coefficient for large scale nonlinear unconstrained optimization, International Journal of Mathematics Analysis 6 (23), (2012), pp. 1131-1146.

      [8] Hamoda M, M. Rivaie, M. Mamat and Z. Salleh (2015), A new nonlinear conjugate gradient coefficients for unconstrained optimization, Applied Mathematical Sciences 9 (37), pp. 1813-1822.

      [9] Abashar A, Mamat M, Rivaie M, Mohd I & Omer O, The proof of sufficient descent condition for a new type of conjugate gradient methods, AIP Conference Proceedings 1602 (1), (2014), pp. 296-303.

      [10] Zoutendijk G, Nonlinear programming computational methods, Chapter in Integer and Nonlinear Programming, North-Holland, Amsterdam, (1970), pp. 37-86.

      [11] Andrei N, Dai–Yuan A, Conjugate gradient algorithm with sufficient descent and conjugacy conditions for unconstrained optimization, Applied Mathematics Letters 21, (2008), pp. 165-171.

      [12] Andrei N, An unconstrained optimization test functions collection, Advanced Modeling and Optimization 10, (2008), pp. 147-161.

      [13] Hillstrom KE, A simulation test approach to the evaluation of nonlinear optimization algorithms, ACM Transactions on Mathematical Software 3, (1977), pp. 305-315.

      [14] Dolan DE & Mor JJ, Benchmarking optimization software with performance profiles, Mathematical Programming 91, (2002), pp. 201-213

  • Downloads

  • How to Cite

    Basri, S., Mamat, M., & Ghazali, P. L. (2018). The Sufficient Descent Condition of Nonlinear Conjugate Gradient Method. International Journal of Engineering & Technology, 7(4.30), 458-461. https://doi.org/10.14419/ijet.v7i4.30.22367

    Received date: 2018-11-29

    Accepted date: 2018-11-29

    Published date: 2018-11-30