A modified conjugate gradient coefficient under exact line search for unconstrained optimization
-
2018-08-17 https://doi.org/10.14419/ijet.v7i3.28.20973 -
Conjugate gradient method, unconstrained optimization, sufficient descent, global convergence, exact line search. -
Abstract
The conjugate gradient (CG) method is a well-known solver for large-scale unconstrained optimization problems. In this paper, a modified CG method based on AMR* and CD method is presented. The resulting algorithm for the new CG method is proved to be globally convergent under exact line search both under some mild conditions. Comparisons of numerical performance are made involving the new method and four other CG methods. The results show that the proposed method is more efficient.
Â
Â
-
References
[1] Al-Baali M (1985), Descent property and global convergence of Fletcher-Reeves method with inexact line search, IMA. J. Numer. Anal. 5, 121-124.
[2] Alhawarat A, Mamat M, Rivaie M & Mohd I (2014), A new modification of nonlinear conjugate gradient coefficients with global convergence properties, International Journal of Mathematical, Computational, Statistical, Natural and Physical Engineering 8, 54-60.
[3] Abidin, N.Z., Mamat, M., Dangerfield, B., Zulkefli, J.H., Baten, Md.A., Wibowo, A (2014) Combating obesity through healthy eating behavior: A call for system dynamics optimization, Plos One, 9(12), 0114135.
[4] Andrei N (2008), An unconstrained optimization test functions collection, Advanced Modelling and Optimization 10, 147-161.
[5] Andrei N (2011), 40 conjugate gradients algorithms for unconstrained optimization, Bull. Malays. Math. Sci. Soc. 34, 319–330.
[6] Armijo L (1966), Minimization of functions having Lipschitz continuous partial derivatives, Paciï¬c J. Mathematics 16, 1-3.
[7] Chong EKP & Żak SH (2002), An introduction to optimization, John Wiley and Sons, New York.
[8] Dolan E & More JJ (2002), Benchmarking optimization software with performance profile, Maths. Prog. 91, 201-213.
[9] Fletcher R & Reeves C (1964), Function minimization by conjugate gradients, Comput. J. 7, 149-154.
[10] Fletcher R (1987), Practical method of optimization, Unconstrained optimization, John Wiley and Sons, New York.
[11] Goldstein AA (1965), On steepest descent, SIAM J. Control. 3, 147-151.
[12] Hajar, N., Mamat, M., Rivaie, M., Jusoh, I (2016). A new type of descent conjugate gradient method with exact line search, AIP Conference Proceedings, 1739, 020089
[13] Hestenes MR & Steifel E (1952), Method of conjugate gradient for solving linear equations, J. Res. Nat. Bur. Stand. 49, 409-436.
[14] Huang H, Wei Z & Yao S (2007), The proof of the sufficient descent condition of the Wei-Yao-Liu conjugate gradient method under the strong Wolfe-Powell line search, Applied Mathematics and Computation 189, 1241-1245.
[15] Jamil M & Yang XS (2013), A literature survey of benchmark functions for global optimisation problems, International Journal of Mathematical Modelling and Numerical Optimisation 4, 150–194.
[16] Liu GH, Han JY & Yin HX (1995), Global convergence of the Fletcher-Reeves algorithm with inexact line search, Applied Mathematics, 75-82.
[17] Mamat, M., Subiyanto, Kartono, A (2013). Mathematical model of cancer treatments using immunotheraphy, chemotheraphy and biochemotheraphy, Applied Mathematical Sciences, 7(5-8), 247-261
[18] Mamat, M., Deraman, S.K., Noor, N.M.M., Rokhayati, Y (2012, Diet problem and nutrient requirement using fuzzy linear programming approach, Asian Journal of Applied Sciences, 5(1), 52-59
[19] Mohamed, N.S., Mamat, M., Mohamad, F.S.., Rivaie, M.M. (2016), A new coefficient of conjugate gradient methods for nonlinear unconstrained optimization, Jurnal Teknologi, 78(6-4), 131-136.
[20] Polak E & Ribiere G (1969), Note sur la convergence de directions conjugees, Rev. Francaise Informat Recherche Operationelle 3, 35-43.
[21] Powell MJD (1984), Nonconvex minimization calculations and the conjugate gradient method, Lecture Notes in Mathematics, 1066, Springer-Verlag, Berlin, 122-141
[22] Rivaie M, Mamat M & Abashar A (2015), A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches, Applied Mathematics and Computation 268, 1152-1163.
[23] Rivaie M, Mamat M, June LW & Mohd I (2012), A new class of nonlinear conjugate gradient coefficient with global convergence properties, Applied Mathematics and Computation 218, 11323-11332.
[24] Shapiee, N., Rivaie,M., Mamat, M (2016). A new classical conjugate gradient coefficient with exact line search, AIP Conference Proceedings, 1739, 020082.
[25] Shoid, S., Rivaie,M., Mamat, M (2016). A modification of classical conjugate gradient method using strong Wolfe line search, AIP Conference Proceedings, 1739, art.no. 020071.
[26] Sun W & Yuan YX (2006), Optimization theory and methods: Nonlinear programming, Springer Science and Business Media, New York.
[27] Wei Z, Yao S & Liu L (2006), The convergence properties of some new conjugate gradient methods, App-l. Math. Comput. 183, 1341-1350.
[28] Witte BF & Holst WR (1964), Two new direct minimum search procedures for functions of several variables, Proceedings of the ACM Spring Joint Computer Conference, pp. 195-209.
[29] Wolfe P (1969), Convergence conditions for ascent method, SIAM Rev. 11, 226-235.
[30] Zoutendijk G (1970), Nonlinear programming computational methods, in Abadie J. (Ed.) Integer and nonlinear programming, Amsterdam.
-
Downloads
-
How to Cite
Aini, N., Hajar, N., Rivaie, M., & Mamat, M. (2018). A modified conjugate gradient coefficient under exact line search for unconstrained optimization. International Journal of Engineering & Technology, 7(3.28), 84-88. https://doi.org/10.14419/ijet.v7i3.28.20973Received date: 2018-10-04
Accepted date: 2018-10-04
Published date: 2018-08-17