GLOBAL CONVERGENCE ANALYSIS OF A MODIFIED CONJUGATE GRADIENT METHOD FOR UNCONSTRAINED OPTIMIZATION PROBLEMS
DOI:
https://doi.org/10.33003/fjs-2024-0806-2942Keywords:
Optimization, Coefficient, Algorithm, Descent, ConvergenceAbstract
In this paper, the global convergence analysis of a modified conjugate gradient method for solving unconstrained optimization problems was considered. We proposed a modified conjugate gradient method for solving unconstrained optimization problems that incorporates an adaptive step size selection scheme. We analyze the method’s global convergence properties theoretically, demonstrating that it satisfies the sufficient descent and global convergence conditions under various assumptions. And we provide numerical experiments to illustrate its effectiveness and efficiency in solving unconstrained optimization problems. We also compare the numerical performance of the proposed method against three existing methods namely, FR, HS and PR using MATLAB simulations. The proposed method was found to perform better than FR and HS, and in competition with PR with respect to computation time, number of iteration and function evaluation.
References
Andrei, N. (2004). Test functions for unconstrained optimization. Research Institute for Informatics, Centre for Advanced Modeling and Optimization, 1-15
Dai, Y. H., & Yuan, Y. (1999); A Nonlinear Conjugate Gradient Method with a strong Global Convergence Property. SIAM journal on optimization, 10(1), 177482, http;//dx.doi.org
Dolan, E. D. & More, J. J. (2002). Benchmarking optimization software with performance profiles. Mathematical programming. 91(2), 201-213.
Fletcher, R. & Reeves, C. M. (1964); Function Minimization by Conjugate Gradients, 2(149-154), http://dx.doi.org/10.1098/comin/7.2.149
Ibrahim, A. & Rohanin, A. (2015). Convergence analysis of a new conjugate gradient method for unconstrained optimization. Applied Mathematical Sciences. 140(9), 6969-6984.
Ibrahim, A., Rohanin, A. (2016). Global Convergence Analysis of a new Hybrid Conjugate Gradient method for unconstrained optimization problems, Malaysian Journal of Fundamental and Applied Sciences 13(2), 40-48
Justin, O. O. & Ohoriemu, B. O. (2024). A hybrid approach to solving complex optimization problems using evolutionary algorithms and mathematical modeling. FUDMA Journal of Sciences. 8(3), 443-449.
Liu, J. & Du, X. (2012); Global Convergence of a Modified Ls Method, mathematical problems in Engineering, 910303/10.1156/910303
Lu, Y., Li, W., Zhang, C. & Yang, Y. (2015). A class of new hybrid conjugate gradient method for unconstrained optimization. Journal of Information and Computational science. 12(5), 1941-1949.
Powell, M. J. (1984). Nonconvex minimization calculations and the conjugate gradient method. Springer
Rivaie, M., Fauzi, M. & Mamat, M. (2012). New modifications of conjugate gradient coefficient with global convergence properties. Humanities, Science and Engineering Research (SHUSER), 2012 IEEE Symposium on. IEEE, 625-629
Wei, Z., Li, G. & Qi, L. (2006a). New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems. Applied Mathematics and Computation. 179(2), 407-430
Wei, Z., Yao, S. & Liu, L. (2006b). The convergence properties of some new conjugate gradient methods. Applied Mathematics and Computation. 183(2), 226-235.
Yuan, G., Meng, Z. & Li, Y. (2016). A modified Hestenes and Stiefel conjugate gradient algorithm for large scale nonsmooth minimizations and nonlinear equations. Journal of Optimization Theory and Applications. 168(1), 129-152.
Zoutendijk, G. (1970). Nonlinear programming, computational methods. Integer and nonlinear programming. 143(1), 37-86.