Abstract
The subspace minimization conjugate gradient methods based on Barzilai–Borwein method (SMCG_BB) and regularization model (SMCG_PR), which were proposed by Liu and Liu (J Optim Theory Appl 180(3):879–906, 2019) and Zhao et al. (Numer Algorithm, 2020. https://doi.org/10.1007/s11075-020-01017-1), respectively, are very interesting and efficient for unconstrained optimization. In this paper, two accelerated subspace minimization conjugate gradient methods are proposed for unconstrained optimization. Motivated by the subspace minimization conjugate gradient methods and the finite termination of linear conjugate gradient methods, we derive an acceleration parameter by the quadratic interpolation function to improve the stepsize, and the modified stepsize may be more closer to the stepsize obtained by exact line search. Moreover, several specific acceleration criteria to enhance the efficiency of the algorithm are designed. Under standard conditions, the global convergence of the proposed methods can be guaranteed. Numerical results show that the proposed accelerated methods are superior to two excellent subspace minimization conjugate gradient methods SMCG_BB and SMCG_PR.
Similar content being viewed by others
References
Andrei, N.: A new three-term conjugate gradient algorithm for unconstrained optimization. Numer. Algorithm 68(2), 305–321 (2015)
Andrei, N.: An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization. Numer. Algorithm 65, 859–874 (2014)
Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10(1), 147–161 (2008)
Andrei, N.: Nonlinear Conjugate Gradient Methods for Unconstrained Optimization. Springer, Berlin (2020)
Andrei, N.: Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization. Bull. Malays. Sci. Soc. 34(2), 319–330 (2011)
Andrei, N.: Accelerated conjugate gradient algorithm with modified secant condition for unconstrained optimization. Stud. Inform. Control 18(3), 211–232 (2009)
Andrei, N.: Acceleration of conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 213(2), 361–369 (2009)
Andrei, N.: An acceleration of gradient descent algorithm with backtracking for unconstrained optimization. Numer. Algorithm 42(1), 63–73 (2006)
Andrei, N.: Diagonal approximation of the Hessian by finite differences for unconstrained optimization. J. Optim. Theory Appl. 185(3), 859–879 (2020)
Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer Anal. 8, 141–148 (1988)
Dai, Y.H., Kou, C.X.: A Barzilai–Borwein conjugate gradient method. Sci. China Math. 59(8), 1511–1524 (2016)
Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)
Dai, Y.H., Yuan, J.Y., Yuan, Y.X.: Modified two-point stepsize gradient methods for unconstrained optimization problems. Comput. Optim. Appl. 22(1), 103–109 (2002)
Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)
Dai, Y.H.: Nonlinear conjugate gradient methods. Wiley Encycl. Oper. Res. Manag. Sci. (2011). https://doi.org/10.1002/9780470400531.eorms0183
Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)
Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)
Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2(1), 35–58 (2006)
Hager, W.W., Zhang, H.: Algorithm 851: CG\_DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32(1), 113–137 (2006)
Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)
Li, M., Liu, H.W., Liu, Z.X.: A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization. Numer. Algorithm 79, 195–219 (2018)
Li, Y.F., Liu, Z.X., Liu, H.W.: A subspace minimization conjugate gradient method based on conic model for unconstrained optimization. Comput. Appl. Math. 38(1), 16 (2019)
Liu, H.W., Liu, Z.X.: An efficient Barzilai–Borwein conjugate gradient method for unconstrained optimization. J. Optim. Theory Appl. 180(3), 879–906 (2019)
Liu, Z.X., Liu, H.W.: An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization. Numer. Algorithms 78(1), 21–39 (2018)
Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (1999)
Polak, E., Ribière, G.: Note sur la convergence de méthodes de directions conjuguées. Rev. Franaise Informat. Rech. Opérationnelle. 3(16), 35–43 (1969)
Polyak, B.T.: The conjugate gradient method in extremal problems. USSR Comput. Math. Math. Phys. 9(4), 94–112 (1969)
Sun, W.Y.: On nonquadratic model optimization methods. Asia Pac. J. Oper. Res. 13, 43–63 (1996)
Wang, T., Liu, Z.X., Liu, H.W.: A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization. Int. J. Comput. Math. 96(10), 1924–1942 (2019)
Yang, Y.T., Chen, Y.T., Lu, Y.L.: A subspace conjugate gradient algorithm for large-scale unconstrained optimization. Numer. Algorithm 76, 813–828 (2017)
Yuan, Y.X., Stoer, J.: A subspace study on conjugate gradient algorithms. Z. Angew. Math. Mech. 75(1), 69–77 (1995)
Yuan, Y.X., Sun, W.Y.: Optimization Theory and Methods. Science Press, Beijing (1997)
Yuan, Y.X.: A modified BFGS algorithm for unconstrained optimization. IMA J. Numer. Anal. 11(3), 325–332 (1991)
Yuan, Y.X.: A review on subspace methods for nonlinear optimization. In: Proceedings of the International Congress of Mathematics. Korea, pp. 807–827 (2014)
Zhao, T., Liu, H.W., Liu, Z.X.: New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization. Numer. Algorithm (2020). https://doi.org/10.1007/s11075-020-01017-1
Acknowledgements
The authors would like to thank the editor and the anonymous referees for their valuable suggestions and comments which have greatly improved the presentation of this paper. This research is supported by the National Natural Science Foundation of China (No. 11901561) and Guangxi Natural Science Foundation (No. 2018GXNSFBA281180).
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by Nobuo Yamashita.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Sun, W., Liu, H. & Liu, Z. A Class of Accelerated Subspace Minimization Conjugate Gradient Methods. J Optim Theory Appl 190, 811–840 (2021). https://doi.org/10.1007/s10957-021-01897-w
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10957-021-01897-w
Keywords
- Conjugate gradient method
- Regularization model
- Barzilai–Borwein method
- Subspace minimization
- Acceleration method