Abstract
We introduce a new scaling parameter for the Dai-Kou family of conjugate gradient algorithms (2013), which is one of the most numerically efficient methods for unconstrained optimization. The suggested parameter is based on eigenvalue analysis of the search direction matrix and minimizing the measure function defined by Dennis and Wolkowicz (1993). The corresponding search direction of conjugate gradient method has the sufficient descent property and the extended conjugacy condition. The global convergence of the proposed algorithm is given for both uniformly convex and general nonlinear objective functions. Also, numerical experiments on a set of test functions of the CUTER collections and the practical problem of the manipulator of robot movement control show that the proposed method is effective.
References
A. B. Abubakar, P. Kumam, M. Malik, A. H. Ibrahim: A hybrid conjugate gradient based approach for solving unconstrained optimization and motion control problems. Math. Comput. Simul. 201 (2022), 640–657.
Z. Aminifard, S. Babaie-Kafaki, S. Ghafoori: An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing. Appl. Numer. Math. 167 (2021), 187–201.
N. Andrei: Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization. Optim. Methods Softw. 32 (2017), 534–551.
N. Andrei: New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method. Calcolo 57 (2020), Article ID 17, 27 pages.
S. Babaie-Kafaki: On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae. J. Optim. Theory Appl. 167 (2015), 91–101.
R. H. Byrd, J. Nocedal: A tool for the analysis of quasi-Newton methods with application to unconstrained minimization. SIAM J. Numer. Anal. 26 (1989), 727–739.
Y.-H. Dai, C.-X. Kou: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23 (2013), 296–320.
M. L. de Leeuw den Bouter, M. B. van Gijzen, R. F. Remis: CG variants for general-form regularization with an application to low-field MRI. Numerical Mathematics and Advanced Applications: ENUMATH 2019. Lecture Notes in Computational Science and Engineering 139. Springer, Cham, 2019, pp. 673–681.
J. E. Dennis, Jr., H. Wolkowicz: Sizing and least-change secant methods. SIAM J. Numer. Anal. 30 (1993), 1291–1314.
E. D. Dolan, J. J. Moré: Benchmarking optimization software with performance profiles. Math. Program. 91 (2002), 201–213.
H. Esmaeili, S. Shabani, M. Kimiaei: A new generalized shrinkage conjugate gradient method for sparse recovery. Calcolo 56 (2019), Article ID 1, 38 pages.
N. I. M. Gould, D. Orban, P. L. Toint: CUTEr and SifDec: A constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29 (2003), 373–394.
W. W. Hager, H. Zhang: Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32 (2006), 113–137.
I. E. Kaporin: New convergence results and preconditioning strategies for the conjugate gradient method. Numer. Linear Algebra Appl. 1 (1994), 179–210.
D. Kratzer, S. V. Parter, M. Steuerwalt: Block splittings for the conjugate gradient method. 11 (1983), 255–279.
W. Li, Y. Liu, J. Yang, W. Wu: A new conjugate gradient method with smoothing L1/2 regularization based on a modified secant equation for training neural networks. Neural Process. Lett. 48 (2018), 955–978.
S. Nezhadhosein: A modified descent spectral conjugate gradient method for unconstrained optimization. Iran. J. Sci. Technol., Trans. A, Sci. 45 (2021), 209–220.
J. Nocedal, S. J. Wright: Numerical Optimization. Springer Series in Operations Research and Financial Engineering. Springer, New York, 2006.
A. Perry: A class of conjugate gradient algorithms with a two-step variable metric memory. Discussion paper 269 (1977), 16 pages; Available at https://EconPapers.repec.org/RePEc:nwu:cmsems:269.
D. F. Shanno: Conjugate gradient methods with inexact searches. Math. Oper. Res. 3 (1978), 244–256.
M. Sun, J. Liu, Y. Wang: Two improved conjugate gradient methods with application in compressive sensing and motion control. Math. Probl. Eng. 2020 (2020), Article ID 9175496, 11 pages.
D. S. Watkins: Fundamentals of Matrix Computations. Pure and Applied Mathematics. John Wiley & Sons, New York, 2002.
R. Winther: Some superlinear convergence results for the conjugate gradient method. SIAM J. Numer. Anal. 17 (1980), 14–17.
P. Wolfe: Convergence conditions for ascent methods. SIAM Rev. 11 (1969), 226–235.
P. Wolfe: Convergence conditions for ascent methods. II: Some corrections. SIAM Rev. 13 (1971), 185–188.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Akbari, M., Nezhadhosein, S. & Heydari, A. Adjustment of the scaling parameter of Dai-Kou type conjugate gradient methods with application to motion control. Appl Math (2024). https://doi.org/10.21136/AM.2024.0006-24
Received:
Published:
DOI: https://doi.org/10.21136/AM.2024.0006-24