[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content
Log in

Adjustment of the scaling parameter of Dai-Kou type conjugate gradient methods with application to motion control

  • Published:
Applications of Mathematics Aims and scope Submit manuscript

Abstract

We introduce a new scaling parameter for the Dai-Kou family of conjugate gradient algorithms (2013), which is one of the most numerically efficient methods for unconstrained optimization. The suggested parameter is based on eigenvalue analysis of the search direction matrix and minimizing the measure function defined by Dennis and Wolkowicz (1993). The corresponding search direction of conjugate gradient method has the sufficient descent property and the extended conjugacy condition. The global convergence of the proposed algorithm is given for both uniformly convex and general nonlinear objective functions. Also, numerical experiments on a set of test functions of the CUTER collections and the practical problem of the manipulator of robot movement control show that the proposed method is effective.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

References

  1. A. B. Abubakar, P. Kumam, M. Malik, A. H. Ibrahim: A hybrid conjugate gradient based approach for solving unconstrained optimization and motion control problems. Math. Comput. Simul. 201 (2022), 640–657.

    Article  MathSciNet  Google Scholar 

  2. Z. Aminifard, S. Babaie-Kafaki, S. Ghafoori: An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing. Appl. Numer. Math. 167 (2021), 187–201.

    Article  MathSciNet  Google Scholar 

  3. N. Andrei: Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization. Optim. Methods Softw. 32 (2017), 534–551.

    Article  MathSciNet  Google Scholar 

  4. N. Andrei: New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method. Calcolo 57 (2020), Article ID 17, 27 pages.

  5. S. Babaie-Kafaki: On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae. J. Optim. Theory Appl. 167 (2015), 91–101.

    Article  MathSciNet  Google Scholar 

  6. R. H. Byrd, J. Nocedal: A tool for the analysis of quasi-Newton methods with application to unconstrained minimization. SIAM J. Numer. Anal. 26 (1989), 727–739.

    Article  MathSciNet  Google Scholar 

  7. Y.-H. Dai, C.-X. Kou: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23 (2013), 296–320.

    Article  MathSciNet  Google Scholar 

  8. M. L. de Leeuw den Bouter, M. B. van Gijzen, R. F. Remis: CG variants for general-form regularization with an application to low-field MRI. Numerical Mathematics and Advanced Applications: ENUMATH 2019. Lecture Notes in Computational Science and Engineering 139. Springer, Cham, 2019, pp. 673–681.

    Chapter  Google Scholar 

  9. J. E. Dennis, Jr., H. Wolkowicz: Sizing and least-change secant methods. SIAM J. Numer. Anal. 30 (1993), 1291–1314.

    Article  MathSciNet  Google Scholar 

  10. E. D. Dolan, J. J. Moré: Benchmarking optimization software with performance profiles. Math. Program. 91 (2002), 201–213.

    Article  MathSciNet  Google Scholar 

  11. H. Esmaeili, S. Shabani, M. Kimiaei: A new generalized shrinkage conjugate gradient method for sparse recovery. Calcolo 56 (2019), Article ID 1, 38 pages.

  12. N. I. M. Gould, D. Orban, P. L. Toint: CUTEr and SifDec: A constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29 (2003), 373–394.

    Article  Google Scholar 

  13. W. W. Hager, H. Zhang: Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32 (2006), 113–137.

    Article  Google Scholar 

  14. I. E. Kaporin: New convergence results and preconditioning strategies for the conjugate gradient method. Numer. Linear Algebra Appl. 1 (1994), 179–210.

    Article  MathSciNet  Google Scholar 

  15. D. Kratzer, S. V. Parter, M. Steuerwalt: Block splittings for the conjugate gradient method. 11 (1983), 255–279.

  16. W. Li, Y. Liu, J. Yang, W. Wu: A new conjugate gradient method with smoothing L1/2 regularization based on a modified secant equation for training neural networks. Neural Process. Lett. 48 (2018), 955–978.

    Article  Google Scholar 

  17. S. Nezhadhosein: A modified descent spectral conjugate gradient method for unconstrained optimization. Iran. J. Sci. Technol., Trans. A, Sci. 45 (2021), 209–220.

    Article  MathSciNet  Google Scholar 

  18. J. Nocedal, S. J. Wright: Numerical Optimization. Springer Series in Operations Research and Financial Engineering. Springer, New York, 2006.

    Google Scholar 

  19. A. Perry: A class of conjugate gradient algorithms with a two-step variable metric memory. Discussion paper 269 (1977), 16 pages; Available at https://EconPapers.repec.org/RePEc:nwu:cmsems:269.

  20. D. F. Shanno: Conjugate gradient methods with inexact searches. Math. Oper. Res. 3 (1978), 244–256.

    Article  MathSciNet  Google Scholar 

  21. M. Sun, J. Liu, Y. Wang: Two improved conjugate gradient methods with application in compressive sensing and motion control. Math. Probl. Eng. 2020 (2020), Article ID 9175496, 11 pages.

  22. D. S. Watkins: Fundamentals of Matrix Computations. Pure and Applied Mathematics. John Wiley & Sons, New York, 2002.

    Google Scholar 

  23. R. Winther: Some superlinear convergence results for the conjugate gradient method. SIAM J. Numer. Anal. 17 (1980), 14–17.

    Article  MathSciNet  Google Scholar 

  24. P. Wolfe: Convergence conditions for ascent methods. SIAM Rev. 11 (1969), 226–235.

    Article  MathSciNet  Google Scholar 

  25. P. Wolfe: Convergence conditions for ascent methods. II: Some corrections. SIAM Rev. 13 (1971), 185–188.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Saeed Nezhadhosein.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Akbari, M., Nezhadhosein, S. & Heydari, A. Adjustment of the scaling parameter of Dai-Kou type conjugate gradient methods with application to motion control. Appl Math (2024). https://doi.org/10.21136/AM.2024.0006-24

Download citation

  • Received:

  • Published:

  • DOI: https://doi.org/10.21136/AM.2024.0006-24

Keywords

MSC 2020

Navigation