[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to main content

Structured minimal-memory inexact quasi-Newton method and secant preconditioners for augmented Lagrangian optimization

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

    We’re sorry, something doesn't seem to be working properly.

    Please try refreshing the page. If that doesn't work, please contact support so we can address the problem.

Abstract

Augmented Lagrangian methods for large-scale optimization usually require efficient algorithms for minimization with box constraints. On the other hand, active-set box-constraint methods employ unconstrained optimization algorithms for minimization inside the faces of the box. Several approaches may be employed for computing internal search directions in the large-scale case. In this paper a minimal-memory quasi-Newton approach with secant preconditioners is proposed, taking into account the structure of Augmented Lagrangians that come from the popular Powell–Hestenes–Rockafellar scheme. A combined algorithm, that uses the quasi-Newton formula or a truncated-Newton procedure, depending on the presence of active constraints in the penalty-Lagrangian function, is also suggested. Numerical experiments using the Cute collection are presented.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Andreani, R., Birgin, E.G., Martínez, J.M., Schuverdt, M.L.: Augmented Lagrangian methods under the Constant Positive Linear Dependence constraint qualification. Math. Program. (2007, in press). DOI: 10.1007/s10107-006-0077-1

  2. Andreani, R., Birgin, E.G., Martínez, J.M., Schuverdt, M.L.: On Augmented Lagrangian methods with general lower-level constraints. SIAM J. Optim. (2007, in press)

  3. Andreani, R., Martínez, J.M., Schuverdt, M.L.: On the relation between the constant positive linear dependence condition and quasinormality constraint qualification. J. Optim. Theory Appl. 125, 473–485 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  4. Barzilai, J., Borwein, J.M.: Two point step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)

    Article  MATH  MathSciNet  Google Scholar 

  5. Birgin, E.G., Castillo, R., Martínez, J.M.: Numerical comparison of Augmented Lagrangian algorithms for nonconvex problems. Comput. Optim. Appl. 31, 31–56 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  6. Birgin, E.G., Martínez, J.M.: Large-scale active-set box-constrained optimization method with spectral projected gradients. Comput. Optim. Appl. 23, 101–125 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  7. Birgin, E.G., Martínez, J.M., Raydan, M.: Nonmonotone spectral projected gradient methods on convex sets. SIAM J. Optim. 10, 1196–1211 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  8. Birgin, E.G., Martínez, J.M., Raydan, M.: Algorithm 813: SPG—Software for convex-constrained optimization. ACM Trans. Math. Softw. 27, 340–349 (2001)

    Article  MATH  Google Scholar 

  9. Birgin, E.G., Martínez, J.M., Raydan, M.: Inexact spectral projected gradient methods on convex sets. IMA J. Numer. Anal. 23, 539–559 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  10. Bongartz, I., Conn, A.R., Gould, N.I.M., Toint, P.L.: CUTE: constrained and unconstrained testing environment. ACM Trans. Math. Softw. 21, 123–160 (1995)

    Article  MATH  Google Scholar 

  11. Burdakov, O., Martínez, J.M., Pilotta, E.A.: A limited memory multipoint secant method for bound constrained optimization. Ann. Oper. Res. 117, 51–70 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  12. Byrd, R.H., Lu, P.H., Nocedal, J., Zhu, C.Y.: A limited memory algorithm for bound constrained optimization. SIAM J. Sci. Comput. 16, 1190–1208 (1995)

    Article  MATH  MathSciNet  Google Scholar 

  13. Conn, A.R., Gould, N.I.M., Toint, P.L.: A globally convergent Augmented Lagrangian algorithm for optimization with general constraints and simple bounds. SIAM J. Numer. Anal. 28, 545–572 (1991)

    Article  MATH  MathSciNet  Google Scholar 

  14. Conn, A.R., Gould, N.I.M., Toint, P.L.: LANCELOT: a Fortran Package for Large Scale Nonlinear Optimization. Springer, Berlin (1992)

    MATH  Google Scholar 

  15. Dai, Y.H., Fletcher, R.: On the asymptotic behaviour of some new gradient methods. Math. Program. 103, 541–549 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  16. Dai, Y.H., Liao, L.Z.: R-linear convergence of the Barzilai and Borwein gradient method. IMA J. Numer. Anal. 22, 1–10 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  17. Dennis, J.E., Schnabel, R.B.: Least change secant updates for quasi-Newton methods. SIAM Rev. 21, 443–459 (1979)

    Article  MATH  MathSciNet  Google Scholar 

  18. Diniz-Ehrhardt, M.A., Gomes-Ruggiero, M.A., Martínez, J.M., Santos, S.A.: Augmented Lagrangian algorithms based on the spectral projected gradient for solving nonlinear programming problems. J. Optim. Theory Appl. 123, 497–517 (2004)

    Article  MATH  MathSciNet  Google Scholar 

  19. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  20. Dostál, Z.: Inexact semimonotonic Augmented Lagrangians with optimal feasibility convergence for convex bound and equality constrained quadratic programming. SIAM J. Numer. Anal. 43, 96–115 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  21. Dostál, Z., Friedlander, A., Santos, S.A.: Augmented Lagrangian with adaptive precision control for quadratic programming with simple bounds and equality constraints. SIAM J. Optim. 13, 1120–1140 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  22. Dostál, Z., Gomes, F.A.M., Santos, S.A.: Duality based domain decomposition with natural coarse space for variational inequalities. J. Comput. Appl. Math. 126, 397–415 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  23. Fletcher, R.: Practical Methods of Optimization. Academic, London (1987)

    MATH  Google Scholar 

  24. Fletcher, R.: On the Barzilai–Borwein method. Department of Mathematics, University of Dundee, NA/207, Dundee, Scotland (2001)

  25. Gill, P.E., Murray, W., Saunders, M.A.: SNOPT: An SQP algorithm for large-scale constrained optimization. SIAM Rev. 47, 99–131 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  26. Golub, G.H., Van Loan, C.F.: Matrix Computations, 3rd edn. Johns Hopkins University Press, Baltimore/London (1996)

    MATH  Google Scholar 

  27. Groceri, G.M., Sottosanto, G.N., Maciel, M.C.: Augmented Penalization algorithms based on BFGS secant approximations and trust region. Technical report, Departamento de Matemáticas, Universidad Nacional del Sur, Bahía Blanca, Argentina (2005)

  28. Gurwitz, C.: A test for cancellation errors in quasi-Newton methods. ACM Trans. Math. Softw. 18, 133–140 (1992)

    Article  MathSciNet  Google Scholar 

  29. Hager, W.W.: Analysis and implementation of a dual algorithm for constrained optimization. J. Optim. Theory Appl. 79, 37–71 (1993)

    Article  MathSciNet  Google Scholar 

  30. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  31. Hager, W.W., Zhang, H.: Algorithm 851: CG_DESCENT. A conjugate gradient method with guaranteed descent, ACM Trans. Math. Softw. 32, 113–137 (2006)

    MathSciNet  Google Scholar 

  32. Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)

    MATH  Google Scholar 

  33. Hager, W.W., Zhang, H.: A new active set algorithm for box constrained optimization. SIAM J. Optim. 17, 526–557 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  34. Hestenes, M.R.: Multiplier and gradient methods. J. Optim. Theory Appl. 4, 303–320 (1969)

    Article  MATH  MathSciNet  Google Scholar 

  35. Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Stand. 49, 409–436 (1952)

    MATH  MathSciNet  Google Scholar 

  36. Krejić, N., Martínez, J.M., Mello, M.P., Pilotta, E.A.: Validation of an Augmented Lagrangian algorithm with a Gauss–Newton Hessian approximation using a set of hard-spheres problems. Comput. Optim. Appl. 16, 247–263 (2000)

    Article  MathSciNet  Google Scholar 

  37. Martínez, J.M., Qi, L.: Inexact Newton methods for solving nonsmooth equations. J. Comput. Appl. Math. 60, 127–145 (1995)

    Article  MATH  MathSciNet  Google Scholar 

  38. Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (1999)

    MATH  Google Scholar 

  39. Oren, S.S.: Self-scaling variable metric algorithms without line search for unconstrained minimization. Math. Comput. 27, 873–885 (1973)

    Article  MATH  MathSciNet  Google Scholar 

  40. Powell, M.J.D.: A method for nonlinear constraints in minimization problems. In: Fletcher, R. (ed.) Optimization, pp. 283–298. Academic, New York (1969)

    Google Scholar 

  41. Qi, L., Sun, J.: A nonsmooth version of Newton method. Math. Program. 58, 353–367 (1993)

    Article  MathSciNet  Google Scholar 

  42. Qi, L., Wei, Z.: On the constant positive linear dependence condition and its application to SQP method. SIAM J. Optim. 10, 963–981 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  43. Raydan, M.: On the Barzilai and Borwein choice of steplength for the gradient method. IMA J. Numer. Anal. 13, 321–326 (1993)

    Article  MATH  MathSciNet  Google Scholar 

  44. Raydan, M.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7, 26–33 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  45. Rockafellar, R.T.: The multiplier method of Hestenes and Powell applied to convex programming. J. Optim. Theory Appl. 12, 555–562 (1973)

    Article  MATH  MathSciNet  Google Scholar 

  46. Rockafellar, R.T.: Augmented Lagrange multiplier functions and duality in nonconvex programming. SIAM Journal on Control 12, 268–285 (1974)

    Article  MATH  MathSciNet  Google Scholar 

  47. Zhu, C.Y., Byrd, R.H., Lu, P.H., Nocedal, J.: Algorithm 778: L-BFGS-B: Fortran subroutines for large-scale bound-constrained optimization. ACM Trans. Math. Softw. 23, 550–560 (1995)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to E. G. Birgin.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Birgin, E.G., Martínez, J.M. Structured minimal-memory inexact quasi-Newton method and secant preconditioners for augmented Lagrangian optimization. Comput Optim Appl 39, 1–16 (2008). https://doi.org/10.1007/s10589-007-9050-z

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-007-9050-z

Keywords