[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3638529.3654182acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

CMA-ES with Adaptive Reevaluation for Multiplicative Noise

Published: 14 July 2024 Publication History

Abstract

The covariance matrix adaptation evolution strategy (CMA-ES) is a powerful optimization method for continuous black-box optimization problems. Several noise-handling methods have been proposed to bring out the optimization performance of the CMA-ES on noisy objective functions. The adaptations of the population size and the learning rate are two major approaches that perform well under additive Gaussian noise. The reevaluation technique is another technique that evaluates each solution multiple times. In this paper, we discuss the difference between those methods from the perspective of stochastic relaxation that considers the maximization of the expected utility function. We derive that the set of maximizers of the noise-independent utility, which is used in the reevaluation technique, certainly contains the optimal solution, while the noise-dependent utility, which is used in the population size and leaning rate adaptations, does not satisfy it under multiplicative noise. Based on the discussion, we develop the reevaluation adaptation CMA-ES (RA-CMA-ES), which computes two update directions using half of the evaluations and adapts the number of reevaluations based on the estimated correlation of those two update directions. The numerical simulation shows that the RA-CMA-ES outperforms the comparative method under multiplicative noise, maintaining competitive performance under additive noise.

Supplemental Material

PDF File
Supplementary Material

References

[1]
Youhei Akimoto, Anne Auger, and Nikolaus Hansen. 2020. Quality gain analysis of the weighted recombination evolution strategy on general convex quadratic functions. Theoretical Computer Science 832 (2020), 42--67.
[2]
Youhei Akimoto, Yuichi Nagata, Isao Ono, and Shigenobu Kobayashi. 2010. Bidirectional Relation between CMA Evolution Strategies and Natural Evolution Strategies. In Parallel Problem Solving from Nature, PPSN XI. Springer Berlin Heidelberg, Berlin, Heidelberg, 154--163.
[3]
Hans-Georg Beyer and Bernhard Sendhoff. 2007. Evolutionary Algorithms in the Presence of Noise: To Sample or Not to Sample. In 2007 IEEE Symposium on Foundations of Computational Intelligence. IEEE, 17--24.
[4]
Nikolaus Hansen. 2016. The CMA Evolution Strategy: A Tutorial. arXiv:1604.00772 (2016). arXiv:1604.00772
[5]
Nikolaus Hansen and Anne Auger. 2014. Principled Design of Continuous Stochastic Search: From Theory to Practice. Springer Berlin Heidelberg, Berlin, Heidelberg, 145--180.
[6]
Nikolaus Hansen, Anne Auger, Raymond Ros, Olaf Mersmann, Tea Tušar, and Dimo Brockhoff. 2020. COCO: a platform for comparing continuous optimizers in a black-box setting. Optimization Methods and Software 36, 1 (2020), 114--144.
[7]
Nikolaus Hansen, AndrÉ S. P. Niederberger, Lino Guzzella, and Petros Koumoutsakos. 2009. A Method for Handling Uncertainty in Evolutionary Optimization With an Application to Feedback Control of Combustion. IEEE Transactions on Evolutionary Computation 13, 1 (2009), 180--197.
[8]
Nikolaus Hansen and Andreas Ostermeier. 1996. Adapting arbitrary normal mutation distributions in evolution strategies: the covariance matrix adaptation. In Proceedings of IEEE International Conference on Evolutionary Computation. IEEE, 312--317.
[9]
Verena Heidrich-Meisner and Christian Igel. 2009. Uncertainty Handling CMA-ES for Reinforcement Learning. In Proceedings of the 11th Annual Conference on Genetic and Evolutionary Computation. Association for Computing Machinery, New York, NY, USA, 1211--1218.
[10]
Michael Hellwig and Hans-Georg Beyer. 2016. Evolution Under Strong Noise: A Self-Adaptive Evolution Strategy Can Reach the Lower Performance Bound - The pcCMSA-ES. In Proceedings of Parallel Problem Solving from Nature (PPSN) (Lecture Notes in Computer Science, Vol. 9921). Springer, 26--36.
[11]
Kouhei Nishida and Youhei Akimoto. 2018. PSA-CMA-ES: CMA-ES with Population Size Adaptation. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO). Association for Computing Machinery, 865--872.
[12]
Masahiro Nomura, Youhei Akimoto, and Isao Ono. 2023. CMA-ES with Learning Rate Adaptation: Can CMA-ES with Default Population Size Solve Multimodal and Noisy Problems?. In Proceedings of the Genetic and Evolutionary Computation Conference. Association for Computing Machinery, New York, NY, USA, 839--847.
[13]
Yann Ollivier, Ludovic Arnold, Anne Auger, and Nikolaus Hansen. 2017. Information-Geometric Optimization Algorithms: A Unifying Picture via In-variance Principles. Journal of Machine Learning Research 18, 1 (2017), 564--628.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
GECCO '24: Proceedings of the Genetic and Evolutionary Computation Conference
July 2024
1657 pages
ISBN:9798400704949
DOI:10.1145/3638529
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 July 2024

Check for updates

Qualifiers

  • Research-article

Funding Sources

Conference

GECCO '24
Sponsor:
GECCO '24: Genetic and Evolutionary Computation Conference
July 14 - 18, 2024
VIC, Melbourne, Australia

Acceptance Rates

Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 33
    Total Downloads
  • Downloads (Last 12 months)33
  • Downloads (Last 6 weeks)8
Reflects downloads up to 12 Dec 2024

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media