[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3377930.3389831acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

Integrated vs. sequential approaches for selecting and tuning CMA-ES variants

Published: 26 June 2020 Publication History

Abstract

When faced with a specific optimization problem, deciding which algorithm to apply is always a difficult task. Not only is there a vast variety of algorithms to select from, but these algorithms are often controlled by many hyperparameters, which need to be suitably tuned in order to achieve peak performance. Usually, the problem of selecting and configuring the optimization algorithm is addressed sequentially, by first selecting a suitable algorithm and then tuning it for the application at hand. Integrated approaches, commonly known as Combined Algorithm Selection and Hyperparameter (CASH) solvers, have shown promise in several applications.
In this work we compare sequential and integrated approaches for selecting and tuning the best out of the 4,608 variants of the modular Covariance Matrix Adaptation Evolution Strategy (CMA-ES). We show that the ranking of these variants depends to a large extent on the quality of the hyperparameters. Sequential approaches are therefore likely to recommend sub-optimal choices. Integrated approaches, in contrast, manage to provide competitive results at much smaller computational cost. We also highlight important differences in the search behavior of two CASH approaches, which build on racing (irace) and on model-based optimization (MIP-EGO), respectively.

References

[1]
Martin Andersson, Sunith Bandaru, Amos HC Ng, and Anna Syberfeldt. 2015. Parameter tuned CMA-ES on the CEC'15 expensive problems. In Proc. of Congress on Evolutionary Computation (CEC'15). IEEE, 1950--1957.
[2]
Anne Auger, Dimo Brockhoff, and Nikolaus Hansen. 2011. Mirrored Sampling in Evolution Strategies with Weighted Recombination. In Proc. of Genetic and Evolutionary Computation (GECCO'11). ACM, 861--868.
[3]
Anne Auger and Nikolaus Hansen. 2005. A restart CMA evolution strategy with increasing population size. In Proc. of Congress on Evolutionary Computation (CEC'05). 1769--1776.
[4]
Anne Auger, Mohamed Jebalia, and Olivier Teytaud. 2005. Algorithms (X, sigma, eta): Quasi-random Mutations for Evolution Strategies. In Artificial Evolution. Springer, 296--307.
[5]
Thomas Bäck, Christophe Foussette, and Peter Krause. 2013. Contemporary Evolution Strategies. Springer.
[6]
Thomas Bartz-Beielstein. 2010. SPOT: An R Package For Automatic and Interactive Tuning of Optimization Algorithms by Sequential Parameter Optimization. CoRR abs/1006.4645 (2010). arXiv:1006.4645 http://arxiv.org/abs/1006.4645
[7]
Nacim Belkhir, Johann Dréo, Pierre Savéant, and Marc Schoenauer. 2017. Per instance algorithm configuration of CMA-ES with limited budget. In Proc. of Genetic and Evolutionary Computation (GECCO'17). ACM, 681--688.
[8]
Dimo Brockhoff, Anne Auger, Nikolaus Hansen, Dirk V. Arnold, and Tim Hohm. 2010. Mirrored Sampling and Sequential Selection for Evolution Strategies. In PPSN. Springer, 11--21.
[9]
Leslie Pérez Cáceres, Manuel López-Ibáñez, Holger Hoos, and Thomas Stützle. 2017. An Experimental Study of Adaptive Capping in irace. In LION, Roberto Battiti, Dmitri E. Kvasov, and Yaroslav D. Sergeyev (Eds.). 235--250.
[10]
Carola Doerr, Hao Wang, Furong Ye, Sander van Rijn, and Thomas Bäck. 2018. IOHprofiler: A Benchmarking and Profiling Tool for Iterative Optimization Heuristics. arXiv e-prints:1810.05281 (Oct. 2018). arXiv:1810.05281 https://arxiv.org/abs/1810.05281
[11]
Katharina Eggensperger, Marius Lindauer, and Frank Hutter. 2019. Pitfalls and Best Practices in Algorithm Configuration. J. Artif. Intell. Res. 64 (2019), 861--893.
[12]
Agoston Endre Eiben, Robert Hinterding, and Zbigniew Michalewicz. 1999. Parameter control in evolutionary algorithms. IEEE Transactions on Evolutionary Computation 3 (1999), 124--141.
[13]
Stefan Falkner, Aaron Klein, and Frank Hutter. 2018. BOHB: Robust and Efficient Hyperparameter Optimization at Scale. In Proc. of International Conference on Machine Learning (ICML'18). 1436--1445.
[14]
Matthias Feurer, Aaron Klein, Katharina Eggensperger, Jost Springenberg, Manuel Blum, and Frank Hutter. 2015. Efficient and robust automated machine learning. In Advances in neural information processing systems. 2962--2970.
[15]
Nikolaus Hansen. 2008. CMA-ES with Two-Point Step-Size Adaptation. arXiv:0805.0231 [cs] (May 2008). http://arxiv.org/abs/0805.0231 arXiv: 0805.0231.
[16]
Nikolaus Hansen. 2009. Benchmarking a BI-population CMA-ES on the BBOB-2009 Function Testbed. In Proc. of Genetic and Evolutionary Computation (GECCO'09). ACM, 2389--2396.
[17]
Nikolaus Hansen, Anne Auger, Dimo Brockhoff, Dejan Tušar, and Tea Tušar. 2016. COCO: Performance Assessment. arXiv:1605.03560 [cs] (May 2016). http://arxiv.org/abs/1605.03560 arXiv: 1605.03560.
[18]
Nikolaus Hansen, Anne Auger, Raymond Ros, Steffen Finck, and Petr Pošík. 2010. Comparing results of 31 algorithms from the black-box optimization benchmarking BBOB-2009. In Proc. of Genetic and Evolutionary Computation (GECCO'10). ACM, 1689--1696.
[19]
Nikolaus Hansen and Andreas Ostermeier. 1996. Adapting arbitrary normal mutation distributions in evolution strategies: the covariance matrix adaptation. In CEC. 312--317.
[20]
Frank Hutter, Holger H Hoos, and Kevin Leyton-Brown. 2011. Sequential model-based optimization for general algorithm configuration. In LION. Springer, 507--523.
[21]
Anja Janković and Carola Doerr. 2019. Adaptive landscape analysis. In Proc. of Genetic and Evolutionary Computation (GECCO'19). ACM, 2032--2035.
[22]
Grahame A. Jastrebski and Dirk V. Arnold. 2006. Improving Evolution Strategies through Active Covariance Matrix Adaptation. In Proc. of Congress on Evolutionary Computation (CEC06). 2814--2821.
[23]
Pascal Kerschke, Holger H. Hoos, Frank Neumann, and Heike Trautmann. 2019. Automated Algorithm Selection: Survey and Perspectives. Evolutionary Computation 27, 1 (2019), 3--45.
[24]
Pascal Kerschke and Heike Trautmann. 2016. The R-Package FLACCO for exploratory landscape analysis with applications to multi-objective optimization problems. In Proc. of Congress on Evolutionary Computation (CEC'16). IEEE, 5262--5269.
[25]
Robert Kleinberg, Kevin Leyton-Brown, Brendan Lucier, and Devon R. Graham. 2019. Procrastinating with Confidence: Near-Optimal, Anytime, Adaptive Algorithm Configuration. In Proc. of Neural Information Processing Systems (NeurPS'19). 8881--8891.
[26]
Lars Kotthoff, Chris Thornton, Holger H Hoos, Frank Hutter, and Kevin Leyton-Brown. 2017. Auto-WEKA 2.0: Automatic model selection and hyperparameter optimization in WEKA. The Journal of Machine Learning Research 18, 1 (2017), 826--830.
[27]
Johannes Lengler. 2018. A General Dichotomy of Evolutionary Algorithms on Monotone Functions. In Parallel Problem Solving from Nature - PPSN XV- 15th International Conference, Coimbra, Portugal, September 8-12, 2018, Proceedings, Part II, Vol. 11102. Springer, 3--15.
[28]
Lisha Li, Kevin Jamieson, Giulia DeSalvo, Afshin Rostamizadeh, and Ameet Talwalkar. 2016. Hyperband: A novel bandit-based approach to hyperparameter optimization. arXiv preprint arXiv:1603.06560 (2016).
[29]
Fernando G. Lobo, Cláudio F. Lima, and Zbigniew Michalewicz (Eds.). 2007. Parameter Setting in Evolutionary Algorithms. Studies in Computational Intelligence, Vol. 54. Springer.
[30]
Manuel López-Ibáñez, Jérémie Dubois-Lacoste, Leslie Pérez Cáceres, Mauro Birattari, and Thomas Stützle. 2016. The irace package: Iterated racing for automatic algorithm configuration. Operations Research Perspectives 3 (2016), 43 -- 58.
[31]
Manuel López-Ibáñez, Jérémie Dubois-Lacoste, Thomas Stützle, and Mauro Birattari. 2011. The irace package, Iterated Race for Automatic Algorithm Configuration. Technical Report TR/IRIDIA/2011-004. IRIDIA, Université Libre de Bruxelles, Belgium. http://iridia.ulb.ac.be/IridiaTrSeries/IridiaTr2011-004.pdf
[32]
Manuel López-Ibáñez and Leslie Pérez Cáceres. [n. d.]. The irace Package: Iterated Race for Automatic Algorithm Configuration. http://iridia.ulb.ac.be/irace/.
[33]
Olaf Mersmann, Bernd Bischl, Heike Trautmann, Mike Preuss, Claus Weihs, and Günter Rudolph. 2011. Exploratory landscape analysis. In Proc. of Genetic and Evolutionary Computation (GECCO'11). ACM, 829--836.
[34]
Alejandro Piad-Morffis, Suilan Estévez-Velarde, Antonio Bolufé-Röhler, James Montgomery, and Stephen Chen. 2015. Evolution strategies with thresheld convergence. In Proc. of Congress on Evolutionary Computation (CEC'15). 2097--2104.
[35]
Zbynek Pitra, Jakub Repický, and Martin Helena. 2019. Landscape analysis of gaussian process surrogates for the covariance matrix adaptation evolution strategy. In Proc. of Genetic and Evolutionary Computation (GECCO'19). ACM, 691--699.
[36]
Quentin Renau, Johann Dreo, Carola Doerr, and Benjamin Doerr. 2019. Expressiveness and Robustness of Landscape Features. In Proc. of Genetic and Evolutionary Computation (GECCO'19). ACM, 2048-2051.
[37]
John R. Rice. 1976. The Algorithm Selection Problem. Advances in Computers 15 (1976), 65--118.
[38]
Chris Thornton, Frank Hutter, Holger H Hoos, and Kevin Leyton-Brown. 2013. Auto-WEKA: Combined selection and hyperparameter optimization of classification algorithms. In ACM SIGKDD. ACM, 847--855.
[39]
Sander van Rijn. 2018. Modular CMA-ES framework from [41], v0.3.0. https://github.com/sjvrijn/ModEA. Pypi package: https://pypi.org/project/ModEA/0.3.0/.
[40]
Sander van Rijn, Carola Doerr, and Thomas Bäck. 2018. Towards an Adaptive CMA-ES Configurator. In PPSN (Lecture Notes in Computer Science), Vol. 11101. Springer, 54--65.
[41]
Sander van Rijn, Hao Wang, Matthijs van Leeuwen, and Thomas Bäck. 2016. Evolving the structure of Evolution Strategies. In SSCI. 1--8.
[42]
Sander van Rijn, Hao Wang, Bas van Stein, and Thomas Bäck. 2017. Algorithm Configuration Data Mining for CMA Evolution Strategies. In Proc. of Genetic and Evolutionary Computation (GECCO'17). ACM, 737--744.
[43]
Diederick Vermetten, Sander van Rijn, Thomas Bäck, and Carola Doerr. 2019. Github repository for Online Selection of CMA-ES Variants. (2019). https://github.com/Dvermetten/Online_CMA-ES_Selection.
[44]
Diederick Vermetten, Sander van Rijn, Thomas Bäck, and Carola Doerr. 2019. Online selection of CMA-ES variants. In Proc. of Genetic and Evolutionary Computation (GECCO'19). ACM, 951--959.
[45]
Diederick Vermetten, Hao Wang, Thomas Bäck, and Carola Doerr. 2020. Github repository with the data used in this paper. https://github.com/Dvermetten/CMA_ES_hyperparam.
[46]
Hao Wang, Michael Emmerich, and Thomas Bäck. 2014. Mirrored Orthogonal Sampling with Pairwise Selection in Evolution Strategies. In SAC. ACM, 154--156.
[47]
Hao Wang, Michael Emmerich, and Thomas Bäck. 2018. Cooling Strategies for the Moment-Generating Function in Bayesian Global Optimization. In Proc of Congress on Evolutionary Computation (CEC'18).
[48]
Hao Wang, Bas van Stein, Michael Emmerich, and Thomas Bäck. 2017. A new acquisition function for Bayesian optimization based on the moment-generating function. In SMC. IEEE, 507--512.
[49]
David H. Wolpert and William G. Macready. 1997. No free lunch theorems for optimization. IEEE Trans. Evolutionary Computation 1, 1 (1997), 67--82.
[50]
Lin Xu, Holger Hoos, and Kevin Leyton-Brown. 2010. Hydra: Automatically configuring algorithms for portfolio-based selection. In Twenty-Fourth AAAI Conference on Artificial Intelligence.
[51]
Lin Xu, Frank Hutter, Holger H Hoos, and Kevin Leyton-Brown. 2011. Hydra-MIP: Automated algorithm configuration and selection for mixed integer programming. In RCRA workshop on experimental evaluation of algorithms for solving problems with combinatorial explosion at the international joint conference on artificial intelligence (IJCAI).

Cited By

View all
  • (2024)Landscape-Aware Automated Algorithm Configuration Using Multi-output Mixed Regression and ClassificationParallel Problem Solving from Nature – PPSN XVIII10.1007/978-3-031-70068-2_6(87-104)Online publication date: 14-Sep-2024
  • (2023)Bayesian OptimizationMany-Criteria Optimization and Decision Analysis10.1007/978-3-031-25263-1_10(271-297)Online publication date: 29-Jul-2023
  • (2022)The importance of landscape features for performance prediction of modular CMA-ES variantsProceedings of the Genetic and Evolutionary Computation Conference10.1145/3512290.3528832(648-656)Online publication date: 8-Jul-2022
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
GECCO '20: Proceedings of the 2020 Genetic and Evolutionary Computation Conference
June 2020
1349 pages
ISBN:9781450371285
DOI:10.1145/3377930
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 June 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. algorithm control
  2. algorithm selection
  3. black-box optimization
  4. iterative optimization heuristics

Qualifiers

  • Research-article

Funding Sources

  • Paris Ile de France region

Conference

GECCO '20
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 29 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Landscape-Aware Automated Algorithm Configuration Using Multi-output Mixed Regression and ClassificationParallel Problem Solving from Nature – PPSN XVIII10.1007/978-3-031-70068-2_6(87-104)Online publication date: 14-Sep-2024
  • (2023)Bayesian OptimizationMany-Criteria Optimization and Decision Analysis10.1007/978-3-031-25263-1_10(271-297)Online publication date: 29-Jul-2023
  • (2022)The importance of landscape features for performance prediction of modular CMA-ES variantsProceedings of the Genetic and Evolutionary Computation Conference10.1145/3512290.3528832(648-656)Online publication date: 8-Jul-2022
  • (2022)Analyzing the impact of undersampling on the benchmarking and configuration of evolutionary algorithmsProceedings of the Genetic and Evolutionary Computation Conference10.1145/3512290.3528799(867-875)Online publication date: 8-Jul-2022
  • (2022)An Efficient Contesting Procedure for AutoML OptimizationIEEE Access10.1109/ACCESS.2022.319203610(75754-75771)Online publication date: 2022
  • (2021)Tuning as a means of assessing the benefits of new ideas in interplay with existing algorithmic modulesProceedings of the Genetic and Evolutionary Computation Conference Companion10.1145/3449726.3463167(1375-1384)Online publication date: 7-Jul-2021
  • (2021)OPTIONProceedings of the Genetic and Evolutionary Computation Conference Companion10.1145/3449726.3459579(239-240)Online publication date: 7-Jul-2021
  • (2021)Improved Automated CASH Optimization with Tree Parzen Estimators for Class Imbalance Problems2021 IEEE 8th International Conference on Data Science and Advanced Analytics (DSAA)10.1109/DSAA53316.2021.9564147(1-9)Online publication date: 6-Oct-2021
  • (2021)Unbalanced budget distribution for automatic algorithm configurationSoft Computing10.1007/s00500-021-06403-yOnline publication date: 2-Nov-2021
  • (2020)Towards dynamic algorithm selection for numerical black-box optimizationProceedings of the 2020 Genetic and Evolutionary Computation Conference10.1145/3377930.3390189(654-662)Online publication date: 25-Jun-2020

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media