[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3638529.3654196acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article
Open access

Already Moderate Population Sizes Provably Yield Strong Robustness to Noise

Published: 14 July 2024 Publication History

Abstract

Experience shows that typical evolutionary algorithms can cope well with stochastic disturbances such as noisy function evaluations. In this first mathematical runtime analysis of the (1 + λ) and (1, λ) evolutionary algorithms in the presence of prior bit-wise noise, we show that both algorithms can tolerate constant noise probabilities without increasing the asymptotic runtime on the OneMax benchmark. For this, a population size λ suffices that is at least logarithmic in the problem size n. The only previous result in this direction regarded the less realistic one-bit noise model, required a population size super-linear in the problem size, and proved a runtime guarantee roughly cubic in the noiseless runtime for the OneMax benchmark. Our significantly stronger results are based on the novel proof argument that the noiseless offspring can be seen as a biased uniform crossover between the parent and the noisy offspring. We are optimistic that the technical lemmas resulting from this insight will find applications also in future mathematical runtime analyses of evolutionary algorithms.

References

[1]
Denis Antipov and Benjamin Doerr. 2021. A tight runtime analysis for the (μ + λ) EA. Algorithmica 83 (2021), 1054--1095.
[2]
Denis Antipov, Benjamin Doerr, and Alexandra Ivanova. 2024. Already Moderate Population Sizes Provably Yield Strong Robustness to Noise. (2024). arXiv:2404.02090
[3]
Anne Auger and Benjamin Doerr (Eds.). 2011. Theory of Randomized Search Heuristics. World Scientific Publishing.
[4]
Duc-Cuong Dang and Per Kristian Lehre. 2015. Simplified runtime analysis of estimation of distribution algorithms. In Genetic and Evolutionary Computation Conference, GECCO 2015. ACM, 513--518.
[5]
Duc-Cuong Dang, Andre Opris, Bahare Salehi, and Dirk Sudholt. 2023. Analysing the robustness of NSGA-II under noise. In Genetic and Evolutionary Computation Conference, GECCO 2023. ACM, 642--651.
[6]
Raphaël Dang-Nhu, Thibault Dardinier, Benjamin Doerr, Gautier Izacard, and Dorian Nogneng. 2018. A new analysis method for evolutionary optimization of dynamic and noisy objective functions. In Genetic and Evolutionary Computation Conference, GECCO 2018. ACM, 1467--1474.
[7]
Matthieu Dinot, Benjamin Doerr, Ulysse Hennebelle, and Sebastian Will. 2023. Runtime analyses of multi-objective evolutionary algorithms in the presence of noise. In International Joint Conference on Artificial Intelligence, IJCAI 2023. ijcai.org, 5549--5557.
[8]
Benjamin Doerr, Carola Doerr, and Franziska Ebel. 2015. From black-box complexity to designing new genetic algorithms. Theoretical Computer Science 567 (2015), 87--104.
[9]
Benjamin Doerr, Carola Doerr, and Jing Yang. 2020. Optimal parameter choices via precise black-box analysis. Theoretical Computer Science 801 (2020), 1--34.
[10]
Benjamin Doerr, Ashish Ranjan Hota, and Timo Kötzing. 2012. Ants easily solve stochastic shortest path problems. In Genetic and Evolutionary Computation Conference, GECCO 2012. ACM, 17--24.
[11]
Benjamin Doerr, Thomas Jansen, Dirk Sudholt, Carola Winzen, and Christine Zarges. 2013. Mutation rate matters even when optimizing monotone functions. Evolutionary Computation 21 (2013), 1--21.
[12]
Benjamin Doerr and Marvin Künnemann. 2015. Optimizing linear functions with the (1 + λ) evolutionary algorithm---different asymptotic runtimes for different instances. Theoretical Computer Science 561 (2015), 3--23.
[13]
Benjamin Doerr and Frank Neumann (Eds.). 2020. Theory of Evolutionary Computation---Recent Developments in Discrete Optimization. Springer. Also available at http://www.lix.polytechnique.fr/Labo/Benjamin.Doerr/doerr_neumann_book.html.
[14]
Benjamin Doerr and Andrew M. Sutton. 2019. When resampling to cope with noise, use median, not mean. In Genetic and Evolutionary Computation Conference, GECCO 2019. ACM, 242--248.
[15]
Benjamin Doerr, Carsten Witt, and Jing Yang. 2021. Runtime analysis for self-adaptive mutation rates. Algorithmica 83 (2021), 1012--1053.
[16]
Stefan Droste. 2002. Analysis of the (1+1) EA for a dynamically changing OneMax-variant. In Congress on Evolutionary Computation, CEC 2002. IEEE, 55--60.
[17]
Stefan Droste. 2004. Analysis of the (1+1) EA for a noisy OneMax. In Genetic and Evolutionary Computation Conference, GECCO 2004. Springer, 1088--1099.
[18]
Stefan Droste. 2005. Not all linear functions are equally difficult for the compact genetic algorithm. In Genetic and Evolutionary Computation Conference, GECCO 2005. ACM, 679--686.
[19]
Stefan Droste, Thomas Jansen, and Ingo Wegener. 2002. On the analysis of the (1+1) evolutionary algorithm. Theoretical Computer Science 276 (2002), 51--81.
[20]
Tobias Friedrich, Timo Kötzing, Martin S. Krejca, and Andrew M. Sutton. 2016. Robustness of ant colony optimization to noise. Evolutionary Computation 24 (2016), 237--254.
[21]
Tobias Friedrich, Timo Kötzing, Martin S. Krejca, and Andrew M. Sutton. 2017. The compact genetic algorithm is efficient under extreme Gaussian noise. IEEE Transactions on Evolutionary Computation 21 (2017), 477--490.
[22]
Tobias Friedrich, Timo Kötzing, Frank Neumann, and Aishwarya Radhakrishnan. 2022. Theoretical study of optimizing rugged landscapes with the cGA. In Parallel Problem Solving from Nature, PPSN 2022, Part II. Springer, 586--599.
[23]
Christian Gießen and Timo Kötzing. 2016. Robustness of populations in stochastic environments. Algorithmica 75 (2016), 462--489.
[24]
Walter J. Gutjahr. 2008. First steps to the runtime complexity analysis of ant colony optimization. Computers & Operations Research 35 (2008), 2711--2727.
[25]
Thomas Jansen. 2013. Analyzing Evolutionary Algorithms - The Computer Science Perspective. Springer.
[26]
Thomas Jansen, Kenneth A. De Jong, and Ingo Wegener. 2005. On the choice of the offspring population size in evolutionary algorithms. Evolutionary Computation 13 (2005), 413--440.
[27]
Daniel Johannsen. 2010. Random Combinatorial Structures and Randomized Search Heuristics. Ph. D. Dissertation. Universität des Saarlandes.
[28]
Joost Jorritsma, Johannes Lengler, and Dirk Sudholt. 2023. Comma selection outperforms plus selection on OneMax with randomly planted optima. In Genetic and Evolutionary Computation Conference, GECCO 2023. ACM, 1602--1610.
[29]
Jörg Lässig and Dirk Sudholt. 2011. Adaptive population models for offspring populations and parallel evolutionary algorithms. In Foundations of Genetic Algorithms, FOGA 2011. ACM, 181--192.
[30]
Per Kristian Lehre and Phan Trung Hai Nguyen. 2021. Runtime analyses of the population-based univariate estimation of distribution algorithms on LeadingOnes. Algorithmica 83 (2021), 3238--3280.
[31]
Per Kristian Lehre and Xiaoyu Qin. 2023. Self-adaptation can improve the noise-tolerance of evolutionary algorithms. In Foundations of Genetic Algorithms, FOGA 2023. ACM, 105--116.
[32]
Per Kristian Lehre and Xiaoyu Qin. 2024. More precise runtime analyses of non-elitist evolutionary algorithms in uncertain environments. Algorithmica 86 (2024), 396--441.
[33]
Per Kristian Lehre and Carsten Witt. 2012. Black-box search by unbiased variation. Algorithmica 64 (2012), 623--642.
[34]
Boris Mitavskiy, Jonathan E. Rowe, and Chris Cannings. 2009. Theoretical analysis of local search strategies to optimize network communication subject to preserving the total number of links. International Journal on Intelligent Computing and Cybernetics 2 (2009), 243--284.
[35]
Heinz Mühlenbein. 1992. How genetic algorithms really work: mutation and hillclimbing. In Parallel Problem Solving from Nature, PPSN 1992. Elsevier, 15--26.
[36]
Frank Neumann and Carsten Witt. 2009. Runtime analysis of a simple ant colony optimization algorithm. Algorithmica 54 (2009), 243--255.
[37]
Frank Neumann and Carsten Witt. 2010. Bioinspired Computation in Combinatorial Optimization - Algorithms and Their Computational Complexity. Springer.
[38]
Chao Qian, Chao Bian, Wu Jiang, and Ke Tang. 2019. Running time analysis of the (1 + 1)-EA for OneMax and LeadingOnes under bit-wise noise. Algorithmica 81 (2019), 749--795.
[39]
Chao Qian, Chao Bian, Yang Yu, Ke Tang, and Xin Yao. 2021. Analysis of noisy evolutionary optimization when sampling fails. Algorithmica 83 (2021), 940--975.
[40]
Chao Qian, Yang Yu, Ke Tang, Yaochu Jin, Xin Yao, and Zhi-Hua Zhou. 2018. On the effectiveness of sampling for evolutionary optimization in noisy environments. Evolutionary Computation 26 (2018), 237--267.
[41]
Jonathan E. Rowe and Aishwaryaprajna. 2019. The benefits and limitations of voting mechanisms in evolutionary optimisation. In Foundations of Genetic Algorithms, FOGA 2019. ACM, 34--42.
[42]
Jonathan E. Rowe and Dirk Sudholt. 2014. The choice of the offspring population size in the (1, λ) evolutionary algorithm. Theoretical Computer Science 545 (2014), 20--38.
[43]
Günter Rudolph. 1997. Convergence Properties of Evolutionary Algorithms. Verlag Dr. Kovâc.
[44]
Dirk Sudholt. 2021. Analysing the robustness of evolutionary algorithms to noise: refined runtime bounds and an example where noise is beneficial. Algorithmica 83 (2021), 976--1011.
[45]
Dirk Sudholt and Christian Thyssen. 2012. A simple ant colony optimizer for stochastic shortest path problems. Algorithmica 64 (2012), 643--672.
[46]
Petar M. Vasić and Dragoslav S. Mitrinović. 2012. Analytic Inequalities. Springer Berlin Heidelberg.
[47]
Carsten Witt. 2006. Runtime analysis of the (μ + 1) EA on simple pseudo-Boolean functions. Evolutionary Computation 14 (2006), 65--86.
[48]
Weijie Zheng and Benjamin Doerr. 2023. From understanding genetic drift to a smart-restart mechanism for estimation-of-distribution algorithms. Journal of Machine Learning Research 24 (2023), 1--40.
[49]
Zhi-Hua Zhou, Yang Yu, and Chao Qian. 2019. Evolutionary Learning: Advances in Theories and Algorithms. Springer.

Cited By

View all
  • (2024)A Gentle Introduction to Theory (for Non-Theoreticians)Proceedings of the Genetic and Evolutionary Computation Conference Companion10.1145/3638530.3648402(800-829)Online publication date: 14-Jul-2024

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
GECCO '24: Proceedings of the Genetic and Evolutionary Computation Conference
July 2024
1657 pages
ISBN:9798400704949
DOI:10.1145/3638529
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs International 4.0 License.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 July 2024

Check for updates

Author Tags

  1. noisy optimization
  2. runtime analysis
  3. theory
  4. population-based algorithms

Qualifiers

  • Research-article

Funding Sources

  • FMJH Program Gaspard Monge for optimization and operations research and their interactions with data science
  • Australian Research Council

Conference

GECCO '24
Sponsor:
GECCO '24: Genetic and Evolutionary Computation Conference
July 14 - 18, 2024
VIC, Melbourne, Australia

Acceptance Rates

Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)94
  • Downloads (Last 6 weeks)13
Reflects downloads up to 18 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)A Gentle Introduction to Theory (for Non-Theoreticians)Proceedings of the Genetic and Evolutionary Computation Conference Companion10.1145/3638530.3648402(800-829)Online publication date: 14-Jul-2024

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media