[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3449639.3459367acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

Generalized jump functions

Published: 26 June 2021 Publication History

Abstract

Jump functions are the most studied non-unimodal benchmark in the theory of evolutionary algorithms (EAs). They have significantly improved our understanding of how EAs escape from local optima. However, their particular structure - to leave the local optimum the EA can only jump directly to the global optimum - raises the question of how representative the recent findings are.
For this reason, we propose an extended class Jumpk,δ of jump functions that incorporate a valley of low fitness of width δ starting at distance k from the global optimum. We prove that several previous results extend to this more general class: for all k = o (n1/3) and δk, the optimal mutation rate for the (1 + 1) EA is [EQUATION], and the fast (1 + 1) EA runs faster than the classical (1 + 1) EA by a factor super-exponential in δ. However, we also observe that some known results do not generalize: the randomized local search algorithm with stagnation detection, which is faster than the fast (1 + 1) EA by a factor polynomial in k on Jumpk, is slower by a factor polynomial in n on some Jumpk,δ instances.
Computationally, the new class allows experiments with wider fitness valleys, especially when they lie further away from the global optimum.

References

[1]
Denis Antipov, Maxim Buzdalov, and Benjamin Doerr. 2020. Fast mutation in crossover-based algorithms. In Genetic and Evolutionary Computation Conference, GECCO 2020. ACM, 1268--1276.
[2]
Denis Antipov, Maxim Buzdalov, and Benjamin Doerr. 2020. First steps towards a runtime analysis when starting with a good solution. In Parallel Problem Solving From Nature, PPSN 2020, Part II. Springer, 560--573.
[3]
Denis Antipov, Maxim Buzdalov, and Benjamin Doerr. 2021. Lazy parameter tuning and control: choosing all parameters randomly from a power-law distribution. In Genetic and Evolutionary Computation Conference, GECCO 2021. ACM.
[4]
Denis Antipov and Benjamin Doerr. 2020. Runtime analysis of a heavy-tailed (1 + (λ, λ)) genetic algorithm on jump functions. In Parallel Problem Solving From Nature, PPSN 2020, Part II. Springer, 545--559.
[5]
Denis Antipov, Benjamin Doerr, and Vitalii Karavaev. 2020. The (1+ (λ, λ)) GA is even faster on multimodal problems. In Genetic and Evolutionary Computation Conference, GECCO 2020. ACM, 1259--1267.
[6]
Anne Auger and Benjamin Doerr (Eds.). 2011. Theory of Randomized Search Heuristics. World Scientific Publishing.
[7]
Henry Bambury, Antoine Bultel, and Benjamin Doerr. 2021. Generalized Jump Functions. CoRR abs/21xx.xxxxx (2021).
[8]
Riade Benbaki, Ziyad Benomar, and Benjamin Doerr. 2021. A rigorous runtime analysis of the 2-MMASib on jump functions: ant colony optimizers can cope well with local optima. In Genetic and Evolutionary Computation Conference, GECCO 2021. ACM.
[9]
Dogan Corus and Pietro S. Oliveto. 2018. Standard steady state genetic algorithms can hillclimb faster than mutation-only evolutionary algorithms. IEEE Transactions on Evolutionary Compututation 22 (2018), 720--732.
[10]
Dogan Corus, Pietro S. Oliveto, and Donya Yazdani. 2017. On the runtime analysis of the Opt-IA artificial immune system. In Genetic and Evolutionary Computation Conference, GECCO 2017. ACM, 83--90.
[11]
Dogan Corus, Pietro S. Oliveto, and Donya Yazdani. 2018. Fast artificial immune systems. In Parallel Problem Solving from Nature, PPSN 2018, Part II. Springer, 67--78.
[12]
Duc-Cuong Dang, Tobias Friedrich, Timo Kötzing, Martin S. Krejca, Per Kristian Lehre, Pietro S. Oliveto, Dirk Sudholt, and Andrew M. Sutton. 2016. Escaping local optima with diversity mechanisms and crossover. In Genetic and Evolutionary Computation Conference, GECCO 2016. ACM, 645--652.
[13]
Duc-Cuong Dang, Tobias Friedrich, Timo Kötzing, Martin S. Krejca, Per Kristian Lehre, Pietro S. Oliveto, Dirk Sudholt, and Andrew M. Sutton. 2018. Escaping local optima using crossover with emergent diversity. IEEE Transactions on Evolutionary Computation 22 (2018), 484--497.
[14]
Benjamin Doerr. 2019. Analyzing randomized search heuristics via stochastic domination. Theoretical Computer Science 773 (2019), 115--137.
[15]
Benjamin Doerr. 2019. A tight runtime analysis for the cGA on jump functions: EDAs can cross fitness valleys at no extra cost. In Genetic and Evolutionary Computation Conference, GECCO 2019. ACM, 1488--1496.
[16]
Benjamin Doerr. 2020. Does comma selection help to cope with local optima?. In Genetic and Evolutionary Computation Conference, GECCO 2020. ACM, 1304--1313.
[17]
Benjamin Doerr, Carola Doerr, and Franziska Ebel. 2015. From black-box complexity to designing new genetic algorithms. Theoretical Computer Science 567 (2015), 87--104.
[18]
Benjamin Doerr, Carola Doerr, and Timo Kötzing. 2018. Static and self-adjusting mutation strengths for multi-valued decision variables. Algorithmica 80 (2018), 1732--1768.
[19]
Benjamin Doerr, Carola Doerr, and Timo Kötzing. 2019. Solving problems with unknown solution length at almost no extra cost. Algorithmica 81 (2019), 703--748.
[20]
Benjamin Doerr, Edda Happ, and Christian Klein. 2012. Crossover can provably be useful in evolutionary computation. Theoretical Computer Science 425 (2012), 17--33.
[21]
Benjamin Doerr and Timo Kötzing. 2021. Lower bounds from fitness levels made easy. In Genetic and Evolutionary Computation Conference, GECCO 2021. ACM.
[22]
Benjamin Doerr and Martin S. Krejca. 2020. The univariate marginal distribution algorithm copes well with deception and epistasis. In Evolutionary Computation in Combinatorial Optimization, EvoCOP 2020. Springer, 51--66.
[23]
Benjamin Doerr, Huu Phuoc Le, Régis Makhmara, and Ta Duy Nguyen. 2017. Fast genetic algorithms. In Genetic and Evolutionary Computation Conference, GECCO 2017. ACM, 777--784.
[24]
Benjamin Doerr and Frank Neumann (Eds.). 2020. Theory of Evolutionary Computation---Recent Developments in Discrete Optimization. Springer. Also available at https://cs.adelaide.edu.au/~frank/papers/TheoryBook2019-selfarchived.pdf.
[25]
Benjamin Doerr and Weijie Zheng. 2021. Theoretical analyses of multi-objective evolutionary algorithms on multi-modal objectives. In Conference on Artificial Intelligence, AAAI 2021. AAAI Press. To appear.
[26]
Stefan Droste, Thomas Jansen, and Ingo Wegener. 2002. On the analysis of the (1 + 1) evolutionary algorithm. Theoretical Computer Science 276 (2002), 51--81.
[27]
Simon Fischer and Ingo Wegener. 2004. The Ising model on the ring: mutation versus recombination. In Genetic and Evolutionary Computation, GECCO 2004. Springer, 1113--1124.
[28]
Tobias Friedrich, Andreas Göbel, Francesco Quinzan, and Markus Wagner. 2018. Evolutionary Algorithms and Submodular Functions: Benefits of Heavy-Tailed Mutations. CoRR abs/1805.10902 (2018).
[29]
Tobias Friedrich, Andreas Göbel, Francesco Quinzan, and Markus Wagner. 2018. Heavy-tailed mutation operators in single-objective combinatorial optimization. In Parallel Problem Solving from Nature, PPSN 2018, Part I. Springer, 134--145.
[30]
Tobias Friedrich, Timo Kötzing, Martin S. Krejca, Samadhi Nallaperuma, Frank Neumann, and Martin Schirneck. 2016. Fast building block assembly by majority vote crossover. In Genetic and Evolutionary Computation Conference, GECCO 2016. ACM, 661--668.
[31]
Tobias Friedrich, Francesco Quinzan, and Markus Wagner. 2018. Escaping large deceptive basins of attraction with heavy-tailed mutation operators. In Genetic and Evolutionary Computation Conference, GECCO 2018. ACM, 293--300.
[32]
Václav Hasenöhrl and Andrew M. Sutton. 2018. On the runtime dynamics of the compact genetic algorithm on jump functions. In Genetic and Evolutionary Computation Conference, GECCO 2018. ACM, 967--974.
[33]
Jens Jägersküpper and Tobias Storch. 2007. When the plus strategy outperforms the comma strategy and when not. In Foundations of Computational Intelligence, FOCI 2007. IEEE, 25--32.
[34]
Thomas Jansen. 2013. Analyzing Evolutionary Algorithms - The Computer Science Perspective. Springer.
[35]
Thomas Jansen and Ingo Wegener. 2002. The analysis of evolutionary algorithms - a proof that crossover really can help. Algorithmica 34 (2002), 47--66.
[36]
Per Kristian Lehre. 2010. Negative drift in populations. In Parallel Problem Solving from Nature, PPSN 2010. Springer, 244--253.
[37]
Per Kristian Lehre and Phan Trung Hai Nguyen. 2019. On the limitations of the univariate marginal distribution algorithm to deception and where bivariate EDAs might help. In Foundations of Genetic Algorithms, FOGA 2019. ACM, 154--168.
[38]
Andrei Lissovoi, Pietro S. Oliveto, and John Alasdair Warwicker. 2019. On the time complexity of algorithm selection hyper-heuristics for multimodal optimisation. In Conference on Artificial Intelligence, AAAI 2019. AAAI Press, 2322--2329.
[39]
Frank Neumann and Carsten Witt. 2010. Bioinspired Computation in Combinatorial Optimization - Algorithms and Their Computational Complexity. Springer.
[40]
Pietro S. Oliveto, Tiago Paixão, Jorge Pérez Heredia, Dirk Sudholt, and Barbora Trubenová. 2018. How to escape local optima in black box optimisation: when non-elitism outperforms elitism. Algorithmica 80 (2018), 1604--1633.
[41]
Tiago Paixão, Jorge Pérez Heredia, Dirk Sudholt, and Barbora Trubenová. 2017. Towards a runtime comparison of natural and artificial evolution. Algorithmica 78 (2017), 681--713.
[42]
Amirhossein Rajabi and Carsten Witt. 2020. Self-adjusting evolutionary algorithms for multimodal optimization. In Genetic and Evolutionary Computation Conference, GECCO 2020. ACM, 1314--1322.
[43]
Amirhossein Rajabi and Carsten Witt. 2021. Stagnation detection in highly multimodal fitness landscapes. In Genetic and Evolutionary Computation Conference, GECCO 2021. ACM.
[44]
Amirhossein Rajabi and Carsten Witt. 2021. Stagnation detection with randomized local search. In Evolutionary Computation in Combinatorial Optimization, EvoCOP 2021. Springer, 152--168.
[45]
Jonathan E. Rowe and Aishwaryaprajna. 2019. The benefits and limitations of voting mechanisms in evolutionary optimisation. In Foundations of Genetic Algorithms, FOGA 2019. ACM, 34--42.
[46]
Dirk Sudholt. 2005. Crossover is provably essential for the Ising model on trees. In Genetic and Evolutionary Computation Conference, GECCO 2005. ACM, 1161--1167.
[47]
Dirk Sudholt. 2013. A new method for lower bounds on the running time of evolutionary algorithms. IEEE Transactions on Evolutionary Computation 17 (2013), 418--435.
[48]
Dirk Sudholt. 2017. How crossover speeds up building block assembly in genetic algorithms. Evolutionary Computation 25 (2017), 237--274.
[49]
Ingo Wegener. 2001. Theoretical aspects of evolutionary algorithms. In Automata, Languages and Programming, ICALP 2001. Springer, 64--78.
[50]
Darrell Whitley, Swetha Varadarajan, Rachel Hirsch, and Anirban Mukhopadhyay. 2018. Exploration and exploitation without mutation: solving the jump function in Θ(n) time. In Parallel Problem Solving from Nature, PPSN 2018, Part II. Springer, 55--66.
[51]
Carsten Witt. 2013. Tight bounds on the optimization time of a randomized search heuristic on linear functions. Combinatorics, Probability & Computing 22 (2013), 294--318.
[52]
Mengxi Wu, Chao Qian, and Ke Tang. 2018. Dynamic mutation based Pareto optimization for subset selection. In Intelligent Computing Methodologies, ICIC 2018, Part III. Springer, 25--35.

Cited By

View all

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
GECCO '21: Proceedings of the Genetic and Evolutionary Computation Conference
June 2021
1219 pages
ISBN:9781450383509
DOI:10.1145/3449639
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 June 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. evolutionary algorithm
  2. multimodal optimization
  3. running time analysis
  4. theory

Qualifiers

  • Research-article

Funding Sources

  • ANR

Conference

GECCO '21
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)33
  • Downloads (Last 6 weeks)6
Reflects downloads up to 18 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Mixed Binomial Distributions for Binary Mutation OperatorsProceedings of the Genetic and Evolutionary Computation Conference10.1145/3638529.3654010(796-804)Online publication date: 14-Jul-2024
  • (2024)Stagnation Detection in Highly Multimodal Fitness LandscapesAlgorithmica10.1007/s00453-024-01249-w86:9(2929-2958)Online publication date: 2-Jul-2024
  • (2024)Fourier Analysis Meets Runtime Analysis: Precise Runtimes on PlateausAlgorithmica10.1007/s00453-024-01232-586:8(2479-2518)Online publication date: 10-May-2024
  • (2023)A Gentle Introduction to Theory (for Non-Theoreticians)Proceedings of the Companion Conference on Genetic and Evolutionary Computation10.1145/3583133.3595042(946-975)Online publication date: 15-Jul-2023
  • (2023)How the Move Acceptance Hyper-Heuristic Copes With Local Optima: Drastic Differences Between Jumps and CliffsProceedings of the Genetic and Evolutionary Computation Conference10.1145/3583131.3590509(990-999)Online publication date: 15-Jul-2023
  • (2023)Tight Runtime Bounds for Static Unary Unbiased Evolutionary Algorithms on Linear FunctionsProceedings of the Genetic and Evolutionary Computation Conference10.1145/3583131.3590482(1565-1574)Online publication date: 15-Jul-2023
  • (2023)Fourier Analysis Meets Runtime Analysis: Precise Runtimes on PlateausProceedings of the Genetic and Evolutionary Computation Conference10.1145/3583131.3590393(1555-1564)Online publication date: 15-Jul-2023
  • (2023)How Well Does the Metropolis Algorithm Cope With Local Optima?Proceedings of the Genetic and Evolutionary Computation Conference10.1145/3583131.3590390(1000-1008)Online publication date: 15-Jul-2023
  • (2023)Run Time Analysis for Random Local Search on Generalized Majority FunctionsIEEE Transactions on Evolutionary Computation10.1109/TEVC.2022.321634927:5(1385-1397)Online publication date: 3-Oct-2023
  • (2022)A gentle introduction to theory (for non-theoreticians)Proceedings of the Genetic and Evolutionary Computation Conference Companion10.1145/3520304.3533628(890-921)Online publication date: 9-Jul-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media