[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

Self-Adjusting Evolutionary Algorithms for Multimodal Optimization

Published: 01 June 2022 Publication History

Abstract

Recent theoretical research has shown that self-adjusting and self-adaptive mechanisms can provably outperform static settings in evolutionary algorithms for binary search spaces. However, the vast majority of these studies focuses on unimodal functions which do not require the algorithm to flip several bits simultaneously to make progress. In fact, existing self-adjusting algorithms are not designed to detect local optima and do not have any obvious benefit to cross large Hamming gaps. We suggest a mechanism called stagnation detection that can be added as a module to existing evolutionary algorithms (both with and without prior self-adjusting schemes). Added to a simple (1+1) EA, we prove an expected runtime on the well-known Jump benchmark that corresponds to an asymptotically optimal parameter setting and outperforms other mechanisms for multimodal optimization like heavy-tailed mutation. We also investigate the module in the context of a self-adjusting (1+λ) EA. To explore the limitations of the approach, we additionally present an example where both self-adjusting mechanisms, including stagnation detection, do not help to find a beneficial setting of the mutation rate. Finally, we investigate our module for stagnation detection experimentally.

References

[1]
Aleti A and Moser I A systematic literature review of adaptive parameter control methods for evolutionary algorithms ACM Comput. Surv. 2016 49 3 1-35
[2]
Antipov, D., Doerr, B.: Runtime analysis of a heavy-tailed (1+(λ,λ)) genetic algorithm on jump functions. In: T. Bäck, M. Preuss, A.H. Deutz, H. Wang, C. Doerr, M.T.M. Emmerich, H. Trautmann (eds.) Proceedinsgs of PPSN ’20, Lecture Notes in Computer Science, Vol. 12270, pp. 545–559. Springer (2020)
[3]
Antipov, D., Doerr, B., Karavaev, V.: A tight runtime analysis for the (1 + (λ,λ)) GA on LeadingOnes. In: Proceedings of FOGA ’19, pp. 169–182. ACM Press (2019)
[4]
Antipov, D., Doerr, B., Karavaev, V.: The (1+(λ,λ)) GA is even faster on multimodal problems. In: Proceedings of GECCO ’20, pp. 1259–1267. ACM Press (2020)
[5]
Auger, A., Doerr, B. (eds.): Theory of Randomized Search Heuristics. World Scientific Publishing (2011)
[6]
Burke EK, Gendreau M, Hyde MR, Kendall G, Ochoa G, Özcan E, and Qu R Hyper-heuristics: a survey of the state of the art J. Oper. Res. Soc. 2013 64 1695-1724
[7]
Buzdalov M, Doerr B, and Kever M The unrestricted black-box complexity of jump functions Evol. Comput. 2016 24 4 719-744
[8]
Corus, D., Oliveto, P.S., Yazdani, D.: Fast artificial immune systems. In: Proceedings of PPSN ’18, pp. 67–78. Springer (2018)
[9]
Dang, D.C., Lehre, P.K.: Self-adaptation of mutation rates in non-elitist populations. In: Proc. of PPSN ’16, pp. 803–813. Springer (2016)
[10]
Doerr, B.: A tight runtime analysis for the cGA on jump functions: EDAs can cross fitness valleys at no extra cost. In: Proceedings of GECCO ’19, pp. 1488–1496. ACM Press (2019)
[11]
Doerr B and Doerr C Optimal static and self-adjusting parameter choices for the (1+(λ, λ)) genetic algorithm Algorithmica 2018 80 5 1658-1709
[12]
Doerr, B., Doerr, C.: Theory of parameter control for discrete black-box optimization: Provable performance gains through dynamic parameter choices. In: B. Doerr, F. Neumann (eds.) Theory of Evolutionary Computation—Recent Developments in Discrete Optimization, pp. 271–321. Springer (2020)
[13]
Doerr B, Doerr C, and Kötzing T Static and self-adjusting mutation strengths for multi-valued decision variables Algorithmica 2018 80 5 1732-1768
[14]
Doerr, B., Doerr, C., Yang, J.: k-bit mutation with self-adjusting k outperforms standard bit mutation. In: J. Handl, E. Hart, P.R. Lewis, M. López-Ibáñez, G. Ochoa, B. Paechter (eds.) Proceedings of PPSN 2016, Lecture Notes in Computer Science, vol. 9921, pp. 824–834. Springer (2016)
[15]
Doerr, B., Fouz, M., Witt, C.: Quasirandom evolutionary algorithms. In: Proceedings of GECCO ’10, pp. 1457–1464. ACM Press (2010)
[16]
Doerr B, Gießen C, Witt C, and Yang J The (1 + λ) evolutionary algorithm with self-adjusting mutation rate Algorithmica 2019 81 2 593-631
[17]
Doerr, B., Krejca, M.S.: Significance-based estimation-of-distribution algorithms. In: Proceedings of GECCO ’18, pp. 1483–1490. ACM Press (2018)
[18]
Doerr, B., Le, H.P., Makhmara, R., Nguyen, T.D.: Fast genetic algorithms. In: Proceedings of GECCO ’17, pp. 777–784. ACM Press (2017)
[19]
Doerr, B., Le, H.P., Makhmara, R., Nguyen, T.D.: Fast genetic algorithms. CoRR arXiv:1703.03334 (2017)
[20]
Doerr, B., Neumann, F. (eds.): Theory of Evolutionary Computation - Recent Developments in Discrete Optimization. Springer, Natural Computing Series (2020)
[21]
Doerr, B., Witt, C., Yang, J.: Runtime analysis for self-adaptive mutation rates. In: Proceedings of GECCO ’18, pp. 1475–1482. ACM Press (2018)
[22]
Doerr, B., Zheng, W.: Theoretical analyses of multi-objective evolutionary algorithms on multi-modal objectives. In: Proceedings of AAAI ’21, pp. 12293–12301. AAAI Press (2021)
[23]
Doerr, C., Wagner, M.: Sensitivity of parameter control mechanisms with respect to their initialization. In: Proceedings of PPSN ’18, pp. 360–372. Springer (2018)
[24]
Doerr, C., Ye, F., van Rijn, S., Wang, H., Bäck, T.: Towards a theory-guided benchmarking suite for discrete black-box optimization heuristics: Profiling (1+λ) EA variants on OneMax and LeadingOnes. In: Proceedings of GECCO ’18, pp. 951–958. ACM Press (2018)
[25]
Droste S, Jansen T, and Wegener I On the analysis of the (1+1) evolutionary algorithm Theoret. Comput. Sci. 2002 276 51-81
[26]
Eiben, A.E., Marchiori, E., Valkó, V.A.: Evolutionary algorithms with on-the-fly population size adjustment. In: Proceedings of PPSN ’04, pp. 41–50. Springer (2004)
[27]
Fajardo, M.A.H.: An empirical evaluation of success-based parameter control mechanisms for evolutionary algorithms. In: Proceedings of GECCO ’19, pp. 787–795. ACM Press (2019)
[28]
Friedrich, T., Quinzan, F., Wagner, M.: Escaping large deceptive basins of attraction with heavy-tailed mutation operators. In: Proceedings of GECCO ’18, pp. 293–300. ACM Press (2018)
[29]
Garnier J, Kallel L, and Schoenauer M Rigorous hitting times for binary mutations Evol. Comput. 1999 7 173-203
[30]
Hajek B Hitting and occupation time bounds implied by drift analysis with applications Adv. Appl. Probab. 1982 14 502-525
[31]
Hansen, P., Mladenovic, N.: Variable neighborhood search. In: R. Martí, P.M. Pardalos, M.G.C. Resende (eds.) Handbook of Heuristics, pp. 759–787. Springer (2018)
[32]
Jansen, T.: Analyzing Evolutionary Algorithms—The Computer Science Perspective. Springer (2013)
[33]
Jansen T, Jong KAD, and Wegener I On the choice of the offspring population size in evolutionary algorithms Evol. Comput. 2005 13 413-440
[34]
Jansen T and Wiegand RP The cooperative coevolutionary (1+1) EA Evol. Comput. 2004 12 4 405-434
[35]
Lässig, J., Sudholt, D.: Adaptive population models for offspring populations and parallel evolutionary algorithms. In: Proceedings of FOGA ’11, pp. 181–192. ACM Press (2011)
[36]
Lengler, J.: A general dichotomy of evolutionary algorithms on monotone functions. In: Proceedings of PPSN ’18, pp. 3–15. Springer (2018)
[37]
Lissovoi, A., Oliveto, P.S., Warwicker, J.A.: Simple hyper-heuristics control the neighbourhood size of randomised local search optimally for leadingones. Evolutionary Computation (2020). In print
[38]
Neumann, F., Witt, C.: Bioinspired Computation in Combinatorial Optimization—Algorithms and Their Computational Complexity. Springer (2010)
[39]
Rajabi, A., Witt, C.: Self-adjusting evolutionary algorithms for multimodal optimization. In: Proceedings of GECCO ’20, pp. 1314–1322. ACM Press (2020)
[40]
Rajabi, A., Witt, C.: Stagnation detection with randomized local search. In: Proceedings of EvoCOP ’21, pp. 152–168. Springer (2021)
[41]
Rodionova, A., Antonov, K., Buzdalova, A., Doerr, C.: Offspring population size matters when comparing evolutionary algorithms with self-adjusting mutation rates. In: Proceedings of GECCO ’19, pp. 855–863. ACM Press (2019)
[42]
Rohlfshagen, P., Lehre, P.K., Yao, X.: Dynamic evolutionary optimisation: an analysis of frequency and magnitude of change. In: Proceedings of GECCO ’09, pp. 1713–1720. ACM Press (2009)
[43]
Rowe, J.E., Aishwaryaprajna: The benefits and limitations of voting mechanisms in evolutionary optimisation. In: Proceedings of FOGA ’19, pp. 34–42. ACM Press (2019)
[44]
Sudholt D A new method for lower bounds on the running time of evolutionary algorithms IEEE Trans. Evol. Comput. 2013 17 418-435
[45]
Wegener, I.: Methods for the analysis of evolutionary algorithms on pseudo-Boolean functions. In: R. Sarker, M. Mohammadian, X. Yao (eds.) Evolutionary Optimization. Kluwer Academic Publishers (2001)
[46]
Whitley, D., Varadarajan, S., Hirsch, R., Mukhopadhyay, A.: Exploration and exploitation without mutation: Solving the jump function in θ(n) time. In: Proceedings of PPSN ’18, pp. 55–66. Springer (2018)
[47]
Witt, C.: Population size vs. runtime of a simple EA. In: Proceedings of CEC ’03, vol. 3, pp. 1996–2003. IEEE Press (2003)
[48]
Witt C Runtime analysis of the (μ+1) EA on simple pseudo-boolean functions Evol. Comput. 2006 14 1 65-86
[49]
Witt C Population size versus runtime of a simple evolutionary algorithm Theoret. Comput. Sci. 2008 403 1 104-120
[50]
Witt C Tight bounds on the optimization time of a randomized search heuristic on linear functions Comb. Probab. Comput. 2013 22 294-318
[51]
Ye, F., Doerr, C., Bäck, T.: Interpolating local and global search by controlling the variance of standard bit mutation. In: Proceedings of CEC ’19, pp. 2292–2299 (2019)

Cited By

View all
  • (2024)A Gentle Introduction to Theory (for Non-Theoreticians)Proceedings of the Genetic and Evolutionary Computation Conference Companion10.1145/3638530.3648402(800-829)Online publication date: 14-Jul-2024
  • (2024)A Flexible Evolutionary Algorithm with Dynamic Mutation Rate ArchiveProceedings of the Genetic and Evolutionary Computation Conference10.1145/3638529.3654076(1578-1586)Online publication date: 14-Jul-2024
  • (2024)Self-adjusting offspring population sizes outperform fixed parameters on the cliff functionArtificial Intelligence10.1016/j.artint.2023.104061328:COnline publication date: 17-Apr-2024
  • Show More Cited By

Index Terms

  1. Self-Adjusting Evolutionary Algorithms for Multimodal Optimization
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Please enable JavaScript to view thecomments powered by Disqus.

        Information & Contributors

        Information

        Published In

        cover image Algorithmica
        Algorithmica  Volume 84, Issue 6
        Jun 2022
        327 pages

        Publisher

        Springer-Verlag

        Berlin, Heidelberg

        Publication History

        Published: 01 June 2022
        Accepted: 10 January 2022
        Received: 31 October 2020

        Author Tags

        1. Randomized search heuristics
        2. Self-adjusting algorithms
        3. Multimodal functions
        4. Runtime analysis

        Qualifiers

        • Research-article

        Funding Sources

        • Danish Council for Independent Research

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)0
        • Downloads (Last 6 weeks)0
        Reflects downloads up to 19 Dec 2024

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)A Gentle Introduction to Theory (for Non-Theoreticians)Proceedings of the Genetic and Evolutionary Computation Conference Companion10.1145/3638530.3648402(800-829)Online publication date: 14-Jul-2024
        • (2024)A Flexible Evolutionary Algorithm with Dynamic Mutation Rate ArchiveProceedings of the Genetic and Evolutionary Computation Conference10.1145/3638529.3654076(1578-1586)Online publication date: 14-Jul-2024
        • (2024)Self-adjusting offspring population sizes outperform fixed parameters on the cliff functionArtificial Intelligence10.1016/j.artint.2023.104061328:COnline publication date: 17-Apr-2024
        • (2024)Stagnation Detection in Highly Multimodal Fitness LandscapesAlgorithmica10.1007/s00453-024-01249-w86:9(2929-2958)Online publication date: 1-Sep-2024
        • (2024)Runtime Analysis for Permutation-based Evolutionary AlgorithmsAlgorithmica10.1007/s00453-023-01146-886:1(90-129)Online publication date: 1-Jan-2024
        • (2024)When Does the Time-Linkage Property Help Optimization by Evolutionary Algorithms?Parallel Problem Solving from Nature – PPSN XVIII10.1007/978-3-031-70071-2_18(280-294)Online publication date: 14-Sep-2024
        • (2023)Runtime analysis for the NSGA-IIProceedings of the Thirty-Seventh AAAI Conference on Artificial Intelligence and Thirty-Fifth Conference on Innovative Applications of Artificial Intelligence and Thirteenth Symposium on Educational Advances in Artificial Intelligence10.1609/aaai.v37i10.26461(12399-12407)Online publication date: 7-Feb-2023
        • (2023)A Gentle Introduction to Theory (for Non-Theoreticians)Proceedings of the Companion Conference on Genetic and Evolutionary Computation10.1145/3583133.3595042(946-975)Online publication date: 15-Jul-2023
        • (2023)How the Move Acceptance Hyper-Heuristic Copes With Local Optima: Drastic Differences Between Jumps and CliffsProceedings of the Genetic and Evolutionary Computation Conference10.1145/3583131.3590509(990-999)Online publication date: 15-Jul-2023
        • (2023)How Well Does the Metropolis Algorithm Cope With Local Optima?Proceedings of the Genetic and Evolutionary Computation Conference10.1145/3583131.3590390(1000-1008)Online publication date: 15-Jul-2023
        • Show More Cited By

        View Options

        View options

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media