[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3638530.3648402acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
tutorial

A Gentle Introduction to Theory (for Non-Theoreticians)

Published: 01 August 2024 Publication History
First page of PDF

References

[1]
[AAT15] Youhei Akimoto, Sandra Astete Morales, and Olivier Teytaud. Analysis of runtime of optimization algorithms for noisy functions over discrete codomains. Theoretical Computer Science, 605:42--50, 2015.
[2]
[ABD20] Denis Antipov, Maxim Buzdalov, and Benjamin Doerr. First steps towards a runtime analysis when starting with a good solution. In Parallel Problem Solving From Nature, PPSN 2020, Part II, pages 560--573. Springer, 2020.
[3]
[ABD22] Denis Antipov, Maxim Buzdalov, and Benjamin Doerr. Fast mutation in crossover-based algorithms. Algorithmica, 84:1724--1761, 2022.
[4]
[ABD24] Denis Antipov, Maxim Buzdalov, and Benjamin Doerr. Lazy parameter tuning and control: choosing all parameters randomly from a power-law distribution. Algorithmica, 86:442--484, 2024.
[5]
[AD11] Anne Auger and Benjamin Doerr, editors. Theory of Randomized Search Heuristics. World Scientific Publishing, 2011.
[6]
[AD20] Denis Antipov and Benjamin Doerr. Runtime analysis of a heavy-tailed (1 + (λ, λ)) genetic algorithm on jump functions. In Parallel Problem Solving From Nature, PPSN 2020, Part II, pages 545--559. Springer, 2020.
[7]
[AD21] Denis Antipov and Benjamin Doerr. A tight runtime analysis for the (μ + λ) EA. Algorithmica, 83:1054--1095, 2021.
[8]
[ADFH18] Denis Antipov, Benjamin Doerr, Jiefeng Fang, and Tangi Hetet. Runtime analysis for the (μ + λ) EA optimizing OneMax. In Genetic and Evolutionary Computation Conference, GECCO 2018, pages 1459--1466. ACM, 2018.
[9]
[ADI24] Denis Antipov, Benjamin Doerr, and Alexandra Ivanova. Already moderate population sizes provably yield strong robustness to noise. In Genetic and Evolutionary Computation Conference, GECCO 2024. ACM, 2024.
[10]
[ADK19] Denis Antipov, Benjamin Doerr, and Vitalii Karavaev. A tight runtime analysis for the (1 + (μ, λ)) GA on LeadingOnes. In Foundations of Genetic Algorithms, FOGA 2019, pages 169--182. ACM, 2019.
[11]
[ADK22] Denis Antipov, Benjamin Doerr, and Vitalii Karavaev. A rigorous runtime analysis of the (1 + (μ, λ)) GA on jump functions. Algorithmica, 84:1573--1602, 2022.
[12]
[ADY19] Denis Antipov, Benjamin Doerr, and Quentin Yang. The efficiency threshold for the offspring population size of the (μ, λ) EA. In Genetic and Evolutionary Computation Conference, GECCO 2019, pages 1461--1469. ACM, 2019.
[13]
[AL14] Fawaz Alanazi and Per Kristian Lehre. Runtime analysis of selection hyper-heuristics with classical learning mechanisms. In Congress on Evolutionary Computation, CEC 2014, pages 2515--2523. IEEE, 2014.
[14]
[Bäc93] Thomas Bäck. Optimal mutation rates in genetic search. In International Conference on Genetic Algorithms, ICGA 1993, pages 2--8. Morgan Kaufmann, 1993.
[15]
[Bäc96] Thomas Bäck. Evolutionary Algorithms in Theory and Practice - Evolution Strategies, Evolutionary Programming, Genetic Algorithms. Oxford University Press, 1996.
[16]
[BB19] Anton Bassin and Maxim Buzdalov. The 1/5-th rule with rollbacks: on self-adjustment of the population size in the (1 + (μ, λ)) GA. In Genetic and Evolutionary Computation Conference Companion, GECCO 2019, pages 277--278. ACM, 2019.
[17]
[BB20] Anton Bassin and Maxim Buzdalov. The (1 + (μ, λ)) genetic algorithm for permutations. In Genetic and Evolutionary Computation Conference, GECCO 2020, Companion Volume, pages 1669--1677. ACM, 2020.
[18]
[BBD+09] Surender Baswana, Somenath Biswas, Benjamin Doerr, Tobias Friedrich, Piyush P. Kurur, and Frank Neumann. Computing single source shortest paths using single-objective fitness. In Foundations of Genetic Algorithms, FOGA 2009, pages 59--66. ACM, 2009.
[19]
[BBD21] Riade Benbaki, Ziyad Benomar, and Benjamin Doerr. A rigorous runtime analysis of the 2-MMASib on jump functions: ant colony optimizers can cope well with local optima. In Genetic and Evolutionary Computation Conference, GECCO 2021, pages 4--13. ACM, 2021.
[20]
[BBD24] Henry Bambury, Antoine Bultel, and Benjamin Doerr. An extended jump functions benchmark for the analysis of randomized search heuristics. Algorithmica, 86:1--32, 2024.
[21]
[BD17] Maxim Buzdalov and Benjamin Doerr. Runtime analysis of the (1 + (λ, λ)) genetic algorithm on random satisfiable 3-CNF formulas. In Genetic and Evolutionary Computation Conference, GECCO 2017, pages 1343--1350. ACM, 2017.
[22]
[BDDV20] Maxim Buzdalov, Benjamin Doerr, Carola Doerr, and Dmitry Vinokurov. Fixed-target runtime analysis. In Genetic and Evolutionary Computation Conference, GECCO 2020, pages 1295--1303. ACM, 2020.
[23]
[BDK23] Firas Ben Jedidia, Benjamin Doerr, and Martin S. Krejca. Estimation-of-distribution algorithms for multi-valued decision variables. In Genetic and Evolutionary Computation Conference, GECCO 2023, pages 230--238. ACM, 2023.
[24]
[BDN10] Süntje Böttcher, Benjamin Doerr, and Frank Neumann. Optimal fixed and adaptive mutation rates for the LeadingOnes problem. In Parallel Problem Solving from Nature, PPSN 2010, pages 1--10. Springer, 2010.
[25]
[BFM97] Thomas Back, David B. Fogel, and Zbigniew Michalewicz. Handbook of Evolutionary Computation. IOP Publishing Ltd., 1997.
[26]
[BFN08] Dimo Brockhoff, Tobias Friedrich, and Frank Neumann. Analyzing hypervolume indicator based algorithms. In Parallel Problem Solving from Nature, PPSN 2008, pages 651--660. Springer, 2008.
[27]
[BFQY20] Chao Bian, Chao Feng, Chao Qian, and Yang Yu. An efficient evolutionary algorithm for subset selection with general cost constraints. In Conference on Artificial Intelligence, AAAI 2020, pages 3267--3274. AAAI Press, 2020.
[28]
[BLS14] Golnaz Badkobeh, Per Kristian Lehre, and Dirk Sudholt. Unbiased black-box complexity of parallel search. In Parallel Problem Solving from Nature, PPSN 2014, pages 892--901. Springer, 2014.
[29]
[BQ22] Chao Bian and Chao Qian. Better running time of the non-dominated sorting genetic algorithm II (NSGA-II) by using stochastic tournament selection. In Parallel Problem Solving From Nature, PPSN 2022, pages 428--441. Springer, 2022.
[30]
[BQT18] Chao Bian, Chao Qian, and Ke Tang. A general approach to running time analysis of multi-objective evolutionary algorithms. In International Joint Conference on Artificial Intelligence, IJCAI 2018, pages 1405--1411. IJCAI, 2018.
[31]
[BS19] Jakob Bossek and Dirk Sudholt. Time complexity analysis of RLS and (1+1) EA for the edge coloring problem. In Foundations of Genetic Algorithms, FOGA 2019, pages 102--115. ACM, 2019.
[32]
[BZLQ23] Chao Bian, Yawen Zhou, Miqing Li, and Chao Qian. Stochastic population update can provably be helpful in multi-objective evolutionary algorithms. In International Joint Conference on Artificial Intelligence, IJCAI 2023, pages 5513--5521. ijcai.org, 2023.
[33]
[CDEL18] Dogan Corus, Duc-Cuong Dang, Anton V. Eremeev, and Per Kristian Lehre. Level-based analysis of genetic algorithms and other search processes. IEEE Transactions on Evolutionary Computation, 22:707--719, 2018.
[34]
[CDH+23] Sacha Cerf, Benjamin Doerr, Benjamin Hebras, Jakob Kahane, and Simon Wietheger. The first proven performance guarantees for the Non-Dominated Sorting Genetic Algorithm II (NSGA-II) on a combinatorial optimization problem. In International Joint Conference on Artificial Intelligence, IJCAI 2023, pages 5522--5530. ijcai.org, 2023.
[35]
[CL20] Brendan Case and Per Kristian Lehre. Self-adaptation in nonelitist evolutionary algorithms on discrete problems with unknown structure. IEEE Transactions on Evolutionary Computation, 24:650--663, 2020.
[36]
[CLNP16] Dogan Corus, Per Kristian Lehre, Frank Neumann, and Mojgan Pourhassan. A parameterised complexity analysis of bi-level optimisation with evolutionary algorithms. Evolutionary Computation, 24:183--203, 2016.
[37]
[CO18] Dogan Corus and Pietro S. Oliveto. Standard steady state genetic algorithms can hill climb faster than mutation-only evolutionary algorithms. IEEE Transactions on Evolutionary Compututation, 22:720--732, 2018.
[38]
[CO20] Dogan Corus and Pietro S. Oliveto. On the benefits of populations for the exploitation speed of standard steady-state genetic algorithms. Algorithmica, 82:3676--3706, 2020.
[39]
[Cra19] Victoria G. Crawford. An efficient evolutionary algorithm for minimum cost submodular cover. In International Joint Conference on Artificial Intelligence, IJCAI 2019, pages 1227--1233. ijcai.org, 2019.
[40]
[Cra21] Victoria G. Crawford. Faster guarantees of evolutionary algorithms for maximization of monotone submodular functions. In International Joint Conference on Artificial Intelligence, IJCAI 2021, pages 1661--1667. ijcai.org, 2021.
[41]
[CS18] Edgar Covantes Osuna and Dirk Sudholt. Runtime analysis of probabilistic crowding and restricted tournament selection for bimodal optimisation. In Genetic and Evolutionary Computation Conference, GECCO 2018, pages 929--936. ACM, 2018.
[42]
[DD15a] Benjamin Doerr and Carola Doerr. Optimal parameter choices through self-adjustment: Applying the 1/5-th rule in discrete settings. In Genetic and Evolutionary Computation Conference, GECCO 2015, pages 1335--1342. ACM, 2015.
[43]
[DD15b] Benjamin Doerr and Carola Doerr. A tight runtime analysis of the (1+(λ, λ)) genetic algorithm on OneMax. In Genetic and Evolutionary Computation Conference, GECCO 2015, pages 1423--1430. ACM, 2015.
[44]
[DD18] Benjamin Doerr and Carola Doerr. Optimal static and self-adjusting parameter choices for the (1 + (λ, λ)) genetic algorithm. Algorithmica, 80:1658--1709, 2018.
[45]
[DD20] Benjamin Doerr and Carola Doerr. Theory of parameter control for discrete black-box optimization: provable performance gains through dynamic parameter choices. In Benjamin Doerr and Frank Neumann, editors, Theory of Evolutionary Computation: Recent Developments in Discrete Optimization, pages 271--321. Springer, 2020. Also available at https://arxiv.org/abs/1804.05650.
[46]
[DD22] Benjamin Doerr and Marc Dufay. General univariate estimation-of-distribution algorithms. In Parallel Problem Solving From Nature, PPSN 2022, Part II, pages 470--484. Springer, 2022.
[47]
[DDE13] Benjamin Doerr, Carola Doerr, and Franziska Ebel. Lessons from the black-box: fast crossover-based genetic algorithms. In Genetic and Evolutionary Computation Conference, GECCO 2013, pages 781--788. ACM, 2013.
[48]
[DDE15] Benjamin Doerr, Carola Doerr, and Franziska Ebel. From black-box complexity to designing new genetic algorithms. Theoretical Computer Science, 567:87--104, 2015.
[49]
[DDHW23] Matthieu Dinot, Benjamin Doerr, Ulysse Hennebelle, and Sebastian Will. Runtime analyses of multi-objective evolutionary algorithms in the presence of noise. In International Joint Conference on Artificial Intelligence, IJCAI 2023, pages 5549--5557. ijcai.org, 2023.
[50]
[DDK14a] Benjamin Doerr, Carola Doerr, and Timo Kötzing. Unbiased black-box complexities of jump functions: how to cross large plateaus. In Genetic and Evolutionary Computation Conference, GECCO 2014, pages 769--776. ACM, 2014.
[51]
[DDK14b] Benjamin Doerr, Carola Doerr, and Timo Kötzing. The unbiased black-box complexity of partition is polynomial. Artificial Intelligence, 216:275--286, 2014.
[52]
[DDK17] Benjamin Doerr, Carola Doerr, and Timo Kötzing. Unknown solution length problems with no asymptotically optimal run time. In Genetic and Evolutionary Computation Conference, GECCO 2017, pages 921--928. ACM, 2017.
[53]
[DDK18] Benjamin Doerr, Carola Doerr, and Timo Kötzing. Static and self-adjusting mutation strengths for multi-valued decision variables. Algorithmica, 80:1732--1768, 2018.
[54]
[DDL21] Benjamin Doerr, Carola Doerr, and Johannes Lengler. Self-adjusting mutation rates with provably optimal success rules. Algorithmica, 83:3108--3147, 2021.
[55]
[DDLS23] Benjamin Doerr, Arthur Dremaux, Johannes F. Lutzeyer, and Aurélien Stumpf. How the move acceptance hyper-heuristic copes with local optima: drastic differences between jumps and cliffs. In Genetic and Evolutionary Computation Conference, GECCO 2023, pages 990--999. ACM, 2023.
[56]
[DDN19] Benjamin Doerr, Carola Doerr, and Frank Neumann. Fast re-optimization via structural diversity. In Genetic and Evolutionary Computation Conference, GECCO 2019, pages 233--241. ACM, 2019.
[57]
[DDN+20] Benjamin Doerr, Carola Doerr, Aneta Neumann, Frank Neumann, and Andrew M. Sutton. Optimization of chance-constrained submodular functions. In Conference on Artificial Intelligence, AAAI 2020, pages 1460--1467. AAAI Press, 2020.
[58]
[DDST16] Benjamin Doerr, Carola Doerr, Reto Spöhel, and Henning Thomas. Playing Mastermind with many colors. Journal of the ACM, 63:42:1--42:23, 2016.
[59]
[DDY16] Benjamin Doerr, Carola Doerr, and Jing Yang. k-bit mutation with self-adjusting k outperforms standard bit mutation. In Parallel Problem Solving from Nature, PPSN 2016, pages 824--834. Springer, 2016.
[60]
[DDY20] Benjamin Doerr, Carola Doerr, and Jing Yang. Optimal parameter choices via precise black-box analysis. Theoretical Computer Science, 801:1--34, 2020.
[61]
[DEJK24] Benjamin Doerr, Aymen Echarghaoui, Mohammed Jamal, and Martin S. Krejca. Runtime analysis of the (μ + 1) GA: provable speed-ups from strong drift towards diverse populations. In Conference on Artificial Intelligence, AAAI 2024, pages 20683--20691. AAAI Press, 2024.
[62]
[DFK+16] Duc-Cuong Dang, Tobias Friedrich, Timo Kötzing, Martin S. Krejca, Per Kristian Lehre, Pietro S. Oliveto, Dirk Sudholt, and Andrew M. Sutton. Escaping local optima with diversity mechanisms and crossover. In Genetic and Evolutionary Computation Conference, GECCO 2016, pages 645--652. ACM, 2016.
[63]
[DFK+18] Duc-Cuong Dang, Tobias Friedrich, Timo Kötzing, Martin S. Krejca, Per Kristian Lehre, Pietro S. Oliveto, Dirk Sudholt, and Andrew M. Sutton. Escaping local optima using crossover with emergent diversity. IEEE Transactions on Evolutionary Computation, 22:484--497, 2018.
[64]
[DGI24] Benjamin Doerr, Yassine Ghannane, and Marouane Ibn Brahim. Runtime analysis for permutation-based evolutionary algorithms. Algorithmica, 86:90--129, 2024.
[65]
[DGN16] Benjamin Doerr, Wanru Gao, and Frank Neumann. Runtime analysis of evolutionary diversity maximization for OneMinMax. In Genetic and Evolutionary Computation Conference, GECCO 2016, pages 557--564. ACM, 2016.
[66]
[DGWY19] Benjamin Doerr, Christian Gießen, Carsten Witt, and Jing Yang. The (1 + λ) evolutionary algorithm with self-adjusting mutation rate. Algorithmica, 81:593--631, 2019.
[67]
[DHK07] Benjamin Doerr, Edda Happ, and Christian Klein. A tight bound for the (1 + 1)-EA for the single source shortest path problem. In Congress on Evolutionary Computation, CEC 2007, pages 1890--1895. IEEE, 2007.
[68]
[DHK12a] Benjamin Doerr, Edda Happ, and Christian Klein. Crossover can provably be useful in evolutionary computation. Theoretical Computer Science, 425:17--33, 2012.
[69]
[DHK12b] Benjamin Doerr, Ashish Ranjan Hota, and Timo Kötzing. Ants easily solve stochastic shortest path problems. In Genetic and Evolutionary Computation Conference, GECCO 2012, pages 17--24. ACM, 2012.
[70]
[DHN07] Benjamin Doerr, Nils Hebbinghaus, and Frank Neumann. Speeding up evolutionary algorithms through asymmetric mutation operators. Evolutionary Computation, 15:401--410, 2007.
[71]
[DHP22] Benjamin Doerr, Omar El Hadri, and Adrien Pinard. The (1 + (λ, λ)) global SEMO algorithm. In Genetic and Evolutionary Computation Conference, GECCO 2022, pages 520--528. ACM, 2022.
[72]
[DJ07] Benjamin Doerr and Daniel Johannsen. Adjacency list matchings: an ideal genotype for cycle covers. In Genetic and Evolutionary Computation Conference, GECCO 2007, pages 1203--1210. ACM, 2007.
[73]
[DJ10] Benjamin Doerr and Daniel Johannsen. Edge-based representation beats vertex-based representation in shortest path problems. In Genetic and Evolutionary Computation Conference, GECCO 2010, pages 759--766. ACM, 2010.
[74]
[DJK+11] Benjamin Doerr, Daniel Johannsen, Timo Kötzing, Per Kristian Lehre, Markus Wagner, and Carola Winzen. Faster black-box algorithms through higher arity operators. In Foundations of Genetic Algorithms, FOGA 2011, pages 163--172. ACM, 2011.
[75]
[DJK+13] Benjamin Doerr, Daniel Johannsen, Timo Kötzing, Frank Neumann, and Madeleine Theile. More effective crossover operators for the all-pairs shortest path problem. Theoretical Computer Science, 471:12--26, 2013.
[76]
[DJL17] Duc-Cuong Dang, Thomas Jansen, and Per Kristian Lehre. Populations can be essential in tracking dynamic optima. Algorithmica, 78:660--680, 2017.
[77]
[DJS+13] Benjamin Doerr, Thomas Jansen, Dirk Sudholt, Carola Winzen, and Christine Zarges. Mutation rate matters even when optimizing monotone functions. Evolutionary Computation, 21:1--21, 2013.
[78]
[DJW98a] Stefan Droste, Thomas Jansen, and Ingo Wegener. On the optimization of unimodal functions with the (1 + 1) evolutionary algorithm. In Parallel Problem Solving from Nature, PPSN 1998, pages 13--22. Springer, 1998.
[79]
[DJW98b] Stefan Droste, Thomas Jansen, and Ingo Wegener. A rigorous complexity analysis of the (1 + 1) evolutionary algorithm for linear functions with Boolean inputs. In International Conference on Evolutionary Computation, ICEC 1998, pages 499--504. IEEE, 1998.
[80]
[DJW02] Stefan Droste, Thomas Jansen, and Ingo Wegener. On the analysis of the (1+1) evolutionary algorithm. Theoretical Computer Science, 276:51--81, 2002.
[81]
[DJW10] Benjamin Doerr, Daniel Johannsen, and Carola Winzen. Multiplicative drift analysis. In Genetic and Evolutionary Computation Conference, GECCO 2010, pages 1449--1456. ACM, 2010.
[82]
[DJW12] Benjamin Doerr, Daniel Johannsen, and Carola Winzen. Multiplicative drift analysis. Algorithmica, 64:673--697, 2012.
[83]
[DJWZ13] Benjamin Doerr, Thomas Jansen, Carsten Witt, and Christine Zarges. A method to derive fixed budget results from expected optimisation times. In Genetic and Evolutionary Computation Conference, GECCO 2013, pages 1581--1588. ACM, 2013.
[84]
[DK15] Benjamin Doerr and Marvin Künnemann. Optimizing linear functions with the (1 + λ) evolutionary algorithm---different asymptotic runtimes for different instances. Theoretical Computer Science, 561:3--23, 2015.
[85]
[DK20a] Benjamin Doerr and Martin S. Krejca. Bivariate estimation-of-distribution algorithms can find an exponential number of optima. In Genetic and Evolutionary Computation Conference, GECCO 2020, pages 796--804. ACM, 2020.
[86]
[DK20b] Benjamin Doerr and Martin S. Krejca. Significance-based estimation-of-distribution algorithms. IEEE Transactions on Evolutionary Computation, 24:1025--1034, 2020.
[87]
[DK20c] Benjamin Doerr and Martin S. Krejca. The univariate marginal distribution algorithm copes well with deception and epistasis. In Evolutionary Computation in Combinatorial Optimization, EvoCOP 2020, pages 51--66. Springer, 2020.
[88]
[DK21a] Benjamin Doerr and Timo Kötzing. Lower bounds from fitness levels made easy. In Genetic and Evolutionary Computation Conference, GECCO 2021, pages 1142--1150. ACM, 2021.
[89]
[DK21b] Benjamin Doerr and Timo Kötzing. Multiplicative up-drift. Algorithmica, 83:3017--3058, 2021.
[90]
[DK21c] Benjamin Doerr and Martin S. Krejca. A simplified run time analysis of the univariate marginal distribution algorithm on LeadingOnes. Theoretical Computer Science, 851:121--128, 2021.
[91]
[DKS07] Benjamin Doerr, Christian Klein, and Tobias Storch. Faster evolutionary algorithms by superior graph representation. In Foundations of Computational Intelligence, FOCI 2007, pages 245--250. IEEE, 2007.
[92]
[DKV24] Benjamin Doerr, Martin S. Krejca, and Nguyen Vu. Superior genetic algorithms for the target set selection problem based on power-law parameter choices and simple greedy heuristics. In Genetic and Evolutionary Computation Conference, GECCO 2024. ACM, 2024.
[93]
[DKW24] Benjamin Doerr, Martin S. Krejca, and Noé Weeks. Proven runtime guarantees for how the MOEA/D computes the Pareto front from the subproblem solutions. Preprint, 2024.
[94]
[DL15] Duc-Cuong Dang and Per Kristian Lehre. Simplified runtime analysis of estimation of distribution algorithms. In Genetic and Evolutionary Computation Conference, GECCO 2015, pages 513--518. ACM, 2015.
[95]
[DL16a] Duc-Cuong Dang and Per Kristian Lehre. Runtime analysis of non-elitist populations: from classical optimisation to partial information. Algorithmica, 75:428--461, 2016.
[96]
[DL16b] Duc-Cuong Dang and Per Kristian Lehre. Self-adaptation of mutation rates in non-elitist populations. In Parallel Problem Solving from Nature, PPSN 2016, pages 803--813. Springer, 2016.
[97]
[DLMN17] Benjamin Doerr, Huu Phuoc Le, Régis Makhmara, and Ta Duy Nguyen. Fast genetic algorithms. In Genetic and Evolutionary Computation Conference, GECCO 2017, pages 777--784. ACM, 2017.
[98]
[DLN19] Duc-Cuong Dang, Per Kristian Lehre, and Phan Trung Hai Nguyen. Level-based analysis of the univariate marginal distribution algorithm. Algorithmica, 81:668--702, 2019.
[99]
[DLOW18] Benjamin Doerr, Andrei Lissovoi, Pietro S. Oliveto, and John Alasdair Warwicker. On the runtime analysis of selection hyper-heuristics with adaptive learning periods. In Genetic and Evolutionary Computation Conference, GECCO 2018, pages 1015--1022. ACM, 2018.
[100]
[DN20] Benjamin Doerr and Frank Neumann, editors. Theory of Evolutionary Computation---Recent Developments in Discrete Optimization. Springer, 2020. Also available at http://www.lix.polytechnique.fr/Labo/Benjamin.Doerr/doerr_neumann_book.html.
[101]
[DNDD+18] Raphaël Dang-Nhu, Thibault Dardinier, Benjamin Doerr, Gautier Izacard, and Dorian Nogneng. A new analysis method for evolutionary optimization of dynamic and noisy objective functions. In Genetic and Evolutionary Computation Conference, GECCO 2018, pages 1467--1474. ACM, 2018.
[102]
[Doe16] Benjamin Doerr. Optimal parameter settings for the (1 + (λ, λ)) genetic algorithm. In Genetic and Evolutionary Computation Conference, GECCO 2016, pages 1107--1114. ACM, 2016.
[103]
[Doe20] Benjamin Doerr. Lower bounds for non-elitist evolutionary algorithms via negative multiplicative drift. In Parallel Problem Solving From Nature, PPSN 2020, Part II, pages 604--618. Springer, 2020.
[104]
[Doe21a] Benjamin Doerr. Exponential upper bounds for the runtime of randomized search heuristics. Theoretical Computer Science, 851:24--38, 2021.
[105]
[Doe21b] Benjamin Doerr. The runtime of the compact genetic algorithm on Jump functions. Algorithmica, 83:3059--3107, 2021.
[106]
[Doe22] Benjamin Doerr. Does comma selection help to cope with local optima? Algorithmica, 84:1659--1693, 2022.
[107]
[DOSS23a] Duc-Cuong Dang, Andre Opris, Bahare Salehi, and Dirk Sudholt. Analysing the robustness of NSGA-II under noise. In Genetic and Evolutionary Computation Conference, GECCO 2023, pages 642--651. ACM, 2023.
[108]
[DOSS23b] Duc-Cuong Dang, Andre Opris, Bahare Salehi, and Dirk Sudholt. A proof that using crossover can guarantee exponential speed-ups in evolutionary multi-objective optimisation. In Conference on Artificial Intelligence, AAAI 2023, pages 12390--12398. AAAI Press, 2023.
[109]
[DPAM02] Kalyanmoy Deb, Amrit Pratap, Sameer Agarwal, and T. Meyarivan. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation, 6:182--197, 2002.
[110]
[dPdLDD15] Axel de Perthuis de Laillevault, Benjamin Doerr, and Carola Doerr. Money for nothing: speeding up evolutionary algorithms through better initialization. In Genetic and Evolutionary Computation Conference, GECCO 2015, pages 815--822. ACM, 2015.
[111]
[DQ23a] Benjamin Doerr and Zhongdi Qu. A first runtime analysis of the NSGA-II on a multimodal problem. IEEE Transactions on Evolutionary Computation, 27:1288--1297, 2023.
[112]
[DQ23b] Benjamin Doerr and Zhongdi Qu. From understanding the population dynamics of the NSGA-II to the first proven lower bounds. In Conference on Artificial Intelligence, AAAI 2023, pages 12408--12416. AAAI Press, 2023.
[113]
[DQ23c] Benjamin Doerr and Zhongdi Qu. Runtime analysis for the NSGA-II: provable speed-ups from crossover. In Conference on Artificial Intelligence, AAAI 2023, pages 12399--12407. AAAI Press, 2023.
[114]
[DR23] Benjamin Doerr and Amirhossein Rajabi. Stagnation detection meets fast mutation. Theoretical Computer Science, 946:113670, 2023.
[115]
[Dro02] Stefan Droste. Analysis of the (1+1) EA for a dynamically changing OneMax-variant. In Congress on Evolutionary Computation, CEC 2002, pages 55--60. IEEE, 2002.
[116]
[Dro03] Stefan Droste. Analysis of the (1+1) EA for a dynamically bitwise changing OneMax. In Genetic and Evolutionary Computation Conference, GECCO 2003, pages 909--921. Springer, 2003.
[117]
[Dro04] Stefan Droste. Analysis of the (1+1) EA for a noisy OneMax. In Genetic and Evolutionary Computation Conference, GECCO 2004, pages 1088--1099. Springer, 2004.
[118]
[Dro06] Stefan Droste. A rigorous analysis of the compact genetic algorithm for linear functions. Natural Computing, 5:257--283, 2006.
[119]
[DS19] Benjamin Doerr and Andrew M. Sutton. When resampling to cope with noise, use median, not mean. In Genetic and Evolutionary Computation Conference, GECCO 2019, pages 242--248. ACM, 2019.
[120]
[DSW13] Benjamin Doerr, Dirk Sudholt, and Carsten Witt. When do evolutionary algorithms optimize separable functions in parallel? In Foundations of Genetic Algorithms, FOGA 2013, pages 48--59. ACM, 2013.
[121]
[DT09] Benjamin Doerr and Madeleine Theile. Improved analysis methods for crossover-based algorithms. In Genetic and Evolutionary Computation Conference, GECCO 2009, pages 247--254. ACM, 2009.
[122]
[DW12a] Benjamin Doerr and Carola Winzen. Memory-restricted black-box complexity of OneMax. Information Processing Letters, 112:32--34, 2012.
[123]
[DW12b] Benjamin Doerr and Carola Winzen. Reducing the arity in unbiased black-box complexity. In Genetic and Evolutionary Computation Conference, GECCO 2012, pages 1309--1316. ACM, 2012.
[124]
[DW14] Benjamin Doerr and Carola Winzen. Ranking-based black-box complexity. Algorithmica, 68:571--609, 2014.
[125]
[DWY21] Benjamin Doerr, Carsten Witt, and Jing Yang. Runtime analysis for self-adaptive mutation rates. Algorithmica, 83:1012--1053, 2021.
[126]
[DZ20a] Benjamin Doerr and Weijie Zheng. From understanding genetic drift to a smart-restart parameter-less compact genetic algorithm. In Genetic and Evolutionary Computation Conference, GECCO 2020, pages 805--813. ACM, 2020.
[127]
[DZ20b] Benjamin Doerr and Weijie Zheng. Sharp bounds for genetic drift in estimation-of-distribution algorithms. IEEE Transactions on Evolutionary Computation, 24:1140--1149, 2020.
[128]
[DZ21] Benjamin Doerr and Weijie Zheng. Theoretical analyses of multi-objective evolutionary algorithms on multi-modal objectives. In Conference on Artificial Intelligence, AAAI 2021, pages 12293--12301. AAAI Press, 2021.
[129]
[EAvH90] A. E. Eiben, Emile H. L. Aarts, and Kees M. van Hee. Global convergence of genetic algorithms: A markov chain analysis. In Parallel Problem Solving from Nature, PPSN 1990, pages 4--12. Springer, 1990.
[130]
[EGL+19] Hafsteinn Einarsson, Marcelo Matheus Gauy, Johannes Lengler, Florian Meier, Asier Mujika, Angelika Steger, and Felix Weissenberger. The linear hidden subset problem for the (1+1) EA with scheduled and adaptive mutation rates. Theoretical Computer Science, 785:150--170, 2019.
[131]
[EHM99] Agoston Endre Eiben, Robert Hinterding, and Zbigniew Michalewicz. Parameter control in evolutionary algorithms. IEEE Transactions on Evolutionary Computation, 3:124--141, 1999.
[132]
[FGN+19] Tobias Friedrich, Andreas Göbel, Frank Neumann, Francesco Quinzan, and Ralf Rothenberger. Greedy maximization of functions with bounded curvature under partition matroid constraints. In Artificial Intelligence, AAAI 2019, pages 2272--2279. AAAI Press, 2019.
[133]
[FGQW18] Tobias Friedrich, Andreas Göbel, Francesco Quinzan, and Markus Wagner. Heavy-tailed mutation operators in single-objective combinatorial optimization. In Parallel Problem Solving from Nature, PPSN 2018, Part I, pages 134--145. Springer, 2018.
[134]
[FHH+09] Tobias Friedrich, Jun He, Nils Hebbinghaus, Frank Neumann, and Carsten Witt. Analyses of simple hybrid algorithms for the vertex cover problem. Evolutionary Computation, 17:3--19, 2009.
[135]
[FHN07] Tobias Friedrich, Nils Hebbinghaus, and Frank Neumann. Rigorous analyses of simple diversity mechanisms. In Genetic and Evolutionary Computation Conference, GECCO 2007, pages 1219--1225. ACM, 2007.
[136]
[FHN09] Tobias Friedrich, Nils Hebbinghaus, and Frank Neumann. Comparison of simple diversity mechanisms on plateau functions. Theoretical Computer Science, 410:2455--2462, 2009.
[137]
[FK13] Matthias Feldmann and Timo Kötzing. Optimizing expected path lengths with ant colony optimization using fitness proportional update. In Foundations of Genetic Algorithms, FOGA 2013, pages 65--74. ACM, 2013.
[138]
[FKK16] Tobias Friedrich, Timo Kötzing, and Martin S. Krejca. EDAs cannot be balanced and stable. In Genetic and Evolutionary Computation Conference, GECCO 2016, pages 1139--1146. ACM, 2016.
[139]
[FKKS17] Tobias Friedrich, Timo Kötzing, Martin S. Krejca, and Andrew M. Sutton. The compact genetic algorithm is efficient under extreme Gaussian noise. IEEE Transactions on Evolutionary Computation, 21:477--490, 2017.
[140]
[FKR+22] Tobias Friedrich, Timo Kötzing, Aishwarya Radhakrishnan, Leon Schiller, Martin Schirneck, Georg Tennigkeit, and Simon Wietheger. Crossover for cardinality constrained optimization. In Genetic and Evolutionary Computation Conference, GECCO 2022, pages 1399--1407. ACM, 2022.
[141]
[FN15] Tobias Friedrich and Frank Neumann. Maximizing submodular functions under matroid constraints by evolutionary algorithms. Evolutionary Computation, 23:543--558, 2015.
[142]
[Fog94] David B. Fogel. An introduction to simulated evolutionary optimization. IEEE Transactions on Neural Networks, 5:3--14, 1994.
[143]
[FQW18] Tobias Friedrich, Francesco Quinzan, and Markus Wagner. Escaping large deceptive basins of attraction with heavy-tailed mutation operators. In Genetic and Evolutionary Computation Conference, GECCO 2018, pages 293--300. ACM, 2018.
[144]
[FW04] Simon Fischer and Ingo Wegener. The Ising model on the ring: mutation versus recombination. In Genetic and Evolutionary Computation, GECCO 2004, pages 1113--1124. Springer, 2004.
[145]
[FW05] Simon Fischer and Ingo Wegener. The one-dimensional Ising model: Mutation versus recombination. Theoretical Computer Science, 344:208--225, 2005.
[146]
[GGL19] Tomas Gavenciak, Barbara Geissmann, and Johannes Lengler. Sorting by swaps with noisy comparisons. Algorithmica, 81:796--827, 2019.
[147]
[Gie03] Oliver Giel. Expected runtimes of a simple multi-objective evolutionary algorithm. In Congress on Evolutionary Computation, CEC 2003, pages 1918--1925. IEEE, 2003.
[148]
[GK16] Christian Gießen and Timo Kötzing. Robustness of populations in stochastic environments. Algorithmica, 75:462--489, 2016.
[149]
[GKS99] Josselin Garnier, Leila Kallel, and Marc Schoenauer. Rigorous hitting times for binary mutations. Evolutionary Computation, 7:173--203, 1999.
[150]
[Gol89] David E. Goldberg. Genetic Algorithms in Search, Optimization and Machine Learning. Addison-Wesley Longman Publishing Co., Inc., 1989.
[151]
[GP14] Brian W. Goldman and William F. Punch. Parameter-less population pyramid. In Genetic and Evolutionary Computation Conference, GECCO 2014, pages 785--792. ACM, 2014.
[152]
[GW03] Oliver Giel and Ingo Wegener. Evolutionary algorithms and the maximum matching problem. In Symposium on Theoretical Aspects of Computer Science, STACS 2003, pages 415--426. Springer, 2003.
[153]
[GW17] Christian Gießen and Carsten Witt. The interplay of population size and mutation probability in the (1+λ) EA on OneMax. Algorithmica, 78:587--609, 2017.
[154]
[GW18] Christian Gießen and Carsten Witt. Optimal mutation rates for the (1 + λ) EA on OneMax through asymptotically tight drift analysis. Algorithmica, 80:1710--1731, 2018.
[155]
[Han99] Thomas Hanne. On the convergence of multiobjective evolutionary algorithms. European Journal of Operations Research, 117:553--564, 1999.
[156]
[HGAK06] Nikolaus Hansen, Fabian Gemperle, Anne Auger, and Petros Koumoutsakos. When do heavy-tail distributions help? In Parallel Problem Solving from Nature, PPSN 2006, pages 62--71. Springer, 2006.
[157]
[HGD94] Jeffrey Horn, David E. Goldberg, and Kalyanmoy Deb. Long path problems. In Parallel Problem Solving from Nature, PPSN 1994, pages 149--158. Springer, 1994.
[158]
[HJKN08] Edda Happ, Daniel Johannsen, Christian Klein, and Frank Neumann. Rigorous analyses of fitness-proportional selection for optimizing linear functions. In Genetic and Evolutionary Computation Conference, GECCO 2008, pages 953--960. ACM, 2008.
[159]
[HLG99] Georges R. Harik, Fernando G. Lobo, and David E. Goldberg. The compact genetic algorithm. IEEE Transactions on Evolutionary Computation, 3:287--297, 1999.
[160]
[HS18] Václav Hasenöhrl and Andrew M. Sutton. On the runtime dynamics of the compact genetic algorithm on jump functions. In Genetic and Evolutionary Computation Conference, GECCO 2018, pages 967--974. ACM, 2018.
[161]
[HS21a] Mario Alejandro Hevia Fajardo and Dirk Sudholt. Self-adjusting offspring population sizes outperform fixed parameters on the cliff function. In Foundations of Genetic Algorithms, FOGA 2021, pages 5:1--5:15. ACM, 2021.
[162]
[HS21b] Mario Alejandro Hevia Fajardo and Dirk Sudholt. Self-adjusting population sizes for non-elitist evolutionary algorithms: why success rates matter. In Genetic and Evolutionary Computation Conference, GECCO 2021, pages 1151--1159. ACM, 2021.
[163]
[HS22] Mario Alejandro Hevia Fajardo and Dirk Sudholt. Theoretical and empirical analysis of parameter control mechanisms in the (1 + (λ, λ)) genetic algorithm. ACM Transactions on Evolutionary Learning and Optimization, 2:13:1--13:39, 2022.
[164]
[HY01] Jun He and Xin Yao. Drift analysis and average time complexity of evolutionary algorithms. Artificial Intelligence, 127:51--81, 2001.
[165]
[HZ20] Zhengxin Huang and Yuren Zhou. Runtime analysis of somatic contiguous hypermutation operators in MOEA/D framework. In Conference on Artificial Intelligence, AAAI 2020, pages 2359--2366. AAAI Press, 2020.
[166]
[HZC+21] Zhengxin Huang, Yuren Zhou, Zefeng Chen, Xiaoyu He, Xinsheng Lai, and Xiaoyun Xia. Running time analysis of MOEA/D on pseudo-Boolean functions. IEEE Transactions on Cybernetics, 51:5130--5141, 2021.
[167]
[HZCH19] Zhengxin Huang, Yuren Zhou, Zefeng Chen, and Xiaoyu He. Running time analysis of MOEA/D with crossover on discrete optimization problem. In Conference on Artificial Intelligence, AAAI 2019, pages 2296--2303. AAAI Press, 2019.
[168]
[HZLL21] Zhengxin Huang, Yuren Zhou, Chuan Luo, and Qingwei Lin. A runtime analysis of typical decomposition approaches in MOEA/D framework for many-objective optimization problems. In International Joint Conference on Artificial Intelligence, IJCAI 2021, pages 1682--1688, 2021.
[169]
[Jäg08] Jens Jägersküpper. A blend of Markov-chain and drift analysis. In Parallel Problem Solving From Nature, PPSN 2008, pages 41--51. Springer, 2008.
[170]
[Jan07] Thomas Jansen. On the brittleness of evolutionary algorithms. In Foundations of Genetic Algorithms, FOGA 2007, pages 54--69. Springer, 2007.
[171]
[Jan13] Thomas Jansen. Analyzing Evolutionary Algorithms - The Computer Science Perspective. Springer, 2013.
[172]
[JJW05] Thomas Jansen, Kenneth A. De Jong, and Ingo Wegener. On the choice of the offspring population size in evolutionary algorithms. Evolutionary Computation, 13:413--440, 2005.
[173]
[JOZ13] Thomas Jansen, Pietro S. Oliveto, and Christine Zarges. Approximating vertex cover using edge-based representations. In Foundations of Genetic Algorithms, FOGA 2013, pages 87--96. ACM, 2013.
[174]
[JS05] Thomas Jansen and Ulf Schellbach. Theoretical analysis of a mutation-based evolutionary algorithm for a tracking problem in the lattice. In Genetic and Evolutionary Computation Conference, GECCO 2005, pages 841--848. ACM, 2005.
[175]
[JS07] Jens Jägersküpper and Tobias Storch. When the plus strategy outperforms the comma strategy and when not. In Foundations of Computational Intelligence, FOCI 2007, pages 25--32. IEEE, 2007.
[176]
[JW99] Thomas Jansen and Ingo Wegener. On the analysis of evolutionary algorithms - a proof that crossover really can help. In European Symposium on Algorithms, ESA 1999, pages 184--193. Springer, 1999.
[177]
[JW00] Thomas Jansen and Ingo Wegener. On the choice of the mutation probability for the (1+1) EA. In Parallel Problem Solving from Nature, PPSN 2000, pages 89--98. Springer, 2000.
[178]
[JW02] Thomas Jansen and Ingo Wegener. The analysis of evolutionary algorithms - a proof that crossover really can help. Algorithmica, 34:47--66, 2002.
[179]
[JW05] Thomas Jansen and Ingo Wegener. Real royal road functions - where crossover provably is essential. Discrete Applied Mathematics, 149:111--125, 2005.
[180]
[JW06] Thomas Jansen and Ingo Wegener. On the analysis of a dynamic evolutionary algorithm. Journal of Discrete Algorithms, 4:181--199, 2006.
[181]
[JZ12] Thomas Jansen and Christine Zarges. Fixed budget computations: a different perspective on run time analysis. In Terence Soule and Jason H. Moore, editors, Genetic and Evolutionary Computation Conference, GECCO 2012, pages 1325--1332. ACM, 2012.
[182]
[JZ14a] Thomas Jansen and Christine Zarges. Performance analysis of randomised search heuristics operating with a fixed budget. Theoretical Computer Science, 545:39--58, 2014.
[183]
[JZ14b] Thomas Jansen and Christine Zarges. Reevaluating immune-inspired hypermutations using the fixed budget perspective. IEEE Transactions on Evolutionary Computation, 18:674--688, 2014.
[184]
[KAD19] Vitalii Karavaev, Denis Antipov, and Benjamin Doerr. Theoretical and empirical study of the (1 + (λ, λ)) EA on the LeadingOnes problem. In Genetic and Evolutionary Computation Conference, GECCO 2019, Companion Material, pages 2036--2039. ACM, 2019.
[185]
[KD06] Saku Kukkonen and Kalyanmoy Deb. Improved pruning of non-dominated solutions based on crowding distance for bi-objective optimization problems. In Conference on Evolutionary Computation, CEC 2006, pages 1179--1186. IEEE, 2006.
[186]
[KHE15] Giorgos Karafotias, Mark Hoogendoorn, and Ágoston E. Eiben. Parameter control in evolutionary algorithms: trends and challenges. IEEE Transactions on Evolutionary Computation, 19:167--187, 2015.
[187]
[KLLZ23] Marc Kaufmann, Maxime Larcher, Johannes Lengler, and Xun Zou. Self-adjusting population sizes for the (1, λ)-EA on monotone functions. Theoretical Computer Science, 979:114181, 2023.
[188]
[KLNO10] Stefan Kratsch, Per Kristian Lehre, Frank Neumann, and Pietro Simone Oliveto. Fixed parameter evolutionary algorithms and maximum leaf spanning trees: a matter of mutation. In Parallel Problem Solving from Nature, PPSN 2010, Part I, pages 204--213. Springer, 2010.
[189]
[KLW15] Timo Kötzing, Andrei Lissovoi, and Carsten Witt. (1+1) EA on generalized dynamic OneMax. In Foundations of Genetic Algorithms, FOGA 2015, pages 40--51. ACM, 2015.
[190]
[KN13] Stefan Kratsch and Frank Neumann. Fixed-parameter evolutionary algorithms and the vertex cover problem. Algorithmica, 65:754--771, 2013.
[191]
[KST11] Timo Kötzing, Dirk Sudholt, and Madeleine Theile. How crossover helps in pseudo-Boolean optimization. In Genetic and Evolutionary Computation Conference, GECCO 2011, pages 989--996. ACM, 2011.
[192]
[KW20a] Timo Kotzing and Carsten Witt. Improved fixed-budget results via drift analysis. In Parallel Problem Solving from Nature, PPSN 2020, Part II, pages 648--660. Springer, 2020.
[193]
[KW20b] Martin S. Krejca and Carsten Witt. Lower bounds on the run time of the Univariate Marginal Distribution Algorithm on OneMax. Theoretical Computer Science, 832:143--165, 2020.
[194]
[Leh10] Per Kristian Lehre. Negative drift in populations. In Parallel Problem Solving from Nature, PPSN 2010, pages 244--253. Springer, 2010.
[195]
[Leh11] Per Kristian Lehre. Fitness-levels for non-elitist populations. In Genetic and Evolutionary Computation Conference, GECCO 2011, pages 2075--2082. ACM, 2011.
[196]
[Len20] Johannes Lengler. A general dichotomy of evolutionary algorithms on monotone functions. IEEE Transactions Evolutionary Computation, 24:995--1009, 2020.
[197]
[LM24] Johannes Lengler and Jonas Meier. Large population sizes and crossover help in dynamic environments. Natural Computing, 23:115--129, 2024.
[198]
[LMS19] Johannes Lengler, Anders Martinsson, and Angelika Steger. When does hillclimbing fail on monotone functions: an entropy compression argument. In Analytic Algorithmics and Combinatorics, ANALCO 2019, pages 94--102. SIAM, 2019.
[199]
[LN17] Per Kristian Lehre and Phan Trung Hai Nguyen. Improved runtime bounds for the univariate marginal distribution algorithm via anti-concentration. In Genetic and Evolutionary Computation Conference, GECCO 2017, pages 1383--1390. ACM, 2017.
[200]
[LN18] Per Kristian Lehre and Phan Trung Hai Nguyen. Level-based analysis of the population-based incremental learning algorithm. In Parallel Problem Solving From Nature, PPSN 2018, pages 105--116. Springer, 2018.
[201]
[LN19a] Per Kristian Lehre and Phan Trung Hai Nguyen. On the limitations of the univariate marginal distribution algorithm to deception and where bivariate EDAs might help. In Foundations of Genetic Algorithms, FOGA 2019, pages 154--168. ACM, 2019.
[202]
[LN19b] Per Kristian Lehre and Phan Trung Hai Nguyen. Runtime analysis of the univariate marginal distribution algorithm under low selective pressure and prior noise. In Genetic and Evolutionary Computation Conference, GECCO 2019, pages 1497--1505. ACM, 2019.
[203]
[LOW20] Andrei Lissovoi, Pietro S. Oliveto, and John Alasdair Warwicker. Simple hyper-heuristics control the neighbourhood size of randomised local search optimally for LeadingOnes. Evolutionary Computation, 28:437--461, 2020.
[204]
[LOW23] Andrei Lissovoi, Pietro S. Oliveto, and John Alasdair Warwicker. When move acceptance selection hyper-heuristics outperform Metropolis and elitist evolutionary algorithms and when not. Artificial Intelligence, 314:103804, 2023.
[205]
[LS11] Jörg Lässig and Dirk Sudholt. Adaptive population models for offspring populations and parallel evolutionary algorithms. In Foundations of Genetic Algorithms, FOGA 2011, pages 181--192. ACM, 2011.
[206]
[LS15] Johannes Lengler and Nicholas Spooner. Fixed budget performance of the (1+1) EA on linear functions. In Foundations of Genetic Algorithms, FOGA 2015, pages 52--61. ACM, 2015.
[207]
[LS18] Johannes Lengler and Angelika Steger. Drift analysis and evolutionary algorithms revisited. Combinatorics, Probability & Computing, 27:643--666, 2018.
[208]
[LSW21] Johannes Lengler, Dirk Sudholt, and Carsten Witt. The complex parameter landscape of the compact genetic algorithm. Algorithmica, 83:1096--1137, 2021.
[209]
[LTZ+02] Marco Laumanns, Lothar Thiele, Eckart Zitzler, Emo Welzl, and Kalyanmoy Deb. Running time analysis of multi-objective evolutionary algorithms on a simple discrete optimization problem. In Parallel Problem Solving from Nature, PPSN 2002, pages 44--53. Springer, 2002.
[210]
[LTZ04] Marco Laumanns, Lothar Thiele, and Eckart Zitzler. Running time analysis of multiobjective evolutionary algorithms on pseudo-Boolean functions. IEEE Transactions on Evolutionary Computation, 8:170--182, 2004.
[211]
[LW12] Per Kristian Lehre and Carsten Witt. Black-box search by unbiased variation Algorithmica, 64:623--642, 2012.
[212]
[LW16] Andrei Lissovoi and Carsten Witt. MMAS versus population-based EA on a family of dynamic fitness functions. Algorithmica, 75:554--576, 2016.
[213]
[LY11] Per Kristian Lehre and Xin Yao. Crossover can be constructive when computing unique input-output sequences. Soft Computing, 15:1675--1687, 2011.
[214]
[LY12] Per Kristian Lehre and Xin Yao. On the impact of mutation-selection balance on the runtime of evolutionary algorithms. IEEE Transactions on Evolutionary Computation, 16:225--241, 2012.
[215]
[LZ21] Johannes Lengler and Xun Zou. Exponential slowdown for larger populations: The (μ + 1)-EA on monotone functions. Theoretical Computer Science, 875:28--51, 2021.
[216]
[LZZZ16] Yuan-Long Li, Yu-Ren Zhou, Zhi-Hui Zhan, and Jun Zhang. A primary theoretical study on decomposition-based multiobjective evolutionary algorithms. IEEE Transactions on Evolutionary Computation, 20:563--576, 2016.
[217]
[Müh92] Heinz Mühlenbein. How genetic algorithms really work: mutation and hillclimbing. In Parallel Problem Solving from Nature, PPSN 1992, pages 15--26. Elsevier, 1992.
[218]
[Neu04] Frank Neumann. Expected runtimes of evolutionary algorithms for the eulerian cycle problem. In Congress on Evolutionary Computation, CEC 2004, pages 904--910. IEEE, 2004.
[219]
[NNS17] Samadhi Nallaperuma, Frank Neumann, and Dirk Sudholt. Expected fitness gains of randomized search heuristics for the traveling salesperson problem. Evolutionary Computation, 25:673--705, 2017.
[220]
[NOW09] Frank Neumann, Pietro S. Oliveto, and Carsten Witt. Theoretical analysis of fitness-proportional selection: landscapes and efficiency. In Genetic and Evolutionary Computation Conference, GECCO 2009, pages 835--842. ACM, 2009.
[221]
[NS18] Frank Neumann and Andrew M. Sutton. Runtime analysis of evolutionary algorithms for the knapsack problem with favorably correlated weights. In Parallel Problem Solving from Nature, PPSN 2018, Part II, pages 141--152. Springer, 2018.
[222]
[NS19] Frank Neumann and Andrew M. Sutton. Runtime analysis of the (1 +1) evolutionary algorithm for the chance-constrained knapsack problem. In Foundations of Genetic Algorithms, FOGA 2019, pages 147--153. ACM, 2019.
[223]
[NSN15] Anh Quang Nguyen, Andrew M. Sutton, and Frank Neumann. Population size matters: rigorous runtime results for maximizing the hypervolume indicator. Theoretical Computer Science, 561:24--36, 2015.
[224]
[NSW22] Frank Neumann, Dirk Sudholt, and Carsten Witt. The compact genetic algorithm struggles on Cliff functions. In Genetic and Evolutionary Computation Conference, GECCO 2022, pages 1426--1433. ACM, 2022.
[225]
[NT10] Frank Neumann and Madeleine Theile. How crossover speeds up evolutionary algorithms for the multi-criteria all-pairs-shortest-path problem. In Parallel Problem: Solving from: Nature, PPSN 2010, Part I, pages 667--676. Springer, 2010.
[226]
[NW07] Frank Neumann and Ingo Wegener. Randomized local search, evolutionary algorithms, and the minimum spanning tree problem. Theoretical Computer Science, 378:32--40, 2007.
[227]
[NW10] Frank Neumann and Carsten Witt. Bioinspired Computation in Combinatorial Optimization - Algorithms and Their Computational Complexity. Springer, 2010.
[228]
[Och02] Gabriela Ochoa. Setting the mutation rate: scope and limitations of the 1/L heuristic. In Genetic and Evolutionary Computation Conference, GECCO 2002, pages 495--502. Morgan Kaufmann, 2002.
[229]
[ODNS24] Andre Opris, Duc Cuong Dang, Frank Neumann, and Dirk Sudholt. Runtime analyses of NSGA-III on many-objective problems. In Genetic and Evolutionary Computation Conference, GECCO 2024. ACM, 2024. To appear.
[230]
[OHY09] Pietro S. Oliveto, Jun He, and Xin Yao. Analysis of the (1+1)-EA for finding approximate solutions to vertex cover problems. IEEE Transactions on Evolutionary Computation, 13:1006--1029, 2009.
[231]
[OSW20] Pietro S. Oliveto, Dirk Sudholt, and Carsten Witt. A tight lower bound on the expected runtime of standard steady state genetic algorithms. In Genetic and Evolutionary Computation Conference, GECCO 2020, pages 1323--1331. ACM, 2020.
[232]
[OSZ19] Pietro S. Oliveto, Dirk Sudholt, and Christine Zarges. On the benefits and risks of using fitness sharing for multimodal optimisation. Theoretical Computer Science, 773:53--70, 2019.
[233]
[OW15] Pietro S. Oliveto and Carsten Witt. Improved time complexity analysis of the simple genetic algorithm. Theoretical Computer Science, 605:21--41, 2015.
[234]
[Pos09] Petr Posik. BBOB-benchmarking a simple estimation of distribution algorithm with Cauchy distribution. In Genetic and Evolutionary Computation Conference, GECCO 2009, Companion Material, pages 2309--2314. ACM, 2009.
[235]
[Pos10] Petr Posík. Comparison of Cauchy EDA and BIPOP-CMA-ES algorithms on the BBOB noiseless testbed. In Genetic and Evolutionary Computation Conference, GECCO 2010, Companion Material, pages 1697--1702. ACM, 2010.
[236]
[Prü04] Adam Prügel-Bennett. When a genetic algorithm outperforms hill-climbing. Theoretical Computer Science, 320:135--153, 2004.
[237]
[QBJT19] Chao Qian, Chao Bian, Wu Jiang, and Ke Tang. Running time analysis of the (1 + 1)-EA for OneMax and LeadingOnes under bit-wise noise. Algorithmica, 81:749--795, 2019.
[238]
[QGWF21] Francesco Quinzan, Andreas Göbel, Markus Wagner, and Tobias Friedrich. Evolutionary algorithms and submodular functions: benefits of heavy-tailed mutations. Natural Computing, 20:561--575, 2021.
[239]
[QSY+17] Chao Qian, Jing-Cheng Shi, Yang Yu, Ke Tang, and Zhi-Hua Zhou. Optimizing ratio of monotone set functions. In International Joint Conference on Artificial Intelligence, IJCAI 2017, pages 2606--2612. ijcai.org, 2017.
[240]
[QSYT17] Chao Qian, Jing-Cheng Shi, Yang Yu, and Ke Tang. On subset selection with general cost constraints. In International Joint Conference on Artificial Intelligence, IJCAI 2017, pages 2613--2619. ijcai.org, 2017.
[241]
[QTZ16] Chao Qian, Ke Tang, and Zhi-Hua Zhou. Selection hyper-heuristics can provably be helpful in evolutionary multi-objective optimization. In Parallel Problem: Solving from: Nature, PPSN 2016, pages 835--846. Springer, 2016.
[242]
[QYT+19] Chao Qian, Yang Yu, Ke Tang, Xin Yao, and Zhi-Hua Zhou. Maximizing submodular or monotone approximately submodular functions by multi-objective evolutionary algorithms. Artificial Intelligence, 275:279--294, 2019.
[243]
[QYZ13] Chao Qian, Yang Yu, and Zhi-Hua Zhou. An analysis on recombination in multi-objective evolutionary optimization. Artificial Intelligence, 204:99--119, 2013.
[244]
[QYZ15] Chao Qian, Yang Yu, and Zhi-Hua Zhou. On constrained Boolean Pareto optimization. In International Joint Conference on Artificial Intelligence, IJCAI 2015, pages 389--395. AAAI Press, 2015.
[245]
[RNNF19] Vahid Roostapour, Aneta Neumann, Frank Neumann, and Tobias Friedrich. Pareto optimization for subset selection with dynamic cost constraints. In Conference on Artificial Intelligence, AAAI 2019, pages 2354--2361. AAAI Press, 2019.
[246]
[RS14] Jonathan E. Rowe and Dirk Sudholt. The choice of the offspring population size in the (1, λ) evolutionary algorithm. Theoretical Computer Science, 545:20--38, 2014.
[247]
[Rud94] Günter Rudolph. Convergence analysis of canonical genetic algorithms. IEEE Transactions on Neural Networks, 5:96--101, 1994.
[248]
[Rud96] Günter Rudolph. How mutation and selection solve long path problems in polynomial expected time. Evolutionary Computation, 4:195--205, 1996.
[249]
[Rud97] Günter Rudolph. Convergence Properties of Evolutionary Algorithms. Verlag Dr. Kovăc, 1997.
[250]
[Rud98a] Günter Rudolph. Evolutionary search for minimal elements in partially ordered finite sets. In Evolutionary Programming, EP 1998, pages 345--353. Springer, 1998.
[251]
[Rud98b] Günter Rudolph. On a multi-objective evolutionary algorithm and its convergence to the Pareto set. In Conference on Evolutionary Computation, ICEC 1998, pages 511--516. IEEE, 1998.
[252]
[RW20] Amirhossein Rajabi and Carsten Witt. Evolutionary algorithms with self -adjusting asymmetric mutation. In Parallel Problem Solving from Nature, PPSN 2020, Part I, pages 664--677. Springer, 2020.
[253]
[RW21a] Amirhossein Rajabi and Carsten Witt. Stagnation detection in highly multimodal fitness landscapes. In Genetic and Evolutionary Computation Conference, GECCO 2021, pages 1178--1186. ACM, 2021.
[254]
[RW21b] Amirhossein Rajabi and Carsten Witt. Stagnation detection with randomized local search. In Evolutionary Computation in Combinatorial Optimization, EvoCOP 2021, pages 152--168. Springer, 2021.
[255]
[RW22] Amirhossein Rajabi and Carsten Witt. Self-adjusting evolutionary algorithms for multimodal optimization. Algorithmica, 84:1694--1723, 2022.
[256]
[RWP08] J. Neal Richter, Alden H. Wright, and John Paxton. Ignoble trails - where crossover is provably harmful. In Parallel Problem Solving from Nature, PPSN 2008, pages 92--101. Springer, 2008.
[257]
[SGS11] Tom Schaul, Tobias Glasmachers, and Jürgen Schmidhuber. High dimensions and heavy tails for natural evolution strategies. In Genetic and Evolutionary Computation Conference, GECCO 2011, pages 845--852. ACM, 2011.
[258]
[SH87] Harold H. Szu and Ralph L. Hartley. Fast simulated. annealing. Physics Letters A, 122:157--162, 1987.
[259]
[Sha02] Jonathan L. Shapiro. The sensitivity of PBIL to its learning rate, and how detailed balance can remove it. In Foundations of Genetic Algorithms, FOGA 2002, pages 115--132. Morgan Kaufmann, 2002.
[260]
[Sha05] Jonathan L. Shapiro. Drift and scaling in estimation of distribution algorithms. Evolutionary Computing, 13:99--123, 2005.
[261]
[Sha06] Jonathan L. Shapiro. Diversity loss in general estimation of distribution algorithms. In Parallel Problem Solving from Nature, PPSN 2006, pages 92--101. Springer, 2006.
[262]
[SN12] Andrew M. Sutton and Frank Neumann. A parameterized runtime analysis of evolutionary algorithms for the Euclidean traveling salesperson problem. In AAAI Conference on Artificial Intelligence, AAAI 2012, pages 1105--1111. AAAI Press, 2012.
[263]
[SNN14] Andrew M. Sutton, Frank Neumann, and Samadhi Nallaperuma. Parameterized runtime analyses of evolutionary algorithms for the planar Euclidean traveling salesperson problem. Evolutionary Computation, 22:595--628, 2014.
[264]
[ST12] Dirk Sudholt and Christian Thyssen. A simple ant colony optimizer for stochastic shortest path problems. Algorithmica, 64:643--672, 2012.
[265]
[Sto06] Tobias Storch. How randomized search heuristics. find maximum cliques in planar graphs. In Genetic and Evolutionary Computation Conference, GECCO 2006, pages 567--574. ACM, 2006.
[266]
[Sto07] Tobias Storch. Finding large cliques in sparse semi-random graphs by simple randomized search heuristics Theoretical Computer Science, 386:114--131, 2007.
[267]
[Sto08] Tobias Storch. On the choice of the parent population size. Evolutionary Computation, 16:557--578, 2008.
[268]
[STW04] Jens Scharnow, Karsten Tinnefeld, and Ingo Wegener. The analysis of evolutionary algorithms on sorting and shortest paths problems. Journal of Mathematical Modelling and Algorithms, 3:349--366, 2004.
[269]
[Sud05] Dirk Sudholt. Crossover is provably essential for the Ising model on trees. In Genetic and Evolutionary Computation Conference, GECCO 2005, pages 1161--1167. ACM, 2005.
[270]
[Sud13] Dirk Sudholt. A new method for lower bounds on the running time of evolutionary algorithms. IEEE Transactions on Evolutionary Computation, 17:418--435, 2013.
[271]
[Sud17] Dirk Sudholt. How crossover speeds up building block assembly in genetic algorithms. Evolutionary Computation, 25:237--274, 2017.
[272]
[Sud21] Dirk Sudholt. Analysing the robustness of evolutionary algorithms to noise: refined runtime bounds and an example where noise is beneficial. Algorithmica, 83:976--1011, 2021.
[273]
[Sut21] Andrew M. Sutton. Fixed-parameter tractability of crossover: steady-state GAs on the closest string problem. Algorithmica, 83:1138--1163, 2021.
[274]
[SW04] Tobias Storch and Ingo Wegener. Real royal road functions for constant population size. Theoretical Computer Science, 320:123--134, 2004.
[275]
[SW19] Dirk Sudholt and Carsten Witt. On the choice of the update strength in estimation-of-distribution algorithms and ant colony optimization. Algorithmica, 81:1450--1489, 2019.
[276]
[SZ10] Dirk Sudholt and Christine Zarges. Analysis of an iterated local search algorithm for vertex coloring. In International Symposium on Algorithms and Computation, ISAAC 2010, Part I, pages 340--352. Springer, 2010.
[277]
[Thi03] Dirk Thierens. Convergence time analysis for the multi-objective counting ones problem. In Evolutionary Multi-Criterion Optimization, EMO 2003, pages 355--364. Springer, 2003.
[278]
[WD23] Simon Wietheger and Benjamin Doerr. A mathematical runtime analysis of the Non-dominated Sorting Genetic Algorithm III (NSGA-III). In International Joint Conference on Artificial Intelligence, IJCAI 2023, pages 5657--5665. ijcai.org, 2023.
[279]
[Wit05] Carsten Witt. Worst-case and average-case approximations by simple randomized search heuristics. In Symposium on Theoretical Aspects of Computer Science, STACS 2005, pages 44--56. Springer, 2005.
[280]
[Wit06] Carsten Witt. Runtime analysis of the (μ + 1) EA on simple pseudo-Boolean functions. Evolutionary Computation, 14:65--86, 2006.
[281]
[Wit13] Carsten Witt. Tight bounds on the optimization time of a randomized search heuristic on linear functions. Combinatorics, Probability & Computing, 22:294--318, 2013.
[282]
[Wit14] Carsten Witt. Revised analysis of the (1+1) EA for the minimum spanning tree problem. In Genetic and Evolutionary Computation Conference, GECCO 2014, pages 509--516. ACM, 2014.
[283]
[Wit19] Carsten Witt. Upper bounds on the running time of the univariate marginal distribution algorithm on OneMax. Algorithmica, 81:632--667, 2019.
[284]
[Wit23] Carsten Witt. How majority-vote crossover and estimation-of-distribution algorithms cope with fitness valleys. Theoretical Computer Science, 940:18--42, 2023.
[285]
[WJ07] Richard A. Watson and Thomas Jansen. A building-block royal road where crossover is provably essential. In Genetic and Evolutionary Computation Conference, GECCO 2007, pages 1452--1459. ACM, 2007.
[286]
[WPN16] Junhua Wu, Sergey Polyakovskiy, and Frank Neumann. On the impact of the renting rate for the unconstrained nonlinear knapsack problem. In Genetic and Evolutionary Computation Conference, GECCO 2016, pages 413--419. ACM, 2016.
[287]
[WZD21] Shouda Wang, Weijie Zheng, and Benjamin Doerr. Choosing the right algorithm with hints from complexity theory. In International Joint Conference on Artificial Intelligence, IJCAI 2021, pages 1697--1703. ijcai.org, 2021.
[288]
[XNNS21] Yue Xie, Aneta Neumann, Frank Neumann, and Andrew M. Sutton. Runtime analysis of RLS and the (1+1) EA for the chance-constrained knapsack problem with correlated uniform weights. In Genetic and Evolutionary Computation Conference, GECCO 2021, pages 1187--1194. ACM, 2021.
[289]
[YL97] Xin Yao and Yong Liu. Fast evolution strategies. In Evolutionary Programming, volume 1213 of Lecture Notes in Computer Science, pages 151--162. Springer, 1997.
[290]
[YLL99] Xin Yao, Yong Liu, and Guangming Lin. Evolutionary programming made faster. IEEE Transactions on Evolutionary Computation, 3:82--102, 1999.
[291]
[YQZ15] Yang Yu, Chao Qian, and Zhi-Hua Zhou. Switch analysis for running time analysis of evolutionary algorithms. IEEE Transactions on Evolutionary Computation, 19:777--792, 2015.
[292]
[ZD22] Weijie Zheng and Benjamin Doerr. Better approximation guarantees for the NSGA-II by using the current crowding distance. In Genetic and Evolutionary Computation Conference, GECCO 2022, pages 611--619. ACM, 2022.
[293]
[ZD23a] Weyijie Zheng and Benjamin Doerr. From understanding genetic drift to a smart-restart mechanism for estimation-of-distribution algorithms. Journal of Machine Learning Research, 24:1--40, 2023.
[294]
[ZD23b] Weijie Zheng and Benjamin Doerr. Mathematical runtime analysis for the non-dominated sorting genetic algorithm II (NSGA-II). Artificial Intelligence, 325:104016, 2023.
[295]
[ZD23c] Weijie Zheng and Benjamin Doerr. Runtime analysis for the NSGA-II: proving, quantifying, and explaining the inefficiency for many objectives. IEEE Transactions on Evolutionary Computation, 2023. In press
[296]
[ZD24] Weijie Zheng and Benjamin Doerr. Runtime analysis of the SMS-EMOA for many-objective optimization. In Conference on Artificial Intelligence, AAAI 2024, pages 20874--20882. AAAI Press, 2024.
[297]
[ZL07] Qingfu Zhang and Hui Li. MOEA/D: A multiobjective evolutionary algorithm based on decomposition. IEEE Transactions on Evolutionary Computation, 11:712--731, 2007.
[298]
[ZLD22] Weijie Zheng, Yufei Liu, and Benjamin Doerr. A first mathematical runtime analysis of the Non-Dominated Sorting Genetic Algorithm II (NSGA-II). In Conference on Artificial Intelligence, AAAI 2022, pages 10408--10416. AAAI Press, 2022.
[299]
[ZLDD24] Weijie Zheng, Mingfeng Li, Renzhong Deng, and Benjamin Doerr. How to use the Metropolis algorithm for multi-objective optimizations. In Conference on Artificial Intelligence, AAAI 2024, pages 20883--20891. AAAI Press, 2024.
[300]
[ZQL+11] Aimin Zhou, Bo-Yang Qu, Hui Li, Shi-Zheng Zhao, Ponnuthurai Nagaratnam Suganthan, and Qingfu Zhang. Multiobjective evolutionary algorithms: A survey of the state of the art. Swarm and Evolutionary Computation, 1:32--49, 2011.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
GECCO '24 Companion: Proceedings of the Genetic and Evolutionary Computation Conference Companion
July 2024
2187 pages
ISBN:9798400704956
DOI:10.1145/3638530
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 August 2024

Check for updates

Qualifiers

  • Tutorial

Funding Sources

  • FMJH Program Gaspard Monge for optimization and operations research and their interactions with data science

Conference

GECCO '24 Companion
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 31
    Total Downloads
  • Downloads (Last 12 months)31
  • Downloads (Last 6 weeks)13
Reflects downloads up to 18 Dec 2024

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media