A Gentle Introduction to Theory (for Non-Theoreticians)
References
[1]
[ABD20] Denis Antipov, Maxim Buzdalov, and Benjamin Doerr. First steps towards a runtime analysis when starting with a good solution. In Parallel Problem Solving From Nature, PPSN 2020, Part II, pages 560--573. Springer, 2020.
[2]
[ABD21] Denis Antipov, Maxim Buzdalov, and Benjamin Doerr. Lazy parameter tuning and control: choosing all parameters randomly from a power-law distribution. In Genetic and Evolutionary Computation Conference, GECCO 2021, pages 1115--1123. ACM, 2021.
[3]
[ABD22] Denis Antipov, Maxim Buzdalov, and Benjamin Doerr. Fast mutation in crossover-based algorithms. Algorithmica, 84:1724--1761, 2022.
[4]
[AD11] Anne Auger and Benjamin Doerr, editors. Theory of Randomized Search Heuristics. World Scientific Publishing, 2011.
[5]
[AD20] Denis Antipov and Benjamin Doerr. Runtime analysis of a heavy-tailed (1 + (λ, λ)) genetic algorithm on jump functions. In Parallel Problem Solving From Nature, PPSN 2020, Part II, pages 545--559. Springer, 2020.
[6]
[AD21] Denis Antipov and Benjamin Doerr. A tight runtime analysis for the (μ + λ) EA. Algorithmica, 83:1054--1095, 2021.
[7]
[ADFH18] Denis Antipov, Benjamin Doerr, Jiefeng Fang, and Tangi Hetet. Runtime analysis for the (μ + λ) EA optimizing OneMax. In Genetic and Evolutionary Computation Conference, GECCO 2018, pages 1459--1466. ACM, 2018.
[8]
[ADK19] Denis Antipov, Benjamin Doerr, and Vitalii Karavaev. A tight runtime analysis for the (1 + (λ, λ)) GA on LeadingOnes. In Foundations of Genetic Algorithms, FOGA 2019, pages 169--182. ACM, 2019.
[9]
[ADK22] Denis Antipov, Benjamin Doerr, and Vitalii Karavaev. A rigorous runtime analysis of the (1 + (λ, λ)) GA on jump functions. Algorithmica, 84:1573--1602, 2022.
[10]
[ADY19] Denis Antipov, Benjamin Doerr, and Quentin Yang. The efficiency threshold for the offspring population size of the (μ, λ) EA. In Genetic and Evolutionary Computation Conference, GECCO 2019, pages 1461--1469. ACM, 2019.
[11]
[AL14] Fawaz Alanazi and Per Kristian Lehre. Runtime analysis of selection hyper-heuristics with classical learning mechanisms. In Congress on Evolutionary Computation, CEC 2014, pages 2515--2523. IEEE, 2014.
[12]
[AMT15] Youhei Akimoto, Sandra Astete Morales, and Olivier Teytaud. Analysis of runtime of optimization algorithms for noisy functions over discrete codomains. Theoretical Computer Science, 605:42--50, 2015.
[13]
[Bäc93] Thomas Bäck. Optimal mutation rates in genetic search. In International Conference on Genetic Algorithms, ICGA 1993, pages 2--8. Morgan Kaufmann, 1993.
[14]
[Bäc96] Thomas Bäck. Evolutionary Algorithms in Theory and Practice - Evolution Strategies, Evolutionary Programming, Genetic Algorithms. Oxford University Press, 1996.
[15]
[BB19] Anton Bassin and Maxim Buzdalov. The 1/5-th rule with rollbacks: on self-adjustment of the population size in the (1 + (λ, λ)) GA. In Genetic and Evolutionary Computation Conference Companion, GECCO 2019, pages 277--278. ACM, 2019.
[16]
[BB20] Anton Bassin and Maxim Buzdalov. The (1 + (λ, λ)) genetic algorithm for permutations. In Genetic and Evolutionary Computation Conference, GECCO 2020, Companion Volume, pages 1669--1677. ACM, 2020.
[17]
[BBD+09] Surender Baswana, Somenath Biswas, Benjamin Doerr, Tobias Friedrich, Piyush P. Kurur, and Frank Neumann. Computing single source shortest paths using single-objective fitness. In Foundations of Genetic Algorithms, FOGA 2009, pages 59--66. ACM, 2009.
[18]
[BBD21a] Henry Bambury, Antoine Bultel, and Benjamin Doerr. Generalized jump functions. In Genetic and Evolutionary Computation Conference, GECCO 2021, pages 1124--1132. ACM, 2021.
[19]
[BBD21b] Riade Benbaki, Ziyad Benomar, and Benjamin Doerr. A rigorous runtime analysis of the 2-MMASib on jump functions: ant colony optimizers can cope well with local optima. In Genetic and Evolutionary Computation Conference, GECCO 2021, pages 4--13. ACM, 2021.
[20]
[BD17] Maxim Buzdalov and Benjamin Doerr. Runtime analysis of the (1 + (λ, λ)) genetic algorithm on random satisfiable 3-CNF formulas. In Genetic and Evolutionary Computation Conference, GECCO 2017, pages 1343--1350. ACM, 2017.
[21]
[BDDV20] Maxim Buzdalov, Benjamin Doerr, Carola Doerr, and Dmitry Vinokurov. Fixed-target runtime analysis. In Genetic and Evolutionary Computation Conference, GECCO 2020, pages 1295--1303. ACM, 2020.
[22]
[BDK23] Firas Ben Jedidia, Benjamin Doerr, and Martin S. Krejca. Estimation-of-distribution algorithms for multi-valued decision variables. In Genetic and Evolutionary Computation Conference, GECCO 2023. ACM, 2023. To appear.
[23]
[BDN10] Süntje Böttcher, Benjamin Doerr, and Frank Neumann. Optimal fixed and adaptive mutation rates for the LeadingOnes problem. In Parallel Problem Solving from Nature, PPSN 2010, pages 1--10. Springer, 2010.
[24]
[BFM97] Thomas Bäck, David B. Fogel, and Zbigniew Michalewicz. Handbook of Evolutionary Computation. IOP Publishing Ltd., 1997.
[25]
[BFN08] Dimo Brockhoff, Tobias Friedrich, and Frank Neumann. Analyzing hypervolume indicator based algorithms. In Parallel Problem Solving from Nature, PPSN2008, pages 651--660. Springer, 2008.
[26]
[BFQY20] Chao Bian, Chao Feng, Chao Qian, and Yang Yu. An efficient evolutionary algorithm for subset selection with general cost constraints. In AAAI Conference on Artificial Intelligence, AAAI 2020, pages 3267--3274. AAAI, 2020.
[27]
[BLS14] Golnaz Badkobeh, Per Kristian Lehre, and Dirk Sudholt. Unbiased black-box complexity of parallel search. In Parallel Problem Solving from Nature, PPSN 2014, pages 892--901. Springer, 2014.
[28]
[BQ22] Chao Bian and Chao Qian. Better running time of the non-dominated sorting genetic algorithm II (NSGA-II) by using stochastic tournament selection. In Parallel Problem Solving From Nature, PPSN 2022, pages 428--441. Springer, 2022.
[29]
[BQT18] Chao Bian, Chao Qian, and Ke Tang. A general approach to running time analysis of multi-objective evolutionary algorithms. In International Joint Conference on Artificial Intelligence, IJCAI 2018, pages 1405--1411. IJCAI, 2018.
[30]
[BS19] Jakob Bossek and Dirk Sudholt. Time complexity analysis of RLS and (1+1) EA for the edge coloring problem. In Foundations of Genetic Algorithms, FOGA 2019, pages 102--115. ACM, 2019.
[31]
[BZLQ23] Chao Bian, Yawen Zhou, Miqing Li, and Chao Qian. Stochastic population update can provably be helpful in multi-objective evolutionary algorithms. In International Joint Conference on Artificial Intelligence, IJCAI 2023, 2023. To appear
[32]
[CDEL18] Dogan Corus, Duc-Cuong Dang, Anton V. Eremeev, and Per Kristian Lehre. Level-based analysis of genetic algorithms and other search processes. IEEE Transactions on Evolutionary Computation, 22:707--719, 2018.
[33]
[CDH+23] Sacha Cerf, Benjamin Doerr, Benjamin Hebras, Jakob Kahane, and Simon Wietheger. The first proven performance guarantees for the Non-Dominated Sorting Genetic Algorithm II (NSGA-II) on a combinatorial optimization problem. In International Joint Conference on Artificial Intelligence, IJCAI 2023. ijcai.org, 2023. To appear.
[34]
[CL20] Brendan Case and Per Kristian Lehre. Self-adaptation in nonelitist evolutionary algorithms on discrete problems with unknown structure. IEEE Transactions on Evolutionary Computation, 24:650--663, 2020.
[35]
[CLNP16] Dogan Corus, Per Kristian Lehre, Frank Neumann, and Mojgan Pourhassan. A parameterised complexity analysis of bi-level optimisation with evolutionary algorithms. Evolutionary Computation, 24:183--203, 2016.
[36]
[CO18] Dogan Corus and Pietro S. Oliveto. Standard steady state genetic algorithms can hillclimb faster than mutation-only evolutionary algorithms. IEEE Transactions on Evolutionary Compututation, 22:720--732, 2018.
[37]
[CO20] Dogan Corus and Pietro S. Oliveto. On the benefits of populations for the exploitation speed of standard steady-state genetic algorithms. Algorithmica, 82:3676--3706, 2020.
[38]
[Cra19] Victoria G. Crawford. An efficient evolutionary algorithm for minimum cost submodular cover. In International Joint Conference on Artificial Intelligence, IJCAI 2019, pages 1227--1233. ijcai.org, 2019.
[39]
[Cra21] Victoria G. Crawford. Faster guarantees of evolutionary algorithms for maximization of monotone submodular functions. In International Joint Conference on Artificial Intelligence, IJCAI 2021, pages 1661--1667. ijcai.org, 2021.
[40]
[CS18] Edgar Covantes Osuna and Dirk Sudholt. Runtime analysis of probabilistic crowding and restricted tournament selection for bimodal optimisation. In Genetic and Evolutionary Computation Conference, GECCO 2018, pages 929--936. ACM, 2018.
[41]
[DD15a] Benjamin Doerr and Carola Doerr. Optimal parameter choices through self-adjustment: Applying the 1/5-th rule in discrete settings. In Genetic and Evolutionary Computation Conference, GECCO 2015, pages 1335--1342. ACM, 2015.
[42]
[DD15b] Benjamin Doerr and Carola Doerr. A tight runtime analysis of the (1+(λ, λ)) genetic algorithm on OneMax. In Genetic and Evolutionary Computation Conference, GECCO 2015, pages 1423--1430. ACM, 2015.
[43]
[DD18] Benjamin Doerr and Carola Doerr. Optimal static and self-adjusting parameter choices for the (1 + (λ, λ)) genetic algorithm. Algorithmica, 80:1658--1709, 2018.
[44]
[DD20] Benjamin Doerr and Carola Doerr. Theory of parameter control for discrete black-box optimization: provable performance gains through dynamic parameter choice. In Benjamin Doerr and Frank Neumann, editors, Theory of Evolutionary Computation: Recent Developments in Discrete Optimization, pages 271--321. Springer, 2020 Also available at https://arxiv.org/abs/1804.05650.
[45]
[DD22] Benjamin Doerr and Marc Dufay. General univariate estimation-of-distribution algorithms. In Parallel Problem Solving From Nature, PPSN 2022, Part II, pages 470--484. Springer, 2022.
[46]
[DDE13] Benjamin Doerr, Carola Doerr, and Franziska Ebel. Lessons from the black-box: fast crossover-based genetic algorithms. In Genetic and Evolutionary Computation Conference, GECCO 2013, pages 781--788. ACM, 2013.
[47]
[DDE15] Benjamin Doerr, Carola Doerr, and Franziska Ebel. From black-box complexity to designing new genetic algorithms. Theoretical Computer Science, 567:87--104, 2015.
[48]
[DDHW23] Matthieu Dinot, Benjamin Doerr, Ulysse Hennebelle, and Sebastian Will. Runtime analyses of multi-objective evolutionary algorithms in the presence of noise. In International Joint Conference on Artificial Intelligence, IJCAI 2023. ijcai.org, 2023 To appear.
[49]
[DDK14a] Benjamin Doerr, Carola Doerr, and Timo Kötzing. Unbiased black-box complexities of jump functions: how to cross large plateaus. In Geneic and Evolutionary Computation Conference, GECCO 2014, pages 769--776. ACM, 2014.
[50]
[DDK14b] Benjamin Doerr, Carola Doerr, and Timo Kötzing. The unbiased black-box complexity of partition is polynomial. Artificial Intelligence, 216:275--286, 2014.
[51]
[DDK17] Benjamin Doerr, Carola Doerr, and Timo Kötzing. Unknown solution length problems with no asymptotically optimal run time. In Genetic and Evolutionary Computation Conference, GECCO 2017, pages 921--928. ACM, 2017.
[52]
[DDK18] Benjamin Doerr, Carola Doerr, and Timo Kötzing. Static and self-adjusting mutation strengths for multi-valued decision variables. Algorithmica, 80:1732--1768, 2018.
[53]
[DDL21] Benjamin Doerr, Carola Doerr, and Johannes Lengler. Self-adjusting mutation rates with provably optimal success rules. Algorithmica, 83:3108--3147, 2021.
[54]
[DDLS23] Benjamin Doerr, Arthur Dremaux, Johannes Lutzeyer, and Aurélien Stumpf. How the move acceptance hyper-heuristic copes with local optima: drastic differences between jumps and cliffs. In Genetic and Evolutionary Computation Conference, GECCO 2023. ACM, 2023. To appear.
[55]
[DDN19] Benjamin Doerr, Carola Doerr, and Frank Neumann. Fast re-optimization via structural diversity. In Genetic and Evolutionary Computation Conference, GECCO 2019, pages 233--241. ACM, 2019.
[56]
[DDN+20] Benjamin Doerr, Carola Doerr, Aneta Neumann, Frank Neumann, and Andrew M. Sutton. Optimization of chance-constrained submodular functions. In Conference on Artificial Intelligence, AAAI 2020, pages 1460--1467. AAAI Press, 2020.
[57]
[DDST16] Benjamin Doerr, Carola Doerr, Reto Spöhel, and Henning Thomas. Playing Mastermind with many colors. Journal of the ACM, 63:42:1--42:23, 2016.
[58]
[DDY16] Benjamin Doerr, Carola Doerr, and Jing Yang. k-bit mutation with self-adjusting k outperforms standard bit mutation. In Parallel Problem Solving from Nature, PPSN 2016, pages 824--834. Springer, 2016.
[59]
[DDY20] Benjamin Doerr, Carola Doerr, and Jing Yang. Optimal parameter choices via precise black-box analysis. Theoretical Computer Science, 801:1--34, 2020.
[60]
[DEJK23] Benjamin Doerr, Aymen Echarghaoui, Mohammed Jamal, and Martin S. Krejca. Lasting diversity and superior runtime guarantees for the (μ + 1) genetic algorithm. CoRR, abs/2302.12570, 2023.
[61]
[DFK+16] Duc-Cuong Dang, Tobias Friedrich, Timo Kötzing, Martin S. Krejca, Per Kristian Lehre, Pietro S. Oliveto, Dirk Sudholt, and Andrew M. Sutton. Escaping local optima with diversity mechanisms and crossover. In Genetic and Evolutionary Computation Conference, GECCO 2016, pages 645--652. ACM, 2016.
[62]
[DFK+18] Duc-Cuong Dang, Tobias Friedrich, Timo Kötzing, Martin S. Krejca, Per Kristian Lehre, Pietro S. Oliveto, Dirk Sudholt, and Andrew M. Sutton. Escaping local optima using crossover with emergent diversity. IEEE Transactions on Evolutionary Computation, 22: 484--497, 2018.
[63]
[DGN16] Benjamin Doerr, Wanru Gao, and Frank Neumann. Runtime analysis of evolutionary diversity maximization for OneMinMax. In Genetic and Evolutionary Computation Conference, GECCO 2016, pages 557--564. ACM, 2016.
[64]
[DGWY19] Benjamin Doerr, Christian Gießen, Carsten Witt, and Jing Yang. The (1 + λ) evolutionary algorithm with self-adjusting mutation rate. Algorithmica, 81:593--631, 2019.
[65]
[DHK07] Benjamin Doerr, Edda Happ, and Christian Klein. A tight bound for the (1 + 1)-EA for the source shortest path problem. In Congress on Evolutionary Computation, CEC 2007, pages 1890--1895. IEEE, 2007.
[66]
[DHK12a] Benjamin Doerr, Edda Happ, and Christian Klein. Crossover can provably be useful in evolutionary computation. Theoretical Computer Science, 425:17--33, 2012.
[67]
[DHK12b] Benjamin Doerr, Ashish Ranjan Hota, and Timo Kötzing. Ants easily solve stochastic shortest path problems. In Genetic and Evolutionary Computation Conference, GECCO 2012, pages 17--24. ACM, 2012.
[68]
[DHN07] Benjamin Doerr, Nils Hebbinghaus, and Frank Neumann. Speeding up evolutionary algorithms through asymmetric mutation operators. Evolutionary Computation, 15:401--410, 2007.
[69]
[DHP22] Benjamin Doerr, Omar El Hadri, and Adrien Pinard. The (1 + (λ, λ)) global SEMO algorithm. In Genetic and Evolutionary Computation Conference, GECCO 2022, pages 520--528. ACM, 2022.
[70]
[DJ07] Benjamin Doerr and Daniel Johannsen. Adjacency list matchings: an ideal genotype for cycle covers. In Genetic and Evolutionary Computation Conference, GECCO 2007, pages 1203--1210. ACM, 2007.
[71]
[DJ10] Benjamin Doerr and Daniel Johannsen. Edge-based representation beats vertex-based representation in shortest path problems. In Genetic and Evolutionary Computation Conference, GECCO 2010, pages 759--766. ACM, 2010.
[72]
[DJK+11] Benjamin Doerr, Daniel Johannsen, Timo Kötzing, Per Kristian Lehre, Markus Wagner, and Carola Winzen. Faster black-box algorithms through higher arity operators. In Foundations of Genetic Algorithms, FOGA 2011, pages 163--172. ACM, 2011.
[73]
[DJK+13] Benjamin Doerr, Daniel Johannsen, Timo Kötzing, Frank Neumann, and Madeleine Theile. More effective crossover operators for the all-pairs shortest path problem. Theoretical Computer Science, 471:12--26, 2013.
[74]
[DJL17] Duc-Cuong Dang, Thomas Jansen, and Per Kristian Lehre. Populations can be essential in tracking dynamic optima. Algorithmica, 78:660--680, 2017.
[75]
[DJS+13] Benjamin Doerr, Thomas Jansen, Dirk Sudholt, Carola Winzen, and Christine Zarges. Mutation rate matters even when optimizing monotone functions. Evolutionary Computation, 21:1--21, 2013.
[76]
[DJW98a] Stefan Droste, Thomas Jansen, and Ingo Wegener. On the optimization of unimodal functions with the (1 + 1) evolutionary algorithm. In Parallel Problem Solving from Nature, PPSN 1998, pages 13--22. Springer, 1998.
[77]
[DJW98b] Stefan Droste, Thomas Jansen, and Ingo Wegener. A rigorous complexity analysis of the (1+1) evolutionary algorithm for linear functions with Boolean inputs. In International Conference on Evolutionary Computation, ICEC 1998, pages 499--504. IEEE, 1998.
[78]
[DJW02] Stefan Droste, Thomas Jansen, and Ingo Wegener. On the analysis of the (1+1) evolutionary algorithm. Theoretical Computer Science, 276:51--81, 2002.
[79]
[DJW10] Benjamin Doerr, Daniel Johannsen, . Multiplicative drift analysis. In Genetic and Evolutionary Computation Conference, GECCO 2010, pages 1449--1456. ACM, 2010.
[80]
[DJW12] Benjamin Doerr, Daniel Johannsen, and Carola Winzen. Multiplicative drift analysis. Algorithmica, 64:673--697, 2012.
[81]
[DJWZ13] Benjamin Doerr, Thomas Jansen, Carsten Witt, and Christine Zarges. A method to derive fixed budget results from expected optimisation times. In Genetic and Evolutionary Computation Conference, GECCO 2013, pages 1581--1588. ACM, 2013.
[82]
[DK15] Benjamin Doerr and Marvin Künnemann. Optimizing linear functions with the (1 + λ) evolutionary algorithm---different asymptotic runtimes for different instances. Theoretical Computer Science, 561:3--23, 2015.
[83]
[DK20a] Benjamin Doerr and Martin S. Krejca. Bivariate estimation-of-distribution algorithms can find an exponential number of optima. In Genetic and Evolutionary Computation Conference, GECCO 2020, pages 796--804. ACM, 2020.
[84]
[DK20b] Benjamin Doerr and Martin S. Krejca. Significance-based estimation-of-distribution algorithms. IEEE Transactions on Evolutionary Computation, 24:1025--1034, 2020.
[85]
[DK20c] Benjamin Doerr and Martin S. Krejca. The univariate marginal distribution algorithm copes well with deception and epistasis. In Evolutionary Computation in Combinatorial Optimization, EvoCOP 2020, pages 51--66. Springer, 2020.
[86]
[DK21a] Benjamin Doerr and Timo Kötzing. Lower bounds from fitness levels made easy. In Genetic and Evolutionary Computation Conference, GECCO 2021, pages 1142--1150. ACM, 2021.
[87]
[DK21b] Benjamin Doerr and Timo Kötzing. Multiplicative up-drift. Algorithmica, 83:3017--3058, 2021.
[88]
[DK21c] Benjamin Doerr and Martin S. Krejca. A simplified run time analysis of the univariate marginal distribution algorithm on LeadingOnes. Theoretical Computer Science, 851:121--128, 2021.
[89]
[DKS07] Benjamin Doerr, Christian Klein, and Tobias Storch. Faster evolutionary algorithms by superior graph representation. In Foundations of Computational Intelligence, FOCI 2007, pages 245--250. IEEE, 2007.
[90]
[DL15] Duc-Cuong Dang and Per Kristian Lehre. Simplified runtime analysis of estimation of distribution algorithms. In Genetic and Evolutionary Computation Conference, GECCO 2015, pages 513--518. ACM, 2015.
[91]
[DL16a] Duc-Cuong Dang and Per Kristian Lehre. Runtime analysis of non-elitist populations: from classical optimisation to partial information. Algorithmica, 75:428--461, 2016.
[92]
[DL16b] Duc-Cuong Dang and Per Kristian Lehre. Self-adaptation of mutation rates in non-elitist populations. In Parallel Problem Solving from Nature, PPSN 2016, pages 803--813. Springer, 2016.
[93]
[DLMN17] Benjamin Doerr, Huu Phuoc Le, Régis Makhmara, and Ta Duy Nguyen. Fast genetic algorithms. In Genetic and Evolutionary Computation Conference, GECCO 2017, pages 777--784. ACM, 2017.
[94]
[DLN19] Duc-Cuong Dang, Per Kristian Lehre, and PhanTrung Hai Nguyen. Level-based analysis of the univariate marginal distribution algorithm. Algorithmica, 81: 668--702, 2019.
[95]
[DLOW18] Benjamin Doerr, Andrei Lissovoi, Pietro S. Oliveto, and John Alasdair Warwicker. On the runtime analysis of selection hyper-heuristics with adaptive learning periods. In Genetic and Evolutionary Computation Conference, GECCO 2018, pages 1015--1022. ACM, 2018.
[96]
[DN20] Benjamin Doerr and Frank Neumann, editors. Theory of Evolutionary Computation---Recent Developments in Discrete Optimization. Springer, 2020. Also available at http://www.lix.polytechnique.fr/Labo/Benjamin.Doerr/doerr_neumann.book.html.
[97]
[DNDD+18] Raphaël Dang-Nhu, Thibault Dardinier, Benjamin Doerr, Gautier Izacard, and Dorian Nogneng. A new analysis method for evolutionary optimization of dynamic and noisy objective functions. In Genetic and Evolutionary Computation Conference, GECCO 2018, pages 1467--1474. ACM, 2018.
[98]
[Doe16] Benjamin Doerr. Optimal parameter settings for the (1 + (λ, λ)) genetic algorithm. In Genetic and Evolutionary Computation Conference, GECCO 2016, pages 1107--1114. ACM, 2016.
[99]
[Doe20] Benjamin Doerr. Lower bounds for non-elitist evolutionary algorithms via negative multiplicative drift. In Parallel Problem Solving From Nature, PPSN 2020, Part II, pages 604--618. Springer, 2020.
[100]
[Doe21a] Benjamin Doerr. Exponential upper bounds for the runtime of randomized search heuristics. Theoretical Computer Science, 851:24--38, 2021.
[101]
[Doe21b] Benjamin Doerr. The runtime of the compact genetic algorithm on Jump functions. Algorithmica, 83:3059--3107, 2021.
[102]
[Doe22] Benjamin Doerr. Does comma selection help to cope with local optima? Algorithmica, 84:1659--1693, 2022.
[103]
[DOSS23] Duc-Cuong Dang, Andre Opris, Bahare Salehi, and Dirk Sudholt. A proof that using crossover can guarantee exponential speed-ups in evolutionary multi-objective optimisation. In Conference on Artificial Intelligence, AAAI 2023. AAAI Press, 2023. To appear.
[104]
[DPAM02] Kalyanmoy Deb, Amrit Pratap, Sameer Agarwal, and T. Meyarivan. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation, 6:182--197, 2002.
[105]
[dPdLDD15] Axel de Perthuis de Laillevault, Benjamin Doerr, and Carola Doerr. Money for nothing: speeding up evolutionary algorithms through better initialization. In Genetic and Evolutionary Computation Conference, GECCO 2015, pages 815--822. ACM, 2015.
[106]
[DQ23a] Benjamin Doerr and Zhongdi Qu .A first runtime, analysis of the NSGA-II on a multimodal problem. Transactions on Evolutionary Computation, 2023.
[107]
[DQ23b] Benjamin Doerr and Zhongdi Qu. From understanding the population dynamics of the NSGA-II to the first proven lower bounds. In Conference on Artificial Intelligence, AAAI 2023. AAAI Press, 2023. To appear.
[108]
[DQ23c] Benjamin Doerr and Zhongdi Qu. Runtime analysis for the NSGA-II: Provable speed-ups from crossover. In Conference on Artificial Intelligence, AAAI 2023. AAAI Press, 2023. To appear.
[109]
[DR23] Benjamin Doerr and Amirhossein Rajabi. Stagnation detection meets fast mutation. Theoretical Computer Science, 946:113670, 2023.
[110]
[Dro02] Stefan Droste. Analysis of the (1+1) EA for a dynamically changing OneMax-variant. In Congress on Evolutionary Computation, CEC 2002, pages 55--60. IEEE, 2002.
[111]
[Dro03] Stefan Droste. Analysis of the (1+1) EA for a dynamically bitwise changing OneMax. In Genetic and Evolutionary Computation Conference, GECCO 2003, pages 909--921. Springer, 2003.
[112]
[Dro04] Stefan Droste. Analysis of the (1+1) EA for a noisy OneMax. In Genetic and Evolutionary Computation Conference, GECCO 2004, pages 1088--1099. Springer, 2004.
[113]
[Dro06] Stefan Droste. A rigorous analysis of the compact genetic algorithm for linear functions. Natural Computing, 5:257--283, 2006.
[114]
[DS19] Benjamin Doerr and Andrew M. Sutton. When resampling to cope with noise, use median, not mean. In Genetic and Evolutionary Computation Conference, GECCO 2019, pages 242--248. ACM, 2019.
[115]
[DSW13] Benjamin Doerr, Dirk Sudholt, and Carsten Witt. When do evolutionary algorithms optimize separable functions in parallel? In Foundations of Genetic Algorithms, FOGA 2013, pages 48--59. ACM, 2013.
[116]
[DT09] Benjamin Doerr and Madeleine Theile. Improved analysis methods for crossover-based algorithms. In Genetic and Evolutionary Computation Conference, GECCO 2009, pages 247--254. ACM, 2009.
[117]
[DW12a] Benjamin Doerr and Carola Winzen. Memory-restricted black-box complexity of OneMax. Information Processing Letters, 112:32--34, 2012.
[118]
[DW12b] Benjamin Doerr and Carola Winzen. Reducing the arity in unbiased black-box complexity. In Genetic and Evolutionary Computation Conference, GECCO 2012, pages 1309--1316. ACM, 2012.
[119]
[DW14] Benjamin Doerr and Carola Winzen. Ranking-based black-box complexity. Algorithmica, 68:571--609, 2014.
[120]
[DW23] Benjamin Doerr and Simon Wietheger. A mathematical runtime analysis of the Non-dominated Sorting Genetic Algorithm III (NSGA-III). In International Joint Conference on Artificial Intelligence, IJCAI 2023. ijcai.org, 2023. To appear
[121]
[DWY21] Benjamin Doerr, Carsten Witt, and Jing Yang. Runtime analysis for self-adaptive mutation rates. Algorithmica, 83:1012--1053, 2021.
[122]
[DZ20a] Benjamin Doerr and Weijie Zheng. From understanding genetic drift to a smart-restart parameter-less compact genetic algorithm. In Genetic and Evolutionary Computation Conference, GECCO 2020, pages 805--813. ACM, 2020.
[123]
[DZ20b] Benjamin Doerr and Weijie Zheng. Sharp bounds for genetic drift in estimation-of-distribution algorithms. IEEE Transactions on Evolutionary Computation, 24:1140--1149, 2020.
[124]
[DZ21] Benjamin Doerr and Weijie Zheng. Theoretical analyses of multi-objective evolutionary algorithms on multi-modal objectives. In Conference on Artificial Intelligence, AAAI 2021, pages 12293--12301. AAAI Press, 2021.
[125]
[EAvH90] A. E. Eiben, Emile H. L. Aarts, and Kees M. van Hee. Global convergence of genetic algorithms: A markov chain analysis. In Parallel Problem Solving from Nature, PPSN 1990, pages 4--12. Springer, 1990.
[126]
[EGL+19] Hafsteinn Einarsson, Marcelo Matheus Gauy, Johannes Lengler, Florian Meier, Asier Mujika, Angelika Steger, and Felix Weissenberger. The linear hidden subset problem for the (1+1) EA with scheduled and adaptive mutation rates. Theoretical Computer Science, 785:150--170, 2019.
[127]
[EHM99] Agoston Endre Eiben, Robert Hinterding, and Zbigniew Michalewicz. Parameter control in evolutionary algorithms. IEEE Transactions on Evolutionary Computation, 3:124--141, 1999.
[128]
[FGN+19] Tobias Friedrich, Andreas Göbel, Frank Neumann, Francesco Quinzan, and Ralf Rothenberger. Greedy maximization of functions with bounded curvature under partition matroid constraints. In Artificial Intelligence, AAAI 2019, pages 2272--2279. AAAI Press, 2019.
[129]
[FGQW18] Tobias Friedrich, Andreas Göbel, Francesco Quinzan, and Markus Wagner. Heavy-tailed mutation operators in single-objective combinatorial optimization. In Parallel Problem-Solving from Nature, PPSN 2018, Part I, pages 134--145. Springer, 2018.
[130]
[FHH+09] Tobias Friedrich, Jun He, Nils Hebbinghaus, Frank Neumann, and Carsten Witt. Analyses of simple hybrid algorithms for the vertex cover problem. Evolutionary Computation, 17:3--19, 2009.
[131]
[FHN07] Tobias Friedrich, Nils Hebbinghaus, and Frank Neumann. Rigorous analyses of simple diversity mechanisms. In Genetic and Evolutionary Computation Conference, GECCO 2007, pages 1219--1225. ACM, 2007.
[132]
[FHN09] Tobias Friedrich, Nils Hebbinghaus, and Frank Neumann. Comparison of simple diversity mechanisms on plateau functions. Theoretical Computer Science, 410:2455--2462, 2009.
[133]
[FK13] Matthias Feldmann and Timo Kötzing. Optimizing expected path lengths with ant colony optimization using fitness proportional update. In Foundations of Genetic Algorithms, FOGA 2013, pages 65--74. ACM, 2013.
[134]
[FKK16] Tobias Friedrich, Timo Kötzing, and Martin S. Krejca. EDAs cannot be balanced and stable. In Genetic and Evolutionary Computation Conference, GECCO 2016, pages 1139--1146. ACM, 2016.
[135]
[FKKS17] Tobias Friedrich, Timo Kötzing, Martin S. Krejca, and Andrew M. Sutton. The compact genetic algorithm is efficient under extreme Gaussian noise. IEEE Transactions on Evolutionary Computation, 21:477--490, 2017.
[136]
[FKR+22] Tobias Friedrich, Timo Kötzing, Aishwarya Radhakrishnan, Leon Schiller, Martin Schirneck, Georg Tennigkeit, and Simon Wietheger. Crossover for cardinality constrained optimization. In Genetic and Evolutionary Computation Conference, GECCO 2022, pages 1399--1407. ACM, 2022.
[137]
[FN15] Tobias Friedrich and Frank Neumann. Maximizing submodular functions under matroid constraints by evolutionary algorithms. Evolutionary Computation, 23:543--558, 2015.
[138]
[Fog94] David B. Fogel. An introduction to simulated evolutionary optimization. IEEE Transactions on Neural Networks, 5:3--14, 1994.
[139]
[FQW18] Tobias Friedrich, Francesco Quinzan, and Markus Wagner. Escaping large deceptive basins of attraction with heavy-tailed mutation operators. In Genetic and Evolutionary Computation Conference, GECCO 2018, pages 293--300. ACM, 2018.
[140]
[FW04] Simon Fischer and Ingo Wegener. The Ising model on the ring: mutation versus recombination. In Genetic and Evolutionary Computation, GECCO 2004, pages 1113--1124. Springer, 2004.
[141]
[FW05] Simon Fischer and Ingo Wegener. The one-dimensional Ising model: Mutation versus recombination. Theoretical Computer Science, 344:208--225, 2005.
[142]
[GGL19] Tomas Gavenciak, Barbara Geissmann, and Johannes Lengler. Sorting by swaps with noisy comparisons. Algorithmica, 81: 796--827, 2019.
[143]
[Gie03] Oliver Giel. Expected runtimes of a simple multi-objective evolutionary algorithm. In Congress on Evolutionary Computation, CEC 2003, pages 1918--1925. IEEE, 2003.
[144]
[GK16] Christian Gießen and Tim Kötzing. Robustness of populations in stochastic environments. Algorithmica, 75:462--489, 2016.
[145]
[GKS99] Josselin Garnier, Leila Kallel, and Marc Schoenauer. Rigorous hitting times for binary mutations. Evolutionary Computation, 7:173--203, 1999.
[146]
[Gol89] David E. Goldberg. Genetic Algorithms in Search, Optimization and Machine Learning. Addison-Wesley Longman Publishing Co., Inc., 1989.
[147]
[GP14] Brian W. Goldman and William F. Punch. Parameter-less population pyramid. In Genetic and Evolutionary Computation Conference, GECCO 2014, pages 785--792. ACM, 2014.
[148]
[GW03] Oliver Giel and Ingo Wegener. Evolutionary algorithms and the maximum matching problem. In Symposium on Theoretical Aspects of Computer Science, STACS 2003, pages 415--426. Springer, 2003.
[149]
[GW17] Christian Gießen and Carsten Witt. The interplay of population size and mutation probability in the (1 + λ) EA on OneMax. Algorithmica, 78:587--609, 2017.
[150]
[GW18] Christian Gießen and Carsten Witt. Optimal mutation rates for the (1 + λ) EA on OneMax through asymptotically tight drift analysis. Algorithmica, 80:1710--1731, 2018.
[151]
[Han99] Thomas Hanne. On the convergence of multiobjective evolutionary algorithms. European Journal of Operations Research, 117:553--564, 1999.
[152]
[HFS20] Mario Alejandro Hevia Fajardo and Dirk Sudholt. On the choice of the parameter control mechanism in the (1 + (λ, λ)) genetic algorithm. In Genetic and Evolutionary Computation Conference, GECCO 2020, pages 832--840. ACM, 2020.
[153]
[HFS21a] Mario Alejandro Hevia Fajardo and Dirk Sudholt. Self-adjusting offspring population sizes outperform fixed parameters on the cliff function. In Foundations of Genetic Algorithms, FOGA 2021, pages 5:1--5:15. ACM, 2021.
[154]
[HFS21b] Mario Alejandro Hevia Fajardo and Dirk Sudholt. Self-adjusting population sizes for non-elitist evolutionary algorithms: why success rates matter. In Genetic and Evolutionary Computation Conference, GECCO 2021, pages 1151--1159. ACM, 2021.
[155]
[HGAK06] Nikolaus Hansen, Fabian Gemperle, Anne Auger, and Petros Koumoutsakos. When do heavy-tail distributions help? In Parallel Problem Solving from Nature, PPSN 2006, pages 62--71. Springer, 2006.
[156]
[HGD94] Jeffrey Horn, David E. Goldberg, and Kalyanmoy Deb. Long path problems. In Parallel Problem Solving from Nature, PPSN 1994, pages 149--158. Springer, 1994.
[157]
[HJKN08] Edda Happ, Daniel Johannsen, Christian Klein, and Frank Neumann. Rigorous analyses of fitness-proportional selection for optimizing linear functions. In Genetic and Evolutionary Computation Conference, GECCO 2008, pages 953--960. ACM, 2008.
[158]
[HLG99] Georges R. Harik, Fernando G. Lobo, and David E. Goldberg. The compact genetic algorithm. IEEE Transactions on Evolutionary Computation, 3:287--297, 1999.
[159]
[HS18] Václav Hasenöhrl and Andrew M. Sutton. On the runtime dynamics of the compact genetic algorithm on jump functions. In Genetic and Evolutionary Computation Conference, GECCO 2018, pages 967--974. ACM, 2018.
[160]
[HY01] Jun He and Xin Yao. Drift analysis and average time complexity of evolutionary algorithms. Artificial Intelligence, 127:51--81, 2001.
[161]
[HZ20] Zhengxin Huang and Yuren Zhou. Runtime analysis of somatic contiguous hypermutation operators in MOEA/D framework. In Conference on Artificial Intelligence, AAAI 2020, pages 2359--2366. AAAI Press, 2020.
[162]
[HZC+21] Zhengxin Huang, Yuren Zhou, Zefeng Chen, Xiaoyu He, Xinsheng Lai, and Xiaoyun Xia. Running time analysis of MOEA/D on pseudo-Boolean functions. IEEE Transactions on Cybernetics, 51:5130--5141, 2021.
[163]
[HZCH19] Zhengxin Huang, Yuren Zhou, Zefeng Chen, and Xiaoyu He. Running time analysis of MOEA/D with crossover on discrete optimization problem. In Conference on Artificial Intelligence, AAAI 2019, pages 2296--2303. AAAI Press, 2019.
[164]
[HZLL21] Zhengxin Huang, Yuren Zhou, Chuan Luo, and Qingwei Lin. A runtime analysis of typical decomposition approaches in MOEA/D framework for many-objective optimization problems. In International Joint Conference on Artificial Intelligence, IJCAI 2021, pages 1682--1688, 2021.
[165]
[Jäg08] Jens Jägersküpper. A blend of Markov-chain and drift analysis. In Parallel Problem Solving From Nature, PPSN 2008, pages 41--51. Springer, 2008.
[166]
[Jan07] Thomas Jansen. On the brittleness of evolutionary algorithms. In Foundations of Genetic Algorithms, FOGA 2007, pages 54--69. Springer, 2007.
[167]
[Jan13] Thomas Jansen. Analyzing Evolutionary Algorithms - The Computer Science Perspective. Springer, 2013.
[168]
[JJW05] Thomas Jansen, Kenneth A. De Jong, and Ingo Wegener. On the choice of the offspring population size in evolutionary algorithms. Evolutionary Computation, 13:413--440, 2005.
[169]
[JOZ13] Thomas Jansen, Pietro S. Oliveto, and Christine Zarges. Approximating vertex cover using edge-based representations. In Foundations of Genetic Algorithms, FOGA 2013, pages 87--96. ACM, 2013.
[170]
[JS05] Thomas Jansen and Ulf Schellbach. Theoretical analysis of a mutation-based evolutionary algorithm for a tracking problem in the lattice. In Genetic and Evolutionary Computation Conference, GECCO 2005, pages 841--848. ACM, 2005.
[171]
[JS07] Jens Jägersküpper and Tobias Storch. When the plus strategy outperforms the comma strategy and when not. In Foundations of Computational Intelligence, FOCI 2007, pages 25--32. IEEE, 2007.
[172]
[JW99] Thomas Jansen and Ingo Wegener. On the analysis of evolutionary algorithms - a proof that crossover really can help. In European Symposium on Algorithms, ESA 1999, pages 184--193. Springer, 1999.
[173]
[JW00] Thomas Jansen and Ingo Wegener. On the choice of the mutation probability for the (1+1) EA. In Parallel Problem Solving from Nature, PPSN 2000, pages 89--98. Springer, 2000.
[174]
[JW02] Thomas Jansen and Ingo Wegener. The analysis of evolutionary algorithms - a proof that crossover really can help. Algorithmica, 34:47--66, 2002.
[175]
[JW05] Thomas Jansen and Ingo Wegener. Real royal road functions - where crossover provably is essential. Discrete Applied Mathematics, 149:111--125, 2005.
[176]
[JW06] Thomas Jansen and Ingo Wegener. On the analysis of a dynamic evolutionary algorithm. Journal of Discrete Algorithms, 4: 181--199, 2006.
[177]
[JZ12] Thomas Jansen and Christine Zarges. Fixed budget computations: a different perspective on run time analysis. In Terence Soule and Jason H. Moore, editors, Genetic and Evolutionary Computation Conference, GECCO 2012, pages 1325--1332. ACM, 2012.
[178]
[JZ14a] Thomas Jansen and Christine Zarges. Performance analysis of randomised search heuristics operating with a fixed budget. Theoretical Computer Science, 545:39--58, 2014.
[179]
[JZ14b] Thomas Jansen and Christine Zarges. Reevaluating immune-inspired hypermutations using the fixed budget perspective. IEEE Transactions on Evolutionary Computation, 18:674--688, 2014.
[180]
[KAD19] Vitalii Karavaev, Denis Antipov, and Benjamin Doerr. Theoretical and empirical study of the (1 + (λ, λ)) EA on the LeadingOnes problem. In Genetic and Evolutionary Computation Conference, GECCO 2019, Companion Material, pages 2036--2039. ACM, 2019.
[181]
[KD06] Saku Kukkonen and Kalyanmoy Deb. Improved pruning of non-dominated solutions based on crowding distance for bi-objective optimization problems. In Conference on Evolutionary Computation, CEC 2006, pages 1179--1186. IEEE, 2006.
[182]
[KHE15] Giorgos Karafotias, Mark Hoogendoorn, and Ágoston E. Eiben. Parameter control in evolutionary algorithms: trends and challenges. IEEE Transactions on Evolutionary Computation, 19:167--187, 2015.
[183]
[KLNO10] Stefan Kratsch, Per Kristian Lehre, Frank Neumann, and Pietro Simone Oliveto. Fixed parameter evolutionary algorithms and maximum leaf spanning trees: a matter of mutation. In Parallel Problem Solving from Nature, PPSN 2010, Part I, pages 204--213. Springer, 2010.
[184]
[KLW15] Timo Kötzing, Andrei Lissovoi, and Carsten Witt. (1+1) EA on generalized dynamic OneMax. In Foundations of Genetic Algorithms, FOGA 2015, pages 40--51. ACM, 2015.
[185]
[KN13] Stefan Kratsch and Frank Neumann. Fixed-parameter evolutionary algorithms and the vertex cover problem. Algorithmica, 65:754--771, 2013.
[186]
[KST11] Timo Kötzing, Dirk Sudholt, and Madeleine Theile. How crossover helps in pseudo-Boolean optimization. In Genetic and Evolutionary Computation Conference, GECCO 2011, pages 989--996. ACM, 2011.
[187]
[KW20a] Timo Kötzing and Carsten Witt. Improved fixed-budget results via drift analysis. In Parallel Problem Solving from Nature, PPSN 2020, Part II, pages 648--660. Springer, 2020.
[188]
[KW20b] Martin S. Krejca and Carsten Witt. Lower bounds on the run time of the Univariate Marginal Distribution Algorithm on OneMax. Theoretical Computer Science, 832:143--165, 2020.
[189]
[Leh10] Per Kristian Lehre. Negative drift in populations. In Parallel Problem Solving from Nature, PPSN 2010, pages 244--253. Springer, 2010.
[190]
[Leh11] Per Kristian Lehre. Fitness-levels for non-elitist populations. In Genetic and Evolutionary Computation Conference, GECCO 2011, pages 2075--2082. ACM, 2011.
[191]
[Len20] Johannes Lengler. A general dichotomy of evolutionary algorithms on monotone functions. IEEE Transactions Evolutionary Computation, 24:995--1009, 2020.
[192]
[LM20] Johannes Lengler and Jonas Meier. Large population sizes and crossover help in dynamic environments. In Parallel Problem Solving from Nature, PPSN 2020, Part I, pages 610--622. Springer, 2020.
[193]
[LMS19] Johannes Lengler, Anders Martinsson, and Angelika Steger. When does hillclimbing fail on monotone functions: an entropy compression argument. In Analytic Algorithmics and Combinatorics, ANALCO 2019, pages 94--102. SIAM, 2019.
[194]
[LN17] Per Kristian Lehre and Phan Trung Hai Nguyen. Improved runtime bounds for the univariate marginal distribution algorithm via anti-concentration. In Genetic and Evolutionary Computation Conference, GECCO 2017, pages 1383--1390. ACM, 2017.
[195]
[LN18] Per Kristian Lehre and Phan Trung Hai Nguyen. Level-based analysis of the population-based incremental learning algorithm. In Parallel Problem Solving From Nature, PPSN 2018, pages 105--116. Springer, 2018.
[196]
[LN19a] Per Kristian Lehre and Phan Trung Hai Nguyen. On the limitations of the univariate marginal distribution algorithm to deception and where bivariate EDAs might help. In Foundations of Genetic Algorithms, FOGA 2019, pages 154--168. ACM, 2019
[197]
[LN19b] Per Kristian Lehre and Phan Trung Hai Nguyen. Runtime analysis of the univariate marginal distribution algorithm under low selective pressure and prior noise. In Genetic and Evolutionary Computation Conference, GECCO 2019, pages 1497--1505. ACM, 2019.
[198]
[LOW20] Andrei Lissovoi, Pietro S. Oliveto, and John Alasdair Warwicker. Simple hyper-heuristics control the neighbourhood size of randomised local search optimally for LeadingOnes. Evolutionary Computation, 28:437--461, 2020.
[199]
[LOW23] Andrei Lissovoi, Pietro S. Oliveto, and John Alasdair Warwicker. When move acceptance selection hyper-heuristics outperform Metropolis and elitist evolutionary algorithms and when not. Artificial Intelligence, 314:103804, 2023.
[200]
[LS11] Jörg Lässig and Dirk Sudholt. Adaptive population models for offspring populations and parallel evolutionary algorithms. In Foundations of Genetic Algorithms, FOGA 2011, pages 181--192. ACM, 2011.
[201]
[LS15] Johannes Lengler and Nicholas Spooner. Fixed budget performance of the (1+1) EA on linear functions. In Foundations of Genetic Algorithms, FOGA 2015, pages 52--61. ACM, 2015.
[202]
[LS18] Johannes Lengler and Angelika Steger. Drift analysis and evolutionary algorithms revisited. Combinatorics, Probability & Computing, 27:643--666, 2018.
[203]
[LSW21] Johannes Lengler, Dirk Sudholt, and Carsten Witt. The complex parameter landscape of the compact genetic algorithm. Algorithmica, 83:1096--1137, 2021.
[204]
[LTZ+02] Marco Laumanns, Lothar Thiele, Eckart Zitzler, Emo Welzl, and Kalyanmoy Deb. Running time analysis of multi-objective evolutionary algorithms on a simple discrete optimization problem. In Parallel Problem Solving from Nature, PPSN 2002, pages 44--53. Springer, 2002.
[205]
[LTZ04] Marco Laumanns, Lothar Thiele, and Eckart Zitzler. Running time analysis of multiobjective evolutionary algorithms on pseudo-Boolean functions. IEEE Transactions on Evolutionary Computation, 8:170--182, 2004.
[206]
[LW12] Per Kristian Lehre and Carsten Witt. Back-box search by unbiased variation. Algorithmica, 64:623--642, 2012.
[207]
[LW16] Andrei Lissovoi and Carsten Witt. MMAS versus population-based EA on a family of dynamic fitness functions. Algorithmica, 75:554--576, 2016.
[208]
[LY11] Per Kristian Lehre and Xin Yao. Crossover can be constructive when computing unique input-output sequences. Soft Computing, 15:1675--1687, 2011.
[209]
[LY12] Per Kristian Lehre and Xin Yao. On the impact of mutation-selection balance on the runtime of evolutionary algorithms. IEEE Transactions on Evolutionary Computation, 16:225--241, 2012.
[210]
[LZ19] Johannes Lengler and Xun Zou. Exponential slowdown for larger populations: the (μ + 1)-EA on monotone functions. In Foundations of Genetic Algorithms, FOGA 2019, pages 87--101. ACM, 2019.
[211]
[LZZZ16] Yuan-Long Li, Yu-Ren Zhou, Zhi-Hui Zhan, and Jun Zhang. A primary theoretical study on decomposition-based multiobjective evolutionary algorithms. IEEE Transactions on Evolutionary Computation, 20:563--576, 2016.
[212]
[Müh92] Heinz Mühlenbein. How genetic algorithms really work: mutation and hillclimbing. In Parallel Problem Solving from Nature, PPSN 1992, pages 15--26. Elsevier, 1992.
[213]
[Neu04] Frank Neumann. Expected runtimes of evolutionary algorithms for the eulerian cycle problem. In Congress on Evolutionary Computation, CEC 2004, pages 904--910. IEEE, 2004.
[214]
[NNS17] Samadhi Nallaperuma, Frank Neumann, and Dirk Sudholt. Expected fitness gains of randomized search heuristics for the traveling salesperson problem. Evolutionary Computation, 25:673--705, 2017.
[215]
[NOW09] Frank Neumann, Pietro S. Oliveto, and Carsten Witt. Theoretical analysis of fitness-proportional selection: landscapes and efficiency. In Genetic and Evolutionary Computation Conference, GECCO 2009, pages 835--842. ACM, 2009.
[216]
[NS18] Frank Neumann and Andrew M. Sutton. Runtime analysis of evolutionary algorithms for the knapsack problem with favorably correlated weights. In Parallel Problem Solving from. Nature, PPSN 2018, Part II, pages 141--152. Springer, 2018.
[217]
[NS19] Frank Neumann and Andrew M. Sutton. Runtime analysis of the (1 + 1) evolutionary algorithm for the chance-constrained knapsack problem. In Foundations of Genetic Algorithms, FOGA 2019, pages 147--153. ACM, 2019.
[218]
[NSN15] Anh Quang Nguyen, Andrew M. Sutton, and Frank Neumann. Population size matters: rigorous runtime results for maximizing the hypervolume indicator. Theoretical Computer Science, 561:24--36, 2015.
[219]
[NSW22] Frank Neumann, Dirk Sudholt, and Carsten Witt. The compact genetic algorithm struggles on Cliff functions. In Genetic and Evolutionary Computation Conference, GECCO 2022, pages 1426--1433. ACM, 2022.
[220]
[NT10] Frank Neumann and Madeleine Theile. How crossover speeds up evolutionary algorithms for the multi-criteria all-pairs-shortest-path problem. In Parallel Problem Solving from Nature, PPSN 2010, Part I, pages 667--676. Springer, 2010.
[221]
[NW07] Frank Neumann and Ingo Wegener. Randomized local search, evolutionary algorithms, and the minimum spanning tree problem. Theoretical Computer Science, 378:32--40, 2007.
[222]
[NW10] Frank Neumann and Carsten Witt. Bioinspired Computation in Combinatorial Optimization - Algorithms and Their Computational Complexity. Springer, 2010.
[223]
[Och02] Gabriela Ochoa. Setting the mutation rate: scope and limitations of the 1/L heuristic. In Genetic and Evolutionary Computation Conference, GECCO 2002, pages 495--502. Morgan Kaufmann, 2002.
[224]
[OHY09] Pietro S. Oliveto, Jun He, and Xin Yao. Analysis of the (1+1)-EA for finding approximate solutions to vertex cover problems. IEEE Transactions on Evolutionary Computation, 13:1006--1029, 2009.
[225]
[OSW20] Pietro S. Oliveto Dirk Sudholt, and Carsten Witt. A tight lower bound on the expected runtime of standard steady state genetic algorithms. In Genetic and Evolutionary Computation Conference, GECCO 2020, pages 1323--1331. ACM, 2020.
[226]
[OSZ19] Pietro S. Oliveto, Dirk Sudholt, and Christine Zarges. On the benefits and risks of using fitness sharing for multimodal optimisation. Theoretical Computer Science, 773:53--70, 2019.
[227]
[OW15] Pietro S. Oliveto and Carsten Witt. Improved time complexity analysis of the simple genetic algorithm. Theoretical Computer Science, 605:21--41, 2015.
[228]
[Pos09] Petr Posik. BBOB-benchmarking a simple estimation of distribution algorithm with Cauchy distribution. In Genetic and Evolutionary Computation Conference, GECCO 2009, Companion Material, pages 2309--2314. ACM, 2009.
[229]
[Pos10] Petr Posík. Comparison of Cauchy EDA and BIPOP-CMA-ES algorithms on the BBOB noiseless testbed. In Genetic and Evolutionary Computation Conference, GECCO 2010, Companion Material, pages 1697--1702. ACM, 2010.
[230]
[Prü04] Adam Prügel-Bennett. When a genetic algorithm outperforms hill-climbing. Theoretical Computer Science, 320:135--153, 2004.
[231]
[QBJT19] Chao Qian, Chao Bian, Wu Jiang, and Ke Tang. Running time analysis of the (1 + 1)-EA for OneMax and LeadingOnes under bit-wise noise. Algorithmica, 81: 749--795, 2019.
[232]
[QGWF21] Francesco Quinzan, Andreas Göbel, Markus Wagner, and Tobias Friedrich. Evolutionary algorithms and submodular functions: benefits of heavy-tailed mutations. Natural Computing, 20:561--575, 2021.
[233]
[QSY+17] Chao Qian, Jing-Cheng Shi, Yang Yu, Ke Tang, and Zhi-Hua Zhou. Optimizing ratio of monotone set functions. In International Joint Conference on Artificial Intelligence, IJCAI 2017, pages 2606--2612. ijcai.org, 2017.
[234]
[QSYT17] Chao Qian, Jing-Cheng Shi, Yang Yu, and Ke Tang. On subset selection with general cost constraints. In International Joint Conference on Artificial Intelligence, IJCAI 2017, pages 2613--2619. ijcai.org, 2017.
[235]
[QTZ16] Chao Qian, Ke Tang, and Zhi-Hua Zhou. Selection hyper-heuristics can provably be helpful in evolutionary multi-objective optimization. In Parallel Problem Solving from Nature, PPSN 2016, pages 835--846. Springer, 2016.
[236]
[QYT+19] Chao Qian, Yang Yu, Ke Tang, Xin Yao, and Zhi-Hua Zhou. Maximizing submodular or monotone approximately submodular functions by multi-objective evolutionary algorithms. Artificial Intelligence, 275:279--294, 2019.
[237]
[QYZ13] Chao Qian, Yang Yu, and Zhi-Hua Zhou. An analysis on recombination in multi-objective evolutionary optimization. Artificial Intelligence, 204:99--119, 2013.
[238]
[QYZ15] Chao Qian, Yang Yu, and Zhi-Hua Zhou. On constrained Boolean Pareto optimization. In International Joint Conference on Artificial Intelligence, IJCAI 2015, pages 389--395. AAAI Press, 2015.
[239]
[RNNF19] Vahid Roostapour, Aneta Neumann, Frank Neumann, and Tobias Friedrich. Pareto optimization for subset selection with dynamic cost constraints. In Conference on Artificial Intelligence, AAAI 2019, pages 2354--2361. AAAI Press, 2019.
[240]
[RS14] Jonathan E. Rowe and Dirk Sudholt. The choice of the offspring population size in the (1, λ) evolutionary algorithm. Theoretical Computer Science, 545:20--38, 2014.
[241]
[Rud94] Günter Rudolph. Convergence analysis of canonical genetic algorithms. IEEE Transactions on Neural Networks, 5:96--101, 1994.
[242]
[Rud96] Günter Rudolph. How mutation and selection solve long path problems in polynomial expected time. Evolutionary Computation, 4:195--205, 1996.
[243]
[Rud97] Günter Rudolph. Convergence Properties of Evolutionary Algorithms. Verlag Dr. Kovǎc, 1997.
[244]
[Rud98a] Günter Rudolph. Evolutionary search for minimal elements in partially ordered finite sets. In Evolutionary Programming, EP 1998, pages 345--353. Springer, 1998.
[245]
[Rud98b] Günter Rudolph. On a multi-objective evolutionary algorithm and its convergence to the pareto set. In Conference on Evolutionary Computation, ICEC 1998, pages 511--516. IEEE, 1998.
[246]
[RW20] Amirhossein Rajabi and Carsten Witt. Evolutionary algorithms with self-adjusting asymmetric mutation. In Parallel Problem Solving from Nature, PPSN 2020, Part I, pages 664--677. Springer, 2020.
[247]
[RW21a] Amirhossein Rajabi and Carsten Witt. Stagnation detection in highly multimodal fitness landscapes. In Genetic and Evolutionary Computation Conference, GECCO 2021, pages 1178--1186. ACM, 2021.
[248]
[RW21b] Amirhossein Rajabi and Carsten Witt. Stagnation detection with randomized local search. In Evolutionary Computation in Combinatorial Optimization, EvoCOP 2021, pages 152--168. Springer, 2021.
[249]
[RW22] Amirhossein Rajabi and Carsten Witt. Self-adjusting evolutionary algorithms for multimodal optimization. Algorithmica, 84:1694--1723, 2022.
[250]
[RWP08] J. Neal Richter, Alden H. Wright, and John Paxton. Ignoble trails - where crossover is provably harmful. In Parallel Problem Solving from Nature, PPSN 2008, pages 92--101. Springer, 2008.
[251]
[SGS11] Tom Schaul, Tobias Glasmachers, and Jürgen Schmidhuber. High dimensions and heavy tails for natural evolution strategies. In Genetic and Evolutionary Computation Conference, GECCO 2011, pages 845--852. ACM, 2011.
[252]
[SH87] Harold H. Szu and Ralph L. Hartley. Fast simulated annealing. Physics Letters A, 122:157--162, 1987.
[253]
[Sha02] Jonathan L. Shapiro. The sensitivity of PBIL to its learning rate, and how detailed balance can remove it. In Foundations of Genetic Algorithms, FOGA 2002, pages 115--132. Morgan Kaufmann, 2002.
[254]
[Sha05] Jonathan L. Shapiro. Drift and scaling in estimation of distribution algorithms. Evolutionary Computing, 13:99--123, 2005.
[255]
[Sha06] Jonathan L. Shapiro. Diversity loss in general estimation of distribution algorithms. In Parallel Problem Solving from Nature, PPSN 2006, pages 92--101. Springer, 2006.
[256]
[SN12] Andrew M. Sutton and Frank Neumann. A parameterized runtime analysis of evolutionary algorithms for the Euclidean traveling salesperson problem. In AAAI Conference on Artificial Intelligence, AAAI 2012, pages 1105--1111. AAAI Press, 2012.
[257]
[SNN14] Andrew M. Sutton, Frank Neumann, and Samadhi Nallaperuma. Parameterized runtime analyses of evolutionary algorithms for the planar Euclidean traveling salesperson problem. Evolutionary Computation, 22:595--628, 2014.
[258]
[ST12] Dirk Sudholt and Christian Thyssen. A simple ant colony optimizer for stochastic shortest path problems. Algorithmica, 64:643--672, 2012.
[259]
[Sto06] Tobias Storch. How randomized search heuristics find maximum cliques in planar graphs. In Genetic and Evolutionary Computation Conference, GECCO 2006, pages 567--574. ACM, 2006.
[260]
[Sto07] Tobias Storch. Finding large cliques in sparse semi-random graphs by simple randomized search heuristics. Theoretical Computer Science, 386:114--131, 2007.
[261]
[Sto08] Tobias Storch. On the choice of the parent population size. Evolutionary Computation, 16:557--578, 2008.
[262]
[STW04] Jens Scharnow, Karsten Tinnefeld, and Ingo Wegener. The analysis of evolutionary algorithms on sorting and shortest paths problems. Journal of Mathematical Modelling and Algorithms, 3:349--366, 2004.
[263]
[Sud05] Dirk Sudholt. Crossover is provably essential for the Ising model on trees. In Genetic and Evolutionary Computation Conference, GECCO 2005, pages 1161--1167. ACM, 2005.
[264]
[Sud13] Dirk Sudholt. A new method for lower bounds on the running time of evolutionary algorithms. IEEE Transactions on Evolutionary Computation, 17:418--435, 2013.
[265]
[Sud17] Dirk Sudholt. How crossover speeds up building block assembly in genetic algorithms. Evolutionary Computation, 25:237--274, 2017.
[266]
[Sud21] Dirk Sudholt. Analysing the robustness of evolutionary algorithms to noise: refined runtime bounds and an example where noise is beneficial. Algorithmica, 83:976--1011, 2021.
[267]
[Sut21] Andrew M. Sutton. Fixed-parameter tractability of crossover: steady-state GAs on the closest string problem. Algorithmica, 83:1138--1163, 2021.
[268]
[SW04] Tobias Storch and Ingo Wegener. Real royal road functions for constant population size. Theoretical Computer Science, 320:123--134, 2004.
[269]
[SW19] Dirk Sudholt and Carsten Witt. On the choice of the update strength in estimation-of-distribution algorithms and ant colony optimization. Algorithmica, 81:1450--1489, 2019.
[270]
[SZ10] Dirk Sudholt and Christine Zarges. Analysis of an iterated local search algorithm for vertex coloring. In International Symposium on Algorithms and Computation, ISAAC 2010, Part. I, pages 340--352, Springer, 2010
[271]
[Thi03] Dirk Thierens. Convergence time analysis for the multi-objective counting ones problem. In Evolutionary Multi-Criterion Optimization, EMO 2003, pages 355--364. Springer, 2003.
[272]
[Wit05] Carsten Witt. Worst-case and average-case approximations by simple randomized search heuristics. In Symposium on Theoretical Aspects of Computer Science, STACS 2005, pages 44--56. Springer, 2005.
[273]
[Wit06] Carsten Witt. Runtime analysis of the (μ + 1) EA on simple pseudo-Boolean functions. Evolutionary Computation, 14:65--86, 2006.
[274]
[Wit13] Carsten Witt. Tight bounds on the optimization time of a randomized search heuristic on linear functions. Combinatorics, Probability & Computing, 22:294--318, 2013.
[275]
[Wit14] Carsten Witt. Revised analysis of the (1+1) EA for the minimum spanning tree problem. In Genetic and Evolutionary Computation Conference, GECCO 2014, pages 509--516. ACM, 2014.
[276]
[Wit19] Carsten Witt. Upper bounds on the running time of the univariate marginal distribution algorithm on OneMax. Algorithmica, 81:632--667, 2019.
[277]
[Wit21] Carsten Witt. On crossing fitness valleys with majority-vote crossover and estimation-ofdistribution algorithms. In Foundations of Genetic Algorithms, FOGA 2021, pages 2:1--2:15. ACM, 2021.
[278]
[WJ07] Richard A. Watson and Thomas Jansen. A building-block royal road where crossover is provably essential. In Genetic and Evolutionary Computation Conference, GECCO 2007, pages 1452--1459. ACM, 2007.
[279]
[WPN16] Junhua Wu, Sergey Polyakovskiy, and Frank Neumann. On the impact of the renting rate for the unconstrained nonlinear knapsack problem. In Genetic and Evolutionary Computation Conference, GECCO 2016, pages 413--419. ACM, 2016.
[280]
[WZD21] Shouda Wang, Weijie Zheng, and Benjamin Doerr. Choosing the right algorithm with hints from complexity theory. In International Joint Conference on Artificial Intelligence, IJCAI 2021, pages 1697--1703. ijcai.org, 2021.
[281]
[XNNS21] Yue Xie, Aneta Neumann, Frank Neumann, and Andrew M. Sutton. Runtime analysis of RLS and the (1+1) EA for the chance-constrained knapsack problem with correlated uniform weights. In Genetic and Evolutionary Computation Conference, GECCO 2021, pages 1187--1194. ACM, 2021.
[282]
[YL97] Xin Yao and Yong Liu. Fast evolution strategies. In Evolutionary Programming, volume 1213 of Lecture Notes in Computer Science, pages 151--162. Springer, 1997.
[283]
[YLL99] Xin Yao, Yong Liu, and Guangming Lin. Evolutionary programming made faster. IEEE Transactions on Evolutionary Computation, 3:82--102, 1999.
[284]
[YQZ15] Yang Yu, Chao Qian, and Zhi-Hua Zhou. Switch analysis for running time analysis of evolutionary algorithms. IEEE Transactions on Evolutionary Computation, 19:777--792, 2015.
[285]
[ZD22a] Weijie Zheng and Benjamin Doerr. Better approximation guarantees for the NSGA-II by using the current crowding distance. In Genetic and Evolutionary Computation Conference, GECCO 2022, pages 611--619. ACM, 2022.
[286]
[ZD22b] Weijie Zheng and Benjamin Doerr. Runtime analysis for the NSGA-II: proving, quantifying, and explaining the inefficiency for three or more objectives. CoRR, abs/2211.13084, 2022.
[287]
[ZL07] Qingfu Zhang and Hui Li. MOEA/D: A multiobjective evolutionary algorithm based on decomposition. IEEE Transactions on Evolutionary Computation, 11:712--731, 2007.
[288]
[ZLD22] Weijie Zheng, Yufei Liu, and Benjamin Doerr. A first mathematical runtime analysis of the Non-Dominated Sorting Genetic Algorithm II (NSGA-II). In Conference on Artificial Intelligence, AAAI 2022, pages 10408--10416. AAAI Press, 2022
[289]
[ZQL+11] Aimin Zhou, Bo-Yang Qu, Hui Li, Shi-Zheng Zhao, Ponnuthurai Nagaratnam Suganthan, and Qingfu Zhang. Multiobjective evolutionary algorithms: A survey of the state of the art. Swarm and Evolutionary Computation, 1:32--49, 2011.
Information & Contributors
Information
Published In
July 2023
2519 pages
Copyright © 2023 ACM.
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].
Sponsors
Publisher
Association for Computing Machinery
New York, NY, United States
Publication History
Published: 24 July 2023
Check for updates
Qualifiers
- Tutorial
Funding Sources
- Agence National de Recherche
Conference
GECCO '23 Companion
Sponsor:
GECCO '23 Companion: Companion Conference on Genetic and Evolutionary Computation
July 15 - 19, 2023
Lisbon, Portugal
Acceptance Rates
Overall Acceptance Rate 1,669 of 4,410 submissions, 38%
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 68Total Downloads
- Downloads (Last 12 months)34
- Downloads (Last 6 weeks)1
Reflects downloads up to 18 Dec 2024
Other Metrics
Citations
Cited By
View allView Options
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in