[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.24963/ijcai.2023/616guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
research-article

Runtime analyses of multi-objective evolutionary algorithms in the presence of noise

Published: 19 August 2023 Publication History

Abstract

In single-objective optimization, it is well known that evolutionary algorithms also without further adjustments can stand a certain amount of noise in the evaluation of the objective function. In contrast, this question is not at all understood for multiobjective optimization.
In this work, we conduct the first mathematical runtime analysis of a simple multi-objective evolutionary algorithm (MOEA) on a classic benchmark in the presence of noise in the objective function. We prove that when bit-wise prior noise with rate p ≤ α/n, α a suitable constant, is present, the simple evolutionary multi-objective optimizer (SEMO) without any adjustments to cope with noise finds the Pareto front of the OneMinMax benchmark in time O(n2 log n), just as in the case without noise. Given that the problem here is to arrive at a population consisting of n + 1 individuals witnessing the Pareto front, this is a surprisingly strong robustness to noise (comparably simple evolutionary algorithms cannot optimize the single-objective OneMax problem in polynomial time when p = ω(log(n)/n)). Our proofs suggest that the strong robustness of the MOEA stems from its implicit diversity mechanism designed to enable it to compute a population covering the whole Pareto front.
Interestingly this result only holds when the objective value of a solution is determined only once and the algorithm from that point on works with this, possibly noisy, objective value. We prove that when all solutions are reevaluated in each iteration, then any noise rate p = ω(log(n)/n2) leads to a super-polynomial runtime. This is very different from single-objective optimization, where it is generally preferred to reevaluate solutions whenever their fitness is important and where examples are known such that not reevaluating solutions can lead to catastrophic performance losses.

References

[1]
Youhei Akimoto, Sandra Astete Morales, and Olivier Teytaud. Analysis of runtime of optimization algorithms for noisy functions over discrete codomains. Theoretical Computer Science, 605:42-50, 2015.
[2]
Anne Auger and Benjamin Doerr, editors. Theory of Randomized Search Heuristics. World Scientific Publishing, 2011.
[3]
Chao Bian and Chao Qian. Better running time of the non-dominated sorting genetic algorithm II (NSGAII) by using stochastic tournament selection. In Parallel Problem Solving From Nature, PPSN 2022, pages 428-441. Springer, 2022.
[4]
Leonora Bianchi, Marco Dorigo, Luca Maria Gambardella, and Walter J. Gutjahr. A survey on metaheuristics for stochastic combinatorial optimization. Natural Computing, 8:239-287, 2009.
[5]
Pruet Boonma and Junichi Suzuki. A confidence-based dominance operator in evolutionary algorithms for noisy multiobjective optimization problems. In IEEE International Conference on Tools with Artificial Intelligence, ICTAI 2009, pages 387-394. IEEE Computer Society, 2009.
[6]
Dimo Brockhoff. Theoretical aspects of evolutionary multiobjective optimization. In Anne Auger and Benjamin Doerr, editors, Theory of Randomized Search Heuristics, pages 101-140. World Scientific Publishing, 2011.
[7]
Sacha Cerf, Benjamin Doerr, Benjamin Hebras, Jakob Kahane, and Simon Wietheger. The first proven performance guarantees for the Non-Dominated Sorting Genetic Algorithm II (NSGA-II) on a combinatorial optimization problem. In International Joint Conference on Artificial Intelligence, IJCAI 2023. ijcai.org, 2023. To appear.
[8]
Carlos Artemio Coello Coello, Gary B. Lamont, and David A. van Veldhuizen. Evolutionary Algorithms for Solving Multi-Objective Problems. Springer, 2nd edition, 2007.
[9]
Edgar Covantes Osuna, Wanru Gao, Frank Neumann, and Dirk Sudholt. Design and analysis of diversity-based parent selection schemes for speeding up evolutionary multi-objective optimisation. Theoretical Computer Science, 832:123-142, 2020.
[10]
Raphäel Dang-Nhu, Thibault Dardinier, Benjamin Doerr, Gautier Izacard, and Dorian Nogneng. A new analysis method for evolutionary optimization of dynamic and noisy objective functions. In Genetic and Evolutionary Computation Conference, GECCO 2018, pages 1467-1474. ACM, 2018.
[11]
Hongwei Ding, Lyès Benyoucef, and Xiaolan Xie. A simulation-based multi-objective genetic algorithm approach for networked enterprises optimization. Engineering Applications of Artificial Intelligence, 19:609-623, 2006.
[12]
Matthieu Dinot, Benjamin Doerr, Ulysse Hennebelle, and SebastianWill. Runtime analyses of multi-objective evolutionary algorithms in the presence of noise. CoRR, abs/2305.10259, 2023.
[13]
Benjamin Doerr and Frank Neumann, editors. Theory of Evolutionary Computation--Recent Developments in Discrete Optimization. Springer, 2020. Also available at http://www.lix.polytechnique.fr/Labo/Benjamin.Doerr/doerr_neumann_book.html.
[14]
Benjamin Doerr and Zhongdi Qu. From understanding the population dynamics of the NSGA-II to the first proven lower bounds. In Conference on Artificial Intelligence, AAAI 2023. AAAI Press, 2023. To appear.
[15]
Benjamin Doerr and Andrew M. Sutton. When resampling to cope with noise, use median, not mean. In Genetic and Evolutionary Computation Conference, GECCO 2019, pages 242-248. ACM, 2019.
[16]
Benjamin Doerr and Weijie Zheng. Theoretical analyses of multi-objective evolutionary algorithms on multi-modal objectives. In Conference on Artificial Intelligence, AAAI 2021, pages 12293-12301. AAAI Press, 2021.
[17]
Benjamin Doerr, Ashish Ranjan Hota, and Timo Kötzing. Ants easily solve stochastic shortest path problems. In Genetic and Evolutionary Computation Conference, GECCO 2012, pages 17-24. ACM, 2012.
[18]
Benjamin Doerr, Daniel Johannsen, and Carola Winzen. Multiplicative drift analysis. Algorithmica, 64:673- 697, 2012.
[19]
Benjamin Doerr, Bojana Kodric, and Marco Voigt. Lower bounds for the runtime of a global multi-objective evolutionary algorithm. In Congress on Evolutionary Computation, CEC 2013, pages 432-439. IEEE, 2013.
[20]
Stefan Droste. Analysis of the (1+1) EA for a noisy OneMax. In Genetic and Evolutionary Computation Conference, GECCO 2004, pages 1088-1099. Springer, 2004.
[21]
Jonathan E. Fieldsend and Richard M. Everson. The rolling tide evolutionary algorithm: a multiobjective optimizer for noisy optimization problems. IEEE Transactions on Evolutionary Computation, 19:103-117, 2015.
[22]
Oliver Giel and Per Kristian Lehre. On the effect of populations in evolutionary multi-objective optimisation. Evolutionary Computation, 18:335-356, 2010.
[23]
Oliver Giel. Expected runtimes of a simple multi-objective evolutionary algorithm. In Congress on Evolutionary Computation, CEC 2003, pages 1918-1925. IEEE, 2003.
[24]
Christian Gießen and Timo Kötzing. Robustness of populations in stochastic environments. Algorithmica, 75:462-489, 2016.
[25]
Walter J. Gutjahr and Alois Pichler. Stochastic multi-objective optimization: a survey on non-scalarizing methods. Annals of Operations Research, 236:475- 499, 2016.
[26]
Walter J. Gutjahr. Recent trends in metaheuristics for stochastic combinatorial optimization. Central European Journal on Computer Science, 1:58-66, 2011.
[27]
Walter J. Gutjahr. Runtime analysis of an evolutionary algorithm for stochastic multi-objective combinatorial optimization. Evolutionary Computation, 20:395-421, 2012.
[28]
Jun He and Xin Yao. Drift analysis and average time complexity of evolutionary algorithms. Artificial Intelligence, 127:51-81, 2001.
[29]
Thomas Jansen. Analyzing Evolutionary Algorithms - The Computer Science Perspective. Springer, 2013.
[30]
Yaochu Jin and Jürgen Branke. Evolutionary optimization in uncertain environments - a survey. IEEE Transactions on Evolutionary Computation, 9:303-317, 2005.
[31]
Marco Laumanns, Lothar Thiele, Kalyanmoy Deb, and Eckart Zitzler. Combining convergence and diversity in evolutionary multiobjective optimization. Evolutionary Computation, 10:263-282, 2002.
[32]
Marco Laumanns, Lothar Thiele, Eckart Zitzler, Emo Welzl, and Kalyanmoy Deb. Running time analysis of multi-objective evolutionary algorithms on a simple discrete optimization problem. In Parallel Problem Solving from Nature, PPSN 2002, pages 44-53. Springer, 2002.
[33]
Marco Laumanns, Lothar Thiele, and Eckart Zitzler. Running time analysis of multiobjective evolutionary algorithms on pseudo-Boolean functions. IEEE Transactions on Evolutionary Computation, 8:170-182, 2004.
[34]
Johannes Lengler. Drift analysis. In Benjamin Doerr and Frank Neumann, editors, Theory of Evolutionary Computation: Recent Developments in Discrete Optimization, pages 89-131. Springer, 2020. Also available at https://arxiv.org/abs/1712.00964.
[35]
Arnaud Liefooghe, Matthieu Basseur, Laetitia Jourdan, and El-Ghazali Talbi. Combinatorial optimization of stochastic multi-objective problems: an application to the flow-shop scheduling problem. In Evolutionary Multi-Criterion Optimization, EMO 2007, volume 4403, pages 457- 471. Springer, 2007.
[36]
Frank Neumann and Carsten Witt. Bioinspired Computation in Combinatorial Optimization - Algorithms and Their Computational Complexity. Springer, 2010.
[37]
Frank Neumann, Mojgan Pourhassan, and Vahid Roostapour. Analysis of evolutionary algorithms in dynamic and stochastic environments. In Benjamin Doerr and Frank Neumann, editors, Theory of Evolutionary Computation: Recent Developments in Discrete Optimization, pages 323-357. Springer, 2020. Also available at https://arxiv.org/abs/1806.08547.
[38]
Frank Neumann. Expected runtimes of a simple evolutionary algorithm for the multi-objective minimum spanning tree problem. European Journal of Operational Research, 181:1620-1629, 2007.
[39]
Anh Quang Nguyen, Andrew M. Sutton, and Frank Neumann. Population size matters: rigorous runtime results for maximizing the hypervolume indicator. Theoretical Computer Science, 561:24-36, 2015.
[40]
Pietro S. Oliveto and Carsten Witt. Simplified drift analysis for proving lower bounds in evolutionary computation. Algorithmica, 59:369-386, 2011.
[41]
Pietro S. Oliveto and Carsten Witt. Erratum: Simplified drift analysis for proving lower bounds in evolutionary computation. CoRR, abs/1211.7184, 2012.
[42]
Chao Qian, Yang Yu, Ke Tang, Yaochu Jin, Xin Yao, and Zhi-Hua Zhou. On the effectiveness of sampling for evolutionary optimization in noisy environments. Evolutionary Computation, 26:237-267, 2018.
[43]
Pratyusha Rakshit and Amit Konar. Differential evolution for noisy multiobjective optimization. Artificial Intelligence, 227:165-189, 2015.
[44]
Thomas Stützle and Holger H. Hoos. MAX-MIN ant system. Future Generation Computer Systems, 16:889-914, 2000.
[45]
Dirk Sudholt and Christian Thyssen. A simple ant colony optimizer for stochastic shortest path problems. Algorithmica, 64:643-672, 2012.
[46]
Dirk Sudholt. Analysing the robustness of evolutionary algorithms to noise: refined runtime bounds and an example where noise is beneficial. Algorithmica, 83:976-1011, 2021.
[47]
Jürgen Teich. Pareto-front exploration with uncertain objectives. In Evolutionary Multi-Criterion Optimization, EMO 2001, pages 314-328. Springer, 2001.
[48]
Dirk Thierens. Convergence time analysis for the multi-objective counting ones problem. In Evolutionary Multi-Criterion Optimization, EMO 2003, pages 355-364. Springer, 2003.
[49]
Simon Wietheger and Benjamin Doerr. A mathematical runtime analysis of the Non-dominated Sorting Genetic Algorithm III (NSGA-III). In International Joint Conference on Artificial Intelligence, IJCAI 2023. ijcai.org, 2023. To appear.
[50]
Weijie Zheng, Yufei Liu, and Benjamin Doerr. A first mathematical runtime analysis of the Non-Dominated Sorting Genetic Algorithm II (NSGA-II). In Conference on Artificial Intelligence, AAAI 2022, pages 10408-10416. AAAI Press, 2022.
[51]
Aimin Zhou, Bo-Yang Qu, Hui Li, Shi-Zheng Zhao, Ponnuthurai Nagaratnam Suganthan, and Qingfu Zhang. Multiobjective evolutionary algorithms: A survey of the state of the art. Swarm and Evolutionary Computation, 1:32-49, 2011.
[52]
Zhi-Hua Zhou, Yang Yu, and Chao Qian. Evolutionary learning: Advances in theories and algorithms. Springer, 2019.

Cited By

View all
  • (2024)Hot off the Press: Runtime Analyses of Multi-Objective Evolutionary Algorithms in the Presence of NoiseProceedings of the Genetic and Evolutionary Computation Conference Companion10.1145/3638530.3664070(33-34)Online publication date: 14-Jul-2024
  • (2024)A Gentle Introduction to Theory (for Non-Theoreticians)Proceedings of the Genetic and Evolutionary Computation Conference Companion10.1145/3638530.3648402(800-829)Online publication date: 14-Jul-2024
  • (2024)Already Moderate Population Sizes Provably Yield Strong Robustness to NoiseProceedings of the Genetic and Evolutionary Computation Conference10.1145/3638529.3654196(1524-1532)Online publication date: 14-Jul-2024

Index Terms

  1. Runtime analyses of multi-objective evolutionary algorithms in the presence of noise
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Please enable JavaScript to view thecomments powered by Disqus.

        Information & Contributors

        Information

        Published In

        cover image Guide Proceedings
        IJCAI '23: Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence
        August 2023
        7242 pages
        ISBN:978-1-956792-03-4

        Sponsors

        • International Joint Conferences on Artifical Intelligence (IJCAI)

        Publisher

        Unknown publishers

        Publication History

        Published: 19 August 2023

        Qualifiers

        • Research-article
        • Research
        • Refereed limited

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)0
        • Downloads (Last 6 weeks)0
        Reflects downloads up to 18 Dec 2024

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)Hot off the Press: Runtime Analyses of Multi-Objective Evolutionary Algorithms in the Presence of NoiseProceedings of the Genetic and Evolutionary Computation Conference Companion10.1145/3638530.3664070(33-34)Online publication date: 14-Jul-2024
        • (2024)A Gentle Introduction to Theory (for Non-Theoreticians)Proceedings of the Genetic and Evolutionary Computation Conference Companion10.1145/3638530.3648402(800-829)Online publication date: 14-Jul-2024
        • (2024)Already Moderate Population Sizes Provably Yield Strong Robustness to NoiseProceedings of the Genetic and Evolutionary Computation Conference10.1145/3638529.3654196(1524-1532)Online publication date: 14-Jul-2024
        • (2024)A Block-Coordinate Descent EMO Algorithm: Theoretical and Empirical AnalysisProceedings of the Genetic and Evolutionary Computation Conference10.1145/3638529.3654169(493-501)Online publication date: 14-Jul-2024

        View Options

        View options

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media