[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.5555/2888116.2888125guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

Pareto ensemble pruning

Published: 25 January 2015 Publication History

Abstract

Ensemble learning is among the state-of-the-art learning techniques, which trains and combines many base learners. Ensemble pruning removes some of the base learners of an ensemble, and has been shown to be able to further improve the generalization performance. However, the two goals of ensemble pruning, i.e., maximizing the generalization performance and minimizing the number of base learners, can conflict when being pushed to the limit. Most previous ensemble pruning approaches solve objectives that mix the two goals. In this paper, motivated by the recent theoretical advance of evolutionary optimization, we investigate solving the two goals explicitly in a bi-objective formulation and propose the PEP (Pareto Ensemble Pruning) approach. We disclose that PEP does not only achieve significantly better performance than the state-of-the-art approaches, and also gains theoretical support.

References

[1]
Anguita, D.; Ghio, A.; Oneto, L.; Parra, X.; and Reyes-Ortiz, J. L. 2012. Human activity recognition on smartphones using a multi-class hardware-friendly support vector machine. In Proceedings of the 4th International Workshop on Ambient Assisted Living and Home Care, 216-223.
[2]
Auger, A., and Doerr, B. 2011. Theory of Randomized Search Heuristics: Foundations and Recent Developments. Singapore: World Scientific.
[3]
B÷ck, T. 1996. Evolutionary Algorithms in Theory and Practice: Evolution Strategies, Evolutionary Programming, Genetic Algorithms. Oxford, UK: Oxford University Press.
[4]
Banfield, R. E.; Hall, L. O.; Bowyer, K. W.; and Kegelmeyer, W. P. 2005. Ensemble diversity measures and their application to thinning. Information Fusion 6(1):49-62.
[5]
Blake, C. L.; Keogh, E.; and Merz, C. J. 1998. UCI Repository of machine learning databases. [http://www.ics.uci.edu/∼mlearn/MLRepository.html].
[6]
Breiman, L. 1996. Bagging predictors. Machine Learning 24(2):123-140.
[7]
Brown, G.; Wyatt, J.; Harris, R.; and Yao, X. 2005. Diversity creation methods: A survey and categorisation. Information Fusion 6(1):5-20.
[8]
Caruana, R.; Niculescu-Mizil, A.; Crew, G.; and Ksikes, A. 2004. Ensemble selection from libraries of models. In Proceedings of the 21st International Conference on Machine Learning, 18-25.
[9]
Castro, P. D.; Coelho, G. P.; Caetano, M. F.; and Von Zuben, F. J. 2005. Designing ensembles of fuzzy classification systems: An immune-inspired approach. In Proceedings of the 4th International Conference on Artificial Immune Systems, 469-482.
[10]
Demšar, J. 2006. Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research 7:1-30.
[11]
He, J., and Yao, X. 2001. Drift analysis and average time complexity of evolutionary algorithms. Artificial Intelligence 127(1):57-85.
[12]
Hernández-Lobato, D.; Martínez-Muñoz, G.; and Suárez, A. 2011. Empirical analysis and evaluation of approximate techniques for pruning regression bagging ensembles. Neurocomputing 74(12):2250-2264.
[13]
Li, N., and Zhou, Z.-H. 2009. Selective ensemble under regulariza-tion framework. In Proceedings of the 8th International Workshop on Multiple Classifier Systems, 293-303.
[14]
Li, N.; Yu, Y.; and Zhou, Z.-H. 2012. Diversity regularized ensemble pruning. In Proceedings of the 23rd European Conference on Machine Learning, 330-345.
[15]
Lin, S., and Kernighan, B. W. 1973. An effective heuristic algorithm for the traveling-salesman problem. Operations Research 21(2):498-516.
[16]
Margineantu, D. D., and Dietterich, T. G. 1997. Pruning adaptive boosting. In Proceedings of the 14th International Conference on Machine Learning, 211-218.
[17]
Martínez-Muñoz, G.; Hernández-Lobato, D.; and Suárez, A. 2009. An analysis of ensemble pruning techniques based on ordered aggregation. IEEE Transactions on Pattern Analysis and Machine Intelligence 31(2):245-259.
[18]
Partalas, I.; Tsoumakas, G.; and Vlahavas, I. 2012. A study on greedy algorithms for ensemble pruning. Technical report, Aristotle University of Thessaloniki, Greece.
[19]
Qian, C.; Yu, Y.; and Zhou, Z.-H. 2013. An analysis on recombination in multi-objective evolutionary optimization. Artificial Intelligence 204:99-119.
[20]
Quinlan, J. R. 1993. C4.5: Programs for Machine Learning. San Francisco, CA: Morgan kaufmann.
[21]
Tsoumakas, G.; Partalas, I.; and Vlahavas, I. 2009. An ensemble pruning primer. In Applications of Supervised and Unsupervised Ensemble Methods, volume 245 of Studies in Computational Intelligence. Berlin, Germany: Springer. 1-13.
[22]
Yu, Y.; Yao, X.; and Zhou, Z.-H. 2012. On the approximation ability of evolutionary optimization with application to minimum set cover. Artificial Intelligence 180-181:20-33.
[23]
Zhang, Y.; Burer, S.; and Street, W. N. 2006. Ensemble pruning via semi-definite programming. Journal of Machine Learning Research 7:1315-1338.
[24]
Zhou, Z.-H.; Wu, J.; and Tang, W. 2002. Ensembling neural networks: Many could be better than all. Artificial Intelligence 137(1):239-263.
[25]
Zhou, Z.-H. 2012. Ensemble Methods: Foundations and Algorithms. Boca Raton, FL: Chapman & Hall/CRC.

Cited By

View all
  • (2024)Stock market prediction with time series data and news headlines: a stacking ensemble approachJournal of Intelligent Information Systems10.1007/s10844-023-00804-162:1(27-56)Online publication date: 1-Feb-2024
  • (2021)Evolutionary Large-Scale Multi-Objective Optimization: A SurveyACM Computing Surveys10.1145/347097154:8(1-34)Online publication date: 4-Oct-2021
  • (2018)Noisy derivative-free optimization with value suppressionProceedings of the Thirty-Second AAAI Conference on Artificial Intelligence and Thirtieth Innovative Applications of Artificial Intelligence Conference and Eighth AAAI Symposium on Educational Advances in Artificial Intelligence10.5555/3504035.3504212(1447-1454)Online publication date: 2-Feb-2018
  • Show More Cited By
  1. Pareto ensemble pruning

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image Guide Proceedings
    AAAI'15: Proceedings of the Twenty-Ninth AAAI Conference on Artificial Intelligence
    January 2015
    4331 pages
    ISBN:0262511290

    Sponsors

    • Association for the Advancement of Artificial Intelligence

    Publisher

    AAAI Press

    Publication History

    Published: 25 January 2015

    Qualifiers

    • Article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 13 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Stock market prediction with time series data and news headlines: a stacking ensemble approachJournal of Intelligent Information Systems10.1007/s10844-023-00804-162:1(27-56)Online publication date: 1-Feb-2024
    • (2021)Evolutionary Large-Scale Multi-Objective Optimization: A SurveyACM Computing Surveys10.1145/347097154:8(1-34)Online publication date: 4-Oct-2021
    • (2018)Noisy derivative-free optimization with value suppressionProceedings of the Thirty-Second AAAI Conference on Artificial Intelligence and Thirtieth Innovative Applications of Artificial Intelligence Conference and Eighth AAAI Symposium on Educational Advances in Artificial Intelligence10.5555/3504035.3504212(1447-1454)Online publication date: 2-Feb-2018
    • (2018)Efficiently Optimizing for Dendritic Connectivity on Tree-Structured Networks in a Multi-Objective FrameworkProceedings of the 1st ACM SIGCAS Conference on Computing and Sustainable Societies10.1145/3209811.3209878(1-8)Online publication date: 20-Jun-2018
    • (2018)X-CLEaVERACM Transactions on Intelligent Systems and Technology10.1145/32054539:6(1-26)Online publication date: 29-Oct-2018
    • (2018)Evolutionary Strategy to Perform Batch-Mode Active Learning on Multi-Label DataACM Transactions on Intelligent Systems and Technology10.1145/31616069:4(1-26)Online publication date: 30-Jan-2018
    • (2018)Structural diversity for decision tree ensemble learningFrontiers of Computer Science: Selected Publications from Chinese Universities10.1007/s11704-018-7151-812:3(560-570)Online publication date: 1-Jun-2018
    • (2018)Sequential quadratic programming enhanced backtracking search algorithmFrontiers of Computer Science: Selected Publications from Chinese Universities10.1007/s11704-016-5556-912:2(316-330)Online publication date: 1-Apr-2018
    • (2017)Solving high-dimensional multi-objective optimization problems with low effective dimensionsProceedings of the Thirty-First AAAI Conference on Artificial Intelligence10.5555/3298239.3298367(875-881)Online publication date: 4-Feb-2017
    • (2017)A Multiobjective Cooperative Coevolutionary Algorithm for Hyperspectral Sparse UnmixingIEEE Transactions on Evolutionary Computation10.1109/TEVC.2016.259885821:2(234-248)Online publication date: 1-Apr-2017
    • Show More Cited By

    View Options

    View options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media