[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1007/978-3-642-42042-9_10guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

Dynamic Ensemble of Ensembles in Nonstationary Environments

Published: 03 November 2013 Publication History

Abstract

Classifier ensemble is an active topic for learning from non-stationary data. In particular, batch growing ensemble methods present one important direction for dealing with concept drift involved in non-stationary data. However, current batch growing ensemble methods combine all the available component classifiers only, each trained independently from a batch of non-stationary data. They simply discard interim ensembles and hence may lose useful information obtained from the fine-tuned interim ensembles. Distinctively, we introduce a comprehensive hierarchical approach called Dynamic Ensemble of Ensembles DE2. The novel method combines classifiers as an ensemble of all the interim ensembles dynamically from consecutive batches of non-stationary data. DE2 includes two key stages: 1 Component classifiers and interim ensembles are dynamically trained; 2 the final ensemble is then learned by exponentially-weighted averaging with available experts, i.e., interim ensembles. We engage Sparsity Learning to choose component classifiers selectively and intelligently. We also incorporate the techniques of Dynamic Weighted Majority, and Learn++.NSE for better integrating different classifiers dynamically. We perform experiments with the data in a typical non-stationary environment, the Pascal Large Scale Learning Challenge 2008 Webspam Data, and compare our DE2 method to other conventional competitive ensemble methods. Experimental results confirm that our approach consistently leads to better performance and has promising generalization ability for learning in non-stationary environments.

References

[1]
Kolter, J., Maloof, M.: Dynamic weighted majority: An ensemble method for dirfting concepts. Journal of Machine Learning Research 82, 2755---2790 2007
[2]
Schlimmer, J., Granger, R.: Beyond incremental processing: Tracking concept drift. In: Proceedings of National Conference on Artificial Intelligence, pp. 502---507 1986
[3]
Widmer, G., Kubat, M.: Learning in the presence of concept drift and hidden contexts. Machine Learning 231, 69---101 1996
[4]
Kuncheva, L.: Classifier ensembles for changing environments. In: Proceedings of International Workshop on Multiple Classifier Systems, pp. 1---15 2004
[5]
Littlestone, N., Warmuth, M.: The weighted majority algorithm. Information Computation 1082, 212---261 1994
[6]
Herbster, M., Warmuth, M.: Tracking the best expert. Machine Learning 322, 151---178 1998
[7]
Bousquet, O., Warmuth, M.: Tracking a small set of experts by mixing past posteriors. Journal of Machine Learning Research 31, 363---396 2002
[8]
Kolter, J., Maloof, M.: Using additive expert ensembles to cope with concept drift. In: Proceedings of International Conference on Machine Learning, pp. 449---456 2005
[9]
Kolter, J., Maloof, M.: Dynamic weighted majority: An ensemble method for dirfting concepts. In: Proceedings of IEEE International Conference on Data Mining, pp. 123---130 2003
[10]
Street, W., Kim, Y.: A streaming ensemble algorithm SEA for large-scale classification. In: Proceedings of the Seventh ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 377---382 2001
[11]
Fan, W.: Streamminer: A classifier ensemble-based engine to mine concept-drifting data streams. In: Proceedings of International Conference on Very Large Data Bases, pp. 1257---1260 2004
[12]
Chen, S., He, H.: SERA: Selectively recursive approach towards nonstationary imbalanced stream data mining. In: International Joint Conference on Neural Networks, pp. 522---529 2009
[13]
Chen, S., He, H.: Toward incremental learning of nonstationary imbalanced data stream: A multiple selectively recursive approach. Evolving Systems 21, 30---50 2011
[14]
Elwell, R., Polikar, R.: Incremental learning of concept drift in nonstationary environments. IEEE Trans. Neural Networks 2210, 1517---1531 2011
[15]
Shalizi, C., Jacobs, A., Klinkner, K., Clauset, A.: Adapting to non-stationarity with growing expert ensembles, arXiv:1103.09049v2 2011
[16]
Webb, S., Caverlee, J., Pu, C.: Introducing the webb spam corpus: using email spam to identify web spam automatically. In: Proceedings of Third Conference on Email and Anti-Spam 2006
[17]
Chang, C.C., Lin, C.J.: LIBSVM: a library for support vector machines. ACM Trans. Intelligent Systems and Technology 23, 1---27 2011

Cited By

View all
  1. Dynamic Ensemble of Ensembles in Nonstationary Environments

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image Guide Proceedings
    ICONIP 2013: Proceedings, Part II, of the 20th International Conference on Neural Information Processing - Volume 8227
    November 2013
    768 pages
    ISBN:9783642420412
    • Editors:
    • Minho Lee,
    • Akira Hirose,
    • Zeng-Guang Hou,
    • Rhee Kil

    Publisher

    Springer-Verlag

    Berlin, Heidelberg

    Publication History

    Published: 03 November 2013

    Author Tags

    1. Ensemble of ensembles
    2. classifier ensemble
    3. concept drift
    4. growing ensemble
    5. nonstationary environment
    6. sparsity learning

    Qualifiers

    • Article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 03 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all

    View Options

    View options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media