[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to main content

An Improved Mixture of Experts Model: Divide and Conquer Using Random Prototypes

  • Chapter
Ensembles in Machine Learning Applications

Part of the book series: Studies in Computational Intelligence ((SCI,volume 373))

Abstract

The Mixture of Experts (ME) is one of the most popular ensemble methods used in Pattern Recognition and Machine Learning. This algorithm stochastically partitions the input space of a problem into a number of subspaces, experts becoming specialized on each subspace. To manage this process, theME uses an expert called gating network, which is trained together with the other experts. In this chapter, we propose a modified version of the ME algorithm which first partitions the original problem into centralized regions and then uses a simple distance-based gating function to specialize the expert networks. Each expert contributes to classify an input sample according to the distance between the input and a prototype embedded by the expert. The Hierarchical Mixture of Experts (HME) is a tree-structured architecture which can be considered a natural extension of the ME model. The training and testing strategies of the standard HME model are also modified, based on the same insight applied to standard ME. In both cases, the proposed approach does not require to train the gating networks, as they are implemented with simple distance-based rules. In so doing the overall time required for training a modifiedME/ HME system is considerably lower. Moreover, centralizing input subspaces and adopting a random strategy for selecting prototypes permits to increase at the same time individual accuracy and diversity of ME/HME modules, which in turn increases the accuracy of the overall ensemble. Experimental results on a binary toy problem and on selected datasets from the UCI machine learning repository show the robustness of the proposed methods compared to the standard ME/HME models.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
GBP 19.95
Price includes VAT (United Kingdom)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
GBP 71.50
Price includes VAT (United Kingdom)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
GBP 89.99
Price includes VAT (United Kingdom)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
GBP 89.99
Price includes VAT (United Kingdom)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Armano, G., Hatami, N.: Random prototype-based oracle for selection-fusion ensembles. In: Proc. the 20th Int. Conf. Patt. Recogn., Istanbul, Turkey, pp. 77–80. IEEE Comp. Society, Los Alamitos (2010)

    Chapter  Google Scholar 

  2. Avnimelech, R., Intrator, N.: Boosted mixture of experts: An ensemble learning scheme. Neural Comp. 11, 483–497 (1999)

    Article  Google Scholar 

  3. Ebrahimpour, R., Kabir, E., Yousefi, M.R.: Teacher-directed learning in view-independent face recognition with mixture of experts using overlapping eigenspaces. Comp. Vision and Image Understanding 111, 195–206 (2008)

    Article  Google Scholar 

  4. Haykin, S.: Neural networks: A comprehensive foundation. Prentice Hall, Upper Saddle River (1999)

    MATH  Google Scholar 

  5. Jacobs, R., Jordan, M.I., Barto, A.: Task decomposition through competition in a modular connectionist architecture: The what and where vision tasks. Technical Report 90-44, Univ. Massachusetts, Amherst (1991)

    Google Scholar 

  6. Jacobs, R., Jordan, M.I., Nowlan, S., Hinton, G.: Adaptive mixtures of local experts. Neural Comp. 87, 79–87 (1991)

    Article  Google Scholar 

  7. Jordan, M.I., Jacobs, R.A.: Hierarchical mixtures of experts and the EM algorithm. Neural Comp. 6, 181–214 (1994)

    Article  Google Scholar 

  8. Kuncheva, L.I.: Combining pattern classifiers: Methods and algorithms. John Wiley & Sons, Hoboken (2004)

    Book  MATH  Google Scholar 

  9. Tang, B., Heywood, M., Shepherd, M.: Input partitioning to mixture of experts. In: Proc. the 2002 Int. Joint Conf. Neural Networks, Honolulu, HI, pp. 227–232. IEEE Comp. Society, Los Alamitos (2002)

    Google Scholar 

  10. Wan, E., Bone, D.: Interpolating earth-science data using RBF networks and mixtures of experts. In: Mozer, M., Jordan, M.I., Petsche, T. (eds.) Advances in Neural Inf. Proc. Syst., vol. 9, pp. 988–994. MIT Press, Cambridge (1997)

    Google Scholar 

  11. Waterhouse, S., Cook, G.: Ensemble methods for phoneme classification. In: Mozer, M., Jordan, M.I., Petsche, T. (eds.) Advances in Neural Inf. Proc. Syst., vol. 9, pp. 800–806. MIT Press, Cambridge (1997)

    Google Scholar 

  12. UCI Repository of Machine Learning Databases, Dept. of Inf. and Comp. Sci., Univ. of California, Irvine, http://archive.ics.uci.edu/ml/

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Armano, G., Hatami, N. (2011). An Improved Mixture of Experts Model: Divide and Conquer Using Random Prototypes. In: Okun, O., Valentini, G., Re, M. (eds) Ensembles in Machine Learning Applications. Studies in Computational Intelligence, vol 373. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-22910-7_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-22910-7_13

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-22909-1

  • Online ISBN: 978-3-642-22910-7

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics