[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to main content

Levels of Details for Gaussian Mixture Models

  • Conference paper
Computer Vision – ACCV 2009 (ACCV 2009)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 5995))

Included in the following conference series:

Abstract

Mixtures of Gaussians are a crucial statistical modeling tool at the heart of many challenging applications in computer vision and machine learning. In this paper, we first describe a novel and efficient algorithm for simplifying Gaussian mixture models using a generalization of the celebrated k-means quantization algorithm tailored to relative entropy. Our method is shown to compare experimentally favourably well with the state-of-the-art both in terms of time and quality performances. Second, we propose a practical enhanced approach providing a hierarchical representation of the simplified GMM while automatically computing the optimal number of Gaussians in the simplified mixture. Application to clustering-based image segmentation is reported.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
GBP 19.95
Price includes VAT (United Kingdom)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
GBP 71.50
Price includes VAT (United Kingdom)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
GBP 89.99
Price includes VAT (United Kingdom)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum-likelihood from incomplete data via the EM algorithm. Journal of Royal Statistical Society B 39, 1–38 (1977)

    MATH  MathSciNet  Google Scholar 

  2. Parzen, E.: On the estimation of a probability density function and mode. Annals of Mathematical Statistics 33, 1065–1076 (1962)

    Article  MATH  MathSciNet  Google Scholar 

  3. Zhang, K., Kwok, J.T.: Simplifying mixture models through function approximation. In: Neural Information Processing Systems (2006)

    Google Scholar 

  4. Goldberger, J., Greenspan, H., Dreyfuss, J.: Simplifying mixture models using the unscented transform. IEEE Transactions Pattern Analysis Machine Intelligence 30, 1496–1502 (2008)

    Article  Google Scholar 

  5. Goldberger, J., Gordon, S., Greenspan, H.: An efficient image similarity measure based on approximations of kl-divergence between two gaussian mixtures. In: IEEE International Conference on Computer Vision (2003)

    Google Scholar 

  6. Julier, S.J., Uhlmann, J.K.: Unscented filtering and nonlinear estimation. Proceedings of the IEEE 92, 401–422 (2004)

    Article  Google Scholar 

  7. Davis, J.V., Dhillon, I.: Differential entropic clustering of multivariate gaussians. In: Neural Information Processing Systems (2006)

    Google Scholar 

  8. Goldberger, J., Roweis, S.: Hierarchical clustering of a mixture model. In: Neural Information Processing Systems (2004)

    Google Scholar 

  9. Novoviov, J., Malk, A.: Application of multinomial mixture model to text classification. In: Pattern Recognition and Image Analysis (2003)

    Google Scholar 

  10. Hamerly, G., Elkan, C.: Learning the k in k-means. In: Neural Information Processing Systems (2003)

    Google Scholar 

  11. Banerjee, A., Merugu, S., Dhillon, I., Ghosh, J.: Clustering with bregman divergences. Journal of Machine Learning Research 6, 234–245 (2005)

    MathSciNet  Google Scholar 

  12. Nielsen, F., Boissonnat, J.D., Nock, R.: On Bregman Voronoi diagrams. In: SIAM Symposium on Discrete Algorithms (2007)

    Google Scholar 

  13. Anderson, T.W., Darling, D.A.: Asymptotic theory of certain ”goodness of fit” criteria based on stochastic processes. In: Annals of Mathematical Statistics (1952)

    Google Scholar 

  14. Hershey, J.R., Olsen, P.A.: Approximating the Kullback Leibler divergence between gaussian mixture models. In: IEEE International Conference on Acoustics, Speech, and Signal Processing (2007)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Garcia, V., Nielsen, F., Nock, R. (2010). Levels of Details for Gaussian Mixture Models. In: Zha, H., Taniguchi, Ri., Maybank, S. (eds) Computer Vision – ACCV 2009. ACCV 2009. Lecture Notes in Computer Science, vol 5995. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-12304-7_48

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-12304-7_48

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-12303-0

  • Online ISBN: 978-3-642-12304-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics