Abstract
In this paper we present a sequential expectation maximization algorithm to adapt in an unsupervised manner a Gaussian mixture model for a classification problem. The goal is to adapt the Gaussian mixture model to cope with the non-stationarity in the data to classify and hence preserve the classification accuracy. Experimental results on synthetic data show that this method is able to learn the time-varying statistical features in data by adapting a Gaussian mixture model online. In order to control the adaptation method and to ensure the stability of the adapted model, we introduce an index to detect when the adaptation would fail.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Bishop, C.: Pattern recognition and machine learning. Springer, Heidelberg (2006)
Rasmussen, C.E.: The infinite Gaussian mixture model. In: Advances in Neural Information Processing Systems, vol. 12, pp. 554–560 (2000)
Cheng, S., Wang, H., Fu, H.: A model-selection-based self-splitting Gaussian mixture learning with application to speaker identification. EURASIP Journal on Applied Signal Processing 17, 2626–2639 (2004)
Fraley, C., Raftery, A., Wehrensy, R.: Incremental model-based clustering for large datasets with small clusters. Tech. Rep. 439 (2003)
Shimada, A., Arita, D., Taniguchi, R.: Dynamic control of adaptive mixture-of-Gaussians background model. In: AVSS 2006. Proceedings of the IEEE International Conference on Video and Signal Based Surveillance, vol. 5 (2006)
Marques, J., Moreno, P.J.: A study of musical instrument classification using Gaussian mixture models and support vector machines. Tech. Rep. CRL 99/4 (1999)
Millan, J.R.: On the need for on-line learning in brain-computer interfaces. In: Proc. IEEE International Joint Conference on Neural Networks, vol. 4, pp. 2877–2882 (2004)
Desmar, J.: Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research 7, 1–30 (2006)
Neal, R.M., Hinton, G.E.: A view of the EM algorithm that justifies incremental, sparse, and other variants. Learning in Graphical Models, 355–368 (1998)
Sato, M., Ishii, S.: On-line EM algorithm for the normalized Gaussian network. Neural Comp. 12(2), 407–432 (2000)
Awwad Shiekh Hasan, B., Gan, J.Q.: Unsupervised adaptive GMM for BCI. In: International IEEE EMBS Conf. on Neural Engineering, Antalya, Turkey (2009)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Awwad Shiekh Hasan, B., Gan, J.Q. (2009). Sequential EM for Unsupervised Adaptive Gaussian Mixture Model Based Classifier. In: Perner, P. (eds) Machine Learning and Data Mining in Pattern Recognition. MLDM 2009. Lecture Notes in Computer Science(), vol 5632. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-03070-3_8
Download citation
DOI: https://doi.org/10.1007/978-3-642-03070-3_8
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-03069-7
Online ISBN: 978-3-642-03070-3
eBook Packages: Computer ScienceComputer Science (R0)