Abstract
Mixture modeling is one of the simplest ways to represent complicated probability density functions, and to integrate information from different sources. There are two typical mixtures in the context of information geometry, the m- and e-mixtures. This paper proposes a novel framework of non-parametric e-mixture modeling by using a simple estimation algorithm based on geometrical insights into the characteristics of the e-mixture. An experimental result supports the proposed framework.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Tu, W., Sun, S.: A subject transfer framework for EEG classification. Neurocomputing 82, 109–116 (2011)
Silva, J., Narayanan, S.S.: Information divergence estimation based on data-dependent partitions. In: Proceedings on IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 429–432 (2001)
Amari, S., Nagaoka, H.: Methods of Information Geometry. American Mathematical Society, Providence (2000)
McLachlan, G., Peel, D.: Finite Mixture Models. Probability and Statistics. Wiley, New York (2000)
Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum likelihood from incomplete data via the EM algorithm. J. Roy. Stat. Soc. Ser. B (Methodol.) 39, 1–38 (1997)
Genest, C., Zidek, J.V.: Combining probability distributions: a critique and an annotated bibliography. Stat. Sci. 1, 114–135 (1986)
Kullback, S., Leibler, R.A.: On information and sufficiency. Ann. Math. Stat. 22, 79–86 (1951)
Murata, N., Fujimoto, Y.: Bregman divergence and density integration. J. Math Ind. 1, 97–104 (2009)
Wang, Q., Kulkarni, S.R., Verdú, S.: Divergence estimation of continuous distributions based on data-dependent partitions. IEEE Trans. Inf. Theor. 51, 3064–3074 (2005)
Wang, Q., Kulkarni, S.R., Verdú, S.: Divergence estimation for multidimensional densities via k-nearest-neighbor distances. IEEE Trans. Inf. Theor. 55, 2392–2405 (2009)
Hino, H., Murata, N.: Information estimators for weighted observations. Neural Netw. 1, 260–275 (2013)
Acknowledgement
Part of this work was supported by JSPS KAKENHI No. 25120009, 25120011, and 16K16108.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing AG
About this paper
Cite this paper
Hino, H., Takano, K., Akaho, S., Murata, N. (2016). Non-parametric e-mixture of Density Functions. In: Hirose, A., Ozawa, S., Doya, K., Ikeda, K., Lee, M., Liu, D. (eds) Neural Information Processing. ICONIP 2016. Lecture Notes in Computer Science(), vol 9948. Springer, Cham. https://doi.org/10.1007/978-3-319-46672-9_1
Download citation
DOI: https://doi.org/10.1007/978-3-319-46672-9_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-46671-2
Online ISBN: 978-3-319-46672-9
eBook Packages: Computer ScienceComputer Science (R0)