Abstract
We present a novel method for manifold learning, i.e. identification of the low-dimensional manifold-like structure present in a set of data points in a possibly high-dimensional space. The main idea is derived from the concept of Riemannian normal coordinates. This coordinate system is in a way a generalization of Cartesian coordinates in Euclidean space. We translate this idea to a cloud of data points in order to perform dimension reduction. Our implementation currently uses Dijkstra’s algorithm for shortest paths in graphs and some basic concepts from differential geometry. We expect this approach to open up new possibilities for analysis of e.g. shape in medical imaging and signal processing of manifold-valued signals, where the coordinate system is “learned” from experimental high-dimensional data rather than defined analytically using e.g. models based on Lie-groups.
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Belkin, M., Niyogi, P.: Laplacian eigenmaps and spectral techniques for embedding and clustering. In: Diettrich, T.G., Becker, S., Ghahramani, Z. (eds.) Advances in Neural Information Processing Systems 14. MIT Press, Cambridge (2002)
Donoho, D.L., Grimes, C.: Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. PNAS 100(10), 5591–5596 (2003)
Fischler, M.A., Bolles, R.C.: Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography. Comm. of the ACM 24, 381–395 (1981)
Fletcher, P.T., Joshi, S., Lu, C., Pizer, S.: Gaussian Distributions on Lie Groups and Their Application to Statistical Shape Analysis. In: Proc. of Information Processing in Medical Imaging (IPMI), pp. 450–462 (2003)
Kohonen, T.: Self-organized formation of topologically correct feature maps. Biological Cybernetics 43, 59–69 (1982)
Lee, J., Shin, S.Y.: General construction of time-domain filters for orientation data. IEEE Transactions on Visualization and Computer Graphics 8(2), 119–128 (2002)
Roweis, S., Saul, L.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)
Schölkopf, B., Smola, A., Müller, K.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation 10(5), 1299–1319 (1998)
Swindale, N.V.: Visual cortex: Looking into a Klein bottle. Current Biology 6(7), 776–779 (1996)
Tanaka, S.: Topological analysis of point singularities in stimulus preference maps of the primary visual cortex. Proc. R. Soc. Lond. B 261, 81–88 (1995)
Tenenbaum, J.B., de Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000)
Weisstein, E.W.: ”Klein Bottle.” From MathWorld–A Wolfram Web Resource, http://mathworld.wolfram.com/KleinBottle.html
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Brun, A., Westin, CF., Herberthson, M., Knutsson, H. (2005). Fast Manifold Learning Based on Riemannian Normal Coordinates. In: Kalviainen, H., Parkkinen, J., Kaarna, A. (eds) Image Analysis. SCIA 2005. Lecture Notes in Computer Science, vol 3540. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11499145_93
Download citation
DOI: https://doi.org/10.1007/11499145_93
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-26320-3
Online ISBN: 978-3-540-31566-7
eBook Packages: Computer ScienceComputer Science (R0)