Abstract
We discuss the utility of dimensionality reduction algorithms to put data points in high dimensional spaces into correspondence by learning a transformation between assigned data points on a lower dimensional structure. We assume that similar high dimensional feature spaces are characterized by a similar underlying low dimensional structure. To enable the determination of an affine transformation between two data sets we make use of well-known dimensional reduction algorithms. We demonstrate this procedure for applications like classification and assignments between two given data sets and evaluate six well-known algorithms during several experiments with different objectives. We show that with these algorithms and our transformation approach high dimensional data sets can be related to each other. We also show that linear methods turn out to be more suitable for assignment tasks, whereas graph-based methods appear to be superior for classification tasks.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Bach, F.R., Jordan, M.I.: Spectral Clustering for Speech Separation. Wiley, Chichester (2009)
Mittal, A., Monnet, A., Paragios, N.: Scene Modeling and Change Detection in Dynamic Scenes: A Subspace Approach. In: CVUI, vol. 113 (2009)
Rao, S., Tron, R., Vidal, R., Ma, Y.: Motion segmentation via robust subspace separation in the presence of outlying, incomplete, or corrupted trajectories. In: CVPR, vol. 37, p. 18 (2008)
Murase, H.: Moving Object Recognition in Eigenspace Representation: Gait Analysis and Lip Reading. Pattern Recognition Letters 17, 155–162 (1996)
Jolliffe, I.T.: Principal Component Analysis. Springer, Heidelberg (2002)
Cox, T.F., Cox, M.A.: Multidimensional Scaling, vol. 30. Chapman & Hall, Sydney (1994)
Tenenbaum, J.B., Silva, V., Langford, J.C.: A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science 290, 2319 (2000)
Roweis, S.T., Saul, L.K.: Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science (2000)
Belkin, M., Niyogi, P.: Laplacian Eigenmaps for Dimensionality Reduction and Data Representation. Neural Computation 15, 1373–1396 (2003)
Nadler, B., Lafon, S., Coifman, R.R.: Diffusion Maps, Spectral Clustering and Reaction Coordinates of Dynamical Systems. Applied and Computational Harmonic Analysis 21, 113–127 (2006)
Weinberger, K.Q., Saul, L.K.: Unsupervised Learning of Image Manifolds by Semidefinite Programming. IJCV 70, 77–90 (2006)
Ham, J., Lee, D.D., Mika, S., Schölkopf, B.: A Kernel View of the Dimensionality Reduction of Manifolds. In: ICML, vol. 47 (2004)
Schölkopf, B., Smola, A., Müller, K.: Kernel Principal Component Analysis. MIT Press, Cambridge (1999)
De Silva, V., Tenenbaum, J.B.: Global versus Local Methods in Nonlinear Dimensionality Reduction. In: NIPS (2003)
Weinberger, K.Q., Packer, B.D., Saul, L.K.: Nonlinear Dimensionality Reduction by Semidefinite Programming and Kernel Matrix Factorization. In: International Workshop on Artificial Intelligence and Statistics, pp. 381–388 (2005)
Chang, H., Yeung, D.Y.: Robust Locally Linear Embedding. Pattern Recognition 39, 1053–1065 (2006)
Zhang, Z., Zha, H.: Principal Manifolds and Nonlinear Dimension Reduction via Local Tangent Space Alignment. SIAM Journal of Scientific Computing (2004)
Donoho, D.L., Grimes, C.: Hessian Eigenmaps: Locally Linear Embedding Techniques for High-Dimensional Data. National Academy of Sciences 100 (2003)
Saul, L.K., Roweis, S.T.: Think Globally, Fit Locally: Unsupervised Learning of Low Dimensional Manifolds. JMLR 4, 119–155 (2003)
Ham, J., Lee, D., Saul, L.: Learning High Dimensional Correspondences from Low Dimensional Manifolds. In: ICML (2003)
Tenenbaum, J., Freeman, W.: Separating Style and Content with Bilinear Models. Neural Computation 12 (2000)
De la Torre, F., Black, M.: Dynamic coupled component analysis. In: CVPR (2005)
Wang, C., Mahadevan, S.: Manifold Alignment Using Procrustes Analysis. In: ICML (2008)
Lee, M.: Algorithms for Representing Similarity Data (1999)
Seewald, A.K.: Digits–A dataset for Handwritten Digit Recognition. TR (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Roscher, R., Schindler, F., Förstner, W. (2011). High Dimensional Correspondences from Low Dimensional Manifolds – An Empirical Comparison of Graph-Based Dimensionality Reduction Algorithms. In: Koch, R., Huang, F. (eds) Computer Vision – ACCV 2010 Workshops. ACCV 2010. Lecture Notes in Computer Science, vol 6469. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-22819-3_34
Download citation
DOI: https://doi.org/10.1007/978-3-642-22819-3_34
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-22818-6
Online ISBN: 978-3-642-22819-3
eBook Packages: Computer ScienceComputer Science (R0)