[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

An improved incremental nonlinear dimensionality reduction for isometric data embedding

Published: 01 April 2015 Publication History

Abstract

Manifold learning has become a hot issue in the field of machine learning and data mining. There are some algorithms proposed to extract the intrinsic characteristics of different type of high-dimensional data by performing nonlinear dimensionality reduction, such as ISOMAP, LLE and so on. Most of these algorithms operate in a batch mode and cannot be effectively applied when data are collected sequentially. In this paper, we proposed a new incremental version of ISOMAP which can use the previous computation results as much as possible and effectively update the low dimensional representation of data points as many new samples are accumulated. Experimental results on synthetic data as well as real world images demonstrate that our approaches can construct an accurate low-dimensional representation of the data in an efficient manner. An effective method to update the neighborhood graph and geodesic distances matrix.A simple method to judge the short circuits in the incremental neighborhood graph.A better solution of the incremental eigen-decomposition problem.

References

[1]
S. Belongie, Spectral partitioning with indefinite kernels using the Nyström extension, in: Proc. ECCV, 2002.
[2]
Y. Bengio, Out-of-sample extensions for LLE, Isomap, MDS, Eigenmaps and Spectral Clustering {C}, in: Advances in Neural Information Processing Systems, 2003.
[3]
C. Williams, M. Seeger, Using the Nyström method to speed up kernel machines, Adv. Neural Inf. Process. Syst., 13 (2001) 682-688.
[4]
G.H. Golub, C.F. Van Loan, Matrix Computations, Johns Hopkins University Press, 1996.
[5]
Li Housen, Incremental manifold learning by spectral embedding methods, Pattern Recognit. Lett., 32 (2011) 1447-1455.
[6]
P. Jia, Incremental Laplacian eigenmaps by preserving adjacent information between data points, Pattern Recognit. Lett., 30 (2009) 1457-1463.
[7]
John C. Platt, FastMap, MetricMap, and Landmark MDS are all Nyström algorithms, in: Proceedings of the 10th International Workshop on Artificial Intelligence and Statistics, 2005, pp. 261.
[8]
I.T. Jolliffe, Principal Component Analysis, Springer-Verlag, New York, 1986.
[9]
Z. Karina, Estimation of tangent planes for neighborhood graph correction, in: ESANN'2007 Proceedings-European Symposium on Artificial Neural Networks Bruges, 2007, pp. 25-27.
[10]
O. Kouropteva, O. Okun, M. Pietikainen, Selection of the optimal parameter value for the locally linear embedding algorithm, in: Proc. of the 1st Int'l Conf. on Fuzzy Systems and Knowledge Discovery, 2002, pp. 359-363.
[11]
X. Liu, Incremental manifold learning via tangent space alignment, Artif. Neural Netw. Pattern Recognit., 4087 (2006) 107-121.
[12]
Martin H.C. Law, Anil K. Jain, Incremental nonlinear dimensionality reduction by manifold learning, IEEE Trans. Pattern Anal. Mach. Intell., 28 (2006) 377-391.
[13]
Martin H.C. Law, N. Zhang, A.K. Jain, Non-linear manifold learning for data stream, in: Proc. SIAM International Conference for Data Mining, 2004, pp. 33-44.
[14]
T. Martinetz, K.J. Schulten, Topology representing networks, Neural Netw., 7 (1994) 507-522.
[15]
Nathan Mekuz, K. John, Tsotsos. Parameterless isomap with adaptive neighborhood selection, in: DAGM, Springer-Verlag, Berlin/Heidelberg, 2006.
[16]
Olga Kouropteva, Incremental locally linear embedding, Pattern Recognit., 38 (2005) 1764-1767.
[17]
S. Roweis, L. Saul, Nonlinear dimentionality reduction by locally linear embedding, Science, 290 (2000) 2323-2326.
[18]
M. Steyvers, Multidimensional scaling, in: Encyclopedia of Cognitive Science, 2002.
[19]
J. Tenenbaum, A global geometric framework for nonlinear dimensionality reduction, Science, 290 (2000) 2319-2323.
[20]
J. Wang, Adaptive manifold learning, in: NIPS 17, 2004. http://books.nips.cc/papers/files/nips17/NIPS2004_0612.pdf
[21]
K. Weinberger, L.K. Saul, Unsupervised learning of image manifolds by semidefinite programming, in: IEEE Internat. Conf. Computer Vision and Pattern Recognition, vol. 2, 2004, pp. 988-995.
[22]
X. Gao, J. Liang, The dynamical neighborhood selection based on the sampling density and manifold curvature for isometric data embedding, Pattern Recognit. Lett., 32 (2011) 202-209.

Cited By

View all
  • (2018)Nonlinear multi-output regression on unknown input manifoldAnnals of Mathematics and Artificial Intelligence10.1007/s10472-017-9551-081:1-2(209-240)Online publication date: 28-Dec-2018
  1. An improved incremental nonlinear dimensionality reduction for isometric data embedding

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image Information Processing Letters
    Information Processing Letters  Volume 115, Issue 4
    April 2015
    67 pages

    Publisher

    Elsevier North-Holland, Inc.

    United States

    Publication History

    Published: 01 April 2015

    Author Tags

    1. Design of algorithms
    2. ISOMAP
    3. Incremental learning
    4. Manifold learning
    5. Nonlinear dimensionality reduction

    Qualifiers

    • Research-article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 05 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2018)Nonlinear multi-output regression on unknown input manifoldAnnals of Mathematics and Artificial Intelligence10.1007/s10472-017-9551-081:1-2(209-240)Online publication date: 28-Dec-2018

    View Options

    View options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media