Abstract
Dimensionality reduction plays a key role in pattern recognition. It can preserve essential and inherent feature information while reducing noise and redundant information contained in the high-dimensional raw data, which achieve performance improvement in subsequent tasks (e.g., classification and clustering). Locality preserving projection (LPP), as a typical method for dimensionality reduction, can explore the local sub-manifold of the raw data with the aid of K-nearest neighbor (KNN). However, LPP has some serious limitations: (1) the neighbor parameter is artificially set, and this leads to the problem that the size of the neighbor parameter may affect the performance of LPP in the application; (2) LPP, as a single-view method, cannot function in multi-view data; (3) LPP ignores both the discriminative information and the global linear relationship of the raw data. In response to these limitations, we propose a novel multi-view dimensionality reduction method called coupled locality discriminant analysis with globality preserving (CLDA-GP). CLDA-GP can learn a couple of optimal mappings so that different multi-view raw spaces can be mapped into a low-dimensional uniform elastic subspace while keeping the local sub-manifold and global linear relationship. It is also worth mentioning that CLDA-GP gives another strategy called local similarity self-learning (LSSL) to excavate the local manifold information of the multi-view data. By utilizing the LSSL strategy, CLDA-GP casts off the limitation of the neighbor parameter. Besides, CLDA-GP further introduces the supervision information of the raw data, which enables its discriminant power. The experiment results on the artificial and benchmark (COIL-20, GT, and Umist) datasets prove CLDA-GP outperforms the comparative methods, which also illustrate the effectiveness and feasibility of CLDA-GP.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Xu X, Liang T, Zhu J, Zheng D, Sun T (2019) Review of classical dimensionality reduction and sample selection methods for large-scale data processing[J]. Neurocomputing 328:5–15
Zebari R, Abdulazeez A, Zeebaree D, Zebari D, Saeed J (2020) A comprehensive review of dimensionality reduction techniques for feature selection and feature extraction[J]. J Appl Sci Technol Trends 1(2):56–70
Aït-Sahalia Y, Xiu D (2019) Principal component analysis of high-frequency data[J]. J Am Stat Assoc 114(525):287–303
Li H, Zhang L, Huang B, Zhou X (2020) Cost-sensitive dual-bidirectional linear discriminant analysis[J]. Inf Sci 510:283–303
Gu Y, Wei HL (2018) A robust model structure selection method for small sample size and multiple datasets problems[J]. Inf Sci 451:195–209
Gopi ES, Palanisamy P (2014) Maximizing Gaussianity using kurtosis measurement in the kernel space for kernel linear discriminant analysis[J]. Neurocomputing 144:329–337
Pang Y, Wang S, Yuan Y (2014) Learning regularized LDA by clustering[J]. IEEE Trans Neural Netw Learn Syst 25(12):2191–2201
Belhumeur PN, Hespanha JP, Kriegman DJ (1997) Eigenfaces vs. fisherfaces: recognition using class specific linear projection[J]. IEEE Trans Pattern Anal Mach Intell 19(7):711–720
Malik ZK, Hussain A, Wu J (2016) An online generalized eigenvalue version of laplacian eigenmaps for visual big data[J]. Neurocomputing 173:127–136
Zhang S, Lei YK (2011) Modified locally linear discriminant embedding for plant leaf recognition[J]. Neurocomputing 74(14–15):2284–2290
Huang Z, Xu X, Zuo L (2014) Reinforcement learning with automatic basis construction based on isometric feature mapping[J]. Inf Sci 286:209–227
Peng X, Tang H, Zhang L, Yi Z, Xiao S (2015) A unified framework for representation-based subspace clustering of out-of-sample and large-scale data[J]. IEEE Trans Neural Netw Learn Syst 27(12):2499–2512
He X, Niyogi P (2004) Locality preserving projections[J]. Adv Neural Inf Proces Syst 16(16):153–160
He X, Cai D, Yan S et al (2005) Neighborhood preserving embedding[C]//tenth IEEE international conference on computer vision (ICCV'05) volume 1. IEEE, 2: 1208–1213
Cai D, He X, Han J, Zhang HJ (2006) Orthogonal laplacianfaces for face recognition[J]. IEEE Trans Image Process 15(11):3608–3614
Kokiopoulou E, Saad Y (2007) Orthogonal neighborhood preserving projections: a projection-based dimensionality reduction technique[J]. IEEE Trans Pattern Anal Mach Intell 29(12):2143–2156
Yang L, Gong W, Gu X, Li W, Liang Y (2008) Null space discriminant locality preserving projections for face recognition[J]. Neurocomputing 71(16–18):3644–3649
Wang Z, Sun X (2008) Face recognition using kernel-based NPE[C]//2008 international conference on computer science and software engineering. IEEE, 1: 802–805
Wang SJ, Chen HL, Peng XJ, Zhou CG (2011) Exponential locality preserving projections for small sample size problem[J]. Neurocomputing 74(17):3654–3662
Long T, Sun Y, Gao J, Hu Y, Yin B (2020) Locality preserving projection based on Euler representation[J]. J Vis Commun Image Represent 70:102796
Zang F, Zhang J, Pan J (2012) Face recognition using Elasticfaces. Pattern Recogn 45(11):3866–3876
Abeo TA, Shen XJ, Bao BK et al (2019) A generalized multi-dictionary least squares framework regularized with multi-graph embeddings[J]. Pattern Recogn 90:1–11
Sun QS, Zeng SG, Liu Y, Heng PA, Xia DS (2005) A new method of feature fusion and its application in image recognition[J]. Pattern Recogn 38(12):2437–2448
Sun T, Chen S (2007) Locality preserving CCA with applications to data visualization and pose estimation[J]. Image Vis Comput 25(5):531–543
Lin G, Fan G, Kang X, Zhang E, Yu L (2016) Heterogeneous feature structure fusion for classification[J]. Pattern Recogn 53:1–11
Sun T, Chen S, Yang J et al (2008) A supervised combined feature extraction method for recognition[C]//Procedings of the IEEE international conference on data mining, Pisa, Italy. 1043–1048
Peng Y, Zhang D, Zhang J (2010) A new canonical correlation analysis algorithm with local discrimination[J]. Neural Process Lett 31(1):1–15
Barker M, Rayens W (2003) Partial least squares for discrimination[J]. J Chemom: A Journal of the Chemometrics Society 17(3):166–173
Li B, Chang H, Shan S et al (2009) Coupled metric learning for face recognition with degraded images[C]//Asian conference on machine learning. Springer, Berlin, Heidelberg, 220–233
Ben X, Meng W, Yan R, Wang K (2013) Kernel coupled distance metric learning for gait recognition and face recognition[J]. Neurocomputing 120:577–589.30
Rastin N, Jahromi MZ, Taheri M (2021) A generalized weighted distance k-nearest neighbor for multi-label problems[J]. Pattern Recogn 114:107526
Huang P, Gao G (2015) Local similarity preserving projections for face recognition[J]. AEU Int J Electron Commun 69(11):1724–1732
Yang B, Chen S (2010) Sample-dependent graph construction with application to dimensionality reduction[J]. Neurocomputing 74(1–3):301–314
Chen J, Wang G, Giannakis GB (2019) Graph multiview canonical correlation analysis[J]. IEEE Trans Signal Process 67(11):2826–2838
Acknowledgments
This work is supported by the National Natural Science Foundation of China (Grant No. 61806006), the China Postdoctoral Science Foundation (Grant No. 2019 M660149), the Institute of Energy, Hefei Comprehensive National Science Center (Grant No. 19KZS203), Science and Technology Research Project of Wuhu City (Grant No. 2020yf48).
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Su, S., Zhu, G., Zhu, Y. et al. Coupled locality discriminant analysis with globality preserving for dimensionality reduction. Appl Intell 53, 7118–7131 (2023). https://doi.org/10.1007/s10489-022-03409-3
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10489-022-03409-3