[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
article

Hypergraph expressing low-rank feature selection algorithm

Published: 01 November 2018 Publication History

Abstract

Dimensionality reduction has been attracted extensive attention in machine learning. It usually includes two types: feature selection and subspace learning. Previously, many researchers have demonstrated that the dimensionality reduction is meaningful for real applications. Unfortunately, a large mass of these works utilize the feature selection and subspace learning independently. This paper explores a novel supervised feature selection algorithm by considering the subspace learning. Specifically, this paper employs an ℓ2,1?norm and an ℓ2,p?norm regularizers, respectively, to conduct sample denoising and feature selection via exploring the correlation structure of data. Then this paper uses two constraints (i.e. hypergraph and low-rank) to consider the local structure and the global structure among the data, respectively. Finally, this paper uses the optimizing framework to iteratively optimize each parameter while fixing the other parameter until the algorithm converges. A lot of experiments show that our new supervised feature selection method can get great results on the eighteen public data sets.

References

[1]
Cai X, Nie F, Huang H (2013) Exact top-k feature selection via l 2,0 -norm constraint. Int Joint Conf Artif Intell:1240---1246
[2]
Chang X, Nie F, Yang Y, Huang H (2014) A convex formulation for semi-supervised multi-label feature selection. Twenty-Eighth AAAI Conf Artif Intell:1171---1177
[3]
Chen H-T, Chang H-W, Liu T-L (2005) Local Discriminant Embedding and Its Variants., Local Discriminant Embedding and Its Variants, IEEE Computer Society Conference on Computer Vision & Pattern Recognition
[4]
Dadaneh BZ, Markid HY, Zakerolhosseini A (2016) Unsupervised probabilistic feature selection using ant colony optimization. Expert Syst Appl Int J 53:27---42
[5]
Daubechies I, Devore R, Fornasier M, Güntürk CS (2008) Iteratively reweighted least squares minimization for sparse recovery. Commun Pure Appl Math 63:1---38
[6]
Du X, Yan Y, Pan P, Long G, Zhao L (2014) Multiple graph unsupervised feature selection. Signal Process 120:754---760
[7]
Du L, Shen YD (2015) Unsupervised feature selection with adaptive structure learning. Comput Sci 37:209---218
[8]
Feng S, Lu H, Long X (2015) Discriminative Dictionary Learning Based on Supervised Feature Selection for Image Classification. Seventh Int Symp Comput Intell Des:225---228
[9]
He X, Niyogi P (2003) Locality preserving projections. Advx Neural Inf Process Syst 16:186---197
[10]
Hou C, Nie F, Li X, Yi D (2014) Joint embedding learning and sparse regression: a framework for unsupervised feature selection. IEEE Trans Cybern 44:793
[11]
Hu R, Zhu X, Cheng D, He W, Yan Y, Song J, Zhang S (2017) Graph self-representation method for unsupervised feature selection. Neurocomputing 220:130---137
[12]
Ling C, Yang Q, Wang J, Zhang S (2004) Decision trees with minimal costs, Proceedings of 21st International Conference on Machine Learning (ICML)
[13]
Li J, Hu X, Wu L, Liu H (2016) Robust unsupervised feature selection on networked data, Siam Int Conf Data Mining:387---395
[14]
Liu Y, Nie F, Wu J, Chen L (2013) Efficient semi-supervised feature selection with noise insensitive trace ratio criterion. Neurocomputing 105:12---18
[15]
Luo D, Ding CHQ, Huang H (2011) Linear discriminant analysis: new formulations and overfit analysis, AAAI Conference on Artificial Intelligence
[16]
Ma Z, Nie F, Yang Y, Uijlings JRR (2012) Web image annotation via subspace-sparsity collaborated feature selection. IEEE Trans Multimed 14:1021---1030
[17]
Nie F, Yuan J, Huang H (2014) Optimal mean robust principal component analysis. Int Conf Mach Learn
[18]
Qin Y, Zhang S, Zhu X, Zhang J, Zhang C (2007) Semi-parametric optimization for missing data imputation. Appl Intell 27:79---88
[19]
Shi C, An G, Zhao R, Ruan Q (2016) Multi-view hessian semi-supervised sparse feature selection for multimedia analysis. IEEE Trans Circ Syst Video Technol 27:1947---1961
[20]
Wang T, Qin Z, Zhang S, Zhang C (2012) Cost-sensitive classification with inadequate labeled data. Inf Syst 37:508---516
[21]
Wang XD, Chen RC, Yan F, Zeng ZQ (2016) Semi-supervised feature selection with exploiting shared information among multiple tasks. J of Vis Commun Image Represent 41:272---280
[22]
Wu X, Zhang S (2003) Synthesizing high-frequency rules from different data sources. IEEE Trans Knowl Data Eng 15:353---367
[23]
Wu X, Zhang C, Zhang S (2004) Efficient mining of both positive and negative association rules. ACM Trans Inf Syst 22:381---405
[24]
Wu X, Zhang C, Zhang S (2005) Database classification for multi-database mining. Inf Syst 30:71---88
[25]
Zhang S, Zhang C (2002) Anytime mining for multi-user applications. IEEE Trans Syst Man Cybern (Part A) 32:515---521
[26]
Zhang S, Zhang C (2003) PostMining: maintenance of association rules by weighting. Inf Syst 28: 691---707
[27]
Yang Y, Ma Z, Hauptmann AG, Sebe N (2013) Feature selection for multimedia analysis by sharing information among multiple tasks. IEEE Trans Multimed 15:661---669
[28]
Zhang S, Qin Z, Ling CX, Sheng S (2005) Missing Is useful: missing values in cost-sensitive decision trees. IEEE Trans Knowl Data Eng 17:1689---1693
[29]
Zhang S, Qin Z, Ling C (2005) Missing is useful: missing values in cost-sensitive decision trees. IEEE Trans Knowl Data Eng 17:1689---1693
[30]
Zhang S (2011) Shell-neighbor method and its application in missing data imputation. Appl Intell 35:123---133
[31]
Zhang S, Jin Z, Zhu X (2011) Missing data imputation by utilizing information within incomplete instances. J Syst Softw 84:452---459
[32]
Zhang S (2012) Nearest neighbor selection for iteratively kNN imputation. J Syst Softw 85:2541---2552
[33]
Zhang S (2012) Decision tree classifiers sensitive to heterogeneous costs. J Syst Softw 85:771---779
[34]
Zhang S, Li X, Zong M, Zhu X, Wang R (2017) Efficient kNN classification with different numbers of nearest neighbors. IEEE Transactions on Neural Networks and Learning Systems
[35]
Zhang S, Li X, Zong M, Zhu X, Cheng D (2017) Learning k for knn classification. ACM Trans Intell Syst Technol 8:43
[36]
Zhao Y, Zhang S (2006) Generalized dimension-reduction framework for recent-biased time series analysis. IEEE Trans Knowl Data Eng 18:231---244
[37]
Zhao Z, Wang L, Liu H (2011) Efficient spectral feature selection with minimum redundancy, Twenty-Fourth AAAI Conference on Artificial Intelligence
[38]
Zhu X, Zhang* S, Jin Z, Zhang Z, Xu Z (2011) Missing value estimation for mixed-attribute datasets. IEEE Trans Knowl Data Eng 23:110---121
[39]
Zhu X, Huang Z, Shen Heng T, Cheng J, Xu C (2012) Dimensionality reduction by mixed kernel canonical correlation analysis. Pattern Recogn 45:3003---3016
[40]
Zhu X, Huang Z, Shen HT, Zhao X (2013) Linear cross-modal hashing for efficient multimedia search: 143-152
[41]
Zhu X, Huang Z, Cheng H, Cui J, Shen HT (2013) Sparse hashing for fast multimedia search. ACM Trans Inf Syst 31:9
[42]
Zhu X, Zhang L, Huang Z (2014) A sparse embedding and least variance encoding approach to hashing. IEEE Trans Image Process 23:3737---3750
[43]
Zeng Z, Wang X, Zhang J, Wu Q (2015) Semi-supervised feature selection based on local discriminative information. Neurocomputing 173:102---109
[44]
Zhu P, Zuo W, Zhang L, Hu Q, Shiu SCK (2015) Unsupervised feature selection by regularized self-representation. Pattern Recogn 48:438---446
[45]
Zhu P, Zhu W, Wang W, Zuo W, Hu Q (2016) Non-convex regularized self-representation for unsupervised feature selection *
[46]
Zhu X, Li X, Zhang S (2016) Block-row sparse multiview multilabel learning for image classification. IEEE Trans Cybern 46:450---461
[47]
Zhu X, Suk H, Lee S-W, Shen D (2016) Subspace regularized sparse multitask learning for multiclass neurodegenerative disease identification. IEEE Trans Biomed Eng 63:607---618
[48]
Zhu X, Li X, Zhang S (2016) Block-row sparse multiview multilabel learning for image classification. IEEE Trans Cybern 46:450---461
[49]
Zhu X, Li X, Zhang S, Xu Z, Yu L, Wang C (2017) Graph PCA Hashing for Similarity Search, IEEE Transactions on Multimedia
[50]
Zhu X, Suk H-I, Wang L, Lee S-W, Shen D (2017) A novel relational regularization feature selection method for joint regression and classification in AD diagnosis. Med Image Anal 38:205---214
[51]
Zhu X, Li X, Zhang S, Ju C, Wu X (2017) Robust joint graph sparse coding for unsupervised spectral feature selection. IEEE Trans Neural Netw Learn Syst 28:1263---1275
[52]
Zhu X, Suk H-I, Huang H, Shen D (2017) Low-rank graph-regularized structured sparse regression for identifying genetic biomarkers. IEEE Transactions on Big Data

Cited By

View all
  1. Hypergraph expressing low-rank feature selection algorithm

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image Multimedia Tools and Applications
    Multimedia Tools and Applications  Volume 77, Issue 22
    November 2018
    928 pages

    Publisher

    Kluwer Academic Publishers

    United States

    Publication History

    Published: 01 November 2018

    Author Tags

    1. Feature selection
    2. Hypergraph
    3. LowRank

    Qualifiers

    • Article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 03 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all

    View Options

    View options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media