[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2783258.2783345acmconferencesArticle/Chapter ViewAbstractPublication PageskddConference Proceedingsconference-collections
research-article

Unsupervised Feature Selection with Adaptive Structure Learning

Published: 10 August 2015 Publication History

Abstract

The problem of feature selection has raised considerable interests in the past decade. Traditional unsupervised methods select the features which can faithfully preserve the intrinsic structures of data, where the intrinsic structures are estimated using all the input features of data. However, the estimated intrinsic structures are unreliable/inaccurate when the redundant and noisy features are not removed. Therefore, we face a dilemma here: one need the true structures of data to identify the informative features, and one need the informative features to accurately estimate the true structures of data. To address this, we propose a unified learning framework which performs structure learning and feature selection simultaneously. The structures are adaptively learned from the results of feature selection, and the informative features are reselected to preserve the refined structures of data. By leveraging the interactions between these two essential tasks, we are able to capture accurate structures and select more informative features. Experimental results on many benchmark data sets demonstrate that the proposed method outperforms many state of the art unsupervised feature selection methods.

References

[1]
S. Alelyani, J. Tang, and H. Liu. Feature selection for clustering: A review. Data Clustering: Algorithms and Applications, 29, 2013.
[2]
F. Bach, R. Jenatton, J. Mairal, and G. Obozinski. Optimization with sparsity-inducing penalties. Foundations and Trends® in Machine Learning, 4(1):1--106, 2012.
[3]
D. Cai, X. He, and J. Han. Spectral regression: A unified approach for sparse subspace learning. In ICDM, pages 73--82, 2007.
[4]
D. Cai, C. Zhang, and X. He. Unsupervised feature selection for multi-cluster data. In SIGKDD, pages 333--342, 2010.
[5]
L. Du, Z. Shen, X. Li, P. Zhou, and Y. Shen. Local and global discriminative learning for unsupervised feature selection. In ICDM, pages 131--140, 2013.
[6]
J. Dy and C. Brodley. Feature selection for unsupervised learning. JMLR, 5:845--889, 2004.
[7]
Q. Gu, Z. Li, and J. Han. Joint feature selection and subspace learning. In IJCAI, pages 1294--1299, 2011.
[8]
X. He, D. Cai, and P. Niyogi. Laplacian score for feature selection. NIPS, 18:507--514, 2006.
[9]
X. He, M. Ji, C. Zhang, and H. Bao. A variance minimization criterion to feature selection using laplacian regularization. PAMI, (99):2013--2025, 2011.
[10]
C. Hou, F. Nie, X. Li, D. Yi, and Y. Wu. Joint embedding learning and sparse regression: A framework for unsupervised feature selection. Cybernetics, IEEE Transactions on, 44(6):793--804, 2014.
[11]
C. Hou, F. Nie, D. Yi, and Y. Wu. Feature selection via joint embedding learning and sparse regression. In IJCAI, pages 1324--1329, 2011.
[12]
W. Krzanowski. Selection of variables to preserve multivariate data structure, using principal components. Applied Statistics, pages 22--33, 1987.
[13]
Z. Li, J. Liu, Y. Yang, X. Zhou, and H. Lu. Clustering-guided sparse structural learning for unsupervised feature selection. TKDE, 26(9):2138--2150, 2014.
[14]
Z. Li, Y. Yang, J. Liu, X. Zhou, and H. Lu. Unsupervised feature selection using nonnegative spectral analysis. In AAAI, 2012.
[15]
J. Liu, S. Ji, and J. Ye. SLEP: Sparse Learning with Efficient Projections. Arizona State University, 2009.
[16]
X. Liu, L. Wang, J. Zhang, J. Yin, and H. Liu. Global and local structure preservation for feature selection. IEEE Transactions on NNLS, 25(6):1083--1095, 2014.
[17]
F. Nie, H. Huang, X. Cai, and C. Ding. Efficient and robust feature selection via joint ell_2,1-norms minimization. NIPS, 23:1813--1821, 2010.
[18]
F. Nie, X. Wang, and H. Huang. Clustering and projected clustering with adaptive neighbors. In SIGKDD, pages 977--986, 2014.
[19]
F. Nie, S. Xiang, Y. Jia, C. Zhang, and S. Yan. Trace ratio criterion for feature selection. In IJCAI, volume 2, pages 671--676, 2008.
[20]
H. Peng, F. Long, and C. Ding. Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. PAMI, 27(8):1226--1238, 2005.
[21]
M. Qian and C. Zhai. Robust unsupervised feature selection. In IJCAI, pages 1621--1627, 2013.
[22]
M. Robnik-Šikonja and I. Kononenko. Theoretical and empirical analysis of relieff and rrelieff. Machine learning, 53(1--2):23--69, 2003.
[23]
L. Shi, L. Du, and Y. Shen. Robust spectral learning for unsupervised feature selection. In ICDM, pages 977--982, 2014.
[24]
I. Takeuchi and M. Sugiyama. Target neighbor consistent feature weighting for nearest neighbor classification. In NIPS, pages 576--584, 2011.
[25]
R. Tibshirani. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society. Series B (Methodological), pages 267--288, 1996.
[26]
J. Wright, A. Y. Yang, A. Ganesh, S. S. Sastry, and Y. Ma. Robust face recognition via sparse representation. PAMI, 31(2):210--227, 2009.
[27]
Z. Xu, I. King, M.-T. Lyu, and R. Jin. Discriminative semi-supervised feature selection via manifold regularization. Neural Networks, IEEE Transactions on, 21(7):1033--1047, 2010.
[28]
Y. Yang, H. Shen, Z. Ma, Z. Huang, and X. Zhou. ell_21-norm regularized discriminative feature selection for unsupervised learning. In IJCAI, pages 1589--1594, 2011.
[29]
Y. Yang, H. T. Shen, F. Nie, R. Ji, and X. Zhou. Nonnegative spectral clustering with discriminative regularization. In AAAI, 2011.
[30]
Y. Yang, D. Xu, F. Nie, S. Yan, and Y. Zhuang. Image clustering using local discriminant models and global integration. TIP, 19(10):2761--2773, 2010.
[31]
Y. Yang, Y. Yang, H. T. Shen, Y. Zhang, X. Du, and X. Zhou. Discriminative nonnegative spectral clustering with out-of-sample extension. TKDE, 25(8):1760--1771, 2013.
[32]
H. Zeng and Y. Cheung. Feature selection and kernel learning for local learning-based clustering. PAMI, 33(8):1532--1547, 2011.
[33]
Z. Zhao and H. Liu. Semi-supervised feature selection via spectral analysis. In SDM, pages 641--646, 2007.
[34]
Z. Zhao and H. Liu. Spectral feature selection for supervised and unsupervised learning. In ICML, pages 1151--1157, 2007.
[35]
Z. Zhao, L. Wang, and H. Liu. Efficient spectral feature selection with minimum redundancy. In AAAI, pages 673--678, 2010.
[36]
Z. Zhao, L. Wang, H. Liu, and J. Ye. On similarity preserving feature selection. TKDE, 25(3):619--632, 2013.

Cited By

View all
  • (2024)MMVFL: A Simple Vertical Federated Learning Framework for Multi-Class Multi-Participant ScenariosSensors10.3390/s2402061924:2(619)Online publication date: 18-Jan-2024
  • (2024)Pseudo-Label Guided Structural Discriminative Subspace Learning for Unsupervised Feature SelectionIEEE Transactions on Neural Networks and Learning Systems10.1109/TNNLS.2023.331937235:12(18605-18619)Online publication date: Dec-2024
  • (2024)Double-Structured Sparsity Guided Flexible Embedding Learning for Unsupervised Feature SelectionIEEE Transactions on Neural Networks and Learning Systems10.1109/TNNLS.2023.326718435:10(13354-13367)Online publication date: Oct-2024
  • Show More Cited By

Index Terms

  1. Unsupervised Feature Selection with Adaptive Structure Learning

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    KDD '15: Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
    August 2015
    2378 pages
    ISBN:9781450336642
    DOI:10.1145/2783258
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 10 August 2015

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. adaptive structure learning
    2. unsupervised feature selection

    Qualifiers

    • Research-article

    Funding Sources

    • china 973 program
    • NSFC

    Conference

    KDD '15
    Sponsor:

    Acceptance Rates

    KDD '15 Paper Acceptance Rate 160 of 819 submissions, 20%;
    Overall Acceptance Rate 1,133 of 8,635 submissions, 13%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)75
    • Downloads (Last 6 weeks)22
    Reflects downloads up to 13 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)MMVFL: A Simple Vertical Federated Learning Framework for Multi-Class Multi-Participant ScenariosSensors10.3390/s2402061924:2(619)Online publication date: 18-Jan-2024
    • (2024)Pseudo-Label Guided Structural Discriminative Subspace Learning for Unsupervised Feature SelectionIEEE Transactions on Neural Networks and Learning Systems10.1109/TNNLS.2023.331937235:12(18605-18619)Online publication date: Dec-2024
    • (2024)Double-Structured Sparsity Guided Flexible Embedding Learning for Unsupervised Feature SelectionIEEE Transactions on Neural Networks and Learning Systems10.1109/TNNLS.2023.326718435:10(13354-13367)Online publication date: Oct-2024
    • (2024)CGDD: Multiview Graph Clustering via Cross-Graph Diversity DetectionIEEE Transactions on Neural Networks and Learning Systems10.1109/TNNLS.2022.320196435:3(4206-4219)Online publication date: Mar-2024
    • (2024)Unsupervised Feature Selection With Flexible Optimal GraphIEEE Transactions on Neural Networks and Learning Systems10.1109/TNNLS.2022.318617135:2(2014-2027)Online publication date: Feb-2024
    • (2024)Local Sample-Weighted Multiple Kernel Clustering With Consensus Discriminative GraphIEEE Transactions on Neural Networks and Learning Systems10.1109/TNNLS.2022.318497035:2(1721-1734)Online publication date: Feb-2024
    • (2024)Exploring Feature Selection With Limited Labels: A Comprehensive Survey of Semi-Supervised and Unsupervised ApproachesIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2024.339787836:11(6124-6144)Online publication date: Nov-2024
    • (2024)Outliers Robust Unsupervised Feature Selection for Structured Sparse SubspaceIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2023.329722636:3(1234-1248)Online publication date: Mar-2024
    • (2024)Unsupervised Feature Selection via Multi-Structure Learning and Indicator Matrix2024 International Conference on New Trends in Computational Intelligence (NTCI)10.1109/NTCI64025.2024.10776404(189-194)Online publication date: 18-Oct-2024
    • (2024)Sparse Variable Selection on High Dimensional Heterogeneous Data With Tree Structured ResponsesIEEE Access10.1109/ACCESS.2024.338430912(50779-50791)Online publication date: 2024
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media