[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to main content

Entropy Criterion for Classifier-Independent Feature Selection

  • Conference paper
Knowledge-Based Intelligent Information and Engineering Systems (KES 2005)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 3684))

Abstract

Feature selection aims to select a feature subset that has discriminative information from the original feature set. In practice, we do not know what classifier is used beforehand, and it is preferable to find a feature subset that is universally effective for any classifier. Such a trial is called classifier-independent feature selection and can be made by removing garbage features that have no discriminative information. However, it is difficult to distinguish only garbage features from the others. In this study, we propose an entropy criterion for this goal and confirm the effectiveness through a synthetic dataset.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Ferri, F.J., Pudil, P., Hatef, M., Kittler, J.: Comparative Study of Techniques for Large-scale Feature Selection. In: Gelsema, E.S., Kanal, L.N. (eds.) Pattern Recognition in Practice IV, pp. 403–413. Elsevier, Amsterdam (1994)

    Google Scholar 

  2. Blum, A.L., Langley, P.: Selection of Relevant Features and Examples in Machine Learning. Artificial Intelligence 97, 245–271 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  3. Kudo, M., Sklansky, J.: Comparison of Algorithms that Select Features for Pattern Recognition. Pattern Recognition 33, 25–41 (2000)

    Article  Google Scholar 

  4. Singh, S.: PRISM - A Novel Framework for Pattern Recognition. Pattern Analysis and Applications 6, 131–149 (2003)

    Article  Google Scholar 

  5. Kudo, M., Shimbo, M.: Feature Selection Based on the Structural Indices of Categories. Pattern Recognition 26, 891–901 (1993)

    Article  Google Scholar 

  6. Holz, H.J., Loew, M.H.: Relative Feature Importance: A Classifier- Independent Approach to Feature Selection. In: Gelsema, E.S., Kanal, L.N. (eds.) Pattern Recognition in Practice IV, pp. 473–487. Elsevier, Amsterdam (1994)

    Google Scholar 

  7. Novovičová, J., Pudil, P., Kittler, J.: Divergence Based Feature Selection for Multimodal Class Densities. IEEE Transactions on Pattern Analysis and Machine Intelligence 18, 218–223 (1996)

    Article  Google Scholar 

  8. Friedman, J.H.: A Recursive Partitioning Decision Rule for Nonparametric Classification. IEEE Transactions on Computers 26, 404–408 (1977)

    Article  MATH  Google Scholar 

  9. Ichimura, N.: Robust Clustering Based on a Maximum Likelihood Method for Estimation of the Suitable Number of Clusters. Transactions of the Institute of Electronics Information and Communication Engineers J78-D-II, 1184–1195 (1995) (in Japanese)

    Google Scholar 

  10. Kudo, M., Yanagi, S., Shimbo, M.: Construction of Class Region by a Randomized Algorithm: A Randomized Subclass Method. Pattern Recognition 29, 581–588 (1996)

    Article  Google Scholar 

  11. Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1995)

    MATH  Google Scholar 

  12. Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann, San Mateo (1993)

    Google Scholar 

  13. Collobert, R., Bengio, S.: SVMTorch: Support Vector Machines for Large-Scale Regression Problems. Journal of Machine Learning Research 1, 143–160 (2001)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Abe, N., Kudo, M. (2005). Entropy Criterion for Classifier-Independent Feature Selection. In: Khosla, R., Howlett, R.J., Jain, L.C. (eds) Knowledge-Based Intelligent Information and Engineering Systems. KES 2005. Lecture Notes in Computer Science(), vol 3684. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11554028_96

Download citation

  • DOI: https://doi.org/10.1007/11554028_96

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-28897-8

  • Online ISBN: 978-3-540-31997-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics