Abstract
This paper introduces a new dynamic feature selection to classification algorithms, which is based on individual similarity and it uses a clustering algorithm to select the best features for an instance individually. In addition, an empirical analysis will be performed to evaluate the performance of the proposed method and to compare it with existing feature selection methods, applying to classification problems. The results shown in this paper indicate that the proposed method had better performance results than the existing methods compared, in most cases.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Mitchell, T.: Machine Learning. McGraw Hill, New York (1997)
Motoda, H., Liu, H.: Feature selection, extraction and construction. Commun. Inst. Inf. Comput. Mach. Taiwan 5, 67–72 (2002)
Liu, H., Motoda, H.: Computational Methods of Feature Selection. CRC Press, Boca Raton (2007)
Zeng, H., Cheung, Y.: Feature selection and kernel learning for local learning-based clustering. IEEE Trans. Pattern Anal. Mach. Intell. 33, 1532–1547 (2011)
Bradley, P.S., Mangasarian, O.L.: Feature selection via concave minimization and support vector machines. In: ICML, vol. 98, pp. 82–90 (1998)
Pearson, K.: Mathematical contributions to the theory of evolution. III. Regression, heredity, and panmixia. Philos. Trans. Roy. Soc. London 187, 253–318 (1896)
Chu, C., Hsu, A., Chou, K., Bandettini, P., Lin, C.: Alzheimer’s disease neuroimaging initiative and others: does feature selection improve classification accuracy? Impact of sample size and feature selection on classification using anatomical magnetic resonance images. Neuroimage 60, 59–60 (2012)
Bolón-Canedo, V., Porto-Díaz, I., Sánchez-Maroño, N., Alonso-Betanzos, A.: A framework for cost-based feature selection. Pattern Recogn. 647, 2481–2489 (2014)
Wang, J., Zhao, P., Hoi, S.C., Jin, R.: Online feature selection and its applications. IEEE Trans. Knowl. Data Eng. 26, 698–710 (2014)
Chen, Z., Wu, C., Zhang, Y., Huang, Z., Ran, B., Zhong, M., Lyu, N.: Feature selection with redundancy-complementariness dispersion. Knowl.-Based Syst. 89, 203–217 (2015)
Chandrashekar, G., Sahin, F.: A survey on feature selection methods. Comput. Electr. Eng. Nagel 40, 16–28 (2014). Elsevier
Maldonado, S., Carrizosa, E., Weber, R.: Kernel penalized K-means: a feature selection method based on kernel K-means. Inf. Sci. 322, 150–160 (2015)
Boutsidis, C., Magdon-Ismail, M.: Deterministic feature selection for k-means clustering. IEEE Trans. Inf. Theory 59, 6099–6110. IEEE (2013)
Bhondave, R., Kalbhor, M., Shinde, S., Rajeswari, K.: Improvement of expectation maximization clustering using select attribute. Int. J. Comput. Sci. Mob. Comput. 3, 503–508 (2014)
Nunes, R.O., Dantas, C.A., Canuto, A.M.P., Xavier-Jnior, J.C.: An unsupervised-based dynamic feature selection for classification tasks. In: 2016 International Joint Conference on Neural Networks, pp. 4213–4220. IEEE (2016)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Dantas, C.A., Nunes, R.O., Canuto, A.M.P., Xavier-Júnior, J.C. (2017). Dynamic Feature Selection Based on Clustering Algorithm and Individual Similarity. In: Lintas, A., Rovetta, S., Verschure, P., Villa, A. (eds) Artificial Neural Networks and Machine Learning – ICANN 2017. ICANN 2017. Lecture Notes in Computer Science(), vol 10614. Springer, Cham. https://doi.org/10.1007/978-3-319-68612-7_53
Download citation
DOI: https://doi.org/10.1007/978-3-319-68612-7_53
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-68611-0
Online ISBN: 978-3-319-68612-7
eBook Packages: Computer ScienceComputer Science (R0)