Abstract
Feature selection and feature weighting are frequently used in machine learning for processing high dimensional data. It reduces the number of features in the dataset and makes the classification process easier. Meta-heuristic algorithms are widely adopted for feature selection and feature weighting due to their enhanced searching ability. This paper compares five different meta-heuristic optimization algorithms that are recently introduced for feature selection and feature weighting in artificial neural networks. This includes chimp optimization algorithm, tunicate swarm algorithm, bear smell search algorithm, antlion optimization algorithm and modified antlion optimization algorithm. Experimental evaluations are performed on five different datasets to illustrate the significant improvements observed during classification process of all the algorithms utilised in the comparative analysis. Both tunicate swarm algorithm and chimp optimization algorithm has gained better classification accuracy than other algorithms. However, all these algorithms are found to be more effective for feature selection and feature weighting processes.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Dey A (2016) Machine learning algorithms: a review. Int J Comput Sci Inf Technol 7(3):1174–1179
Xue B, Cervante L, Shang L, Browne WN, Zhang M (2013) Multi-objective evolutionary algorithms for filter based feature selection in classification. Int J Artif Intell Tools 22(04):1350024
Hughes G (1968) On the mean accuracy of statistical pattern recognizers. IEEE Trans Inf Theory 14(1):55–63
Tahir MA, Bouridane A, Kurugollu F (2007) Simultaneous feature selection and feature weighting using Hybrid Tabu Search/K-nearest neighbor classifier. Pattern Recogn Lett 28(4):438–446
Schmidhuber J (2015) Deep learning in neural networks: an overview. Neural Netw 61:85–117
Galeshchuk S (2016) Neural networks performance in exchange rate prediction. Neurocomputing 172:446–452
Kavzoglu T, Mather PM (2000) The use of feature selection techniques in the context of artificial neural networks. In: Proceedings of the 26th annual conference of the remote sensing society
Perez-Rodriguez J, Arroyo-Pena AG, Garcia-Pedrajas N (2015) Simultaneous instance and feature selection and weighting using evolutionary computation: Proposal and study. Appl Soft Comput 37:416–443
Zhou Y, Cheng G, Jiang S, Dai M (2020) Building an efficient intrusion detection system based on feature selection and ensemble classifier. Comput Networks 174:107247
Weston J, Mukherjee S, Chapelle O, Pontil M, Poggio T, Vapnik V (2000) Feature selection for SVMs. Adv Neural Inf Process Syst 13:668–674
Peng HC, Ding C, Long FH (2005) Minimum redundancy-maximum relevance feature selection 70–71
Kelly JD Jr, Davis L (1991) A hybrid genetic algorithm for classification. IJCAI 91:645–650
Dialameh M, Jahromi MZ (2017) A general feature-weighting function for classification problems. Expert Syst Appl 72:177–188
Wettschereck D, Aha DW, Mohri T (1997) A review and empirical evaluation of feature weighting methods for a class of lazy learning algorithms. Artif Intell Rev 11(1–5):273–314
Raymer ML, Punch WF, Goodman ED, Kuhn LA, Jain AK (2000) Dimensionality reduction using genetic algorithms. IEEE Trans Evol Comput 4(2):164–171
Singh D, Singh B (2019) Hybridization of feature selection and feature weighting for high dimensional data. Appl Intell 49(4):1580–1596
Alba E, Dorronsoro B (2005) The exploration/exploitation tradeoff in dynamic cellular genetic algorithms. IEEE Trans Evol Comput 9(2):126–142
Lozano M, García-Martínez C (2010) Hybrid metaheuristics with evolutionary algorithms specializing in intensification and diversification: overview and progress report. Comput Oper Res 37(3):481–497
Shayeghi H, Ghasemi A, Moradzadeh M, Nooshyar M (2017) Day-ahead electricity price forecasting using WPT, GMI and modified LSSVM-based S-OLABC algorithm. Soft Comput 21(2):525–541
Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1(1):67–82
Song XF, Zhang Y, Gong DW, Gao XZ (2021) A fast hybrid feature selection based on correlation-guided clustering and particle swarm optimization for high-dimensional data. IEEE Trans Cybern 99:1–14
Song XF, Zhang Y, Gong DW, Sun XY (2021) Feature selection using bare-bones particle swarm optimization with mutual information. Pattern Recogn 112:107804
Song XF, Zhang Y, Guo YN, Sun XY, Wang YL (2020) Variable-size cooperative coevolutionary particle swarm optimization for feature selection on high-dimensional data. IEEE Trans Evol Comput 24(5):882–895
Hu Y, Zhang Y, Gong D (2020) Multiobjective particle swarm optimization for feature selection with fuzzy cost. IEEE Trans Cybern 51(2):874–888
Darwish A (2018) Bio-inspired computing: algorithms review, deep analysis, and the scope of applications. Future Comput Inf J 3(2):231–246
Kennedy J, Eberhart R (1995) Particle swarm optimization. In: Proceedings of ICNN’95-international conference on neural networks IEEE, vol 4. pp 1942–1948
Yazdani M, Jolai F (2016) Lion optimization algorithm (LOA): a nature-inspired metaheuristic algorithm. J Comput Des Eng 3(1):24–36
Cuevas E, Cienfuegos M, Rojas R, Padilla A (2015) A computational intelligence optimization algorithm based on the behavior of the social-spider. In: Computational intelligence applications in modeling and control. Springer, pp. 123–146
Mirjalili S, Lewis A (2016) The whale optimization algorithm. Adv Eng Softw 95:51–67
Song X, Tang L, Zhao S, Zhang X, Li L, Huang J, Cai W (2015) Grey wolf optimizer for parameter estimation in surface waves. Soil Dyn Earthq Eng 75:147–157
Emary E, Zawbaa HM, Hassanien AE (2016) Binary grey wolf optimization approaches for feature selection. Neurocomputing 172:371–381
Karaboga D (2005) An idea based on honey bee swarm for numerical optimization, Technical Report-TR06. Erciyes University, Engineering Faculty, Computer Engineering Department, 200, pp 1–10
Parpinelli RS, Lopes HS, Freitas AA (2002) Data mining with an ant colony optimization algorithm. IEEE Trans Evol Comput 6(4):321–332
Michelakos I, Mallios N, Papageorgiou E, Vassilakopoulos M (2011) Ant colony optimization and data mining. In: Next Generation Data Technologies for Collective Computational Intelligence. Springer, pp 31–60
Singh D, Singh B (2020) Investigating the impact of data normalization on classification performance. Appl Soft Comput 97:105524
Tharwat A, Hassanien AE (2018) Chaotic antlion algorithm for parameter optimization of support vector machine. Appl Intell 48(3):670–686
Khishe M, Mosavi MR (2020) Chimp optimization algorithm. Expert Syst Appl 149:113338
Couzin ID, Laidre ME (2009) Fission-fusion populations. Curr Biol 19(15):R633–R635
Boesch C (2002) Cooperative hunting roles among Tai chimpanzees. Hum Nat 13(1):27–46
Stanford CB, Wallis J, Mpongo E, Goodall J (1994) Hunting decisions in wild chimpanzees. Behaviour 131(1–2):1–18
Mosavi MR, Khishe M, Akbarisani M (2017) Neural network trained by biogeography-based optimizer with chaos for sonar data set classification. Wirel Pers Commun 95(4):4623–4642
Berrill JJ (1950) The Tuniccafa. The Royal Society, London
Davenport J, Balazs GH (1991) Fiery bodies—are pyrosomas an important component of the diet of leatherback turtles? Biology 37:33–38
Ghasemi-Marzbali A (2020) A novel nature-inspired meta-heuristic algorithm for optimization: bear smell search algorithm. Soft Comput 24(2):1–33
Jordan R, Fukunaga I, Kollo M, Schaefer AT (2018) Active sampling state dynamically enhances olfactory bulb odor representation. Neuron 98(6):1214–1228
Li Z, Hopfield JJ (1989) Modeling the olfactory bulb and its neural oscillatory processings. Biol Cybern 61(5):379–392
Mirjalili S (2015) The ant lion optimizer. Adv Eng Softw 83:80–98
Wang M, Wu C, Wang L, Xiang D, Huang X (2019) A feature selection approach for hyperspectral image based on modified ant lion optimizer. Knowl-Based Syst 168:39–48
Barthelemy P, Bertolotti J, Wiersma DS (2008) A Lévy flight for light. Nature 453(7194):495–498
Haklı H, Uğuz H (2014) A novel particle swarm optimization algorithm with Levy flight. Appl Soft Comput 23:333–345
Heidari AA, Pahlavani P (2017) An efficient modified grey wolf optimizer with Lévy flight for optimization tasks. Appl Soft Comput 60:115–134
Heinzel A, Barragan VM (1999) A review of the state-of-the-art of the methanol crossover in direct methanol fuel cells. J Power Sources 84(1):70–74
Ledesma S, Cerda G, Avina G, Hernández D, Torres M (2008) Feature selection using artificial neural networks. In: Mexican international conference on artificial intelligence, pp 351–359
Mavrovouniotis M, Yang S (2015) Training neural networks with ant colony optimization algorithms for pattern classification. Soft Comput 19(6):1511–1522
Bache K, Lichman M (2013) UCI machine learning repository. University of California, School of Information and Computer Science
Larson SC (1931) The shrinkage of the coefficient of multiple correlation. J Educ Psychol 22(1):45
Hastie T, Tibshirani R (1996) Discriminant adaptive nearest neighbor classification. IEEE Trans Pattern Anal Mach Intell 18(6):607–616
Quinlan JR (2014) C4. 5: programs for machine learning. Elsevier, London
Cover T, Hart P (1967) Nearest neighbor pattern classification. IEEE Trans Inf Theory 13(1):21–27
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Diaz, P.M., Jiju, M.J.E. A comparative analysis of meta-heuristic optimization algorithms for feature selection and feature weighting in neural networks. Evol. Intel. 15, 2631–2650 (2022). https://doi.org/10.1007/s12065-021-00634-6
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12065-021-00634-6