[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content

Advertisement

Log in

A comparative analysis of meta-heuristic optimization algorithms for feature selection and feature weighting in neural networks

  • Research Paper
  • Published:
Evolutionary Intelligence Aims and scope Submit manuscript

Abstract

Feature selection and feature weighting are frequently used in machine learning for processing high dimensional data. It reduces the number of features in the dataset and makes the classification process easier. Meta-heuristic algorithms are widely adopted for feature selection and feature weighting due to their enhanced searching ability. This paper compares five different meta-heuristic optimization algorithms that are recently introduced for feature selection and feature weighting in artificial neural networks. This includes chimp optimization algorithm, tunicate swarm algorithm, bear smell search algorithm, antlion optimization algorithm and modified antlion optimization algorithm. Experimental evaluations are performed on five different datasets to illustrate the significant improvements observed during classification process of all the algorithms utilised in the comparative analysis. Both tunicate swarm algorithm and chimp optimization algorithm has gained better classification accuracy than other algorithms. However, all these algorithms are found to be more effective for feature selection and feature weighting processes.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Dey A (2016) Machine learning algorithms: a review. Int J Comput Sci Inf Technol 7(3):1174–1179

    Google Scholar 

  2. Xue B, Cervante L, Shang L, Browne WN, Zhang M (2013) Multi-objective evolutionary algorithms for filter based feature selection in classification. Int J Artif Intell Tools 22(04):1350024

    Article  Google Scholar 

  3. Hughes G (1968) On the mean accuracy of statistical pattern recognizers. IEEE Trans Inf Theory 14(1):55–63

    Article  Google Scholar 

  4. Tahir MA, Bouridane A, Kurugollu F (2007) Simultaneous feature selection and feature weighting using Hybrid Tabu Search/K-nearest neighbor classifier. Pattern Recogn Lett 28(4):438–446

    Article  Google Scholar 

  5. Schmidhuber J (2015) Deep learning in neural networks: an overview. Neural Netw 61:85–117

    Article  Google Scholar 

  6. Galeshchuk S (2016) Neural networks performance in exchange rate prediction. Neurocomputing 172:446–452

    Article  Google Scholar 

  7. Kavzoglu T, Mather PM (2000) The use of feature selection techniques in the context of artificial neural networks. In: Proceedings of the 26th annual conference of the remote sensing society

  8. Perez-Rodriguez J, Arroyo-Pena AG, Garcia-Pedrajas N (2015) Simultaneous instance and feature selection and weighting using evolutionary computation: Proposal and study. Appl Soft Comput 37:416–443

    Article  Google Scholar 

  9. Zhou Y, Cheng G, Jiang S, Dai M (2020) Building an efficient intrusion detection system based on feature selection and ensemble classifier. Comput Networks 174:107247

    Article  Google Scholar 

  10. Weston J, Mukherjee S, Chapelle O, Pontil M, Poggio T, Vapnik V (2000) Feature selection for SVMs. Adv Neural Inf Process Syst 13:668–674

    Google Scholar 

  11. Peng HC, Ding C, Long FH (2005) Minimum redundancy-maximum relevance feature selection 70–71

  12. Kelly JD Jr, Davis L (1991) A hybrid genetic algorithm for classification. IJCAI 91:645–650

    MATH  Google Scholar 

  13. Dialameh M, Jahromi MZ (2017) A general feature-weighting function for classification problems. Expert Syst Appl 72:177–188

    Article  Google Scholar 

  14. Wettschereck D, Aha DW, Mohri T (1997) A review and empirical evaluation of feature weighting methods for a class of lazy learning algorithms. Artif Intell Rev 11(1–5):273–314

    Article  Google Scholar 

  15. Raymer ML, Punch WF, Goodman ED, Kuhn LA, Jain AK (2000) Dimensionality reduction using genetic algorithms. IEEE Trans Evol Comput 4(2):164–171

    Article  Google Scholar 

  16. Singh D, Singh B (2019) Hybridization of feature selection and feature weighting for high dimensional data. Appl Intell 49(4):1580–1596

    Article  Google Scholar 

  17. Alba E, Dorronsoro B (2005) The exploration/exploitation tradeoff in dynamic cellular genetic algorithms. IEEE Trans Evol Comput 9(2):126–142

    Article  Google Scholar 

  18. Lozano M, García-Martínez C (2010) Hybrid metaheuristics with evolutionary algorithms specializing in intensification and diversification: overview and progress report. Comput Oper Res 37(3):481–497

    Article  MathSciNet  MATH  Google Scholar 

  19. Shayeghi H, Ghasemi A, Moradzadeh M, Nooshyar M (2017) Day-ahead electricity price forecasting using WPT, GMI and modified LSSVM-based S-OLABC algorithm. Soft Comput 21(2):525–541

    Article  Google Scholar 

  20. Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1(1):67–82

    Article  Google Scholar 

  21. Song XF, Zhang Y, Gong DW, Gao XZ (2021) A fast hybrid feature selection based on correlation-guided clustering and particle swarm optimization for high-dimensional data. IEEE Trans Cybern 99:1–14

    Google Scholar 

  22. Song XF, Zhang Y, Gong DW, Sun XY (2021) Feature selection using bare-bones particle swarm optimization with mutual information. Pattern Recogn 112:107804

    Article  Google Scholar 

  23. Song XF, Zhang Y, Guo YN, Sun XY, Wang YL (2020) Variable-size cooperative coevolutionary particle swarm optimization for feature selection on high-dimensional data. IEEE Trans Evol Comput 24(5):882–895

    Article  Google Scholar 

  24. Hu Y, Zhang Y, Gong D (2020) Multiobjective particle swarm optimization for feature selection with fuzzy cost. IEEE Trans Cybern 51(2):874–888

    Article  Google Scholar 

  25. Darwish A (2018) Bio-inspired computing: algorithms review, deep analysis, and the scope of applications. Future Comput Inf J 3(2):231–246

    Article  MathSciNet  Google Scholar 

  26. Kennedy J, Eberhart R (1995) Particle swarm optimization. In: Proceedings of ICNN’95-international conference on neural networks IEEE, vol 4. pp 1942–1948

  27. Yazdani M, Jolai F (2016) Lion optimization algorithm (LOA): a nature-inspired metaheuristic algorithm. J Comput Des Eng 3(1):24–36

    Google Scholar 

  28. Cuevas E, Cienfuegos M, Rojas R, Padilla A (2015) A computational intelligence optimization algorithm based on the behavior of the social-spider. In: Computational intelligence applications in modeling and control. Springer, pp. 123–146

  29. Mirjalili S, Lewis A (2016) The whale optimization algorithm. Adv Eng Softw 95:51–67

    Article  Google Scholar 

  30. Song X, Tang L, Zhao S, Zhang X, Li L, Huang J, Cai W (2015) Grey wolf optimizer for parameter estimation in surface waves. Soil Dyn Earthq Eng 75:147–157

    Article  Google Scholar 

  31. Emary E, Zawbaa HM, Hassanien AE (2016) Binary grey wolf optimization approaches for feature selection. Neurocomputing 172:371–381

    Article  Google Scholar 

  32. Karaboga D (2005) An idea based on honey bee swarm for numerical optimization, Technical Report-TR06. Erciyes University, Engineering Faculty, Computer Engineering Department, 200, pp 1–10

  33. Parpinelli RS, Lopes HS, Freitas AA (2002) Data mining with an ant colony optimization algorithm. IEEE Trans Evol Comput 6(4):321–332

    Article  MATH  Google Scholar 

  34. Michelakos I, Mallios N, Papageorgiou E, Vassilakopoulos M (2011) Ant colony optimization and data mining. In: Next Generation Data Technologies for Collective Computational Intelligence. Springer, pp 31–60

  35. Singh D, Singh B (2020) Investigating the impact of data normalization on classification performance. Appl Soft Comput 97:105524

    Article  Google Scholar 

  36. Tharwat A, Hassanien AE (2018) Chaotic antlion algorithm for parameter optimization of support vector machine. Appl Intell 48(3):670–686

    Article  Google Scholar 

  37. Khishe M, Mosavi MR (2020) Chimp optimization algorithm. Expert Syst Appl 149:113338

    Article  Google Scholar 

  38. Couzin ID, Laidre ME (2009) Fission-fusion populations. Curr Biol 19(15):R633–R635

    Article  Google Scholar 

  39. Boesch C (2002) Cooperative hunting roles among Tai chimpanzees. Hum Nat 13(1):27–46

    Article  Google Scholar 

  40. Stanford CB, Wallis J, Mpongo E, Goodall J (1994) Hunting decisions in wild chimpanzees. Behaviour 131(1–2):1–18

    Article  Google Scholar 

  41. Mosavi MR, Khishe M, Akbarisani M (2017) Neural network trained by biogeography-based optimizer with chaos for sonar data set classification. Wirel Pers Commun 95(4):4623–4642

    Article  Google Scholar 

  42. Berrill JJ (1950) The Tuniccafa. The Royal Society, London

    Google Scholar 

  43. Davenport J, Balazs GH (1991) Fiery bodies—are pyrosomas an important component of the diet of leatherback turtles? Biology 37:33–38

    Google Scholar 

  44. Ghasemi-Marzbali A (2020) A novel nature-inspired meta-heuristic algorithm for optimization: bear smell search algorithm. Soft Comput 24(2):1–33

    Google Scholar 

  45. Jordan R, Fukunaga I, Kollo M, Schaefer AT (2018) Active sampling state dynamically enhances olfactory bulb odor representation. Neuron 98(6):1214–1228

    Article  Google Scholar 

  46. Li Z, Hopfield JJ (1989) Modeling the olfactory bulb and its neural oscillatory processings. Biol Cybern 61(5):379–392

    Article  MATH  Google Scholar 

  47. Mirjalili S (2015) The ant lion optimizer. Adv Eng Softw 83:80–98

    Article  Google Scholar 

  48. Wang M, Wu C, Wang L, Xiang D, Huang X (2019) A feature selection approach for hyperspectral image based on modified ant lion optimizer. Knowl-Based Syst 168:39–48

    Article  Google Scholar 

  49. Barthelemy P, Bertolotti J, Wiersma DS (2008) A Lévy flight for light. Nature 453(7194):495–498

    Article  Google Scholar 

  50. Haklı H, Uğuz H (2014) A novel particle swarm optimization algorithm with Levy flight. Appl Soft Comput 23:333–345

    Article  Google Scholar 

  51. Heidari AA, Pahlavani P (2017) An efficient modified grey wolf optimizer with Lévy flight for optimization tasks. Appl Soft Comput 60:115–134

    Article  Google Scholar 

  52. Heinzel A, Barragan VM (1999) A review of the state-of-the-art of the methanol crossover in direct methanol fuel cells. J Power Sources 84(1):70–74

    Article  Google Scholar 

  53. Ledesma S, Cerda G, Avina G, Hernández D, Torres M (2008) Feature selection using artificial neural networks. In: Mexican international conference on artificial intelligence, pp 351–359

  54. Mavrovouniotis M, Yang S (2015) Training neural networks with ant colony optimization algorithms for pattern classification. Soft Comput 19(6):1511–1522

    Article  Google Scholar 

  55. Bache K, Lichman M (2013) UCI machine learning repository. University of California, School of Information and Computer Science

  56. Larson SC (1931) The shrinkage of the coefficient of multiple correlation. J Educ Psychol 22(1):45

    Article  Google Scholar 

  57. Hastie T, Tibshirani R (1996) Discriminant adaptive nearest neighbor classification. IEEE Trans Pattern Anal Mach Intell 18(6):607–616

    Article  Google Scholar 

  58. Quinlan JR (2014) C4. 5: programs for machine learning. Elsevier, London

    Google Scholar 

  59. Cover T, Hart P (1967) Nearest neighbor pattern classification. IEEE Trans Inf Theory 13(1):21–27

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to P. M. Diaz.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Diaz, P.M., Jiju, M.J.E. A comparative analysis of meta-heuristic optimization algorithms for feature selection and feature weighting in neural networks. Evol. Intel. 15, 2631–2650 (2022). https://doi.org/10.1007/s12065-021-00634-6

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12065-021-00634-6

Keywords

Navigation