[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content
Log in

A novel multi-swarm particle swarm optimization for feature selection

  • Published:
Genetic Programming and Evolvable Machines Aims and scope Submit manuscript

Abstract

A novel feature selection method based on a multi-swarm particle swarm optimization (MSPSO) is proposed in this paper. The canonical particle swarm optimization (PSO) has been widely used for feature selection problems. However, PSO suffers from stagnation in local optimal solutions and premature convergence in complex feature selection problems. This paper employs the multi-swarm topology in which the population is split into several small-sized sub-swarms. Particles in each sub-swarm update their positions with the guidance of the local best particle in its own sub-swarm. In order to promote information exchange among the sub-swarms, an elite learning strategy is introduced in which the elite particles in each sub-swarm learn from the useful information found by other sub-swarms. Moreover, a local search operator is proposed to improve the exploitation ability of each sub-swarm. MSPSO is able to improve the population diversity and better explore the entire feature space. The performance of the proposed method is compared with six PSO based wrappers, three traditional wrappers, and three popular filters on eleven datasets. Experimental results verify that MSPSO can find feature subsets with high classification accuracies and smaller numbers of features. The analysis of the search behavior of MSPSO demonstrates its effectiveness on maintaining population diversity and finding better feature subsets. The statistical test demonstrates that the superiority of MSPSO over other methods is significant.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. I. Guyon, A. Elisseeff, An introduction to variable and feature selection. J. Mach. Learn. Res. 3, 1157–1182 (2003)

    MATH  Google Scholar 

  2. M. Dash, H. Liu, Feature selection for classification. Intell. Data Anal. 1(1–4), 131–156 (1997)

    Article  Google Scholar 

  3. L. Song, D. Li, X. Zeng, Y. Wu, L. Guo, Q. Zou, nDNA-prot: identification of DNA-binding proteins based on unbalanced classification. BMC Bioinform. 15(1), 298 (2014)

    Article  Google Scholar 

  4. J. Sotoca, F. Pla, Supervised feature selection by clustering using conditional mutual information-based distances. Pattern Recogn. 43(6), 2068–2081 (2010)

    Article  MATH  Google Scholar 

  5. L. Wei, M. Liao, Y. Gao, R. Ji, Z. He, Q. Zou, Improved and promising identification of human microRNAs by incorporating a high-quality negative set. IEEE/ACM Trans. Comput. Biol. Bioinform. 11(1), 192–201 (2014)

    Article  Google Scholar 

  6. H. Liu, L. Yu, Toward integrating feature selection algorithms for classification and clustering. IEEE Trans. Knowl. Data Eng. 17(4), 491–502 (2005)

    Article  Google Scholar 

  7. B. Xue, M.J. Zhang, W.N. Browne, X. Yao, A survey on evolutionary computation approaches to feature selection. IEEE Trans. Evol. Comput. 20(4), 606–626 (2016)

    Article  Google Scholar 

  8. S. Ding, Feature selection based F-score and ACO algorithm in support vector machine, in Proceedings of the 2nd International Symposium on Knowledge Acquisition and Modeling (2009), pp. 19–23

  9. L.T. Vinh, S. Lee, Y.-T. Park, B.J. d’Auriol, A novel feature selection method based on normalized mutual information. Appl. Intell. 37, 100–120 (2010)

    Article  Google Scholar 

  10. J. Apolloni, G. Leguizamón, E. Alba, Two hybrid wrapper–filter feature selection algorithms applied to high-dimensional microarray experiments. Appl. Soft Comput. 38, 922–932 (2016)

    Article  Google Scholar 

  11. H. Nguyen, B Xue, I. Liu et al., Filter based backward elimination in wrapper based PSO for feature selection in classification, in Proceedings of the IEEE Congress on Evolutionary Computation (2014), pp. 111–3118

  12. Z. Zeng, H. Zhang, R. Zhang et al., A novel feature selection method considering feature interaction. Pattern Recogn. 48, 2656–2666 (2015)

    Article  Google Scholar 

  13. M.L. Raymer, W.F. Punch, E.D. Goodman, L.A. Kuhn, A.K. Jain, Dimensionality reduction using genetic algorithms. IEEE Trans. Evol. Comput. 4(2), 164–171 (2000)

    Article  Google Scholar 

  14. M.G. Smith, L. Bull, Genetic programming with a genetic algorithm for feature construction and selection. Genet. Program. Evol. Mach. 6(3), 265–281 (2005)

    Article  Google Scholar 

  15. K. Neshatian, M. Zhang, Improving relevance measures using genetic programming, in European Conference on Genetic Programming (EuroGP 2012), Series Lecture Notes in Computer Science, vol. 7244 (Springer, 2012), pp. 97–108

  16. M.A. Shoorehdeli, M. Teshnehlab, H.A. Moghaddam, Feature subset selection for face detection using genetic algorithms and particle swarm optimization, in Proceedings of IEEE International Conference on Networking, Sensing and Control (ICNSC), Fort Lauderdale, FL, USA (2006), pp. 686–690

  17. S. Oreski, G. Oreski, Genetic algorithm-based heuristic for feature selection in credit risk assessment. Expert Syst. Appl. 41(4), 2052–2064 (2014)

    Article  Google Scholar 

  18. G. Pedram, B.J. Atli, Feature selection based on hybridization of genetic algorithm and particle swarm optimization. IEEE Geosci. Remote Sens. Lett. 12(2), 309–313 (2015)

    Article  Google Scholar 

  19. B. Chen, L. Chen, Y. Chen, Efficient ant colony optimization for image feature selection. Signal Process. 93(6), 1566–1576 (2012)

    Article  Google Scholar 

  20. T. Sina, M. Parham, Relevance–redundancy feature selection based on ant colony optimization. Pattern Recognit. 48(9), 2798–2811 (2015)

    Article  Google Scholar 

  21. G. Wang, H.S. Chu, Y.X. Zhang, Multiple parameter control for ant colony optimization applied to feature selection problem. Neural Comput. Appl. 26(7), 1693–1708 (2015)

    Article  Google Scholar 

  22. A. Mohemmed, M. Zhang, M. Johnston, Particle swarm optimization based Adaboost for face detection, in Proceedings of IEEE Congress on Evolutionary Computation (CEC), Trondheim, Norway (2009), pp. 2494–2501

  23. H. Peng, Y. Fan, Feature selection by optimizing a lower bound of conditional mutual information. Inf. Sci. 418–419, 652–667 (2017)

    Article  Google Scholar 

  24. J. Kennedy, R. Mendes, Population structure and particle swarm performance, in Proceedings of Congress on Evolutionary Computation, CEC’02 (2002), pp. 1671–1676

  25. J. Kennedy, R.C. Eberhart, Particle swarm optimization, in Proceedings of IEEE International Conference on Neural Networks (1995), pp. 1942–1948

  26. R. Diao, Q. Shen, Nature inspired feature selection meta-heuristics. Artif. Intell. Rev. 44, 311–340 (2015)

    Article  Google Scholar 

  27. L.Y. Chuang, C.H. Yang, J.C. Li, Chaotic maps based on binary particle swarm optimization for feature selection. Appl. Soft Comput. 11(1), 239–248 (2011)

    Article  Google Scholar 

  28. B. Tran, B. Xue, M. Zhang, Improved PSO for feature selection on high-dimensional datasets, in Simulated Evolution and Learning (LNCS 8886), Cham, Switzerland (Springer, 2014), pp. 503–515

  29. B. Xue, M. Zhang, W.N. Browne, Particle swarm optimisation for feature selection in classification: novel initialisation and updating mechanisms. Appl. Soft Comput. 18, 261–276 (2014)

    Article  Google Scholar 

  30. Y. Zhang, S. Wang, P. Phillips, G. Ji, Binary PSO with mutation operator for feature selection using decision tree applied to spam detection. Knowl. Based Syst. 64, 22–31 (2014)

    Article  Google Scholar 

  31. P. Moradi, M. Gholampour, A hybrid particle swarm optimization for feature subset selection by integrating a novel local search strategy. Appl. Soft Comput. 43, 117–130 (2016)

    Article  Google Scholar 

  32. C. Qiu, Bare bones particle swarm optimization with adaptive chaotic jump for feature selection in classification. Int. J. Comput. Intell. Syst. 11(1), 1–14 (2018)

    Article  MathSciNet  Google Scholar 

  33. J. Kennedy, Bare bones particle swarms, in Proceeding of the 2003 IEEE Swarm Intelligence Symposium (2003), pp. 80–87

  34. M. Javidi, N. Emami, A hybrid search method of wrapper feature selection by chaos particle swarm optimization and local search. Turk. J. Electr. Eng. Comput. Sci. 24, 3852–3861 (2016)

    Article  Google Scholar 

  35. S.M. Vieira, L.F. Mendonça, G.J. Farinha, J.M.C. Sousa, Modified binary PSO for feature selection using SVM applied to mortality prediction of septic patients. Appl. Soft Comput. 13(8), 3494–3504 (2013)

    Article  Google Scholar 

  36. L. Chuang, S. Tsai, C. Yang, Improved binary particle swarm optimization using catfish effect for feature selection. Expert Syst. Appl. 38, 12699–12707 (2011)

    Article  Google Scholar 

  37. J. Jiang, Y. Bo, C. Song, L. Bao, Hybrid algorithm based on particle swarm optimization and artificial fish swarm algorithm, in Advances in Neural Networks—ISNN 2012, ed. by J. Wang, G. Yen, M. Polycarpou (Springer, Berlin, 2012), pp. 607–614

    Chapter  Google Scholar 

  38. H. Nguyen, B. Xue, P. Andreae, M. Zhang. Particle swarm optimisation with genetic operators for feature selection, in Proceedings of IEEE International Conference on Evolutionary Computations (2017), pp. 286–293

  39. B. Xue, M. Zhang, W.N. Browne, Particle swarm optimization for feature selection in classification: a multi-objective approach. IEEE Trans. Cybern. 43(6), 1656–1671 (2013)

    Article  Google Scholar 

  40. Y. Zhang, C. Xia, D. Gong, X. Sun, Multi-objective PSO algorithm for feature selection problems with unreliable data, in Advances in Swarm Intelligence (LNCS 8794), Cham, Switzerland (Springer, 2014), pp. 386–393

  41. J. Kennedy, R. Mendes, Population structure and particle swarm performance, in Proceedings of the Congress on Evolutionary Computation, CEC’02 (2002), pp. 1671–1676

  42. Q. Liu, W. Wei, H. Yuan, Z.H. Zhan, Y. Li, Topology selection for particle swarm optimization. Inf. Sci. 363, 154–173 (2016)

    Article  Google Scholar 

  43. W. Ye, W. Feng, S. Fan, A novel multi-swarm particle swarm optimization with dynamic learning strategy. Appl. Soft Comput. 61, 832–843 (2017)

    Article  Google Scholar 

  44. S.-K.S. Fan, J.-M. Chang, Dynamic multi-swarm particle swarm optimizer using parallel PC cluster systems for global optimization of largescale multimodal functions. Eng. Optim. 42(5), 431–451 (2010)

    Article  Google Scholar 

  45. L. Vanneschi, D. Codecasa, G. Mauri, An empirical comparison of parallel and distributed particle swarm optimization methods, in Proceedings of the 12th Annual Conference on Genetic and Evolutionary Computation, USA (2010), pp. 15–22

  46. H. Abadlia, N. Smairi, K. Ghedira, Particle swarm optimization based on dynamic island model, in Proceedings of the 2017 IEEE 29th International Conference on Tools with Artificial Intelligence (ICTAI), USA (2017), pp. 709–716

  47. G. Zhang, Z. Zhan, K. Du, Y. Lin, W. Chen, J. Li, J. Zhang, Parallel particle swarm optimization using message passing interface, in Proceedings of the 18th Asia Pacific Symposium on Intelligent and Evolutionary Systems (2015), pp. 55–64

  48. J. Liu, D. Ma, T.B. Ma, W. Zhang, Ecosystem particle swarm optimization. Soft. Comput. 21(3), 1667–1691 (2017)

    Article  Google Scholar 

  49. J. Kennedy, R.C. Eberhart, A discrete binary version of the particle swarm algorithm, in Proceedings of 1997 Conference Systems Man and Cybernetics (1997), pp. 4104–4108

  50. W.D. Chang, A modified particle swarm optimization with multiple subpopulations for multimodal function optimization problems. Appl. Soft Comput. 33, 170–182 (2015)

    Article  Google Scholar 

  51. J. Liang, P.N. Suganthan, Dynamic multi-swarm particle swarm optimizer, in Proceedings of Swarm Intelligence Symposium (2005), pp. 124–129

  52. X. Xu, Y. Tang, J. Li, C. Hua, X. Guan, Dynamic multi-swarm particle swarm optimizer with cooperative learning strategy. Appl. Soft Comput. 29, 169–183 (2015)

    Article  Google Scholar 

  53. S. Gu, R. Cheng, Y. Jin, Feature selection for high-dimensional classification using a competitive swarm optimizer. Soft. Comput. 22(3), 811–822 (2018)

    Article  Google Scholar 

  54. T. Marill, D. Green, On the effectiveness of receptors in recognition systems. IEEE Trans. Inf. Theory 9(1), 11–17 (2006)

    Article  Google Scholar 

  55. H. Peng, F. Long, C. Ding, Feature selection based on mutual information criteria of max-dependency, max-relevance, and min redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 27(8), 1226–1238 (2005)

    Article  Google Scholar 

  56. R. Battiti, Using mutual information for selecting features in supervised neural net learning. IEEE Trans. Neural Netw. 5(4), 537–550 (1994)

    Article  Google Scholar 

  57. H. Yang, J. Moody, Data visualization and feature selection: new algorithms for nongaussian data. Adv. Neural Inf. Process. Syst. 12, 687–693 (1999)

    Google Scholar 

  58. G. Brown, A. Pocock, M. Zhao, M. Luján, Conditional likelihood maximisation: a unifying framework for information theoretic feature selection. J. Mach. Learn. Res. 13(1), 27–66 (2012)

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

This work was supported by the [Natural Science Foundation of Jiangsu Province] under Grant [No. BK20160898]; and the [NUPTSF] under Grant [No. NY214186].

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chenye Qiu.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Area Editor: U.-M. O'Reilly.

Appendix

Appendix

The evolving curves of the remaining datasets except wine, sonar, and LSVT are shown in Fig. 4.

Fig. 4
figure 4figure 4

Curves of fitness value and number of selected features with the number of iterations: a, b heart; c, d australia; e, f WDBC; g, h ionosphere; i, j musk1; k, l arrhythmia; m, n colon; o, p leukemia

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Qiu, C. A novel multi-swarm particle swarm optimization for feature selection. Genet Program Evolvable Mach 20, 503–529 (2019). https://doi.org/10.1007/s10710-019-09358-0

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10710-019-09358-0

Keywords

Navigation