Abstract
A novel feature selection method based on a multi-swarm particle swarm optimization (MSPSO) is proposed in this paper. The canonical particle swarm optimization (PSO) has been widely used for feature selection problems. However, PSO suffers from stagnation in local optimal solutions and premature convergence in complex feature selection problems. This paper employs the multi-swarm topology in which the population is split into several small-sized sub-swarms. Particles in each sub-swarm update their positions with the guidance of the local best particle in its own sub-swarm. In order to promote information exchange among the sub-swarms, an elite learning strategy is introduced in which the elite particles in each sub-swarm learn from the useful information found by other sub-swarms. Moreover, a local search operator is proposed to improve the exploitation ability of each sub-swarm. MSPSO is able to improve the population diversity and better explore the entire feature space. The performance of the proposed method is compared with six PSO based wrappers, three traditional wrappers, and three popular filters on eleven datasets. Experimental results verify that MSPSO can find feature subsets with high classification accuracies and smaller numbers of features. The analysis of the search behavior of MSPSO demonstrates its effectiveness on maintaining population diversity and finding better feature subsets. The statistical test demonstrates that the superiority of MSPSO over other methods is significant.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
I. Guyon, A. Elisseeff, An introduction to variable and feature selection. J. Mach. Learn. Res. 3, 1157–1182 (2003)
M. Dash, H. Liu, Feature selection for classification. Intell. Data Anal. 1(1–4), 131–156 (1997)
L. Song, D. Li, X. Zeng, Y. Wu, L. Guo, Q. Zou, nDNA-prot: identification of DNA-binding proteins based on unbalanced classification. BMC Bioinform. 15(1), 298 (2014)
J. Sotoca, F. Pla, Supervised feature selection by clustering using conditional mutual information-based distances. Pattern Recogn. 43(6), 2068–2081 (2010)
L. Wei, M. Liao, Y. Gao, R. Ji, Z. He, Q. Zou, Improved and promising identification of human microRNAs by incorporating a high-quality negative set. IEEE/ACM Trans. Comput. Biol. Bioinform. 11(1), 192–201 (2014)
H. Liu, L. Yu, Toward integrating feature selection algorithms for classification and clustering. IEEE Trans. Knowl. Data Eng. 17(4), 491–502 (2005)
B. Xue, M.J. Zhang, W.N. Browne, X. Yao, A survey on evolutionary computation approaches to feature selection. IEEE Trans. Evol. Comput. 20(4), 606–626 (2016)
S. Ding, Feature selection based F-score and ACO algorithm in support vector machine, in Proceedings of the 2nd International Symposium on Knowledge Acquisition and Modeling (2009), pp. 19–23
L.T. Vinh, S. Lee, Y.-T. Park, B.J. d’Auriol, A novel feature selection method based on normalized mutual information. Appl. Intell. 37, 100–120 (2010)
J. Apolloni, G. Leguizamón, E. Alba, Two hybrid wrapper–filter feature selection algorithms applied to high-dimensional microarray experiments. Appl. Soft Comput. 38, 922–932 (2016)
H. Nguyen, B Xue, I. Liu et al., Filter based backward elimination in wrapper based PSO for feature selection in classification, in Proceedings of the IEEE Congress on Evolutionary Computation (2014), pp. 111–3118
Z. Zeng, H. Zhang, R. Zhang et al., A novel feature selection method considering feature interaction. Pattern Recogn. 48, 2656–2666 (2015)
M.L. Raymer, W.F. Punch, E.D. Goodman, L.A. Kuhn, A.K. Jain, Dimensionality reduction using genetic algorithms. IEEE Trans. Evol. Comput. 4(2), 164–171 (2000)
M.G. Smith, L. Bull, Genetic programming with a genetic algorithm for feature construction and selection. Genet. Program. Evol. Mach. 6(3), 265–281 (2005)
K. Neshatian, M. Zhang, Improving relevance measures using genetic programming, in European Conference on Genetic Programming (EuroGP 2012), Series Lecture Notes in Computer Science, vol. 7244 (Springer, 2012), pp. 97–108
M.A. Shoorehdeli, M. Teshnehlab, H.A. Moghaddam, Feature subset selection for face detection using genetic algorithms and particle swarm optimization, in Proceedings of IEEE International Conference on Networking, Sensing and Control (ICNSC), Fort Lauderdale, FL, USA (2006), pp. 686–690
S. Oreski, G. Oreski, Genetic algorithm-based heuristic for feature selection in credit risk assessment. Expert Syst. Appl. 41(4), 2052–2064 (2014)
G. Pedram, B.J. Atli, Feature selection based on hybridization of genetic algorithm and particle swarm optimization. IEEE Geosci. Remote Sens. Lett. 12(2), 309–313 (2015)
B. Chen, L. Chen, Y. Chen, Efficient ant colony optimization for image feature selection. Signal Process. 93(6), 1566–1576 (2012)
T. Sina, M. Parham, Relevance–redundancy feature selection based on ant colony optimization. Pattern Recognit. 48(9), 2798–2811 (2015)
G. Wang, H.S. Chu, Y.X. Zhang, Multiple parameter control for ant colony optimization applied to feature selection problem. Neural Comput. Appl. 26(7), 1693–1708 (2015)
A. Mohemmed, M. Zhang, M. Johnston, Particle swarm optimization based Adaboost for face detection, in Proceedings of IEEE Congress on Evolutionary Computation (CEC), Trondheim, Norway (2009), pp. 2494–2501
H. Peng, Y. Fan, Feature selection by optimizing a lower bound of conditional mutual information. Inf. Sci. 418–419, 652–667 (2017)
J. Kennedy, R. Mendes, Population structure and particle swarm performance, in Proceedings of Congress on Evolutionary Computation, CEC’02 (2002), pp. 1671–1676
J. Kennedy, R.C. Eberhart, Particle swarm optimization, in Proceedings of IEEE International Conference on Neural Networks (1995), pp. 1942–1948
R. Diao, Q. Shen, Nature inspired feature selection meta-heuristics. Artif. Intell. Rev. 44, 311–340 (2015)
L.Y. Chuang, C.H. Yang, J.C. Li, Chaotic maps based on binary particle swarm optimization for feature selection. Appl. Soft Comput. 11(1), 239–248 (2011)
B. Tran, B. Xue, M. Zhang, Improved PSO for feature selection on high-dimensional datasets, in Simulated Evolution and Learning (LNCS 8886), Cham, Switzerland (Springer, 2014), pp. 503–515
B. Xue, M. Zhang, W.N. Browne, Particle swarm optimisation for feature selection in classification: novel initialisation and updating mechanisms. Appl. Soft Comput. 18, 261–276 (2014)
Y. Zhang, S. Wang, P. Phillips, G. Ji, Binary PSO with mutation operator for feature selection using decision tree applied to spam detection. Knowl. Based Syst. 64, 22–31 (2014)
P. Moradi, M. Gholampour, A hybrid particle swarm optimization for feature subset selection by integrating a novel local search strategy. Appl. Soft Comput. 43, 117–130 (2016)
C. Qiu, Bare bones particle swarm optimization with adaptive chaotic jump for feature selection in classification. Int. J. Comput. Intell. Syst. 11(1), 1–14 (2018)
J. Kennedy, Bare bones particle swarms, in Proceeding of the 2003 IEEE Swarm Intelligence Symposium (2003), pp. 80–87
M. Javidi, N. Emami, A hybrid search method of wrapper feature selection by chaos particle swarm optimization and local search. Turk. J. Electr. Eng. Comput. Sci. 24, 3852–3861 (2016)
S.M. Vieira, L.F. Mendonça, G.J. Farinha, J.M.C. Sousa, Modified binary PSO for feature selection using SVM applied to mortality prediction of septic patients. Appl. Soft Comput. 13(8), 3494–3504 (2013)
L. Chuang, S. Tsai, C. Yang, Improved binary particle swarm optimization using catfish effect for feature selection. Expert Syst. Appl. 38, 12699–12707 (2011)
J. Jiang, Y. Bo, C. Song, L. Bao, Hybrid algorithm based on particle swarm optimization and artificial fish swarm algorithm, in Advances in Neural Networks—ISNN 2012, ed. by J. Wang, G. Yen, M. Polycarpou (Springer, Berlin, 2012), pp. 607–614
H. Nguyen, B. Xue, P. Andreae, M. Zhang. Particle swarm optimisation with genetic operators for feature selection, in Proceedings of IEEE International Conference on Evolutionary Computations (2017), pp. 286–293
B. Xue, M. Zhang, W.N. Browne, Particle swarm optimization for feature selection in classification: a multi-objective approach. IEEE Trans. Cybern. 43(6), 1656–1671 (2013)
Y. Zhang, C. Xia, D. Gong, X. Sun, Multi-objective PSO algorithm for feature selection problems with unreliable data, in Advances in Swarm Intelligence (LNCS 8794), Cham, Switzerland (Springer, 2014), pp. 386–393
J. Kennedy, R. Mendes, Population structure and particle swarm performance, in Proceedings of the Congress on Evolutionary Computation, CEC’02 (2002), pp. 1671–1676
Q. Liu, W. Wei, H. Yuan, Z.H. Zhan, Y. Li, Topology selection for particle swarm optimization. Inf. Sci. 363, 154–173 (2016)
W. Ye, W. Feng, S. Fan, A novel multi-swarm particle swarm optimization with dynamic learning strategy. Appl. Soft Comput. 61, 832–843 (2017)
S.-K.S. Fan, J.-M. Chang, Dynamic multi-swarm particle swarm optimizer using parallel PC cluster systems for global optimization of largescale multimodal functions. Eng. Optim. 42(5), 431–451 (2010)
L. Vanneschi, D. Codecasa, G. Mauri, An empirical comparison of parallel and distributed particle swarm optimization methods, in Proceedings of the 12th Annual Conference on Genetic and Evolutionary Computation, USA (2010), pp. 15–22
H. Abadlia, N. Smairi, K. Ghedira, Particle swarm optimization based on dynamic island model, in Proceedings of the 2017 IEEE 29th International Conference on Tools with Artificial Intelligence (ICTAI), USA (2017), pp. 709–716
G. Zhang, Z. Zhan, K. Du, Y. Lin, W. Chen, J. Li, J. Zhang, Parallel particle swarm optimization using message passing interface, in Proceedings of the 18th Asia Pacific Symposium on Intelligent and Evolutionary Systems (2015), pp. 55–64
J. Liu, D. Ma, T.B. Ma, W. Zhang, Ecosystem particle swarm optimization. Soft. Comput. 21(3), 1667–1691 (2017)
J. Kennedy, R.C. Eberhart, A discrete binary version of the particle swarm algorithm, in Proceedings of 1997 Conference Systems Man and Cybernetics (1997), pp. 4104–4108
W.D. Chang, A modified particle swarm optimization with multiple subpopulations for multimodal function optimization problems. Appl. Soft Comput. 33, 170–182 (2015)
J. Liang, P.N. Suganthan, Dynamic multi-swarm particle swarm optimizer, in Proceedings of Swarm Intelligence Symposium (2005), pp. 124–129
X. Xu, Y. Tang, J. Li, C. Hua, X. Guan, Dynamic multi-swarm particle swarm optimizer with cooperative learning strategy. Appl. Soft Comput. 29, 169–183 (2015)
S. Gu, R. Cheng, Y. Jin, Feature selection for high-dimensional classification using a competitive swarm optimizer. Soft. Comput. 22(3), 811–822 (2018)
T. Marill, D. Green, On the effectiveness of receptors in recognition systems. IEEE Trans. Inf. Theory 9(1), 11–17 (2006)
H. Peng, F. Long, C. Ding, Feature selection based on mutual information criteria of max-dependency, max-relevance, and min redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 27(8), 1226–1238 (2005)
R. Battiti, Using mutual information for selecting features in supervised neural net learning. IEEE Trans. Neural Netw. 5(4), 537–550 (1994)
H. Yang, J. Moody, Data visualization and feature selection: new algorithms for nongaussian data. Adv. Neural Inf. Process. Syst. 12, 687–693 (1999)
G. Brown, A. Pocock, M. Zhao, M. Luján, Conditional likelihood maximisation: a unifying framework for information theoretic feature selection. J. Mach. Learn. Res. 13(1), 27–66 (2012)
Acknowledgements
This work was supported by the [Natural Science Foundation of Jiangsu Province] under Grant [No. BK20160898]; and the [NUPTSF] under Grant [No. NY214186].
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Area Editor: U.-M. O'Reilly.
Appendix
Appendix
The evolving curves of the remaining datasets except wine, sonar, and LSVT are shown in Fig. 4.
Rights and permissions
About this article
Cite this article
Qiu, C. A novel multi-swarm particle swarm optimization for feature selection. Genet Program Evolvable Mach 20, 503–529 (2019). https://doi.org/10.1007/s10710-019-09358-0
Received:
Revised:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10710-019-09358-0