[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

Hybrid Whale Optimization Algorithm with simulated annealing for feature selection

Published: 18 October 2017 Publication History

Highlights

Four hybrid feature selection methods for classification task are proposed.
Our hybrid method combines Whale Optimization Algorithm with simulated annealing.
Eighteen UCI datasets were used in the experiments.
Our approaches result a higher accuracy by using less number of features.

Abstract

Hybrid metaheuristics are of the most interesting recent trends in optimization and memetic algorithms. In this paper, two hybridization models are used to design different feature selection techniques based on Whale Optimization Algorithm (WOA). In the first model, Simulated Annealing (SA) algorithm is embedded in WOA algorithm, while it is used to improve the best solution found after each iteration of WOA algorithm in the second model. The goal of using SA here is to enhance the exploitation by searching the most promising regions located by WOA algorithm. The performance of the proposed approaches is evaluated on 18 standard benchmark datasets from UCI repository and compared with three well-known wrapper feature selection methods in the literature. The experimental results confirm the efficiency of the proposed approaches in improving the classification accuracy compared to other wrapper-based algorithms, which insures the ability of WOA algorithm in searching the feature space and selecting the most informative attributes for classification tasks.

References

[1]
Han J., Pei J., M. Kamber, Data Mining: Concepts and Techniques, Elsevier, 2011.
[2]
S.F. Crone, S. Lessmann, R. Stahlbock, The impact of preprocessing on data mining: an evaluation of classifier sensitivity in direct marketing, Eur. J. Oper. Res. 173 (3) (2006) 781–800.
[3]
Liu H., H. Motoda, Feature Selection for Knowledge Discovery and Data Mining, Kluwer Academic Publishers, Boston, 1998.
[4]
Zhu Z., Ong Y.S., M. Dash, Wrapper-filter feature selection algorithm using a memetic framework, IEEE Trans. Syst. Man Cybern. 37 (1) (2007) 70–76.
[5]
R. Kohavi, G.H. John, Wrappers for feature subset selection, Artif. Intel. 97 (1) (1997) 273–324.
[6]
Liu H., Yu L., Toward integrating feature selection algorithms for classification and clustering, IEEE Trans. Knowl. Data Eng. 17 (4) (2005) 491–502.
[7]
A. Zarshenas, K. Suzuki, Binary coordinate ascent: an efficient optimization technique for feature subset selection for machine learning, Knowl. Based Syst. 110 (2016) 191–201. in press.
[8]
S. Paul, M. Magdon-Ismail, P. Drineas, Column selection via adaptive sampling, Adv. Neural Inf. Process. Syst. (2015).
[9]
C. Boutsidis, P. Drineas, M. Magdon-Ismail, Near-optimal column-based matrix reconstruction, SIAM J. Comput. 43 (2) (2014) 687–717.
[10]
P. Drineas, M.W. Mahoney, S. Muthukrishnan, Relative-error CUR matrix decompositions, SIAM J. Matrix Anal. Appl. 30 (2) (2008) 844–881.
[11]
C. Boutsidis, P. Drineas, M.W. Mahoney, Unsupervised feature selection for the k-means clustering problem, in: Proceeding of the 2009 Advances in Neural Information Processing Systems, 2009.
[12]
S. Paul, M. Magdon-Ismail, P. Drineas, Feature selection for linear SVM with provable guarantees, Pattern Recogit. 60 (2016) 205–214.
[13]
S. Paul, M. Magdon-Ismail, P. Drineas, Feature selection for linear SVM with provable guarantees, in: Proceeding of the 2015 Artificial Intelligence and Statistics, 2015.
[14]
S. Paul, P. Drineas, Feature selection for ridge regression with provable guarantees, Neural Comput. 28 (4) (2016) 716–742.
[15]
E.G. Talbi, Metaheuristics From Design to Implementation, Wiley Online Library, 2009.
[16]
I. Guyon, A. Elisseeff, An introduction to variable and feature selection, J. Mach. Learn. Res. 3 (2003) 1157–1182.
[17]
Lai C., M.J. Reinders, L. Wessels, Random subspace method for multivariate feature selection, Pattern Recognit. Lett. 27 (10) (2006) 1067–1076.
[18]
A.-R. Hedar, Wang J., M. Fukushima, Tabu search for attribute reduction in rough set theory, Soft Comput. Fusion Found. Methodol. Appl. 12 (9) (2006) 909–918.
[19]
R. Jensen, Shen Q., Semantics-preserving dimensionality reduction: rough and fuzzy-rough-based approaches, IEEE Trans. Knowl. Data Eng. 16 (12) (2004) 1457–1471.
[20]
M. Mafarja, S. Abdullah, A fuzzy record-to-record travel algorithm for solving rough set attribute reduction, Int. J. Syst. Sci. 46 (3) (2015) 503–512.
[21]
M. Mafarja, S. Abdullah, Record-to-record travel algorithm for attribute reduction in rough set theory, J Theor. Appl. Inf. Technol. 49 (2) (2013) 507–513.
[22]
M.M. Kabir, M. Shahjahan, K. Murase, A new local search based hybrid genetic algorithm for feature selection, Neurocomputing 74 (17) (2011) 2914–2928.
[23]
R. Bello, et al., Two-step particle swarm optimization to solve the feature selection problem, in: Proceedings of the Seventh International Conference on Intelligent Systems Design and Applications, IEEE Computer Society, 2007, pp. 691–696.
[24]
S. Kashef, H. Nezamabadi-pour, An advanced ACO algorithm for feature subset selection, Neurocomputing 147 (2015) 271–279.
[25]
E. Zorarpacı, S.A. Özel, A hybrid approach of differential evolution and artificial bee colony for feature selection, Expert Syst. Appl. 62 (2016) 91–103.
[26]
Wang J., Li T., Ren R., A real time idss based on artificial bee colony-support vector machine algorithm, in: Proceeding of the 2010 Third International Workshop on Advanced Computational Intelligence (IWACI), IEEE, 2010.
[27]
E.-G. Talbi, A taxonomy of hybrid metaheuristics, J. Heuristics 8 (5) (2002) 541–564.
[28]
S. Kirkpatrick, C.D. Gelatt, M.P. Vecchi, Optimization by simulated annealing, Science 220 (4598) (1983) 671–680.
[29]
S. Mirjalili, A. Lewis, The whale optimization algorithm, Adv. Eng. Softw. 95 (2016) 51–67.
[30]
H.J. Touma, Study of the economic dispatch problem on IEEE 30-bus system using whale optimization algorithm, Int. J. Eng. Technol. Sci. 5 (1) (2016).
[31]
A. Kaveh, M.I. Ghazaan, Enhanced whale optimization algorithm for sizing optimization of skeletal structures, Mech. Based Des. Struct. Mach. (2016) 1–18.
[32]
D.P. Ladumor, et al., A whale optimization algorithm approach for unit commitment problem solution, in: Proceeding of the 2016 National Conference on Advancements in Electrical and Power Electronics Engineering (AEPEE-2016), Morbi, 2016.
[33]
Oh I.-S., Lee J.-S., B.-R. Moon, Hybrid genetic algorithms for feature selection, IEEE Trans. Pattern Anal. Mach. Intell. 26 (11) (2004) 1424–1437.
[34]
O.C. Martin, S.W. Otto, Combining simulated annealing with local search heuristics, Ann. Oper. Res. 63 (1) (1996) 57–75.
[35]
K. Lenin, B.R. Reddy, M. Suryakalavathi, Hybrid Tabu search-simulated annealing method to solve optimal reactive power problem, Int. Electr. Power Energy Syst. 82 (2016) 87–91.
[36]
Y. Lin, Z. Bian, X. Liu, Developing a dynamic neighborhood structure for an adaptive hybrid simulated annealing – tabu search algorithm to solve the symmetrical traveling salesman problem, Appl. Soft Comput. 49 (2016) 937–952.
[37]
P. Vasant, Hybrid simulated annealing and genetic algorithms for industrial production management problems, Int. J. Comput. Methods 7 (02) (2010) 279–297.
[38]
Li Z., P. Schonfeld, Hybrid simulated annealing and genetic algorithm for optimizing arterial signal timings under oversaturated traffic conditions, J. Adv. Transp. 49 (1) (2015) 153–170.
[39]
Li Y., et al., A hybrid genetic-simulated annealing algorithm for the location-inventory-routing problem considering returns under E-supply chain environment, Sci. World J. 2013 (2013).
[40]
L. Junghans, N. Darde, Hybrid single objective genetic algorithm coupled with the simulated annealing optimization method for building optimization, Energy Build. 86 (2015) 651–662.
[41]
M. Mafarja, S. Abdullah, Investigating memetic algorithm in solving rough set attribute reduction, Int. J. Comput. Appl. Technol. 48 (3) (2013) 195–202.
[42]
R. Azmi, et al., A hybrid GA and SA algorithms for feature selection in recognition of hand-printed Farsi characters, in: Proceeding of the 2010 IEEE International Conference on Intelligent Computing and Intelligent Systems, 2010.
[43]
Wu J., Lu Z., A novel hybrid genetic algorithm and simulated annealing for feature selection and kernel optimization in support vector regression, in: Proceeding of the 2012 IEEE Fifth International Conference on Advanced Computational Intelligence (ICACI),, IEEE, 2012.
[44]
K. Manimala, K. Selvi, R. Ahila, Hybrid soft computing techniques for feature selection and parameter optimization in power quality data mining, Appl. Soft Comput. 11 (8) (2011) 5485–5497.
[45]
O. Olabiyisi Stephen, et al., Hybrid metaheuristic feature extraction technique for solving timetabling problem, Int. J. Sci. Eng. Res. 3 (8) (2012).
[46]
Tang W.C., Feature Selection For The Fuzzy Artmap Neural Network Using A Hybrid Genetic Algorithm And Tabu Search, USM, 2007.
[47]
M. Majdi, S. Abdullah, N.S. Jaddi, Fuzzy Population-based meta-heuristic approaches for attribute reduction in rough set theory, World Acad. Sci. Eng. Technol. Int. J. Comput. Electr. Autom. Control Inf. Eng. 9 (12) (2015) 2289–2297.
[48]
P. Moradi, M. Gholampour, A hybrid particle swarm optimization for feature subset selection by integrating a novel local search strategy, Appl. Soft Comput. 43 (2016) 117–130.
[49]
E.-G. Talbi, et al., Comparison of population based metaheuristics for feature selection: application to microarray data classification, in: Proceeding of the 2008 IEEE/ACS International Conference on Computer Systems and Applications, IEEE, 2008.
[50]
Yong Z., Dun-wei G., Wan-qiu Z., Feature selection of unreliable data using an improved multi-objective PSO algorithm, Neurocomputing 171 (2016) 1281–1290.
[51]
J. Jona, N. Nagaveni, A hybrid swarm optimization approach for feature set reduction in digital mammograms, WSEAS Trans. Inf. Sci. Appl. 9 (2012) 340–349.
[52]
M.E. Basiri, S. Nemati, A novel hybrid ACO–GA algorithm for text feature selection, in: Proceeding of the 2009 IEEE Congress on Evolutionary Computation, IEEE, 2009.
[53]
R. Babatunde, S. Olabiyisi, E. Omidiora, Feature dimensionality reduction using a dual level metaheuristic algorithm, International Journal of Applied Information Systems (IJAIS) 7 (1) (2014) http://www.ijais.org.
[54]
J. Jona, N. Nagaveni, Ant-cuckoo colony optimization for feature selection in digital mammogram, Pakistan J. Biol. Sci. 17 (2) (2014) 266.
[55]
M. Nekkaa, D. Boughaci, Hybrid harmony search combined with stochastic local search for feature selection, Neural Process. Lett. 44 (1) (2016) 199–220.
[56]
I. BoussaïD, J. Lepagnot, P. Siarry, A survey on optimization metaheuristics, Inf. Sci. 237 (2013) 82–117.
[57]
G. Chandrashekar, F. Sahin, A survey on feature selection methods, Comput. Electr. Eng. 40 (1) (2014) 16–28.
[58]
R. Jensen, Q. Shen, Finding Rough Set Reducts with Ant Colony Optimization, in: Proceedings of the 2003 UK Workshop on Computational Intelligence, 2003, pp. 15–22.
[59]
D. Goldberg, K. Deb, B. Korb, Messy genetic algorithms: motivation, analysis, and first results, Complex Syst. 3 (1989) 493–530.
[60]
G. Sanchita, D. Anindita, et al., Evolutionary algorithm based techniques to handle big data, in: P.B.S. Mishra, et al. (Eds.), Techniques and Environments for Big Data Analysis: Parallel, Cloud, and Grid Computing, Springer International Publishing:, Cham, 2016, pp. 113–158.
[61]
N.S. Altman, An introduction to kernel and nearest-neighbor nonparametric regression, Am. Stat. 46 (3) (1992) 175–185.
[62]
E. Emary, H.M. Zawbaa, A.E. Hassanien, Binary ant lion approaches for feature selection, Neurocomputing 213 (2016) 54–65.
[63]
Blake, C.L. and C.J. Merz. UCI Repository of machine learning databases. 1998 [cited2016 1 June]; Available from: <http://www.ics.uci.edu/∼mlearn/>.
[64]
J. Friedman, T. Hastie, R. Tibshirani, The Elements of Statistical Learning, vol. 1, Springer, Berlin, 2001.
[65]
H.M. Zawbaa, E. Emary, B. Parv, Feature selection based on antlion optimization algorithm, in: Proceeding of the 2015 Third World Conference on Complex Systems (WCCS), 2015.

Cited By

View all
  • (2025)Artificial rabbits optimization algorithm with automatically DBSCAN clustering algorithm to similarity agent update for features selection problemsThe Journal of Supercomputing10.1007/s11227-024-06606-881:1Online publication date: 1-Jan-2025
  • (2024)Hybrid optimization algorithm for estimating soil parameters of spoil hopper deposition model for trailing suction hopper dredgersJournal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology10.3233/JIFS-23395946:1(1813-1831)Online publication date: 1-Jan-2024
  • (2024)Feature Selection Method Based on Improved Differential Evolution and ReliefFProceedings of the 2024 Guangdong-Hong Kong-Macao Greater Bay Area International Conference on Digital Economy and Artificial Intelligence10.1145/3675417.3675507(539-543)Online publication date: 19-Jan-2024
  • Show More Cited By

Index Terms

  1. Hybrid Whale Optimization Algorithm with simulated annealing for feature selection
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Please enable JavaScript to view thecomments powered by Disqus.

        Information & Contributors

        Information

        Published In

        cover image Neurocomputing
        Neurocomputing  Volume 260, Issue C
        Oct 2017
        505 pages

        Publisher

        Elsevier Science Publishers B. V.

        Netherlands

        Publication History

        Published: 18 October 2017

        Author Tags

        1. Feature selection
        2. Hybrid optimization
        3. Whale Optimization Algorithm
        4. Simulated annealing
        5. Classification
        6. WOA
        7. Optimization

        Qualifiers

        • Research-article

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)0
        • Downloads (Last 6 weeks)0
        Reflects downloads up to 03 Jan 2025

        Other Metrics

        Citations

        Cited By

        View all
        • (2025)Artificial rabbits optimization algorithm with automatically DBSCAN clustering algorithm to similarity agent update for features selection problemsThe Journal of Supercomputing10.1007/s11227-024-06606-881:1Online publication date: 1-Jan-2025
        • (2024)Hybrid optimization algorithm for estimating soil parameters of spoil hopper deposition model for trailing suction hopper dredgersJournal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology10.3233/JIFS-23395946:1(1813-1831)Online publication date: 1-Jan-2024
        • (2024)Feature Selection Method Based on Improved Differential Evolution and ReliefFProceedings of the 2024 Guangdong-Hong Kong-Macao Greater Bay Area International Conference on Digital Economy and Artificial Intelligence10.1145/3675417.3675507(539-543)Online publication date: 19-Jan-2024
        • (2024)Multi-label feature selection based on minimizing feature redundancy of mutual informationNeurocomputing10.1016/j.neucom.2024.128392607:COnline publication date: 28-Nov-2024
        • (2024)A hybrid optimization algorithm for multi-agent dynamic planning with guaranteed convergence in probabilityNeurocomputing10.1016/j.neucom.2024.127764592:COnline publication date: 1-Aug-2024
        • (2024)A quasi-reflected and Gaussian mutated arithmetic optimisation algorithm for global optimisationInformation Sciences: an International Journal10.1016/j.ins.2024.120823677:COnline publication date: 1-Aug-2024
        • (2024)Enhancing IoT (Internet of Things) feature selectionExpert Systems with Applications: An International Journal10.1016/j.eswa.2024.124936256:COnline publication date: 5-Dec-2024
        • (2024)Deficiencies of the whale optimization algorithm and its validation methodExpert Systems with Applications: An International Journal10.1016/j.eswa.2023.121544237:PBOnline publication date: 1-Feb-2024
        • (2024)Conditioned adaptive barrier-based double integral super twisting SMC for trajectory tracking of a quadcopter and hardware in loop using IGWO algorithmExpert Systems with Applications: An International Journal10.1016/j.eswa.2023.121141235:COnline publication date: 10-Jan-2024
        • (2024)Feature Selection based nature inspired Capuchin Search Algorithm for solving classification problemsExpert Systems with Applications: An International Journal10.1016/j.eswa.2023.121128235:COnline publication date: 10-Jan-2024
        • Show More Cited By

        View Options

        View options

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media