[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3474124.3474149acmotherconferencesArticle/Chapter ViewAbstractPublication Pagesic3Conference Proceedingsconference-collections
research-article

A New Harris Hawk Whale Optimization Algorithm for Enhancing Neural Networks

Published: 04 November 2021 Publication History

Abstract

The learning process of artificial neural-networks is considered as one of the burdensome challenges to the researchers. The major dilemma of training the neural networks is the nonlinear nature and unknown controlling parameters like weights and biases. Slow convergence and trap in local optima are demerits of training neural network algorithms. To overcome these demerits, this work proposes a hybrid of Harris hawk optimization with a whale optimization algorithm to train the neural network. Harris hawk is a metaheuristic evolutionary algorithm and is used here to optimize the weights and bias of neural networks. The efficacy of the proposed algorithm is assessed by evaluating it on different kinds of cancer datasets and other datasets like fraud, banking note authentication. The experimental results demonstrate that the proposed algorithm performs better than its contemporary counterparts.

References

[1]
Parul Agarwal and Shikha Mehta. 2014. Nature-Inspired Algorithms: State-of-Art, Problems and Prospects. International Journal of Computer Applications 100, 14: 14–21. https://doi.org/10.5120/17593-8331
[2]
Parul Agarwal and Shikha Mehta. 2019. ABC_DE_FP: A Novel Hybrid Algorithm for Complex Continuous Optimisation Problems. International Journal of Bio-Inspired Computation 14, 1: 46–61. https://doi.org/10.1504/ijbic.2018.10014476
[3]
Parul Agarwal and Shikha Mehta. 2019. Subspace Clustering of High Dimensional Data Using Differential Evolution. In Nature-Inspired Algorithms for Big Data Frameworks (First). IGI Global, 47–74. https://doi.org/10.4018/978-1-5225-5852-1.ch003
[4]
Parul Agarwal, Shikha Mehta, and Ajith Abraham. 2021. A meta-heuristic density-based subspace clustering algorithm for high-dimensional data. Soft Computing 1. https://doi.org/10.1007/s00500-021-05973-1
[5]
Ibrahim Aljarah, Hossam Faris, and Seyedali Mirjalili. 2018. Optimizing connection weights in neural networks using the whale optimization algorithm. Soft Computing 22, 1. https://doi.org/10.1007/s00500-016-2442-1
[6]
Wdaa ASI. 2008. Differential evolution for neural networks learning enhancement. Universiti Teknologi, Malaysia.
[7]
K Bache and M Lichman. 2006. UCI machine learning repository. Retrieved from http://archive.ics.uci.edu/ml
[8]
Arpita Jadhav Bhatt, Chetna Gupta, and Sangeeta Mittal. 2018. iABC-AL: Active learning-based privacy leaks threat detection for iOS applications. Journal of King Saud University - Computer and Information Sciences. https://doi.org/10.1016/j.jksuci.2018.05.008
[9]
Arpita Jadhav Bhatt, Chetna Gupta, and Sangeeta Mittal. 2018. iABC: Towards a hybrid framework for analyzing and classifying behaviour of iOS applications using static and dynamic analysis. Journal of Information Security and Applications 41: 144–158. https://doi.org/10.1016/j.jisa.2018.07.005
[10]
Socha K Blum C. 2005. Training feed-forward neural networks with ant colony optimization: an application to pattern classification. In HIS’05, fifth international conference on IEEE, 6.
[11]
V. P. S. Burse, K., Manoria, M., & Kirar. 2011. Improved back propagation algorithm to avoid local minima in multiplicative neuron model. In International Conference on Advances in Information Technology and Mobile Communication, 67–73.
[12]
M. Carvalho and T. B. Ludermir. 2006. Hybrid training of feed-forward neural networks with particle swarm optimization. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 4233 LNCS: 1061–1070. https://doi.org/10.1007/11893257_116
[13]
Sankhadeep Chatterjee, Sarbartha Sarkar, Sirshendu Hore, Nilanjan Dey, Amira S. Ashour, and Valentina E. Balas. 2017. Particle swarm optimization trained neural network for structural failure prediction of multistoried RC buildings. Neural Computing and Applications 28, 8: 2005–2016. https://doi.org/10.1007/s00521-016-2190-2
[14]
Junzhao Y Ding S, Chunyang S. 2011. An optimizing BP neural network algorithm based on genetic algorithm. Artificial Intelligence Review 36, 2: 153–162.
[15]
Diptam Dutta, Argha Roy, and Kaustav Choudhury. 2013. Training artificial neural network using particle swarm optimization algorithm. International Journal of Advanced Research in Computer Science and Software Engineering 3, 3: 430–434. Retrieved from www.ijarcsse.com
[16]
Iztok Fister, Xin-She Yang, Iztok Fister, Janez Brest, and Dušan Fister. 2013. A Brief Review of Nature-Inspired Algorithms for Optimization. ELEKTROTEHNI ˇ SKI VESTNIK 80, 3: 1–7. Retrieved from http://arxiv.org/abs/1307.4186
[17]
Sushmito Ghosh and Douglas L. Reilly. 1994. Credit card fraud detection with a neural-network. Proceedings of the Hawaii International Conference on System Sciences 3: 621–630. https://doi.org/10.1109/hicss.1994.323314
[18]
Jatinder N.D. Gupta and Randall S. Sexton. 1999. Comparing backpropagation with a genetic algorithm for neural network training. Omega 27, 6: 679–684. https://doi.org/10.1016/S0305-0483(99)00027-4
[19]
Ali Asghar Heidari, Seyedali Mirjalili, Hossam Faris, Ibrahim Aljarah, Majdi Mafarja, and Huiling Chen. 2019. Harris hawks optimization: Algorithm and applications. Future Generation Computer Systems 97, March: 849–872. https://doi.org/10.1016/j.future.2019.02.028
[20]
Lampinen J Ilonen J, Kamarainen J-K. 2003. Differential evolution training algorithm for feed-forward neural networks. Neural Process Lett 17, 1: 93–105.
[21]
Y. Jianbo, S Wang, and Xi L. 2008. Evolving artificial neural networks using an improved PSO and DPSO. Neurocomputing 71, 46: 054–1060.
[22]
D. Karaboga and B. Akay. 2007. An artificial bee colony (abc) algorithm on training artificial neural networks. In IEEE 15th Signal Processing and Communications Applications, 1–4. https://doi.org/10.1109/SIU.2007.4298679
[23]
Majdi M. Mafarja and Seyedali Mirjalili. 2017. Hybrid Whale Optimization Algorithm with simulated annealing for feature selection. Neurocomputing 260: 302–312. https://doi.org/10.1016/j.neucom.2017.04.053
[24]
Shikha Mehta, Parul Agarwal, Prakhar Shrivastava, and Jharna Barlawala. 2018. Differential bond energy algorithm for optimal vertical fragmentation of distributed databases. Journal of King Saud University-Computer and Information Sciences.
[25]
Schneider G Meissner M, Schmuker M. 2006. Optimized particle swarm optimization (OPSO) and its application to artificial neural network training. BMC bioinformatics 7, 1: 125.
[26]
Neves J Mendes R, Cortez P, Rocha M. 2002. Particle swarms for feedforward neural network training. In International joint conference on neural networks, 1895–1899.
[27]
Seyedali Mirjalili and Andrew Lewis. 2016. The Whale Optimization Algorithm. Advances in Engineering Software 95: 51–67. https://doi.org/10.1016/j.advengsoft.2016.01.008
[28]
David J Montana and Lawrence Davis. 1989. Training Feedforward Neural Networks Using Genetic Algorithms. Proceedings of the 11th International Joint Conference on Artificial intelligence - Volume 1 89: 762–767. Retrieved from http://dl.acm.org/citation.cfm?id=1623755.1623876
[29]
Rohit Salgotra, Urvinder Singh, and Sriparna Saha. 2019. On Some Improved Versions of Whale Optimization Algorithm. Springer Berlin Heidelberg. https://doi.org/10.1007/s13369-019-04016-0
[30]
Udo Seiffert. 2001. Multiple layer perceptron training using genetic algorithms. ESANN.
[31]
Johnson JD Sexton RS, Dorsey RE. 1998. Toward global optimization of neural networks: a comparison of the genetic algorithm and backpropagation. Decision Support System 22, 2: 171–185.
[32]
Bialko M Slowik A. 2008. Training of artificial neural networks using differential evolution algorithm. In Conference on human system interactions, IEEE, 60–65.
[33]
Mudassar Ali Syed and Raziuddin Syed. 2019. Weighted Salp Swarm Algorithm and its applications towards optimal sensor deployment. Journal of King Saud University - Computer and Information Sciences, xxxx. https://doi.org/10.1016/j.jksuci.2019.07.005
[34]
Bogart C Whitley D, Starkweather T. 1990. Genetic algorithms and neural networks: optimizing connections and connectivity. Parallel Computation 14, 3: 347–361.
[35]
Macready WG Wolpert DH. 1998. No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation 1, 1: 67–82.
[36]
Xin-She Yang, Karamanoglu M, Hossein Amir Gandomi, Zhihua Cui, and Renbin Xiao. 2013. Swarm Intelligence and Bio-Inspired Computation, 1st Edition Theory and Applications. Elsevier Inc.
[37]
X. S. Yang. 2014. Nature-Inspired Optimization Algorithms. Elsevier.

Cited By

View all
  • (2024)An Improved Snow Ablation Optimizer for Stabilizing the Artificial Neural NetworkAdvances in Data-Driven Computing and Intelligent Systems10.1007/978-981-99-9521-9_40(525-536)Online publication date: 22-Feb-2024
  • (2022)Recent Advances in Harris Hawks Optimization: A Comparative Study and ApplicationsElectronics10.3390/electronics1112191911:12(1919)Online publication date: 20-Jun-2022

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
IC3-2021: Proceedings of the 2021 Thirteenth International Conference on Contemporary Computing
August 2021
483 pages
ISBN:9781450389204
DOI:10.1145/3474124
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 04 November 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Artificial neural network
  2. Exploitation
  3. Exploration
  4. Harris hawk optimization
  5. Whale optimization

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

IC3 '21

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)9
  • Downloads (Last 6 weeks)0
Reflects downloads up to 01 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)An Improved Snow Ablation Optimizer for Stabilizing the Artificial Neural NetworkAdvances in Data-Driven Computing and Intelligent Systems10.1007/978-981-99-9521-9_40(525-536)Online publication date: 22-Feb-2024
  • (2022)Recent Advances in Harris Hawks Optimization: A Comparative Study and ApplicationsElectronics10.3390/electronics1112191911:12(1919)Online publication date: 20-Jun-2022

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media