[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
article

Optimizing connection weights in neural networks using the whale optimization algorithm

Published: 01 January 2018 Publication History

Abstract

The learning process of artificial neural networks is considered as one of the most difficult challenges in machine learning and has attracted many researchers recently. The main difficulty of training a neural network is the nonlinear nature and the unknown best set of main controlling parameters (weights and biases). The main disadvantages of the conventional training algorithms are local optima stagnation and slow convergence speed. This makes stochastic optimization algorithm reliable alternative to alleviate these drawbacks. This work proposes a new training algorithm based on the recently proposed whale optimization algorithm (WOA). It has been proved that this algorithm is able to solve a wide range of optimization problems and outperform the current algorithms. This motivated our attempts to benchmark its performance in training feedforward neural networks. For the first time in the literature, a set of 20 datasets with different levels of difficulty are chosen to test the proposed WOA-based trainer. The results are verified by comparisons with back-propagation algorithm and six evolutionary techniques. The qualitative and quantitative results prove that the proposed trainer is able to outperform the current algorithms on the majority of datasets in terms of both local optima avoidance and convergence speed.

References

[1]
Baluja S (1994) Population-based incremental learning. A method for integrating genetic search based function optimization and competitive learning. Technical report, DTIC Document.
[2]
Basheer IA, Hajmeer M (2000) Artificial neural networks: fundamentals, computing, design, and application. J Microbiol Methods 43(1):3-31.
[3]
Beyer H-G, Schwefel H-P (2002) Evolution strategies-a comprehensive introduction. Natural Comput 1(1):3-52.
[4]
Blum C, Socha K (2005) Training feed-forward neural networks with ant colony optimization: an application to pattern classification. In: Hybrid intelligent systems, HIS'05, fifth international conference on IEEE, p 6.
[5]
Braik M, Sheta A, Arieqat A (2008) A comparison between GAs and PSO in training ANN to model the TE chemical process reactor. In: AISB 2008 convention communication, interaction and social intelligence, vol 1. Citeseer, p 24.
[6]
Chatterjee S, Sarkar S, Hore S, Dey N, Ashour AS, Balas VE (2016) Particle swarm optimization trained neural network for structural failure prediction of multistoried RC buildings. Neural Comput Appl 1-12.
[7]
¿repin¿ek M, Liu S-H, Mernik M (2013) Exploration and exploitation in evolutionary algorithms: a survey. ACM Comput Surv (CSUR) 45(3):35.
[8]
Das S, Suganthan PN (2011) Differential evolution: a survey of the state-of-the-art. IEEE Trans Evol Comput 15(1):4-31.
[9]
Ding S, Chunyang S, Junzhao Y (2011) An optimizing BP neural network algorithm based on genetic algorithm. Artif Intell Rev 36(2):153-162.
[10]
Dorigo M, Birattari M, Stützle T (2006) Ant colony optimization. Comput Intell Mag IEEE 1(4):28-39.
[11]
Faris H, Aljarah I, Mirjalili S (2016) Training feedforward neural networks using multi-verse optimizer for binary classification problems. Appl Intell 45(2):322-332.
[12]
Gang X (2013) An adaptive parameter tuning of particle swarm optimization algorithm. Appl Math Comput 219(9):4560-4569.
[13]
Goldberg DE et al (1989) Genetic algorithms in search optimization andmachine learning, 412th edn. Addison-wesley, Reading Menlo Park.
[14]
Gupta JND, Sexton RS (1999) Comparing backpropagation with a genetic algorithm for neural network training. Omega 27(6):679-684.
[15]
Holland JH (1992) Adaptation in natural and artificial systems. MIT Press, Cambridge.
[16]
Ho YC, Pepyne DL (2002) Simple explanation of the no-free-lunch theorem and its implications. J Optim Theory Appl 115(3):549-570.
[17]
Huang W, Zhao D, Sun F, Liu H, Chang E (2015) Scalable gaussian process regression using deep neural networks. In: Proceedings of the 24th international conference on artificial intelligence. AAAI Press, pp 3576-3582.
[18]
Ilonen J, Kamarainen J-K, Lampinen J (2003) Differential evolution training algorithm for feed-forward neural networks. Neural Process Lett 17(1):93-105.
[19]
Jianbo Y, Wang S, Xi L (2008) Evolving artificial neural networks using an improved PSO and DPSO. Neurocomputing 71(46):1054-1060.
[20]
Karaboga D (2005) An idea based on honey bee swarm for numerical optimization. Technical report, Technical report-tr06, Erciyes University, Engineering Faculty, Computer Engineering Department.
[21]
Karaboga D, Gorkemli B, Ozturk C, Karaboga N (2014) A comprehensive survey: artificial bee colony (ABC) algorithm and applications. Artif Intell Rev 42(1):21-57.
[22]
Karaboga D, Akay B, Ozturk C (2007) Artificial bee colony (ABC) optimization algorithm for training feed-forward neural networks. In: Modeling decisions for artificial intelligence. Springer, pp 318-329.
[23]
Kennedy J (2010) Particle swarm optimization. In: Sammut C, Webb, GI (eds) Encyclopedia of machine learning. Springer, Boston, pp 760-766.
[24]
Kim JS, Jung S (2015) Implementation of the rbf neural chip with the back-propagation algorithm for on-line learning. Appl Soft Comput 29:233-244.
[25]
Linggard R, Myers DJ, Nightingale C (2012) Neural networks for vision, speech and natural language, 1st edn. Springer, New York.
[26]
Meissner M, Schmuker M, Schneider G (2006) Optimized particle swarm optimization (OPSO) and its application to artificial neural network training. BMC Bioinform 7(1):125.
[27]
Mendes R, Cortez P, Rocha M, Neves J (2002) Particle swarms for feedforward neural network training. In: Proceedings of the 2002 international joint conference on neural networks, IJCNN '02, vol 2, pp 1895-1899.
[28]
Meng X, Li J, Qian B, Zhou M, Dai X (2014) Improved population-based incremental learning algorithm for vehicle routing problems with soft time windows. In: Networking, sensing and control (ICNSC), 2014 IEEE 11th international conference on IEEE, pp 548-553.
[29]
Mirjalili SA, Hashim SZM, Sardroudi HM (2012) Training feedforward neural networks using hybrid particle swarm optimization and gravitational search algorithm. Appl Math Comput 218(22):11125-11137.
[30]
Mirjalili S (2014) Let a biogeography-based optimizer train your multilayer perceptron. Inf Sci 269:188-209.
[31]
Mirjalili S (2015) How effective is the grey wolf optimizer in training multi-layer perceptrons. Appl Intell 43(1):150-161.
[32]
Mirjalili S, Lewis A (2016) The whale optimization algorithm. Adv Eng Softw 95:51-67.
[33]
Mohan BC, Baskaran R (2012) A survey: ant colony optimization based recent research and implementation on several engineering domain. Expert Syst Appl 39(4):4618-4627.
[34]
Panchal G, Ganatra A (2011) Behaviour analysis of multilayer perceptrons with multiple hidden neurons and hidden layers. Int J Comput Theory Eng 3(2):332.
[35]
Price K, Storn RM, Lampinen JA (2006) Differential evolution: a practical approach to global optimization. Springer, New York.
[36]
Rakitianskaia AS, Engelbrecht AP (2012) Training feedforward neural networks with dynamic particle swarm optimisation. Swarm Intell 6(3):233-270.
[37]
Rezaeianzadeh M, Tabari H, Arabi YA, Isik S, Kalin L (2014) Flood flow forecasting using ANN, ANFIS and regression models. Neural Comput Appl 25(1):25-37.
[38]
Sastry K, Goldberg DE, Kendall G (2014) Genetic algorithms. In: Burke EK, Kendall G (eds) Search methodologies: introductory tutorials in optimization and decision support techniques. Springer, Boston, pp 93-117.
[39]
Schmidhuber J (2015) Deep learning in neural networks: an overview. Neural Netw 61:85-117.
[40]
Seiffert U (2001) Multiple layer perceptron training using genetic algorithms. In: Proceedings of the European symposium on artificial neural networks, Bruges, Bélgica.
[41]
Sexton RS, Dorsey RE, Johnson JD (1998) Toward global optimization of neural networks: a comparison of the genetic algorithm and backpropagation. Decis Support Syst 22(2):171-185.
[42]
Sexton RS, Gupta JND (2000) Comparative evaluation of genetic algorithm and backpropagation for training neural networks. Inf Sci 129(14):45-59.
[43]
Slowik A, Bialko M (2008) Training of artificial neural networks using differential evolution algorithm. In: Conference on human system interactions, IEEE, pp 60-65.
[44]
Socha K, Blum C (2007) An ant colony optimization algorithm for continuous optimization: application to feed-forward neural network training. Neural Comput Appl 16(3):235-247.
[45]
Storn R, Price K (1997) Differential evolution--a simple and efficient heuristic for global optimization over continuous spaces. J Glob Optim 11(4):341-359.
[46]
Wang L, Zeng Y, Chen T (2015) Back propagation neural network with adaptive differential evolution algorithm for time series forecasting. Expert Syst Appl 42(2):855-863.
[47]
Wdaa ASI (2008) Differential evolution for neural networks learning enhancement. Ph.D. thesis, Universiti Teknologi, Malaysia.
[48]
Whitley D, Starkweather T, Bogart C (1990) Genetic algorithms and neural networks: optimizing connections and connectivity. Parallel Comput 14(3):347-361.
[49]
Wienholt W (1993) Minimizing the system error in feedforward neural networks with evolution strategy. In: ICANN93, Springer, pp 490-493.
[50]
Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1(1):67-82.
[51]
Yang X-S (ed) (2014) Random walks and optimization. In: Nature-inspired optimization algorithms, chap 3. Elsevier, Oxford, pp 45-65.
[52]
Zhang Y, Wang S, Ji G (2015) A comprehensive survey on particle swarm optimization algorithm and its applications. Math Probl Eng 2015:931256.

Cited By

View all
  • (2024)A Short-term Power Load Forecasting Based on CSWOA-TPA-BiGRUProceedings of the 2024 International Conference on Power Electronics and Artificial Intelligence10.1145/3674225.3674348(677-681)Online publication date: 19-Jan-2024
  • (2024)An enhanced Coati Optimization Algorithm for global optimization and feature selection in EEG emotion recognitionComputers in Biology and Medicine10.1016/j.compbiomed.2024.108329173:COnline publication date: 9-Jul-2024
  • (2024)Metaheuristic learning algorithms for accurate prediction of hydraulic performance of porous embankment weirsApplied Soft Computing10.1016/j.asoc.2023.111150151:COnline publication date: 17-Apr-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Soft Computing - A Fusion of Foundations, Methodologies and Applications
Soft Computing - A Fusion of Foundations, Methodologies and Applications  Volume 22, Issue 1
January 2018
333 pages
ISSN:1432-7643
EISSN:1433-7479
Issue’s Table of Contents

Publisher

Springer-Verlag

Berlin, Heidelberg

Publication History

Published: 01 January 2018

Author Tags

  1. Evolutionary algorithm
  2. MLP
  3. Multilayer perceptron
  4. Optimization
  5. Training neural network
  6. WOA
  7. Whale optimization algorithm

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 03 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)A Short-term Power Load Forecasting Based on CSWOA-TPA-BiGRUProceedings of the 2024 International Conference on Power Electronics and Artificial Intelligence10.1145/3674225.3674348(677-681)Online publication date: 19-Jan-2024
  • (2024)An enhanced Coati Optimization Algorithm for global optimization and feature selection in EEG emotion recognitionComputers in Biology and Medicine10.1016/j.compbiomed.2024.108329173:COnline publication date: 9-Jul-2024
  • (2024)Metaheuristic learning algorithms for accurate prediction of hydraulic performance of porous embankment weirsApplied Soft Computing10.1016/j.asoc.2023.111150151:COnline publication date: 17-Apr-2024
  • (2024)A Hybrid Meta-heuristics Algorithm: XGBoost-Based Approach for IDS in IoTSN Computer Science10.1007/s42979-024-02913-25:5Online publication date: 10-May-2024
  • (2024)Prediction of middle box-based attacks in Internet of Healthcare Things using ranking subsets and convolutional neural networkWireless Networks10.1007/s11276-023-03603-230:3(1493-1511)Online publication date: 1-Apr-2024
  • (2024)CAC-WOA: context aware clustering with whale optimization algorithm for knowledge discovery from multidimensional space in electricity applicationCluster Computing10.1007/s10586-023-03965-427:1(499-513)Online publication date: 1-Feb-2024
  • (2024)Hybrid enhanced whale optimization algorithm for contrast and detail enhancement of color imagesCluster Computing10.1007/s10586-022-03920-927:1(231-267)Online publication date: 1-Feb-2024
  • (2024)Optimizing beyond boundaries: empowering the salp swarm algorithm for global optimization and defective software module classificationNeural Computing and Applications10.1007/s00521-024-10131-336:30(18727-18759)Online publication date: 1-Oct-2024
  • (2023)Deep neural fuzzy based fractional order PID controller for level control applications in quadruple tank systemJournal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology10.3233/JIFS-22167445:1(1847-1861)Online publication date: 1-Jan-2023
  • (2023)Evolutionary meta-heuristic optimized modelJournal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology10.3233/JIFS-21342345:6(10967-10983)Online publication date: 1-Jan-2023
  • Show More Cited By

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media