Abstract
Random Weight Networks have been extensively used in many applications in the last decade because it has many strong features such as fast learning and good generalization performance. Most of the traditional training techniques for Random Weight Networks randomly select the connection weights and hidden biases and thus suffer from local optima stagnation and degraded convergence. The literature shows that stochastic population-based optimization techniques are well regarded and reliable alternative for Random Weight Networks optimization because of high local optima avoidance and flexibility. In addition, many practitioners and non-expert users find it difficult to set the other parameters of the network like the number of hidden neurons, the activation function, and the regularization factor. In this paper, an approach for training Random Weight Networks is proposed based on a recent variant of particle swarm optimization called competitive swarm optimization. Unlike most of Random Weight Networks training techniques, which are used to optimize only the input weights and hidden biases, the proposed approach will automatically tune the weights, biases, the number of hidden neurons, and regularization factor as well as the embedded activation function in the network, simultaneously. The goal is to help users to effectively identify a proper structure and hyperparameter values to their applications while obtaining reasonable prediction results. Twenty benchmark classification datasets are used to compare the proposed approach with different types of basic and hybrid Random Weight Network-based models. The experimental results on the benchmark datasets show that the reasonable classification results can be obtained by automatically tuning the hyperparameters using the proposed approach.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Notes
Interested readers can refer to http://unboxresearch.com/articles/randnn.html to see the visual exploration of RWN.
References
Ala’M AZ, Faris H, Alqatawna JF, Hassonah MA (2018) Evolving support vector machines using whale optimization algorithm for spam profiles detection on online social networks in different lingual contexts. Knowl-Based Syst 153:91–104
Ala’M AZ, Faris H, Alqatawna JF, Hassonah MA (2020) Salp chain-based optimization of support vector machines and feature weighting for medical diagnostic information systems. In: Evolutionary machine learning techniques. pp 11–34. Springer
Aljarah I, Faris H, Mirjalili S (2018) Optimizing connection weights in neural networks using the whale optimization algorithm. Soft Comput 22(1):1–15
Azzini A, Tettamanzi AGB (2011) Evolutionary anns: a state of the art survey. Intelligenza Artificiale 5(1):19–35
Bohat VK, Arya KV (2018) An effective gbest-guided gravitational search algorithm for real-parameter optimization and its application in training of feedforward neural networks. Knowl Based Syst 143:192–207
Cao J, Lin Z, Huang G-B (2012) Self-adaptive evolutionary extreme learning machine. Neural Process Lett 36(3):285–305
Cao W, Wang X, Ming Z, Gao J (2018) A review on neural networks with random weights. Neurocomputing 275:278–287
Chen S, Cowan CF, Grant PM (1991) Orthogonal least squares learning algorithm for radial basis function networks. IEEE Trans Neural Netw 2(2):302–309
Chen WN, Zhang J, Lin Y, Chen N, Zhan ZH, Chung HS, Li Y, Shi YH (2013) Particle swarm optimization with an aging leader and challengers. IEEE Trans Evol Comput 17(2):241–258
Cheng R, Jin Y (2015) A competitive swarm optimizer for large scale optimization. IEEE Trans Cybern 45(2):191–204
Cho J-H, Lee D-J, Chun M-G (2007) Parameter optimization of extreme learning machine using bacterial foraging algorithm. J Korean Inst Intell Syst 17(6):807–812
Ding S, Li H, Chunyang S, Junzhao Yu, Jin F (2013) Evolutionary artificial neural networks: a review. Artif Intell Rev 39(3):251–260
Eshtay M, Faris H, Obeid N (2018) Improving extreme learning machine by competitive swarm optimization and its application for medical diagnosis problems. Expert Syst Appl 104:134–152
Eshtay M, Faris H, Obeid N (2019) Metaheuristic-based extreme learning machines: a review of design formulations and applications. Int J Mach Learn Cybern 10(6):1543–1561
Eshtay M, Faris H, Obeid N (2020) A competitive swarm optimizer with hybrid encoding for simultaneously optimizing the weights and structure of extreme learning machines for classification problems. Int J Mach Learn Cybern 7:1–23
Faris H, Ala’M AZ, Heidari AA, Aljarah I, Mafarja M, Hassonah MA, Fujita H (2019) An intelligent system for spam detection and identification of the most relevant features based on evolutionary random weight networks. Inf Fusion 48:67–83
Faris H, Aljarah I, Al-Madi N, Mirjalili S (2016) Optimizing the learning process of feedforward neural networks using lightning search algorithm. Int J Artif Intell Tools 25(06):1650033
Faris H, Aljarah I, Mirjalili S (2018) Improved monarch butterfly optimization for unconstrained global search and neural network training. Appl Intell 48(2):445–464
Faris H, Heidari AA, Alaà AZ, Mafarja M, Aljarah I, Eshtay M, Mirjalili S (2020) Time-varying hierarchical chains of salps with random weight networks for feature selection. Expert Syst Appl 140:112898
Gentiluomo L, Roessner D, Augustijn D, Svilenov H, Kulakova A, Mahapatra S, Winter G, Streicher W, Rinnan Å, Peters GH et al (2019) Application of interpretable artificial neural networks to early monoclonal antibodies development. Eur J Pharm Biopharm 141:81–89
Grefenstette JJ (1986) Optimization of control parameters for genetic algorithms. IEEE Trans Syst Man Cybern, 16(1): 122–128
Han F, Yao H-F, Ling Q-H (2013) An improved evolutionary extreme learning machine based on particle swarm optimization. Neurocomputing 116:87–93
He YL, Wang XZ, Huang JZ (2016) Fuzzy nonlinear regression analysis using a random weight network. Inf Sci 364:222–240
Hecht-Nielsen R (1987) Kolmogorov’s mapping neural network existence theorem. In: Proceedings of the international conference on neural networks, vol 3, pp 11–13. IEEE Press, New York
Hoerl AE, Kennard RW (1970) Ridge regression: biased estimation for nonorthogonal problems. Technometrics 12(1):55–67
Huang DS, Ip HH, Chi Z (2004) A neural root finder of polynomials based on root moments. Neural Comput 16(8):1721–1762
Huang GB, Zhu QY, Siew CK (2004) Extreme learning machine: a new learning scheme of feedforward neural networks. In: 2004 IEEE international joint conference on neural networks, 2004. Proceedings, vol 2, pp 985–990. IEEE
Hush DR (1989) Classification with neural networks: a performance analysis. In: IEEE international conference on systems engineering (1989)
Igelnik B, Pao Y-H (1995) Stochastic choice of basis functions in adaptive function approximation and the functional-link net. IEEE Trans Neural Netw 6(6):1320–1329
Katuwal R, Suganthan PN, Zhang L (2018) An ensemble of decision trees with random vector functional link networks for multi-class classification. Appl Soft Comput 70:1146–1153
Kaya Y, Kayci L, Tekin R, Faruk Ertuğrul Ö (2014) Evaluation of texture features for automatic detecting butterfly species using extreme learning machine. J Exp Theor Artif Intell 26(2):267–281
Lichman M (2013) UCI machine learning repository
Ma L, Khorasani K (2005) Constructive feedforward neural networks using Hermite polynomial activation functions. IEEE Trans Neural Netw 16(4):821–833
Mirjalili S, Lewis A (2013) S-shaped versus v-shaped transfer functions for binary particle swarm optimization. Swarm Evol Comput 9:1–14
Niu P, Ma Y, Li M, Yan S, Li G (2016) A kind of parameters self-adjusting extreme learning machine. Neural Process Lett 44(3):813–830
Nour MA, Madey GR (1996) Heuristic and optimization approaches to extending the kohonen self organizing algorithm. Euro J Oper Res 93(2):428–448
Pao YH, Park GH, Sobajic DJ (1994) Learning and generalization characteristics of the random vector functional-link net. Neurocomputing 6(2):163–180
Ripley BD (1993) Statistical aspects of neural networks. Netw Chaos? Stat Probab Aspects 50:40–123
Sattar AM, Ertuğrul ÖF, Gharabaghi B, McBean EA, Cao J (2019) Extreme learning machine model for water network management. Neural Comput Appl 31(1):157–169
Schmidt WF, Kraaijveld MA, Duin RP (1992) Feedforward neural networks with random weights. In: 11th IAPR international conference on pattern recognition, 1992, vol. II.cconference B: pattern recognition methodology and systems, proceedings
Schuster M, Paliwal KK (1997) Bidirectional recurrent neural networks. IEEE Trans Signal Process 45(11):2673–2681
Shi Y, Eberhart RC (1999) Empirical study of particle swarm optimization. In: Proceedings of the 1999 congress on evolutionary computation-CEC99 (Cat. No. 99TH8406), vol. 3, pp 1945–1950. IEEE
Storn R, Price K (1997) Differential evolution-a simple and efficient heuristic for global optimization over continuous spaces. J Global Opt 11(4):341–359
Svozil D, Kvasnicka V, Pospichal J (1997) Introduction to multi-layer feed-forward neural networks. Chemom Intell Lab Syst 39(1):43–62
Wang J, Ye K, Cao J, Wang T, Xue A, Cheng Y, Yin C (2017) DOA estimation of excavation devices with ELM and MUSIC-based hybrid algorithm. Cogn Comput 9(4):564–580
Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1(1):67–82
Worasucheep C (2008) A particle swarm optimization with stagnation detection and dispersion. In: 2008 IEEE congress on evolutionary computation (IEEE world congress on computational intelligence), pp 424–429. IEEE
Zhai J, Wang X, Pang X (2016) Voting-based instance selection from large data sets with MapReduce and random weight networks. Inf Sci 367:1066–1077
Zhao R, Wang Y, Peng H, Jelodar H, Yuan C, Li YC, Masood I, Rabbani M (2019) Selfish herds optimization algorithm with orthogonal design and information update for training multi-layer perceptron neural network. Appl Intell 49(6):2339–2381
Zhao X, Li D, Yang B, Liu S, Pan Z, Chen H (2016) An efficient and effective automatic recognition system for online recognition of foreign fibers in cotton. IEEE Access 4:8465–8475
Zhou P, Yuan M, Wang H, Wang Z, Chai T-Y (2015) Multivariable dynamic modeling for molten iron quality using online sequential random vector functional-link networks with self-feedback connections. Inf Sci 325:237–255
Zhu QY, Qin AK, Suganthan PN, Huang GB (2005) Evolutionary extreme learning machine. Pattern Recogn 38(10):1759–1763
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that there is no conflict of interest regarding the publication of this paper.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Eshtay, M., Faris, H., Heidari, A.A. et al. AutoRWN: automatic construction and training of random weight networks using competitive swarm of agents. Neural Comput & Applic 33, 5507–5524 (2021). https://doi.org/10.1007/s00521-020-05329-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-020-05329-0