Abstract
The performance of most data science algorithms, and in particular machine learning algorithms, is largely dependent on the performance of their optimisation algorithm. In other words, without an effective optimisation algorithm there is no effective data science algorithm. Conventional optimisation algorithms suffer from drawbacks such as a tendency to get stuck in local optima and sensitivity to the initial conditions. To tackle these, population-based metaheuristic algorithms, which work on a population of candidate solution and incorporate stochastic behaviour, can be used. In this chapter, we the present the Human Mental Search (HMS) algorithm, population-based metaheuristic algorithm that is inspired by bid exploration in online auctions. HMS comprises three main operators:, mental search, grouping, and movement. The mental search operator is responsible for exploring the vicinity of a candidate solution, the grouping operator employs an unsupervised clustering technique, k-means, to partition candidate solutions, while the movement operator moves candidate solutions towards a promising area identified by the grouping operator. To evaluate the efficacy of the HMS algorithm, a set of experiments on different benchmark functions with diverse characteristics as well as both normal and large-scale problems. The obtained results clearly show the merit of HMS compared to other algorithms.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Amirsadri, S., Mousavirad, S.J., Ebrahimpour-Komleh, H.: A levy flight-based grey wolf optimizer combined with back-propagation algorithm for neural network training. Neural Comput. Appl. 30(12), 3707–3720 (2018)
Atashpaz-Gargari, E., Lucas, C.: Imperialist competitive algorithm: an algorithm for optimization inspired by imperialistic competition. In: IEEE Congress on Evolutionary Computation, pp. 4661–4667 (2007)
Brooks, S.P., Morgan, B.J.: Optimization using simulated annealing. J. R. Stat. Soc. Ser. D (The Statistician) 44(2), 241–257 (1995)
Carvalho, A.R., Ramos, F.M., Chaves, A.A.: Metaheuristics for the feedforward artificial neural network (ANN) architecture optimization problem. Neural Comput. Appl. 20(8), 1273–1284 (2011)
Das, S., Abraham, A., Konar, A.: Automatic clustering using an improved differential evolution algorithm. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 38(1), 218–237 (2008)
Derrac, J., García, S., Molina, D., Herrera, F.: A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evolut. Comput. 1(1), 3–18 (2011)
Espinoza-Pérez, S., Rojas-Domınguez, A., Valdez-Pena, S.I., Mancilla-Espinoza, L.E.: Evolutionary training of deep belief networks for handwritten digit recognition. Res. Comput. Sci. 148, 115–131 (2019)
Eusuff, M., Lansey, K., Pasha, F.: Shuffled frog-leaping algorithm: a memetic meta-heuristic for discrete optimization. Eng. Optim. 38(2), 129–154 (2006)
Geem, Z.W., Kim, J.H., Loganathan, G.V.: A new heuristic optimization algorithm: harmony search. Simulation 76(2), 60–68 (2001)
Hancer, E., Xue, B., Zhang, M.: Differential evolution for filter feature selection based on information theory and feature ranking. Knowl.-Based Syst. 140, 103–119 (2018)
Kapanova, K., Dimov, I., Sellier, J.: A genetic approach to automatic neural network architecture optimization. Neural Comput. Appl. 29(5), 1481–1492 (2018)
Karaboga, D., Basturk, B.: A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm. J. Glob. Optim. 39(3), 459–471 (2007)
Kennedy, J., Eberhart, R.: Particle swarm optimization (PSO). In: IEEE International Conference on Neural Networks, pp. 1942–1948 (1995)
Mahdavi, S., Shiri, M.E., Rahnamayan, S.: Metaheuristics in large-scale global continues optimization: a survey. Inform. Sci. 295, 407–428 (2015)
Minija, S.J., Emmanuel, W.S.: Imperialist competitive algorithm-based deep belief network for food recognition and calorie estimation. Evolut. Intell., 1–16 (2019)
Mirjalili, S., Lewis, A.: The whale optimization algorithm. Adv. Eng. Softw. 95, 51–67 (2016)
Mirjalili, S., Mirjalili, S.M., Lewis, A.: Grey wolf optimizer. Adv. Eng. Softw. 69, 46–61 (2014)
Mladenović, N., Hansen, P.: Variable neighborhood search. Comput. Oper. Res. 24(11), 1097–1100 (1997)
Mousavirad, S., Ebrahimpour-Komleh, H.: Feature selection using modified imperialist competitive algorithm. In: ICCKE 2013, pp. 400–405. IEEE (2013)
Mousavirad, S.J., Bidgoli, A.A., Ebrahimpour-Komleh, H., Schaefer, G.: A memetic imperialist competitive algorithm with chaotic maps for multi-layer neural network training. Int. J. Bio-Inspir. Comput. 14(4), 227–236 (2019)
Mousavirad, S.J., Bidgoli, A.A., Ebrahimpour-Komleh, H., Schaefer, G., Korovin, I.: An effective hybrid approach for optimising the learning process of multi-layer neural networks. In: International Symposium on Neural Networks, pp. 309–317 (2019)
Mousavirad, S.J., Ebrahimpour-Komleh, H., Schaefer, G.: Effective image clustering based on human mental search. Appl. Soft Comput. 78, 209–220 (2019)
Radicchi, F., Baronchelli, A.: Evolution of optimal lévy-flight strategies in human mental searches. Phys. Rev. E 85(6), 061121 (2012)
Radicchi, F., Baronchelli, A., Amaral, L.A.: Rationality, irrationality and escalating behavior in lowest unique bid auctions. PloS One 7(1) (2012)
Shi, Y., Eberhart, R.: A modified particle swarm optimizer. In: IEEE International Conference on Evolutionary Computation, pp. 69–73 (1998)
Simon, D.: Biogeography-based optimization. IEEE Trans. Evolut. Comput. 12(6), 702–713 (2008)
Stützle, T.: Local search algorithms for combinatorial problems. Darmstadt University of Technology PhD Thesis, vol. 20 (1998)
Suganthan, P.N., Hansen, N., Liang, J.J., Deb, K., Chen, Y.P., Auger, A., Tiwari, S.: Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization. Nanyang Technological University Singapore, Technical report (2005)
Sun, R.: Optimization for deep learning: theory and algorithms (2019). arXiv:1912.08957
Tran, T.H., Nguyen, H., Nhat-Duc, H., et al.: A success history-based adaptive differential evolution optimized support vector regression for estimating plastic viscosity of fresh concrete. Eng. Comput., 1–14 (2019)
Yang, X.S.: Firefly algorithm, stochastic test functions and design optimization (2010). arXiv:1003.1409
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Mousavirad, S.J., Schaefer, G., Ebrahimpour-Komleh, H. (2021). The Human Mental Search Algorithm for Solving Optimisation Problems. In: Hassanien, AE., Taha, M.H.N., Khalifa, N.E.M. (eds) Enabling AI Applications in Data Science. Studies in Computational Intelligence, vol 911. Springer, Cham. https://doi.org/10.1007/978-3-030-52067-0_2
Download citation
DOI: https://doi.org/10.1007/978-3-030-52067-0_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-52066-3
Online ISBN: 978-3-030-52067-0
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)