[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to main content

The Human Mental Search Algorithm for Solving Optimisation Problems

  • Chapter
  • First Online:
Enabling AI Applications in Data Science

Abstract

The performance of most data science algorithms, and in particular machine learning algorithms, is largely dependent on the performance of their optimisation algorithm. In other words, without an effective optimisation algorithm there is no effective data science algorithm. Conventional optimisation algorithms suffer from drawbacks such as a tendency to get stuck in local optima and sensitivity to the initial conditions. To tackle these, population-based metaheuristic algorithms, which work on a population of candidate solution and incorporate stochastic behaviour, can be used. In this chapter, we the present the Human Mental Search (HMS) algorithm, population-based metaheuristic algorithm that is inspired by bid exploration in online auctions. HMS comprises three main operators:, mental search, grouping, and movement. The mental search operator is responsible for exploring the vicinity of a candidate solution, the grouping operator employs an unsupervised clustering technique, k-means, to partition candidate solutions, while the movement operator moves candidate solutions towards a promising area identified by the grouping operator. To evaluate the efficacy of the HMS algorithm, a set of experiments on different benchmark functions with diverse characteristics as well as both normal and large-scale problems. The obtained results clearly show the merit of HMS compared to other algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
GBP 19.95
Price includes VAT (United Kingdom)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
GBP 143.50
Price includes VAT (United Kingdom)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
GBP 179.99
Price includes VAT (United Kingdom)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
GBP 179.99
Price includes VAT (United Kingdom)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Amirsadri, S., Mousavirad, S.J., Ebrahimpour-Komleh, H.: A levy flight-based grey wolf optimizer combined with back-propagation algorithm for neural network training. Neural Comput. Appl. 30(12), 3707–3720 (2018)

    Article  Google Scholar 

  2. Atashpaz-Gargari, E., Lucas, C.: Imperialist competitive algorithm: an algorithm for optimization inspired by imperialistic competition. In: IEEE Congress on Evolutionary Computation, pp. 4661–4667 (2007)

    Google Scholar 

  3. Brooks, S.P., Morgan, B.J.: Optimization using simulated annealing. J. R. Stat. Soc. Ser. D (The Statistician) 44(2), 241–257 (1995)

    Google Scholar 

  4. Carvalho, A.R., Ramos, F.M., Chaves, A.A.: Metaheuristics for the feedforward artificial neural network (ANN) architecture optimization problem. Neural Comput. Appl. 20(8), 1273–1284 (2011)

    Article  Google Scholar 

  5. Das, S., Abraham, A., Konar, A.: Automatic clustering using an improved differential evolution algorithm. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 38(1), 218–237 (2008)

    Article  Google Scholar 

  6. Derrac, J., García, S., Molina, D., Herrera, F.: A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evolut. Comput. 1(1), 3–18 (2011)

    Article  Google Scholar 

  7. Espinoza-Pérez, S., Rojas-Domınguez, A., Valdez-Pena, S.I., Mancilla-Espinoza, L.E.: Evolutionary training of deep belief networks for handwritten digit recognition. Res. Comput. Sci. 148, 115–131 (2019)

    Article  Google Scholar 

  8. Eusuff, M., Lansey, K., Pasha, F.: Shuffled frog-leaping algorithm: a memetic meta-heuristic for discrete optimization. Eng. Optim. 38(2), 129–154 (2006)

    Article  MathSciNet  Google Scholar 

  9. Geem, Z.W., Kim, J.H., Loganathan, G.V.: A new heuristic optimization algorithm: harmony search. Simulation 76(2), 60–68 (2001)

    Article  Google Scholar 

  10. Hancer, E., Xue, B., Zhang, M.: Differential evolution for filter feature selection based on information theory and feature ranking. Knowl.-Based Syst. 140, 103–119 (2018)

    Article  Google Scholar 

  11. Kapanova, K., Dimov, I., Sellier, J.: A genetic approach to automatic neural network architecture optimization. Neural Comput. Appl. 29(5), 1481–1492 (2018)

    Article  Google Scholar 

  12. Karaboga, D., Basturk, B.: A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm. J. Glob. Optim. 39(3), 459–471 (2007)

    Article  MathSciNet  Google Scholar 

  13. Kennedy, J., Eberhart, R.: Particle swarm optimization (PSO). In: IEEE International Conference on Neural Networks, pp. 1942–1948 (1995)

    Google Scholar 

  14. Mahdavi, S., Shiri, M.E., Rahnamayan, S.: Metaheuristics in large-scale global continues optimization: a survey. Inform. Sci. 295, 407–428 (2015)

    Article  MathSciNet  Google Scholar 

  15. Minija, S.J., Emmanuel, W.S.: Imperialist competitive algorithm-based deep belief network for food recognition and calorie estimation. Evolut. Intell., 1–16 (2019)

    Google Scholar 

  16. Mirjalili, S., Lewis, A.: The whale optimization algorithm. Adv. Eng. Softw. 95, 51–67 (2016)

    Article  Google Scholar 

  17. Mirjalili, S., Mirjalili, S.M., Lewis, A.: Grey wolf optimizer. Adv. Eng. Softw. 69, 46–61 (2014)

    Article  Google Scholar 

  18. Mladenović, N., Hansen, P.: Variable neighborhood search. Comput. Oper. Res. 24(11), 1097–1100 (1997)

    Article  MathSciNet  Google Scholar 

  19. Mousavirad, S., Ebrahimpour-Komleh, H.: Feature selection using modified imperialist competitive algorithm. In: ICCKE 2013, pp. 400–405. IEEE (2013)

    Google Scholar 

  20. Mousavirad, S.J., Bidgoli, A.A., Ebrahimpour-Komleh, H., Schaefer, G.: A memetic imperialist competitive algorithm with chaotic maps for multi-layer neural network training. Int. J. Bio-Inspir. Comput. 14(4), 227–236 (2019)

    Article  Google Scholar 

  21. Mousavirad, S.J., Bidgoli, A.A., Ebrahimpour-Komleh, H., Schaefer, G., Korovin, I.: An effective hybrid approach for optimising the learning process of multi-layer neural networks. In: International Symposium on Neural Networks, pp. 309–317 (2019)

    Google Scholar 

  22. Mousavirad, S.J., Ebrahimpour-Komleh, H., Schaefer, G.: Effective image clustering based on human mental search. Appl. Soft Comput. 78, 209–220 (2019)

    Article  Google Scholar 

  23. Radicchi, F., Baronchelli, A.: Evolution of optimal lévy-flight strategies in human mental searches. Phys. Rev. E 85(6), 061121 (2012)

    Google Scholar 

  24. Radicchi, F., Baronchelli, A., Amaral, L.A.: Rationality, irrationality and escalating behavior in lowest unique bid auctions. PloS One 7(1) (2012)

    Google Scholar 

  25. Shi, Y., Eberhart, R.: A modified particle swarm optimizer. In: IEEE International Conference on Evolutionary Computation, pp. 69–73 (1998)

    Google Scholar 

  26. Simon, D.: Biogeography-based optimization. IEEE Trans. Evolut. Comput. 12(6), 702–713 (2008)

    Article  Google Scholar 

  27. Stützle, T.: Local search algorithms for combinatorial problems. Darmstadt University of Technology PhD Thesis, vol. 20 (1998)

    Google Scholar 

  28. Suganthan, P.N., Hansen, N., Liang, J.J., Deb, K., Chen, Y.P., Auger, A., Tiwari, S.: Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization. Nanyang Technological University Singapore, Technical report (2005)

    Google Scholar 

  29. Sun, R.: Optimization for deep learning: theory and algorithms (2019). arXiv:1912.08957

  30. Tran, T.H., Nguyen, H., Nhat-Duc, H., et al.: A success history-based adaptive differential evolution optimized support vector regression for estimating plastic viscosity of fresh concrete. Eng. Comput., 1–14 (2019)

    Google Scholar 

  31. Yang, X.S.: Firefly algorithm, stochastic test functions and design optimization (2010). arXiv:1003.1409

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Mousavirad, S.J., Schaefer, G., Ebrahimpour-Komleh, H. (2021). The Human Mental Search Algorithm for Solving Optimisation Problems. In: Hassanien, AE., Taha, M.H.N., Khalifa, N.E.M. (eds) Enabling AI Applications in Data Science. Studies in Computational Intelligence, vol 911. Springer, Cham. https://doi.org/10.1007/978-3-030-52067-0_2

Download citation

Publish with us

Policies and ethics