Abstract
The goal of this work is to propose a hybrid algorithm called real-coded self-organizing migrating genetic algorithm by combining real-coded genetic algorithm (RCGA) and self-organizing migrating algorithm (SOMA) for solving bound-constrained nonlinear optimization problems having multimodal continuous functions. In RCGA, exponential ranking selection, whole-arithmetic crossover and non-uniform mutation operations have been used as different operators where as in SOMA, a modification has been done. The performance of the proposed hybrid algorithm has been tested by solving a set of benchmark optimization problems taken from the existing literature. Then, the simulated results have been compared numerically and graphically with existing algorithms. In the graphical comparison, a modified performance index has been proposed. Finally, the proposed algorithm has been applied to solve two real-life problems.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Deep K, Thakur M (2007) A new mutation operator for real coded genetic algorithms. Appl Math Comput 193(1):211–230
Holland JH (1992) Adaptation in natural and artificial systems: an introductory analysis with applications to biology, control, and artificial intelligence. MIT Press, Cambridge
Bäck T, Fogel DB, Michalewicz Z (1997) Handbook of evolutionary computation. CRC Press, Boca Raton
Koziel S, Michalewicz Z (1999) Evolutionary algorithms, homomorphous mappings, and constrained parameter optimization. Evol Comput 7(1):19–44
Kirkpatrick S, Gelatt CD, Vecchi MP (1983) Optimization by simulated annealing. Science 220(4598):671–680
Glover F (1986) Future paths for integer programming and links to artificial intelligence. Comput Oper Res 13(5):533–549
Glover F (1989) Tabu search-part I. ORSA J Comput 1(3):190–206
Pierezan J, Coelho LDS (2018) Coyote optimization algorithm: a new metaheuristic for global optimization problems. In: 2018 IEEE congress on evolutionary computation (CEC). IEEE, pp 1–8
Klein CE, Mariani VC, dos Santos Coelho L (2018) Cheetah based optimization algorithm: a novel swarm intelligence paradigm. In: 26th European Symposium on Artificial Neural Networks, ESANN 2018, Bruges, Belgium, April 25–27, 2018
Klein CE, dos Santos Coelho L (2018) Meerkats-inspired algorithm for global optimization problems. In: 26th European Symposium on Artificial Neural Networks, ESANN 2018, Bruges, Belgium, April 25–27, 2018
Mortazavi A, Toğan V, Nuhoğlu A (2018) Interactive search algorithm: a new hybrid metaheuristic optimization algorithm. Eng Appl Artif Intell 71:275–292
Shayanfar H, Gharehchopogh FS (2018) Farmland fertility: a new metaheuristic algorithm for solving continuous optimization problems. Appl Soft Comput 71:728–746
Shadravan S, Naji HR, Bardsiri VK (2019) The Sailfish Optimizer: a novel nature-inspired metaheuristic algorithm for solving constrained engineering optimization problems. Eng Appl Artif Intell 80:20–34
de Vasconcelos Segundo EH, Mariani VC, dos Santos Coelho L (2019) Design of heat exchangers using Falcon Optimization Algorithm. Appl Therm Eng 156:119–144
Goldberg DE (1989) Genetic algorithms in search, optimization and machine learning. Addison-Wesley Publishing Co., Inc., Cambridge
Michalewicz Z (2013) Genetic algorithms + data structures = evolution programs. Springer, Berlin
Mitchell M (1998) An introduction to genetic algorithms. MIT Press, Cambridge
Sakawa M (2012) Genetic algorithms and fuzzy multiobjective optimization, vol 14. Springer, Berlin
Sawyerr BA, Ali MM, Adewumi AO (2011) A comparative study of some real-coded genetic algorithms for unconstrained global optimization. Optim Methods Softw 26(6):945–970
Toledo CFM, Oliveira L, França PM (2014) Global optimization using a genetic algorithm with hierarchically structured population. J Comput Appl Math 261:341–351
Karaboga D, Akay B (2009) A comparative study of artificial bee colony algorithm. Appl Math Comput 214(1):108–132
Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer. Adv Eng Softw 69:46–61
Sadollah A, Eskandar H, Bahreininejad A, Kim JH (2015) Water cycle algorithm with evaporation rate for solving constrained and unconstrained optimization problems. Appl Soft Comput 30:58–71
Wu X, Zhou Y, Lu Y (2017) Elite opposition-based water wave optimization algorithm for global optimization. In: Mathematical problems in engineering
Chen H, Xu Y, Wang M, Zhao X (2019) A balanced whale optimization algorithm for constrained engineering design problems. Appl Math Model 71:45–59
Tejani GG, Savsani VJ, Patel VK, Mirjalili S (2019) An improved heat transfer search algorithm for unconstrained optimization problems. J Comput Des Eng 6(1):13–32
Rao R (2016) Jaya: a simple and new optimization algorithm for solving constrained and unconstrained optimization problems. Int J Ind Eng Comput 7(1):19–34
Zelinka I, Lampinen J (2000) SOMA—self-organizing migrating algorithm, Nostradamus. In: Proceedings of the 3rd international conference on prediction and nonlinear dynamic, Zlin, Czech Republic
Nolle L, Zelinka I, Hopgood AA, Goodyear A (2005) Comparison of an self-organizing migration algorithm with simulated annealing and differential evolution for automated waveform tuning. Adv Eng Softw 36(10):645–653
Deep K, Dipti (2007) A new hybrid self organizing migrating genetic algorithm for function optimization. In: Proceedings of IEEE Congress on Evolutionary Computation, pp 2796–2803, Singapore
Deep K, Dipti (2008) Self-organizing migrating genetic algorithm for constrained optimization. Appl Math Comput 198(1):237–250
Coelho LdS (2009) Self-organizing migration algorithm applied to machining allocation of clutch assembly. Math Comput Simul 80(2):427–435
Coelho LdS, Mariani VC (2010) An efficient cultural self-organizing migrating strategy for economic dispatch optimization with valve-point effect. Energy Convers Manag 51(12):2580–2587
Senkerik R, Zelinka I, Davendra D, Oplatkova Z (2010) Utilization of SOMA and differential evolution for robust stabilization of chaotic Logistic equation. Comput Math Appl 60(4):1026–1037
Davendra D, Zelinka I (2016) Self-organizing migrating algorithm. In: New optimization techniques in engineering. Springer, Cham
Shi XH, Wan LM, Lee HP, Yang XW, Wang LM, Liang YC (2003) An improved genetic algorithm with variable population-size and a PSO-GA based hybrid evolutionary algorithm. In: Proceedings of the 2003 International Conference on Machine Learning and Cybernetics (IEEE Cat. No. 03EX693), vol 3. IEEE, pp 1735–1740
Marjani A, Shirazian S, Asadollahzadeh M (2018) Topology optimization of neural networks based on a coupled genetic algorithm and particle swarm optimization techniques (c-GA–PSO-NN). Neural Comput Appl 29(11):1073–1076
Choudhary A, Kumar M, Gupta MK, Unune DK, Mia M (2019) Mathematical modeling and intelligent optimization of submerged arc welding process parameters using hybrid PSO-GA evolutionary algorithms. Neural Comput Appl. https://doi.org/10.1007/s00521-019-04404-5
Fan SKS, Liang YC, Zahara E (2006) A genetic algorithm and a particle swarm optimizer hybridized with Nelder–Mead simplex search. Comput Ind Eng 50(4):401–425
Deep K, Singh D (2016) Optimization of directional overcurrent relay times using C-SOMGA. In: Self-organizing migrating algorithm. Springer, Cham, pp 167–186
Deep K, Das KN (2008) Quadratic approximation based hybrid genetic algorithm for function optimization. Appl Math Comput 203(1):86–98
Bharati (1994) Controlled random search optimization technique and their applications. Ph.D. Thesis, Department of Mathematics, University of Roorkee, Roorkee, India
Mohan C, Nguyen HT (1999) A controlled random search technique incorporating the simulated annealing concept for solving integer and mixed integer global optimization problems. Comput Optim Appl 14(1):103–132
Sherwood TK (1963) A course in process design. The MIT Press, Cambridge
Beightler CS, Phillips DT (1976) Applied geometric programming. Wiley, Hoboken
Jen FC, Pegels CC, Dupuis TM (1968) Optimal capacities of production facilities. Manag Sci 14(10):B-573
Janikow CZ, Michalewicz Z (1991) An experimental comparison of binary and floating point representations in genetic algorithms. In: ICGA, pp 31–36
Zelinka I, Lampinen J, Nolle L (2001) On the theoretical proof of convergence for a class of SOMA search algorithms. In: Proceedings of 7th international Mendel conference on soft computing, Brno, Czech Republic, pp 103–110
Zelinka I (2004) SOMA—self-organizing migrating algorithm. In: New optimization techniques in engineering. Springer, Berlin, pp 167–217
Ali MM, Khompatraporn C, Zabinsky ZB (2005) A numerical evaluation of several stochastic algorithms on selected continuous global optimization test problems. J Glob Optim 31(4):635–672
Jamil M, Yang X-S (2013) A literature survey of benchmark functions for global optimisation problems. Int J Math Model Numer Optim 4(2):150–194
Surjanovic S, Bingham D (2016) Virtual library of simulation experiments: test functions and datasets. Retrieved September 25 from http://www.sfu.ca/~ssurjano
Chelouah R, Siarry P (2000) Tabu search applied to global optimization. Eur J Oper Res 123(2):256–270
Liang JJ, Qin AK, Suganthan PN, Baskar S (2006) Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE Trans Evol Comput 10(3):281–295
Zhang G, Lu H (2006) Hybrid real-coded genetic algorithm with quasi-simplex technique. Int J Comput Sci Netw Secur 6(10):246–255
Pant M, Thangaraj R, Singh VP, Abraham A (2008) Particle swarm optimization using Sobol mutation. In: 2008 first international conference on emerging trends in engineering and technology. IEEE, pp 367–372
Ali MM, Kaelo P (2008) Improved particle swarm algorithms for global optimization. Appl Math Comput 196(2):578–593
Ali MM, Gabere MN (2010) A simulated annealing driven multi-start algorithm for bound constrained global optimization. J Comput Appl Math 233(10):2661–2674
Pant M, Thangaraj R, Grosan C, Abraham A (2008) Hybrid differential evolution-particle swarm optimization algorithm for solving global optimization problems. In: 2008 third international conference on digital information management. IEEE, pp 18–24
Deep K (2011) The particle swarm optimization for real life optimization problems. In: Proceeding of international conference on advances in modeling, optimization and computing (AMOC), pp 723–732
Hellinckx LJ, Rijckaert MJ (1972) Optimal capacities of production facilities An application of geometric programming. Can J Chem Eng 50(1):148–150
Zaiontz C. Real statistics using Excel. http://www.real-statistics.com/non-parametric-tests/wilcoxon-rank-sum-test/
Bellera CA, Julien M, Hanley JA (2010) Normal approximations to the distributions of the Wilcoxon statistics: accurate to what N? Graphical insights. J Stat Educ 18(2):1–17
Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30
Derrac J, García S, Molina D, Herrera F (2011) A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evolut Comput 1(1):3–18
García S, Fernández A, Luengo J, Herrera F (2010) Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: experimental analysis of power. Inf Sci 180(10):2044–2064
Acknowledgements
The authors are very much thankful to the anonymous reviewers for their constructive comments. The third and fifth author would like to acknowledge the finical support provided by WBDSTBT (429(Sanc)/ST/P/S&T/16G-23/2018) for continuing the research. The first author is thankful to Mrs. Kabita Garai, Dakshin Changrachak Sukanta Vidyapith, Moyna, Purba Medinipur, West Bengal, India-721644 and Dr. Navonil Bose, Department of Physics, Supreme Knowledge Foundation Group of Institutions, Mankundu, Hooghly, West Bengal, India-712139 for their kind cooperation and suggestions.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix
Appendix
The detailed description of 25 problems that have been used in this paper is given below:
-
1.
Ackley’s problem:
$$\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{1} ({\mathbf{x}}) = - 20\exp \left( { - 0.2\sqrt {\frac{1}{n}\sum\limits_{i = 1}^{n} {x_{i}^{2} } } } \right) - \exp \left( {\frac{1}{n}\sum\limits_{i = 1}^{n} {\cos (2\pi x_{i} )} } \right) + 20 + e,$$where \(- 32 \le x_{i} \le 32\), with the known global optimum \(f_{1} ({\mathbf{x}}^{*} ) = 0\) at \(x_{i}^{*} = 0\) for \(i = 1,2, \ldots ,n.\)
-
2.
Cosine mixture problem:
\(\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{2} ({\mathbf{x}}) = \sum\limits_{i = 1}^{n} {x_{i}^{2} } - 0.1\sum\limits_{i = 1}^{n} {\cos (5\pi x_{i} )}\), \({\text{where }}\,\, - 1 \le x_{i} \le 1\), with the known global optimum \(f_{2} ({\mathbf{x}}^{*} ) = - \,0.1\,n \,\) at \(x_{i}^{*} = 0\) for \(i = 1,2, \ldots ,n.\)
-
3.
Exponential problem:
\(\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{3} ({\mathbf{x}}) = - \exp \left( { - 0.5\sum\limits_{i = 1}^{n} {x_{i}^{2} } } \right)\),\({\text{where }}\, - 1 \le x_{i} \le 1\), with the known global optimum \(f_{3} ({\mathbf{x}}^{*} ) = - 1 \,\) at \(x_{i}^{*} = 0\) for \(i = 1,2, \ldots ,n.\)
-
4.
Griewank problem:
\(\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{4} ({\mathbf{x}}) = 1 + \frac{1}{4000}\sum\limits_{i = 1}^{n} {x_{i}^{2} } - \prod\limits_{i = 1}^{n} {\cos \left( {\frac{{x_{i} }}{\sqrt i }} \right)}\), \({\text{where }}\, - 600 \le x_{i} \le 600\), with the known global optimum \(f_{4} ({\mathbf{x}}^{*} ) = 0 \,\) at \(x_{i}^{*} = 0\) for \(i = 1,2, \ldots ,n.\)
-
5.
Levy and Montalvo problem 1:
\(\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{5} ({\mathbf{x}}) = \frac{\pi }{n}\left( {10\sin^{2} (\pi y_{1} ) + \sum\limits_{i = 1}^{n - 1} {(y_{i} - 1)^{2} } \left[ {1 + 10\sin^{2} \left( {\pi y_{i + 1} } \right)} \right] + (y_{n} - 1)^{2} } \right)\)\({\text{where }}\,y_{i} = 1 + \frac{1}{4}(x_{i} + 1),\,\,\, - 10 \le x_{i} \le 10\), with the known global optimum \(f_{5} ({\mathbf{x}}^{*} ) = 0 \,\) at \(x_{i}^{*} = 1\) for \(i = 1,2, \ldots ,n.\)
-
6.
Levy and Montalvo problem 2:
\(\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{6} ({\mathbf{x}}) = 0.1\left( {\sin^{2} (3\pi x_{1} ) + \sum\limits_{i = 1}^{n - 1} {(x_{i} - 1)^{2} } \left[ {1 + \sin^{2} (3\pi x_{i + 1} )} \right] + (x_{n} - 1)^{2} \left[ {1 + \sin^{2} (2\pi x_{n} )} \right]} \right)\)\({\text{where }}\,\, - 5 \le x_{i} \le 5\), with the known global optimum \(f_{6} ({\mathbf{x}}^{*} ) = 0 \,\) at \(x_{i}^{*} = 1\) for \(i = 1,2, \ldots ,n.\)
-
7.
Paviani problem:
\(\mathop {\hbox{min} }\limits_{x} f_{7} (x) = \sum\limits_{i = 1}^{10} {\left[ {(1n(x_{i} - 2))^{2} + (1n(10 - x_{i} ))^{2} } \right] - \left( {\prod\limits_{i = 1}^{10} {x_{i} } } \right)^{0.2} } ,\;{\text{where }}2 \le x_{i} \le 10\) with the known global optimum \(f_{7} ({\mathbf{x}}^{*} ) = - 45.778 \,\) at \(x_{i}^{*} = 9.351\) for \(i = 1,2, \ldots ,n.\)
-
8.
Rastrigin problem:
\(\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{8} ({\mathbf{x}}) = 10n + \sum\limits_{i = 1}^{n} {\left[ {x_{i}^{2} - 10\cos (2\pi x_{i} )} \right]}\), \({\text{where }} - 5.12 \le x_{i} \le 5.12\), with the known global optimum \(f_{8} ({\mathbf{x}}^{*} ) = 0\) at \(x_{i}^{*} = 0\) for \(i = 1,2, \ldots ,n.\)
-
9.
Rosenbrock problem:
\(\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{9} ({\mathbf{x}}) = \sum\limits_{i = 1}^{n - 1} {\left[ {100(x_{i + 1} - x_{i}^{2} )^{2} + (x_{i} - 1)^{2} } \right]}\), \({\text{where }} - 30 \le x_{i} \le 30\), with the known global optimum \(f_{9} ({\mathbf{x}}^{*} ) = 0\) at \(x_{i}^{*} = 1\) for \(i = 1,2, \ldots ,n.\)
-
10.
Schwefel problem:
\(\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{10} ({\mathbf{x}}) = 418.9829n - \sum\limits_{i = 1}^{n} {\left[ {x_{i} \sin (\sqrt {\left| {x_{i} } \right|} )} \right]\,}\), \({\text{where }} - 500 \le x_{i} \le 500\), with the known global optimum \(f_{10} ({\mathbf{x}}^{*} ) = 0\) at \(x_{i}^{*} = 420.97\) for \(i = 1,2, \ldots ,n.\)
-
11.
Sinusoidal problem:
\(\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{11} ({\mathbf{x}}) = - \left[ {2.5\prod\limits_{i = 1}^{n} {\sin \left( {x_{i} - \frac{\pi }{6}} \right)} + \prod\limits_{i = 1}^{n} {\sin \left( {5\left( {x_{i} - \frac{\pi }{6}} \right)} \right)} } \right]\), \({\text{where }}\,\,0 \le x_{i} \le \pi\), with the known global optimum \(f_{11} ({\mathbf{x}}^{*} ) = - 3.5\) at \(x_{i}^{*} = 2\pi /3\) for \(i = 1,2, \ldots ,n\).
-
12.
Zakharov’s problem:
\(\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{12} ({\mathbf{x}}) = \sum\limits_{i = 1}^{n} {x_{i}^{2} } + \left( {\sum\limits_{i = 1}^{n} {\frac{i}{2}x_{i} } } \right)^{2} + \left( {\sum\limits_{i = 1}^{n} {\frac{i}{2}x_{i} } } \right)^{4}\), \({\text{where}} - 5.12 \le x_{i} \le 5.12\), with the known global optimum \(f_{12} ({\mathbf{x}}^{*} ) = 0\) at \(x_{i}^{*} = 0\) for \(i = 1,2, \ldots ,n\)
-
13.
Sphere problem:
\(\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{13} ({\mathbf{x}}) = \sum\limits_{i = 1}^{n} {x_{i}^{2} }\), \({\text{where}} - 5.12 \le x_{i} \le 5.12\), with the known global optimum \(f_{13} ({\mathbf{x}}^{*} ) = 0\) at \(x_{i}^{*} = 0\) for \(i = 1,2, \ldots ,n.\)
-
14.
Axis parallel hyper ellipsoid problem:
\(\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{14} ({\mathbf{x}}) = \sum\limits_{i = 1}^{n} {ix_{i}^{2} }\), \({\text{where}} - 5.12 \le x_{i} \le 5.12\), with the known global optimum \(f_{14} ({\mathbf{x}}^{*} ) = 0\) at \(x_{i}^{*} = 0\) for \(i = 1,2, \ldots ,n.\)
-
15.
Schwefel double sum problem:
\(\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{15} ({\mathbf{x}}) = \sum\limits_{i = 1}^{n} {\left( {\sum\limits_{j = 1}^{i} {x_{i} } } \right)^{2} }\), \({\text{where }}\,\, - 65.536 \le x_{i} \le 65.536\), with the known global optimum \(f_{15} ({\mathbf{x}}^{*} ) = 0\) at \(x_{i}^{*} = 0\) for \(i = 1,2, \ldots ,n\).
-
16.
Schwefel problem 4:
\(\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{16} ({\mathbf{x}}) = \mathop {\hbox{min} }\limits_{i} \{ \,\left| {x_{i} } \right|,\,\,1 \le i \le n\}\), \({\text{where}} - 100 \le x_{i} \le 100\), with the known global optimum \(f_{16} ({\mathbf{x}}^{*} ) = 0\) at \(x_{i}^{*} = 0\) for \(i = 1,2, \ldots ,n.\)
-
17.
De-Jong problem with noise:
\(\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{17} ({\mathbf{x}}) = \sum\limits_{i = 1}^{n} {x_{i}^{4} + {\text{rand}}\, (0,\,\,1)}\), \({\text{where }} - 10 \le x_{i} \le 10\), with the known global optimum \(f_{17} ({\mathbf{x}}^{*} ) = 0\) at \(x_{i}^{*} = 0\) for \(i = 1,2, \ldots ,n.\)
-
18.
Ellipsoidal problem:
\(\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{18} ({\mathbf{x}}) = \sum\limits_{i = 1}^{n} {(x_{i} - i)^{2} }\), \({\text{where }}\, - n \le x_{i} \le n\), with the known global optimum \(f_{18} ({\mathbf{x}}^{*} ) = 0\) at \(x_{i}^{*} = i\) for \(i = 1,2, \ldots ,n.\)
-
19.
Generalized penalized problem 1:
$$\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{19} ({\mathbf{x}}) = \frac{\pi }{n}\left( {10\sin^{2} (\pi y_{1} ) + \sum\limits_{i = 1}^{n - 1} {(y_{i} - 1)^{2} } \left[ {1 + 10\sin^{2} (\pi y_{i + 1} )} \right] + (y_{n} - 1)^{2} } \right) + \sum\limits_{i = 1}^{n} u (x_{i} ,\,10,\,100,\,4),$$\({\text{where}}\;y_{i} = 1 + \frac{1}{4}(x_{i} + 1)\,\,{\text{and}}\,\, - 10 \le x_{i} \le 10\), with the known global optimum \(f_{19} ({\mathbf{x}}^{*} ) = 0\) at \(x_{i}^{*} = 1\) for \(i = 1,2, \ldots ,n.\)
-
20.
Generalized penalized problem 2:
$$\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{20} ({\mathbf{x}}) = 0.1\left( {\sin^{2} (3\pi x_{1} ) + \sum\limits_{i = 1}^{n - 1} {(x_{i} - 1)^{2} } \left[ {1 + \sin^{2} (3\pi x_{i + 1} )} \right] + (x_{n} - 1)^{2} \left[ {1 + \sin^{2} (2\pi x_{n} )} \right]} \right) + \sum\limits_{i = 1}^{n} {u(x_{i} ,\,10,\,100,\,4)} ,\,\,\,{\text{where }}\, - 5 \le x_{i} \le 5,$$with the known global optimum \(f_{20} ({\mathbf{x}}^{*} ) = 0\) at \(x_{i}^{*} = 1\) for \(i = 1,2, \ldots ,n.\)
In problems number 19 and 20, the penalty function u is given by the following expression:
$$u(x,\,a,\,k,\,m) = \left\{ {\begin{array}{*{20}l} {k(x - a)^{m} ,} \hfill & {{\text{if}}\;x > a,} \hfill \\ { - k(x - a)^{m} ,} \hfill & {{\text{if}}\;x < - a,} \hfill \\ {0,} \hfill & {{\text{otherwise}} .} \hfill \\ \end{array} } \right.$$ -
21.
Easom problem:
\(\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{21} ({\mathbf{x}}) = - \left( {\mathop \prod \limits_{i = 1}^{n} \cos \,(x_{i} )} \right) \times \left( {e^{{ - \sum\limits_{i = 1}^{n} {(x_{i} - \pi )^{2} } }} } \right)\), \({\text{where }}\, - 10 \le x_{i} \le 10\), with the known global optimum \(f_{21} ({\mathbf{x}}^{*} ) = - 1\) at \(x_{i}^{*} = \pi\) for \(i = 1,2, \ldots ,n.\)
-
22.
Trid problem:
\(\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{22} ({\mathbf{x}}) = \sum\limits_{i = 1}^{n} {(x_{i} - 1)^{2} } - \sum\limits_{i = 2}^{n} {x_{i} x_{i - 1} }\), \({\text{where }}\,\, - n^{2} \le x_{i} \le n^{2}\), with the known global optimum \(f_{22} ({\mathbf{x}}^{*} ) = - \frac{n(n + 4)(n - 1)}{6}\, \,\) at \(x_{i}^{*} = i\,(n + 1 - i)\) for \(i = 1,2, \ldots ,n.\)
-
23.
Dixon and Price problem:
\(\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{23} ({\mathbf{x}}) = \,(x_{1} - 1)^{2} + \,\sum\limits_{i = 2}^{n} {i\,(2x_{i}^{2} - x_{i - 1} )^{2} }\), \({\text{where }} - 10 \le x_{i} \le 10\), with the known global optimum \(f_{23} ({\mathbf{x}}^{*} ) = 0\) at \(x_{i}^{*} = f\left( {2^{{\left( {\frac{{2^{i} - 2}}{{2^{i} }}} \right)}} } \right)\) for \(i = 1,2, \ldots ,n.\)
-
24.
Michalewicz’s problem:
\(\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{24} ({\mathbf{x}}) = - \sum\limits_{i = 1}^{n} {\sin (x_{i} )} \left( {\sin \left( {\frac{{ix_{i}^{2} }}{\pi }} \right)} \right)^{2m}\), \(\,\,{\text{where }}\,0 \le x_{i} \le \pi\) and \(m = 10\,\), with the known global optimum \(f_{24} ({\mathbf{x}}^{*} ) = - 9.6602\) for \(i = 1,2, \ldots ,n.\)
-
25.
Shekel 10 problem:
\(\mathop {\hbox{min} }\limits_{{\mathbf{x}}} f_{25} ({\mathbf{x}}) = - \sum\limits_{j = 1}^{m} {\left[ {\sum\limits_{i = 1}^{4} {(x_{i} - a_{ij} )^{2} + c_{j} } } \right]}^{ - 1}\), \(\,\,{\text{where }}\,0 \le x_{i} \le 10\) and \(m = 10\),
$$a_{ij} = \left[ {\begin{array}{*{20}c} 4 & 1 & 8 & 6 & 3 & 2 & 5 & 8 & 6 & 7 \\ 4 & 1 & 8 & 6 & 7 & 9 & 5 & 1 & 2 & {3.6} \\ 4 & 1 & 8 & 6 & 3 & 2 & 3 & 8 & 6 & 7 \\ 4 & 1 & 8 & 6 & 7 & 9 & 3 & 1 & 2 & {3.6} \\ \end{array} } \right]$$\({\text{and}}\,\,c_{j} = [0\begin{array}{*{20}c} {.1} & {0.2} & {0.2} & {0.4} & {0.4} & {0.6} & {0.3} & {0.7} & {0.5} & {0.5} \\ \end{array} ]^{\rm T}\), with the known global optimum \(f_{25} ({\mathbf{x}}^{*} ) = - 10.5364\) at \(x_{i}^{*} = 4\), for \(i = 1,2,3,4.\)
Rights and permissions
About this article
Cite this article
Duary, A., Rahman, M.S., Shaikh, A.A. et al. A new hybrid algorithm to solve bound-constrained nonlinear optimization problems. Neural Comput & Applic 32, 12427–12452 (2020). https://doi.org/10.1007/s00521-019-04696-7
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-019-04696-7