[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content
Log in

Evolutionary and population dynamics of 3 parents differential evolution (3PDE) using self-adaptive tuning methodologies

  • Published:
Natural Computing Aims and scope Submit manuscript

Abstract

Differential Evolution is known for its simplicity and effectiveness as an evolutionary optimizer. In recent years, many researchers have focused on the exploration of Differential Evolution (DE). The objective of this paper is to show the evolutionary and population dynamics for the empirical testing on 3-Parents Differential Evolution (3PDE) for unconstrained function optimization (Teng et al. 2007). In this paper, 50 repeated evolutionary runs for each of 20 well-known benchmarks were carried out to test the proposed algorithms against the original 4-parents DE algorithm. As a result of the observed evolutionary dynamics, 3PDE-SAF performed the best among the preliminary proposed algorithms that included 3PDE-SACr and 3PDE-SACrF. Subsequently, 3PDE-SAF is chosen for the self-adaptive population size for testing dynamic population sizing methods using the absolute (3PDE-SAF-Abs) and relative (3PDE-SAF-Rel) population size encodings. The final result shows that 3PDE-SAF-Rel produced a better performance and convergence overall compared to all the other proposed algorithms, including the original DE. In terms of population dynamics, the population size in 3PDE-SAF-Abs exhibited disadvantageously high dynamics that caused less efficient results. On the other hand, the population size in 3PDE-SAF-Rel was observed to be approximately constant at ten times the number of variables being optimized, hence giving a better and more stable performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  • Ali MM, Torn A (2004) Population set based global optimization algorithms: some modifications and numerical studies. Comput Oper Res 31(10):1703–1725

    Article  MathSciNet  MATH  Google Scholar 

  • Ali MM, Khompatraphon C, Zabinsky ZB (2005) A numerical evaluation of several stochastic algorithms on selected continuous global optimization test problems. J Glob Optim 31:635–672

    Article  MATH  Google Scholar 

  • Chakraborty UK (ed) (2008) Advances in differential evolution. Springer, Heidelberg

    MATH  Google Scholar 

  • Das S, Konar A, Chakraborty UK (2005) Two improved differential evolution schemes for faster global search. Proceedings of the 2005 conference on genetic and evolutionary computation GECCO’05. pp 991–998

  • Eiben AE, Hinterding R, Michalewicz Z (1999) Parameter control in evolutionary algorithms. IEEE Trans Evol Comput 3(2):124–141

    Article  Google Scholar 

  • Ho S, Shu S, Chen J (2003) Intelligent evolutionary algorithms for large parameter optimization problems. IEEE Trans Evol Comput 8(5):522–540

    Google Scholar 

  • Kaelo P, Ali MM (2006) A numerical study of some modified differential evolution algorithms. Eur J Oper Res 169(3):1176–1184

    Article  MathSciNet  MATH  Google Scholar 

  • Onwubolu GC, Babu BV (2004) New optimization techniques in engineering. Springer Verlag, Heidelberg, Germany

    MATH  Google Scholar 

  • Price K, Storn P (1997) Differential evolution. Dr Dobb’s J, 18–24

  • Storn R, Price K (1995) Differential evolution: a simple and efficient adaptive scheme for global optimization over continuous spaces. Technical Report TR-95-012, International Computer Science Institute, Berkeley

  • Teng NS, Teo J, Ahmad Hijazi MH (2006) Comparing between 3-parents and 4-parents for differential evolution. M2USIC proceeding 2006, TS-1A, pp 12–16

  • Teng NS, Teo J, Ahmad Hijazi MH (2007) Empirical testing on 3 parents differential evolution 3PDE for unconstrained function optimization. IEEE Congress on Evolutionary Computation 2007. pp 2259–2266

  • Tu Z, Lu Y (2004) A robust stochastic genetic algorithm (StGA) for global numerical optimization. IEEE Trans Evol Comput 8(5):456–470

    Article  Google Scholar 

  • Yao X, Liu Y (1996) Fast evolutionary programming in evolutionary. In: Fogel LJ, Angeline PJ, Back T (eds) Evolutionary programming V: proceedings of the fifth annual conference on evolutionary programming. The MIT Press, Cambridge, MA, pp 451–460

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Teng Nga Sing.

Appendix

Appendix

This appendix lists all the test functions used according to their function number used in the experiments conducted above. A brief description of each test function including its global optimum and variable values where this occurs accompanies each function description.

1.1 De Jong’s Function 1 (Sphere Model)

The De Jong’s Function 1 is the simplest test function. It is also known as sphere model. It is continuous, convex and unimodal function. The tests were performed with n = 10. \( f(x)\, = \,\sum\nolimits_{i = 1}^{n} {x_{i}^{2} } \) subject to −100 ≤ x i  ≤ 100. The global minimum is located where x = (0, 0,…, 0) with f(x) = 0.

1.2 De Jong’s Function 2 (Rosenbrock’s Saddle)

$$ f\left( x \right) = \sum\limits_{i = 1}^{n - 1} {\left[ {100\left( {x_{i + 1} - x_{i}^{2} } \right)^{2} + \left( {x_{i} - 1} \right)^{2} } \right]} $$

subject to −30 ≤ x i  ≤ 30. The tests were performed with n = 10. It is unimodal and due to a saddle point where it is very difficult to locate the minimizer x = (1, 1,…, 1) with f(x) = 0.

1.3 Camel Back-Three Hump Problem

$$ f\left( x \right) = 2x_{1}^{2} - 1.05x_{1}^{4} + \frac{1}{6}x_{1}^{6} + x_{1} x_{2} + x_{2}^{2} $$

subject to −5 ≤ x1, x2 ≤ 5. This function is multimodal which has three local minima, and one of them is global which located at x = (0, 0) with f(x) = 0.

1.4 Becker and Lago’s Problem

$$ f\left( x \right) = \left( {|x_{1} | - 5} \right)^{2} + \left( {|x_{2} | - 5} \right)^{2} $$

subject to −10 ≤ x1, x2 ≤ 10. The test were performed with n = 2. This function is also a multimodal function that has four global minima located at (±5, ±5), all with f(x) = 0.

1.5 Schwefel’s Problem 2.22

$$ f\left( x \right) = \sum\limits_{i = 1}^{n} {\left| {x_{i} } \right|} + \prod\limits_{i = 1}^{n} {\left| {x_{i} } \right|} $$

subject to −10 ≤ x i  ≤ 10. This is a unimodal function. The global minimum is located at the origin with the f(x) = 0. The tests were performed with n = 10.

1.6 Rastragin’s Function

$$ f\left( x \right) = \sum\limits_{i = 1}^{n} {\left[ {x_{i}^{2} - 10\cos \left( {2\pi x_{i} } \right) + 10} \right]} $$

subject to −5.12 ≤ x i  ≤ 5.12 where i = 1–n. The tests were performed with n = 10. This is a multimodal function where the total number of minima is not known. However, the global minimizer is located at x = (0, 0,…, 0) with f(x) = 0.

1.7 Modified Rosenbrock’s Problem

$$ f\left( x \right) = 100\left( {x_{2} - x_{1}^{2} } \right)^{2} + \left[ {6.4\left( {x_{2} - 0.5} \right)^{2} - x_{1} - 0.6} \right]^{2} $$

subject to −5 ≤ x1, x2 ≤ 5. The tests again performed with n = 2. This is a multimodal function with two global minima with f(x) = 0 where corresponding to the intersection of two parabolas and a local minimum where the parabolas approach with no intersection. The global minima are located at x ≈ (0.3412, 0.1164), (1, 1).

1.8 Griewangk’s Function

$$ f\left( x \right) = \frac{1}{4000}\sum\limits_{i = 1}^{n} {x{}_{i}^{2} - \prod\limits_{i = 1}^{n} {\cos \left( {{\frac{{x_{i} }}{{\sqrt {i + 1} }}}} \right)} } + 1 $$

subject to −600 ≤ x i  ≤ 600, with n = 10. This function has a global minimum located at x = (0, 0,…, 0) with f(x) = 0. The number of local minima is unknown.

1.9 Ackley’s Path Function

$$ f\left( x \right) = - 20\exp \left( { - 0.2\sqrt {\frac{1}{n}\sum\limits_{i = 1}^{n} {x_{i}^{2} } } } \right) - \exp \left( {\frac{1}{n}\sum\limits_{i = 1}^{n} {\cos 2\pi x_{i} } } \right) + 20 + e $$

subject to −32 ≤ x i  ≤ 32 where i = 1 – n and e equals approximately 2.71828183. The tests were performed with n = 10. This is a multimodal function where the number of local minima is not known. The global minimum is located at the origin with the f(x) = 0.

1.10 Bohachevsky’s Problem 2

$$ f\left( x \right) = x_{1}^{2} + 2x_{2}^{2} - 0.3\cos \left( {3\pi x_{1} } \right)\cos \left( {4\pi x_{2} } \right) + 0.3 $$

subject to −50 ≤ x1, x2 ≤ 50 with n = 10. The function is a multimodal function where the number of local minima is not known. However, the global minimizer is located at x = (0, 0) with f(x) = 0.

1.11 Rotate Hyper-Ellipsoid Function

$$ f\left( x \right) = \sum\limits_{i = 1}^{n} {\left( {\sum\limits_{j = 1}^{i} {x_{j} } } \right)^{2} } $$

subject to −65.536 ≤ x i  ≤ 65.536. This function is continuous, convex and unimodal. It produces rotated hyper-ellipsoids. The tests were performed with n = 10. The global minimum is located at x = (0) with the f(x) = 0.

1.12 Sum of Different Power Problem

$$ f\left( x \right) = \sum\limits_{i = 1}^{n} {\left| {x_{i} } \right|^{{\left( {i + 1} \right)}} } $$

subject to −1 ≤ x i  ≤ 1. This is a unimodal function which is commonly used. The tests were again performed with n = 10. The global minimum is located at x = 0 with f(x) = 0.

1.13 Miele and Cantrell’s Problem

$$ f\left( x \right) = \left( {\exp \left( {x_{1} } \right) - x_{2} } \right)^{4} + 100\left( {x_{2} - x_{3} } \right)^{6} + \left( {\tan \left( {x_{3} - x_{4} } \right)} \right)^{4} + x_{1}^{8} $$

subject to −1 ≤ x i  ≤ 1 where i = 1, 2, 3, 4. The test were performed with n = 4. This is a multimodal function where the number of local minima is unknown. However, the global minimum is located at x = (0, 1, 1, 1) with f(x) = 0.

1.14 Schaffer’s Problem 1

$$ f\left( x \right) = 0.5 + {\frac{{\left( {\sin \sqrt {x_{1}^{2} + x_{2}^{2} } } \right)^{2} - 0.5}}{{\left( {1 + 0.001\left( {x_{1}^{2} + x_{2}^{2} } \right)} \right)^{2} }}} $$

subject to −100 ≤ x1, x2 ≤ 100. Again, the tests were performed with n = 2. This is also a multimodal function and the number of local minima is not known. The global minimum is located at x = (0, 0) with f(x) = 0.

1.15 Moved Axis Parallel Hyper-Ellipsoid Function

$$ f\left( x \right) = \sum\limits_{i = 1}^{n} {5i} x_{i}^{2} $$

subject to −5.12 ≤ x i  ≤ 5.12. It is derived from the Axis Parallel Hyper-Ellipsoid function. This is a unimodal function. The tests were performed with n = 10. The global minimum is located at x = 5 * i with f(x) = 0.

1.16 Helical Valley problem

$$ f\left( x \right) = 100\left[ {\left( {x_{2} - 10\theta } \right)^{2} + \left( {\sqrt {\left( {x_{1}^{2} + x_{2}^{2} } \right)} - 1} \right)^{2} } \right] + x_{3}^{2} $$
$$ \theta = \left\{ {{\frac{1}{2\pi }}\tan^{ - 1} {\frac{{x_{2} }}{{x_{1} }}}} \right.\,\,\,\,{\text{if}}\,\,x_{1} \ge 0 $$
$$ \theta = \left\{ {{\frac{1}{2\pi }}\tan^{ - 1} {\frac{{x_{2} }}{{x_{1} }}}} \right. + \frac{1}{2}\,\,\,\,{\text{if}}\,\,x_{1} < 0 $$

subject to −10 ≤ x1, x2, x3 ≤ 10. The function is a unimodal function with n = 3. This is a step-sided valley which follows a helical path. The global minimum is located at x = (1, 0, 0) with f(x) = 0.

1.17 Salomon’s Problem

$$ f\left( x \right) = 1 - \cos \left( {2\pi ||x||} \right) + 0.1||x|| $$

where \( ||x|| = \sqrt {\sum\nolimits_{i = 1}^{n} {x_{i}^{2} } } , \) subject to −100 ≤ x i  ≤ 100. It is a multimodal function where the number of local minima is unknown. However, the global minimum is located at x = (0, 0,…, 0) with f(x) = 0. The tests were performed with n = 10.

1.18 Powell’s Quadratic Problem

$$ f\left( x \right) = \left( {x_{1} + 10x_{2} } \right)^{2} + 5\left( {x_{3} - x_{4} } \right)^{2} + \left( {x_{2} - 2x_{3} } \right)^{4} + 10\left( {x_{1} - x_{4} } \right)^{4} $$

subject to −10 ≤ x i  ≤ 10 where i = 1, 2, 3, 4. The tests were performed with n = 4. This is a unimodal function where f(x) = 0, x = (0, 0, 0, 0).

1.19 Bohachevsky’s Problem 1

$$ f\left( x \right) = x_{1}^{2} + 2x_{2}^{2} - 0.3\cos \left( {3\pi x_{1} } \right) - 0.4\cos \left( {4\pi x_{2} } \right) + 0.7 $$

subject to −50 ≤ x1, x2 ≤ 50 with n = 2. It is unimodal function where the number of local minima is unknown. However, the global minimum is located at x = (0, 0) with f(x) = 0.

1.20 Wood’s Function

$$ \begin{gathered} f\left( x \right) = 100\left( {x_{2} - x_{1}^{2} } \right)^{2} + \left( {1 - x_{1} } \right)^{2} + 90\left( {x_{4} - x_{3}^{2} } \right)^{2} + \left( {1 - x_{3} } \right)^{2} + \hfill \\ 10.1\left[ {\left( {x_{2} - 1} \right)^{2} + \left( {x_{4} - 1} \right)^{2} } \right] + 19.8\left( {x_{2} - 1} \right)\left( {x_{4} - 1} \right) \hfill \\ \end{gathered} $$

subject to −100 ≤ x i  ≤ 100 where i = 1, 2, 3, 4. This is a unimodal function. The function has a saddle near (1, 1, 1, 1). The only minimum is located at x = (1, 1, 1, 1) with f(x) = 0. The tests were performed with n = 4.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Sing, T.N., Teo, J. Evolutionary and population dynamics of 3 parents differential evolution (3PDE) using self-adaptive tuning methodologies. Nat Comput 10, 507–526 (2011). https://doi.org/10.1007/s11047-010-9194-0

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11047-010-9194-0

Keywords

Navigation