Abstract
This article offers a new physical-based meta-heuristic optimization algorithm, which is named Transient Search Optimization (TSO) algorithm. This algorithm is inspired by the transient behavior of switched electrical circuits that include storage elements such as inductance and capacitance. The exploration and exploitation of the TSO algorithm are verified by using twenty-three benchmark, where its statistical (average and standard deviation) results are compared with the most recent 15 optimization algorithms. Furthermore, the non-parametric sign test, p value test, execution time, and convergence curves proved the superiority of the TSO against other algorithms. Also, the TSO algorithm is applied for the optimal design of three well-known constrained engineering problems (coil spring, welded beam, and pressure vessel). In conclusion, the comparison revealed that the TSO is promising and very competitive algorithm for solving different engineering problems.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
The randomization concept encouraged the researchers to inspire different behaviors in nature and propose new meta-heuristic algorithms to solve complicated mathematical problems [1]. The nature-inspiration can be a bio-inspiration or a physical-inspiration. The bio-inspiration is based on the life behavior of the creatures to locate their food and adapt their habitats. There are two types of bio-inspired algorithms: 1) swarm-based algorithms, which are imitating the group behavior of creatures’ society for searching and finding food sources; 2) evolutionary-based algorithms, which imitate the evolution concept of creatures. On the other hand, the physical-inspiration is based on scientific laws and equations in many scientific disciplines such as chemistry, astronomy, electrical, disasters … etc. Mathematical equations of different sciences can be randomized in a special manner then tested and re-evaluated until reaching better results and applied to many engineering applications. The straightforwardness, strength, computational time, and suitable proposal are the general characteristics of the algorithms’ comparison and competition.
In a literature review, tens of meta-heuristic algorithms were proposed in the last decades and applied to solve many engineering problems. In the swarm-based algorithms group [2], particle swarm optimization (PSO) is stimulated by the teeming deeds of the birds and fishes [3], ant colony optimization (ACO) is stimulated by the straight path actions between the food and the colony [4], bat algorithm (BA) is motivated by the echo method of detecting the food [5], cuckoo search (CS) algorithm is stimulated by the remembrance of Cuckoos in exploration and recording the best nest for their eggs, and artificial bee colony (ABC) is stimulated by the honey bee group activities [6], grey wolf optimizer (GWO) is motivated by the headship and members grading for exploration and pursue the prey [7, 8], and whale optimization algorithm (WOA) is inspired by the food encircling by the humpback whales [9,10,11]. Furthermore, many swarm-based algorithms appeared recently in last five years to enter the competition circle and solve several engineering problems such as: coyote optimizer [12], sunflower optimizer [13, 14], salp swarm algorithm [15], squirrel search algorithm [16], butterfly optimization algorithm [17], pity beetle algorithm [18], moth-flame optimization algorithm [19], mouth brooding fish algorithm [20], dolphin echolocation [21], spotted hyena optimizer [22], emperor penguin optimizer [23], virus colony search [24], krill herd algorithm [25, 26], and Firefly algorithm (FA) [27].
In evolutionary-based algorithms group [28], genetic algorithm (GA) stimulates the evolution method of biological selection and genetics, biogeography-based optimizer (BBO) is stimulated by the evolution of kinds (for example predators and victims) throughout colonization and transmutation to find best home [29], differential evolution (DE) relies on the weighted-variance among two inhabitants vectors [30], and fast evolutionary programming (FEP) [31]. All these algorithms have been utilized in solving several engineering optimization problems.
Physics-based algorithms are also implemented to achieve the main target. In this regards, many algorithms are inspired such as gravitational search algorithm (GSA), which is motivated by Newton’s law of gravity [32], ray optimization (RO), which imitates the Snell’s light refraction law [33], charged-system-search (CSS) that is stimulated by the Coulomb rule and Newtonian rules of motions [34], and colliding bodies optimization (CBO) that is motivated by basic crashes among masses [35]. Moreover, electromagnetic field optimization [36], thermal exchange optimization [37], Ions motion algorithm [38], water evaporation optimization [39], and water cycle algorithm [40] are created as competitive algorithms.
Based on the No Free Lunch theory [41], there is no optimization algorithm that can efficiently solve all optimization problems. It sometimes solves one optimization problem and in the meantime fails to solve other problems. Therefore, the trend of establishing new meta-heuristic optimization algorithms is highly welcomed as long as they contribute something significant to the field. Recently, a tremendous revolution of these evolutionary computations algorithms has been emerged. This represents the principal impetus to the authors to exhibit the proposed transient search optimization (TSO) algorithm.
In this paper, the TSO algorithm is inspired by the transient response of electrical circuits that are including energy storage elements (inductor and capacitor). The computational complexity of the mathematical model and flowchart of the TSO algorithm is reduced as much as possible. The exploration of TSO algorithm is examined using uin-modal benchmark functions and the exploration is examined using multi-modal benchmark functions. The statistical results, non-parametric sign test, and convergence curves verified the superiority of the TSO algorithm among other 13 metaheuristic algorithms. Furthermore, the TSO algorithm is applied for optimal design for three applicable engineering problems and compared with different algorithms.
The rest of the paper is arranged as follows: Section 2 has two subsections, where the first subsection 2.1 presents a brief background on the transient response of first-order and second-order RLC circuits and the second subsection 2.2 describes the TSO algorithm. In Section 3, the TSO is verified using 23 benchmark functions and analyzed statistically. In section 4, the TSO is applied to find the optimal design of three engineering problems. Finally, a brief conclusion of the paper is drawn in Section 5.
2 Transient Search Optimization Algorithm
2.1 Background
The complete response of electrical circuits, which are containing resistive (R) and energy storage elements such as capacitors (C), inductors (L), or both of them (LC), includes a transient response and a steady-state response (final response) as shown in Eq. (1). The electrical circuits that contain a single storing component (RL or RC) are called first-order circuits, as indicated in Fig. 1, while the circuits that involve two storage elements (RLC) are known as second-order circuits, as depicted in Fig. 2. The switching of these circuits cannot change the process instantaneously toward the next steady state, where the capacitor or inductor takes time to be charged or discharged until reaching the steady-state value. The transient response of the first-order circuit could be computed by the differential equation as shown in Eq. (2). The differential equation can be solved to find the solution of x(t), as illustrated in Eq. (3). The transient response of the first-order circuit is shown in Fig. 3, where the transient response is the exponential response for charging and discharging operations [42, 43].
where t is the time, x(t) can be the capacitor voltage v(t) of the RC circuit or an inductor current i(t) of the RL circuit, τ is the circuit time constant, where τ = RC of the RC circuit and τ = L/R of the RL circuit, K is a constant that relies on the initial value of x(0), and x(∞) is the final response value. The transient response of the second-order circuit could be computed by the differential equation shown in Eq. (4). The solution of the second-order differential equation is shown in Eq. (5), where the response of RLC circuit is considered as an under-damped response.
where α is the damping coefficient, ω0 is the resonant frequency, fd is the damped resonant frequency, and B1 and B2 are constants. The underdamped response occurs when α < ω0 which causes damped oscillations of the transient response of the RLC circuit, as shown in Fig. 3.
2.2 Inspiration of the TSO Algorithm
In this section, the TSO algorithm is modeled as 1) initialization the search-agents between lower and upper bounds of the search area; 2) Searching for the best solution (Exploration); and 3) reaching the steady-state or best solution (Exploitation). Firstly, the initialization of the search-agents is randomly generated as in Eq. (6). Secondly, the exploration behavior of TSO is inspired by the oscillations of the second-order RLC circuits around the zero as depicted in Fig. 3. However, the exploitation of TSO is inspired by the exponential decaying of the first-order discharge, as displayed in Fig. 3. The random number r1 is used to balance between the exploration (r1 ≥ 0.5) and exploitation (r1 < 0.5) of the TSO algorithm. The mathematical modeling of the exploitation and exploration of the TSO algorithm is shown in Eq. (7), which is inspired by Eq. (3) and Eq. (5). The best solution (Yl*) of TSO algorithm imitates the steady-state or final value (x(∞)) of the electrical circuit, also B1 = B2 = |Yl-C1. Yl*|.
where lb is the lower bound of search area, ub is the upper bound of the search area, rand is a random number distributed uniformly, z is a variable that changes from 2 to 0 as in Eq. (10), T and C1 are random coefficients, r1, r2, and r3 are random numbers distributed uniformly ϵ [0, 1], Yl is the position of search agents, Yl* is the best position, l is the iteration number, k is a constant number (k = 0, 1, 2, …), and Lmax is the maximum iterations number. Furthermore, the balance between the exploration and exploitation process is realized by the coefficient T, which varies between [−2, 2]. The exploitation process of TSO algorithm is achieved when T > 0, while the exploration process is achieved when T < 0, as demonstrated in Fig. 4. It is obvious that the transient response indicated in Fig. 4 starts with a high response value then damps into the smallest value when the T > 0 then it oscillates again and goes to higher values when T < 0. The pseudo code of the TSO algorithm is depicted in Fig. 5. It is obvious that the algorithm offered is not complex and only one equation is used for position updating and balancing between the exploration and exploitation procedures.
In addition, the computational complexity of the TSO is expressed using big-oh notation, where the process of TSO algorithm begins with initialization of search-agents, then evaluate them using the cost function, then update the search-agents according to the function evaluation. The initialization process is expressed in big-oh notation O(N), where N is the number of search-agents. Secondly, the search agents enter the while loop, which has a maximum iteration (Lmax). Then the complexity of function evaluations of all search-agents is expressed as O(N*Lmax). Finally, the complexity of updating all search agents that have a dimension (D) for total iterations (Lmax) is expressed as O (NxLmaxxD). Therefore, the computational complexity of the TSO algorithm is expressed as O (Nx (Lmax D+ Lmax + 1)).
2.3 Verification of the TSO Algorithm
In this section, the robustness of the TSO algorithm is testing by using 23 well-known benchmark functions. These functions sorted into three categories [44]: uni-modal functions, multi-modal, and fixed-dimension multi-modal functions, as depicted in Tables 1, 2, 3. Uni-modal functions are usually used to check the exploitation ability of the algorithms, while the multi-modal functions are usually used to check the exploration capability of the algorithms. The experimental tests are performed using MATLAB R2016b and the whole tests are executed on a PC (Intel (R) Core (TM) i7–3770 CPU @ 3.40 GHz (8 CPUs), 16 GB, Windows 7–64 bits). Firstly, the TSO algorithm is compared with eight famous algorithm, which are widely applied in different engineering problems algorithms: such as salp swarm algorithm (SSA) [15], grey wolf optimizer (GWO) [7], whale optimization algorithm (WOA) [9], PSO, and CS algorithms, DE and FEP, and GSA. So, for fair comparison, all algorithms have the same population size, which is 30, the same maximum number of iterations, which is 500, and the same number of independent runs, which is 30. However, there are specific parameters for every algorithm as illustrated in Table 4. Therefore, the statistical results (average and standard deviation) of 30 independent runs are calculated as shown in Table 5. The statistical results of the TSO algorithm are paralleled with the other eight algorithms. Then, the comparison of results revealed that the TSO algorithm achieved higher number of best values (15/23) than other algorithms. So, the TSO algorithm has the 1st rank among other algorithms, however, DE achieved the 2nd rank. Also, it can be noticed that any algorithm achieves good result for Rastrigin function (F9), but it achieves bad result for the other function Rosenbrock function (F5). However, the TSO algorithm achieved good results for both functions, which means that the TSO algorithm balances between exploration and exploitation processes, as shown in Table 5. Furthermore, the Shwefel’s function (F8) is a challengeable test function to most of the metaheuristic algorithms, but the TSO algorithm succeeded to solve it and find its best solution. For further investigations, the TSO algorithm is compared with the most recent emerged algorithms, such as sandpiper optimization algorithm (SOA) [45], hybrid sine cosine algorithm (HSCA) [46], enhanced salp swarm algorithm (ESSA) [47], augmented grey wolf optimizer (AGWO) [48], GA, ABC [49], and FA [50], as shown in Tables 6, 7, 8. The comparison revealed that the TSO algorithm achieved the 1st rank of best results (17/23).
Moreover, the superiority of the TSO algorithm is tested by using the non-parametric sign test (Wilcoxon signed rank test) at 5% significant level as shown in Table 9. The Wilcoxon signed rank test computed the rank of all algorithms for each benchmark function, then the sum of all ranks for all 23 benchmark functions is calculated as shown in Table 9. This test revealed that the TSO algorithm achieved the first rank compared with other algorithms. Also, the p value is calculated for each benchmark function, where the null hypothesis (that states no difference between algorithms) is rejected, because all p-values are less than the significant level (5%). On the other hand, the simulation speeds of the TSO algorithm and other algorithms are compared in Table 10. It is obvious that the PSO algorithm has the smallest simulation time due to its simplicity, then the TSO algorithm has the second smallest simulation time. However, the DE algorithm has the second longest simulation time. Finally, Fig. 6 shows that the TSO algorithm converges mostly faster than the other algorithms (PSO, SSA, GWO, DE, CS, and WOA).
3 Application of the TSO to Classical Engineering Problems
In this section, the TSO algorithm is examined using three well-known classical engineering design problems: tension spring design, welded beam design, and pressure vessel. These design problems are multi-constrained handling problems, which assess the aptitude of the TSO algorithm. Each engineering problem is solved for 30 times, while the number of maximum iterations is 500 and the number of population is 30 for all optimization algorithms.
3.1 Coil Spring Design
The minimum weight of tension coil spring is the fitness function of this engineering test problem as shown in Fig. 7 [51]. The optimum weight is subjected to four constraints: deflection, shear stress, surge waves frequency, and outer diameter as written in Eq. (11). The number of optimal variables is three: average coil diameter (D), Wire diameter (d), and a number of active coils (N). The TSO algorithm is employed to optimize this problem and it is paralleled with some algorithms. The statistical results (minimum, average, and standard deviation of 30 runs) are listed in Table 11. The TSO algorithm and interactive search algorithm (ISA) [55] offered the minimum weight value compared with other algorithms. This proves the capability and superiority of the TSO algorithm to solve this aforementioned optimization problem.
The range of variables
3.2 Welded Beam Design
The minimum cost is the objective function of the welded beam design problem, as depicted in Fig. 8 [56]. The cost function is subjected to seven constraint functions as formulated in Eq. (12). Four design variables must be optimized to minimize the cost functions: weld thickness (h), attached part of bar length (l), bar height (t), and bar thickness (b). The TSO algorithm is applied to attain the lowest cost of welded beam design and paralleled with other algorithms. The statistical results (minimum, average, and standard deviation of 30 runs) are listed in Table 12. The TSO, ISA, and interactive fuzzy search algorithm (IFSA) [57] offered the minimum cost of welded beam design among other algorithms. However, ISA and IFSA are hybrid algorithms which take a longer execution time.
The range of variables is
where
3.3 Pressure Vessel Design
The purpose of this design is to find the minimum cost of cylindrical vessel design as shown in Fig. 9. The objective function is subjected to four constraint functions, as demonstrated in Eq. (13). There are four parameters to be optimized: shell thickness (Ts), head thickness (Th), inner radius (R), and length of the cylindrical vessel without head (L). The TSO is applied to get the minimum cost of cylindrical pressure vessel design and the results are paralleled with those obtained using other algorithms. The statistical results (minimum, average, and standard deviation of 30 runs) are listed in Table 13. It is worthy for noting here that the TSO offers the minimum cost compared with the results of other algorithms. The ISA and IFSA algorithms are hybrid algorithms which take a longer time to find the minimum solution.
The range of variables
4 Conclusion
This paper has presented a novel physical-inspired algorithm called the TSO algorithm. This suggested algorithm is stimulated by the transient behavior of switched electrical RLC circuits. The TSO is confirmed using 23 benchmark functions and three well-known classical engineering problems. The statistical results (mean, standard deviation, rank, and p-values) proved the superiority and significance level of the TSO compared with 15 algorithms (SSA, WOA, GWO, GSA, DE, FEP, PSO, CS, ESSA, AGWO, SOA, HSCA, GA, ABC, and FA). The convergence curves verified the fast convergence behavior of TSO in comparison with those obtained using other algorithms. The constrained-handled problems are utilized in checking the TSO algorithm where the constraints make the problems much difficult to find the optimum values. The TSO offered the minimum weight of coil spring design, the minimum cost of welded beam design, and minimum cost of cylindrical pressure vessel design. The superiority of the TSO algorithm reflects its flexibility, robustness, and proper design. For future work, the TSO algorithm will be applied in different fields, such as feature selection and optimal power flow. Also, the flowchart simplicity and the results encourage the researchers to apply the TSO algorithm in different disciplines.
References
Yang X-S (2014) Random walks and optimization. In: Yang X-SBT-N-IOA (ed) Nature-inspired optimization algorithms. Elsevier, Oxford, pp 45–65
Qais M, Abdulwahid Z (2013) A new method for improving particle swarm optimization algorithm (TriPSO). In: 2013 5th international conference on modeling, simulation and applied optimization, ICMSAO
Abualigah LM, Khader AT (2017) Unsupervised text feature selection technique based on hybrid particle swarm optimization algorithm with genetic operators for the text clustering. J Supercomput 73:4773–4795. https://doi.org/10.1007/s11227-017-2046-2
Nacional C (2004) Relationship between genetic algorithms and ant Colony optimization algorithms. Quality 11:1–16. https://doi.org/10.1109/MCI.2006.329691
Yang XS (2010) A new metaheuristic bat-inspired algorithm. In: González JR, Pelta DA, Cruz C et al (eds) Studies in computational intelligence. Springer, Berlin, Heidelberg, pp 65–74
Karaboga D, Basturk B (2007) A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm. J Glob Optim 39:459–471. https://doi.org/10.1007/s10898-007-9149-x
Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer. Adv Eng Softw 69:46–61. https://doi.org/10.1016/j.advengsoft.2013.12.007
Qais MH, Hasanien HM, Alghuwainem S (2018) A Grey wolf optimizer for optimum parameters of multiple PI controllers of a grid-connected PMSG driven by variable speed wind turbine. IEEE Access 6:44120–44128. https://doi.org/10.1109/ACCESS.2018.2864303
Mirjalili S, Lewis A (2016) The whale optimization algorithm. Adv Eng Softw 95:51–67. https://doi.org/10.1016/j.advengsoft.2016.01.008
Qais MH, Hasanien HM, Alghuwainem S (2020) Whale optimization algorithm-based Sugeno fuzzy logic controller for fault ride-through improvement of grid-connected variable speed wind generators. Eng Appl Artif Intell 87. https://doi.org/10.1016/j.engappai.2019.103328
Qais MH, Hasanien HM, Alghuwainem S (2019) Enhanced whale optimization algorithm for maximum power point tracking of variable-speed wind generators. Appl Soft Comput 105937. https://doi.org/10.1016/j.asoc.2019.105937
Qais MH, Hasanien HM, Alghuwainem S, Nouh AS (2019) Coyote optimization algorithm for parameters extraction of three-diode photovoltaic models of photovoltaic modules. Energy 187:116001. https://doi.org/10.1016/j.energy.2019.116001
Qais MH, Hasanien HM, Alghuwainem S (2019) Identification of electrical parameters for three-diode photovoltaic model using analytical and sunflower optimization algorithm. Appl Energy 250:109–117. https://doi.org/10.1016/j.apenergy.2019.05.013
Gomes GF, da Cunha SS, Ancelotti AC (2019) A sunflower optimization (SFO) algorithm applied to damage identification on laminated composite plates. Eng Comput 35:619–626. https://doi.org/10.1007/s00366-018-0620-8
Mirjalili S, Gandomi AH, Mirjalili SZ, Saremi S, Faris H, Mirjalili SM (2017) Salp swarm algorithm: a bio-inspired optimizer for engineering design problems. Adv Eng Softw 114:163–191. https://doi.org/10.1016/j.advengsoft.2017.07.002
Jain M, Singh V, Rani A (2019) A novel nature-inspired algorithm for optimization: squirrel search algorithm. Swarm Evol Comput 44:148–175. https://doi.org/10.1016/j.swevo.2018.02.013
Arora S, Singh S (2019) Butterfly optimization algorithm: a novel approach for global optimization. Soft Comput 23:715–734. https://doi.org/10.1007/s00500-018-3102-4
Kallioras NA, Lagaros ND, Avtzis DN (2018) Pity beetle algorithm – a new metaheuristic inspired by the behavior of bark beetles. Adv Eng Softw 121:147–166. https://doi.org/10.1016/j.advengsoft.2018.04.007
Mirjalili S (2015) Moth-flame optimization algorithm: a novel nature-inspired heuristic paradigm. Knowledge-Based Syst 89:228–249. https://doi.org/10.1016/j.knosys.2015.07.006
Jahani E, Chizari M (2018) Tackling global optimization problems with a novel algorithm – mouth brooding fish algorithm. Appl Soft Comput J 62:987–1002. https://doi.org/10.1016/j.asoc.2017.09.035
Kaveh A, Farhoudi N (2013) A new optimization method: dolphin echolocation. Adv Eng Softw 59:53–70. https://doi.org/10.1016/j.advengsoft.2013.03.004
Dhiman G, Kumar V (2017) Spotted hyena optimizer: a novel bio-inspired based metaheuristic technique for engineering applications. Adv Eng Softw 114:48–70. https://doi.org/10.1016/j.advengsoft.2017.05.014
Dhiman G, Kumar V (2018) Emperor penguin optimizer: a bio-inspired algorithm for engineering problems. Knowledge-Based Syst 159:20–50. https://doi.org/10.1016/j.knosys.2018.06.001
Li MD, Zhao H, Weng XW, Han T (2016) A novel nature-inspired algorithm for optimization: virus colony search. Adv Eng Softw 92:65–88. https://doi.org/10.1016/j.advengsoft.2015.11.004
Abualigah LMQ (2019) Feature selection and enhanced krill herd algorithm for text document clustering. Springer, Berlin
Abualigah LM, Khader AT, Hanandeh ES (2018) Hybrid clustering analysis using improved krill herd algorithm. Appl Intell 48:4047–4071. https://doi.org/10.1007/s10489-018-1190-6
Yang XS (2009) Firefly algorithms for multimodal optimization. In: Lecture notes in computer science (including subseries lecture notes in artificial intelligence and lecture notes in bioinformatics). Springer, Berlin, Heidelberg, pp 169–178
Marcelin JL (1999) Evolutionary optimisation of mechanical structures: towards an integrated optimisation. Eng Comput 15:326–333. https://doi.org/10.1007/s003660050027
Simon D (2008) Biogeography-based optimization. IEEE Trans Evol Comput 12:702–713. https://doi.org/10.1109/TEVC.2008.919004
Storn R, Price K (1997) Differential evolution - a simple and efficient heuristic for global optimization over continuous spaces. J Glob Optim 11:341–359. https://doi.org/10.1023/A:1008202821328
Yao X, Liu Y, Lin G (1999) Evolutionary programming made faster. IEEE Trans Evol Comput 3:82–102. https://doi.org/10.1109/4235.771163
Hasanien HM (2017) Gravitational search algorithm-based optimal control of Archimedes wave swing-based wave energy conversion system supplying a DC microgrid under uncertain dynamics. IET Renew Power Gener 11:763–770. https://doi.org/10.1049/iet-rpg.2016.0677
Kaveh A, Khayatazad M (2012) A new meta-heuristic method: ray optimization. Comput Struct 112–113:283–294. https://doi.org/10.1016/j.compstruc.2012.09.003
Kaveh A, Talatahari S (2010) A novel heuristic optimization method: charged system search. Acta Mech 213:267–289. https://doi.org/10.1007/s00707-009-0270-4
Kaveh A, Mahdavi VR (2014) Colliding bodies optimization: a novel meta-heuristic method. Comput Struct 139:18–27. https://doi.org/10.1016/j.compstruc.2014.04.005
Abedinpourshotorban H, Mariyam Shamsuddin S, Beheshti Z, Jawawi DNA (2016) Electromagnetic field optimization: a physics-inspired metaheuristic optimization algorithm. Swarm Evol Comput 26:8–22. https://doi.org/10.1016/j.swevo.2015.07.002
Kaveh A, Dadras A (2017) A novel meta-heuristic optimization algorithm: thermal exchange optimization. Adv Eng Softw 110:69–84. https://doi.org/10.1016/j.advengsoft.2017.03.014
Javidy B, Hatamlou A, Mirjalili S (2015) Ions motion algorithm for solving optimization problems. Appl Soft Comput J 32:72–79. https://doi.org/10.1016/j.asoc.2015.03.035
Kaveh A, Bakhshpoori T (2016) Water evaporation optimization: a novel physically inspired optimization algorithm. Comput Struct 167:69–85. https://doi.org/10.1016/j.compstruc.2016.01.008
Eskandar H, Sadollah A, Bahreininejad A, Hamdi M (2012) Water cycle algorithm - a novel metaheuristic optimization method for solving constrained engineering optimization problems. Comput Struct 110–111:151–166. https://doi.org/10.1016/j.compstruc.2012.07.010
Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1:67–82. https://doi.org/10.1109/4235.585893
Dorf RC (2013) Introduction to electric circuits, 9th ed. John Wiley & Sons, London
Boylestad RL (1966) Introductory circuit analysis, 13th ed. Pearson, London
Digalakis JG, Margaritis KG (2001) On benchmarking functions for genetic algorithms. Int J Comput Math 77:481–506. https://doi.org/10.1080/00207160108805080
Kaur A, Jain S, Goel S (2019) Sandpiper optimization algorithm: a novel approach for solving real-life engineering problems. Appl Intell 50:582–619. https://doi.org/10.1007/s10489-019-01507-3
Gupta S, Deep K (2019) A novel hybrid sine cosine algorithm for global optimization and its application to train multilayer perceptrons. Appl Intell 50:993–1026. https://doi.org/10.1007/s10489-019-01570-w
Qais MH, Hasanien HM, Alghuwainem S (2019) Enhanced salp swarm algorithm: application to variable speed wind generators. Eng Appl Artif Intell 80:82–96. https://doi.org/10.1016/j.engappai.2019.01.011
Qais MH, Hasanien HMHM, Alghuwainem S (2018) Augmented grey wolf optimizer for grid-connected PMSG-based wind energy conversion systems. Appl Soft Comput J 69:504–515. https://doi.org/10.1016/j.asoc.2018.05.006
Yarpiz (2020). Artificial Bee Colony (ABC) in MATLAB (https://www.mathworks.com/matlabcentral/fileexchange/52966-artificial-bee-colony-abc-in-matlab), MATLAB Central File Exchange. Retrieved April 10, 2020
Yarpiz (2020). Firefly Algorithm (FA) (https://www.mathworks.com/matlabcentral/fileexchange/52900-firefly-algorithm-fa), MATLAB Central File Exchange. Retrieved April 10, 2020
Arora J (2012) Introduction to optimum design, 4th ed. Academic Press, London
He Q, Wang L (2007) An effective co-evolutionary particle swarm optimization for constrained engineering design problems. Eng Appl Artif Intell 20:89–99. https://doi.org/10.1016/j.engappai.2006.03.003
Huang zhuo F, Wang L, He Q (2007) An effective co-evolutionary differential evolution for constrained optimization. Appl Math Comput 186:340–356. https://doi.org/10.1016/j.amc.2006.07.105
Rashedi E, Nezamabadi-pour H, Saryazdi S (2009) GSA: a gravitational search algorithm. Inf Sci (Ny) 179:2232–2248. https://doi.org/10.1016/j.ins.2009.03.004
Mortazavi A, Toğan V, Nuhoğlu A (2018) Interactive search algorithm: a new hybrid metaheuristic optimization algorithm. Eng Appl Artif Intell 71:275–292. https://doi.org/10.1016/j.engappai.2018.03.003
Coello Coello CA (2000) Use of a self-adaptive penalty approach for engineering optimization problems. Comput Ind 41:113–127. https://doi.org/10.1016/S0166-3615(99)00046-9
Mortazavi A (2019) Interactive fuzzy search algorithm: a new self-adaptive hybrid optimization algorithm. Eng Appl Artif Intell 81:270–282. https://doi.org/10.1016/j.engappai.2019.03.005
Acknowledgments
The authors would like to thank the Deanship of Scientific Research, King Saud University for funding and supporting this research through the initiative of graduate students research support (GSR).
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Qais, M.H., Hasanien, H.M. & Alghuwainem, S. Transient search optimization: a new meta-heuristic optimization algorithm. Appl Intell 50, 3926–3941 (2020). https://doi.org/10.1007/s10489-020-01727-y
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10489-020-01727-y