Abstract
The water flow optimizer (WFO) is the latest swarm intelligence algorithm inspired by the shape of water flow. Its advantages of simplicity, efficiency, and robust performance have motivated us to further enhance it. In this paper, we introduce fractional-order (FO) technology with memory properties into the WFO, called fractional-order water flow optimizer (FOWFO). To verify the superior performance and practicality of FOWFO, we conducted comparisons with nine state-of-the-art algorithms on benchmark functions from the IEEE Congress on Evolutionary Computation 2017 (CEC2017) and four real-world optimization problems with large dimensions. Additionally, tuning adjustments were made for two crucial parameters within the fractional-order framework. Finally, an analysis was performed on the balance between exploration and exploitation within FOWFO and its algorithm complexity.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
The classification of meta-heuristic algorithms is typically based on their inspiration sources [1,2,3,4,5]. In this paper, we categorize meta-heuristic algorithms into two groups based on the number of offspring individuals [6]: single-solution-based optimization algorithms and population-based optimization algorithms. Single-solution-based optimization algorithms require only one individual to search for a solution, such as large neighborhood search, tabu search and variable neighborhood search. On the other hand, population-based optimization algorithms involve multiple individuals searching for the optimal solution in the global space through operators, such as crossover, mutation, and selection. Population-based algorithms can be further divided into swarm intelligence (SI) [7, 8], evolutionary algorithms (EA) [9, 10], and algorithms based on physics or chemistry [11]. Swarm intelligence includes both human-related algorithms and non-human algorithms.
We primarily focus on population-based optimization algorithms. To find the global optimal solution through cooperation among multiple individuals, exploration and exploitation of algorithms must be balanced [23, 24]. Exploration involves searching for multiple valuable solutions by distributing different individuals across various locations within the entire search space. On the other hand, exploitation entails individuals seeking better solutions in the vicinity of valuable solutions. Exploration increases the diversity of the population, avoiding the algorithm falling into local optima, while exploitation enables the algorithm to converge quickly towards the optimal solution. Achieving a balance between exploration and exploitation enables the algorithm to find the global optimal solution and accelerate convergence speed.
Water is abundant in nature, and there are many meta-heuristic algorithms inspired by water, as presented in Table 1. The water flow optimizer (WFO) is a novel global optimization algorithm inspired by the shape of water flow in nature. In [22], WFO was proposed for the first time and successfully applied to spacecraft trajectory optimization. In [25], the binary version was added to WFO to solve the feature selection problem, and experimental results demonstrated the effectiveness of the binary water flow optimizer (BWFO) for this optimization problem. In [26], various strategies such as the fusion of Halton sequence and Cauchy mutation were introduced to improve the convergence speed and convergence ability of WFO. WFO is the latest proposed algorithm, and there are few improved versions of it. Due to the strong search performance of WFO, more and more WFO variants will emerge in the future.
In recent years, fractional-order calculus has garnered significant attention from many researchers. In [27], the efficiency and accuracy of the fractional-order genetic algorithm (FOGA) were higher than methods of genetic algorithm (GA), random algorithm (RA), and particle swarm optimization (PSO) in parameter optimization of ecological systems. In [28], the fractional controller was added to augmented Lagrangian particle swarm optimization (ALPSO) to improve the convergence speed to solve the optimization of the fixed structure controller. Simulation results showed that augmented lagrangian particle swarm optimization with fractional order velocity (ALPSOFV) has achieved good results. In [29], fractional order Darwinian particle swarm optimization (FO-DPSO) was applied to the estimation of plane wave parameters, and it was found that the experimental results basically match the expected values, which verifies the accuracy of this scheme. In [30], FO-DPSO was used to optimize line loss minimization and voltage deviation problems, and the results outperformed the state-of-the-art algorithms. In [31], the fractional calculus was added to bidirectional least mean square algorithm to form fractional order bidirectional least mean square algorithm (FOBLMS), which was applied to the global positioning system (GPS) receiver, and the performance of FOBLMS was verified on the existing beamforming algorithms. In [32], fractional order particle swarm optimization (FOPSO) was used to optimize the multi-objective core reload pattern, and FOPSO was found to be robust and efficient. In [33], mixing autoregressive fractional integrated moving average model (ARFIMA) and long-short term memory (LSTM) networks to forecast the stock market had better prediction accuracy. In [34], the fractional chaotic maps was added to enhanced whale optimization algorithms (EWOA), which improves the accuracy of the algorithm in the parameter identification of isolated wind-diesel power systems. In [35], fractional derivative (FC) was introduced into particle swarm optimization (PSO) to improve the convergence speed and enhance the memory effect. And fractional particle swarm optimization gravitational search algorithm (FPSOGSA) was applied to the optimal reactive power dispatch (ORPD) problem, and compared with the state-of-the-art counterparts, the best results were obtained. In [36], complex order particle swarm optimization (CoPSO) was proposed by introducing conjugate order derivatives into PSO. Experimental results showed that CoPSO outperforms fractional order particle swarm optimization (FOPSO). In [37], the velocity of bat algorithm (BA) was updated through the fractional calculus to improve the ability of the algorithm to jump out of local solutions. In [38], FO-DPSO based on artificial neural network (ANN) was used to compute the solutions of the corneal shape model, achieving more accurate solutions. In [39], introducing Shannon entropy into FOPSO solved the ORPD problem better. In [40], using the memory of fractional calculus, the cuckoo memories are captured during the movement, so that the cuckoo search (CS) can get rid of the local minimum and fast converge to the optimal solutions, and the fractional-order cuckoo search ( FO-CS) was applied to the parameter identification of financial system, and more accurate and consistent results were obtained. In [41], the memory feature of fractional-order (FO) was used to enhance the local search ability of flower pollination algorithm (FPA), and experiments showed that fractional-order flower pollination algorithm (FO-FPA) improves the quality of solutions and the acceleration of convergence speed. In [42], the fractional-order chaos maps was introduced into FPA, and its memory advantage and new dynamical distribution were used to adaptively adjust parameters. After several rounds of validations, it showed that the fractional chaotic flower pollination algorithm had the highest accuracy and convergence speed. In [43], the fractional order chaos maps were used to generate the initial population of harris hawks optimizer (HHO), so that the optimization solution converges to the global optimal solution. In [44], fractional long-term memory was used to recalculate the transition probability of ant colony algorithm. Experiments proved that fractional-order ant colony algorithm (FACA) had better search ability. In [45], the history dependency of fractional calculus (FC) was used to improve the exploitation ability of manta ray foraging optimizer (MRFO) and avoid falling into local optima. The superior performance of the fractional-order Caputo manta ray foraging optimizer (FCMRFO) was demonstrated through its performance on global optimization problems, constraint engineering problems, and image segmentation. In [46], the application of FODPSO to the identification of electrical parameters of solar photovoltaic cells had achieved good results. In [47], Caputo–Fabrizio fractional order model was used to explore the dynamics of COVID-19 variations, and fractional Adams–Bashforth was used to compute the iterative solution of the model. In [48], the stability and performance of three FOPSO variants were analyzed and they were applied to the insulin injection optimization problem. In [49], the fractional order dragonfly algorithm (FO-DA) was used for parameter identification of Solid Oxide Fuel Cells (SOFC), and compared with state-of-the-art approaches, better results were obtained. In summary, FO possesses inherent advantages in long-term memory, non-locality, and weak singularity. It aids algorithms in improving convergence properties, enhancing memory effects, and increasing stability, reliability, and consistency.
According to the No-Free-Lunch theory [50], it is impossible for any algorithm to successfully optimize all problems, and the same holds true for WFO, which motivates us to improve WFO. In this paper, we introduce fractional order (FO) with memory properties to enhance the performance of WFO. We also replace the inherent probability with linearly increasing probability to balance exploration and exploitation, and propose the fractional-order water flow optimizer (FOWFO). In the experimental section, the superior performance of FOWFO is verified through experimental and statistical results comparing FOWFO with nine other algorithms on the IEEE CEC2017 functions. The practicality of FOWFO is demonstrated through its performance on four real-world optimization problems. Finally, the parameters, exploration and exploitation and algorithm complexity of FOWFO are discussed and analyzed, which provides a research direction for further improvement and application in the future.
The main contributions of the present study are summarized as follows: (1) fractional order technology can significantly improve the ability of the algorithm to optimize real-world problems with large dimensions. (2) Fractional order has little effect on algorithm complexity.
The rest of this paper is organized as: Sect. 2 introduces the original WFO and fractional-order calculus. Section 3 proposes FOWFO. Section 4 makes some comparative experiments and statistical analysis. Section 5 discusses parameters, balance between exploration and exploitation, and algorithm complexity. Section 6 draws conclusions and future work.
2 Preliminaries
2.1 Water Flow Optimizer
The WF0 algorithm consists of two operators: laminar operator and turbulent operator, which simulate two hydraulic phenomena of water particles flowing from highlands to lowlands [51]: regular laminar flow and irregular turbulent flow in hydraulic systems.
In WFO, there are N water particles in total, and each water particle \(X_{i}\) is expressed as: \(X_{i}=\left( x_{i}^{1}, x_{i}^{2}, \ldots , x_{i}^{d}\right)\), \(i \in \{1,2, \ldots , N\}\), where \(x_{i}^d\) represents the position of ith water particle in the dth dimension.
2.1.1 Laminar Operator
When the water velocity is small, the water particles move regularly in parallel and straight lines in their respective layers, this regular flow is called laminar flow. By simulating this phenomenon, the laminar operator is designed in the mathematical model of WFO.
In laminar flow, the velocity of water particles is different in different layers, and particles in layers away from walls or obstacles are faster than particles in layers close to walls or obstacles. The laminar operator is modeled by the following equations:
where \(X_i(t)\) is the position of the ith particle at the tth iteration, and \(X_i(t+1)\) is the position of the ith particle at the (\(t+1\))th iteration. s represents the shifting coefficient of ith particle, which is a random number between 0 and 1. The \(\vec {d}\) vector represents the common movement direction of all particles, which is determined by the position of the current best particle \(X_{\text {best}}\) and the position of the randomly selected particle \(X_{k}\).
During the same iteration, \(\vec {d}\) is constant, ensuring that each particle moves in the same direction, and the shifting coefficient of each particle is generated randomly, ensuring that each particle has different shift.
2.1.2 Turbulent Operator
When the rapid water flow hits obstacles, local oscillations and even eddy will occur. In mathematical modeling, one dimension of the problem to be solved is regarded as a layer of water flow, then the transformation between dimensions can simulate the irregular motion of particles in turbulent flow. Therefore, the moving position \(X_i(t+1)\) of the water particle in the turbulent operator is generated by oscillation in a randomly selected dimension, as:
where \(q_{1}\) and \(q_{2}\) are randomly selected distinct dimensions from d dimensions, i.e., \(q_{1} \in \{1,2, \ldots , d\}\), \(q_{2} \in \{1,2, \ldots , d\}\) & \(q_{1} \ne q_{2}\). m is the mutation value. k is a randomly selected individual from N particles, i.e., \(k \in \{1,2, \ldots , N\}\) & \(k \ne i\). r is a random number between [0, 1]. \(p_{e}\) represents the eddying probability in the range (0, 1). \(\psi\) is eddying transformation. \(\varphi\) is over-layer moving transformation.
The eddying transformation equations are as follows:
where \(\theta\) is a random number in \([-\pi , \pi ]\). \(\beta\) is the shear force of the kth particle to the ith particle.
The over-layer moving transformation equation is
where lb and ub represent the lower and upper bounds of the search space, respectively.
The flowchart of WFO is shown in Fig. 1. \(T_{max}\) is the maximum number of iterations. The laminar probability \(p_{l} \in (0, 1)\) controls whether the algorithm implements laminar operator or turbulent operator. The eddying probability \(p_{e}\) controls whether the turbulent operator performs eddying transformation or over-layer moving transformation. Finally, the global optimal solution \(X_{\text {best}}\) is output.
2.2 Fractional-order Calculus
There are several definitions of fractional-order (FO) calculus in mathematics. In this paper, we introduce the definition of Grunwald–Letnikov (GL). Its mathematical calculations are as follows [52]:
where \(D^\epsilon (x(t))\) is the GL fractional derivative of order \(\epsilon\). \(\Gamma\) represents gamma function.
In discrete-time implementation, Eq. (8) can be formulated as [53]:
where T is the sampling period. e is the number of terms from memory or previous events.
When the derivative order coefficient \(\epsilon\) equals 1, Eq. (10) becomes
where \(D^1[x(t)]\) is the difference between two followed events.
3 Proposed FOWFO
WFO uses eddying transformation and over-layer moving transformation in the turbulent operator to increase population diversity and improve global exploration ability. To balance exploration and exploitation, we enhance the laminar operator of WFO based on fractional-order (FO), which improves the local exploitation ability of the algorithm. At the same time, we replace the laminar probability \(p_{l}\) in WFO with the control parameter \(\text {Coef}=t/T_{max}\) to balance the exploration and exploitation of the algorithm.
3.1 Enhancement the Laminar Operator of Water Flow Optimizer Based on FO
Using the memory property of FO for previous events, FO is added to the laminar operator to improve the accuracy and convergence speed of the solution by sharing information among solutions in the exploitation stage.
According to Eq. (11) in the fractional order definition, when the derivative order coefficient \(\epsilon =1\), the position update Eq. (1) of the laminar operator in WFO can be rewritten as:
On the GL general definition, for any \(\epsilon\), Eq. (12) can become
Bring Eq. (13) into Eq. (10). When \(T=1\),
By shifting terms on both sides of the equation, Eq. (14) becomes
When we set the first two terms (\(e=2\)) of memory, the position of FOWFO is updated as below:
When we set the first four terms (\(e=4\)) of memory,
When \(e=8\),
3.2 Linear Increase of Laminar Probability (\(p_{l}\))
In WFO, the laminar probability \(p_{l}\) is set as a constant, and we replace the constant \(p_{l}\) with \(\text {Coef}=t/T_{max}\). When \(rand < \text {Coef}\), FOWFO executes the laminar operator, and when \(rand > \text {Coef}\), FOWFO executes the turbulent operator. As the number of iterations increases, the value of \(t/T_{max}\) increases linearly from \(1/T_{max}\) to 1. At the early stage of iteration, the value of \(\text {Coef}\) is small, and the algorithm is more inclined to global exploration. At the later stage of iteration, the value of \(\text {Coef}\) is large, and the algorithm is more inclined to local exploitation. The whole process plays a role in balancing exploration and exploitation.
The flowchart of FOWFO is in Fig. 2, and the pseudocode is in Algorithm 1. Due to the memory property of the fractional order, the population of the last e iterations is recorded in the memory according to the first-in-first-out rule.
3.3 Advantages of FOWFO
In this paper, FOWFO exhibits the following advantages:
-
(1)
The fractional-order water flow optimizer (FOWFO) is derived through rigorous mathematical reasoning. FOWFO possesses the fractional-order advantages of long-term memory, non-locality, and weak singularity.
-
(2)
In the experimental section, FOWFO demonstrates excellent performance on large-dimensional real-world problems.
4 Experiments
4.1 Experimental Settings
To verify the performance of the algorithm, we compared and analyzed the results of the proposed algorithm and the state-of-the-art algorithms on the IEEE Congress on Evolutionary Computation 2017 (CEC2017) benchmark functions [54]. IEEE CEC2017 has 30 functions, F1-F3 are unimodal functions, F4-F10 are multimodal functions, F11-F20 are hybrid functions, and F21-F30 are composition functions. Among them, the optimization result of the algorithm on F2 is unstable, so F2 is not used as a test function.
The basic parameters of all algorithms on IEEE CEC2017 functions are set as follows: the population size (N) is 100, the upper and lower boundaries of the search space are 100 and -100 respectively, the maximum number of function evaluations is \(10000*D\), where D is the dimension, and each algorithm runs 51 times independently on each function. All experiments are implemented in MATLAB R2021b on PC with 2.60GHz Intel(R) Core(TM) i7-9750 H CPU and 16GB RAM.
4.2 Performance Evaluation Criteria
To evaluate the performance of the algorithm, in this paper the experimental data and statistical data are processed according to the following criteria:
-
(1)
The mean and standard deviation (std) in the IEEE CEC2017 experimental data tables are calculated from optimization errors between obtained optimal values and known global optimal values. The best mean values are highlighted in \(\textbf{boldface}\).
-
(2)
Non-parametric statistical tests include the Wilcoxon rank-sum test [55] and the Friedman test [56]. The Wilcoxon rank-sum test utilizes optimization errors to detect whether there is a significant difference (\(\alpha =0.05\)) between the proposed algorithm and the compared algorithm. The symbol “\(+\)” indicates that the proposed algorithm is superior to its competitor, while the symbol “−” denotes that the proposed algorithm is significantly worse than its competitor. There is no significant difference between the two algorithms, recorded as symbol “\(\approx\)”. Additionally, “W/T/L” represents how many numbers the proposed algorithm has won, tied and lost to its competitor, respectively.
In the Friedman test, the mean values of optimization errors are employed as test data. A smaller Friedman rank for the algorithm indicates better performance. The minimum value is highlighted in \(\textbf{boldface}\).
-
(3)
Box-and-whisker diagrams show the robustness and accuracy of the solutions. The lower edge, red line and upper edge of the blue box denote the first quartile, the median and the third quartile, respectively. The height of the box indicates the fluctuation of solution, and the median indicates the average level of solution. The lines above and below the blue box represent the maximum and minimum non-outliers, respectively. The red symbol “\(+\)” displays outlier.
-
(4)
Convergence graphs intuitively display the convergence speed and accuracy of the algorithm optimization process.
-
(5)
The mean, std, best and worst values obtained by the algorithm optimizing real-world optimization problems with large dimensions are the smaller the better, and the best values are highlighted in \(\textbf{boldface}\).
4.3 Comparison for Competitive Algorithms
To verify the optimization performance of FOWFO, FOWFO, WFO [22], FCMRFO [45], FOFPA [41], spherical search algorithm (SS) [57], spherical evolution (SE) [58], chameleon swarm algorithm (CSA) [59], an expanded particle swarm optimization (XPSO) [60], teaching-learning-based artificial bee colony (TLABC) [61] and artificial hummingbird algorithm (AHA) [62] were tested on 29 IEEE CEC2017 benchmark functions with 10, 30, 50 and 100 dimensions, respectively, where XPSO is an expanded particle swarm optimization (PSO), TLABC is a hybrid algorithm combining teaching-learning based optimization (TLBO) and artificial bee colony (ABC). Their parameter settings are in Table 2. The experimental and statistical results are in Tables 3, 4, 5, 6 and 7. Note that the best results among all compared methods are shown in bold in the following tables.
The experimental and statistical results of FOWFO and other nine algorithms on IEEE CEC2017 benchmark functions with 10 dimensions are shown in Table 4. From the table, FOWFO has the best mean values on 10 functions, SS has the best mean values on 11 functions, and the number of best mean values of FOWFO ranks second. But it can be seen from the statistical results (W/T/L) that FOWFO wins WFO, FCMRFO, FOFPA, SS, SE, CSA, XPSO, TLABC and AHA on 15, 22, 29, 14, 24, 24, 21, 16 and 18 functions, respectively. Therefore, FOWFO performs best on IEEE CEC2017 functions with 10 dimensions. The experimental and statistical results of FOWFO and other algorithms on IEEE CEC2017 benchmark functions with 30 dimensions are in Table 5. In the table, FOWFO, WFO, FCMRFO, FOFPA, SS, SE, CSA, XPSO, TLABC and AHA gain the best mean values on 15, 4, 0, 0, 4, 1, 0, 0, 5 and 0 functions, respectively. From the W/T/L, FOWFO is better than other algorithms on 14, 27, 29, 22, 27, 23, 22, 17 and 26 functions,respectively. The results denote that FOWFO has the best performance on IEEE CEC2017 functions with 30 dimensions. The experimental and statistical results of FOWFO and competitive algorithms on IEEE CEC2017 benchmark functions with 50 dimensions are represented in Table 6. It can be seen that FOWFO gets the best mean values on 9 functions, and the number of best mean values of FOWFO ranks first. FOWFO significantly outperforms the competitive algorithms on 8, 28, 29, 22, 25, 21, 20, 18 and 28 functions, respectively. These show that FOWFO keeps superior performance on functions with 50 dimensions. The results of FOWFO and competitive algorithms on functions with 100 dimensions are displayed in Table 7. From it, FOWFO, WFO, FCMRFO, FOFPA, SS, SE, CSA, XPSO, TLABC and AHA find the best mean values on 8, 6, 0, 2, 5, 1, 1, 3, 3, and 0 functions, respectively. W/T/L indicates that FOWFO wins other nine algorithms on 11, 29, 28, 20, 26, 22, 22, 24 and 28 functions,respectively. It demonstrates that FOWFO can still maintain high performance on functions with 100 dimensions.
The Friedman test in Table 3 can more intuitively show that the performance of FOWFO ranks first in every dimension, indicating that FOWFO performs best on IEEE CEC2017 functions.
Box-and-whisker diagrams and convergence graphs of the optimized data obtained by FOWFO and other nine algorithms on the IEEE CEC2017 functions with 10, 30, 50 and 100 dimensions are shown in Figs. 3, 4, 5 and 6, where vertical axis of convergence graphs denotes log value of average optimization error. From the box-and-whisker diagrams, the red median line of FOWFO is the lowest, and its solution distribution is more stable, indicating that FOWFO has superior and stable performance. From the convergence graphs, the convergence curves of FOWFO are at the lowest positions in the late stage of iteration compared with the curves of other algorithms, indicating that the average error of FOWFO is the smallest, and FOWFO still has strong exploration ability in the late stage of iteration, which prevents the algorithm from falling into local optima.
4.4 Real-World Optimization Problems with Large Dimensions
Our research found that FOWFO performs well on real-world optimization problems with large dimensions. To demonstrate the practicality of FOWFO on practical optimization problems with large dimensions, FOWFO, WFO, FCMRFO, FOFPA, SS, SE, CSA, XPSO, TLABC and AHA are used to optimize the following four real-world optimization problems with large dimensions: hydrothermal scheduling problem (HSP), dynamic economic dispatch (DED) problem, large scale transmission pricing problem (LSTPP), static economic load dispatch (ELD) problem. The dimensions of HSP, DED, LSTPP and ELD are 96, 120, 126 and 140, respectively, and their specific description can be found in [63]. The population size (N) of all algorithms is set to 100, and each algorithm runs 51 times independently on each problem. The optimization results are shown in Tables 8, 9, 10 and 11, where Mean, Std, Best and Worst denote mean, standard deviation, minimum and maximum values, respectively.
From Tables 8, 9, 10 and 11, it can be found that Mean, Best, and Worst obtained by FOWFO on HSP, DED, LSTPP and ELD are the smallest compared with those of the other nine algorithms, and the optimization values obtained by FOWFO are significantly better than that of the other algorithms, indicating that FOWFO has superior performance on real-world optimization problems with large dimensions, and it can be applied to large dimensional practical problems in the future.
5 Discussion
5.1 Analysis for Parameters of FOWFO
The two most important parameters in the fractional order in this paper are: the memory terms e and the derivative order coefficient \(\epsilon\). Therefore, the performance of FOWFO is also affected by e and \(\epsilon\). To analyze the sensitivity of e and \(\epsilon\), FOWFO with different e and \(\epsilon\) are tested on the IEEE CEC2017 functions with 30 dimensions. The experimental results of the eighteen combinations are presented in Tables 12 and 13. From these tables, it can be seen that FOWFO with \(e=12\) and \(\epsilon =0.9999999\) performs best.
5.2 Balance Between Exploration and Exploitation
To gain a more intuitive understanding of the exploration and exploitation of FOWFO, we display the distribution of the population in the solution space for the two-dimensional unimodal function (F3), multimodal function (F9), and composition function (F25) in Fig. 7. In the figure, the lines represent contour lines of fitness values, with redder lines indicating higher fitness values. The search space range for each dimension is \([-100, 100]\). The red dot signifies the current position of the individual, while the blue triangle denotes the current best individual position.
The population size is 100, and the maximum number of iterations is set at 200. In Fig. 7, at iteration \(t=1\), individuals are uniformly distributed within the solution space. For F3 and F9, as the number of iterations increases, the population gradually converges toward the optimal solution and eventually narrows down to a minimal range. This demonstrates the strong exploitation ability of FOWFO. In the case of the more complex function F25, the population explores multiple valuable regions and progressively consolidates within each region. Even at the end of the iteration, the population still maintains a certain degree of search range, indicating that FOWFO retains high exploration ability in later iterations. The search history of FOWFO population individuals on different functions demonstrates its ability to independently switch between exploration and exploitation on various problems, effectively achieving a balance between the two.
5.3 Algorithm Complexity
In this subsection, the central processing unit (CPU) running times consumed by all tested algorithms on IEEE CEC2017 functions with 10, 30, 50 and 100 dimensions and on real-world optimization problems with large dimensions are given, respectively, where the maximum number of function evaluations for all algorithms on each function and problem is set to be the same. The CPU running times on IEEE CEC2017 functions are shown in Table 14. From Table 14, it can be observed that WFO has the shortest CPU running time on 10-dimensional, 30-dimensional, 50-dimensional and 100-dimensional functions. The computational time of FOWFO is higher than that of WFO, but the increase is not substantial. The CPU running times on real-world optimization problems are displayed in Table 15. In Table 15, FCMRFO ranks first for the shortest CPU running times on HSP and DED. WFO has the shortest CPU running times on LSTPP and ELD. FOWFO also has reasonable computation times on these real-world optimization problems. These findings indicate that FOWFO and WFO exhibit similar computational times and require very little time. The bar graphs illustrating the CPU running time consumed by all tested algorithms are plotted in Figs. 8 and 9.
The computational complexities of FOWFO and WFO are closely aligned and reasonable. This denotes that the effect of fractional order on the algorithm complexity of WFO is little. FOWFO exhibits low computational complexity and little computation cost, thus affirming its usability and indicating its potential applicability to high-dimensional and engineering problems.
6 Conclusion and Future Work
In this paper, fractional-order water flow optimizer (FOWFO) is proposed to add fractional order to WFO to enhance the performance of the algorithm and use linear increase in laminar probability to balance the exploration and exploitation. Through a comparative analysis of experimental results between FOWFO and nine state-of-the-art algorithms on the IEEE CEC2017 functions, it can be demonstrated that the fractional order is effective in improving the performance of the algorithm. Experiments have found that FOWFO has achieved good results on real-world optimization problems with large dimensions, and its low computational complexity and little computation cost enable FOWFO to be applied to high-dimensional practical problems.
In the future work, there are the following suggestions: (1) The performance of FOWFO could be further enhanced, such as introducing hydraulic drop and jump. (2) FOWFO could be applied to protein structure prediction [64, 65], solar photovoltaic parameter estimation [66, 67], dendritic neural model [68,69,70], biology [71, 72] and physics [73,74,75,76,77,78,79]. (3) Fractional order (FO) could be used to improve other meta-heuristic algorithms.
Data Availability
Related data and material can be found at https://toyamaailab.github.io.
Abbreviations
- WFO:
-
Water flow optimizer
- FOWFO:
-
Fractional-order water flow optimizer
- EA:
-
Evolutionary algorithms
- SI:
-
Swarm intelligence
- FO:
-
Fractional order
- GPS:
-
Global positioning system
- CEC:
-
Congress on Evolutionary Computation
- FCMRFO:
-
Fractional-order Caputo manta ray foraging optimizer
- FOFPA:
-
Fractional-order flower pollination algorithm
- SS:
-
Spherical search algorithm
- SE:
-
Spherical evolution
- CSA:
-
Chameleon swarm algorithm
- XPSO:
-
An expanded particle swarm optimization
- TLABC:
-
Teaching-learning-based artificial bee colony
- AHA:
-
Artificial hummingbird algorithm
- CPU:
-
Central processing unit
- HSP:
-
Hydrothermal scheduling problem
- DED:
-
Dynamic economic dispatch problem
- LSTPP:
-
Large scale transmission pricing problem
- ELD:
-
Static economic load dispatch problem
References
Molina, Daniel, Poyatos, Javier, Del Ser, Javier, García, Salvador, Hussain, Amir, Herrera, Francisco: Comprehensive taxonomies of nature- and bio-inspired optimization: Inspiration versus algorithmic behavior, critical analysis recommendations. Cognitive Computation 12(5), 897–939 (2020)
Hanif Halim, A., Ismail, I., Das, Swagatam: Performance assessment of the metaheuristic optimization algorithms: an exhaustive review. Artificial Intelligence Review 54(3), 2323–2409 (2021)
Ezugwu, Absalom E., Shukla, Amit K., Nath, Rahul, Akinyelu, Andronicus A., Agushaka, Jeffery O., Chiroma, Haruna, Muhuri, Pranab K.: Metaheuristics: a comprehensive overview and classification along with bibliometric analysis. Artificial Intelligence Review 54(6), 4237–4316 (2021)
Hare, Warren, Nutini, Julie, Tesfamariam, Solomon: A survey of non-gradient optimization methods in structural engineering. Advances in Engineering Software 59, 19–28 (2013)
Abualigah, Laith, Diabat, Ali: Advances in sine cosine algorithm: A comprehensive survey. Artificial Intelligence Review 54(4), 2567–2608 (2021)
Ma, Zhongqiang, Wu, Guohua, Suganthan, Ponnuthurai N., Song, Aijuan, Luo, Qizhang: Performance assessment and exhaustive listing of 500+ nature inspired metaheuristic algorithms. arXiv preprint arXiv:2212.09479, (2022)
Parpinelli, R.S., Lopes, H.S.: New inspirations in swarm intelligence: a survey. International Journal of Bio-Inspired Computation 3(1), 1–16 (2011)
Krause, Jonas, Cordeiro, Jelson, Parpinelli, Rafael Stubs, Lopes, Heitor Silvério: A survey of swarm algorithms applied to discrete optimization problems. In: Swarm Intelligence and Bio-Inspired Computation, pages 169–191. Elsevier, (2013)
Fonseca, Carlos M., Fleming, Peter J.: An overview of evolutionary algorithms in multiobjective optimization. Evolutionary Computation 3(1), 1–16 (1995)
Mühlenbein, H., Gorges-Schleuter, M., Krämer, O.: Evolution algorithms in combinatorial optimization. Parallel Computing 7(1), 65–85 (1988)
Biswas, Anupam, Mishra, K.K., Tiwari, Shailesh, Misra, A.K.: Physics-inspired optimization algorithms: a survey. Journal of Optimization, 2013, (2013)
Shah-Hosseini, Hamed: The intelligent water drops algorithm: a nature-inspired swarm-based optimization algorithm. Int. J. Bio-Inspired Computation 1(1–2), 71–79 (2009)
Tran, Trung Hieu, Ng, Kien Ming: A water-flow algorithm for flexible flow shop scheduling with intermediate buffers. Journal of Scheduling 14(5), 483–500 (2011)
Eskandar, Hadi, Sadollah, Ali, Bahreininejad, Ardeshir, Hamdi, Mohd: Water cycle algorithm - a novel metaheuristic optimization method for solving constrained engineering optimization problems. Computers & Structures 110–111, 151–166 (2012)
Zheng, Yu-Jun.: Water wave optimization: A new nature-inspired metaheuristic. Computers & Operations Research 55, 1–11 (2015)
Kaveh, A., Bakhshpoori, T.: Water evaporation optimization: A novel physically inspired optimization algorithm. Computers & Structures 167, 69–85 (2016)
Wedyan, Ahmad, Whalley, Jacqueline, Narayanan, Ajit: Hydrological cycle algorithm for continuous optimization problems. Journal of Optimization, 2017, (2017)
Biyanto, Totok R., Matradji, Febrianto, Henokh Y, Afdanny, Naindar, Rahman, Ahmad Hasinur, Gunawan, Kevin Sanjoyo: Rain water algorithm: Newton’s law of rain water movements during free fall and uniformly accelerated motion utilization. In: AIP Conference Proceedings, volume 2088, page 020053. AIP Publishing LLC, (2019)
Ghasemi, Mojtaba, Davoudkhani, Iraj Faraji, Akbari, Ebrahim, Rahimnejad, Abolfazl, Ghavidel, Sahand, Li, Li.: A novel and effective optimization algorithm for global optimization and its engineering applications: Turbulent flow of water-based optimization (TFWO). Engineering Applications of Artificial Intelligence 92, 103666 (2020)
Karami, Hojat, Anaraki, Mahdi Valikhan, Farzin, Saeed, Mirjalili, Seyedali: Flow direction algorithm (FDA): A novel optimization approach for solving optimization problems. Computers & Industrial Engineering 156, 107224 (2021)
Guha, Ritam, Ghosh, Soulib, Ghosh, Kushal Kanti, Cuevas, Erik, Perez-Cisneros, Marco, Sarkar, Ram: Groundwater flow algorithm: A novel hydro-geology based optimization algorithm. IEEE Access 10, 132193–132211 (2022)
Luo, Kaiping: Water flow optimizer: A nature-inspired evolutionary algorithm for global optimization. IEEE Transactions on Cybernetics 52(8), 7753–7764 (2022)
Alba, E., Dorronsoro, B.: The exploration/exploitation tradeoff in dynamic cellular genetic algorithms. IEEE Transactions on Evolutionary Computation 9(2), 126–142 (2005)
Lynn, Nandar, Suganthan, Ponnuthurai Nagaratnam: Heterogeneous comprehensive learning particle swarm optimization with enhanced exploration and exploitation. Swarm and Evolutionary Computation 24, 11–24 (2015)
de Matos Macêdo, Fagner José, da Rocha Neto, Ajalmar Rêgo: A binary water flow optimizer applied to feature selection. In: Intelligent Data Engineering and Automated Learning – IDEAL 2022, pages 94–103. Springer International Publishing, (2022)
Cheng, Mang-Mang., Zhang, Jing, Wang, De-Guang., Tan, Wei, Yang, Jing: A localization algorithm based on improved water flow optimizer and max-similarity path for 3-D heterogeneous wireless sensor networks. IEEE Sensors Journal 23(12), 13774–13788 (2023)
Yang, Xiao-Hua., Liu, Tong, Li, Yu.-Qi.: A fractional-order genetic algorithm for parameter optimization of the moisture movement in a bio-retention system. Thermal Science 23(4), 2343–2350 (2019)
Shahri, Esmat Sadat Alaviyan., Alfi, Alireza, Tenreiro Machado, J.A.: Fractional fixed-structure \(\text{ H }\infty\) controller design using augmented lagrangian particle swarm optimization with fractional order velocity. Applied Soft Computing 77, 688–695 (2019)
Akbar, Sadiq, Zaman, Fawad, Asif, Muhammad, Rehman, Ata Ur, Raja, Muhammad Asif Zahoor.: Novel application of FO-DPSO for 2-D parameter estimation of electromagnetic plane waves. Neural Computing and Applications 31(8), 3681–3690 (2019)
Muhammad, Yasir, Khan, Rahimdad, Ullah, Farman, Rehman, Ata ur, Aslam, Muhammad Saeed, Raja, Muhammad Asif Zahoor.: Design of fractional swarming strategy for solution of optimal reactive power dispatch. Neural Computing and Applications 32(14), 10501–10518 (2020)
Siridhara, A.L., Ratnam, D.V.: Mitigation of multipath effects based on a robust fractional order bidirectional least mean square (FOBLMS) beamforming algorithm for GPS receivers. Wireless Personal Communications 112(2), 743–761 (2020)
Zameer, Aneela, Muneeb, Muhammad, Mirza, Sikander M., Raja, Muhammad Asif Zahoor.: Fractional-order particle swarm based multi-objective PWR core loading pattern optimization. Annals of Nuclear Energy 135, 106982 (2020)
Bukhari, Ayaz Hussain, Raja, Muhammad Asif Zahoor., Sulaiman, Muhammad, Islam, Saeed, Shoaib, Muhammad, Kumam, Poom: Fractional neuro-sequential ARFIMA-LSTM for financial market forecasting. IEEE Access 8, 71326–71338 (2020)
Mousavi, Yashar, Alfi, Alireza, Kucukdemiral, Ibrahim Beklan: Enhanced fractional chaotic whale optimization algorithm for parameter identification of isolated wind-diesel power systems. IEEE Access 8, 140862–140875 (2020)
Khan, Noor Habib, Wang, Yong, Tian, De., Raja, Muhammad Asif Zahoor., Jamal, Raheela, Muhammad, Yasir: Design of fractional particle swarm optimization gravitational search algorithm for optimal reactive power dispatch problems. IEEE Access 8, 146785–146806 (2020)
Tenreiro Machado, J.A., Pahnehkolaei, Seyed Mehdi Abedi., Alfi, Alireza: Complex-order particle swarm optimization. Communications in Nonlinear Science and Numerical Simulation 92, 105448 (2021)
Boudjemaa, Redouane, Oliva, Diego, Ouaar, Fatima: Fractional lévy flight bat algorithm for global optimisation. Int. J. Bio-Inspired Computation 15(2), 100–112 (2020)
Waseem, W., Sulaiman, Muhammad, Alhindi, Ahmad, Alhakami, Hosam: A soft computing approach based on fractional order DPSO algorithm designed to solve the corneal model for eye surgery. IEEE Access 8, 61576–61592 (2020)
Muhammad, Yasir, Khan, Rahimdad, Raja, Muhammad Asif Zahoor., Ullah, Farman, Chaudhary, Naveed Ishtiaq, He, Yigang: Design of fractional swarm intelligent computing with entropy evolution for optimal power flow problems. IEEE Access 8, 111401–111419 (2020)
Yousri, Dalia, Mirjalili, Seyedali: Fractional-order cuckoo search algorithm for parameter identification of the fractional-order chaotic, chaotic with noise and hyper-chaotic financial systems. Engineering Applications of Artificial Intelligence 92, 103662 (2020)
Yousri, Dalia, Elaziz, Mohamed Abd, Mirjalili, Seyedali: Fractional-order calculus-based flower pollination algorithm with local search for global optimization and image segmentation. Knowledge-Based Systems 197, 105889 (2020)
Yousri, Dalia, Allam, Dalia, Babu, Thanikanti Sudhakar, AbdelAty, Amr M., Radwan, Ahmed G., Ramachandaramurthy, Vigna K., Eteiba, M.. B..: Fractional chaos maps with flower pollination algorithm for chaotic systems’ parameters identification. Neural Computing and Applications 32(20), 16291–16327 (2020)
Elaziz, Mohamed Abd, Yousri, Dalia, Mirjalili, Seyedali: A hybrid harris hawks-moth-flame optimization algorithm including fractional-order chaos maps and evolutionary population dynamics. Advances in Engineering Software 154, 102973 (2021)
Yi-Fei, P.U., Patrick, S.I.A.R.R.Y., Wu-Yang, Z.H.U., Jian, W.A.N.G., Ni, Z.H.A.N.G.: Fractional-order ant colony algorithm: A fractional long term memory based cooperative learning approach. Swarm and Evolutionary Computation 69, 101014 (2022)
Yousri, Dalia, AbdelAty, Amr M., Al-qaness, Mohammed A.A.., Ewees, Ahmed A., Radwan, Ahmed G., Elaziz, Mohamed Abd: Discrete fractional-order Caputo method to overcome trapping in local optima: Manta ray foraging optimizer as a case study. Expert Systems with Applications 192, 116355 (2022)
Waleed Abd El Maguid Ahmed, Hala M. Abdel Mageed, Samah AbdEltwab Mohamed, and Amr A. Saleh (2022) Fractional order darwinian particle swarm optimization for parameters identification of solar PV cells and modules. Alexandria Engineering Journal, 61(2):1249–1263
Baba, Isa Abdullahi, Rihan, Fathalla A.: A fractional-order model with different strains of COVID-19. Physica A: Statistical Mechanics and its Applications 603, 127813 (2022)
Pahnehkolaei, Seyed Mehdi Abedi., Alfi, Alireza, Tenreiro Machado, J.A.: Analytical stability analysis of the fractional-order particle swarm optimization algorithm. Chaos, Solitons & Fractals 155, 111658 (2022)
Guo, Haibing, Wei, Gu., Khayatnezhad, Majid, Ghadimi, Noradin: Parameter extraction of the SOFC mathematical model based on fractional order version of dragonfly algorithm. International Journal of Hydrogen Energy 47(57), 24059–24068 (2022)
Wolpert, D.H., Macready, W.G.: No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation 1(1), 67–82 (1997)
Abbott, Michael B., Minns, Anthony W.: Computational Hydraulics, 2nd edn. Routledge (2017)
Podlubny, Igor: An introduction to fractional derivatives, fractional differential equations, to methods of their solution and some of their applications. Math. Sci. Eng 198, 261–300 (1999)
Ostalczyk, Piotr: Discrete fractional calculus: applications in control and image processing, vol. 4. World scientific (2015)
Awad, N.H., Ali, M.Z., Liang, J.J., Qu, B.Y., Suganthan, P.N: Problem definitions and evaluation criteria for the CEC 2017 special session and competition on single objective bound constrained real-parameter numerical optimization. In: Technical Report. Nanyang Technological University Singapore, (2016)
Luengo, Julián, García, Salvador, Herrera, Francisco: A study on the use of statistical tests for experimentation with neural networks: Analysis of parametric test conditions and non-parametric tests. Expert Systems with Applications 36(4), 7798–7808 (2009)
Carrasco, J., García, S., Rueda, M.M., Das, S., Herrera, F.: Recent trends in the use of statistical tests for comparing swarm and evolutionary computing algorithms: Practical guidelines and a critical review. Swarm and Evolutionary Computation 54, 100665 (2020)
Kumar, Abhishek, Misra, Rakesh Kumar, Singh, Devender, Mishra, Sujeet, Das, Swagatam: The spherical search algorithm for bound-constrained global optimization problems. Applied Soft Computing 85, 105734 (2019)
Tang, Deyu: Spherical evolution for solving continuous optimization problems. Applied Soft Computing 81, 105499 (2019)
Braik, Malik Shehadeh: Chameleon swarm algorithm: A bio-inspired optimizer for solving engineering design problems. Expert Systems with Applications 174, 114685 (2021)
Xia, Xuewen, Gui, Ling, He, Guoliang, Wei, Bo., Zhang, Yinglong, Fei, Yu., Hongrun, Wu., Zhan, Zhi-Hui.: An expanded particle swarm optimization based on multi-exemplar and forgetting ability. Information Sciences 508, 105–120 (2020)
Chen, Xu., Bin, Xu., Mei, Congli, Ding, Yuhan, Li, Kangji: Teaching-learning-based artificial bee colony for solar photovoltaic parameter estimation. Applied Energy 212, 1578–1588 (2018)
Zhao, Weiguo, Wang, Liying, Mirjalili, Seyedali: Artificial hummingbird algorithm: A new bio-inspired optimizer with its engineering applications. Computer Methods in Applied Mechanics and Engineering 388, 114194 (2022)
Das, Swagatam, Suganthan, Ponnuthurai N: Problem definitions and evaluation criteria for CEC 2011 competition on testing evolutionary algorithms on real world optimization problems. Jadavpur University, Nanyang Technological University, Kolkata, pages 341–359, (2010)
Lei, Zhenyu, Gao, Shangce, Zhang, Zhiming, Zhou, MengChu, Cheng, Jiujun: MO4: A many-objective evolutionary algorithm for protein structure prediction. IEEE Transactions on Evolutionary Computation 26(3), 417–430 (2022)
Zhang, Yu., Gao, Shangce, Cai, Pengxing, Lei, Zhenyu, Wang, Yirui: Information entropy-based differential evolution with extremely randomized trees and lightGBM for protein structural class prediction. Applied Soft Computing 136, 110064 (2023)
Gao, Shangce, Wang, Kaiyu, Tao, Sichen, Jin, Ting, Dai, Hongwei, Cheng, Jiujun: A state-of-the-art differential evolution algorithm for parameter estimation of solar photovoltaic models. Energy Conversion and Management 230, 113784 (2021)
Yang, Yu., Gao, Shangce, Zhou, MengChu, Wang, Yirui, Lei, Zhenyu, Zhang, Tengfei, Wang, Jiahai: Scale-free network-based differential evolution to solve function optimization and parameter estimation of photovoltaic models. Swarm and Evolutionary Computation 74, 101142 (2022)
Gao, Shangce, Zhou, Mengchu, Wang, Yirui, Cheng, Jiujun, Yachi, Hanaki, Wang, Jiahai: Dendritic neuron model with effective learning algorithms for classification, approximation, and prediction. IEEE Transactions on Neural Networks and Learning Systems 30(2), 601–614 (2019)
Gao, Shangce, Zhou, MengChu, Wang, Ziqian, Sugiyama, Daiki, Cheng, Jiujun, Wang, Jiahai, Todo, Yuki: Fully complex-valued dendritic neuron model. IEEE Transactions on Neural Networks and Learning Systems 34(4), 2105–2118 (2023)
Yang, Yu., Lei, Zhenyu, Wang, Yirui, Zhang, Tengfei, Peng, Chen, Gao, Shangce: Improving dendritic neuron model with dynamic scale-free network-based differential evolution. IEEE/CAA Journal of Automatica Sinica 9(1), 99–110 (2022)
Umar, Muhammad, Amin, Fazli, Al-Mdallal, Qasem, Ali, Mohamed R.: A stochastic computing procedure to solve the dynamics of prevention in HIV system. Biomedical Signal Processing and Control 78, 103888 (2022)
Mukdasai, Kanit, Sabir, Zulqurnain, Raja, Muhammad Asif Zahoor., Sadat, R., Ali, Mohamed R., Singkibud, Peerapongpat: A numerical simulation of the fractional order leptospirosis model using the supervise neural network. Alexandria Engineering Journal 61(12), 12431–12441 (2022)
Shahzad, Azeem, Liaqat, Fakhira, Ellahi, Zaffer, Sohail, Muhammad, Ayub, Muhammad, Ali, Mohamed R.: Thin film flow and heat transfer of Cu-nanofluids with slip and convective boundary condition over a stretching sheet. Scientific Reports 12(1), 14254 (2022)
Sadaf, Maasoomah, Arshed, Saima, Akram, Ghazala, Ali, Mohamed R., Bano, Iffat: Analytical investigation and graphical simulations for the solitary wave behavior of Chaffee-Infante equation. Results in Physics 54, 107097 (2023)
Ali, Karmina K., Yusuf, Abdullahi, Yokus, As.ıf, Ali, Mohamed R.: Optical waves solutions for the perturbed Fokas-Lenells equation through two different methods. Results in Physics 53, 106869 (2023)
Waqas, Hassan, Farooq, Umar, Hassan, Ali, Liu, Dong, Noreen, Sobia, Makki, Roa, Imran, Muhammad, Ali, Mohamed R.: Numerical and Computational simulation of blood flow on hybrid nanofluid with heat transfer through a stenotic artery: Silver and gold nanoparticles. Results in Physics 44, 106152 (2023)
Ali, Karmina K., Tarla, Sibel, Ali, Mohamed R., Yusuf, Abdullahi: Modulation instability analysis and optical solutions of an extended (2+1)-dimensional perturbed nonlinear Schrödinger equation. Results in Physics 45, 106255 (2023)
Ali, Karmina K., Tarla, Sibel, Ali, Mohamed R., Yusuf, Abdullahi, Yilmazer, Resat: Physical wave propagation and dynamics of the Ivancevic option pricing model. Results in Physics 52, 106751 (2023)
Asim Zafar, M., Raheel, Ali M., Mahnashi, Ahmet Bekir, Ali, Mohamed R., Hendy, A.S.: Exploring the new soliton solutions to the nonlinear M-fractional evolution equations in shallow water by three analytical techniques. Results in Physics 54, 107092 (2023)
Acknowledgements
This work was mainly supported by the Research on Deep-layered Swarm Intelligence Algorithms and Their Application in Solar Photovoltaic Models under Grant NSF2024CB14.
Funding
This research was partially supported by the Research on Deep-layered Swarm Intelligence Algorithms and Their Application in Solar Photovoltaic Models under Grant NSF2024CB14.
Author information
Authors and Affiliations
Contributions
ZT: methodology, software, writing—original draft preparation. KW: methodology and software. YZ: methodology and software. QZ: methodology, software, writing—reviewing and editing. YT: writing—reviewing and editing. SG: conceptualization, methodology, software, supervision, writing—review & editing. All authors read and approved the final manuscript.
Corresponding author
Ethics declarations
Conflict of interest
The authors declare no Conflict of interest.
Ethical approval and consent to participate
Not applicable.
Consent for publication
Not applicable.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Tang, Z., Wang, K., Zang, Y. et al. Fractional-Order Water Flow Optimizer. Int J Comput Intell Syst 17, 84 (2024). https://doi.org/10.1007/s44196-024-00445-4
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s44196-024-00445-4