Abstract
This paper intends to enhance the learning performance of radial basis function neural network (RBFnn) using self-organizing map (SOM) neural network (SOMnn). In addition, the particle swarm optimization (PSO) and genetic algorithm (GA) based (PG) algorithm is employed to train RBFnn for function approximation. The proposed mix of SOMnn with PG (MSPG) algorithm combines the automatically clustering ability of SOMnn and the PG algorithm. The simulation results revealed that SOMnn, PSO and GA approaches can be combined ingeniously and redeveloped into a hybrid algorithm which aims for obtaining a more accurate learning performance among relevant algorithms. On the other hand, method evaluation results for four continuous test function experiments and the demand estimation case showed that the MSPG algorithm outperforms other algorithms and the Box–Jenkins models in accuracy. Additionally, the proposed MSPG algorithm is allowed to be embedded into business’ enterprise resource planning system in different industries to provide suppliers, resellers or retailers in the supply chain more accurate demand information for evaluation and so to lower the inventory cost. Next, it can be further applied to the intelligent manufacturing system to cope with real situation in the industry to meet the need of customization.
Similar content being viewed by others
References
Akaike, H. (1974). A new look at the statistical model identification. IEEE Transactions on Automatic Control, 19, 716–723.
Amraee, T., Ranjbar, A. M., Mozafari, B., & Sadati, N. (2007). An enhanced under-voltage load-shedding scheme to provide voltage stability. Electric Power Systems Research, 77, 1038–1046.
Anbazhagan, S., & Kumarappan, N. (2014). Day-ahead deregulated electricity market price forecasting using neural network input featured by DCT. Energy Conversion and Management, 78, 711–719.
Ayala, H. V. H., & Coelho, L. D. S. (2016). Cascaded evolutionary algorithm for nonlinear system identification based on correlation functions and radial basis functions neural networks. Mechanical Systems and Signal Processing, 68–69, 378–392.
Azadeh, A., & Tarverdian, S. (2007). Integration of genetic algorithm, computer simulation and design of experiments for forecasting electrical energy consumption. Energy Policy, 35, 5229–5241.
Babu, C. N., & Reddy, B. E. (2014). A moving-average-filter-based hybrid ARIMA–ANN model for forecasting time series data. Applied Soft Computing, 23, 27–38.
Bagheri, M., Mirbagheri, S. A., Bagheri, Z., & Kamarkhani, A. M. (2015). Modeling and optimization of activated sludge bulking for a real wastewater treatment plant using hybrid artificial neural networks–genetic algorithm approach. Process Safety and Environmental Protection, 95, 12–25.
Barra, T. V., Bezerra, G. B., & de Castro, L. N. (2006). An immunological density-preserving approach to the synthesis of RBF neural networks for classification. In International joint conference on neural networks (pp. 929–935).
Box, G. E. P., & Jenkins, G. (1976). Time series analysis, forecasting and control. San Francisco: Holden-Day.
Chan, K. Y., Dillon, T. S., Singh, J. S., & Chang, E. (2012). Neural-network-based models for short-term traffic flow forecasting using a hybrid exponential smoothing and Levenberg–Marquardt algorithm. IEEE Transaction on Intelligent Transportation Systems, 13(2), 644–654.
Chang, P. C., & Liao, T. W. (2006). Combining SOM and fuzzy rule base for flow time prediction in semiconductor manufacturing factory. Applied Soft Computing, 6, 198–206.
Chen, S., Cowan, C. F. N., & Grant, P. M. (1991). Orthogonal least squares learning algorithm for radial basis function networks. IEEE Transaction on Neural Networks, 2(2), 302–309.
Chen, S., Wu, Y., & Luk, B. L. (1999). Combined genetic algorithm optimization and regularized orthogonal least squares learning for radial basis function networks. IEEE Transactions on Neural Networks, 10(5), 1239–1243.
Chiroma, H., Abdulkareem, S., & Herawan, T. (2015). Evolutionary neural network model for West Texas Intermediate crude oil price prediction. Applied Energy, 142, 266–273.
DelaOssa, L., Gamez, J. A., & Puetra, J. M. (2006). Learning weighted linguistic fuzzy rules with estimation of distribution algorithms. In IEEE congress on evolutionary computation (pp. 900–907). Vancouver, BC: Sheraton Vancouver Wall Centre Hotel.
Deng, W., Chen, R., Gao, J., Song, Y., & Xu, J. (2012). A novel parallel hybrid intelligence optimization algorithm for a function approximation problem. Computers and Mathematics with Applications, 63, 325–336.
Denker, J. S. (1986). Neural network models of learning and adaptation. Physica D, 22, 216–232.
Dey, S., Bhattacharyya, S., & Maulik, U. (2014). Quantum inspired genetic algorithm and particle swarm optimization using chaotic map model based interference for gray level image thresholding. Swarm and Evolutionary Computation, 15, 38–57.
Dickey, D. A., & Fuller, W. A. (1981). Likelihood ration statistics for autoregressive time series with a unit root. Econometrica, 49(4), 1057–1072.
Du, W., Leung, S. Y. S., & Kwong, C. K. (2015). A multiobjective optimization-based neural network model for short-term replenishment forecasting in fashion industry. Neurocomputing, 151, 342–353.
Duda, R. O., & Hart, P. E. (1973). Pattern classification and scene analysis. New York: Wiley.
Duman, S., Yorukeren, N., & Altas, I. H. (2015). A novel modified hybrid PSOGSA based on fuzzy logic for non-convex economic dispatch problem with value-point effect. Electrical Power and Energy Systems, 64, 121–135.
Engle, R. F., Robert, F., & Yoo, B. S. (1987). Forecasting and testing in cointegrated systems. Journal of Econometrics, 35, 588–589.
Er, M. J., Li, Z., Cai, H., & Chen, Q. (2005). Adaptive noise cancellation using enhanced dynamic fuzzy neural network. IEEE Transactions on Fuzzy Systems, 13(3), 331–342.
Feng, H. M. (2006). Self-generation RBFNs using evolutional PSO learning. Neurocomputing, 70, 241–251.
Galvez, A., & Iglesias, A. (2013). A new iterative mutually coupled hybrid GA–PSO approach for curve fitting in manufacturing. Applied Soft Computing, 13, 1491–1504.
Garcia-Gonzalo, E., & Fernandez-Martinez, J. L. (2012). A brief historical review of particle swarm optimization (PSO). Journal of Bioinformatics and Intelligent Control, 1(1), 3–16.
Goldberg, D. E. (1989). Genetic algorithms in search, optimization & machine learning. Reading, MA: Addison-Wesley.
Golub, G. H., & Loan, C. F. V. (1996). Matrix computations (3rd ed.). Baltimore, MD: Johns Hopkins Univ. Press.
Hadavandi, E., Shavandi, H., Ghanbari, A., & Naghneh, S. A. (2012). Developing a hybrid artificial intelligence model for outpatient visits forecasting in hospitals. Applied Soft Computing, 12, 700–711.
Holland, J. H. (1975). Adaptation in Natural and Artificial Systems. Ann Arbor, MI: MIT Press. Reprinted in 1998.
Hsu, S. H., Hsieh, J. P. A., Chih, T. C., & Hsu, K. C. (2009). A two-stage architecture for stock price forecasting by integrating self-organizing map and support vector regression. Expert Systems with Applications, 36, 7947–7951.
Huang, C. M., & Wang, F. L. (2007). An RBF network with OLS and EPSO algorithms for real-time power dispatch. IEEE Transaction on Power Systems, 22(1), 96–104.
Jaipuria, S., & Mahapatra, S. S. (2014). An improved demand forecasting method to reduce bullwhip effect in supply chains. Expert Systems with Applications, 41, 2395–2408.
Jin, D., Wang, P., Bai, Z., Wang, X., Peng, H., Qi, R., et al. (2011). Analysis of bacterial community in bulking sludge using culture-dependent and-independent approaches. Journal of Environmental Sciences, 23, 1880–1887.
Karaboga, D., & Basturk, B. (2008). On the performance of artificial bee colony (ABC) algorithm. Applied Soft Computing, 8(1), 687–697.
Katherasan, D., Elias, J. V., Sathiya, P., & Haq, A. N. (2014). Simulation and parameter optimization of flux cored arc welding using artificial neural network and particle swarm optimization algorithm. Journal of Intelligent Manufacturing, 25, 67–76.
Kennedy, J., & Eberhart, R. C. (1995). Particle swarm optimization. In Proceedings of IEEE international conference on neural networks (pp. 1942–1948). Perth: IEEE Service Center.
Kennedy, J., & Eberhart, R. C. (2001). Swarm intelligence. San Mateo, CA: Morgan Kaufmann.
Ketabchi, H., & Ataie-Ashtiani, B. (2015). Evolutionary algorithms for the optimal management of coastal groundwater: A comparative study toward future challenges. Journal of Hydrology, 520, 193–213.
Kmenta, J. (1986). Elements of econometrics (2nd ed.). New York: Macmillan Publishing Co.
Kohonen, T. (1987). Self-organizing and associative memory (2nd ed.). Berlin: Springer.
Kohonen, T. (1990). The self-organizing map. Proceedings of IEEE, 78(9), 1464–1480.
Kuo, R. J., & Han, Y. S. (2011). A hybrid of genetic algorithm and particle swarm optimization for solving bi-level linear programming problem: A case study on supply chain mode. Applied Mathematical Modelling, 35(8), 3905–3917.
Kuo, R. J., Syu, Y. J., Chen, Z. Y., & Tien, F. C. (2012). Integration of particle swarm optimization and genetic algorithm for dynamic clustering. Information Sciences, 195, 124–140.
Kurt, I., Ture, M., & Kurum, A. T. (2008). Comparing performances of logistic regression, classification and regression tree, and neural networks for predicting coronary artery disease. Expert Systems with Applications, 34, 366–374.
Kuzmanovski, I., Lazova, S. D., & Aleksovska, S. (2007). Classification of perovskites with supervised self-organizing maps. Analytica Chimica Acta, 595, 182–189.
Lee, Z. J. (2008). A novel hybrid algorithm for function approximation. Expert Systems with Applications, 34, 384–390.
Lin, C. F., Wu, C. C., Yang, P. H., & Kuo, T. Y. (2009). Application of Taguchi method in light-emitting diode backlight design for wide color gamut displays. Journal of Display Technology, 5(8), 323–330.
Lin, G., & Wu, M. (2009). A hybrid neural network model for typhoon-rainfall forecasting. Journal of Hydrology, 375, 450–458.
Lin, G. F., & Wu, M. C. (2011). An RBF network with a two-step learning algorithm for developing a reservoir inflow forecasting model. Journal of Hydrology, 405, 439–450.
Looney, C. G. (1996). Advances in feedforward neural networks: Demystifying knowledge acquiring black boxes. IEEE Transactions on Knowledge and Data Engineering, 8(2), 211–226.
Lopez, M., Valero, S., Senabre, C., Aparicio, J., & Gabaldon, A. (2012). Application of SOM neural networks to short-term load forecasting: The Spanish electricity market case study. Electric Power Systems Research, 91, 18–27.
Lu, J., Hu, H., & Bai, Y. (2015). Generalized radial basis function neural network based on an improved dynamic particle swarm optimization and AdaBoost algorithm. Neurocomputing, 152, 305–315.
Mezura-Montes, E., & CoelloCoello, C. A. (2011). Constraint-handling in nature-inspired numerical optimization: Past, present and future. Swarm Evol. Comput., 1(22), 173–194.
Olabi, A. G. (2008). Using Taguchi method to optimize welding pool of dissimilar laser-welded components. Optics & Laser Technology, 40, 379–388.
Ozturk, C., Hancer, E., & Karaboga, D. (2015). A novel binary artificial bee colony algorithm based on genetic operators. Information Sciences, 297, 154–170.
Qasem, S. N., Shamsuddin, S. M., & Zain, A. M. (2012). Multi-objective hybrid evolutionary algorithms for radial basis function neural network design. Knowledge-Based Systems, 27, 475–497.
Qiu, X., & Lau, H. Y. K. (2014). An AIS-based hybrid algorithm for static job shop scheduling problem. Journal of Intelligent Manufacturing, 25, 489–503.
Rezaee-Jordehi, A., & Jasni, J. (2013). Parameter selection in particle swarm optimisation: A survey. Journal of Experimental & Theoretical Artificial Intelligence, 25(4), 527–542.
Rocha, I. B. C. M., Parente, E, Jr., & Melo, A. M. C. (2014). A hybrid shared/distributed memory parallel genetic algorithm for optimization of laminate composites. Composite Structures, 107, 288–297.
Rumbell, T., Denham, S. L., & Wennekers, T. (2014). A spiking self-organizing map combining STDP, oscillations, and continuous learning. IEEE Transaction on Neural, Networks and Learning Systems, 25(5), 894–907.
Sarimveis, H., Alexandridis, A., Mazarakis, S., & Bafas, G. (2004). A new algorithm for developing dynamic radial basis function neural network models based on genetic algorithms. Computers and Chemical Engineering, 28, 209–217.
Sattari, M. T., Yurekli, K., & Pal, M. (2012). Performance evaluation of artificial neural network approaches in forecasting reservoir inflow. Applied Mathematical Modelling, 36, 2649–2657.
Savsani, P., Jhala, R. L., & Savsani, V. (2014). Effect of hybridizing biogeography-based optimization (BBO) technique with artificial immune algorithm (AIA) and ant colony optimization (ACO). Applied Soft Computing, 21, 542–553.
Shafie-khah, M., Moghaddam, M. P., & Sheikh-El-Eslami, M. K. (2011). Price forecasting of day-ahead electricity markets using a hybrid forecast method. Energy Conversion and Management, 52, 2165–2169.
Shelokar, P. S., Siarry, P., Jayaraman, V. K., & Kulkarni, B. D. (2007). Particle swarm and colony algorithms hybridized for improved continuous optimization. Applied Mathematics and Computation, 188, 129–142.
Shi, Y., & Eberhart, R. C. (1998). Parameter selection in particle swarm optimization. In Evolutionary programming VII: Proceedings of EP98 (pp. 591–600). New York: Springer.
Syswerda, G. (1989). Uniform crossover in genetic algorithms. In J. D. Sehaffer (Ed.), Proceedings of the third international conference on genetic algorithms and their applications (pp. 2–9). San Mateo: CA Morgan Kaufmann Publishers.
Taguchi, G., Chowdhury, S., & Wu, Y. (2005). Taguchi’s quality engineering handbook. Hoboken, NJ: Wiley.
Taguchi, G., & Yokoyama, T. (1993). Taguchi methods: Design of experiments. Dearbon, MI: ASI Press.
Tsai, J. T., Chou, J. H., & Liu, T. K. (2006). Tuning the structure and parameters of a neural network by using hybrid Taguchi-genetic algorithm. IEEE Transactions on Neural Networks, 17(1), 69–80.
Tsekouras, G. E., & Tsimikas, J. (2013). On training RBF neural networks using input–output fuzzy clustering and particle swarm optimization. Fuzzy Sets and Systems, 221, 65–89.
Vitorino, L. N., Ribeiro, S. F., & Bastos-Filho, C. J. A. (2015). A mechanism based on artificial bee colony to generate diversity in particle swarm optimization. Neurocomputing, 148, 39–45.
Wang, D., & Lu, W. Z. (2006). Forecasting of ozone level in time series using MLP model with a novel hybrid training algorithm. Atmospheric Environment, 40, 913–924.
Wang, W. M., Peng, X., Nhu, G. N., Hu, J., & Peng, Y. H. (2014). Dynamic representation of fuzzy knowledge based on fuzzy petri net and genetic-particle swarm optimization. Expert Systems with Applications, 41, 1369–1376.
Whitehead, B. A., & Choate, T. D. (1996). Cooperative–competitive genetic evolution of radial basis function centers and widths for time series prediction. IEEE Transaction on Neural Networks, 7(4), 869–880.
Wu, J. D., & Liu, J. C. (2012). A forecasting system for car fuel consumption using a radial basis function neural network. Expert Systems with Applications, 39, 1883–1888.
Xu, R., Venayagamoorthy, G. K., & Wunsch, D. C. (2007). Modeling of gene regulatory networks with hybrid differential evolution and particle swarm optimization. Neural Networks, 20, 917–927.
Yadav, V., & Srinivasan, D. (2011). A SOM-based hybrid linear-neural model for short-term load forecasting. Neurocomputing, 74, 2874–2885.
Yousefi, M., Enayatifar, R., & Darus, A. N. (2012). Optimal design of plate-fin heat exchangers by a hybrid evolutionary algorithm. International Communications in Heat and Mass Transfer, 39, 258–263.
Yu, L., Wang, S., Lai, K. K., & Wen, F. (2010). A multiscale neural network learning paradigm for financial crisis forecasting. Neurocomputing, 73, 716–725.
Yu, S., Wang, K., & Wei, Y. M. (2015a). A hybrid self-adaptive particle swarm optimization-genetic algorithm-radial basis function model for annual electricity demand prediction. Energy Conversion & Management, 91, 176–185.
Yu, S., Wei, Y. M., & Wang, K. (2012a). A PSO–GA optimal model to estimate primary energy demand of China. Energy Policy, 42, 329–340.
Yu, S., Wei, Y. M., & Wang, K. (2012b). China’s primary energy demands in 2020: Predictions from an MPSO–RBF estimation model. Energy Conversion & Management, 61, 59–66.
Yu, S., Zhang, J., Zheng, S., & Sun, H. (2015b). Provincial carbon intensity abatement potential estimation in China: A PSO-GA-optimized multi-factor environmental learning curve method. Energy Policy, 77, 46–55.
Yu, S., Zhu, K., & Gao, S. (2009). A hybrid MPSO-BP structure adaptive algorithm for RBFNs. Neural Computing and Applications, 18, 769–779.
Zhang, Z., Su, S., Lin, Y., Cheng, X., Shuang, K., & Xu, P. (2015a). Adaptive multi-objective artificial immune system based virtual network embedding. Journal of Network and Computer Applications, 53, 140–155.
Zhang, J. L., Zhang, Y. J., & Zhang, L. (2015b). A novel hybrid method for crude oil forecasting. Energy Economics, 49, 649–659.
Zou, H. F., Xia, G. P., Yang, F. T., & Wang, H. Y. (2007). An investigation and comparison of artificial neural network and time series models for Chinese food grain price forecasting. Neurocomputing, 70, 2913–2923.
Author information
Authors and Affiliations
Corresponding author
Appendix: Four continuous test functions (Shelokar et al. 2007; Whitehead and Choate 1996)
Appendix: Four continuous test functions (Shelokar et al. 2007; Whitehead and Choate 1996)
The first experiment, Rosenbrock function (Shelokar et al. 2007) is expressed as follows:
- (a)
search domain: \(-5\leqq x_j \leqq 5, j = 1\);
- (b)
one global minimum: \((x_1 ,x_2 ) = (1, 1); RS(x_1 ,x_2)=0\).
In the second experiment, Griewank function (Shelokar et al. 2007) is expressed as follows:
- (a)
search domain: \(-300 \leqq x_j \leqq 600, j\)= 1;
- (b)
one global minimum: (\(x_1 ,x_2 )\) = (0, 0); \(GR(x_1 ,x_2 )\)= 0.
In the third experiment, B2 function (Shelokar et al. 2007) is expressed as follows:
- (a)
search domain: \(-100\leqq x_j \leqq 100, j = 1\);
- (b)
one global minima: \((x_1 ,x_2 )= (0, 0); B2(x_1 ,x_2 )= 0\).
In the fourth experiment, the Mackey-Glass time series (Whitehead and Choate 1996) is expressed as follows:
The research for the retrieved time step was in the range from 118 to 1118 with the Mackey-Glass time series function, from which 1000 samples were generated randomly.
Rights and permissions
About this article
Cite this article
Chen, ZY., Kuo, R.J. Combining SOM and evolutionary computation algorithms for RBF neural network training. J Intell Manuf 30, 1137–1154 (2019). https://doi.org/10.1007/s10845-017-1313-7
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10845-017-1313-7