Abstract
As a powerful tool for regression prediction, Incremental Extreme Learning Machine (I-ELM) has good nonlinear approximation ability, but the original model has the problem that the uneven output weights distribution affects the generalization ability of the model. This paper proposes an Incremental Extreme Learning Machine method based on Attenuated Regularization Term (ARI-ELM). The proposed ARI-ELM adds attenuation regularization term in the iterative process of output weights, reduces the output weights of the hidden node in the early stage of the iteration and ensuring that the new nodes after multiple iterations are not affected by the large regularization coefficient. Therefore, the overall output weights of the network reach a relatively small and evenly distributed state, which would reduce the complexity of the model. This paper also proves that the model still has convergence performance after adding the attenuated regularization term. Simulation results on the benchmark data set demonstrate that our proposed approach has better generalization performance than other incremental extreme learning machine variants. In addition, this paper applies the algorithm to specific weight prediction scene of intelligent manufacturing dynamic scheduling, and also gets good results.
This paper was supported by National Key Research and Development Program of China under Grant 2018YFB1003700 and National Natural Science Foundation of China under Grant 61906015.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Cao, J.W., Lin, Z.P.: Extreme learning machines on high dimensional and large data applications: a survey. Math. Probl. Eng. 2015(PT.12), 103796.1–103796.13 (2015)
Feng, G.R., Huang, G.B., Lin, Q.P., Gay, R.: Error minimized extreme learning machine with growth of hidden nodes and incremental learning. IEEE Trans. Neural Netw. 20(8), 1352–1357 (2009)
Geng, Z., Dong, J., Chen, J., Han, Y.: A new self-organizing extreme learning machine soft sensor model and its applications in complicated chemical processes. Eng. Appl. Artif. Intell. 62, 38–50 (2017)
Huang, G.B., Chen, L., Siew, C.K.: Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans. Neural Netw. 17(4), 879–892 (2006)
Huang, G.B., Chen, L., Siew, C.K.: Convex incremental extreme learning machine. Neurocomputing 70(16–18), 3056–3062 (2007)
Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: a new learning scheme of feedforward neural networks. In: International Joint Conference on Neural Networks, pp. 985–990. IEEE (2005)
Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: theory and applications. Neurocomputing 70(1/3), 489–501 (2006)
Huang, G.B., Zhu, Q.Y., Siew, C.K.: Real-time learning capability of neural networks. IEEE Trans. Neural Netw. 17(4), 863 (2006)
Liang, N.Y., Huang, G.B., Saratchandran, P., Sundararajan, N.: A fast and accurate online sequential learning algorithm for feedforward networks. IEEE Trans. Neural Netw. 17, 1411–23 (2006)
Lukosevicius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Comput. Sci. Rev. 3(3), 127–149 (2009)
Tang, X.L., Han, M.: Partial Lanczos extreme learning machine for single-output regression problems. Neurocomputing 72(13–15), 3066–3076 (2009)
Tang, Y.G., Li, Z.H., Guan, X.P.: Identification of nonlinear system using extreme learning machine based Hammerstein model. Commun. Nonlinear Sci. Numer. Simul. 19(9), 3171–3183 (2014)
Tian, Z.D., Li, S.J., Wang, Y.H., Wang, X.D.: Network traffic prediction method based on improved ABC algorithm optimized EM-ELM. J. China Univ. Posts Telecommun. 25(03), 37–48 (2018)
Wang, D., Wang, P., Ji, Y.: An oscillation bound of the generalization performance of extreme learning machine and corresponding analysis. Neurocomputing 151, 883–890 (2015)
Williams, R.J., Zipser, D.: A learning algorithm for continually running fully recurrent neural networks. Neural Comput. 1(2), 270–280 (1998)
Zhang, L., Zhang, D.: Evolutionary cost-sensitive extreme learning machine. IEEE Trans. Neural Netw. Learn. Syst. 28(12), 3045–3060 (2017)
Zhongda, T., Shujiang, L., Yanhong, W., Yi, S.: A prediction method based on wavelet transform and multiple models fusion for chaotic time series. Chaos, Solitons Fractals 98, 158–172 (2017)
Zhu, W., Huang, W., Lin, Z., Yang, Y., Huang, S., Zhou, J.: Data and feature mixed ensemble based extreme learning machine for medical object detection and segmentation. Multimed. Tools Appl. 75(5), 2815–2837 (2015). https://doi.org/10.1007/s11042-015-2582-9
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 Springer Nature Switzerland AG
About this paper
Cite this paper
Wang, C., Li, Y., Zou, W., Xia, Y. (2022). An Incremental Extreme Learning Machine Prediction Method Based on Attenuated Regularization Term. In: Tan, Y., Shi, Y., Niu, B. (eds) Advances in Swarm Intelligence. ICSI 2022. Lecture Notes in Computer Science, vol 13345. Springer, Cham. https://doi.org/10.1007/978-3-031-09726-3_17
Download citation
DOI: https://doi.org/10.1007/978-3-031-09726-3_17
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-09725-6
Online ISBN: 978-3-031-09726-3
eBook Packages: Computer ScienceComputer Science (R0)