Abstract
In reality, the inputs of many complicated systems are continuous time-varying functions. It is difficult for traditional Elman neural networks (ENN) to simulate such complicated nonlinear systems directly because their inputs are all instantaneous constant values. To overcome this limitation, an Elman-style process neural network (EPNN) is proposed in this paper. From the point view of architecture, the EPNN is similar to the ENN. The major characteristics which distinguish the EPNN from the ENN lie in the fact that the inputs and the connection weights of the EPNN are time-varying functions. A corresponding learning algorithm based on the expansion of the orthogonal basis functions is developed. The effectiveness of the EPNN and its learning algorithm is proved by the lubricating oil iron concentration prediction in the aircraft engine health condition monitoring, and the application test results also indicate that the EPNN has a faster learning speed and a higher accuracy than the same scale ENN.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
McCullon, W., Pitts, W.: A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysics 5, 115–133 (1943)
Hornik, K., Stinchcombe, M., White, H.: Multilayer feedforward networks are universal approximators. Neural Networks 2, 359–366 (1989)
Funahashi, K.: On the approximate realization of continuous mappings by neural networks. Neural Networks 2, 183–192 (1989)
Zhang, L.I., Tao, H.W., Holt, C.E.: A critical window for cooperation and competition among developing retinotectal synapses. Nature 395, 37–44 (1998)
He, X.G., Liang, J.Z.: Some theoretical issues on procedure neural networks. Engineering Science 2, 40–44 (2000)
Ding, G., Lin, L., Zhong, S.S.: Functional time series prediction using process neural network. Chinese Physics Letters 26, 090502-1–090502-4 (2009)
Ding, G., Zhong, S.S.: Time series prediction using wavelet process neural network. Chinese Physics B 17, 1998–2003 (2008)
Freeman, W.J.: Mass action in the nervous system, New York (1975)
Rajan, K., Abbott, L.F., Sompolinsky, H.: Stimulus-dependent suppression of chaos in recurrent neural networks. Physical Review E 82, 011903 (2010)
Mirikitani, D.T., Nikolaev, N.: Recursive bayesian recurrent neural networks for time-series modeling. IEEE Transactions on Neural Networks 21, 262–274 (2010)
Elman, J.L.: Finding structure in time. Cognitive Science 14, 179–211 (1990)
Portegys, T.E.: A maze learning comparison of Elman, long short-term memory, and Mona neural networks. Neural Networks 23, 306–313 (2010)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Ding, G., Lin, L. (2011). Elman-Style Process Neural Network with Application to Aircraft Engine Health Condition Monitoring. In: Liu, D., Zhang, H., Polycarpou, M., Alippi, C., He, H. (eds) Advances in Neural Networks – ISNN 2011. ISNN 2011. Lecture Notes in Computer Science, vol 6675. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21105-8_56
Download citation
DOI: https://doi.org/10.1007/978-3-642-21105-8_56
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-21104-1
Online ISBN: 978-3-642-21105-8
eBook Packages: Computer ScienceComputer Science (R0)