Abstract
In this paper, the problems of stability of a general class of discrete-time delayed recurrent neural networks are re-investigated in light of some recent results. These networks are obtained by modeling synapses as Finite Impulse Response (FIR) filters instead of multiplicative scalars. We first derive a sufficient conditions for the network operating in closed-loop to converge to a fixed point using Lyapunov functional method; the symmetry of the connection matrix is not assumed. We then show how these conditions relate to other conditions ensuring both the existence of the error gradient other arbitrary long trajectories and the asymptotic stability of the fixed points at each time step.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Almeida, L.B.: Backpropagation in perceptrons with feedback. Neural Computers (1987) 199–208.
Aussem, A., Murtagh, F., Sarazin, M.: Dynamical recurrent neural networks-towards environmental time series prediction. International Journal of Neural Systems 6 (1995) 145–170
Aussem, A.: Sufficient Conditions for Error Back Flow Convergence in Dynamical Recurrent Neural Networks. Neural Computation 14 (2002) 1907–1927
Bengio, Y., Simard, P., Frasconi, P.: Learning long-term dependencies with gradient descent is difficult. IEEE Transactions on Neural Networks 5 (1994) 157–166.
Cao, J.D., Zhou, D.M.: Stability Analysis of Delayed Cellular Neural Networks. Neural Networks 11 (1998) 1601–1605
Feng, C., Plamondon, R.: On the Stability Analysis of Delayed Neural Networks. Neural Networks 14 (2001) 1181–1188
Khalil, H.K.: Nonlinear Systems. Prentice-Hall, Upper Saddle River, NJ, (1996)
Kremer, S.C.: Spatiotemporal Connectionist Networks: A Taxonomy and Review. Neural Computation 13 (2001) 249–306
Mandic, D.P., Chambers, J.A.: Recurrent Neural Networks for Prediction. Learning Algorithms, Architectures and Stability. John Wiley & Sons, Chichester, England (2001)
Pineda, F.J.: Generalization of back-propagation to recurrent neural networks. Physical Review Letters 59 (1987) 2229–2232
Wan, E.A.: Finite Impulse Response Neural Networks with Applications in Time Series Prediction. Ph.D. Thesis, Stanford University, CA, (1993)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Aussem, A. (2003). Closed Loop Stability of FIR-Recurrent Neural Networks. In: Kaynak, O., Alpaydin, E., Oja, E., Xu, L. (eds) Artificial Neural Networks and Neural Information Processing — ICANN/ICONIP 2003. ICANN ICONIP 2003 2003. Lecture Notes in Computer Science, vol 2714. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44989-2_62
Download citation
DOI: https://doi.org/10.1007/3-540-44989-2_62
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-40408-8
Online ISBN: 978-3-540-44989-8
eBook Packages: Springer Book Archive