Abstract
Avoiding the difficulty of constructing a proper Lyapunov function, the generalized Dahlquist constant approach is employed to investigate the exponential stability of the static neural networks. Without assuming the boundedness, monotonicity of the activations, a new sufficient conditions for existence of an unique equilibrium and the exponential stability of the neural networks are presented. An example is given to show the effectiveness of our results.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Haykin, S.: Neural Networks: A Comprehensive Foundation, New York (1994)
Hertz, J., Krogh, A.: Introduction to Theory of Neural Computation. Addison-Wesley, Reading (1994)
Pineda, F.: Generalization of Back-propagation to Recurrent Neural Networks. Phys. Rev. Lett. 59, 2229–2232 (1987)
Rohwer, R., Forrest, B.: Training Time-dependence in Neural Networks. In: 1st IEEE Int. Conf. Neural Networks, San Diego, CA, pp. 701–708 (1987)
Hopfield, J.: Neural Networks and Physical Systems with Emergent Collective Computational Abilities. Proc. Nat. Acad. Sci. 2, 2554–2558 (1982)
Hopfield, J., Tank, D.: Computing with Neural Circuits:A Model. Scinence 233, 625–633 (1986)
Xu, Z., Qiao, H., Peng, J., Zhang, B.: A Comparative Study of Two Modeling Approaches in Neural Networks. Neural Networks 17, 73–85 (2004)
Arik, S.: Global Asymptotic Stability of A Class of Dynamical Neural Networks. IEEE Trans. Circuits Syst. I 47, 568–571 (2000)
Chen, T., Amari, S.: New Theorems on Global Convergence of Some Dynamical Systems. Neural Networks 14, 252–255 (2001)
Cohen, M., Grossberg, S.: Absolute Stability of Global Pattern Formation and Parallel Memory Storage by Competitive Neural Networks. IEEE Trans. Syst. Man Cybern. 13, 815–826 (1983)
Li, G., Xu, J.: Analysis of Global Asymptotic Stability of Delayed Recurrent Neural Networks. Dyna. of Cont. Disc. and Impu. Syst. Math. Anal. 13, 306–309 (2006)
Li, P., Cao, J.: Stability in Static Delayed Neural Networks: A Nonlinear Measure Approach. Neurocomputing 69, 1776–1781 (2006)
Wan, A.N., Peng, J., Wang, M.: Generalized Relative Dahlquis t Constant with Application in Stability Analysis of Nonlinear Systems. Mathematica Applicata 18, 328–332 (2005)
Xu, S., Lam, D.W.C.: Ho: Globally Robust Exponenttial Stability Analysis for Interval Recurrent Neural Networks. Phys. Lett. A 325, 124–133 (2004)
Liang, J., Cao, J.: A Based-on LMI Stability Criterion for Delayed Recurrent Neural networks. Chaos, Solutions & Fractals 28, 154–160 (2006)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Li, G., Xu, J. (2009). The Dahlquist Constant Approach to Stability Analysis of the Static Neural Networks. In: Yu, W., He, H., Zhang, N. (eds) Advances in Neural Networks – ISNN 2009. ISNN 2009. Lecture Notes in Computer Science, vol 5551. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-01507-6_35
Download citation
DOI: https://doi.org/10.1007/978-3-642-01507-6_35
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-01506-9
Online ISBN: 978-3-642-01507-6
eBook Packages: Computer ScienceComputer Science (R0)