Abstract
This paper studies the global output convergence of a class of recurrent delayed neural networks with time-varying inputs. We consider non-decreasing activations which may also have jump discontinuities in order to model the ideal situation where the gain of the neuron amplifiers is very high and tends to infinity. In particular, we drop the assumptions of Lipschitz continuity and boundedness on the activation functions, which are usually required in most of the existing works. Due to the possible discontinuities of the activations functions, we introduce a suitable notation of limit to study the convergence of the output of the recurrent delayed neural networks. Under suitable assumptions on the interconnection matrices and the time-varying inputs, we establish a sufficient condition for global output convergence of this class of neural networks. The convergence results are useful in solving some optimization problems and in the design of recurrent delayed neural networks with discontinuous neuron activations.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Liu D, Hu S, Wang J (2004) Global output convergence of a class of continuous-time recurrent neural networks with time-varying thresholds. IEEE Trans Circuits Syst II 51(4): 161–167
Wang J (1993) Analysis and design of a recurrent neural network for linear programming. IEEE Trans Circuits Syst I 40: 613–618
Wang J (1994) A deterministic annealing neural network for convex programming. Neural Netw 7: 629–641
Wang J (1996) A recurrent neural network for solving the shortest path problem. IEEE Trans Circuits Syst I 43: 482–486
Wang J (1995) Analysis and design of an analog sorting network. IEEE Trans Neural Netw 6: 962–971
Wang J (1992) Analog neural networks for solving the assignment problem. Electron Lett 28: 1047–1050
Wang J (1997) Primal and dual assignment networks. IEEE Trans Neural Netw 8: 784–790
Civalleri PP, Gill LM, Pandolfi L (1993) On stability of cellular neural networks with delay. IEEE Trans Circuits Syst I 40: 157–164
Yi Z, Lv J, Zhang L (2006) Output convergence analysis for a class of delayed recurrent neural networks with time-varying inputs. IEEE Trans System Man Cybern 36(1): 87–95
Forti M, Nistri P (2003) Global convergence of neural networks with discontinuous neuron activations. IEEE Trans Circuits Syst I 50(11): 1421–1435
Forti M, Grazzini M, Nistri P, Dancioni L (2006) Generalized Lyapunov approach for convergence of neural network with Discontinuous or non-Lipschitz activations. Physica D 214: 88–99
Forti M, Nistri P, Papini D (2005) Global exponential stability and global convergence in finite time of delayed neural networks with infinite gain. IEEE Trans Neural Netw 16(6): 1449–1463
Lu WL, Chen TP (2005) Dynamical behaviors of Cohen-Grossberg neural networks with discontinuous activation functions. Neural Netw 18: 231–242
Lu WL, Chen TP (2006) Dynamical behaviors of delayed neural network systems with discontinuous activation functions. Neural Comput 18: 683–708
Forti M, Nistri P, Quincampoix M (2004) Generalized neural network for non-smooth nonlinear programming problems. IEEE Trans Circuits Syst I 51: 1741–1754
Lu WL, Wang J (2008) Convergence analysis of a class of nonsmooth gradient systems. IEEE Trans Circuits Syst I 55: 3514–3527
Liu QS, Wang J (2008) A one-layer recurrent neural network with a discontinuous activation function for linear programming. Neural Comput 20: 1366–1383
Liu QS, Wang J (2008) A one-layer recurrent neural network with a discontinuous hard-limiting activation function for quadratic programming. IEEE Trans Neural Netw 19: 558–570
Lu WL, Chen TP (2008) Almost periodic dynamics of a class of delayed neural networks with discontinuous activations. Neural Comput 20: 1065–1090
Filippov AF (1998) Differential equations with discontinuous right-hand side. Mathematics and its applications (Soviet series). Kluwer, Boston
Zhang J, Jin X (2000) Global stability analysis in delayed Hopfield neural network models. Neural Netw 13: 745–753
Zhang J, Suda Y, Iwasa T (2004) Absolutely exponential stability of a class of neural networks with unbounded delay. Neural Netw 17: 391–397
Ling XF, Li C, Wong K (2004) Criteria for exponential stability of Cohen-Grossberg neural network. Neural Netw 17: 1401–1414
Lu WL, Chen TP (2003) New conditions on global stability of Cohen-Grossberg networks. Neural Comput 15: 1173–1189
Ensari T, Arik S (2005) Global stability of a class of neural networks with time-varying delay. IEEE Trans Circuits Syst II 52(3): 126–130
Li XM, Huang LH, Zhu HY (2003) Global stability of cellular neural networks with constant and variable delays. Nonlinear Anal 53: 319–333
Li XM, Huang LH, Wu JH (2003) Further results on the stability of delayed cellular neural networks. IEEE Trans Circuits Syst I 50((9): 1239–1242
Huang Y, Wu C (2005) A unifying proof of global asymptotical stability of neural networks with delay. IEEE Trans Circuits Syst II 52(4): 181–184
Ceragioli F (2000) Discontinuous ordinary differential equations and stabilization, PhD Thesis. Universitá di Firenze, Florence, Italy
Clarke FH (1983) Optimization and non-smooth analysis. Wiley, New York
Hale JK (1969) Ordinary differential equations. Wiley, New York
Wang YN, Zuo Y, Huang LH, Li C (2008) Global robust stability of delayed neural networks with discontinuous activation functions. IET Control Theory Appl 7: 543–553
Forti M (2007) M-matrix and global convergence of discontinuous neural networks. Int J Circuit Theory Appl 35: 105–130
Wang JF, Huang LH, Guo ZY (2009) Dynamical behavior of delayed Hopfield neural networks with discontinuous activations. Appl Math Model 33(4): 1793–1802
Huang LH, Wang JF, Zhou XN (2009) Existence and global asymptotic stability of periodic solutions for Hopfield neural networks with discontinuous activations. Nonlinear Anal Real World Appl 10: 1651–1661
Papini D, Taddei V (2005) Global exponential stability of the periodic solution of a delayed neural network with discontinuous activations. Phys Lett A 343: 117–128
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Guo, Z., Huang, L. Global Output Convergence of a Class of Recurrent Delayed Neural Networks with Discontinuous Neuron Activations. Neural Process Lett 30, 213–227 (2009). https://doi.org/10.1007/s11063-009-9119-z
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11063-009-9119-z