[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content
Log in

Global Output Convergence of a Class of Recurrent Delayed Neural Networks with Discontinuous Neuron Activations

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

This paper studies the global output convergence of a class of recurrent delayed neural networks with time-varying inputs. We consider non-decreasing activations which may also have jump discontinuities in order to model the ideal situation where the gain of the neuron amplifiers is very high and tends to infinity. In particular, we drop the assumptions of Lipschitz continuity and boundedness on the activation functions, which are usually required in most of the existing works. Due to the possible discontinuities of the activations functions, we introduce a suitable notation of limit to study the convergence of the output of the recurrent delayed neural networks. Under suitable assumptions on the interconnection matrices and the time-varying inputs, we establish a sufficient condition for global output convergence of this class of neural networks. The convergence results are useful in solving some optimization problems and in the design of recurrent delayed neural networks with discontinuous neuron activations.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Liu D, Hu S, Wang J (2004) Global output convergence of a class of continuous-time recurrent neural networks with time-varying thresholds. IEEE Trans Circuits Syst II 51(4): 161–167

    Article  Google Scholar 

  2. Wang J (1993) Analysis and design of a recurrent neural network for linear programming. IEEE Trans Circuits Syst I 40: 613–618

    Article  MATH  Google Scholar 

  3. Wang J (1994) A deterministic annealing neural network for convex programming. Neural Netw 7: 629–641

    Article  MATH  Google Scholar 

  4. Wang J (1996) A recurrent neural network for solving the shortest path problem. IEEE Trans Circuits Syst I 43: 482–486

    Article  Google Scholar 

  5. Wang J (1995) Analysis and design of an analog sorting network. IEEE Trans Neural Netw 6: 962–971

    Article  Google Scholar 

  6. Wang J (1992) Analog neural networks for solving the assignment problem. Electron Lett 28: 1047–1050

    Article  Google Scholar 

  7. Wang J (1997) Primal and dual assignment networks. IEEE Trans Neural Netw 8: 784–790

    Article  Google Scholar 

  8. Civalleri PP, Gill LM, Pandolfi L (1993) On stability of cellular neural networks with delay. IEEE Trans Circuits Syst I 40: 157–164

    Article  MATH  Google Scholar 

  9. Yi Z, Lv J, Zhang L (2006) Output convergence analysis for a class of delayed recurrent neural networks with time-varying inputs. IEEE Trans System Man Cybern 36(1): 87–95

    Article  MATH  Google Scholar 

  10. Forti M, Nistri P (2003) Global convergence of neural networks with discontinuous neuron activations. IEEE Trans Circuits Syst I 50(11): 1421–1435

    Article  MathSciNet  Google Scholar 

  11. Forti M, Grazzini M, Nistri P, Dancioni L (2006) Generalized Lyapunov approach for convergence of neural network with Discontinuous or non-Lipschitz activations. Physica D 214: 88–99

    Article  MATH  MathSciNet  Google Scholar 

  12. Forti M, Nistri P, Papini D (2005) Global exponential stability and global convergence in finite time of delayed neural networks with infinite gain. IEEE Trans Neural Netw 16(6): 1449–1463

    Article  Google Scholar 

  13. Lu WL, Chen TP (2005) Dynamical behaviors of Cohen-Grossberg neural networks with discontinuous activation functions. Neural Netw 18: 231–242

    Article  MATH  Google Scholar 

  14. Lu WL, Chen TP (2006) Dynamical behaviors of delayed neural network systems with discontinuous activation functions. Neural Comput 18: 683–708

    Article  MATH  MathSciNet  Google Scholar 

  15. Forti M, Nistri P, Quincampoix M (2004) Generalized neural network for non-smooth nonlinear programming problems. IEEE Trans Circuits Syst I 51: 1741–1754

    Article  MathSciNet  Google Scholar 

  16. Lu WL, Wang J (2008) Convergence analysis of a class of nonsmooth gradient systems. IEEE Trans Circuits Syst I 55: 3514–3527

    Article  MathSciNet  Google Scholar 

  17. Liu QS, Wang J (2008) A one-layer recurrent neural network with a discontinuous activation function for linear programming. Neural Comput 20: 1366–1383

    Article  MATH  MathSciNet  Google Scholar 

  18. Liu QS, Wang J (2008) A one-layer recurrent neural network with a discontinuous hard-limiting activation function for quadratic programming. IEEE Trans Neural Netw 19: 558–570

    Article  Google Scholar 

  19. Lu WL, Chen TP (2008) Almost periodic dynamics of a class of delayed neural networks with discontinuous activations. Neural Comput 20: 1065–1090

    Article  MATH  MathSciNet  Google Scholar 

  20. Filippov AF (1998) Differential equations with discontinuous right-hand side. Mathematics and its applications (Soviet series). Kluwer, Boston

    Google Scholar 

  21. Zhang J, Jin X (2000) Global stability analysis in delayed Hopfield neural network models. Neural Netw 13: 745–753

    Article  Google Scholar 

  22. Zhang J, Suda Y, Iwasa T (2004) Absolutely exponential stability of a class of neural networks with unbounded delay. Neural Netw 17: 391–397

    Article  MATH  Google Scholar 

  23. Ling XF, Li C, Wong K (2004) Criteria for exponential stability of Cohen-Grossberg neural network. Neural Netw 17: 1401–1414

    Article  Google Scholar 

  24. Lu WL, Chen TP (2003) New conditions on global stability of Cohen-Grossberg networks. Neural Comput 15: 1173–1189

    Article  MATH  Google Scholar 

  25. Ensari T, Arik S (2005) Global stability of a class of neural networks with time-varying delay. IEEE Trans Circuits Syst II 52(3): 126–130

    Article  Google Scholar 

  26. Li XM, Huang LH, Zhu HY (2003) Global stability of cellular neural networks with constant and variable delays. Nonlinear Anal 53: 319–333

    Article  MATH  MathSciNet  Google Scholar 

  27. Li XM, Huang LH, Wu JH (2003) Further results on the stability of delayed cellular neural networks. IEEE Trans Circuits Syst I 50((9): 1239–1242

    MathSciNet  Google Scholar 

  28. Huang Y, Wu C (2005) A unifying proof of global asymptotical stability of neural networks with delay. IEEE Trans Circuits Syst II 52(4): 181–184

    Article  MathSciNet  Google Scholar 

  29. Ceragioli F (2000) Discontinuous ordinary differential equations and stabilization, PhD Thesis. Universitá di Firenze, Florence, Italy

  30. Clarke FH (1983) Optimization and non-smooth analysis. Wiley, New York

    Google Scholar 

  31. Hale JK (1969) Ordinary differential equations. Wiley, New York

    MATH  Google Scholar 

  32. Wang YN, Zuo Y, Huang LH, Li C (2008) Global robust stability of delayed neural networks with discontinuous activation functions. IET Control Theory Appl 7: 543–553

    Article  MathSciNet  Google Scholar 

  33. Forti M (2007) M-matrix and global convergence of discontinuous neural networks. Int J Circuit Theory Appl 35: 105–130

    Article  MATH  Google Scholar 

  34. Wang JF, Huang LH, Guo ZY (2009) Dynamical behavior of delayed Hopfield neural networks with discontinuous activations. Appl Math Model 33(4): 1793–1802

    Article  MathSciNet  Google Scholar 

  35. Huang LH, Wang JF, Zhou XN (2009) Existence and global asymptotic stability of periodic solutions for Hopfield neural networks with discontinuous activations. Nonlinear Anal Real World Appl 10: 1651–1661

    Article  MATH  MathSciNet  Google Scholar 

  36. Papini D, Taddei V (2005) Global exponential stability of the periodic solution of a delayed neural network with discontinuous activations. Phys Lett A 343: 117–128

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lihong Huang.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Guo, Z., Huang, L. Global Output Convergence of a Class of Recurrent Delayed Neural Networks with Discontinuous Neuron Activations. Neural Process Lett 30, 213–227 (2009). https://doi.org/10.1007/s11063-009-9119-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-009-9119-z

Keywords

Navigation