[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
article

Real-time computation at the edge of chaos in recurrent neural networks

Published: 01 July 2004 Publication History

Abstract

Depending on the connectivity, recurrent networks of simple computational units can show very different types of dynamics, ranging from totally ordered to chaotic. We analyze how the type of dynamics (ordered or chaotic) exhibited by randomly connected networks of threshold gates driven by a time-varying input signal depends on the parameters describing the distribution of the connectivity matrix. In particular, we calculate the critical boundary in parameter space where the transition from ordered to chaotic dynamics takes place. Employing a recently developed framework for analyzing real-time computations, we show that only near the critical boundary can such networks perform complex computations on time series. Hence, this result strongly supports conjectures that dynamical systems that are capable of doing complex computational tasks should operate near the edge of chaos, that is, the transition from ordered to chaotic dynamics.

References

[1]
Bak, P., Tang, C., & Wiesenfeld, K. (1987). Self-organized criticality: An explanation of 1/f noise. Physical Review Letters, 59(4), 381-384.
[2]
Bak, P., Tang, C., & Wiesenfeld, K. (1988). Self-organized criticality. Physical Review A, 38(1), 364-374.
[3]
Bornholdt, S., & Rööhlf, T. (2003). Self-organized critical neural networks. Physical Review E, 67, 066118.
[4]
Bornholdt, S., & Rööhlf, T. (2000). Topological evolution of dynamical networks: Global criticality from local dynamics. Physical Review Letters, 84(26), 6114-6117.
[5]
Boyd, S., & Chua, L. O. (1985). Fading memory & the problem of approximating nonlinear operators with Voltera series. IEEE Trans. on Circuits and Systems, 32, 1150.
[6]
Dauce, E., Quoy, M., Cessac, B., Doyon, B., & Samuelides, M. (1998). Self-organization & dynamics reduction in recurrent networks: Stimulus presentation and learning. Neural Networks, 11, 521-533.
[7]
Derrida, B. (1987). Dynamical phase transition in non-symmetric spin glasses. J. Phys. A: Math. Gen., 20, 721-725.
[8]
Derrida, B., & Pomeau, Y. (1986). R&om networks of automata: A simple annealed approximation. Europhys. Lett., 1, 45-52.
[9]
Derrida, B., & Weisbuch, G. (1986). Evolution of overlaps between configurations in random Boolean networks. J. Physique, 47, 1297.
[10]
Hertz, J., Krogh, A., & Palmer, R. G. (1991). Introduction to the theory of neural computation. Cambridge, MA: Perseus Books.
[11]
Jaeger, H. (2002). Tutorial on training recurrent neural networks, covering BPPT, RTRL, EKF and the "echo state network" approach (GMD Tech. Rep. 159). Berlin: German National Research Center for Information Technology.
[12]
Kauffman, S. A. (1993). The origins of order: Self-organization and selection in evolution . New York: Oxford University Press.
[13]
Langton, C. G. (1990). Computation at the edge of chaos. Physica D, 42.
[14]
Luque, B., & Sole, R. (2000). Lyapunov exponents in random boolean networks. Physica A, 284, 33-45.
[15]
Maass, W., Natschläger, T., & Markram, H. (2002). Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural Computation, 14(11), 2531-2560.
[16]
Mitchell, M., Hraber, P. T., & Crutchfield, J. P. (1993). Revisiting the edge of chaos: Evolving cellular automata to perform computations. Complex Systems, 7, 89-130.
[17]
Natschläger, T., & Maass, W. (2004). Information dynamics and emergent computation in recurrent circuits of spiking neurons. In S. Thrun, L. Saul, B. Schölkopf (Eds.), Advances in neural information processing systems, 16. Cambridge, MA: MIT Press.
[18]
Packard, N. (1988). Adaptation towards the edge of chaos, In J. A. S. Kelso, A. J. Mandell, & M. F. Shlesinger (Eds.), Dynamic patterns in complex systems (pp. 293-301). Singapore: World Scientific.
[19]
Röhlf, T., & Bornholdt, S. (2002). Criticality in random threshold networks: annealed approximation and beyond. Physica A, 310, 245-259.
[20]
Siu, K.-Y., Roychowdhury, V., & Kailath, T. (1995). Discrete neural computation: A theoretical foundation. Upper Saddle River, NJ: Prentice Hall.
[21]
van Vreeswijk, C. A., & Sompolinsky, H. (1996). Chaos in neuronal networks with balanced excitatory and inhibitory activity. Science, 274, 1724-1726.

Cited By

View all
  • (2024)Evolving Reservoirs for Meta Reinforcement LearningApplications of Evolutionary Computation10.1007/978-3-031-56855-8_3(36-60)Online publication date: 3-Mar-2024
  • (2022)Low-power-consumption physical reservoir computing model based on overdamped bistable stochastic resonance systemNeurocomputing10.1016/j.neucom.2021.09.074468:C(137-147)Online publication date: 11-Jan-2022
  • (2021)Increasing liquid state machine performance with edge-of-chaos dynamics organized by astrocyte-modulated plasticityProceedings of the 35th International Conference on Neural Information Processing Systems10.5555/3540261.3542229(25703-25719)Online publication date: 6-Dec-2021
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Neural Computation
Neural Computation  Volume 16, Issue 7
July 2004
215 pages

Publisher

MIT Press

Cambridge, MA, United States

Publication History

Published: 01 July 2004

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 02 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Evolving Reservoirs for Meta Reinforcement LearningApplications of Evolutionary Computation10.1007/978-3-031-56855-8_3(36-60)Online publication date: 3-Mar-2024
  • (2022)Low-power-consumption physical reservoir computing model based on overdamped bistable stochastic resonance systemNeurocomputing10.1016/j.neucom.2021.09.074468:C(137-147)Online publication date: 11-Jan-2022
  • (2021)Increasing liquid state machine performance with edge-of-chaos dynamics organized by astrocyte-modulated plasticityProceedings of the 35th International Conference on Neural Information Processing Systems10.5555/3540261.3542229(25703-25719)Online publication date: 6-Dec-2021
  • (2021)Effect of Neural Decay Factors on Prediction Performance in Chaotic Echo State Networks2021 IEEE International Conference on Systems, Man, and Cybernetics (SMC)10.1109/SMC52423.2021.9659012(1888-1893)Online publication date: 17-Oct-2021
  • (2021)Stability of Gated Recurrent Unit Neural Networks: Convex Combination Formulation ApproachJournal of Optimization Theory and Applications10.1007/s10957-020-01776-w188:1(291-306)Online publication date: 1-Jan-2021
  • (2020)Feed-forward versus recurrent architecture and local versus cellular automata distributed representation in reservoir computing for sequence memory learningArtificial Intelligence Review10.1007/s10462-020-09815-853:7(5083-5112)Online publication date: 1-Oct-2020
  • (2019)Soft bodies as input reservoir: role of softness from the viewpoint of reservoir computing2019 International Symposium on Micro-NanoMechatronics and Human Science (MHS)10.1109/MHS48134.2019.9249256(1-7)Online publication date: 1-Dec-2019
  • (2019)Perturbations and phase transitions in swarm optimization algorithmsNatural Computing: an international journal10.1007/s11047-019-09741-x18:3(579-591)Online publication date: 1-Sep-2019
  • (2019)Global Asymptotic Stability and Stabilization of Long Short-Term Memory Neural Networks with Constant Weights and BiasesJournal of Optimization Theory and Applications10.1007/s10957-018-1447-6181:1(231-243)Online publication date: 1-Apr-2019
  • (2019)Hyper-spherical Reservoirs for Echo State NetworksArtificial Neural Networks and Machine Learning – ICANN 2019: Workshop and Special Sessions10.1007/978-3-030-30493-5_9(89-93)Online publication date: 17-Sep-2019
  • Show More Cited By

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media