Abstract
In this paper we give an overview of a recently developed theory [1, 2] which allows for calculating finite size corrections to the dynamical equations describing the dynamics of separable Neural Networks, away from saturation. According to this theory, finite size effects are described by a linear-noise Fokker Planck equation for the fluctuations (corresponding to an Ornstein-Uhlenbeck process), whose solution is characterized by the first two moments. The theory is applied to a particular problem in which detailed balance does not hold.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Castellanos, A., Coolen, A.C.C., Viana, L.: Finite Size effects in separable recurrent Neural Networks, J. Phys. A: Math. Gen. 31 (1998) 6615–6634.
Castellanos, A., Ph.D. Thesis, CICESE-UNAM, México (1998).
Kohring G.A.: J. Phys. A: Math. Gen. 23 (1990) 2237.
Coolen, A.A.C. and Sherrington D.: Mathematical Approaches to Neural Networks, ed. J.G. Taylor (Amsterdam, North Holland) p 293
Gardiner C W 1990 Handbook of Stochastic Methods (Berlin: Springer)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1999 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Viana, L., Castellanos, A., Coolen, A.C.C. (1999). Finite size effects in neural networks. In: Mira, J., Sánchez-Andrés, J.V. (eds) Foundations and Tools for Neural Modeling. IWANN 1999. Lecture Notes in Computer Science, vol 1606. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0098196
Download citation
DOI: https://doi.org/10.1007/BFb0098196
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-66069-9
Online ISBN: 978-3-540-48771-5
eBook Packages: Springer Book Archive