[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
article

Dreaming of mathematical neuroscience for half a century

Published: 01 January 2013 Publication History

Abstract

Theoreticians have been enchanted by the secrets of the brain for many years: how and why does it work so well? There has been a long history of searching for its mechanisms. Theoretical or even mathematical scientists have proposed various models of neural networks which has led to the birth of a new field of research. We can think of the 'pre-historic' period of Rashevski and Wiener, and then the period of perceptrons which is the beginning of learning machines, neurodynamics approaches, and further connectionist approaches. Now is currently the period of computational neuroscience. I have been working in this field for nearly half a century, and have experienced its repeated rise and fall. Now having reached very old age, I would like to state my own endeavors on establishing mathematical neuroscience for half a century, from a personal, even biased, point of view. It would be my pleasure if my experiences could encourage young researchers to participate in mathematical neuroscience.

References

[1]
Theory of adaptive pattern classifiers. IEEE Transactions. vEC-16. 299-307.
[2]
Characteristics of randomly connected threshold-element networks and network systems. Proceedings of the IEEE. v59 i1. 35-47.
[3]
Characteristics of random nets of analog neuron-like elements. IEEE Transactions on Systems, Man and Cybernetics. vSMC-2 i5. 643-657.
[4]
Learning patterns and pattern sequences by self-organizing nets of threshold elements. IEEE Transactions on Computers. vC-21 i11. 1197-1206.
[5]
A method of statistical neurodynamics. Kybernetik. v14. 201-215.
[6]
Neural theory of association and concept-formation. Biological Cybernetics. v26. 175-185.
[7]
Mathematical foundations of neurocomputing. Proceedings of the IEEE. v78 i9. 1443-1463.
[8]
A universal theorem on learning curves. Neural Networks. v6 i2. 161-166.
[9]
Natural gradient works efficiently in learning. Neural Computation. v10. 251-276.
[10]
Information geometry on hierarchy of probability distributions. IEEE Transactions on Information Theory. v47 i5. 1701-1711.
[11]
Measure of correlation orthogonal to change in firing rate. Neural Computation. v21 i4. 960-972.
[12]
Amari, S., Ando, H., Toyoizumi, T., & Masuda, N. (2012). State concentration exponent as a measure of quickness in Kauffman-type networks (submitted for publication).
[13]
Blind source separation-semiparametric statistical approach. IEEE Transactions on Signal Processing. v45 i11. 2692-2700.
[14]
A new learning algorithm for blind signal separation. In: Touretzky, D.S., Mozer, M.C., Hasselmo, M.E. (Eds.), Advances in neural information processing systems, vol. 8, pp. 757-763.
[15]
Four types of learning curves. Neural Computation. v4. 605-618.
[16]
Information geometry of Boltzmann machines. IEEE Transactions on Neural Networks. v3 i2. 260-271.
[17]
Statistical neurodynamics of associative memory. Neural Networks. v1 i1. 63-73.
[18]
Methods of information geometry. AMS and Oxford University Press.
[19]
Synchronous firing and higher-order interactions in neuron pool. Neural Computation. v15. 127-142.
[20]
Singularities affect dynamics of learning in neuromanifolds. Neural Computation. v18. 1007-1065.
[21]
A mathematical foundation for statistical neurodynamics. SIAM Journal on Applied Mathematics. v33. 95-126.
[22]
An information maximisation approach to blind separation and blind deconvolution. Neural Computation. v7 i6. 1129-1159.
[23]
Theory for the development of neuron selectivity: orientation specificity and binocular interaction in visual cortex. Journal of Neuroscience. v2 i1. 32-48.
[24]
Dynamics of fully connected attractor neural networks near saturation. Physical Review Letters. v71. 3886-3889.
[25]
Phase diagram and storage capacity of sequence storing neural networks. Journal of Physics. vA31. 8607
[26]
Learning processes in neural networks. Physical Review A. v44. 2718-2726.
[27]
Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences of the United States of America. v79. 2554-2558.
[28]
Stochastic reasoning, free energy, and information geometry. Neural Computation. v16 i9. 1779-1810.
[29]
Metabolic stability and epigenesis in randomly constructed genetic nets. Journal of Theoretical Biology. v22. 437-467.
[30]
Self-organized formation of topologically correct feature maps. Biological Cybernetics. v43. 59-69.
[31]
Estimating spiking irregularities under changing environments. Neural Computation. v18 i10. 2359-2386.
[32]
Information geometry of U-Boost and Bregman divergence. Neural Computation. v16 i7. 1437-1481.
[33]
Information-geometric measure for neural spikes. Neural Computation. v14. 2269-2316.
[34]
Information loss associated with imperfect observation and mismatched decoding. Frontiers in Computational Neuroscience. v5 i9. 1-13.
[35]
A simplified neuron model as a principal component analyzer. Journal of Mathematical Biology. v15. 267-273.
[36]
A hierarchy of macrodynamical equations for associative memory. Neural Networks. v8. 833-838.
[37]
Rosenblatt, F. (1961). Principles of neurodynamics, Spartan.
[38]
Random logical nets I-III. Avtomatika i Telemekhanika. v5, 6, 7.
[39]
Learning internal representations by error propagation. In: Rumelhart, D.E., McClelland, J.L. (Eds.), Parallel distributed processing: explorations in the microstructure of cognition, vol. 1, MIT Press.
[40]
Formation of topographic maps and columnar microstructures. Biological Cybernetics. v35. 63-72.
[41]
Foundations of the theory of learing systems. Academic Press.
[42]
Self-organization of orientation sensitive cells in the striate cortex. Kybernetik. v14. 85-100.
[43]
Dynamics of learning near singularities in radial basis function networks. Neural Networks. v21. 989-1005.
[44]
How patterned neural connections can be set up by self-organization. Proceedings of the Royal Society. vB-194. 431-445.
[45]
Excitatory and inhibitory interactions in localized populations of model neurons. Biophysical Journal. v12. 1-24.
[46]
A mathematical theory of the functional dynamics of cortical and thalamic nervous tissue. Kybernetik. v13. 55-80.

Cited By

View all

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Neural Networks
Neural Networks  Volume 37, Issue
January, 2013
200 pages

Publisher

Elsevier Science Ltd.

United Kingdom

Publication History

Published: 01 January 2013

Author Tags

  1. Dynamics of neural field
  2. Mathematical neuroscience
  3. Neural learning
  4. Statistical neurodynamics

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 07 Jan 2025

Other Metrics

Citations

Cited By

View all

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media