[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to main content

Training the Hopfield Neural Network for Classification Using a STDP-Like Rule

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2017)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 10636))

Included in the following conference series:

Abstract

The backpropagation algorithm has played a critical role in training deep neural networks. Many studies suggest that the brain may implement a similar algorithm. But most of them require symmetric weights between neurons, which makes the models less biologically plausible. Inspired by some recent works by Bengio et al., we show that the well-known Hopfield neural network (HNN) can be trained in a biologically plausible way. The network can take hierarchical architectures and the weights between neurons are not necessarily symmetric. The network runs in two alternating phases. The weight change is proportional to the firing rate of the presynaptic neuron and the state (or membrane potential) change of the postsynaptic neuron between the two phases, which approximates a classical spike-timing-dependent-plasticity (STDP) rule. Several HNNs with one or two hidden layers are trained on the MNIST dataset and all of them converge to low training errors. These results further push our understanding of the brain mechanism for supervised learning.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
GBP 19.95
Price includes VAT (United Kingdom)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
GBP 71.50
Price includes VAT (United Kingdom)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
GBP 89.99
Price includes VAT (United Kingdom)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Baldi, P., Pineda, F.J.: Contrastive learning and neural oscillations. Neural Comput. 3(4), 526–545 (1991)

    Article  Google Scholar 

  2. Bengio, Y., Fischer, A.: Early inference in energy-based models approximates back-propagation. arXiv preprint arXiv:1510.02777 (2015)

  3. Bengio, Y., Lee, D.H., Bornschein, J., Lin, Z.: Towards biologically plausible deep learning. arXiv preprint arXiv:1502.04156 (2015)

  4. Bengio, Y., Mesnard, T., Fischer, A., Zhang, S., Wu, Y.: STDP-compatible approximation of backpropagation in an energy-basedmodel. Neural Comput. 29, 555–577 (2017)

    Article  Google Scholar 

  5. Bi, G.Q., Poo, M.M.: Synaptic modification by correlated activity: Hebb’s postulate revisited. Annu. Rev. Neurosci. 24(1), 139–166 (2001)

    Article  Google Scholar 

  6. Forti, M., Tesi, A.: New conditions for global stability of neural networks with application to linear and quadratic programming problems. IEEE Trans. Circ. Syst. I: Fundam. Theory Appl. 42(7), 354–366 (1995)

    Article  MATH  MathSciNet  Google Scholar 

  7. Grossberg, S.: Competitive learning: from interactive activation to adaptive resonance. Cogn. Sci. 11(1), 23–63 (1987)

    Article  Google Scholar 

  8. Hopfield, J.J.: Neurons with graded response have collective computational properties like those of two-state neurons. Proc. Natl. Acad. Sci. 81(10), 3088–3092 (1984)

    Article  MATH  Google Scholar 

  9. Hu, S., Wang, J.: Absolute exponential stability of a class of continuous-time recurrent neural networks. IEEE Trans. Neural Netw. 14(1), 35–45 (2003)

    Article  Google Scholar 

  10. LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521(7553), 436–444 (2015)

    Article  Google Scholar 

  11. Liao, Q., Leibo, J.Z., Poggio, T.: How important is weight symmetry in backpropagation? In: Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence (AAAI), pp. 1837–1844 (2016)

    Google Scholar 

  12. Lillicrap, T.P., Cownden, D., Tweed, D.B., Akerman, C.J.: Random feedback weights support learning in deep neural networks. arXiv preprint arXiv:1411.0247 (2014)

  13. Mazzoni, P., Andersen, R.A., Jordan, M.I.: A more biologically plausible learning rule for neural networks. Proc. Natl. Acad. Sci. 88(10), 4433–4437 (1991)

    Article  Google Scholar 

  14. Movellan, J.R.: Contrastive Hebbian learning in the continuous hopfield model. In: Connectionist Models: Proceedings of the 1990 Summer School, pp. 10–17 (1991)

    Google Scholar 

  15. Scellier, B., Bengio, Y.: Towards a biologically plausible backprop. arXiv preprint arXiv:1602.05179 (2016)

  16. Xie, X., Seung, H.S.: Equivalence of backpropagation and contrastive Hebbian learning in a layered network. Neural Comput. 15(2), 441–454 (2003)

    Article  MATH  Google Scholar 

Download references

Acknowledgments

This work was supported in part by the National Natural Science Foundation of China under Grant 91420201, Grant 61332007, Grant 61621136008 and Grant 61620106010, in part by the Beijing Municipal Science and Technology Commission under Grant Z161100000216126, and in part by Huawei Technology under Contract YB2015120018.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiaolin Hu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Hu, X., Wang, T. (2017). Training the Hopfield Neural Network for Classification Using a STDP-Like Rule. In: Liu, D., Xie, S., Li, Y., Zhao, D., El-Alfy, ES. (eds) Neural Information Processing. ICONIP 2017. Lecture Notes in Computer Science(), vol 10636. Springer, Cham. https://doi.org/10.1007/978-3-319-70090-8_74

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-70090-8_74

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-70089-2

  • Online ISBN: 978-3-319-70090-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics