Abstract
The backpropagation algorithm has played a critical role in training deep neural networks. Many studies suggest that the brain may implement a similar algorithm. But most of them require symmetric weights between neurons, which makes the models less biologically plausible. Inspired by some recent works by Bengio et al., we show that the well-known Hopfield neural network (HNN) can be trained in a biologically plausible way. The network can take hierarchical architectures and the weights between neurons are not necessarily symmetric. The network runs in two alternating phases. The weight change is proportional to the firing rate of the presynaptic neuron and the state (or membrane potential) change of the postsynaptic neuron between the two phases, which approximates a classical spike-timing-dependent-plasticity (STDP) rule. Several HNNs with one or two hidden layers are trained on the MNIST dataset and all of them converge to low training errors. These results further push our understanding of the brain mechanism for supervised learning.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Baldi, P., Pineda, F.J.: Contrastive learning and neural oscillations. Neural Comput. 3(4), 526–545 (1991)
Bengio, Y., Fischer, A.: Early inference in energy-based models approximates back-propagation. arXiv preprint arXiv:1510.02777 (2015)
Bengio, Y., Lee, D.H., Bornschein, J., Lin, Z.: Towards biologically plausible deep learning. arXiv preprint arXiv:1502.04156 (2015)
Bengio, Y., Mesnard, T., Fischer, A., Zhang, S., Wu, Y.: STDP-compatible approximation of backpropagation in an energy-basedmodel. Neural Comput. 29, 555–577 (2017)
Bi, G.Q., Poo, M.M.: Synaptic modification by correlated activity: Hebb’s postulate revisited. Annu. Rev. Neurosci. 24(1), 139–166 (2001)
Forti, M., Tesi, A.: New conditions for global stability of neural networks with application to linear and quadratic programming problems. IEEE Trans. Circ. Syst. I: Fundam. Theory Appl. 42(7), 354–366 (1995)
Grossberg, S.: Competitive learning: from interactive activation to adaptive resonance. Cogn. Sci. 11(1), 23–63 (1987)
Hopfield, J.J.: Neurons with graded response have collective computational properties like those of two-state neurons. Proc. Natl. Acad. Sci. 81(10), 3088–3092 (1984)
Hu, S., Wang, J.: Absolute exponential stability of a class of continuous-time recurrent neural networks. IEEE Trans. Neural Netw. 14(1), 35–45 (2003)
LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521(7553), 436–444 (2015)
Liao, Q., Leibo, J.Z., Poggio, T.: How important is weight symmetry in backpropagation? In: Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence (AAAI), pp. 1837–1844 (2016)
Lillicrap, T.P., Cownden, D., Tweed, D.B., Akerman, C.J.: Random feedback weights support learning in deep neural networks. arXiv preprint arXiv:1411.0247 (2014)
Mazzoni, P., Andersen, R.A., Jordan, M.I.: A more biologically plausible learning rule for neural networks. Proc. Natl. Acad. Sci. 88(10), 4433–4437 (1991)
Movellan, J.R.: Contrastive Hebbian learning in the continuous hopfield model. In: Connectionist Models: Proceedings of the 1990 Summer School, pp. 10–17 (1991)
Scellier, B., Bengio, Y.: Towards a biologically plausible backprop. arXiv preprint arXiv:1602.05179 (2016)
Xie, X., Seung, H.S.: Equivalence of backpropagation and contrastive Hebbian learning in a layered network. Neural Comput. 15(2), 441–454 (2003)
Acknowledgments
This work was supported in part by the National Natural Science Foundation of China under Grant 91420201, Grant 61332007, Grant 61621136008 and Grant 61620106010, in part by the Beijing Municipal Science and Technology Commission under Grant Z161100000216126, and in part by Huawei Technology under Contract YB2015120018.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Hu, X., Wang, T. (2017). Training the Hopfield Neural Network for Classification Using a STDP-Like Rule. In: Liu, D., Xie, S., Li, Y., Zhao, D., El-Alfy, ES. (eds) Neural Information Processing. ICONIP 2017. Lecture Notes in Computer Science(), vol 10636. Springer, Cham. https://doi.org/10.1007/978-3-319-70090-8_74
Download citation
DOI: https://doi.org/10.1007/978-3-319-70090-8_74
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-70089-2
Online ISBN: 978-3-319-70090-8
eBook Packages: Computer ScienceComputer Science (R0)