[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content
Log in

A nonlinear extension of the Generalized Hebbian learning

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

In this letter, we introduce a nonlinear hierarchic PCA type neural network with a simple architecture. The learning algorithm is a kind of nonlinear extension of the well-known Sanger's Generalized Hebbian Algorithm (GHA). It is derived from a nonlinear optimization criterion. Experiments with sinusoidal data show that the neurons become sensitive to different sinusoids. Standard linear PCA algorithms don't have such a separation property.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. A. Cichocki, R. Unbehauen.Neural networks for optimization and signal processing, Wiley, New York, 1993.

    Google Scholar 

  2. S. Haykin.Neural networks: a comprehensive foundation, IEEE Press and Macmillan, New York, 1994.

    Google Scholar 

  3. T.D. Sanger. Optimal unsupervised learning in a single-layer linear feedforward neural network,Neural Networks vol. 2, pp. 459–473, 1989.

    Article  Google Scholar 

  4. J. Karhunen, J. Joutsensalo. Nonlinear generalizations of principal component learning algorithms,Proc. 1993 Int. Joint Conf. on Neural Networks (Nagoya, Japan) vol. 3, pp. 2599–2602, 1993.

    Google Scholar 

  5. J. Karhunen. Optimization criteria and nonlinear PCA neural networks,Proc. 1994 IEEE Int. Conf. on Neural Networks (Orlando) vol. II, pp. 1242–1246, 1994.

    Google Scholar 

  6. E. Oja, H. Ogawa, J. Wangviwattana. Learning in nonlinear constrained Hebbian networks, in T. Kohonen et al. eds.,Artificial Neural Networks, North-Holland, Amsterdam, pp. 385–390, 1991.

    Google Scholar 

  7. J. Karhunen, J. Joutsensalo. Representation and separation of signals using nonlinear PCA type learning,Neural Networks vol. 7, pp. 113–127, 1994.

    Article  Google Scholar 

  8. F. Palmieri. Hebbian learning and self-association in nonlinear neural networks,Proc. 1994 IEEE Int. Conf. on Neural Networks (Orlando) vol. II, pp. 1258–1263, 1994.

    Google Scholar 

  9. C.W. Therrien.Discrete random signals and statistical signal processing, Prentice-Hall, Englewood Cliffs, NJ, 1992.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Joutsensalo, J., Karhunen, J. A nonlinear extension of the Generalized Hebbian learning. Neural Process Lett 2, 5–8 (1995). https://doi.org/10.1007/BF02312375

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF02312375

Keywords

Navigation