[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to main content

Binary Perceptron Learning Algorithm Using Simplex-Method

  • Conference paper
Artificial Intelligence and Soft Computing (ICAISC 2012)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 7267))

Included in the following conference series:

Abstract

A number of researchers headed by E. Gardner have proved that a maximum achievable memory load of binary perceptron is 2. A learning algorithm allowing reaching and even exceeding the critical load was proposed. The algorithm was reduced to solving the linear programming problem. The proposed algorithm is sequel to Krauth and Mezard ideas. The algorithm makes it possible to construct networks storage capacity and noise stability of which are comparable to those of Krauth and Mezard algorithm. However suggested modification of the algorithm outperforms.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
GBP 19.95
Price includes VAT (United Kingdom)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
GBP 35.99
Price includes VAT (United Kingdom)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
GBP 44.99
Price includes VAT (United Kingdom)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Karandashev, I., Kryzhanovsky, B., Litinskii, L.: Hopfield-type memory without catastrophic forgetting. In: Future Computing 2011: The Third International Conference on Future Computational Technologies and Applications, pp. 57–61. IARIA (2011)

    Google Scholar 

  2. Gardner, E., Derrida, B.: Optimal storage properties of neural network models. J. Phys. A: Math. Gen. 21, 271–284 (1988)

    Article  MathSciNet  Google Scholar 

  3. Hertz, J., Krogh, A., Palmer, R.: Introduction to the Theory of Neural Computation. Addison-Wesley, Massachusetts (1991)

    Google Scholar 

  4. Kryzhanovsky, B.V., Kryzhanovsky, V.M.: Binary Optimization: On the Probability of a Local Minimum Detection in Random Search. In: Rutkowski, L., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds.) ICAISC 2008. LNCS (LNAI), vol. 5097, pp. 89–100. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

  5. Krauth, W., Mezard, M.: Learning algorithms with optimal stability in neural networks. J. Phys. A: Math. Gen. 20, L745–L752 (1987)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kryzhanovskiy, V., Zhelavskaya, I., Karandashev, J. (2012). Binary Perceptron Learning Algorithm Using Simplex-Method. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds) Artificial Intelligence and Soft Computing. ICAISC 2012. Lecture Notes in Computer Science(), vol 7267. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-29347-4_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-29347-4_13

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-29346-7

  • Online ISBN: 978-3-642-29347-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics