[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to main content

Pattern Classification Using a Set of Compact Hyperspheres

  • Conference paper
Neural Information Processing (ICONIP 2006)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4233))

Included in the following conference series:

  • 1376 Accesses

Abstract

Prototype classifiers are one of the simplest and most intuitive approaches in pattern classification. However, they need careful positioning of prototypes to capture the distribution of each class region. Classical methods, such as learning vector quantization (LVQ), are sensitive to the initial choice of the number and the locations of the prototypes. To alleviate this problem, a new method is proposed that represents each class region by a set of compact hyperspheres. The number of hyperspheres and their locations are determined by setting up the problem as a set of quadratic optimization problems. Experimental results show that the proposed approach significantly beats LVQ and Restricted Coulomb Energy (RCE) in most performance aspects.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Hastie, T., Tibshirani, R.: Discriminant adaptive nearest neighbor classification. IEEE Trans. Pattern Analysis and Machine Intelligence. 18(6), 607–616 (1996)

    Article  Google Scholar 

  2. Reilly, D., Cooper, L., Elbaum, C.: A neural model for category learning. Biological Cybernetics 45, 35–41 (1982)

    Article  Google Scholar 

  3. Hudak, M.J.: RCE Classifiers: Theory and Practice. Cybernetics and Systems: An International Journal 23, 483–515 (1992)

    Article  MathSciNet  Google Scholar 

  4. Tsumura, N., Itoh, K., Ichioka, Y.: Reliable classification by double hyperspheres in pattern vector space. Pattern Recognition 28, 1621–1626 (1995)

    Article  Google Scholar 

  5. Kositsky, M., Ullman, S.: Learning Class Regions by the Union of Ellipsoids. In: Proc. 13th International Conference on Pattern Recognition (ICPR), vol. 4, pp. 750–757. IEEE Computer Society Press, Los Alamitos (1996)

    Chapter  Google Scholar 

  6. Atiya, A., Hashem, S., Fayed, H.: New hyperspheres for pattern classification. In: Proc. 1st International Computer Engineering Conference (ICENCO-2004), Cairo, Egypt, pp. 258–263 (2004)

    Google Scholar 

  7. Wright, S.J.: Primal-Dual Interior-Point Methods. SIAM Publications, Philadelphia (1997)

    MATH  Google Scholar 

  8. Khachiyan, L.G.: Rounding of polytopes in the real number model of computation. Mathematics of Operations Research 21, 307–320 (1996)

    Article  MATH  MathSciNet  Google Scholar 

  9. Kohonen, T.: Self-organization and Associative Memory, 3rd edn. Springer, Heidelberg (1989)

    Google Scholar 

  10. Hastie, T., Tisbshirani, R., Friedman, J.: The Elements of Statistical Learning. Springer, Heidelberg (2001)

    MATH  Google Scholar 

  11. Zhang, H., Sun, G.: Optimal reference subset selection for nearest neighbor classification by tabu search. Pattern Recognition 35, 1481–1490 (2002)

    Article  MATH  Google Scholar 

  12. Prechelt, L.: Proben1. A Set of Neural-Network Benchmark Problems. Univ. Karlsruhe, Germany (1994), available FTP: ira.uka.de/pub/neuron/proben1.tar.gz

  13. Michie, D., Spiegelhalter, D.J., Taylor, C.C.: Machine Learning, Neural and Statistical Classification. Ellis Horwood, New York (1994)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Atiya, A., Hashem, S., Fayed, H. (2006). Pattern Classification Using a Set of Compact Hyperspheres. In: King, I., Wang, J., Chan, LW., Wang, D. (eds) Neural Information Processing. ICONIP 2006. Lecture Notes in Computer Science, vol 4233. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11893257_13

Download citation

  • DOI: https://doi.org/10.1007/11893257_13

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-46481-5

  • Online ISBN: 978-3-540-46482-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics