[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to main content

Dynamic and Static Weighting in Classifier Fusion

  • Conference paper
Pattern Recognition and Image Analysis (IbPRIA 2005)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 3523))

Included in the following conference series:

Abstract

When a Multiple Classifier System is employed, one of the most popular methods to accomplish the classifier fusion is the simple majority voting. However, when the performance of the ensemble members is not uniform, the efficiency of this type of voting is affected negatively. In this paper, a comparison between simple and weighted voting (both dynamic and static) is presented. New weighting methods, mainly in the direction of the dynamic approach, are also introduced. Experimental results with several real-problem data sets demonstrate the advantages of the weighting strategies over the simple voting scheme. When comparing the dynamic and the static approaches, results show that the dynamic weighting is superior to the static strategy in terms of classification accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Barandela, R., Valdovinos, R.M., Sánchez, J.S.: New applications of ensembles of classifiers. Pattern Analysis and Applications 6, 245–256 (2003)

    Article  Google Scholar 

  2. Valdovinos, R.M., Barandela, R.: Sistema de Múltiples Clasificadores. Una alternativa para la Escalabilidad de Algoritmos. In: Proc. of the 9th Intl. Conference of Research on Computer Sciences, Puebla, Mexico (2002)

    Google Scholar 

  3. Dudani, S.A.: The distance weighted k-nearest neighbor rule. IEEE Trans. on Systems, Man and Cybernetics 6, 325–327 (1976)

    Article  Google Scholar 

  4. Kuncheva, L.I., Kountchev, R.K.: Generating classifier outputs of fixed accuracy and diversity. Pattern Recognition Letters 23, 593–600 (2002)

    Article  MATH  Google Scholar 

  5. Woods, K., Kegelmeyer Jr., W.P., Bowyer, K.: Combination of multiple classifiers using local accuracy estimates. IEEE Trans. on Pattern Analysis and Machine Intelligence 19, 405–410 (1997)

    Article  Google Scholar 

  6. Kuncheva, L.I.: Using measures of similarity and inclusion for multiple classifier fusion by decision templates. Fuzzy Sets and Systems 122, 401–407 (2001)

    Article  MATH  MathSciNet  Google Scholar 

  7. Chen, D., Cheng, X.: An asymptotic analysis of some expert fusion methods. Pattern Recognition Letters 22, 901–904 (2001)

    Article  MATH  Google Scholar 

  8. Kuncheva, L.I., Bezdek, J.C., Duin, R.P.W.: Decision templates for multiple classifier fusion. Pattern Recognition 34, 299–314 (2001)

    Article  MATH  Google Scholar 

  9. Ho, T.-K.: Complexity of classification problems and comparative advantages of combined classifiers. In: Proc. of the 1st Intl. Workshop on Multiple Classifier Systems, pp. 97–106. Springer, Heidelberg (2000)

    Chapter  Google Scholar 

  10. Bahler, D., Navarro, L.: Methods for combining heterogeneous sets of classifiers. In: Proc. of the 17th Natl. Conference on Artificial Intelligence (AAAI-2000), Workshop on New Research Problems for Machine Learning (2000)

    Google Scholar 

  11. Merz, C.J., Murphy, P.M.: UCI Repository of Machine Learning Databases, Dept. of Information and Computer Science, Univ. of California, Irvine, CA (1998)

    Google Scholar 

  12. Dietterich, G.T.: Machine learning research: four current directions. AI Magazine 18, 97–136 (1997)

    Google Scholar 

  13. Breiman, L.: Bagging predictors. Machine Learning 24, 123–140 (1996)

    MATH  MathSciNet  Google Scholar 

  14. Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: Proc. of the 13th Intl. Conference on Machine Learning, Morgan Kaufmann, pp. 148–156. Morgan Kaufmann, San Francisco (1996)

    Google Scholar 

  15. Kubat, M., Cooperson Jr., M.: Voting nearest neighbor subclassifiers. In: Proc. of the 17th Intl. Conference on Machine Learning, Morgan Kaufmann, Stanford, CA, pp. 503–510. Morgan Kaufmann, Stanford (2000)

    Google Scholar 

  16. Dasaraty, B.V.: Nearest Neighbor (NN) Norms: NN Pattern Classification Techniques. IEEE Computer Society Press, Los Alamitos (1991)

    Google Scholar 

  17. Hansen, L.K., Salomon, P.: Neural network ensembles. IEEE Trans. on Pattern Analysis and Machine Intelligence 12, 993–1001 (1990)

    Article  Google Scholar 

  18. Matan, O.: On voting ensembles of classifiers. In: Proc. of the 13th Natl. Conference on Artificial Intelligence (AAAI 1996), Workshop on Integrating Multiple Learned Models, pp. 84–88 (1996)

    Google Scholar 

  19. Ho, T.-K., Hull, J.J., Srihari, S.N.: Combination of Decisions by Multiple Classifiers. In: Structured Document Image Analysis, pp. 188–202. Springer, Heidelberg (1992)

    Google Scholar 

  20. Shepard, R.N.: Toward a universal law of generalization for psychological science. Science 237, 1317–1323 (1987)

    Article  MathSciNet  Google Scholar 

  21. Verikasa, A., Lipnickasb, A., Malmqvista, K., Bacauskieneb, M., Gelzinisb, A.: Soft combination of neural classifiers: a comparative study. Pattern Recognition Letters 20, 429–444 (1999)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Valdovinos, R.M., Sánchez, J.S., Barandela, R. (2005). Dynamic and Static Weighting in Classifier Fusion. In: Marques, J.S., Pérez de la Blanca, N., Pina, P. (eds) Pattern Recognition and Image Analysis. IbPRIA 2005. Lecture Notes in Computer Science, vol 3523. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11492542_8

Download citation

  • DOI: https://doi.org/10.1007/11492542_8

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-26154-4

  • Online ISBN: 978-3-540-32238-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics