Abstract
Combining multiple classifiers is expected to increase classification accuracy. Research on combination strategies of multiple classifiers becomes a popular topic. For a crisp classifier, which returns a discrete class label instead of a set of real-valued probabilities respecting to every classes, the often used combination method is majority voting. Both majority and weighted majority voting are classifier-based voting schemes, which provide a certain base classifier with an identical confidence in voting. However, each classifier should have different voting priorities with respect to its learning space. This differences can not be reflected by classifier-based voting strategy. In this paper, we propose another two voting strategies in an effort to take such differences into consideration. We apply the AdaBoost algorithm to generate multiple classifiers and vary its voting strategy. Then, the prediction ability of each voting strategy is tested and compared on 8 datasets taken from UCI Machine Learning Repository. The experimental results show that one of the proposed voting strategies, namely sample-based voting scheme, achieves better performance in view of classification accuracy.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Alexandre, L.A., Campilho, A.C., Kamel, M.: On combining classifiers using sum and product rules. Pattern Recognition Letters 22(12), 1283–1289 (2001)
Bauer, E., Kohavi, R.: An empirical comparison of voting classification algoirthm: Bagging, boosting and variants. Machine Learning 36, 105–142 (1999)
Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)
Freund, Y.: Boosting a weak leaning algorithm by majority. Information and computation 121(2), 256–285 (1995)
Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: Proc. of the Thirteenth International Conference on Machine Learning, pp. 148–156. Morgan Kaufmann, San Francisco (1996), The Mit Press
Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an aplication to boosting. Journal of Computer and System Sciences 55(1), 119–139 (1997)
Kamel, M., Wanas, N.: Data dependence in combining classifiers. In: Multiple Classifiers Systems, Fourth International Workshop, Surrey, UK, June 2003, pp. 11–13 (2003)
Kohavi, R., Sommerfield, D., Dougherty, J.: Data Mining Using MLC++: A machine learning library in C++. In: Tools with Artificial Intelligence. IEEE CS Press, Los Alamitos (1996)
Kuncheva, L.I., Bezdek, J.C., Duin, R.P.W.: Decision templates for multiple classifier fusion: An experimental comparision. Pattern Recognition 34(2), 299–314 (2001)
Murph, P.M., Aha, D.W.: UCI Repository Of Machine Learning Databases. In: Dept. of Information and Computer Science, Univ. of California: Irvine (1991)
Osteyee, D.B., Good, I.J.: Information, Weight of Evidence. In: The Singularity Between Probability Measures and Signal Detection. Springer, Berlin (1974)
Schapire, R.E., Singer, Y.: Boosting the margin: A new explanation for the effectiveness of voting methods. Machine Learning 37(3), 297–336 (1999)
Schapire, R.E., Singer, Y.: Improved boosting algorithms using confidence-rated predictions. Machine Learning 37(3), 297–336 (1999)
Wang, Y., Wong, A.K.C.: From association to classification: Inference using weight of evidence. IEEE Trans. On Knowledge and Data Engineering 15(3), 764–767 (2003)
Wong, A.K.C., Wang, Y.: High order discovery from discrete-valued data. IEEE Trans. On Knowledge and Data Engineering 9(6), 877–893 (1997)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Sun, Y., Kamel, M.S., Wong, A.K.C. (2005). Empirical Study on Weighted Voting Multiple Classifiers. In: Singh, S., Singh, M., Apte, C., Perner, P. (eds) Pattern Recognition and Data Mining. ICAPR 2005. Lecture Notes in Computer Science, vol 3686. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11551188_36
Download citation
DOI: https://doi.org/10.1007/11551188_36
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-28757-5
Online ISBN: 978-3-540-28758-2
eBook Packages: Computer ScienceComputer Science (R0)