Abstract
The process of combining an ensemble of classifiers has been deemed to be an efficient way for improving the performance of several classification problems. The Random Subspace Method, that consists of training a set of classifiers on different subsets of the feature space, has been shown to be effective in increasing the accuracy of classifiers, notably the nearest neighbor one. Since, in several real world domains, data can also be suffered from several aspects of uncertainty, including incompleteness and inconsistency, an Enhanced Evidential k-Nearest Neighbor classifier has been recently introduced to deal with the uncertainty pervading both the attribute values and the classifier outputs within the belief function framework. Thus, in this paper, we are based primarily on the Enhanced Evidential k-Nearest Neighbor classifier to construct an ensemble pattern classification system. More precisely, we adopt the Random Subspace Method in our context to build ensemble classifiers with imperfect data.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Altınçay, H.: Ensembling evidential k-nearest neighbor classifiers through multi-modal perturbation. Appl. Soft Comput. 7(3), 1072–1083 (2007)
Bay, S.D.: Combining nearest neighbor classifiers through multiple feature subsets. In: 15th International Conference on Machine Learning, vol. 98, pp. 37–45 (1998)
Breiman, L.: Bagging predictors. Mach. Learn. 24(2), 123–140 (1996)
Bryll, R., Gutierrez-Osuna, R., Quek, F.: Attribute bagging: improving accuracy of classifier ensembles by using random feature subsets. Pattern Recogn. 36(6), 1291–1302 (2003)
Cho, S.B., Won, H.-H.: Cancer classification using ensemble of neural networks with multiple significant gene subsets. Appl. Intell. 26(3), 243–250 (2007)
Dempster, A.P.: Upper and lower probabilities induced by a multivalued mapping. Ann. Math. Stat. 38, 325–339 (1967)
Denoeux, T.: A k-nearest neighbor classification rule based on Dempster-Shafer theory. IEEE Trans. Syst. Man Cybern. 25(5), 804–813 (1995)
Günter, S., Bunke, H.: Feature selection algorithms for the generation of multiple classifier systems and their application to handwritten word recognition. Pattern Recogn. Lett. 25(11), 1323–1336 (2004)
Jiao, L., Denœux, T., Pan, Q.: Evidential editing K-nearest neighbor classifier. In: Destercke, S., Denoeux, T. (eds.) ECSQARU 2015. LNCS, vol. 9161, pp. 461–471. Springer, Cham (2015). doi:10.1007/978-3-319-20807-7_42
Jousselme, A., Grenier, D., Bossé, E.: A new distance between two bodies of evidence. Inf. Fusion 2(2), 91–101 (2001)
Kim, Y.: Toward a successful crm: variable selection, sampling, and ensemble. Decis. Support Syst. 41(2), 542–553 (2006)
Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artif. Intell. 97(1–2), 273–324 (1997)
Kuncheva, L., Skurichina, M., Duin, R.P.: An experimental study on diversity for bagging and boosting with linear classifiers. Inf. Fusion 3(4), 245–258 (2002)
Murphy, P., Aha, D.: UCI repository databases (1996). http://www.ics.uci.edu/mlear
Opitz, D., Maclin, R.: Popular ensemble methods: an empirical study. J. Artif. Intell. Res. 11, 169–198 (1999)
Ristic, B., Smets, P.: The TBM global distance measure for the association of uncertain combat id declarations. Inf. Fusion 7(3), 276–284 (2006)
Sánchez-Maroño, N., Alonso-Betanzos, A., Tombilla-Sanromán, M.: Filter methods for feature selection – a comparative study. In: Yin, H., Tino, P., Corchado, E., Byrne, W., Yao, X. (eds.) IDEAL 2007. LNCS, vol. 4881, pp. 178–187. Springer, Heidelberg (2007). doi:10.1007/978-3-540-77226-2_19
Schapire, R.E.: The boosting approach to machine learning: an overview. In: Denison, D.D., Hansen, M.H., Holmes, C.C., Mallick, B., Yu, B. (eds.) Nonlinear Estimation and Classification. LNS, vol. 171, pp. 149–171. Springer, New York (2003). doi:10.1007/978-0-387-21579-2_9
Skurichina, M., Duin, R.P.: Bagging, boosting and the random subspace method for linear classifiers. Pattern Anal. Appl. 5(2), 121–135 (2002)
Smets, P.: Decision making in the TBM: the necessity of the pignistic transformation. Int. J. Approximate Reasoning 38(2), 133–147 (2005)
Smets, P., Kennes, R.: The transferable belief model. Artif. Intell. 66(2), 191–234 (1994)
Tessem, B.: Approximations for efficient computation in the theory of evidence. Artif. Intell. 61(2), 315–329 (1993)
Trabelsi, A., Elouedi, Z., Lefevre, E.: A novel \(k\)-nn approach for data with uncertain attribute values. In: 30th International Conference on Industrial, Engineering and other Applications of Applied Intelligent Systems. Springer (2017, to appear)
Tumer, K., Ghosh, J.: Classifier combining: analytical results and implications. In: Proceedings of the National Conference on Artificial Intelligence, pp. 126–132 (1996)
Zouhal, L.M., Denoeux, T.: An evidence-theoretic \(k\)-nn rule with parameter optimization. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 28(2), 263–271 (1998)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Trabelsi, A., Elouedi, Z., Lefevre, E. (2017). Ensemble Enhanced Evidential k-NN Classifier Through Random Subspaces. In: Antonucci, A., Cholvy, L., Papini, O. (eds) Symbolic and Quantitative Approaches to Reasoning with Uncertainty. ECSQARU 2017. Lecture Notes in Computer Science(), vol 10369. Springer, Cham. https://doi.org/10.1007/978-3-319-61581-3_20
Download citation
DOI: https://doi.org/10.1007/978-3-319-61581-3_20
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-61580-6
Online ISBN: 978-3-319-61581-3
eBook Packages: Computer ScienceComputer Science (R0)