[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.5555/645327.649533guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

An Empirical Study of MetaCost Using Boosting Algorithms

Published: 31 May 2000 Publication History

Abstract

MetaCost is a recently proposed procedure that converts an error-based learning algorithm into a cost-sensitive algorithm. This paper investigates two important issues centered on the procedure which were ignored in the paper proposing MetaCost. First, no comparison was made between MetaCost's final model and the internal cost-sensitive classifier on which MetaCost depends. It is credible that the internal cost-sensitive classifier may outperform the final model without the additional computation required to derive the final model. Second, MetaCost assumes its internal cost-sensitive classifier is obtained by applying a minimum expected cost criterion. It is unclear whether violation of the assumption has an impact on MetaCost's performance. We study these issues using two boosting procedures, and compare with the performance of the original form of MetaCost which employs bagging.

References

[1]
Bauer, E. & Kohavi, R. (1999), An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants. Machine Learning, 36, pp. 105-139, Kluwer Academic Publishers.
[2]
Blake, C., Keogh, E. & Merz, C.J. (1998), UCI Repository of machine learning databases {http://www.ics.uci.edu/~mlearn/MLRepository.html}. Irvine, CA: University of California, Dept. of Information and Computer Science.
[3]
Domingos, P. (1999), MetaCost: A General Method for Making Classifiers Cost-Sensitive, in Proceedings of the Fifth International Conference on Knowledge Discovery and Data Mining, pp. 155-164, ACM Press.
[4]
Michie, D., D.J. Spiegelhalter, & C.C. Taylor (1994), Machine Learning, Neural and Statistical Classification, Ellis Horwood Limited.
[5]
Quinlan, J.R. (1993), C4.5: Program for Machine Learning, Morgan Kaufmann.
[6]
Quinlan, J.R. (1996), Bagging, boosting, and C4.5, in Proceedings of the 13th National Conference on Artificial Intelligence, pp. 725-730, AAAI Press.
[7]
Schapire, R.E., Y. Freund, P. Bartlett, & W.S. Lee (1997), Boosting the margin: A new explanation for the effectiveness of voting methods, in Proceedings of the Fourteenth International Conference on Machine Learning, pp. 322-330.
[8]
Ting, K.M. (2000), An Empirical Study of MetaCost using Boosting Algorithms. Technical Report C01/00, Deakin University.
[9]
Ting, K.M. & Z. Zheng (1998), Boosting Cost-Sensitive Trees, Proceedings of the First International Conference on Discovery Science, LNAI-1532, Berlin: Springer-Verlag, pp. 244-255.

Cited By

View all
  • (2013)A survey of cost-sensitive decision tree induction algorithmsACM Computing Surveys10.1145/2431211.243121545:2(1-35)Online publication date: 12-Mar-2013
  • (2009)Exploratory undersampling for class-imbalance learningIEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics10.1109/TSMCB.2008.200785339:2(539-550)Online publication date: 1-Apr-2009
  • (2006)Training Cost-Sensitive Neural Networks with Methods Addressing the Class Imbalance ProblemIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2006.1718:1(63-77)Online publication date: 1-Jan-2006

Index Terms

  1. An Empirical Study of MetaCost Using Boosting Algorithms

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image Guide Proceedings
    ECML '00: Proceedings of the 11th European Conference on Machine Learning
    May 2000
    452 pages
    ISBN:3540676023

    Publisher

    Springer-Verlag

    Berlin, Heidelberg

    Publication History

    Published: 31 May 2000

    Qualifiers

    • Article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 14 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2013)A survey of cost-sensitive decision tree induction algorithmsACM Computing Surveys10.1145/2431211.243121545:2(1-35)Online publication date: 12-Mar-2013
    • (2009)Exploratory undersampling for class-imbalance learningIEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics10.1109/TSMCB.2008.200785339:2(539-550)Online publication date: 1-Apr-2009
    • (2006)Training Cost-Sensitive Neural Networks with Methods Addressing the Class Imbalance ProblemIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2006.1718:1(63-77)Online publication date: 1-Jan-2006

    View Options

    View options

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media