Definition
A decision stump is a Decision Tree, which uses only a single attribute for splitting. For discrete attributes, this typically means that the tree consists only of a single interior node (i.e., the root has only leaves as successor nodes). If the attribute is numerical, the tree may be more complex.
Decision stumps perform surprisingly well on some commonly used benchmark datasets from the UCI repository (Holte, 1993), which illustrates that learners with a high Bias and low Variance may perform well because they are less prone to Overfitting. Decision stumps are also often used as weak learners in Ensemble Methods such as boosting (Freund & Schapire, 1996).
Cross References
Recommended Reading
Freund, Y., & Schapire, R. E. (1996). Experiments with a new boosting algorithm. In L. Saitta (Ed.), Proceedings of the 13th international conference on machine learning; Bari, Italy (pp. 148–156). San Francisco: Morgan Kaufmann.
Holte, R. C. (1993). Very simple classification rules perform well on most commonly used datasets. Machine Learning, 11, 63–91.
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer Science+Business Media, LLC
About this entry
Cite this entry
(2011). Decision Stump. In: Sammut, C., Webb, G.I. (eds) Encyclopedia of Machine Learning. Springer, Boston, MA. https://doi.org/10.1007/978-0-387-30164-8_202
Download citation
DOI: https://doi.org/10.1007/978-0-387-30164-8_202
Publisher Name: Springer, Boston, MA
Print ISBN: 978-0-387-30768-8
Online ISBN: 978-0-387-30164-8
eBook Packages: Computer ScienceReference Module Computer Science and Engineering