[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to main content

Decision Stump

  • Reference work entry
Encyclopedia of Machine Learning

Definition

A decision stump is a Decision Tree, which uses only a single attribute for splitting. For discrete attributes, this typically means that the tree consists only of a single interior node (i.e., the root has only leaves as successor nodes). If the attribute is numerical, the tree may be more complex.

Decision stumps perform surprisingly well on some commonly used benchmark datasets from the UCI repository (Holte, 1993), which illustrates that learners with a high Bias and low Variance may perform well because they are less prone to Overfitting. Decision stumps are also often used as weak learners in Ensemble Methods such as boosting (Freund & Schapire, 1996).

Cross References

Bias and Variance

Decision Tree

Overfitting

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Recommended Reading

  • Freund, Y., & Schapire, R. E. (1996). Experiments with a new boosting algorithm. In L. Saitta (Ed.), Proceedings of the 13th international conference on machine learning; Bari, Italy (pp. 148–156). San Francisco: Morgan Kaufmann.

    Google Scholar 

  • Holte, R. C. (1993). Very simple classification rules perform well on most commonly used datasets. Machine Learning, 11, 63–91.

    MATH  Google Scholar 

Download references

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer Science+Business Media, LLC

About this entry

Cite this entry

(2011). Decision Stump. In: Sammut, C., Webb, G.I. (eds) Encyclopedia of Machine Learning. Springer, Boston, MA. https://doi.org/10.1007/978-0-387-30164-8_202

Download citation

Publish with us

Policies and ethics