[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to content
Licensed Unlicensed Requires Authentication Published by De Gruyter September 16, 2007

Super Learner

  • Mark J. van der Laan , Eric C Polley and Alan E. Hubbard

When trying to learn a model for the prediction of an outcome given a set of covariates, a statistician has many estimation procedures in their toolbox. A few examples of these candidate learners are: least squares, least angle regression, random forests, and spline regression. Previous articles (van der Laan and Dudoit (2003); van der Laan et al. (2006); Sinisi et al. (2007)) theoretically validated the use of cross validation to select an optimal learner among many candidate learners. Motivated by this use of cross validation, we propose a new prediction method for creating a weighted combination of many candidate learners to build the super learner. This article proposes a fast algorithm for constructing a super learner in prediction which uses V-fold cross-validation to select weights to combine an initial set of candidate learners. In addition, this paper contains a practical demonstration of the adaptivity of this so called super learner to various true data generating distributions. This approach for construction of a super learner generalizes to any parameter which can be defined as a minimizer of a loss function.

Published Online: 2007-9-16

©2011 Walter de Gruyter GmbH & Co. KG, Berlin/Boston

Downloaded on 19.12.2024 from https://www.degruyter.com/document/doi/10.2202/1544-6115.1309/html
Scroll to top button