Stacked Regressions

Stacked Regressions

Report Number
367
Authors
Leo Breiman
Citation
Machine Learning (1996?)
Abstract

Stacking Regressions is a method for forming linear combinations of different predictors to give improved prediction accuracy. The idea is to use cross-validation data and least squares under non-negativity contraints to determine the coefficients in the combination. Its effectiveness is demonstrated in stacking regression trees of different sizes and in a simulation stacking linear subset and ridge regressions. Reasons why this method works are explored. The idea of stacking originated with Wolpert [1992].

PDF File