Next:
16.1 An Introduction to
Up:
References
Previous:
References
16. Bagging, Boosting and Ensemble Methods
Peter Bühlmann
Subsections
16.1 An Introduction to Ensemble Methods
16.2 Bagging and Related Methods
16.2.1 Bagging
16.2.1.1 Bagging algorithm
16.2.2 Unstable Estimators with Hard Decision Indicator
16.2.2.1 Regression Trees
16.2.3 Subagging
16.2.3.1 Subagging Algorithm
16.2.3.2 Subagging Regression Trees
16.2.4 Bagging More ''Smooth'' Base Procedures and Bragging
16.2.5 Bragging
16.2.6 Out-of-Bag Error Estimation
16.2.7 Disadvantages
16.2.8 Other References
16.3 Boosting
16.3.1 Boosting as Functional Gradient Descent
16.3.1.1 The Margin for Classification
16.3.2 The Generic Boosting Algorithm
16.3.2.1
Boost
16.3.2.2 LogitBoost
16.3.2.3 Multi-class Problems
16.3.3 Small Step Size
16.3.4 The Bias-variance Trade-off for
Boost
16.3.5
Boost with Smoothing Spline Base Procedure for One-dimensional Curve Estimation
16.3.5.1 Choosing the Base Procedure
16.3.5.2 MSE Trace and Stopping
16.3.5.3 Asymptotic Optimality
16.3.6
Boost for Additive and Interaction Regression Models
16.3.6.1 Additive Modeling
16.3.6.2 Degrees of Freedom and
-stopping Estimates
16.3.6.3 Penalized
Boosting
16.3.6.4 Interaction Modeling
16.3.7 Linear Modeling
16.3.8 Boosting Trees
16.3.8.1 Interpretation
16.3.9 Boosting and
-penalized Methods (Lasso)
16.3.10 Other References