A basic, clear implementation of tree-based gradient boosting designed to illustrate the core operation of boosting models. Tuning parameters (such as stochastic subsampling, modified learning rate, or regularization) are not implemented. The only adjustable parameter is the number of training rounds. If you are looking for a high performance boosting implementation with tuning parameters, consider the 'xgboost' package.

Documentation

Manual: DidacticBoost.pdf
Vignette: None available.

Maintainer: David Shaub <davidshaub at gmx.com>

Author(s): David Shaub*

Install package and any missing dependencies by running this line in your R console:

install.packages("DidacticBoost")

Depends R (>= 3.1.1), rpart(>=4.1-10)
Imports
Suggests testthat
Enhances
Linking to
Reverse
depends
Reverse
imports
Reverse
suggests
Reverse
enhances
Reverse
linking to

Package DidacticBoost
Materials
URL https://github.com/dashaub/DidacticBoost
Task Views
Version 0.1.1
Published 2016-04-19
License GPL-3
BugReports https://github.com/dashaub/DidacticBoost/issues
SystemRequirements
NeedsCompilation no
Citation
CRAN checks DidacticBoost check results
Package source DidacticBoost_0.1.1.tar.gz