shapr: Prediction Explanation with Dependence-Aware Shapley Values

Complex machine learning models are often hard to interpret. However, in many situations it is crucial to understand and explain why a model made a specific prediction. Shapley values is the only method for such prediction explanation framework with a solid theoretical foundation. Previously known methods for estimating the Shapley values do, however, assume feature independence. This package implements the method described in Aas, Jullum and Løland (2019) <arXiv:1903.10464>, which accounts for any feature dependence, and thereby produces more accurate estimates of the true Shapley values.

Version: 0.2.0
Depends: R (≥ 3.5.0)
Imports: stats, data.table, Rcpp (≥ 0.12.15), condMVNorm, mvnfast, Matrix
LinkingTo: RcppArmadillo, Rcpp
Suggests: ranger, xgboost, mgcv, testthat, knitr, rmarkdown, roxygen2, MASS, ggplot2, caret, gbm, party, partykit
Published: 2021-01-28
Author: Nikolai Sellereite ORCID iD [aut], Martin Jullum ORCID iD [cre, aut], Annabelle Redelmeier [aut], Anders Løland [ctb], Jens Christian Wahl [ctb], Camilla Lingjærde [ctb], Norsk Regnesentral [cph, fnd]
Maintainer: Martin Jullum <Martin.Jullum at>
License: MIT + file LICENSE
NeedsCompilation: yes
Language: en-US
Materials: README NEWS
CRAN checks: shapr results


Reference manual: shapr.pdf
Vignettes: 'shapr': Explaining individual machine learning predictions with Shapley values


Package source: shapr_0.2.0.tar.gz
Windows binaries: r-devel:, r-release:, r-oldrel:
macOS binaries: r-release (arm64): shapr_0.2.0.tgz, r-release (x86_64): shapr_0.2.0.tgz, r-oldrel: shapr_0.2.0.tgz
Old sources: shapr archive


Please use the canonical form to link to this page.