R 中 Ranger 的 SHAP 重要性
SHAP Importance for Ranger in R
有一个二元分类问题:
如何获得 Ranger 模型变量的 Shap 贡献?
示例数据:
library(ranger)
library(tidyverse)
# Binary Dataset
df <- iris
df$Target <- if_else(df$Species == "setosa",1,0)
df$Species <- NULL
# Train Ranger Model
model <- ranger(
x = df %>% select(-Target),
y = df %>% pull(Target))
我试过几个库(DALEX
、shapr
、fastshap
、shapper
)但我没有得到任何解决方案。
我希望 xgboost 得到类似 SHAPforxgboost
的结果,例如:
shap.values
的输出,即变量的形状贡献
shap.plot.summary
早上好!,
根据我的发现,您可以将 ranger()
与 fastshap() 一起使用,如下所示:
library(fastshap)
library(ranger)
library(tidyverse)
data(iris)
# Binary Dataset
df <- iris
df$Target <- if_else(df$Species == "setosa",1,0)
df$Species <- NULL
x <- df %>% select(-Target)
# Train Ranger Model
model <- ranger(
x = df %>% select(-Target),
y = df %>% pull(Target))
# Prediction wrapper
pfun <- function(object, newdata) {
predict(object, data = newdata)$predictions
}
# Compute fast (approximate) Shapley values using 10 Monte Carlo repetitions
system.time({ # estimate run time
set.seed(5038)
shap <- fastshap::explain(model, X = x, pred_wrapper = pfun, nsim = 10)
})
# Load required packages
library(ggplot2)
theme_set(theme_bw())
# Aggregate Shapley values
shap_imp <- data.frame(
Variable = names(shap),
Importance = apply(shap, MARGIN = 2, FUN = function(x) sum(abs(x)))
)
然后例如,对于变量重要性,你可以这样做:
# Plot Shap-based variable importance
ggplot(shap_imp, aes(reorder(Variable, Importance), Importance)) +
geom_col() +
coord_flip() +
xlab("") +
ylab("mean(|Shapley value|)")
此外,如果您想要个别预测,可以采用以下方式:
# Plot individual explanations
expl <- fastshap::explain(model, X = x ,pred_wrapper = pfun, nsim = 10, newdata = x[1L, ])
autoplot(expl, type = "contribution")
所有这些信息都可以在这里找到,还有更多信息:https://bgreenwell.github.io/fastshap/articles/fastshap.html
查看link,解决你的疑惑! :)
有一个二元分类问题: 如何获得 Ranger 模型变量的 Shap 贡献?
示例数据:
library(ranger)
library(tidyverse)
# Binary Dataset
df <- iris
df$Target <- if_else(df$Species == "setosa",1,0)
df$Species <- NULL
# Train Ranger Model
model <- ranger(
x = df %>% select(-Target),
y = df %>% pull(Target))
我试过几个库(DALEX
、shapr
、fastshap
、shapper
)但我没有得到任何解决方案。
我希望 xgboost 得到类似 SHAPforxgboost
的结果,例如:
shap.values
的输出,即变量的形状贡献shap.plot.summary
早上好!,
根据我的发现,您可以将 ranger()
与 fastshap() 一起使用,如下所示:
library(fastshap)
library(ranger)
library(tidyverse)
data(iris)
# Binary Dataset
df <- iris
df$Target <- if_else(df$Species == "setosa",1,0)
df$Species <- NULL
x <- df %>% select(-Target)
# Train Ranger Model
model <- ranger(
x = df %>% select(-Target),
y = df %>% pull(Target))
# Prediction wrapper
pfun <- function(object, newdata) {
predict(object, data = newdata)$predictions
}
# Compute fast (approximate) Shapley values using 10 Monte Carlo repetitions
system.time({ # estimate run time
set.seed(5038)
shap <- fastshap::explain(model, X = x, pred_wrapper = pfun, nsim = 10)
})
# Load required packages
library(ggplot2)
theme_set(theme_bw())
# Aggregate Shapley values
shap_imp <- data.frame(
Variable = names(shap),
Importance = apply(shap, MARGIN = 2, FUN = function(x) sum(abs(x)))
)
然后例如,对于变量重要性,你可以这样做:
# Plot Shap-based variable importance
ggplot(shap_imp, aes(reorder(Variable, Importance), Importance)) +
geom_col() +
coord_flip() +
xlab("") +
ylab("mean(|Shapley value|)")
此外,如果您想要个别预测,可以采用以下方式:
# Plot individual explanations
expl <- fastshap::explain(model, X = x ,pred_wrapper = pfun, nsim = 10, newdata = x[1L, ])
autoplot(expl, type = "contribution")
所有这些信息都可以在这里找到,还有更多信息:https://bgreenwell.github.io/fastshap/articles/fastshap.html 查看link,解决你的疑惑! :)