R xgboost xgb.cv pred 值:最佳迭代还是最终迭代?
R xgboost xgb.cv pred values: best iteration or final iteration?
我正在使用 xgb.cv 函数在 xgboost 的 R 实现中网格搜索最佳超参数。将预测设置为 TRUE 时,它会提供折叠观察的预测。假设您正在使用提前停止,预测是否对应于最佳迭代的预测,或者它们是最终迭代的预测?
CV 预测对应于最佳迭代 - 您可以使用 'strict' early_stopping 值看到这一点,然后将预测与使用 'best' 数量训练的模型所做的预测进行比较迭代次数和 'final' 迭代次数,例如:
# Load minimum reproducible example
library(xgboost)
data(agaricus.train, package='xgboost')
data(agaricus.test, package='xgboost')
train <- agaricus.train
dtrain <- xgb.DMatrix(train$data, label=train$label)
test <- agaricus.test
dtest <- xgb.DMatrix(test$data, label=test$label)
# Perform cross validation with a 'strict' early_stopping
cv <- xgb.cv(data = train$data, label = train$label, nfold = 5, max_depth = 2,
eta = 1, nthread = 4, nrounds = 10, objective = "binary:logistic",
prediction = TRUE, early_stopping_rounds = 1)
# Check which round was the best iteration (the one that initiated the early stopping)
print(cv$best_iteration)
[1] 3
# Get the predictions
head(cv$pred)
[1] 0.84574515 0.15447612 0.15390711 0.84502697 0.09661318 0.15447612
# Train a model using 3 rounds (corresponds to best iteration)
trained_model <- xgb.train(data = dtrain, max_depth = 2,
eta = 1, nthread = 4, nrounds = 3,
watchlist = list(train = dtrain, eval = dtrain),
objective = "binary:logistic")
# Get predictions
head(predict(trained_model, dtrain))
[1] 0.84625006 0.15353635 0.15353635 0.84625006 0.09530514 0.15353635
# Train a model using 10 rounds (corresponds to final iteration)
trained_model <- xgb.train(data = dtrain, max_depth = 2,
eta = 1, nthread = 4, nrounds = 10,
watchlist = list(train = dtrain, eval = dtrain),
objective = "binary:logistic")
head(predict(trained_model, dtrain))
[1] 0.9884467125 0.0123147098 0.0050151693 0.9884467125 0.0008781737 0.0123147098
所以 CV 的预测与迭代次数为 'best' 时的预测相同,而不是 'final'。
我正在使用 xgb.cv 函数在 xgboost 的 R 实现中网格搜索最佳超参数。将预测设置为 TRUE 时,它会提供折叠观察的预测。假设您正在使用提前停止,预测是否对应于最佳迭代的预测,或者它们是最终迭代的预测?
CV 预测对应于最佳迭代 - 您可以使用 'strict' early_stopping 值看到这一点,然后将预测与使用 'best' 数量训练的模型所做的预测进行比较迭代次数和 'final' 迭代次数,例如:
# Load minimum reproducible example
library(xgboost)
data(agaricus.train, package='xgboost')
data(agaricus.test, package='xgboost')
train <- agaricus.train
dtrain <- xgb.DMatrix(train$data, label=train$label)
test <- agaricus.test
dtest <- xgb.DMatrix(test$data, label=test$label)
# Perform cross validation with a 'strict' early_stopping
cv <- xgb.cv(data = train$data, label = train$label, nfold = 5, max_depth = 2,
eta = 1, nthread = 4, nrounds = 10, objective = "binary:logistic",
prediction = TRUE, early_stopping_rounds = 1)
# Check which round was the best iteration (the one that initiated the early stopping)
print(cv$best_iteration)
[1] 3
# Get the predictions
head(cv$pred)
[1] 0.84574515 0.15447612 0.15390711 0.84502697 0.09661318 0.15447612
# Train a model using 3 rounds (corresponds to best iteration)
trained_model <- xgb.train(data = dtrain, max_depth = 2,
eta = 1, nthread = 4, nrounds = 3,
watchlist = list(train = dtrain, eval = dtrain),
objective = "binary:logistic")
# Get predictions
head(predict(trained_model, dtrain))
[1] 0.84625006 0.15353635 0.15353635 0.84625006 0.09530514 0.15353635
# Train a model using 10 rounds (corresponds to final iteration)
trained_model <- xgb.train(data = dtrain, max_depth = 2,
eta = 1, nthread = 4, nrounds = 10,
watchlist = list(train = dtrain, eval = dtrain),
objective = "binary:logistic")
head(predict(trained_model, dtrain))
[1] 0.9884467125 0.0123147098 0.0050151693 0.9884467125 0.0008781737 0.0123147098
所以 CV 的预测与迭代次数为 'best' 时的预测相同,而不是 'final'。