XGBoost报告eval-auc一直在下降,但是train-auc一直在上升,结果是否正常?

XGBoost reporting eval-auc has been declining,but train-auc has been rising,whether the result is normal?

我想用XGBoost的early_stopping_rounds做非过拟合训练。为此,我使用以下代码:

parameters = {'nthread': 4,'objective': 'binary:logistic','learning_rate': 0.06,'max_depth': 6,'min_child_weight': 3,
        'silent': 0,'gamma': 0,'subsample': 0.7,'colsample_bytree': 0.5,'n_estimators': 5,
        'missing': -999,'scale_pos_weight': scale_pos_weight,'seed': 4789,'eval_metric':'auc','early_stopping_rounds': 100}
X_train, X_test, y_train, y_test =train_test_split(train_feature,train_label, test_size=0.3, random_state=4789)
dtrain = xgb.DMatrix(X_train, label=y_train)
dtest = xgb.DMatrix(X_test, label=y_test)
evallist = [(dtest, 'eval'), (dtrain, 'train')]
bst = xgb.train(parameters, dtrain,num_boost_round=1500, evals=evallist)

打印中间结果时,我得到如下日志:

[1469]  eval-auc:0.912417   train-auc:0.986104
[16:04:23] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 110 extra nodes, 0 pruned nodes, max_depth=6
[1470]  eval-auc:0.912412   train-auc:0.986118
[16:04:27] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 102 extra nodes, 0 pruned nodes, max_depth=6
[1471]  eval-auc:0.912405   train-auc:0.986129
[16:04:30] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 116 extra nodes, 0 pruned nodes, max_depth=6
[1472]  eval-auc:0.912383   train-auc:0.986143
[16:04:34] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 116 extra nodes, 0 pruned nodes, max_depth=6
[1473]  eval-auc:0.912375   train-auc:0.986159

现在我想知道这个训练结果是否正确?如何检测我的模型是否过拟合以及选择多少轮?

正如@Stepan Novikov 所说,您看到的结果是正确的 - 您的模型刚刚开始过度拟合。

关于你的第二个问题,early_stopping_rounds 参数的工作方式是在经过 N 轮后停止训练,而 eval-aug 没有任何改进(N 是 early_stopping_rounds)。注意中间的eval-auc值可能会下降,但是只要最后N轮有绝对的提升,训练就会继续

在您的示例中,第 [1469] 轮具有 eval-auc 的最大值,因此训练将直到第 [1569] 轮(100 轮后,根据配置)才会停止。

最后,达到的最佳轮数应存储在您示例的 bst 变量中。