插入符集合的神经网络轮数问题

Issues with neural net number of rounds with caret ensemble

我正在创建一个包含两个 xgboost 和 mxnet 模型的简单集合。数据框是 A3n.df,classification 变量位于 A3n.df[1]。这两个模型 运行 本身都很好,并获得了可信的准确性。所有数据都归一化为 0-1,打乱并且 class 变量转换为一个因子(插入符号)。我已经 运行 网格搜索最佳超参数,但需要包含一个用于 caretEnsemble 的网格。

#training grid for xgboost
xgb_grid_A3 = expand.grid(
  nrounds = 1200,   
  eta = 0.01,
  max_depth = 20,
  gamma = 1,
  colsample_bytree = 0.6,
  min_child_weight = 2,
  subsample = 0.8)

#training grid for mxnet
mxnet_grid_A3 = expand.grid(layer1 = 12,
                            layer2 = 2,
                            layer3 = 0,
                            learningrate = 0.001,
                            dropout = 0
                            beta1 = .9,
                            beta2 = 0.999,
                            activation = 'relu')

Ensemble_control_A4 <- trainControl(
  method = "cv",
  number = 5,
  verboseIter = TRUE,
  returnData = TRUE,
  returnResamp = "all",                                                        
  classProbs = TRUE,                                                           
  summaryFunction = twoClassSummary,
  allowParallel = TRUE,
  sampling = "up",
  index=createResample(yEf, 20))

yE = A4n.df[,1]
xE = data.matrix(A4n.df[,-1])
yf <- yE
yEf <- ifelse(yE == 0, "no", "yes") 
yEf <- factor(yEf)

Ensemble_list_A4 <- caretList(
  x=xE,
  y=yEf,
  trControl=Ensemble_control_A4,
  metric="ROC",
  methodList=c("glm", "rpart"),
  tuneList=list(
    xgbA4=caretModelSpec(method="xgbTree", tuneGrid=xgb_grid_A4),
    mxA4=caretModelSpec(method="mxnetAdam", tuneGrid=mxnet_grid_A4)))

XGboost 似乎训练得很好:

+ Resample01: eta=0.01, max_depth=20, gamma=1, colsample_bytree=0.6, min_child_weight=2, subsample=0.8, nrounds=1200 
....
+ Resample20: eta=0.01, max_depth=20, gamma=1, colsample_bytree=0.6, min_child_weight=2, subsample=0.8, nrounds=1200 
- Resample20: eta=0.01, max_depth=20, gamma=1, colsample_bytree=0.6, min_child_weight=2, subsample=0.8, nrounds=1200 
Aggregating results
Selecting tuning parameters
Fitting nrounds = 1200, max_depth = 20, eta = 0.01, gamma = 1, colsample_bytree = 0.6, min_child_weight = 2, subsample = 0.8 on full training set

然而,mxnet 似乎只有 运行 10 轮,当 1 或 2 千更有意义时,似乎缺少参数:

+ Resample01: layer1=12, layer2=2, layer3=0, learningrate=0.001, dropout=0, beta1=0.9, beta2=0.999, activation=relu 
Start training with 1 devices
[1] Train-accuracy=0.487651209677419
[2] Train-accuracy=0.624751984126984
[3] Train-accuracy=0.599082341269841
[4] Train-accuracy=0.651909722222222
[5] Train-accuracy=0.662202380952381
[6] Train-accuracy=0.671006944444444
[7] Train-accuracy=0.676463293650794
[8] Train-accuracy=0.683407738095238
[9] Train-accuracy=0.691964285714286
[10] Train-accuracy=0.698660714285714
- Resample01: layer1=12, layer2=2, layer3=0, learningrate=0.001, dropout=0, beta1=0.9, beta2=0.999, activation=relu

+ Resample01: parameter=none 
- Resample01: parameter=none 
+ Resample02: parameter=none 
Aggregating results
Selecting tuning parameters
Fitting cp = 0.0243 on full training set
There were 40 warnings (use warnings() to see them)

警告(1-40):

1: In predict.lm(object, newdata, se.fit, scale = 1, type = ifelse(type ==  ... :
  prediction from a rank-deficient fit may be misleading

我希望 mxnet 训练几千轮,训练准确率最终像预集成模型一样,60-70% *转念一想,20个mxnet 运行中有的达到了60-70%,但似乎不一致。也许它正常运行?

插入符号文档中有一条注释,num.round 需要由用户在 tune_grid 之外设置:http://topepo.github.io/caret/train-models-by-tag.html

Ensemble_list_A2 <- caretList(
  x=xE,
  y=yEf,
  trControl=Ensemble_control_A2,
  metric="ROC",
  methodList=c("glm", "rpart", "bayesglm"),
  tuneList=list(
    xgbA2=caretModelSpec(method="xgbTree", tuneGrid=xgb_grid_A2),
    mxA2=caretModelSpec(method="mxnetAdam", tuneGrid=mxnet_grid_A2, num.round=1500, ctx=mx.gpu()),
    svmA2=caretModelSpec(method="svmLinear2", tuneGrid=svm_grid_A2),
    rfA2=caretModelSpec(method="rf", tuneGrid=rf_grid_A2)))