调整神经网络的大小参数

Tuning size parameter for neural network

我想使用插入符拟合神经网络模型 package.There 有 208 个预测变量,所有这些都很重要,不能丢弃。 我可以给 size 参数的最大值是 4,超过这个值我会收到一条错误消息,提示权重太多。

> ctrl<-trainControl(method = 'cv',number = 5)
> my.grid <- expand.grid(.decay = 0.1, .size =5)
> nn.fit <- train(train_predictors,train_responses[["r2c1"]],method = "nnet",algorithm = 'backprop', tuneGrid = my.grid,trace=F, linout = TRUE,trControl = ctrl)
Something is wrong; all the RMSE metric values are missing:
      RMSE        Rsquared        MAE     
 Min.   : NA   Min.   : NA   Min.   : NA  
 1st Qu.: NA   1st Qu.: NA   1st Qu.: NA  
 Median : NA   Median : NA   Median : NA  
 Mean   :NaN   Mean   :NaN   Mean   :NaN  
 3rd Qu.: NA   3rd Qu.: NA   3rd Qu.: NA  
 Max.   : NA   Max.   : NA   Max.   : NA  
 NA's   :1     NA's   :1     NA's   :1    
Error: Stopping
In addition: Warning messages:
1: model fit failed for Fold1: decay=0.1, size=5 Error in nnet.default(x, y, w, ...) : too many (1051) weights

2: model fit failed for Fold2: decay=0.1, size=5 Error in nnet.default(x, y, w, ...) : too many (1051) weights

3: model fit failed for Fold3: decay=0.1, size=5 Error in nnet.default(x, y, w, ...) : too many (1051) weights

4: model fit failed for Fold4: decay=0.1, size=5 Error in nnet.default(x, y, w, ...) : too many (1051) weights

5: model fit failed for Fold5: decay=0.1, size=5 Error in nnet.default(x, y, w, ...) : too many (1051) weights

6: In nominalTrainWorkflow(x = x, y = y, wts = weights, info = trainInfo,  :
  There were missing values in resampled performance measures.

该模型在 4 个神经元(大小=4)下表现非常糟糕。如果我想要超过 5 个神经元,我该怎么做才能使模型正常工作?

您可以在调整网格中为 nnet 方法指定其他参数。每种方法的可用参数都可以在线获得,但很难找到。这是我为 adam nn 使用 mxnet 的示例:

mxnet_grid_A2 = expand.grid(layer1 = c(10, 12),   
                           layer2 = c(4, 6),
                           layer3 = 2,
                           learningrate = c(0.001, 0.0001),
                           dropout = c(0, 0.2)
                           beta1 = .9,
                           beta2 = 0.999,
                           activation = 'relu')

您始终可以使用插入符号 train 方法中的 ... 可选参数将其他参数传递给基础训练方法(在本例中为 nnet)。 nnet 包的 CRAN 文档描述了允许控制最大隐藏单元数的 MaxNwts 参数。