如何在 R 中控制 KerasR 中的学习率

How to control learning rate in KerasR in R

要在 R 中拟合分类模型,一直在使用 library(KerasR)。控制学习率 KerasR

compile(optimizer=Adam(lr = 0.001, beta_1 = 0.9, beta_2 = 0.999, epsilon = 1e-08, decay = 0, clipnorm = -1, clipvalue = -1), loss      = 'binary_crossentropy', metrics   =  c('categorical_accuracy') )

但是它给我这样的错误

Error in modules$keras.optimizers$Adam(lr = lr, beta_1 = beta_2, beta_2 = beta_2, : attempt to apply non-function

我也用过 keras_compile 仍然得到同样的错误。 我可以在编译中更改优化器,但最大学习率是 0.01,我想尝试 0.2。

model <- keras_model_sequential()

model %>% layer_dense(units = 512, activation = 'relu',  input_shape =  ncol(X_train)) %>% 
  layer_dropout(rate = 0.2) %>% 
  layer_dense(units = 128, activation = 'relu')%>%
  layer_dropout(rate = 0.1) %>% 
  layer_dense(units = 2, activation = 'sigmoid')%>%
compile( 
  optimizer = 'Adam', 
  loss      = 'binary_crossentropy',
  metrics   =  c('categorical_accuracy') 
)

我认为问题在于您同时使用了两个不同的库 kerasRkeras。您应该只使用其中一个。首先,您正在使用 keras_model_sequential 函数 来自 keras,然后您尝试使用来自 kerasR 库的 Adam 函数。您可以在此处找到这两个库之间的区别:https://www.datacamp.com/community/tutorials/keras-r-deep-learning#differences

以下代码对我有用,它仅使用 keras 库。

library(keras)
model <- keras_model_sequential()

model %>% 
  layer_dense(units = 512, activation = 'relu',  input_shape =  ncol(X_train)) %>% 
  layer_dropout(rate = 0.2) %>% 
  layer_dense(units = 128, activation = 'relu')%>%
  layer_dropout(rate = 0.1) %>% 
  layer_dense(units = 2, activation = 'sigmoid')%>%
  compile(optimizer=optimizer_adam(lr = 0.2), loss= 'binary_crossentropy', metrics   =  c('accuracy') )