如何在 tensorflow.js 中设置 Adam 优化器学习率?

How do you set the Adam optimizer learning rate in tensorflow.js?

对于tensorflow.js,如何设置node.js中Adam优化器的学习率?我收到一个错误:

model.optimizer.setLearningRate is not a function

const optimizer = tf.train.adam(0.001)
model.compile({
    loss: 'sparseCategoricalCrossentropy',
    optimizer,
    shuffle: true,
    metrics: ['accuracy']
});

await model.fit(trainValues, trainLabels, {
    epochs: 50,
    validationData: [testValues, testLabels],
    callbacks: {
        onEpochBegin: async (epoch) => {
            const newRate = getNewRate();
            model.optimizer.setLearningRate(newRate);
        }
    }
});

当您调用 model.compile, you can pass an instance of tf.train.Optimizer instead of passing a string. These instances are created via the tf.train.* 个工厂时,您可以将学习率作为第一个参数传递。

代码示例

model.compile({
    optimizer: tf.train.sgd(0.000001), // custom learning rate
    /* ... */
});

在训练期间更改学习率

目前,只有 sgd 个优化器有 setLearningRate method implemented, meaning the following code only works for optimizer instances created via tf.train.sgd:

const optimizer = tf.train.sgd(0.001);
optimizer.setLearningRate(0.000001);

使用non-officialAPI

优化器实例有一个 protected 属性 learningRate,您可以更改该属性。该属性不是 public,但是,因为这是 JavaScript,您可以通过在对象上设置 learningRate 来简单地更改值,如下所示:

const optimizer = tf.train.adam();
optimizer.learningRate = 0.000001;
// or via your model:
model.optimizer.learningRate = 0.000001;

请记住,您正在使用 API 的 non-official 部分,它随时可能会损坏。

创建模型时,可以在将optimizer传递给model.compile

时设置学习率
const myOptimizer = tf.train.sgd(myLearningRate) 
model.compile({optimizer: myOptimizer, loss: 'meanSquaredError'});

可以在训练期间使用 setLearningRate

更改学习率
model.fit(xs, ys, {
  epochs: 800, 
  callbacks: {
    onEpochEnd: async (epoch, logs) => {

      if (epoch == 300){
        model.optimizer.setLearningRate(0.14)  
    }

      if (epoch == 400){
        model.optimizer.setLearningRate(0.02)   
      }
    }
  } 
})