使用 Keras Tensorflow 2.0 获取梯度

Get Gradients with Keras Tensorflow 2.0

我想跟踪张量板上的梯度。 但是,由于会话 运行 语句不再存在,并且 tf.keras.callbacks.TensorBoardwrite_grads 参数是depricated,我想知道如何在使用 Kerastensorflow 2.0 训练期间跟踪梯度.

我目前的方法是为此创建一个新的回调 class,但没有成功。也许其他人知道如何完成这种高级的东西。

为测试而创建的代码如下所示,但是 运行 出现了独立于将梯度值打印到控制台或张量板之外的错误。

import tensorflow as tf
from tensorflow.python.keras import backend as K

mnist = tf.keras.datasets.mnist

(x_train, y_train), (x_test, y_test) = mnist.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0

model = tf.keras.models.Sequential([
  tf.keras.layers.Flatten(input_shape=(28, 28)),
  tf.keras.layers.Dense(128, activation='relu', name='dense128'),
  tf.keras.layers.Dropout(0.2),
  tf.keras.layers.Dense(10, activation='softmax', name='dense10')
])

model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])


class GradientCallback(tf.keras.callbacks.Callback):
    console = True

    def on_epoch_end(self, epoch, logs=None):
        weights = [w for w in self.model.trainable_weights if 'dense' in w.name and 'bias' in w.name]
        loss = self.model.total_loss
        optimizer = self.model.optimizer
        gradients = optimizer.get_gradients(loss, weights)
        for t in gradients:
            if self.console:
                print('Tensor: {}'.format(t.name))
                print('{}\n'.format(K.get_value(t)[:10]))
            else:
                tf.summary.histogram(t.name, data=t)


file_writer = tf.summary.create_file_writer("./metrics")
file_writer.set_as_default()

# write_grads has been removed
tensorboard_cb = tf.keras.callbacks.TensorBoard(histogram_freq=1, write_grads=True)
gradient_cb = GradientCallback()

model.fit(x_train, y_train, epochs=5, callbacks=[gradient_cb, tensorboard_cb])

要计算损失对权重的梯度,请使用

with tf.GradientTape() as tape:
    loss = model(model.trainable_weights)

tape.gradient(loss, model.trainable_weights)

这在 GradientTape.

上有(可以说是很差的)记录

我们不需要 tape.watch 变量,因为默认情况下会监视可训练参数。

作为一个函数,可以写成

def gradient(model, x):
    x_tensor = tf.convert_to_tensor(x, dtype=tf.float32)
    with tf.GradientTape() as t:
        t.watch(x_tensor)
        loss = model(x_tensor)
    return t.gradient(loss, x_tensor).numpy()

也可以看看这里:https://github.com/tensorflow/tensorflow/issues/31542#issuecomment-630495970

richardwth写了一个Tensorboard的子class。

我改编如下:

class ExtendedTensorBoard(tf.keras.callbacks.TensorBoard):
    def _log_gradients(self, epoch):
        writer = self._writers['train']

        with writer.as_default(), tf.GradientTape() as g:
            # here we use test data to calculate the gradients
            features, y_true = list(val_dataset.batch(100).take(1))[0]

            y_pred = self.model(features)  # forward-propagation
            loss = self.model.compiled_loss(y_true=y_true, y_pred=y_pred)  # calculate loss
            gradients = g.gradient(loss, self.model.trainable_weights)  # back-propagation

            # In eager mode, grads does not have name, so we get names from model.trainable_weights
            for weights, grads in zip(self.model.trainable_weights, gradients):
                tf.summary.histogram(
                    weights.name.replace(':', '_') + '_grads', data=grads, step=epoch)

        writer.flush()

    def on_epoch_end(self, epoch, logs=None):
        # This function overwrites the on_epoch_end in tf.keras.callbacks.TensorBoard
        # but we do need to run the original on_epoch_end, so here we use the super function.
        super(ExtendedTensorBoard, self).on_epoch_end(epoch, logs=logs)

        if self.histogram_freq and epoch % self.histogram_freq == 0:
            self._log_gradients(epoch)