通过keras中的特定层停止梯度反向传播
Stopping Gradient back prop through a particular layer in keras
x = Conv2D(768, (3, 3), padding='same', activation='relu', kernel_initializer='normal',
name='rpn_conv1',trainable=trainable)(base_layers)
x_class = Conv2D(num_anchors, (1, 1), activation='sigmoid', kernel_initializer='uniform',
name='rpn_out_class',trainable=trainable)(x)
# stop gradient backflow through regression layer
x_regr = Conv2D(num_anchors * 4, (1, 1), activation='linear', kernel_initializer='zero',
name='rpn_out_regress',trainable=trainable)(x)
如何使用 K.stop_gradient() 单独通过回归层 (x_reg) 停止梯度反向传播?
您需要 Lambda
图层才能使用自定义函数。
x_regr_constant = Lambda(
lambda x: K.stop_gradient(x),
output_shape=notNecessaryWithTensorflow
)(x_regr)
x = Conv2D(768, (3, 3), padding='same', activation='relu', kernel_initializer='normal',
name='rpn_conv1',trainable=trainable)(base_layers)
x_class = Conv2D(num_anchors, (1, 1), activation='sigmoid', kernel_initializer='uniform',
name='rpn_out_class',trainable=trainable)(x)
# stop gradient backflow through regression layer
x_regr = Conv2D(num_anchors * 4, (1, 1), activation='linear', kernel_initializer='zero',
name='rpn_out_regress',trainable=trainable)(x)
如何使用 K.stop_gradient() 单独通过回归层 (x_reg) 停止梯度反向传播?
您需要 Lambda
图层才能使用自定义函数。
x_regr_constant = Lambda(
lambda x: K.stop_gradient(x),
output_shape=notNecessaryWithTensorflow
)(x_regr)