Keras:模仿 PyTorch 的 conv2d 和 linear/dense 权重初始化?
Keras: Mimic PyTorch's conv2d and linear/dense weight initialization?
我正在将一个模型从 PyTorch 移植到 Keras/Tensorflow,我想确保我使用相同的算法进行权重初始化。如何在 Keras 中模拟 PyTorch 的权重初始化?
如果重构 PyTorch initialization 代码,您会发现权重初始化算法出奇的简单。该代码中的注释是正确的;只需阅读该评论并模仿它即可。
这是模仿它的有效 Keras/Tensorflow 代码:
import tensorflow as tf
from tensorflow.keras import layers
class PytorchInitialization(tf.keras.initializers.VarianceScaling):
def __init__(self, seed=None):
super().__init__(
scale=1 / 3, mode='fan_in', distribution='uniform', seed=seed)
# Conv layer
conv = layers.Conv2D(32, 3, activation="relu", padding="SAME",
input_shape=(28, 28, 1),
kernel_initializer=PytorchInitialization(),
bias_initializer=PytorchInitialization())
# Dense / linear layer
classifier = layers.Dense(10,
kernel_initializer=PytorchInitialization(),
bias_initializer=PytorchInitialization(),
我正在将一个模型从 PyTorch 移植到 Keras/Tensorflow,我想确保我使用相同的算法进行权重初始化。如何在 Keras 中模拟 PyTorch 的权重初始化?
如果重构 PyTorch initialization 代码,您会发现权重初始化算法出奇的简单。该代码中的注释是正确的;只需阅读该评论并模仿它即可。
这是模仿它的有效 Keras/Tensorflow 代码:
import tensorflow as tf
from tensorflow.keras import layers
class PytorchInitialization(tf.keras.initializers.VarianceScaling):
def __init__(self, seed=None):
super().__init__(
scale=1 / 3, mode='fan_in', distribution='uniform', seed=seed)
# Conv layer
conv = layers.Conv2D(32, 3, activation="relu", padding="SAME",
input_shape=(28, 28, 1),
kernel_initializer=PytorchInitialization(),
bias_initializer=PytorchInitialization())
# Dense / linear layer
classifier = layers.Dense(10,
kernel_initializer=PytorchInitialization(),
bias_initializer=PytorchInitialization(),