Tensorflow Addons R2 ValueError: Dimension 0 in both shapes must be equal, but are 1 and 5
Tensorflow Addons R2 ValueError: Dimension 0 in both shapes must be equal, but are 1 and 5
我一直在尝试将 tfa 指标添加到我的模型编译中,以便在整个培训过程中进行跟踪。但是,当我添加 R2 指标时,出现以下错误。我以为 y_shape=(1,)
会解决这个问题,但事实并非如此。
ValueError: Dimension 0 in both shapes must be equal, but are 1 and 5. Shapes are [1] and [5]. for '{{node AssignAddVariableOp_8}} = AssignAddVariableOp[dtype=DT_FLOAT](AssignAddVariableOp_8/resource, Sum_6)' with input shapes: [], [5].
我的代码如下所示:
model = Sequential()
model.add(Input(shape=(4,)))
model.add(Normalization())
model.add(Dense(5, activation="relu", kernel_regularizer=l2(l2=1e-2)))
print(model.summary())
opt = Adam(learning_rate = 1e-2)
model.compile(loss="mean_squared_error", optimizer=tf.keras.optimizers.Adam(learning_rate=1e-2), metrics=[MeanSquaredError(name="mse"), MeanAbsoluteError(name="mae"), tfa.metrics.RSquare(name="R2", y_shape=(1,))])
history = model.fit(x = training_x,
y = training_y,
epochs = 10,
batch_size = 64,
validation_data = (validation_x, validation_y)
)
非常感谢任何帮助!请注意,我还尝试将 y_shape 更改为 (5,),但随后出现错误,即尺寸不相等,而是 5 和 1...
您需要向您的模型添加一个输出层,如下所示:
model.add(Dense(1))
那么您的模型将如下所示:
model = Sequential()
model.add(Input(shape=(4,)))
model.add(Normalization())
model.add(Dense(5, activation="relu", kernel_regularizer=regularizers.l2(l2=1e-2)))
model.add(Dense(1))
print(model.summary())
输出:
Model: "sequential_10"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
normalization_10 (Normaliza (None, 4) 9
tion)
dense_12 (Dense) (None, 5) 25
dense_13 (Dense) (None, 1) 6
=================================================================
Total params: 40
Trainable params: 31
Non-trainable params: 9
我一直在尝试将 tfa 指标添加到我的模型编译中,以便在整个培训过程中进行跟踪。但是,当我添加 R2 指标时,出现以下错误。我以为 y_shape=(1,)
会解决这个问题,但事实并非如此。
ValueError: Dimension 0 in both shapes must be equal, but are 1 and 5. Shapes are [1] and [5]. for '{{node AssignAddVariableOp_8}} = AssignAddVariableOp[dtype=DT_FLOAT](AssignAddVariableOp_8/resource, Sum_6)' with input shapes: [], [5].
我的代码如下所示:
model = Sequential()
model.add(Input(shape=(4,)))
model.add(Normalization())
model.add(Dense(5, activation="relu", kernel_regularizer=l2(l2=1e-2)))
print(model.summary())
opt = Adam(learning_rate = 1e-2)
model.compile(loss="mean_squared_error", optimizer=tf.keras.optimizers.Adam(learning_rate=1e-2), metrics=[MeanSquaredError(name="mse"), MeanAbsoluteError(name="mae"), tfa.metrics.RSquare(name="R2", y_shape=(1,))])
history = model.fit(x = training_x,
y = training_y,
epochs = 10,
batch_size = 64,
validation_data = (validation_x, validation_y)
)
非常感谢任何帮助!请注意,我还尝试将 y_shape 更改为 (5,),但随后出现错误,即尺寸不相等,而是 5 和 1...
您需要向您的模型添加一个输出层,如下所示:
model.add(Dense(1))
那么您的模型将如下所示:
model = Sequential()
model.add(Input(shape=(4,)))
model.add(Normalization())
model.add(Dense(5, activation="relu", kernel_regularizer=regularizers.l2(l2=1e-2)))
model.add(Dense(1))
print(model.summary())
输出:
Model: "sequential_10"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
normalization_10 (Normaliza (None, 4) 9
tion)
dense_12 (Dense) (None, 5) 25
dense_13 (Dense) (None, 1) 6
=================================================================
Total params: 40
Trainable params: 31
Non-trainable params: 9