使用 Caffe2 创建一个使用 dropout 的模型,但得到与 dropout 代码相关的错误
Using to Caffe2 to create a model that uses dropout but getting an error related to dropout code
我正在尝试在 Caffe2 中创建一个使用 dropout 的模型。但是我收到一个模型错误,它引用了我的代码 dropout
代码。
def someModel(model, data):
conv1 = brew.conv(model, data, 'conv1', dim_in=1, dim_out=20, kernel=5)
conv_relu_1 = model.net.Relu(conv1, 'relu1')
conv2 = brew.conv(model, conv_relu_1, 'conv2', dim_in=1, dim_out=20, kernel=5)
conv_relu_2 = model.net.Relu(conv2, 'relu2')
pool1 = model.net.MaxPool(conv_relu_2, 'pool1', kernel=2, stride=2)
drop1 = model.Dropout(pool1, 'drop1', ratio=0.5, is_test=0)
#drop1 = model.Dropout(pool1, 'drop1', ratio=0.5)
conv3 = brew.conv(model, drop1, 'conv3', dim_in=1, dim_out=50, kernel=3)
conv_relu_3 = model.net.Relu(conv3, 'relu3')
conv4 = brew.conv(model, conv_relu_3, 'conv4', dim_in=1, dim_out=20, kernel=5)
conv_relu_4 = model.net.Relu(conv4, 'relu4')
pool2 = model.net.MaxPool(conv_relu_4, 'pool1', kernel=2, stride=2)
drop2 = model.Dropout(pool2, 'drop2', ratio=0.5)
fc1 = brew.fc(model, drop2, 'fc1', dim_in=20 * 4 * 4, dim_out=50)
fc_relu_1 = model.net.Relu(fc1, 'relu5')
fc2 = brew.fc(model, fc_relu_1, 'fc2', dim_in=50 * 4 * 4, dim_out=10)
pred = brew.fc(model, fc2, 'pred', 500, 10)
softmax = model.net.Softmax(pred, 'softmax')
return softmax
return pred
以下是我遇到的错误。
Exception when creating gradient for [Dropout]:[enforce fail at operator_gradient.h:86] schema->Verify(def_). (GradientMaker) Operator def did not pass schema checking: input: "pool1" output: "drop2" name: "" type: "Dropout" arg { name: "ratio" f: 0.5 } . Op: input: "pool1" output: "drop2" name: "" type: "Dropout" arg {name: "ratio" f: 0.5}
定义dropout层为
dropout1 = brew.dropout(model,pool1, 'dropout1', ratio=0.5, is_test=0)
我正在尝试在 Caffe2 中创建一个使用 dropout 的模型。但是我收到一个模型错误,它引用了我的代码 dropout
代码。
def someModel(model, data):
conv1 = brew.conv(model, data, 'conv1', dim_in=1, dim_out=20, kernel=5)
conv_relu_1 = model.net.Relu(conv1, 'relu1')
conv2 = brew.conv(model, conv_relu_1, 'conv2', dim_in=1, dim_out=20, kernel=5)
conv_relu_2 = model.net.Relu(conv2, 'relu2')
pool1 = model.net.MaxPool(conv_relu_2, 'pool1', kernel=2, stride=2)
drop1 = model.Dropout(pool1, 'drop1', ratio=0.5, is_test=0)
#drop1 = model.Dropout(pool1, 'drop1', ratio=0.5)
conv3 = brew.conv(model, drop1, 'conv3', dim_in=1, dim_out=50, kernel=3)
conv_relu_3 = model.net.Relu(conv3, 'relu3')
conv4 = brew.conv(model, conv_relu_3, 'conv4', dim_in=1, dim_out=20, kernel=5)
conv_relu_4 = model.net.Relu(conv4, 'relu4')
pool2 = model.net.MaxPool(conv_relu_4, 'pool1', kernel=2, stride=2)
drop2 = model.Dropout(pool2, 'drop2', ratio=0.5)
fc1 = brew.fc(model, drop2, 'fc1', dim_in=20 * 4 * 4, dim_out=50)
fc_relu_1 = model.net.Relu(fc1, 'relu5')
fc2 = brew.fc(model, fc_relu_1, 'fc2', dim_in=50 * 4 * 4, dim_out=10)
pred = brew.fc(model, fc2, 'pred', 500, 10)
softmax = model.net.Softmax(pred, 'softmax')
return softmax
return pred
以下是我遇到的错误。
Exception when creating gradient for [Dropout]:[enforce fail at operator_gradient.h:86] schema->Verify(def_). (GradientMaker) Operator def did not pass schema checking: input: "pool1" output: "drop2" name: "" type: "Dropout" arg { name: "ratio" f: 0.5 } . Op: input: "pool1" output: "drop2" name: "" type: "Dropout" arg {name: "ratio" f: 0.5}
定义dropout层为
dropout1 = brew.dropout(model,pool1, 'dropout1', ratio=0.5, is_test=0)