caffe 的 prototxt 出错,解析器需要一个标识符?
Error in the prototxt of caffe, parser expected an identifier?
我正在尝试使用 caffe 在 python 中执行这个简单的代码:
import caffe
net = caffe.Net("myfile.prototxt", caffe.TEST)
我收到这条消息,所以我想我的 .prototxt 有错误
[libprotobuf ERROR google/protobuf/text_format.cc:274] Error parsing text-format caffe.NetParameter: 26:22: Expected identifier.
WARNING: Logging before InitGoogleLogging() is written to STDERR
F0205 14:29:24.097086 1120 upgrade_proto.cpp:88] Check failed: ReadProtoFromTextFile(param_file, param) Failed to parse NetParameter file: caffeModel.prototxt
但我无法理解这个错误,该架构只是一个深度可分离卷积,中间有批归一化层,也许我应该有一些输入层?
name: "UNIPINET"
# transform_param {
# scale: 0.017
# mirror: false
# crop_size: 224
# mean_value: [103.94,116.78,123.68]
# }
input: "data"
input_dim: 1
input_dim: 1
input_dim: 63
input_dim: 13
layer {
name: "conv1/dw"
type: "Convolution"
bottom: "data"
top: "conv1/dw"
param {
lr_mult: 1
decay_mult: 1
}
convolution_param {
num_output: 1
bias_term: true
pad: 0
kernel_size: 15, 3
group: 1
#engine: CAFFE
stride: 1
weight_filler {
type: "msra"
}
}
}
layer {
name: "conv1/sep"
type: "Convolution"
bottom: "conv1/dw"
top: "conv1/sep"
param {
lr_mult: 1
decay_mult: 1
}
convolution_param {
num_output: 16
bias_term: false
pad: 0
kernel_size: 1
stride: 1
weight_filler {
type: "msra"
}
}
}
layer {
name: "conv1/sep/bn"
type: "BatchNorm"
bottom: "conv1/sep"
top: "conv1/sep"
param {
lr_mult: 0
decay_mult: 0
}
param {
lr_mult: 0
decay_mult: 0
}
param {
lr_mult: 0
decay_mult: 0
}
}
layer {
name: "relu1/sep"
type: "ReLU"
bottom: "conv1/sep"
top: "conv1/sep"
}
layer {
name: "avg_pool"
type: "Pooling"
bottom: "conv1/sep"
top: "pool6"
pooling_param {
pool: AVE
global_pooling: true
}
}
layer {
name: "fc"
type: "InnerProduct"
bottom: "avg_pool"
top: "fc"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
inner_product_param {
num_output: 12
weight_filler {
type: "msra"
}
bias_filler {
type: "constant"
value: 0
}
}
}
layer {
name: "output"
type: "Softmax"
bottom: "fc"
top: "output"
}
已解决,kernel_size意味着两个维度是相等的,如果我想为H和W使用不同的维度应该使用kernel_w kernel_h.
我正在尝试使用 caffe 在 python 中执行这个简单的代码:
import caffe
net = caffe.Net("myfile.prototxt", caffe.TEST)
我收到这条消息,所以我想我的 .prototxt 有错误
[libprotobuf ERROR google/protobuf/text_format.cc:274] Error parsing text-format caffe.NetParameter: 26:22: Expected identifier.
WARNING: Logging before InitGoogleLogging() is written to STDERR
F0205 14:29:24.097086 1120 upgrade_proto.cpp:88] Check failed: ReadProtoFromTextFile(param_file, param) Failed to parse NetParameter file: caffeModel.prototxt
但我无法理解这个错误,该架构只是一个深度可分离卷积,中间有批归一化层,也许我应该有一些输入层?
name: "UNIPINET"
# transform_param {
# scale: 0.017
# mirror: false
# crop_size: 224
# mean_value: [103.94,116.78,123.68]
# }
input: "data"
input_dim: 1
input_dim: 1
input_dim: 63
input_dim: 13
layer {
name: "conv1/dw"
type: "Convolution"
bottom: "data"
top: "conv1/dw"
param {
lr_mult: 1
decay_mult: 1
}
convolution_param {
num_output: 1
bias_term: true
pad: 0
kernel_size: 15, 3
group: 1
#engine: CAFFE
stride: 1
weight_filler {
type: "msra"
}
}
}
layer {
name: "conv1/sep"
type: "Convolution"
bottom: "conv1/dw"
top: "conv1/sep"
param {
lr_mult: 1
decay_mult: 1
}
convolution_param {
num_output: 16
bias_term: false
pad: 0
kernel_size: 1
stride: 1
weight_filler {
type: "msra"
}
}
}
layer {
name: "conv1/sep/bn"
type: "BatchNorm"
bottom: "conv1/sep"
top: "conv1/sep"
param {
lr_mult: 0
decay_mult: 0
}
param {
lr_mult: 0
decay_mult: 0
}
param {
lr_mult: 0
decay_mult: 0
}
}
layer {
name: "relu1/sep"
type: "ReLU"
bottom: "conv1/sep"
top: "conv1/sep"
}
layer {
name: "avg_pool"
type: "Pooling"
bottom: "conv1/sep"
top: "pool6"
pooling_param {
pool: AVE
global_pooling: true
}
}
layer {
name: "fc"
type: "InnerProduct"
bottom: "avg_pool"
top: "fc"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
inner_product_param {
num_output: 12
weight_filler {
type: "msra"
}
bias_filler {
type: "constant"
value: 0
}
}
}
layer {
name: "output"
type: "Softmax"
bottom: "fc"
top: "output"
}
已解决,kernel_size意味着两个维度是相等的,如果我想为H和W使用不同的维度应该使用kernel_w kernel_h.