以图像作为标签的 Caffe 测试网
Caffe test net with images as labels
问题
我尝试创建一个 CNN,其中我使用图像作为标签,值介于 0 和 1 之间。经过一些训练后,我的网络损失了大约 23 轮。现在我想看看结果。为此,我正在使用这个 python 脚本:
import caffe
import numpy as np
from PIL import Image
net = caffe.Net('D:/caffe/net.prototxt',
'D:/caffe/net_iter_35000.caffemodel',
caffe.TEST)
# load input and configure preprocessing
transformer = caffe.io.Transformer({'data': net.blobs['data'].data.shape})
transformer.set_mean('data', np.load('train_mean.npy').mean(1).mean(1))
transformer.set_transpose('data', (2,0,1))
transformer.set_channel_swap('data', (2,1,0))
transformer.set_raw_scale('data', 255.0)
#note we can change the batch size on-the-fly
#since we classify only one image, we change batch size from 10 to 1
net.blobs['data'].reshape(1,3,360,360)
#load the image in the data layer
im = caffe.io.load_image('train/img0.png')
net.blobs['data'].data[...] = transformer.preprocess('data', im)
#compute
out = net.forward()
result = out['conv7'][0][0]
现在我期望结果的值大约在 0 和 1 之间。但实际上 result.max() returns 5.92 和 result.min() returns -4315.5。
python 脚本中是否有错误,或者这个值对于 23 的损失是否正常?
附加信息
我的train_test.prototxt:
layer {
name: "mynet"
type: "Data"
top: "data0"
top: "label0"
include {
phase: TRAIN
}
transform_param {
mean_file: "train_mean.binaryproto"
scale: 0.00390625
}
data_param {
source: "train_lmdb"
batch_size: 32
backend: LMDB
}
}
layer {
name: "mynetlabel"
type: "Data"
top: "data1"
top: "label1"
include {
phase: TRAIN
}
transform_param {
scale: 0.00390625
}
data_param {
source: "train_label_lmdb_2"
batch_size: 32
backend: LMDB
}
}
layer {
name: "mnist"
type: "Data"
top: "data0"
top: "label0"
include {
phase: TEST
}
transform_param {
mean_file: "train_mean.binaryproto"
scale: 0.00390625
}
data_param {
source: "val_lmdb"
batch_size: 16
backend: LMDB
}
}
layer {
name: "mnistlabel"
type: "Data"
top: "data1"
top: "label1"
include {
phase: TEST
}
transform_param {
scale: 0.00390625
}
data_param {
source: "val_label_lmdb_2"
batch_size: 16
backend: LMDB
}
}
.
.
.
layer {
name: "conv7"
type: "Convolution"
bottom: "conv6"
top: "conv7"
param {
lr_mult: 5.0
decay_mult: 1.0
}
param {
lr_mult: 10.0
decay_mult: 0.0
}
convolution_param {
num_output: 1
pad: 0
kernel_size: 1
weight_filler {
type: "gaussian"
std: 0.00999999977648
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "accuracy"
type: "Accuracy"
bottom: "conv7"
bottom: "data1"
top: "accuracy"
include {
phase: TEST
}
}
layer {
name: "loss"
type: "SigmoidCrossEntropyLoss"
bottom: "conv7"
bottom: "data1"
top: "loss"
}
我的net.prototxt:
layer {
name: "data"
type: "Input"
top: "data"
input_param { shape: { dim: 50 dim: 3 dim: 360 dim: 360 } }
transform_param {
scale: 0.00390625
}
}
.
.
.
layer {
name: "conv7"
type: "Convolution"
bottom: "conv6"
top: "conv7"
param {
lr_mult: 5.0
decay_mult: 1.0
}
param {
lr_mult: 10.0
decay_mult: 0.0
}
convolution_param {
num_output: 1
pad: 0
kernel_size: 1
weight_filler {
type: "gaussian"
std: 0.00999999977648
}
bias_filler {
type: "constant"
}
}
}
您的 train_val.prototxt
使用 net.prototxt
文件中的 "SigmoidWithCrossEntropy"
, as the name of this layer suggests, it comprises (internally) of a "Sigmoid"
layer and a cross entropy loss. Therefore, when deploying your net you should replace this layer with a "Sigmoid"
图层。
有关详细信息,请参阅 。
PS,
caffe 不支持将 "Accuracy"
层用于单个二进制输出:"Accuracy"
层假定您的预测维度等于 类 的数量(适合 "SoftmaxWithLoss"
)。在您的情况下,您有两个标签 {0, 1}
但输出暗淡仅为 1。有关详细信息,请参阅 。
问题
我尝试创建一个 CNN,其中我使用图像作为标签,值介于 0 和 1 之间。经过一些训练后,我的网络损失了大约 23 轮。现在我想看看结果。为此,我正在使用这个 python 脚本:
import caffe
import numpy as np
from PIL import Image
net = caffe.Net('D:/caffe/net.prototxt',
'D:/caffe/net_iter_35000.caffemodel',
caffe.TEST)
# load input and configure preprocessing
transformer = caffe.io.Transformer({'data': net.blobs['data'].data.shape})
transformer.set_mean('data', np.load('train_mean.npy').mean(1).mean(1))
transformer.set_transpose('data', (2,0,1))
transformer.set_channel_swap('data', (2,1,0))
transformer.set_raw_scale('data', 255.0)
#note we can change the batch size on-the-fly
#since we classify only one image, we change batch size from 10 to 1
net.blobs['data'].reshape(1,3,360,360)
#load the image in the data layer
im = caffe.io.load_image('train/img0.png')
net.blobs['data'].data[...] = transformer.preprocess('data', im)
#compute
out = net.forward()
result = out['conv7'][0][0]
现在我期望结果的值大约在 0 和 1 之间。但实际上 result.max() returns 5.92 和 result.min() returns -4315.5。
python 脚本中是否有错误,或者这个值对于 23 的损失是否正常?
附加信息
我的train_test.prototxt:
layer {
name: "mynet"
type: "Data"
top: "data0"
top: "label0"
include {
phase: TRAIN
}
transform_param {
mean_file: "train_mean.binaryproto"
scale: 0.00390625
}
data_param {
source: "train_lmdb"
batch_size: 32
backend: LMDB
}
}
layer {
name: "mynetlabel"
type: "Data"
top: "data1"
top: "label1"
include {
phase: TRAIN
}
transform_param {
scale: 0.00390625
}
data_param {
source: "train_label_lmdb_2"
batch_size: 32
backend: LMDB
}
}
layer {
name: "mnist"
type: "Data"
top: "data0"
top: "label0"
include {
phase: TEST
}
transform_param {
mean_file: "train_mean.binaryproto"
scale: 0.00390625
}
data_param {
source: "val_lmdb"
batch_size: 16
backend: LMDB
}
}
layer {
name: "mnistlabel"
type: "Data"
top: "data1"
top: "label1"
include {
phase: TEST
}
transform_param {
scale: 0.00390625
}
data_param {
source: "val_label_lmdb_2"
batch_size: 16
backend: LMDB
}
}
.
.
.
layer {
name: "conv7"
type: "Convolution"
bottom: "conv6"
top: "conv7"
param {
lr_mult: 5.0
decay_mult: 1.0
}
param {
lr_mult: 10.0
decay_mult: 0.0
}
convolution_param {
num_output: 1
pad: 0
kernel_size: 1
weight_filler {
type: "gaussian"
std: 0.00999999977648
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "accuracy"
type: "Accuracy"
bottom: "conv7"
bottom: "data1"
top: "accuracy"
include {
phase: TEST
}
}
layer {
name: "loss"
type: "SigmoidCrossEntropyLoss"
bottom: "conv7"
bottom: "data1"
top: "loss"
}
我的net.prototxt:
layer {
name: "data"
type: "Input"
top: "data"
input_param { shape: { dim: 50 dim: 3 dim: 360 dim: 360 } }
transform_param {
scale: 0.00390625
}
}
.
.
.
layer {
name: "conv7"
type: "Convolution"
bottom: "conv6"
top: "conv7"
param {
lr_mult: 5.0
decay_mult: 1.0
}
param {
lr_mult: 10.0
decay_mult: 0.0
}
convolution_param {
num_output: 1
pad: 0
kernel_size: 1
weight_filler {
type: "gaussian"
std: 0.00999999977648
}
bias_filler {
type: "constant"
}
}
}
您的 train_val.prototxt
使用 net.prototxt
文件中的 "SigmoidWithCrossEntropy"
, as the name of this layer suggests, it comprises (internally) of a "Sigmoid"
layer and a cross entropy loss. Therefore, when deploying your net you should replace this layer with a "Sigmoid"
图层。
有关详细信息,请参阅
PS,
caffe 不支持将 "Accuracy"
层用于单个二进制输出:"Accuracy"
层假定您的预测维度等于 类 的数量(适合 "SoftmaxWithLoss"
)。在您的情况下,您有两个标签 {0, 1}
但输出暗淡仅为 1。有关详细信息,请参阅