尽管设置了 CPU_Only,但仍使用 GPU,产生意外的关键字参数
Using GPU despite setting CPU_Only, yielding unexpected keyword argument
我正在 Ubuntu 14.04 虚拟服务器上安装 Caffe,并使用 https://github.com/BVLC/caffe/wiki/Ubuntu-14.04-VirtualBox-VM 作为灵感安装了 CUDA(没有驱动程序)。在安装过程中,我在构建之前编辑了 MakeFile 以包含 "CPU_ONLY := 1"
。但是,Caffe 似乎仍在尝试利用 GPU。当我尝试 运行 一个测试示例时,出现以下错误:
python python/classify.py examples/images/cat.jpg foo
Traceback (most recent call last):
File "python/classify.py", line 130, in <module>
main(sys.argv)
File "python/classify.py", line 103, in main
channel_swap=channel_swap)
TypeError: __init__() got an unexpected keyword argument 'gpu'
如何解决此问题并 运行 完全在 CPU 上?
由于caffe开发人员引入的大量接口更改,目前存在一些问题。 Python 包装器尚未使用这些更改进行更新。
查看修复问题的 PR:https://github.com/BVLC/caffe/pull/1964
我要对 Mailerdaimon 的回答补充几句话。
我按照安装指南 (https://github.com/BVLC/caffe/wiki/Ubuntu-14.04-VirtualBox-VM) to setup Caffe in my vagrant virtual machine. FYI, virtual machines DO NOT support GPU accelerating. Back to the point, after I fix 'CPU / GPU switch in example scripts'(https://github.com/BVLC/caffe/pull/2058) and add '--print_results --labels_file' options(https://github.com/jetpacapp/caffe/blob/master/python/classify.py) 'python/classify.py',这个命令 './python/classify.py ./examples/images/cat.jpg foo --print_results'仍然抛出以下错误:
Traceback (most recent call last):
File "./python/classify.py", line 175, in <module>
main(sys.argv)
File "./python/classify.py", line 129, in main
channel_swap=channel_swap)
File "/home/vagrant/caffe/python/caffe/classifier.py", line 38, in __init__
self.transformer.set_mean(in_, mean)
File "/home/vagrant/caffe/python/caffe/io.py", line 267, in set_mean
raise ValueError('Mean shape incompatible with input shape.')
ValueError: Mean shape incompatible with input shape.
然后我转储'mean'(3*256*256)和'input'(3*227*227)的形状。显然这两种形状是不相容的。但是旧版本的 'set_mean()' 不会抛出错误,所以我深入研究了 python 代码并发现旧的 'set_mean()' 函数看起来像这样(python/caffe/pycaffe.py, line 195-202, https://github.com/jetpacapp/caffe/):
if mode == 'elementwise':
if mean.shape != in_shape[1:]:
# Resize mean (which requires H x W x K input in range [0,1]).
m_min, m_max = mean.min(), mean.max()
normal_mean = (mean - m_min) / (m_max - m_min)
mean = caffe.io.resize_image(normal_mean.transpose((1,2,0)),
in_shape[2:]).transpose((2,0,1)) * (m_max - m_min) + m_min
但在最新的Caffe中,贡献者将'set_mean()'等转换函数封装成class
'Transformer'。新的 'set_mean()' 函数如下所示(python/caffe/io.py,第 253-254 行,https://github.com/BVLC/caffe/):
if ms != self.inputs[in_][1:]:
raise ValueError('Mean shape incompatible with input shape.')
天哪,这两个怎么是同一个功能?所以我更改了新的 'set_mean()',注释掉错误提示语句,并添加了与旧的 'set_mean()'.
一样的形状调整程序
if ms != ins:
print(self.inputs[in_])
in_shape = self.inputs[in_][1:]
m_min, m_max = mean.min(), mean.max()
normal_mean = (mean - m_min) / (m_max - m_min)
mean = resize_image(normal_mean.transpose((1,2,0)),
in_shape[1:]).transpose((2,0,1)) * \
(m_max - m_min) + m_min
'''
raise ValueError('Mean shape incompatible with input shape.')
'''
瞧,问题解决了。
Classifying 1 inputs.
Done in 1.17 s.
[('tabby', '0.27933'), ('tiger cat', '0.21915'), ('Egyptian cat', '0.16064'), ('lynx', '0.12844'), ('kit fox', '0.05155')]
另一个google组解决了同样的错误:-
您需要做的就是更改此设置:
mean=np.load(mean_file)
对此:
mean=np.load(mean_file).mean(1).mean(1)
用户 2696499 只有一个错字
if ms != inputs[in_][1:]
print(self.inputs[in_])
in_shape = self.inputs[in_][1:]
m_min, m_max = mean.min(), mean.max()
normal_mean = (mean - m_min) / (m_max - m_min)
mean = resize_image(normal_mean.transpose((1,2,0)),
in_shape[1:]).transpose((2,0,1)) * \
(m_max - m_min) + m_min
'''
raise ValueError('Mean shape incompatible with input shape.')
'''
我正在 Ubuntu 14.04 虚拟服务器上安装 Caffe,并使用 https://github.com/BVLC/caffe/wiki/Ubuntu-14.04-VirtualBox-VM 作为灵感安装了 CUDA(没有驱动程序)。在安装过程中,我在构建之前编辑了 MakeFile 以包含 "CPU_ONLY := 1"
。但是,Caffe 似乎仍在尝试利用 GPU。当我尝试 运行 一个测试示例时,出现以下错误:
python python/classify.py examples/images/cat.jpg foo
Traceback (most recent call last):
File "python/classify.py", line 130, in <module>
main(sys.argv)
File "python/classify.py", line 103, in main
channel_swap=channel_swap)
TypeError: __init__() got an unexpected keyword argument 'gpu'
如何解决此问题并 运行 完全在 CPU 上?
由于caffe开发人员引入的大量接口更改,目前存在一些问题。 Python 包装器尚未使用这些更改进行更新。
查看修复问题的 PR:https://github.com/BVLC/caffe/pull/1964
我要对 Mailerdaimon 的回答补充几句话。
我按照安装指南 (https://github.com/BVLC/caffe/wiki/Ubuntu-14.04-VirtualBox-VM) to setup Caffe in my vagrant virtual machine. FYI, virtual machines DO NOT support GPU accelerating. Back to the point, after I fix 'CPU / GPU switch in example scripts'(https://github.com/BVLC/caffe/pull/2058) and add '--print_results --labels_file' options(https://github.com/jetpacapp/caffe/blob/master/python/classify.py) 'python/classify.py',这个命令 './python/classify.py ./examples/images/cat.jpg foo --print_results'仍然抛出以下错误:
Traceback (most recent call last):
File "./python/classify.py", line 175, in <module>
main(sys.argv)
File "./python/classify.py", line 129, in main
channel_swap=channel_swap)
File "/home/vagrant/caffe/python/caffe/classifier.py", line 38, in __init__
self.transformer.set_mean(in_, mean)
File "/home/vagrant/caffe/python/caffe/io.py", line 267, in set_mean
raise ValueError('Mean shape incompatible with input shape.')
ValueError: Mean shape incompatible with input shape.
然后我转储'mean'(3*256*256)和'input'(3*227*227)的形状。显然这两种形状是不相容的。但是旧版本的 'set_mean()' 不会抛出错误,所以我深入研究了 python 代码并发现旧的 'set_mean()' 函数看起来像这样(python/caffe/pycaffe.py, line 195-202, https://github.com/jetpacapp/caffe/):
if mode == 'elementwise':
if mean.shape != in_shape[1:]:
# Resize mean (which requires H x W x K input in range [0,1]).
m_min, m_max = mean.min(), mean.max()
normal_mean = (mean - m_min) / (m_max - m_min)
mean = caffe.io.resize_image(normal_mean.transpose((1,2,0)),
in_shape[2:]).transpose((2,0,1)) * (m_max - m_min) + m_min
但在最新的Caffe中,贡献者将'set_mean()'等转换函数封装成class 'Transformer'。新的 'set_mean()' 函数如下所示(python/caffe/io.py,第 253-254 行,https://github.com/BVLC/caffe/):
if ms != self.inputs[in_][1:]:
raise ValueError('Mean shape incompatible with input shape.')
天哪,这两个怎么是同一个功能?所以我更改了新的 'set_mean()',注释掉错误提示语句,并添加了与旧的 'set_mean()'.
一样的形状调整程序if ms != ins:
print(self.inputs[in_])
in_shape = self.inputs[in_][1:]
m_min, m_max = mean.min(), mean.max()
normal_mean = (mean - m_min) / (m_max - m_min)
mean = resize_image(normal_mean.transpose((1,2,0)),
in_shape[1:]).transpose((2,0,1)) * \
(m_max - m_min) + m_min
'''
raise ValueError('Mean shape incompatible with input shape.')
'''
瞧,问题解决了。
Classifying 1 inputs.
Done in 1.17 s.
[('tabby', '0.27933'), ('tiger cat', '0.21915'), ('Egyptian cat', '0.16064'), ('lynx', '0.12844'), ('kit fox', '0.05155')]
另一个google组解决了同样的错误:-
您需要做的就是更改此设置:
mean=np.load(mean_file)
对此:
mean=np.load(mean_file).mean(1).mean(1)
用户 2696499 只有一个错字
if ms != inputs[in_][1:]
print(self.inputs[in_])
in_shape = self.inputs[in_][1:]
m_min, m_max = mean.min(), mean.max()
normal_mean = (mean - m_min) / (m_max - m_min)
mean = resize_image(normal_mean.transpose((1,2,0)),
in_shape[1:]).transpose((2,0,1)) * \
(m_max - m_min) + m_min
'''
raise ValueError('Mean shape incompatible with input shape.')
'''