如何在 Colab 中将 Inception V4 从 .ckpt 转换为 .pb?
How to convert Inception V4 from .ckpt to .pb in Colab?
我正在使用 Coral 开发板和 Jetson T2 开发板。
为了向他们发送模型,模型必须具有扩展名 .pb
是否有 link 模型已经具有 .pb 扩展名?
目前我正在使用这个 link:
TF_slim
所有模型都有扩展名 .ckpt,仅此而已。没有 .meta 或其他任何内容。
我不知道如何转换为.pb。
我在 Colab 工作。这是我的代码:
# Now let's download the pretrained model from tensorflow's model zoo.
!mkdir /content/pretrained_model
%cd /content/pretrained_model
!wget http://download.tensorflow.org/models/inception_v4_2016_09_09.tar.gz
!tar xvf inception_v4_2016_09_09.tar.gz
#Exporting the inference graph
!python /content/models/research/slim/export_inference_graph.py \
--alsologtostderr \
--model_name=inception_v4.ckpt \
--output_file=/content/pretrained_model/inception_v4_inf_graph.pb
这是我遇到的错误:
Traceback (most recent call last):
File "/content/models/research/slim/export_inference_graph.py", line 162, in <module>
tf.app.run()
File "/tensorflow-1.15.2/python3.6/tensorflow_core/python/platform/app.py", line 40, in run
_run(main=main, argv=argv, flags_parser=_parse_flags_tolerate_undef)
File "/usr/local/lib/python3.6/dist-packages/absl/app.py", line 299, in run
_run_main(main, args)
File "/usr/local/lib/python3.6/dist-packages/absl/app.py", line 250, in _run_main
sys.exit(main(argv))
File "/content/models/research/slim/export_inference_graph.py", line 128, in main
FLAGS.dataset_dir)
File "/content/models/research/slim/datasets/dataset_factory.py", line 59, in get_dataset
reader)
File "/content/models/research/slim/datasets/imagenet.py", line 187, in get_split
labels_to_names = create_readable_names_for_imagenet_labels()
File "/content/models/research/slim/datasets/imagenet.py", line 93, in create_readable_names_for_imagenet_labels
filename, _ = urllib.request.urlretrieve(synset_url)
File "/usr/lib/python3.6/urllib/request.py", line 248, in urlretrieve
with contextlib.closing(urlopen(url, data)) as fp:
File "/usr/lib/python3.6/urllib/request.py", line 223, in urlopen
return opener.open(url, data, timeout)
File "/usr/lib/python3.6/urllib/request.py", line 532, in open
response = meth(req, response)
File "/usr/lib/python3.6/urllib/request.py", line 642, in http_response
'http', request, response, code, msg, hdrs)
File "/usr/lib/python3.6/urllib/request.py", line 564, in error
result = self._call_chain(*args)
File "/usr/lib/python3.6/urllib/request.py", line 504, in _call_chain
result = func(*args)
File "/usr/lib/python3.6/urllib/request.py", line 756, in http_error_302
return self.parent.open(new, timeout=req.timeout)
File "/usr/lib/python3.6/urllib/request.py", line 532, in open
response = meth(req, response)
File "/usr/lib/python3.6/urllib/request.py", line 642, in http_response
'http', request, response, code, msg, hdrs)
File "/usr/lib/python3.6/urllib/request.py", line 570, in error
return self._call_chain(*args)
File "/usr/lib/python3.6/urllib/request.py", line 504, in _call_chain
result = func(*args)
File "/usr/lib/python3.6/urllib/request.py", line 650, in http_error_default
raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 404: Not Found
谢谢
this URL in tensorflow/models. I submitted a PR tensorflow/models#9207 中似乎有错误。
- base_url = 'https://raw.githubusercontent.com/tensorflow/models/master/research/inception/inception/data/'
+ base_url = 'https://raw.githubusercontent.com/tensorflow/models/master/research/slim/datasets'
进行此更改将修复 404 错误。
请参阅 https://github.com/tensorflow/models/tree/master/research/slim#exporting-the-inference-graph
中的说明
Exporting the Inference Graph
Saves out a GraphDef containing the architecture of the model.
To use it with a model name defined by slim, run:
$ python export_inference_graph.py \
--alsologtostderr \
--model_name=inception_v3 \
--output_file=/tmp/inception_v3_inf_graph.pb
$ python export_inference_graph.py \
--alsologtostderr \
--model_name=mobilenet_v1 \
--image_size=224 \
--output_file=/tmp/mobilenet_v1_224.pb
Freezing the exported Graph If you then want to use the resulting model with your own or pretrained checkpoints as part of a mobile
model, you can run freeze_graph to get a graph def with the variables
inlined as constants using:
bazel build tensorflow/python/tools:freeze_graph
bazel-bin/tensorflow/python/tools/freeze_graph \
--input_graph=/tmp/inception_v3_inf_graph.pb \
--input_checkpoint=/tmp/checkpoints/inception_v3.ckpt \
--input_binary=true --output_graph=/tmp/frozen_inception_v3.pb \
--output_node_names=InceptionV3/Predictions/Reshape_1
我正在使用 Coral 开发板和 Jetson T2 开发板。 为了向他们发送模型,模型必须具有扩展名 .pb
是否有 link 模型已经具有 .pb 扩展名? 目前我正在使用这个 link: TF_slim
所有模型都有扩展名 .ckpt,仅此而已。没有 .meta 或其他任何内容。 我不知道如何转换为.pb。
我在 Colab 工作。这是我的代码:
# Now let's download the pretrained model from tensorflow's model zoo.
!mkdir /content/pretrained_model
%cd /content/pretrained_model
!wget http://download.tensorflow.org/models/inception_v4_2016_09_09.tar.gz
!tar xvf inception_v4_2016_09_09.tar.gz
#Exporting the inference graph
!python /content/models/research/slim/export_inference_graph.py \
--alsologtostderr \
--model_name=inception_v4.ckpt \
--output_file=/content/pretrained_model/inception_v4_inf_graph.pb
这是我遇到的错误:
Traceback (most recent call last):
File "/content/models/research/slim/export_inference_graph.py", line 162, in <module>
tf.app.run()
File "/tensorflow-1.15.2/python3.6/tensorflow_core/python/platform/app.py", line 40, in run
_run(main=main, argv=argv, flags_parser=_parse_flags_tolerate_undef)
File "/usr/local/lib/python3.6/dist-packages/absl/app.py", line 299, in run
_run_main(main, args)
File "/usr/local/lib/python3.6/dist-packages/absl/app.py", line 250, in _run_main
sys.exit(main(argv))
File "/content/models/research/slim/export_inference_graph.py", line 128, in main
FLAGS.dataset_dir)
File "/content/models/research/slim/datasets/dataset_factory.py", line 59, in get_dataset
reader)
File "/content/models/research/slim/datasets/imagenet.py", line 187, in get_split
labels_to_names = create_readable_names_for_imagenet_labels()
File "/content/models/research/slim/datasets/imagenet.py", line 93, in create_readable_names_for_imagenet_labels
filename, _ = urllib.request.urlretrieve(synset_url)
File "/usr/lib/python3.6/urllib/request.py", line 248, in urlretrieve
with contextlib.closing(urlopen(url, data)) as fp:
File "/usr/lib/python3.6/urllib/request.py", line 223, in urlopen
return opener.open(url, data, timeout)
File "/usr/lib/python3.6/urllib/request.py", line 532, in open
response = meth(req, response)
File "/usr/lib/python3.6/urllib/request.py", line 642, in http_response
'http', request, response, code, msg, hdrs)
File "/usr/lib/python3.6/urllib/request.py", line 564, in error
result = self._call_chain(*args)
File "/usr/lib/python3.6/urllib/request.py", line 504, in _call_chain
result = func(*args)
File "/usr/lib/python3.6/urllib/request.py", line 756, in http_error_302
return self.parent.open(new, timeout=req.timeout)
File "/usr/lib/python3.6/urllib/request.py", line 532, in open
response = meth(req, response)
File "/usr/lib/python3.6/urllib/request.py", line 642, in http_response
'http', request, response, code, msg, hdrs)
File "/usr/lib/python3.6/urllib/request.py", line 570, in error
return self._call_chain(*args)
File "/usr/lib/python3.6/urllib/request.py", line 504, in _call_chain
result = func(*args)
File "/usr/lib/python3.6/urllib/request.py", line 650, in http_error_default
raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 404: Not Found
谢谢
this URL in tensorflow/models. I submitted a PR tensorflow/models#9207 中似乎有错误。
- base_url = 'https://raw.githubusercontent.com/tensorflow/models/master/research/inception/inception/data/'
+ base_url = 'https://raw.githubusercontent.com/tensorflow/models/master/research/slim/datasets'
进行此更改将修复 404 错误。
请参阅 https://github.com/tensorflow/models/tree/master/research/slim#exporting-the-inference-graph
中的说明Exporting the Inference Graph
Saves out a GraphDef containing the architecture of the model.
To use it with a model name defined by slim, run:
$ python export_inference_graph.py \ --alsologtostderr \ --model_name=inception_v3 \ --output_file=/tmp/inception_v3_inf_graph.pb $ python export_inference_graph.py \ --alsologtostderr \ --model_name=mobilenet_v1 \ --image_size=224 \ --output_file=/tmp/mobilenet_v1_224.pb
Freezing the exported Graph If you then want to use the resulting model with your own or pretrained checkpoints as part of a mobile
model, you can run freeze_graph to get a graph def with the variables inlined as constants using:
bazel build tensorflow/python/tools:freeze_graph bazel-bin/tensorflow/python/tools/freeze_graph \ --input_graph=/tmp/inception_v3_inf_graph.pb \ --input_checkpoint=/tmp/checkpoints/inception_v3.ckpt \ --input_binary=true --output_graph=/tmp/frozen_inception_v3.pb \ --output_node_names=InceptionV3/Predictions/Reshape_1