将图 (pb) 转换为用于 gcloud ml-engine predict 的 SavedModel
Convert graph (pb) to SavedModel for gcloud ml-engine predict
我根据 recent post by Google’s Derek Chow on the Google Cloud Big Data And Machine Learning Blog 使用 Cloud Machine Learning Engine 训练了一个对象检测器,现在想使用 Cloud Machine Learning Engine 进行预测。
说明包括将 Tensorflow 图导出为 output_inference_graph.pb 的代码,但不包括如何将 protobuf 格式 (pb) 转换为 gcloud ml-engine 预测所需的 SavedModel 格式。
我查看了 answer by Google’s @rhaertel80 for how to convert a “Tensorflow For Poets” image classification model and the 如何转换“Tensorflow For Poets 2”图像分类模型,但似乎都不适用于博客 post 中描述的对象检测器图 (pb)。
请问如何转换对象检测器图 (pb) 以便可以使用或使用 gcloud ml-engine 进行预测?
SavedModel 包含一个 MetaGraphDef inside its structure。
要从 python 中的 GraphDef 创建 SavedModel,您可能需要按照 link.
中所述使用构建器
export_dir = ...
...
builder = tf.saved_model.builder.SavedModelBuilder(export_dir)
with tf.Session(graph=tf.Graph()) as sess:
...
builder.add_meta_graph_and_variables(sess,
[tag_constants.TRAINING],
signature_def_map=foo_signatures,
assets_collection=foo_assets)
...
with tf.Session(graph=tf.Graph()) as sess:
...
builder.add_meta_graph(["bar-tag", "baz-tag"])
...
builder.save()
这个post救了我!希望能帮助到这里的人。我用的方法导出成功
https://github.com/tensorflow/tensorflow/pull/15855/commits/81ec5d20935352d71ff56fac06c36d6ff0a7ae05
def export_model(sess, architecture, saved_model_dir):
if architecture == 'inception_v3':
input_tensor = 'DecodeJpeg/contents:0'
elif architecture.startswith('mobilenet_'):
input_tensor = 'input:0'
else:
raise ValueError('Unknown architecture', architecture)
in_image = sess.graph.get_tensor_by_name(input_tensor)
inputs = {'image': tf.saved_model.utils.build_tensor_info(in_image)}
out_classes = sess.graph.get_tensor_by_name('final_result:0')
outputs = {'prediction': tf.saved_model.utils.build_tensor_info(out_classes)}
signature = tf.saved_model.signature_def_utils.build_signature_def(
inputs=inputs,
outputs=outputs,
method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME
)
legacy_init_op = tf.group(tf.tables_initializer(), name='legacy_init_op')
# Save out the SavedModel.
builder = tf.saved_model.builder.SavedModelBuilder(saved_model_dir)
builder.add_meta_graph_and_variables(
sess, [tf.saved_model.tag_constants.SERVING],
signature_def_map={
tf.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY: signature
},
legacy_init_op=legacy_init_op)
builder.save()
#execute this in the final of def main(_):
export_model(sess, FLAGS.architecture, FLAGS.saved_model_dir)
parser.add_argument(
'--saved_model_dir',
type=str,
default='/tmp/saved_models/1/',
help='Where to save the exported graph.'
)
我根据 recent post by Google’s Derek Chow on the Google Cloud Big Data And Machine Learning Blog 使用 Cloud Machine Learning Engine 训练了一个对象检测器,现在想使用 Cloud Machine Learning Engine 进行预测。
说明包括将 Tensorflow 图导出为 output_inference_graph.pb 的代码,但不包括如何将 protobuf 格式 (pb) 转换为 gcloud ml-engine 预测所需的 SavedModel 格式。
我查看了 answer by Google’s @rhaertel80 for how to convert a “Tensorflow For Poets” image classification model and the
请问如何转换对象检测器图 (pb) 以便可以使用或使用 gcloud ml-engine 进行预测?
SavedModel 包含一个 MetaGraphDef inside its structure。 要从 python 中的 GraphDef 创建 SavedModel,您可能需要按照 link.
中所述使用构建器export_dir = ...
...
builder = tf.saved_model.builder.SavedModelBuilder(export_dir)
with tf.Session(graph=tf.Graph()) as sess:
...
builder.add_meta_graph_and_variables(sess,
[tag_constants.TRAINING],
signature_def_map=foo_signatures,
assets_collection=foo_assets)
...
with tf.Session(graph=tf.Graph()) as sess:
...
builder.add_meta_graph(["bar-tag", "baz-tag"])
...
builder.save()
这个post救了我!希望能帮助到这里的人。我用的方法导出成功
https://github.com/tensorflow/tensorflow/pull/15855/commits/81ec5d20935352d71ff56fac06c36d6ff0a7ae05
def export_model(sess, architecture, saved_model_dir):
if architecture == 'inception_v3':
input_tensor = 'DecodeJpeg/contents:0'
elif architecture.startswith('mobilenet_'):
input_tensor = 'input:0'
else:
raise ValueError('Unknown architecture', architecture)
in_image = sess.graph.get_tensor_by_name(input_tensor)
inputs = {'image': tf.saved_model.utils.build_tensor_info(in_image)}
out_classes = sess.graph.get_tensor_by_name('final_result:0')
outputs = {'prediction': tf.saved_model.utils.build_tensor_info(out_classes)}
signature = tf.saved_model.signature_def_utils.build_signature_def(
inputs=inputs,
outputs=outputs,
method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME
)
legacy_init_op = tf.group(tf.tables_initializer(), name='legacy_init_op')
# Save out the SavedModel.
builder = tf.saved_model.builder.SavedModelBuilder(saved_model_dir)
builder.add_meta_graph_and_variables(
sess, [tf.saved_model.tag_constants.SERVING],
signature_def_map={
tf.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY: signature
},
legacy_init_op=legacy_init_op)
builder.save()
#execute this in the final of def main(_):
export_model(sess, FLAGS.architecture, FLAGS.saved_model_dir)
parser.add_argument(
'--saved_model_dir',
type=str,
default='/tmp/saved_models/1/',
help='Where to save the exported graph.'
)