为 TensorFlow Serving 导出 textsum 模型时出现错误 'str' object has no attribute 'dtype'
Getting error 'str' object has no attribute 'dtype' when exporting textsum model for TensorFlow Serving
我目前正在尝试使用 PREDICT SIGNATURE 导出 TF textsum 模型。我让 _Decode 从传入的测试文章字符串返回结果,然后将其传递给 buildTensorInfo。这实际上是一个正在返回的字符串。
现在,当我 运行 textsum_export.py 导出模型的逻辑时,它到达了正在构建 TensorInfo 对象的地步,但是出现以下跟踪错误。我知道 PREDICT 签名通常用于图像。这是问题吗?因为我正在处理字符串,所以我不能将它用于 Textsum 模型吗?
错误是:
Traceback (most recent call last):
File "export_textsum.py", line 129, in Export
tensor_info_outputs = tf.saved_model.utils.build_tensor_info(res)
File "/usr/local/lib/python2.7/site-packages/tensorflow/python/saved_model/utils_impl.py", line 37, in build_tensor_info
dtype_enum = dtypes.as_dtype(tensor.dtype).as_datatype_enum
AttributeError: 'str' object has no attribute 'dtype'
导出模型的TF会话如下:
with tf.Session(config = config) as sess:
# Restore variables from training checkpoints.
ckpt = tf.train.get_checkpoint_state(FLAGS.checkpoint_dir)
if ckpt and ckpt.model_checkpoint_path:
saver.restore(sess, ckpt.model_checkpoint_path)
global_step = ckpt.model_checkpoint_path.split('/')[-1].split('-')[-1]
print('Successfully loaded model from %s at step=%s.' %
(ckpt.model_checkpoint_path, global_step))
res = decoder._Decode(saver, sess)
print("Decoder value {}".format(type(res)))
else:
print('No checkpoint file found at %s' % FLAGS.checkpoint_dir)
return
# Export model
export_path = os.path.join(FLAGS.export_dir,str(FLAGS.export_version))
print('Exporting trained model to %s' % export_path)
#-------------------------------------------
tensor_info_inputs = tf.saved_model.utils.build_tensor_info(serialized_tf_example)
tensor_info_outputs = tf.saved_model.utils.build_tensor_info(res)
prediction_signature = (
tf.saved_model.signature_def_utils.build_signature_def(
inputs={ tf.saved_model.signature_constants.PREDICT_INPUTS: tensor_info_inputs},
outputs={tf.saved_model.signature_constants.PREDICT_OUTPUTS:tensor_info_outputs},
method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME
))
#----------------------------------
legacy_init_op = tf.group(tf.tables_initializer(), name='legacy_init_op')
builder = saved_model_builder.SavedModelBuilder(export_path)
builder.add_meta_graph_and_variables(
sess=sess,
tags=[tf.saved_model.tag_constants.SERVING],
signature_def_map={
'predict':prediction_signature,
},
legacy_init_op=legacy_init_op)
builder.save()
print('Successfully exported model to %s' % export_path)
PREDICT 签名与张量一起工作,如果 res 是 'str' 类型 python 变量,那么 res_tensor 将是 dtype tf.string
res_tensor = tf.convert_to_tensor(res)
我目前正在尝试使用 PREDICT SIGNATURE 导出 TF textsum 模型。我让 _Decode 从传入的测试文章字符串返回结果,然后将其传递给 buildTensorInfo。这实际上是一个正在返回的字符串。
现在,当我 运行 textsum_export.py 导出模型的逻辑时,它到达了正在构建 TensorInfo 对象的地步,但是出现以下跟踪错误。我知道 PREDICT 签名通常用于图像。这是问题吗?因为我正在处理字符串,所以我不能将它用于 Textsum 模型吗?
错误是:
Traceback (most recent call last):
File "export_textsum.py", line 129, in Export
tensor_info_outputs = tf.saved_model.utils.build_tensor_info(res)
File "/usr/local/lib/python2.7/site-packages/tensorflow/python/saved_model/utils_impl.py", line 37, in build_tensor_info
dtype_enum = dtypes.as_dtype(tensor.dtype).as_datatype_enum
AttributeError: 'str' object has no attribute 'dtype'
导出模型的TF会话如下:
with tf.Session(config = config) as sess:
# Restore variables from training checkpoints.
ckpt = tf.train.get_checkpoint_state(FLAGS.checkpoint_dir)
if ckpt and ckpt.model_checkpoint_path:
saver.restore(sess, ckpt.model_checkpoint_path)
global_step = ckpt.model_checkpoint_path.split('/')[-1].split('-')[-1]
print('Successfully loaded model from %s at step=%s.' %
(ckpt.model_checkpoint_path, global_step))
res = decoder._Decode(saver, sess)
print("Decoder value {}".format(type(res)))
else:
print('No checkpoint file found at %s' % FLAGS.checkpoint_dir)
return
# Export model
export_path = os.path.join(FLAGS.export_dir,str(FLAGS.export_version))
print('Exporting trained model to %s' % export_path)
#-------------------------------------------
tensor_info_inputs = tf.saved_model.utils.build_tensor_info(serialized_tf_example)
tensor_info_outputs = tf.saved_model.utils.build_tensor_info(res)
prediction_signature = (
tf.saved_model.signature_def_utils.build_signature_def(
inputs={ tf.saved_model.signature_constants.PREDICT_INPUTS: tensor_info_inputs},
outputs={tf.saved_model.signature_constants.PREDICT_OUTPUTS:tensor_info_outputs},
method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME
))
#----------------------------------
legacy_init_op = tf.group(tf.tables_initializer(), name='legacy_init_op')
builder = saved_model_builder.SavedModelBuilder(export_path)
builder.add_meta_graph_and_variables(
sess=sess,
tags=[tf.saved_model.tag_constants.SERVING],
signature_def_map={
'predict':prediction_signature,
},
legacy_init_op=legacy_init_op)
builder.save()
print('Successfully exported model to %s' % export_path)
PREDICT 签名与张量一起工作,如果 res 是 'str' 类型 python 变量,那么 res_tensor 将是 dtype tf.string
res_tensor = tf.convert_to_tensor(res)