测试 TF 服务模型失败,字节为字符串,字符串为字节混淆
Testing TF serving model fails with bytes as strings and strings as bytes confusion
我在 Tensorflow 1.12
上使用我的文本分类模型时遇到问题。我正在使用 tf.estimator.inputs.pandas_input_fn
读入我的数据,并使用 tf.estimator.DNNClassifier
读入 train/evaluate。然后我想为我的模型服务。
(提前致歉,很难在这里提供完整的工作示例,但它非常类似于 TF 在 https://www.tensorflow.org/api_docs/python/tf/estimator/DNNClassifier 提供的示例)
我目前正在用...保存我的模型
...
estimator.export_savedmodel("./TEST_SERVING/", self.serving_input_receiver_fn, strip_default_attrs=True)
...
def serving_input_receiver_fn(self):
"""An input receiver that expects a serialized tf.Example."""
# feature spec dictionary determines our input parameters for the model
feature_spec = {
'Headline': tf.VarLenFeature(dtype=tf.string),
'Description': tf.VarLenFeature(dtype=tf.string)
}
# the inputs will be initially fed as strings with data serialized by
# Google ProtoBuffers
serialized_tf_example = tf.placeholder(
dtype=tf.string, shape=None, name='input_example_tensor')
receiver_tensors = {'examples': serialized_tf_example}
# deserialize input
features = tf.parse_example(serialized_tf_example, feature_spec)
return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)
这实际上 运行 失败,错误为:
TypeError: Failed to convert object of type <class 'tensorflow.python.framework.sparse_tensor.SparseTensor'> to Tensor. Contents: SparseTensor(indices=Tensor("ParseExample/ParseExample:0", shape=(?, 2),
dtype=int64), values=Tensor("ParseExample/ParseExample:2", shape=(?,), dtype=string), dense_shape=Tensor("ParseExample/ParseExample:4", shape=(2,), dtype=int64)). Consider casting elements to a supported type.
我尝试用第二种方式保存:
def serving_input_receiver_fn(self):
"""Build the serving inputs."""
INPUT_COLUMNS = ["Headline","Description"]
inputs = {}
for feat in INPUT_COLUMNS:
inputs[feat] = tf.placeholder(shape=[None], dtype=tf.string, name=feat)
return tf.estimator.export.ServingInputReceiver(inputs, inputs)
这确实有效,直到我尝试使用 saved_model_cli
对其进行测试。
saved_model_cli show --all --dir TEST_SERVING/1553879255/
的一些输出:
MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:
signature_def['predict']:
The given SavedModel SignatureDef contains the following input(s):
inputs['Description'] tensor_info:
dtype: DT_STRING
shape: (-1)
name: Description:0
inputs['Headline'] tensor_info:
dtype: DT_STRING
shape: (-1)
name: Headline:0
The given SavedModel SignatureDef contains the following output(s):
outputs['class_ids'] tensor_info:
dtype: DT_INT64
shape: (-1, 1)
name: dnn/head/predictions/ExpandDims:0
outputs['classes'] tensor_info:
dtype: DT_STRING
shape: (-1, 1)
name: dnn/head/predictions/str_classes:0
outputs['logits'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 3)
name: dnn/logits/BiasAdd:0
outputs['probabilities'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 3)
name: dnn/head/predictions/probabilities:0
Method name is: tensorflow/serving/predict
但是现在好像没法测试了
>>> saved_model_cli run --dir TEST_SERVING/1553879255/ --tag_set serve --signature_def predict --input_examples 'inputs=[{"Description":["What is going on"],"Headline":["Help me"]}]'
Traceback (most recent call last):
...
File "/Users/Josh/miniconda3/envs/python36/lib/python3.6/site-packages/tensorflow/python/tools/saved_model_cli.py", line 489, in _create_example_string
feature_list)
TypeError: 'What is going on' has type str, but expected one of: bytes
好的,让我们通过更改为 b["What is going on"]
和 b["Help me"]
...
将其转换为字节对象
ValueError: Type <class 'bytes'> for value b'What is going on' is not supported for tf.train.Feature.
任何ideas/thoughts??
谢谢!
好的,所以最终我找到了答案,引用在
问题出在我不太了解的序列化方面。该解决方案允许将原始字符串传递给 tf.estimator.export.build_raw_serving_input_receiver_fn
。
我的保存功能现在是这样的:
def save_serving_model(self,estimator):
feature_placeholder = {'Headline': tf.placeholder('string', [1], name='headline_placeholder'),
'Description': tf.placeholder('string', [1], name='description_placeholder')}
serving_input_fn = tf.estimator.export.build_raw_serving_input_receiver_fn(feature_placeholder)
estimator.export_savedmodel("TEST_SERVING/", serving_input_fn)
使用 saved_model_cli
的地方有效。即:
saved_model_cli run --dir /path/to/model/ --tag_set serve --signature_def predict --input_exprs="Headline=['Finally, it works'];Description=['Yay, it works']"
Result for output key class_ids:
[[2]]
Result for output key classes:
[[b'2']]
Result for output key logits:
[[-0.56755465 0.31625098 0.39260274]]
Result for output key probabilities:
[[0.16577701 0.40119565 0.4330274 ]]
我在 Tensorflow 1.12
上使用我的文本分类模型时遇到问题。我正在使用 tf.estimator.inputs.pandas_input_fn
读入我的数据,并使用 tf.estimator.DNNClassifier
读入 train/evaluate。然后我想为我的模型服务。
(提前致歉,很难在这里提供完整的工作示例,但它非常类似于 TF 在 https://www.tensorflow.org/api_docs/python/tf/estimator/DNNClassifier 提供的示例)
我目前正在用...保存我的模型
...
estimator.export_savedmodel("./TEST_SERVING/", self.serving_input_receiver_fn, strip_default_attrs=True)
...
def serving_input_receiver_fn(self):
"""An input receiver that expects a serialized tf.Example."""
# feature spec dictionary determines our input parameters for the model
feature_spec = {
'Headline': tf.VarLenFeature(dtype=tf.string),
'Description': tf.VarLenFeature(dtype=tf.string)
}
# the inputs will be initially fed as strings with data serialized by
# Google ProtoBuffers
serialized_tf_example = tf.placeholder(
dtype=tf.string, shape=None, name='input_example_tensor')
receiver_tensors = {'examples': serialized_tf_example}
# deserialize input
features = tf.parse_example(serialized_tf_example, feature_spec)
return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)
这实际上 运行 失败,错误为:
TypeError: Failed to convert object of type <class 'tensorflow.python.framework.sparse_tensor.SparseTensor'> to Tensor. Contents: SparseTensor(indices=Tensor("ParseExample/ParseExample:0", shape=(?, 2),
dtype=int64), values=Tensor("ParseExample/ParseExample:2", shape=(?,), dtype=string), dense_shape=Tensor("ParseExample/ParseExample:4", shape=(2,), dtype=int64)). Consider casting elements to a supported type.
我尝试用第二种方式保存:
def serving_input_receiver_fn(self):
"""Build the serving inputs."""
INPUT_COLUMNS = ["Headline","Description"]
inputs = {}
for feat in INPUT_COLUMNS:
inputs[feat] = tf.placeholder(shape=[None], dtype=tf.string, name=feat)
return tf.estimator.export.ServingInputReceiver(inputs, inputs)
这确实有效,直到我尝试使用 saved_model_cli
对其进行测试。
saved_model_cli show --all --dir TEST_SERVING/1553879255/
的一些输出:
MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:
signature_def['predict']:
The given SavedModel SignatureDef contains the following input(s):
inputs['Description'] tensor_info:
dtype: DT_STRING
shape: (-1)
name: Description:0
inputs['Headline'] tensor_info:
dtype: DT_STRING
shape: (-1)
name: Headline:0
The given SavedModel SignatureDef contains the following output(s):
outputs['class_ids'] tensor_info:
dtype: DT_INT64
shape: (-1, 1)
name: dnn/head/predictions/ExpandDims:0
outputs['classes'] tensor_info:
dtype: DT_STRING
shape: (-1, 1)
name: dnn/head/predictions/str_classes:0
outputs['logits'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 3)
name: dnn/logits/BiasAdd:0
outputs['probabilities'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 3)
name: dnn/head/predictions/probabilities:0
Method name is: tensorflow/serving/predict
但是现在好像没法测试了
>>> saved_model_cli run --dir TEST_SERVING/1553879255/ --tag_set serve --signature_def predict --input_examples 'inputs=[{"Description":["What is going on"],"Headline":["Help me"]}]'
Traceback (most recent call last):
...
File "/Users/Josh/miniconda3/envs/python36/lib/python3.6/site-packages/tensorflow/python/tools/saved_model_cli.py", line 489, in _create_example_string
feature_list)
TypeError: 'What is going on' has type str, but expected one of: bytes
好的,让我们通过更改为 b["What is going on"]
和 b["Help me"]
...
ValueError: Type <class 'bytes'> for value b'What is going on' is not supported for tf.train.Feature.
任何ideas/thoughts?? 谢谢!
好的,所以最终我找到了答案,引用在
问题出在我不太了解的序列化方面。该解决方案允许将原始字符串传递给 tf.estimator.export.build_raw_serving_input_receiver_fn
。
我的保存功能现在是这样的:
def save_serving_model(self,estimator):
feature_placeholder = {'Headline': tf.placeholder('string', [1], name='headline_placeholder'),
'Description': tf.placeholder('string', [1], name='description_placeholder')}
serving_input_fn = tf.estimator.export.build_raw_serving_input_receiver_fn(feature_placeholder)
estimator.export_savedmodel("TEST_SERVING/", serving_input_fn)
使用 saved_model_cli
的地方有效。即:
saved_model_cli run --dir /path/to/model/ --tag_set serve --signature_def predict --input_exprs="Headline=['Finally, it works'];Description=['Yay, it works']"
Result for output key class_ids:
[[2]]
Result for output key classes:
[[b'2']]
Result for output key logits:
[[-0.56755465 0.31625098 0.39260274]]
Result for output key probabilities:
[[0.16577701 0.40119565 0.4330274 ]]