TensorFlow 服务无法提供 TensorFlow 2.0 keras.layers.LSTM
TensorFlow serving failed to serve TensorFlow 2.0 keras.layers.LSTM
为什么 TensorFlow 服务无法提供这个简单的 LSTM keras 层,而它使用 saved_model_cli run
成功运行?我该如何解决?
TensorFlow 版本 2.0 alpha
$ pip install tensorflow==2.0.0-alpha0
正在重现 SavedModel:
import tensorflow as tf
from tensorflow.keras.models import Model
from tensorflow.keras.layers import Input, LSTM
img_width= 10
img_height= 10
def build_model():
input_img = Input(shape=(img_width*img_height, 1), name='input_data', dtype='float32')
x = LSTM(12, return_sequences=False, name='lstm_1')(input_img)
model = Model(input_img, x)
return model
def save():
model = build_model()
tf.saved_model.save(model, "./test_keras_serving/1/")
if __name__ == '__main__':
save()
TensorFlow 服务安装:
$ docker pull tensorflow/serving
$ docker images -a
REPOSITORY TAG IMAGE ID CREATED SIZE
tensorflow/serving latest 38bee21b2ca0 2 months ago 229MB
提供 SavedModel
$ docker run -p 8501:8501 --mount type=bind,source=/the/path/of/dir/test_keras_serving,target=/models/my_model -e MODEL_NAME=my_model -t tensorflow/serving
使用 TensorFlow Serving 的预测 python 代码。
import json
import requests
import numpy as np
def pred():
inp_value = np.zeros((2,100,1))
_url = 'http://localhost:8501/v1/models/my_model:predict'
headers = {"cache-control": "no-cache", "content-type": "application/json"}
data = json.dumps({"signature_name": "serving_default","instances": inp_value.tolist()})
json_response = requests.post(url=_url, data=data, headers=headers)
print(json_response)
if __name__ == '__main__':
pred()
在客户端,结果是
<Response [400]>
而不是 [200]
。
并且服务器显示:
2019-05-12 13:21:49.370594: W external/org_tensorflow/tensorflow/core/framework/op_kernel.cc:1401] OP_REQUIRES failed at partitioned_function_ops.cc:118 : Invalid argument: Expected input[1] == 'TensorArrayV2Stack/TensorListStack/element_shape:output:0' to be a control input.
In {{node TensorArrayV2Stack/TensorListStack}}
但是,SavedModel 与 saved_model_cli
:
一起工作正常
$ saved_model_cli run --dir ./test_keras_serving/1 --tag_set serve --signature_def serving_default --input_exprs input_data=np.zeros((2,100,1))
输出:
Result for output key lstm_1:
[[0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]]
如何使用 TensorFlow Serving
获得与 saved_model_cli
相同的结果?
通过 docker pull tensorflow/serving:nightly
解决了这个问题
为什么 TensorFlow 服务无法提供这个简单的 LSTM keras 层,而它使用 saved_model_cli run
成功运行?我该如何解决?
TensorFlow 版本 2.0 alpha
$ pip install tensorflow==2.0.0-alpha0
正在重现 SavedModel:
import tensorflow as tf
from tensorflow.keras.models import Model
from tensorflow.keras.layers import Input, LSTM
img_width= 10
img_height= 10
def build_model():
input_img = Input(shape=(img_width*img_height, 1), name='input_data', dtype='float32')
x = LSTM(12, return_sequences=False, name='lstm_1')(input_img)
model = Model(input_img, x)
return model
def save():
model = build_model()
tf.saved_model.save(model, "./test_keras_serving/1/")
if __name__ == '__main__':
save()
TensorFlow 服务安装:
$ docker pull tensorflow/serving
$ docker images -a
REPOSITORY TAG IMAGE ID CREATED SIZE
tensorflow/serving latest 38bee21b2ca0 2 months ago 229MB
提供 SavedModel
$ docker run -p 8501:8501 --mount type=bind,source=/the/path/of/dir/test_keras_serving,target=/models/my_model -e MODEL_NAME=my_model -t tensorflow/serving
使用 TensorFlow Serving 的预测 python 代码。
import json
import requests
import numpy as np
def pred():
inp_value = np.zeros((2,100,1))
_url = 'http://localhost:8501/v1/models/my_model:predict'
headers = {"cache-control": "no-cache", "content-type": "application/json"}
data = json.dumps({"signature_name": "serving_default","instances": inp_value.tolist()})
json_response = requests.post(url=_url, data=data, headers=headers)
print(json_response)
if __name__ == '__main__':
pred()
在客户端,结果是
<Response [400]>
而不是 [200]
。
并且服务器显示:
2019-05-12 13:21:49.370594: W external/org_tensorflow/tensorflow/core/framework/op_kernel.cc:1401] OP_REQUIRES failed at partitioned_function_ops.cc:118 : Invalid argument: Expected input[1] == 'TensorArrayV2Stack/TensorListStack/element_shape:output:0' to be a control input.
In {{node TensorArrayV2Stack/TensorListStack}}
但是,SavedModel 与 saved_model_cli
:
$ saved_model_cli run --dir ./test_keras_serving/1 --tag_set serve --signature_def serving_default --input_exprs input_data=np.zeros((2,100,1))
输出:
Result for output key lstm_1:
[[0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]]
如何使用 TensorFlow Serving
获得与 saved_model_cli
相同的结果?
通过 docker pull tensorflow/serving:nightly
解决了这个问题