保存模型后,我使用 tensorflow estimator 构建了分类模型,将其转换为 tensorflow lite 时显示错误
I have built the classification model using tensorflow estimator after saving the model , when converting it into tensorflow lite it shows an error
import tensorflow as tf
converter = tf.lite.TFLiteConverter.from_saved_model("/content/drive/MyDrive/tensorflowtest/1618754788") #path to the SavedModel directenter code hereory
converter.target_spec.supported_ops = [
tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops.
tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops.
]
Model Saves Without Any Error
tflite_model = converter.convert()
When i execute this line I get this Exception
ConverterError Traceback (most recent call last)
/usr/local/lib/python3.7/dist-packages/tensorflow/lite/python/convert.py in toco_convert_protos(model_flags_str, toco_flags_str, input_data_str, debug_info_str, enable_mlir_converter)
295 return model_str
296 except Exception as e:
--> 297 raise ConverterError(str(e))
298
299 if distutils.spawn.find_executable(_toco_from_proto_bin) is None:
ConverterError: <unknown>:0: error: loc("head/predictions/str_classes"): 'tf.AsString' op is neither a custom op nor a flex op
<unknown>:0: error: failed while converting: 'main':
Some ops in the model are custom ops, See instructions to implement custom ops: https://www.tensorflow.org/lite/guide/ops_custom
Custom ops: AsString
Details:
tf.AsString(tensor<?x1xi64>) -> (tensor<?x1x!tf.string>) : {device = "", fill = "", precision = -1 : i64, scientific = false, shortest = false, width = -1 : i64}
I tried Using tensor flow nightly but error still remains
I am trying to build a classification model using tensorflow as then i want to convert it into tensorflow lite for Android App
if you have any other appproch without converting into tensorflow lite that would be acceptable too
import tensorflow as tf
converter = tf.lite.TFLiteConverter.from_saved_model("/content/drive/MyDrive/tensorflowtest/1618754788") #path to the SavedModel directenter code hereory
converter.target_spec.supported_ops = [
tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops.
tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops.
]
Model Saves Without Any Error
tflite_model = converter.convert()
When i execute this line I get this Exception
ConverterError Traceback (most recent call last)
/usr/local/lib/python3.7/dist-packages/tensorflow/lite/python/convert.py in toco_convert_protos(model_flags_str, toco_flags_str, input_data_str, debug_info_str, enable_mlir_converter)
295 return model_str
296 except Exception as e:
--> 297 raise ConverterError(str(e))
298
299 if distutils.spawn.find_executable(_toco_from_proto_bin) is None:
ConverterError: <unknown>:0: error: loc("head/predictions/str_classes"): 'tf.AsString' op is neither a custom op nor a flex op
<unknown>:0: error: failed while converting: 'main':
Some ops in the model are custom ops, See instructions to implement custom ops: https://www.tensorflow.org/lite/guide/ops_custom
Custom ops: AsString
Details:
tf.AsString(tensor<?x1xi64>) -> (tensor<?x1x!tf.string>) : {device = "", fill = "", precision = -1 : i64, scientific = false, shortest = false, width = -1 : i64}
I tried Using tensor flow nightly but error still remains I am trying to build a classification model using tensorflow as then i want to convert it into tensorflow lite for Android App if you have any other appproch without converting into tensorflow lite that would be acceptable too