将 .pb 转换为 .tflite 以获得可变输入形状的模型

Convert .pb to .tflite for a model of variable input shape

我正在处理一个问题,我使用自定义数据集使用 Tensorflow 对象检测 API 训练了一个模型。我正在使用 tf 版本 2.2.0

output_directory = 'inference_graph'
!python /content/models/research/object_detection/exporter_main_v2.py \
--trained_checkpoint_dir {model_dir} \
--output_directory {output_directory} \
--pipeline_config_path {pipeline_config_path}

我成功获得了 .pb 文件和 .ckpt 文件。但现在我需要将其转换为 .tflite。我无法这样做,有一些错误或其他。

我尝试了 TensorFlow 文档中写的基本方法,但也没有用。 我试过的另一个代码如下:

    import tensorflow as tf
from tensorflow.keras.models import Sequential, Model
from tensorflow.keras.layers import Conv2D, Flatten, MaxPooling2D, Dense, Input, Reshape, Concatenate, GlobalAveragePooling2D, BatchNormalization, Dropout, Activation, GlobalMaxPooling2D
from tensorflow.keras.utils import Sequence

model = tf.saved_model.load(f'/content/drive/MyDrive/FINAL DNET MODEL/inference_graph/saved_model/')
converter = tf.lite.TFLiteConverter.from_keras_model(model)
converter.post_training_quantize=True
converter.inference_type=tf.uint8
tflite_model = converter.convert()
open("val_converted_model_int8.tflite", "wb").write(tflite_model)

我得到的错误是:

AttributeError Traceback (most recent call last) in () 8 converter.post_training_quantize=True 9 converter.inference_type=tf.uint8 ---> 10 tflite_model = converter.convert() 11 open("val_converted_model_int8.tflite", "wb").write(tflite_model)

/usr/local/lib/python3.6/dist-packages/tensorflow/lite/python/lite.py in convert(self) 837 # to None. 838 # Once we have better support for dynamic shapes, we can remove this. --> 839 if not isinstance(self._keras_model.call, _def_function.Function): 840 # Pass keep_original_batch_size=True will ensure that we get an input 841 # signature including the batch dimension specified by the user.

AttributeError: '_UserObject' object has no attribute 'call'

谁能帮我解决这个问题?

我认为问题不在于可变输入形状(错误消息令人困惑)。

tf.saved_model.load returns 一个 SavedModel,但是 tf.lite.TFLiteConverter.from_keras_model 需要一个 Keras 模型,所以它无法处理它。

您需要使用 TFLiteConverter.from_saved_model API。像这样:

saved_model_dir = '/content/drive/MyDrive/FINAL DNET MODEL/inference_graph/saved_model/'
converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)

如果您 运行 遇到其他问题,请告诉我们。