无法在 tensorflow lite 对象检测中使用自定义模型 android 应用程序
Fail to use custom model in tensorflow lite object detection android app
我使用 Google AutoMl 训练模型,然后生成 tensorflow lite 模型来检测塑料瓶等。我想在 tensorflowlite 对象检测 android 示例中使用但失败了
这是我指的 github : https://github.com/tensorflow/examples/tree/master/lite/examples/object_detection/android
我用自己的项目替换了 tflite 文件和 txt 文件,在 android studio 上的安装运行良好,但应用程序崩溃且无法运行
public class DetectorActivity extends CameraActivity implements OnImageAvailableListener {
private static final Logger LOGGER = new Logger();
// Configuration values for the prepackaged SSD model.
private static final int TF_OD_API_INPUT_SIZE = 300;
private static final boolean TF_OD_API_IS_QUANTIZED = true;
private static final String TF_OD_API_MODEL_FILE = "swai.tflite";
private static final String TF_OD_API_LABELS_FILE = "file:///android_asset/swai.txt";
private static final DetectorMode MODE = DetectorMode.TF_OD_API;
// Minimum detection confidence to track a detection.
private static final float MINIMUM_CONFIDENCE_TF_OD_API = 0.5f;
private static final boolean MAINTAIN_ASPECT = false;
private static final Size DESIRED_PREVIEW_SIZE = new Size(640, 480);
private static final boolean SAVE_PREVIEW_BITMAP = false;
private static final float TEXT_SIZE_DIP = 10;
OverlayView trackingOverlay;
这是我在虚拟设备中部署时发现的错误
09-17 13:32:09.283 1599-1856/? D/gralloc_ranchu: gralloc_alloc: Creating ashmem region of size 462848
09-17 13:32:09.325 9980-10000/org.tensorflow.lite.examples.detection E/AndroidRuntime: FATAL EXCEPTION: inference
Process: org.tensorflow.lite.examples.detection, PID: 9980
java.lang.IllegalArgumentException: Cannot convert between a TensorFlowLite buffer with 786432 bytes and a ByteBuffer with 270000 bytes.
at org.tensorflow.lite.Tensor.throwIfShapeIsIncompatible(Tensor.java:272)
at org.tensorflow.lite.Tensor.throwIfDataIsIncompatible(Tensor.java:249)
at org.tensorflow.lite.Tensor.setTo(Tensor.java:110)
at org.tensorflow.lite.NativeInterpreterWrapper.run(NativeInterpreterWrapper.java:151)
at org.tensorflow.lite.Interpreter.runForMultipleInputsOutputs(Interpreter.java:275)
at org.tensorflow.lite.examples.detection.tflite.TFLiteObjectDetectionAPIModel.recognizeImage(TFLiteObjectDetectionAPIModel.java:193)
at org.tensorflow.lite.examples.detection.DetectorActivity.run(DetectorActivity.java:181)
at android.os.Handler.handleCallback(Handler.java:873)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loop(Looper.java:193)
at android.os.HandlerThread.run(HandlerThread.java:65)
我认为没有实现此模型工作的设置,请问有什么建议吗?谢谢!
案例关闭,通过将 TF_OD_API_INPUT_SIZE 从 300 更改为 512 解决。786432 代表 512x512x3,但我的输入仅为 270000 (300x3003)。
我使用 Google AutoMl 训练模型,然后生成 tensorflow lite 模型来检测塑料瓶等。我想在 tensorflowlite 对象检测 android 示例中使用但失败了
这是我指的 github : https://github.com/tensorflow/examples/tree/master/lite/examples/object_detection/android
我用自己的项目替换了 tflite 文件和 txt 文件,在 android studio 上的安装运行良好,但应用程序崩溃且无法运行
public class DetectorActivity extends CameraActivity implements OnImageAvailableListener {
private static final Logger LOGGER = new Logger();
// Configuration values for the prepackaged SSD model.
private static final int TF_OD_API_INPUT_SIZE = 300;
private static final boolean TF_OD_API_IS_QUANTIZED = true;
private static final String TF_OD_API_MODEL_FILE = "swai.tflite";
private static final String TF_OD_API_LABELS_FILE = "file:///android_asset/swai.txt";
private static final DetectorMode MODE = DetectorMode.TF_OD_API;
// Minimum detection confidence to track a detection.
private static final float MINIMUM_CONFIDENCE_TF_OD_API = 0.5f;
private static final boolean MAINTAIN_ASPECT = false;
private static final Size DESIRED_PREVIEW_SIZE = new Size(640, 480);
private static final boolean SAVE_PREVIEW_BITMAP = false;
private static final float TEXT_SIZE_DIP = 10;
OverlayView trackingOverlay;
这是我在虚拟设备中部署时发现的错误
09-17 13:32:09.283 1599-1856/? D/gralloc_ranchu: gralloc_alloc: Creating ashmem region of size 462848
09-17 13:32:09.325 9980-10000/org.tensorflow.lite.examples.detection E/AndroidRuntime: FATAL EXCEPTION: inference
Process: org.tensorflow.lite.examples.detection, PID: 9980
java.lang.IllegalArgumentException: Cannot convert between a TensorFlowLite buffer with 786432 bytes and a ByteBuffer with 270000 bytes.
at org.tensorflow.lite.Tensor.throwIfShapeIsIncompatible(Tensor.java:272)
at org.tensorflow.lite.Tensor.throwIfDataIsIncompatible(Tensor.java:249)
at org.tensorflow.lite.Tensor.setTo(Tensor.java:110)
at org.tensorflow.lite.NativeInterpreterWrapper.run(NativeInterpreterWrapper.java:151)
at org.tensorflow.lite.Interpreter.runForMultipleInputsOutputs(Interpreter.java:275)
at org.tensorflow.lite.examples.detection.tflite.TFLiteObjectDetectionAPIModel.recognizeImage(TFLiteObjectDetectionAPIModel.java:193)
at org.tensorflow.lite.examples.detection.DetectorActivity.run(DetectorActivity.java:181)
at android.os.Handler.handleCallback(Handler.java:873)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loop(Looper.java:193)
at android.os.HandlerThread.run(HandlerThread.java:65)
我认为没有实现此模型工作的设置,请问有什么建议吗?谢谢!
案例关闭,通过将 TF_OD_API_INPUT_SIZE 从 300 更改为 512 解决。786432 代表 512x512x3,但我的输入仅为 270000 (300x3003)。