Tensorflow c_api decision forest savedModel 已加载但状态不是 TF_OK

Tensorflow c_api decision forest savedModel loaded but status is not TF_OK

我正在使用 tensorflow 决策森林。我使用 Python 训练我的模型,并使用 SavedModel 格式保存模型。然后为了推断,我正在尝试使用 tensorflow C_API 在 C 中加载模型。我发现对于此任务,我需要从 Python 包中加载决策林 inference.so 文件。

您可以在 Debian 10 中使用此命令安装 Python 软件包:

pip3 install tensorflow-decision-forests

之后,在我的程序中,我使用 TF_LoadLibrary 加载了 inference.so 文件。然后我使用 TF_LoadSessionFromSavedModel.

加载模型

这里是代码

#include <stdio.h>
#include <tensorflow/c/c_api.h>

int main() {
  TF_Graph *Graph = TF_NewGraph();
  TF_Status *Status = TF_NewStatus();
  TF_SessionOptions *SessionOpts = TF_NewSessionOptions();
  TF_Buffer *RunOpts = NULL;
  TF_Library *library;

  library = TF_LoadLibrary("/home/user/.local/lib/python3.7/site-packages/tensorflow_decision_forests/tensorflow/ops/inference/inference.so",
                              Status);

  const char *saved_model_dir = "randomforests-model/";
  const char *tags = "serve";
  int ntags = 1;

  TF_Session *Session = TF_LoadSessionFromSavedModel(
      SessionOpts, RunOpts, saved_model_dir, &tags, ntags, Graph, NULL, Status);

  printf("status: %s\n", TF_Message(Status));

  if(TF_GetCode(Status) == TF_OK) {
    printf("loaded\n");
  }else{
    printf("not loaded\n");
  }

  return 0;
}

输出

$ gcc -g main.c -ltensorflow -o main.out
user@debian:/home/code/tftest$ ./main.out 
Hello from TensorFlow C library version 2.7.0-dev20211101
2022-01-22 19:39:28.539621: I tensorflow/cc/saved_model/reader.cc:43] Reading SavedModel from: randomforests-mtproto-model-001028/
2022-01-22 19:39:28.547223: I tensorflow/cc/saved_model/reader.cc:107] Reading meta graph with tags { serve }
2022-01-22 19:39:28.547792: I tensorflow/cc/saved_model/reader.cc:148] Reading SavedModel debug info (if present) from: randomforests-mtproto-model-001028/
2022-01-22 19:39:28.548298: I tensorflow/core/platform/cpu_feature_guard.cc:151] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-01-22 19:39:28.598841: I tensorflow/cc/saved_model/loader.cc:210] Restoring SavedModel bundle.
2022-01-22 19:39:28.743885: I tensorflow/cc/saved_model/loader.cc:194] Running initialization op on SavedModel bundle at path: randomforests-mtproto-model-001028/
[INFO kernel.cc:1153] Loading model from path
[INFO decision_forest.cc:617] Model loaded with 300 root(s), 618972 node(s), and 28 input feature(s).
[INFO abstract_model.cc:1063] Engine "RandomForestOptPred" built
[INFO kernel.cc:1001] Use fast generic engine
2022-01-22 19:39:30.922861: I tensorflow/cc/saved_model/loader.cc:283] SavedModel load for tags { serve }; Status: success: OK. Took 2383248 microseconds.

status: No shape inference function exists for op 'SimpleMLLoadModelFromPathWithHandle', did you forget to define it?
not loaded

问题是函数输出告诉模型正在加载 Status: success: OKStatus 变量不等于 TF_OK 并且相关消息是 No shape inference function exists for op 'SimpleMLLoadModelFromPathWithHandle', did you forget to define it?

那么我怎样才能以正确的方式加载模型呢?

很久没有得到答案,我在TensorFlow的论坛上提问并得到了answerTensorFlow 的当前版本似乎在使用 C_API 加载决策林时出现问题。所以我们可以使用答案中讨论的 Yggdrasil 库。