如何在 C++ 代码中 运行 Yolov5 tensorflow model.pb?
How to run Yolov5 tensorflow model.pb inside c++ code?
I have trained a model using yolov5 and I got the model.pt I convert
it using the export file to TensorFlow compatible model.pb now I want
to use this model with c++ instead of python I did a lot of research
but I did configure it out how to do this, so where can I find an
example that uses model.pb inside c++ code?
I tried running the model.pt using TochScript it worked fine I tried
running the model.onnx it runs but slow now I'm trying to run the
mode.pb
我没有找到直接 运行 model.pb 的方法,但经过长时间的研究,我已经能够 运行 saved_model。有重要的几行代码
// the input node is:
const string input_node = "serving_default_input_1:0";
// the output node is:
std::vector<string> output_nodes ={"StatefulPartitionedCall:0"};
tensorflow::SavedModelBundle bundle;
//std::string path = path to the saved model folder ./yolov5s_saved_model/
tensorflow::LoadSavedModel(session_options, run_options, path, {"serve"},
&bundle);
std::vector<std::pair<string, Tensor>> inputs_data = {{input_node, image_output}};
std::vector<tensorflow::Tensor> predictions;
bundle.GetSession()->Run( inputs_data , output_nodes, {}, &predictions);
I have trained a model using yolov5 and I got the model.pt I convert it using the export file to TensorFlow compatible model.pb now I want to use this model with c++ instead of python I did a lot of research but I did configure it out how to do this, so where can I find an example that uses model.pb inside c++ code?
I tried running the model.pt using TochScript it worked fine I tried running the model.onnx it runs but slow now I'm trying to run the mode.pb
我没有找到直接 运行 model.pb 的方法,但经过长时间的研究,我已经能够 运行 saved_model。有重要的几行代码
// the input node is:
const string input_node = "serving_default_input_1:0";
// the output node is:
std::vector<string> output_nodes ={"StatefulPartitionedCall:0"};
tensorflow::SavedModelBundle bundle;
//std::string path = path to the saved model folder ./yolov5s_saved_model/
tensorflow::LoadSavedModel(session_options, run_options, path, {"serve"},
&bundle);
std::vector<std::pair<string, Tensor>> inputs_data = {{input_node, image_output}};
std::vector<tensorflow::Tensor> predictions;
bundle.GetSession()->Run( inputs_data , output_nodes, {}, &predictions);