Pytorch LSTM in ONNX.js - Uncaught (in promise) Error: unrecognized input '' for node: LSTM_4
Pytorch LSTM in ONNX.js - Uncaught (in promise) Error: unrecognized input '' for node: LSTM_4
我正在尝试 运行 浏览器中的 Pytorch LSTM 网络。但我收到此错误:
graph.ts:313 Uncaught (in promise) Error: unrecognized input '' for node: LSTM_4
at t.buildGraph (graph.ts:313)
at new t (graph.ts:139)
at Object.from (graph.ts:77)
at t.load (model.ts:25)
at session.ts:85
at t.event (instrument.ts:294)
at e.initialize (session.ts:81)
at e.<anonymous> (session.ts:63)
at onnx.min.js:14
at Object.next (onnx.min.js:14)
我该如何解决这个问题?这是我将模型保存到 onnx 的代码:
net = torch.load('trained_model/trained_model.pt')
net.eval()
with torch.no_grad():
input = torch.tensor([[1,2,3,4,5,6,7,8,9]])
h0, c0 = net.init_hidden(1)
output, (hn, cn) = net.forward(input, (h0,c0))
torch.onnx.export(net, (input, (h0, c0)), 'trained_model/trained_model.onnx',
input_names=['input', 'h0', 'c0'],
output_names=['output', 'hn', 'cn'],
dynamic_axes={'input': {0: 'sequence'}})
我将输入作为唯一的动态轴,因为它是唯一可以改变大小的轴。使用此代码,模型可以正确保存为 trained_model.onnx。它确实给了我一个警告:
UserWarning: Exporting a model to ONNX with a batch_size other than 1, with a variable length with LSTM can cause an error when running the ONNX model with a different batch size. Make sure to save the model with a batch size of 1, or define the initial states (h0/c0) as inputs of the model.
warnings.warn("Exporting a model to ONNX with a batch_size other than 1, "
这个警告有点混乱,因为我用 batch_size 1:
导出它
- 输入的形状为 torch.Size([1, 9])
- h0 的形状为 torch.Size([2, 1, 256]) - 对应于 (num_lstm_layers, batch_size, hidden_dim)
- c0 也有形状 torch.Size([2, 1, 256])
但由于我确实将 h0/c0 定义为模型的输入,所以我认为这与问题无关。
这是我在浏览器中用于 运行ning 的 javascript 代码:
<script src="https://cdn.jsdelivr.net/npm/onnxjs/dist/onnx.min.js"></script>
<!-- Code that consume ONNX.js -->
<script>
// create a session
const myOnnxSession = new onnx.InferenceSession();
console.log('trying to load the model')
// load the ONNX model file
myOnnxSession.loadModel("./trained_model.onnx").then(() => {
console.log('successfully loaded model!')
// after this I generate input and run the model
// since my code fails before this it isn't relevant
});
</script>
根据 console.log 语句,无法加载到模型。我应该如何解决这个问题?如果相关,我正在使用 Python 3.8.5、Pytorch 1.6.0、ONNX 1.8.0。
对于将来遇到此问题的任何人,我相信我会收到此错误,因为即使 ONNX 支持 Pytorch LSTM 网络,ONNX.js 还不支持它。
为了解决这个问题,我可能会在浏览器中使用一个名为 streamlit.
的简单 Web 应用程序框架,而不是 运行
我正在尝试 运行 浏览器中的 Pytorch LSTM 网络。但我收到此错误:
graph.ts:313 Uncaught (in promise) Error: unrecognized input '' for node: LSTM_4
at t.buildGraph (graph.ts:313)
at new t (graph.ts:139)
at Object.from (graph.ts:77)
at t.load (model.ts:25)
at session.ts:85
at t.event (instrument.ts:294)
at e.initialize (session.ts:81)
at e.<anonymous> (session.ts:63)
at onnx.min.js:14
at Object.next (onnx.min.js:14)
我该如何解决这个问题?这是我将模型保存到 onnx 的代码:
net = torch.load('trained_model/trained_model.pt')
net.eval()
with torch.no_grad():
input = torch.tensor([[1,2,3,4,5,6,7,8,9]])
h0, c0 = net.init_hidden(1)
output, (hn, cn) = net.forward(input, (h0,c0))
torch.onnx.export(net, (input, (h0, c0)), 'trained_model/trained_model.onnx',
input_names=['input', 'h0', 'c0'],
output_names=['output', 'hn', 'cn'],
dynamic_axes={'input': {0: 'sequence'}})
我将输入作为唯一的动态轴,因为它是唯一可以改变大小的轴。使用此代码,模型可以正确保存为 trained_model.onnx。它确实给了我一个警告:
UserWarning: Exporting a model to ONNX with a batch_size other than 1, with a variable length with LSTM can cause an error when running the ONNX model with a different batch size. Make sure to save the model with a batch size of 1, or define the initial states (h0/c0) as inputs of the model.
warnings.warn("Exporting a model to ONNX with a batch_size other than 1, "
这个警告有点混乱,因为我用 batch_size 1:
导出它- 输入的形状为 torch.Size([1, 9])
- h0 的形状为 torch.Size([2, 1, 256]) - 对应于 (num_lstm_layers, batch_size, hidden_dim)
- c0 也有形状 torch.Size([2, 1, 256])
但由于我确实将 h0/c0 定义为模型的输入,所以我认为这与问题无关。
这是我在浏览器中用于 运行ning 的 javascript 代码:
<script src="https://cdn.jsdelivr.net/npm/onnxjs/dist/onnx.min.js"></script>
<!-- Code that consume ONNX.js -->
<script>
// create a session
const myOnnxSession = new onnx.InferenceSession();
console.log('trying to load the model')
// load the ONNX model file
myOnnxSession.loadModel("./trained_model.onnx").then(() => {
console.log('successfully loaded model!')
// after this I generate input and run the model
// since my code fails before this it isn't relevant
});
</script>
根据 console.log 语句,无法加载到模型。我应该如何解决这个问题?如果相关,我正在使用 Python 3.8.5、Pytorch 1.6.0、ONNX 1.8.0。
对于将来遇到此问题的任何人,我相信我会收到此错误,因为即使 ONNX 支持 Pytorch LSTM 网络,ONNX.js 还不支持它。
为了解决这个问题,我可能会在浏览器中使用一个名为 streamlit.
的简单 Web 应用程序框架,而不是 运行