-
Notifications
You must be signed in to change notification settings - Fork 131
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Export ONNX #331
Comments
Hi Victor, Thanks for bringing this topic up. width, height and n_channels are only needed for CNN models. If the model is RNN (meaning there is at least one Recurrent layer) then these parameters can be None. Thanks, |
Hi Victor, Please let me know if the answer solves the question and we can close the issue. Thanks, |
Hi Maggie, Thanks a lot for your answer! I raised this issue because I trained a RNN for a sentiment analysis task and then tried to export it in ONNX format:
And got the following error:
I managed to export it by adding a dummy value to the
I think that Thanks, |
Hi Victor - we don't officially support exporting an RNN model to ONNX format. With your work-around, aren't you getting an OnnxWriteError exception for your RNN layers? The error message should be something like <layer_type> is not supported. Doug |
Hi Doug, You are right! I'm sorry, I made many tests to find a workaround and I forgot to mention it worked with a simple DNN - NOT with the original RNN. Here it is the code I ran:
Anyways, it's nice to know RNNs aren't supported yet. Many thanks, |
Hi,
When you try to export a model in ONNX format, the deploy method expects the InputLayer parameters height and n_channels not to be None, even though they are meant to be used only with image data.
It would be nice not to have to specify dummy values for them when training RNNs and DNNs for example.
Best regards,
Victor
The text was updated successfully, but these errors were encountered: