Skip to content

Commit

Permalink
Update onnx creation with dynamic batch size option
Browse files Browse the repository at this point in the history
  • Loading branch information
bartoszptak committed Jan 5, 2024
1 parent 44e4b41 commit 7bb32f8
Showing 1 changed file with 21 additions and 1 deletion.
22 changes: 21 additions & 1 deletion docs/source/creators/creators_example_onnx_model.rst
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ Steps based on `EXPORTING A MODEL FROM PYTORCH TO ONNX AND RUNNING IT USING ONNX
x = torch.rand(1, INP_CHANNEL, INP_HEIGHT, INP_WIDTH) # eg. torch.rand([1, 3, 256, 256])
_ = model(x)
* Step 3. Call export function
* Step 3a. Call export function with static batch_size=1:

.. code-block::
Expand All @@ -44,6 +44,20 @@ Steps based on `EXPORTING A MODEL FROM PYTORCH TO ONNX AND RUNNING IT USING ONNX
output_names=['output'],
do_constant_folding=False)
* Step 3b. Call export function with dynamic batch_size:

.. code-block::
torch.onnx.export(model,
x, # model input
'model.onnx', # where to save the model
export_params=True,
opset_version=15,
input_names=['input'],
output_names=['output'],
dynamic_axes={'input': {0: 'batch_size'}, # variable lenght axes
'output': {0: 'batch_size'}})
================
Tensorflow/Keras
================
Expand All @@ -63,3 +77,9 @@ Steps based on the `tensorflow-onnx <https://github.com/onnx/tensorflow-onnx>`_
.. code-block::
python -m tf2onnx.convert --saved-model YOUR_MODEL_CHECKPOINT_PATH --output model.onnx --opset 15
===============================================
Update ONNX model to support dynamic batch size
===============================================

To convert model to support dynamic batch size, you need to update :code:`model.onnx` file. You can do it manually using `this <https://github.com/onnx/onnx/issues/2182#issuecomment-881752539>` script. Please note that the script is not perfect and may not work for all models.

0 comments on commit 7bb32f8

Please sign in to comment.