-
Notifications
You must be signed in to change notification settings - Fork 6.7k
Onnx Dynamic Shapes Support #20001
Onnx Dynamic Shapes Support #20001
Conversation
|
Hey @Zha0q1 , Thanks for submitting the PR
CI supported jobs: [windows-gpu, unix-cpu, sanity, centos-cpu, website, edge, windows-cpu, miscellaneous, centos-gpu, unix-gpu, clang] Note: |
|
@mxnet-bot run ci [all] |
|
Jenkins CI successfully triggered : [centos-gpu, centos-cpu, website, clang, unix-gpu, miscellaneous, unix-cpu, windows-cpu, edge, windows-gpu, sanity] |
| # Create the model (ModelProto) | ||
| onnx_model = helper.make_model(onnx_graph) | ||
|
|
||
| # Run shape inference on the model. Due to ONNX bug/incompatibility this may or may not crash |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we run shape inference here? Seems previously we don't have it
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is an optional step. Doing shape inference may help with some runtime optimization and we can visualize the graph better, since the noes will have input and output shapes labeled
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shape inference is default to off
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Makes sense. We can keep it here, and turn it off if it crashes on some cases later on.
waytrue17
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks
RFC #20000