Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

Use higher opset_version when exporting ONNX models #3964

Closed
xiaowu0162 opened this issue Jul 20, 2021 · 0 comments · Fixed by #3968
Closed

Use higher opset_version when exporting ONNX models #3964

xiaowu0162 opened this issue Jul 20, 2021 · 0 comments · Fixed by #3968
Assignees

Comments

@xiaowu0162
Copy link
Contributor

xiaowu0162 commented Jul 20, 2021

What would you like to be added:

  1. A dummy_input argument for the export_model in Compressor.
  2. Allow using opset_version=10 or higher when calling torch.onnx.export inside the export_model in Compressor.

Why is this needed:

  1. Some models need a tuple instead of a tensor as the model input. For example, transformer models require both input and input mask as the model input. In these cases, it seems that the original input_shape argument is not enough to specify the dummy input.
  2. Some models that use dynamic indexing (e.g., transformers) does not work with the default opset_version (9) of torch.onnx.export. The reported error looks like RuntimeError: Unsupported: ONNX export of Slice with dynamic inputs. DynamicSlice is a deprecated experimental op. The error disappears when opset_version=10 is used.

Without this feature, how does current nni work

It does not work for exporting transformers to ONNX.

Components that may involve changes:

  1. export_model function in Compressor
  2. Model Speedup may be affected

Brief description of your proposal if any:

Allow a dummy_input argument and possibly an opset_version argument.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant