-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Model Compression] Expand export_model arguments: dummy input and onnx opset_version #3968
Conversation
Ready for review @J-shang @QuanluZhang |
device = torch.device('cpu') | ||
input_data = torch.Tensor(*input_shape).to(device) | ||
else: | ||
input_data = dummy_input |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what if user both set dummy_input
and device
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think device should be ignored in that case. Have updated the docstring.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
recommand we also input_data = dummy_input.to(device)
, or this may confuse user, if user also set device
but we ignore it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But I think this operation may fail when e.g., dummy_input
is a tuple
Fix #3964.