Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parameterize ONNX --opset-version #3154

Merged
merged 1 commit into from
May 16, 2021
Merged

Parameterize ONNX --opset-version #3154

merged 1 commit into from
May 16, 2021

Conversation

CristiFati
Copy link
Contributor

@CristiFati CristiFati commented May 13, 2021

Ran into a situation where due to (external) constraints opset version 12 is unsupported (too new). Made it configurable in case anyone else needs it. The change is trivial, but I'm not sure whether it brings any value. It's always a matter of gains / losses: the gain is the ability of configuring the version, the loss is another argument (increased complexity). Perfectly fine if it's going to be rejected.

Might add some other enhancements (fixes) if I encounter problems along the way. One that caught my eye, but I didn't look into it yet is:

ONNX: starting export with onnx 1.8.1...
e:\Work\Dev\GitHub\CristiFati\yolov5\models\yolo.py:51: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  if self.grid[i].shape[2:4] != x[i].shape[2:4] or self.onnx_dynamic:

🛠️ PR Summary

Made with ❤️ by Ultralytics Actions

🌟 Summary

Enhanced customization for ONNX export with opset version selection.

📊 Key Changes

  • Added command-line argument --opset-version to specify the ONNX opset version.

🎯 Purpose & Impact

  • 🚀 Allows users to select the ONNX opset version during model export, offering greater control over model compatibility.
  • ✨ May improve model performance and compatibility with different ONNX-supporting inference engines so users can optimize their workflows accordingly.

@glenn-jocher
Copy link
Member

@CristiFati sure, that makes sense. Thank you for your contributions!

@glenn-jocher
Copy link
Member

/rebase

@glenn-jocher glenn-jocher changed the title Enhancement(s) Parameterize ONNX --opset-version May 16, 2021
@glenn-jocher glenn-jocher changed the title Parameterize ONNX --opset-version Parameterize ONNX --opset-versio May 16, 2021
@glenn-jocher glenn-jocher changed the title Parameterize ONNX --opset-versio Parameterize ONNX --opset-version May 16, 2021
@glenn-jocher glenn-jocher merged commit 9ab561d into ultralytics:master May 16, 2021
Lechtr pushed a commit to Lechtr/yolov5 that referenced this pull request Jul 20, 2021
BjarneKuehl pushed a commit to fhkiel-mlaip/yolov5 that referenced this pull request Aug 26, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants