Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conv() dilation argument fix #9466

Merged
merged 1 commit into from
Sep 18, 2022
Merged

Conv() dilation argument fix #9466

merged 1 commit into from
Sep 18, 2022

Conversation

glenn-jocher
Copy link
Member

@glenn-jocher glenn-jocher commented Sep 18, 2022

Resolves #9384

Signed-off-by: Glenn Jocher glenn.jocher@ultralytics.com

πŸ› οΈ PR Summary

Made with ❀️ by Ultralytics Actions

🌟 Summary

Improved clarity in neural network layer initialization.

πŸ“Š Key Changes

  • Clarified the act parameter passing when initializing Conv layers in Focus and GhostConv modules.

🎯 Purpose & Impact

  • The purpose is to make the code more readable and to ensure the act argument is explicitly named during the construction of Conv layers.
  • This ensures any future changes to the default values or behavior of act in the Conv class constructor will not impact these modules inadvertently, enhancing code maintainability.
  • Users should not experience any functional changes, but developers can appreciate clearer code and more predictable behavior when adjusting or extending these classes. πŸ› πŸ’‘

Resolves #9384

Signed-off-by: Glenn Jocher <glenn.jocher@ultralytics.com>
@glenn-jocher glenn-jocher self-assigned this Sep 18, 2022
@glenn-jocher glenn-jocher merged commit 4d50cd3 into master Sep 18, 2022
@glenn-jocher glenn-jocher deleted the glenn-jocher-patch-1 branch September 18, 2022 13:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

TypeError when training with custom data
1 participant