Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Avoid FP64 ops for MPS support in train.py #8511

Merged
merged 1 commit into from
Jul 7, 2022
Merged

Conversation

glenn-jocher
Copy link
Member

@glenn-jocher glenn-jocher commented Jul 7, 2022

Resolves #7878 (comment)

πŸ› οΈ PR Summary

Made with ❀️ by Ultralytics Actions

🌟 Summary

Refinement of data types in weight computation functions.

πŸ“Š Key Changes

  • Changed np.int to the built-in Python int for type casting in labels_to_class_weights and labels_to_image_weights.
  • Ensured returned weights are of type float in labels_to_class_weights.

🎯 Purpose & Impact

  • πŸ“ˆ Consistency: Using Python's built-in int type may offer more consistent behavior across different platforms and versions of Python.
  • πŸ” Precision: Ensuring weights are in floating-point format (float) enables more precise calculations, especially important when weights are used in loss calculations or gradients during model training.
  • πŸš€ User Experience: These backend changes ensure more reliable and accurate training processes for users, although most might not notice the subtle changes directly.

@glenn-jocher glenn-jocher self-assigned this Jul 7, 2022
@glenn-jocher glenn-jocher merged commit dd28df9 into master Jul 7, 2022
@glenn-jocher glenn-jocher deleted the update/fp64 branch July 7, 2022 18:36
Shivvrat pushed a commit to Shivvrat/epic-yolov5 that referenced this pull request Jul 12, 2022
ctjanuhowski pushed a commit to ctjanuhowski/yolov5 that referenced this pull request Sep 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant