Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

torch.cuda.amp bug fix #2750

Merged
merged 1 commit into from
Apr 9, 2021
Merged

torch.cuda.amp bug fix #2750

merged 1 commit into from
Apr 9, 2021

Commits on Apr 9, 2021

  1. torch.cuda.amp bug fix

    PR #2725 introduced a very specific bug that only affects multi-GPU trainings. Apparently the cause was using the torch.cuda.amp decorator in the autoShape forward method. I've implemented amp more traditionally in this PR, and the bug is resolved.
    glenn-jocher authored Apr 9, 2021
    Configuration menu
    Copy the full SHA
    49b4a4d View commit details
    Browse the repository at this point in the history