Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix the loss initialization in intermediate_source/char_rnn_generation_tutorial.py #2380

Merged
merged 8 commits into from
Jun 2, 2023

Conversation

akjalok
Copy link
Contributor

@akjalok akjalok commented Jun 1, 2023

Fixes #844

Description

Changed the loss initialization from 0 to torch.Tensor([0]) in order to avoid confusion about the dtypes.

Checklist

  • The issue that is being fixed is referred in the description (see above "Fixes #ISSUE_NUMBER")
  • Only one issue is addressed in this pull request
  • Labels from the issue that this PR is fixing are added to this pull request
  • No unnessessary issues are included into this pull request.
    cc @svekars @carljparker @kit1980

@netlify
Copy link

netlify bot commented Jun 1, 2023

Deploy Preview for pytorch-tutorials-preview ready!

Name Link
🔨 Latest commit 9578973
🔍 Latest deploy log https://app.netlify.com/sites/pytorch-tutorials-preview/deploys/647a225e657d2d0008a93f19
😎 Deploy Preview https://deploy-preview-2380--pytorch-tutorials-preview.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site settings.

@github-actions github-actions bot added Text Issues relating to text tutorials docathon-h1-2023 A label for the docathon in H1 2023 easy and removed cla signed labels Jun 1, 2023
@akjalok akjalok changed the title Fix the loss initialization in intermediate_source /char_rnn_generation_tutorial.py Fix the loss initialization in intermediate_source/char_rnn_generation_tutorial.py Jun 1, 2023
Copy link
Member

@NicolasHug NicolasHug left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll approve, but as mentioned in the original issue #844 (comment) the current code is just fine. We don't need to initialize loss as a tensor.

intermediate_source/char_rnn_generation_tutorial.py Outdated Show resolved Hide resolved
@@ -278,7 +278,7 @@ def train(category_tensor, input_line_tensor, target_line_tensor):

rnn.zero_grad()

loss = 0
loss = torch.tensor([0]) # in PyTorch 2.0 and later can be don't need to initialize as tensor, can be ``loss = 0``
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe this is true for any pytorch version, it's not related to 2.0

@svekars svekars merged commit 769cff9 into pytorch:main Jun 2, 2023
7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
docathon-h1-2023 A label for the docathon in H1 2023 easy Text Issues relating to text tutorials
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Typo instantiating loss at zero
4 participants