Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Typo instantiating loss at zero #844

Closed
aymebou opened this issue Feb 7, 2020 · 3 comments · Fixed by #2380
Closed

Typo instantiating loss at zero #844

aymebou opened this issue Feb 7, 2020 · 3 comments · Fixed by #2380
Assignees
Labels
docathon-h1-2023 A label for the docathon in H1 2023 easy Text Issues relating to text tutorials

Comments

@aymebou
Copy link

aymebou commented Feb 7, 2020

Hello folks at Pytorch,

When instantiating loss at a real zero, i think you rather meant something like (i don't know if there is a better way but I usually do it this way) :

loss = torch.autograd.Variable(torch.Tensor([0]))

@holly1238 holly1238 added the Text Issues relating to text tutorials label Jul 30, 2021
@svekars svekars added easy docathon-h1-2023 A label for the docathon in H1 2023 labels May 31, 2023
@zabboud
Copy link
Contributor

zabboud commented May 31, 2023

I would take this on but first I wanted some clarity on the question - is the question whether setting loss = 0 vs. loss = torch.autograd.Variable(torch.Tensor([0])) is better? Or just to explain why loss = 0 is used instead?

@abhi-glitchhg
Copy link
Contributor

When instantiating loss at a real zero, i think you rather meant something like (i don't know if there is a better way, but I usually do it this way) :

I think instantiating the loss with 0 (or python int dtype) is not a problem because once you use the add operation with an integer and a tensor

the dtype is promoted to tensor[1][2].

But initializing it as a tensor won't do any harm and could avoid future confusion like this.

@akjalok
Copy link
Contributor

akjalok commented Jun 1, 2023

/assigntome

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
docathon-h1-2023 A label for the docathon in H1 2023 easy Text Issues relating to text tutorials
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants