Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

03_pytorch_computer_vision.ipynb crossEntropyLoss() #1027

Open
Jiammmmmin opened this issue Jul 31, 2024 · 2 comments
Open

03_pytorch_computer_vision.ipynb crossEntropyLoss() #1027

Jiammmmmin opened this issue Jul 31, 2024 · 2 comments

Comments

@Jiammmmmin
Copy link

截屏2024-07-31 上午11 18 52 截屏2024-07-31 上午11 19 21 it seems that I need to convert the `y` to `torch.tensor(y)`, but it does not help. Anyone else having the same problem as me?
@rithvikshetty
Copy link

Try doing it with torch.tensor(y).type(torch.LongTensor)

@mrdbourke
Copy link
Owner

Hi @Jiammmmmin ,

How did you go with this?

I was just able to run the example notebook with no issues, see here: https://www.learnpytorch.io/03_pytorch_computer_vision/

Did you perhaps check your dataloader code?

Example:

from torch.utils.data import DataLoader

# Setup the batch size hyperparameter
BATCH_SIZE = 32

# Turn datasets into iterables (batches)
train_dataloader = DataLoader(train_data, # dataset to turn into iterable
    batch_size=BATCH_SIZE, # how many samples per batch? 
    shuffle=True # shuffle data every epoch?
)

test_dataloader = DataLoader(test_data,
    batch_size=BATCH_SIZE,
    shuffle=False # don't necessarily have to shuffle the testing data
)

# Let's check out what we've created
print(f"Dataloaders: {train_dataloader, test_dataloader}") 
print(f"Length of train dataloader: {len(train_dataloader)} batches of {BATCH_SIZE}")
print(f"Length of test dataloader: {len(test_dataloader)} batches of {BATCH_SIZE}")

Or your loss_fn code?

What happens when you run print(y)?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants