Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to return a final val loss in trainer? #1942

Closed
Data-drone opened this issue May 25, 2020 · 4 comments · Fixed by #2029
Closed

How to return a final val loss in trainer? #1942

Data-drone opened this issue May 25, 2020 · 4 comments · Fixed by #2029
Labels
feature Is an improvement or enhancement priority: 0 High priority task question Further information is requested
Milestone

Comments

@Data-drone
Copy link

What is your question?

Most optimisation packages ie Ray Tune / Hyperopt return the train loop to return a final accuracy for the optimiser to decide what to try next.

How do I do this with the Trainer module for Pytorch Lightning?

What's your environment?

  • OS: Linux
  • Packaging pip
  • Version 0.7.6
@Data-drone Data-drone added the question Further information is requested label May 25, 2020
@rajarajanvakil
Copy link

I too have the same issue , I want to return "val_loss" in validation step and "avg_val�_loss" from validation_epoch_end

@rajarajanvakil
Copy link

#321

`def validation_end(self, outputs):
avg_loss = torch.stack([x['batch_val_loss'] for x in outputs]).mean()
avg_acc = torch.stack([x['batch_val_acc'] for x in outputs]).mean()

    return {
      'val_loss': avg_loss,
      'val_acc': avg_acc, 
      'progress_bar':{'val_loss': avg_loss, 'val_acc': avg_acc }}`

@Data-drone
Copy link
Author

I managed to achieve using callbacks and loggers but it doesn't work for ddp backend if doing distributed training. Think I need to include a manual gather step in there for that

@williamFalcon
Copy link
Contributor

let’s formally support this somehow so it doesn’t have to be hacker around :)

@williamFalcon williamFalcon added feature Is an improvement or enhancement priority: 0 High priority task labels May 26, 2020
@williamFalcon williamFalcon added this to the 0.8.0 milestone May 26, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Is an improvement or enhancement priority: 0 High priority task question Further information is requested
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants