Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Early stopping when validation is disabled #1235

Merged
merged 6 commits into from
Mar 31, 2020
Merged

Early stopping when validation is disabled #1235

merged 6 commits into from
Mar 31, 2020

Conversation

awaelchli
Copy link
Contributor

@awaelchli awaelchli commented Mar 25, 2020

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

What does this PR do?

Fixes #1201 (right now it's just just a check that the proposed fix by @Dunrar passes the unit tests).

  • Early stopping falls back to training metrics returned in e.g. training_step
  • Documented this behaviour
  • Added test to make sure this doesn't get broken in the future

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@awaelchli awaelchli changed the title early stop fallback to train epoch Early stopping when validation is disabled Mar 25, 2020
@awaelchli
Copy link
Contributor Author

@jeremyjordan @neggert I think you guys worked on callbacks and early stopping. Is it ok to do this change here? Basically what we want (according to @williamFalcon) is that early stopping will work on the training metrics if no validation loop is defined (or it is disabled).
If you agree with this change, then I will add a unit test for the case I described.

@jeremyjordan
Copy link
Contributor

@awaelchli yeah i think that looks fine!

Copy link
Member

@Borda Borda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 🚀

@mergify mergify bot requested a review from a team March 30, 2020 22:33
@awaelchli awaelchli marked this pull request as ready for review March 31, 2020 00:22
@awaelchli
Copy link
Contributor Author

Thanks for looking at it already. I added a test and properly documented the behaviour. Could you review it again?

@codecov
Copy link

codecov bot commented Mar 31, 2020

Codecov Report

Merging #1235 into master will increase coverage by 0%.
The diff coverage is 100%.

@@          Coverage Diff           @@
##           master   #1235   +/-   ##
======================================
  Coverage      92%     92%           
======================================
  Files          62      62           
  Lines        3181    3181           
======================================
+ Hits         2923    2924    +1     
+ Misses        258     257    -1     

@mergify mergify bot merged commit 1aba411 into Lightning-AI:master Mar 31, 2020
@mergify
Copy link
Contributor

mergify bot commented Mar 31, 2020

Great job! =)

@awaelchli awaelchli deleted the earlystoptraining branch March 31, 2020 06:49
@Borda Borda added the bug Something isn't working label Mar 31, 2020
@Borda Borda added this to the 0.7.2 milestone Mar 31, 2020
alexeykarnachev pushed a commit to alexeykarnachev/pytorch-lightning that referenced this pull request Apr 3, 2020
* early stop fallback to train epoch

* added test

* fix imports

* update docs

* update changelog

* fix typo
@Borda Borda modified the milestones: v0.7., v0.7.x Apr 18, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Early stopping not working on 0.7.1
4 participants