Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Early Stopping kicks in at min_epochs + 2 instead of min_epochs #606

Closed
awaelchli opened this issue Dec 8, 2019 · 2 comments · Fixed by #617
Closed

Early Stopping kicks in at min_epochs + 2 instead of min_epochs #606

awaelchli opened this issue Dec 8, 2019 · 2 comments · Fixed by #617
Labels
bug Something isn't working

Comments

@awaelchli
Copy link
Member

awaelchli commented Dec 8, 2019

Describe the bug

I was working on a fix for #524 and found that early stopping starts to kick in at epoch 3 despite min_epochs = 1.

To Reproduce

run basic_examples/gpu_template.py and log the callback calls every epoch.

Expected behavior

When setting min_epochs=n (counting from 1), we should evaluate early stopping at the end of epoch n.

Proposed fix:

I propose to change this line in the training loop:
met_min_epochs = epoch > self.min_epochs
to
met_min_epochs = epoch >= self.min_epochs - 1

  • Why the "-1"? The epoch variable in the training loop starts at 0, but the Trainer argument min_epochs starts counting at 1.

  • Why the ">="? The early stop check is done at the end of each epoch, hence the epoch counter will be = to min_epochs after min_epochs have passed.

Desktop (please complete the following information):

  • OS: Linux
  • Version: master
@awaelchli awaelchli added the bug Something isn't working label Dec 8, 2019
@williamFalcon
Copy link
Contributor

submit the PR for this?

@awaelchli
Copy link
Member Author

yep. was waiting for an approval.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants