-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Change nb to num in ABCs, comments, and tqdm logging #613
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is it all? :]
pytorch_lightning/trainer/logging.py
Outdated
@@ -177,7 +177,7 @@ def reduce_distributed_output(self, output, num_gpus): | |||
elif isinstance(output[k], torch.Tensor) and output[k].dim() == 0: | |||
pass | |||
|
|||
# reduce only metrics that have the same nb of gpus | |||
# reduce only metrics that have the same num of gpus |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you can say "full" number
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agreed.
@@ -169,22 +170,22 @@ class TrainerTrainLoopMixin(ABC): | |||
def __init__(self): | |||
# this is just a summary on variables used in this abstract class, | |||
# the proper values/initialisation should be done in child class | |||
self.max_nb_epochs = None | |||
self.max_epochs = None | |||
self.min_epochs = None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is the min needed here in this class?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, it is used in the training loop.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
self.min_nb_epochs
was there before, just lower down in the list, so I moved it up so it would be next to self.max_epochs
.
@property | ||
def max_nb_epochs(self): | ||
""" | ||
.. warning:: `max_nb_epochs` is deprecated and will be removed in |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pls. just one line
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Got it.
I think so. I did a project-wide search for "nb" and these were all the remaining ones I found, other than the ones used for SLURM, since SLURM uses "nb" in some of their variables. |
I updated this PR with the suggestions by @Borda if someone wants to rereview it or merge it if it looks good. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Before submitting
What does this PR do?
Fixes #589 (comment)