Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change nb to num in ABCs, comments, and tqdm logging #613

Merged
merged 4 commits into from
Dec 9, 2019
Merged

Change nb to num in ABCs, comments, and tqdm logging #613

merged 4 commits into from
Dec 9, 2019

Conversation

elliotwaite
Copy link
Contributor

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos, doc improvements)
  • Did you read the contributor guideline?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?

What does this PR do?

Fixes #589 (comment)

Copy link
Member

@Borda Borda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is it all? :]

@@ -177,7 +177,7 @@ def reduce_distributed_output(self, output, num_gpus):
elif isinstance(output[k], torch.Tensor) and output[k].dim() == 0:
pass

# reduce only metrics that have the same nb of gpus
# reduce only metrics that have the same num of gpus
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you can say "full" number

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed.

@@ -169,22 +170,22 @@ class TrainerTrainLoopMixin(ABC):
def __init__(self):
# this is just a summary on variables used in this abstract class,
# the proper values/initialisation should be done in child class
self.max_nb_epochs = None
self.max_epochs = None
self.min_epochs = None
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is the min needed here in this class?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, it is used in the training loop.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

self.min_nb_epochs was there before, just lower down in the list, so I moved it up so it would be next to self.max_epochs.

@property
def max_nb_epochs(self):
"""
.. warning:: `max_nb_epochs` is deprecated and will be removed in
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pls. just one line

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it.

@elliotwaite
Copy link
Contributor Author

is it all? :]

I think so. I did a project-wide search for "nb" and these were all the remaining ones I found, other than the ones used for SLURM, since SLURM uses "nb" in some of their variables.

@elliotwaite
Copy link
Contributor Author

elliotwaite commented Dec 9, 2019

I updated this PR with the suggestions by @Borda if someone wants to rereview it or merge it if it looks good.

Copy link
Member

@Borda Borda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@williamFalcon williamFalcon merged commit b492e2b into Lightning-AI:master Dec 9, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants