Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

change print to logging #457

Merged
merged 6 commits into from
Nov 5, 2019
Merged

change print to logging #457

merged 6 commits into from
Nov 5, 2019

Conversation

Ir1d
Copy link
Contributor

@Ir1d Ir1d commented Nov 4, 2019

Before submitting

What does this PR do?

Fixes #282 (issue).

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

Note

I changed print to logging.info. And added some simple support for Custom Logger to override the logging to some custom logging functions.
However, many logging functions dont have access to trainer.logger at this moment, so I leaved them untouched.(for example the callback has test_tube_logger, some mixin functions have unclear direct access to logger. In these cases, I let them use logging.info directly instead of using a more robust self.logger.info)

Copy link
Member

@Borda Borda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just an author reminder #282 (comment) :)

@@ -214,17 +214,17 @@ def __dataloader(self, train):

@pl.data_loader
def train_dataloader(self):
print('training data loader called')
self.logger.info('training data loader called')
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is not very saved with self.logger = None in LightningModule

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah indeed. in that case exposing the logger is becoming much harder 🌚

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You may try change self.logger = logging.getLogger()

'Early stopping conditioned on metric `%s` '
'which is not available. Available metrics are: %s' %
(self.monitor, ','.join(list(logs.keys()))), RuntimeWarning)
warnings.warn(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

warnings.warn and logging.warning are a completely different stream with different consequences so please be sure that you want this behaviour... You may consider also raise it as Error

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmmm I'm not familiar with this, but I noticed that the previous code used warnings.warn a lot, I thought this was kind of best-practice.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe that it was related to using print, in such case, it is better to warnings
I do not say that is wrong, in this case it is good, just to be aware of it... :]

pytorch_lightning/callbacks/pt_callbacks.py Outdated Show resolved Hide resolved
@@ -142,7 +143,7 @@ def on_epoch_end(self, epoch, logs=None):

def on_train_end(self, logs=None):
if self.stopped_epoch > 0 and self.verbose > 0:
logging.info('Epoch %05d: early stopping' % (self.stopped_epoch + 1))
logging.info(f'Epoch {self.stopped_epoch + 1:05d}: early stopping')
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not directly to this line, but we shall be sure that all this new logging is tested (covered by tests)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This did occur to me, but I didn't know how to test this logging, any idea on this?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I usually tested that the message does not crash the code...

@Ir1d
Copy link
Contributor Author

Ir1d commented Nov 5, 2019

A summary of this PR:

  1. changed print -> logging.info and warnings.warn
  2. set logging options here, but not sure if appropiate. @williamFalcon you might want to take a close look.
  3. no new tests included, because I didn't know how to test logging :|

oh and btw, I thought it might be better to leave 'custom logging function' for some other day in the future. At this moment, I don't feel like adding a property to trainer indicating logging.info . The logging conventions in the project seems not yet settled down.

@williamFalcon
Copy link
Contributor

amazing!

@Borda let's do a different PR/issue for test coverage.

@williamFalcon williamFalcon merged commit 5a9afb1 into Lightning-AI:master Nov 5, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Switch from print to logging
3 participants