Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Automatic batch-size scaling is missing properties #1828

Closed
williamFalcon opened this issue May 14, 2020 · 4 comments · Fixed by #1836
Closed

Automatic batch-size scaling is missing properties #1828

williamFalcon opened this issue May 14, 2020 · 4 comments · Fixed by #1836
Labels
bug Something isn't working help wanted Open to be worked on

Comments

@williamFalcon
Copy link
Contributor

  File "envs/demo2/lib/python3.7/site-packages/pytorch_lightning/trainer/training_tricks.py", line 267, in _run_power_scaling
    trainer.fit(model)
  File "envs/demo2/lib/python3.7/site-packages/pytorch_lightning/trainer/trainer.py", line 839, in fit
    self.single_gpu_train(model)
  File "/lib/python3.7/site-packages/pytorch_lightning/trainer/distrib_parts.py", line 499, in single_gpu_train
    self.run_pretrain_routine(model)
  File "pytorch_lightning/trainer/trainer.py", line 981, in run_pretrain_routine
    False)
  File "evaluation_loop.py", line 326, in _evaluate
    eval_results = model.validation_epoch_end(outputs)
  File "vae.py", line 83, in validation_epoch_end
    self.logger.experiment.add_image('images', grid, 0)
AttributeError: 'NoneType' object has no attribute 'experiment'

@SkafteNicki

Looks like loggers are gone?

@williamFalcon williamFalcon added bug Something isn't working help wanted Open to be worked on labels May 14, 2020
@SkafteNicki
Copy link
Member

Loggers are disabled during the auto scaling, such that all the small runs that are done inside the auto scaling is not logged. This works fine if the user does not explicit call the logger inside their model (i.e. only called by lightning in the background). I am not sure the way around this, of cause the simplest solution would be to ask the user to do something like:

if self.logger:
   self.logger.experiment.add_image('images', grid, 0)

But that is not a good idea...

@SkafteNicki
Copy link
Member

Also this problem should also be present for the learning rate finder...

@williamFalcon
Copy link
Contributor Author

yeah, can’t ask the user to do that. maybe replace with a logger that has a no op somehow?

@SkafteNicki
Copy link
Member

Yes, I think you are right that we just initialize a dummy logger. Something like this should work

from pytorch_lightning.loggers import LightningLoggerBase

class DummyExperiment(object):
    def nop(*args, **kw): pass
    def __getattr__(self, _): return self.nop


class DummyLogger(LightningLoggerBase):
    def __init__(self):
        self._experiment = DummyExperiment()
    
    @property
    def experiment(self):
        return self._experiment    
    
    def log_metrics(self, metrics, step):
        pass
    
    def log_hyperparams(self, params):
        pass
    
    @property
    def name(self):
        pass
    
    @property
    def version(self):
        pass
    
logger = DummyLogger()
logger.experiment.add_image('hest', [1,2,3], global_step=3) # has no effect, but still exist

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Open to be worked on
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants