Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use Lightning DataModules #130

Merged
merged 16 commits into from
Jul 29, 2020
Merged

Conversation

nateraw
Copy link
Contributor

@nateraw nateraw commented Jul 28, 2020

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?

What does this PR do?

Resolves #129

I suggest we don't merge until 0.9.0 has been released. This PR uses the prerelease version 0.9.0rc2.

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Yeah 💯

@mergify mergify bot requested a review from Borda July 28, 2020 06:01
@ananyahjha93 ananyahjha93 self-requested a review July 28, 2020 06:06
@ananyahjha93 ananyahjha93 changed the title [DONT MERGE] Use Lightning DataModules [wip] Use Lightning DataModules Jul 28, 2020
@codecov
Copy link

codecov bot commented Jul 29, 2020

Codecov Report

Merging #130 into master will increase coverage by 0.02%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #130      +/-   ##
==========================================
+ Coverage   92.27%   92.30%   +0.02%     
==========================================
  Files          79       79              
  Lines        4038     4000      -38     
==========================================
- Hits         3726     3692      -34     
+ Misses        312      308       -4     
Flag Coverage Δ
#unittests 92.30% <100.00%> (+0.02%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
pl_bolts/datamodules/__init__.py 100.00% <ø> (ø)
...bolts/models/self_supervised/amdim/amdim_module.py 96.51% <ø> (ø)
..._bolts/models/self_supervised/moco/moco2_module.py 85.71% <ø> (-0.73%) ⬇️
pl_bolts/models/vision/image_gpt/igpt_module.py 98.66% <ø> (-0.12%) ⬇️
pl_bolts/datamodules/mnist_datamodule.py 97.67% <100.00%> (ø)
pl_bolts/datamodules/sklearn_datamodule.py 88.75% <100.00%> (ø)
...ts/models/autoencoders/basic_ae/basic_ae_module.py 100.00% <100.00%> (ø)
.../models/autoencoders/basic_vae/basic_vae_module.py 93.06% <100.00%> (-0.39%) ⬇️
pl_bolts/models/gans/basic/basic_gan_module.py 93.02% <100.00%> (-0.16%) ⬇️
pl_bolts/models/self_supervised/cpc/cpc_module.py 87.50% <100.00%> (-0.53%) ⬇️
... and 2 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 2fd2969...bbbabf2. Read the comment docs.

@nateraw nateraw added the enhancement New feature or request label Jul 29, 2020
@nateraw nateraw changed the title [wip] Use Lightning DataModules Use Lightning DataModules Jul 29, 2020
@nateraw
Copy link
Contributor Author

nateraw commented Jul 29, 2020

@Borda @ananyahjha93

Good to go. Had to add some unrelated unittests because I couldn't stand the coverage failing (Autoencoder components, LARS optimizer)

Currently pinned at lightning v0.9.0rc3. I would normally be against it, but this PR has a lot of changes which will probably cause some nasty merge conflicts, so I'll leave it up to you two if you want to merge before stable release of lightning 0.9.0.

@@ -19,6 +19,7 @@ def __init__(
data_dir,
val_split=5000,
num_workers=16,
batch_size=32,
*args,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is tricky as it tends to confuse user why some arguments were not used...

@@ -122,15 +124,6 @@ def configure_optimizers(self):
def prepare_data(self):
self.datamodule.prepare_data()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why do we still need to call prepare data? shouldn’t that be automatically called for the user?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think @nateraw mentioned here that we need manual calls to prepare_data for now: Lightning-AI/pytorch-lightning#2742

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

solved here: Lightning-AI/pytorch-lightning#2755

so we can remove those calls to prepare_data across the repo as soon as this is merged/next release is up on PyPi

@Borda Borda merged commit 07dcc0f into Lightning-Universe:master Jul 29, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Use Pytorch Lightning's LightningDataModule
4 participants