Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Swav #239

Merged
merged 55 commits into from
Oct 19, 2020
Merged

Swav #239

merged 55 commits into from
Oct 19, 2020

Conversation

ananyahjha93
Copy link
Contributor

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?

What does this PR do?

Adapts swav from official implementation.

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@pep8speaks
Copy link

pep8speaks commented Sep 16, 2020

Hello @ananyahjha93! Thanks for updating this PR.

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2020-10-19 16:09:54 UTC

@mergify mergify bot requested a review from Borda September 16, 2020 22:08
@codecov
Copy link

codecov bot commented Sep 16, 2020

Codecov Report

Merging #239 into master will decrease coverage by 1.85%.
The diff coverage is 66.95%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #239      +/-   ##
==========================================
- Coverage   83.91%   82.06%   -1.86%     
==========================================
  Files          91       97       +6     
  Lines        4861     5441     +580     
==========================================
+ Hits         4079     4465     +386     
- Misses        782      976     +194     
Flag Coverage Δ
#cpu 23.42% <19.10%> (-0.54%) ⬇️
#pytest 23.42% <19.10%> (-0.54%) ⬇️
#unittests 81.49% <66.09%> (-1.89%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
...ts/models/self_supervised/swav/swav_online_eval.py 21.31% <21.31%> (ø)
pl_bolts/models/self_supervised/swav/transforms.py 66.66% <66.66%> (ø)
...l_bolts/models/self_supervised/swav/swav_resnet.py 70.20% <70.20%> (ø)
...l_bolts/models/self_supervised/swav/swav_module.py 74.35% <74.35%> (ø)
pl_bolts/models/self_supervised/__init__.py 100.00% <100.00%> (ø)
pl_bolts/models/self_supervised/amdim/datasets.py 61.11% <100.00%> (ø)
pl_bolts/models/self_supervised/swav/__init__.py 100.00% <100.00%> (ø)
...olts/models/self_supervised/swav/swav_finetuner.py 100.00% <100.00%> (ø)
pl_bolts/datasets/base_dataset.py 95.45% <0.00%> (-4.55%) ⬇️
... and 5 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update d32d3eb...1ccfbaa. Read the comment docs.

@Borda Borda added the fix fixing issues... label Sep 17, 2020
Copy link
Contributor

@nateraw nateraw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we avoid passing + assigning datamodule to model as self.datamodule? I'd like to get rid of this pattern entirely. (#207 )


if prob < self.p:
sigma = (self.max - self.min) * np.random.random_sample() + self.min
sample = cv2.GaussianBlur(sample, (self.kernel_size, self.kernel_size), sigma)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

)

trainer = pl.Trainer(
gpus=0, fast_dev_run=False, max_epochs=1, default_root_dir=tmpdir, max_steps=3
Copy link
Contributor

@teddykoker teddykoker Oct 19, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we at least do a fast_dev_run to test?

requirements.txt Outdated Show resolved Hide resolved

class GaussianBlur(object):
# Implements Gaussian blur as described in the SimCLR paper
def __init__(self, kernel_size, p=0.5, min=0.1, max=2.0):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Might not want to use min and max since these are Python built-ins. Could lead to potential issues.

tests/models/self_supervised/test_models.py Outdated Show resolved Hide resolved
requirements.txt Outdated
@@ -1,2 +1,2 @@
pytorch-lightning>=1.0
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

reverse order

torch>=1.6
pytorch-lightning>=1.0

pl_bolts/models/self_supervised/swav/swav_finetuner.py Outdated Show resolved Hide resolved
pl_bolts/models/self_supervised/swav/swav_finetuner.py Outdated Show resolved Hide resolved
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
fix fixing issues...
Development

Successfully merging this pull request may close these issues.

None yet

6 participants