Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

simclr fixes #329

Merged
merged 46 commits into from
Nov 17, 2020
Merged

simclr fixes #329

merged 46 commits into from
Nov 17, 2020

Conversation

ananyahjha93
Copy link
Contributor

@ananyahjha93 ananyahjha93 commented Nov 2, 2020

What does this PR do?

1 PR to fix a bunch of SimCLR issues.

Fixes #318
Fixes #227
Fixes #320
Fixes #322
Fixes #327
Fixes #179
Fixes #309
Fixes #355
Fixes #357

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together? Otherwise, we ask you to create a separate PR for every change.
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?
  • Did you verify new and existing tests pass locally with your changes?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

PR review

  • Is this pull request ready for review? (if not, please submit in draft mode)

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@codecov
Copy link

codecov bot commented Nov 2, 2020

Codecov Report

Merging #329 (634a050) into master (b746be0) will decrease coverage by 0.81%.
The diff coverage is 70.81%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #329      +/-   ##
==========================================
- Coverage   82.00%   81.18%   -0.82%     
==========================================
  Files         100      100              
  Lines        5639     5715      +76     
==========================================
+ Hits         4624     4640      +16     
- Misses       1015     1075      +60     
Flag Coverage Δ
cpu 24.28% <16.88%> (-0.27%) ⬇️
pytest 24.28% <16.88%> (-0.27%) ⬇️
unittests 80.47% <70.81%> (-0.79%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
pl_bolts/callbacks/ssl_online.py 100.00% <ø> (ø)
pl_bolts/models/self_supervised/byol/models.py 100.00% <ø> (ø)
...olts/models/self_supervised/swav/swav_finetuner.py 100.00% <ø> (ø)
...l_bolts/models/self_supervised/swav/swav_module.py 66.79% <ø> (ø)
...lts/models/self_supervised/simclr/simclr_module.py 71.63% <68.36%> (-11.18%) ⬇️
..._bolts/models/self_supervised/simclr/transforms.py 80.88% <69.23%> (-19.12%) ⬇️
...l_bolts/models/self_supervised/byol/byol_module.py 83.90% <100.00%> (-1.04%) ⬇️
pl_bolts/models/self_supervised/resnets.py 92.04% <100.00%> (-0.74%) ⬇️
.../models/self_supervised/simclr/simclr_finetuner.py 100.00% <100.00%> (ø)
pl_bolts/losses/self_supervised_learning.py 71.33% <0.00%> (-6.37%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update b746be0...634a050. Read the comment docs.

@Borda Borda added the fix fixing issues... label Nov 3, 2020
@Borda Borda added model Priority High priority task labels Nov 6, 2020
@Borda
Copy link
Member

Borda commented Nov 6, 2020

@ananyahjha93 how is it going here, seems like very important fix...

@pep8speaks
Copy link

pep8speaks commented Nov 13, 2020

Hello @ananyahjha93! Thanks for updating this PR.

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2020-11-17 18:09:42 UTC

@ananyahjha93 ananyahjha93 changed the title [wip] simclr fixes simclr fixes Nov 16, 2020
@ananyahjha93 ananyahjha93 changed the title simclr fixes [wip] simclr fixes Nov 16, 2020
Borda
Borda previously requested changes Nov 16, 2020
Copy link
Member

@Borda Borda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pls check the argparser issues

@ananyahjha93 ananyahjha93 changed the title [wip] simclr fixes simclr fixes Nov 17, 2020
Copy link
Contributor

@SeanNaren SeanNaren left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll trust you on the model logic, everything seems fine. Followup conversation about hparams should happen @tchaton @Borda

@SeanNaren SeanNaren requested a review from Borda November 17, 2020 19:14
dm.val_transforms = SimCLREvalDataTransform(32)
dm.test_transforms = SimCLREvalDataTransform(32)
args.num_samples = dm.num_samples
dm = CIFAR10DataModule(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we simplify this one, wrap it as a func which does this hard overwrite?

num_workers=args.num_workers
)

dm.train_transforms = SimCLRFinetuneTransform(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

well in fact this shall be in the dm as def set_finetune_transfroms(self) -> None

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe @akihironitta could take it in another PR...

from pl_bolts.models.self_supervised.simclr.transforms import SimCLREvalDataTransform, SimCLRTrainDataTransform
from pl_bolts.optimizers.lars_scheduling import LARSWrapper
from pl_bolts.optimizers.lr_scheduler import LinearWarmupCosineAnnealingLR
class SyncFunction(torch.autograd.Function):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we add docs to what it does?

Comment on lines +152 to +155
if self.arch == 'resnet18':
backbone = resnet18
elif self.arch == 'resnet50':
backbone = resnet50
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
if self.arch == 'resnet18':
backbone = resnet18
elif self.arch == 'resnet50':
backbone = resnet50
backbones = {
'resnet18': resnet18,
'resnet50': resnet50,
}
backbone = backbones[self.arch]


def nt_xent_loss(self, out_1, out_2, temperature, eps=1e-6):
"""
assume out_1 and out_2 are normalized
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
assume out_1 and out_2 are normalized
Args:
assume out_1 and out_2 are normalized

parser.add_argument('--meta_dir', default='.', type=str, help='path to meta.bin for imagenet')

# model params
parser.add_argument("--arch", default="resnet50", type=str, help="convnet architecture")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not for PL, this could be taken automatically from Model init arguments...

dm.val_transforms = SimCLREvalDataTransform(32)
args.num_samples = dm.num_samples
if args.dataset == 'stl10':
dm = STL10DataModule(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we wrap these hardcoded dataset decedent changes in a function?

)

# Implements Gaussian blur as described in the SimCLR paper
def __init__(self, kernel_size, p=0.5, min=0.1, max=2.0):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
def __init__(self, kernel_size, p=0.5, min=0.1, max=2.0):
def __init__(self, kernel_size: int, p: float = 0.5, min: float = 0.1, max: float = 2.0):

pls use longer var then p and do not overwrite native func like min/max

@Borda Borda self-requested a review November 17, 2020 19:28
@Borda Borda dismissed their stale review November 17, 2020 19:33

still to do...

@ananyahjha93 ananyahjha93 merged commit 77ff983 into master Nov 17, 2020
@ananyahjha93 ananyahjha93 deleted the fix/simclr branch November 17, 2020 19:36
@Borda
Copy link
Member

Borda commented Nov 17, 2020

just for the record @ananyahjha93 promised to fix all comments above in another PR 🐰

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
fix fixing issues... model Priority High priority task
6 participants