Skip to content

How to set learning rate for SimCLR with multiple GPU training #373

Answered by ananyahjha93
FrankXinqi asked this question in Q&A
Discussion options

You must be logged in to vote

@FrankXinqi For SimCLR with a batch size of 256, you can use the regular adam optimizer with learning rates 1e-4/1e-4. LARS is preferred for bigger batch sizes like 1024 and above.

Also, try using the new updated simclr from the master branch, the online fine tuning is fixed.

We'll soon have the imagenet weights in as well for SimCLR.

Also, swav has the provision of a queue to run with a batch size of 256 and the authors have shown swav to be much more robust to small batch sizes. That is something you might wanna look at.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by Borda
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants
Converted from issue

This discussion was converted from issue #373 on December 08, 2020 19:21.