Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix custom gamma in rl #550

Merged
merged 1 commit into from
Mar 4, 2021
Merged

Conversation

BartekRoszak
Copy link
Contributor

What does this PR do?

I didn't make an issue as the bug seems to be obvious.
So far in a few rl models, custom gamma could be set in init but then was not passed to loss computation function.
This PR fixes it.

Fixes # (issue)

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests? [not needed for typos/docs]
  • Did you verify new and existing tests pass locally with your changes?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

PR review

  • Is this pull request ready for review? (if not, please submit in draft mode)

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@codecov
Copy link

codecov bot commented Jan 30, 2021

Codecov Report

Merging #550 (cb1aa65) into master (b9d5f5e) will decrease coverage by 0.07%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #550      +/-   ##
==========================================
- Coverage   77.60%   77.53%   -0.08%     
==========================================
  Files         115      115              
  Lines        6707     6707              
==========================================
- Hits         5205     5200       -5     
- Misses       1502     1507       +5     
Flag Coverage Δ
cpu ?
pytest ?
unittests 77.53% <100.00%> (+0.44%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
pl_bolts/models/rl/double_dqn_model.py 95.65% <100.00%> (ø)
pl_bolts/models/rl/dqn_model.py 78.48% <100.00%> (ø)
pl_bolts/models/rl/per_dqn_model.py 86.66% <100.00%> (ø)
pl_bolts/utils/warnings.py 38.46% <0.00%> (-61.54%) ⬇️
...s/models/detection/components/_supported_models.py 71.42% <0.00%> (-28.58%) ⬇️
pl_bolts/datasets/mnist_dataset.py 36.36% <0.00%> (-9.10%) ⬇️
pl_bolts/models/detection/faster_rcnn/backbones.py 84.61% <0.00%> (-7.70%) ⬇️
pl_bolts/transforms/dataset_normalizations.py 80.00% <0.00%> (-5.00%) ⬇️
pl_bolts/callbacks/vision/image_generation.py 87.87% <0.00%> (-3.04%) ⬇️
..._bolts/models/self_supervised/simclr/transforms.py 75.71% <0.00%> (-2.86%) ⬇️
... and 16 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update b9d5f5e...c8000de. Read the comment docs.

Copy link
Contributor

@akihironitta akihironitta left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@BartekRoszak LGTM. Thank you for your contribution!

@akihironitta akihironitta self-assigned this Feb 13, 2021
@akihironitta
Copy link
Contributor

@Borda Could I have your review here?

@Borda Borda enabled auto-merge (squash) March 4, 2021 14:50
@Borda Borda disabled auto-merge March 4, 2021 21:38
@Borda Borda merged commit 298fc28 into Lightning-Universe:master Mar 4, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants