Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Removing regularisation applied to linear model bias #669

Merged
merged 1 commit into from
Jun 24, 2021

Conversation

stanton119
Copy link
Contributor

@stanton119 stanton119 commented Jun 23, 2021

What does this PR do?

Fixes #668
Removes the bias from the regularisation applied within linear models.
The regularisation should only apply to the model weights and not the bias.
In the class training_step method using self.parameters() includes the bias, whereas using self.linear.weight ignores it.

The included approach matches that from sklearn; I did a semi-related write up to check the removal of the bias against sklearn here - https://github.com/stanton119/data-analysis/blob/master/PyTorchStuff/elastic_net/elastic_linear.md

It's my first PR here so please let me know if anything is out of sorts!

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests? [not needed for typos/docs]
  • Did you verify new and existing tests pass locally with your changes?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

PR review

  • Is this pull request ready for review? (if not, please submit in draft mode)

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

100% 🙃

@Borda Borda added the ready label Jun 23, 2021
@codecov
Copy link

codecov bot commented Jun 23, 2021

Codecov Report

Merging #669 (c83c763) into master (354534a) will decrease coverage by 0.45%.
The diff coverage is 75.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #669      +/-   ##
==========================================
- Coverage   71.83%   71.38%   -0.46%     
==========================================
  Files         118      118              
  Lines        7148     7114      -34     
==========================================
- Hits         5135     5078      -57     
- Misses       2013     2036      +23     
Flag Coverage Δ
cpu 71.38% <75.00%> (-0.46%) ⬇️
pytest 71.38% <75.00%> (-0.46%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
pl_bolts/models/regression/logistic_regression.py 93.02% <50.00%> (ø)
pl_bolts/models/regression/linear_regression.py 98.71% <100.00%> (ø)
pl_bolts/models/self_supervised/byol/models.py 33.33% <0.00%> (-66.67%) ⬇️
pl_bolts/callbacks/byol_updates.py 68.18% <0.00%> (-31.82%) ⬇️
...l_bolts/models/self_supervised/byol/byol_module.py 22.34% <0.00%> (-29.24%) ⬇️
pl_bolts/datasets/mnist_dataset.py 87.50% <0.00%> (-4.17%) ⬇️
...l_bolts/models/rl/vanilla_policy_gradient_model.py 93.44% <0.00%> (-2.46%) ⬇️
..._bolts/models/self_supervised/moco/moco2_module.py 56.25% <0.00%> (-1.59%) ⬇️
pl_bolts/datasets/ssl_amdim_datasets.py 75.34% <0.00%> (-0.98%) ⬇️
pl_bolts/models/vision/image_gpt/igpt_module.py 19.78% <0.00%> (-0.88%) ⬇️
... and 18 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 354534a...c83c763. Read the comment docs.

@akihironitta akihironitta enabled auto-merge (squash) June 24, 2021 01:37
@Borda Borda disabled auto-merge June 24, 2021 06:56
@Borda Borda merged commit ace22fd into Lightning-Universe:master Jun 24, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Classic Regression model regularisation error
3 participants