Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix - Avoid two softmax calls in LogisticRegression #655

Merged
merged 4 commits into from
Jun 16, 2021
Merged

Fix - Avoid two softmax calls in LogisticRegression #655

merged 4 commits into from
Jun 16, 2021

Conversation

pierrenodet
Copy link
Contributor

@pierrenodet pierrenodet commented May 30, 2021

Avoid two softmax calls in LogisticRegression

What does this PR do?

Fixes #654

Avoid two softmax calls in LogisticRegression
@github-actions github-actions bot added the model label May 30, 2021
@Borda Borda changed the title Fix issue #654 Fix - Avoid two softmax calls in LogisticRegression Jun 15, 2021
@Borda Borda added the ready label Jun 15, 2021
@codecov
Copy link

codecov bot commented Jun 15, 2021

Codecov Report

Merging #655 (fe2b0c2) into master (50969a8) will not change coverage.
The diff coverage is 0.00%.

Impacted file tree graph

@@           Coverage Diff           @@
##           master     #655   +/-   ##
=======================================
  Coverage   25.54%   25.54%           
=======================================
  Files         118      118           
  Lines        7110     7110           
=======================================
  Hits         1816     1816           
  Misses       5294     5294           
Flag Coverage Δ
cpu 25.54% <0.00%> (ø)
pytest 25.54% <0.00%> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
pl_bolts/models/regression/logistic_regression.py 23.80% <0.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 50969a8...fe2b0c2. Read the comment docs.

Copy link

@tchaton tchaton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM !

@tchaton tchaton enabled auto-merge (squash) June 16, 2021 07:20
@garryod
Copy link
Contributor

garryod commented Jun 16, 2021

Hello @pierrenodet, could we make the same changes in validation_step() and test_step() such that the losses are the same? (Will then need to add a 'F.softmax()' inside the 'accuracy()' call as it expects probabilities)

@Borda Borda disabled auto-merge June 16, 2021 08:14
@Borda Borda merged commit b087892 into Lightning-Universe:master Jun 16, 2021
@pierrenodet pierrenodet deleted the patch-1 branch June 16, 2021 17:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Softmax applied twice when computing loss in LogisticRegression
4 participants