Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Removes need to unsqueeze from dp #1319

Merged
merged 11 commits into from
Apr 2, 2020
Merged

Removes need to unsqueeze from dp #1319

merged 11 commits into from
Apr 2, 2020

Conversation

williamFalcon
Copy link
Contributor

No description provided.

@williamFalcon williamFalcon changed the title removes need to unsqueeze from dp [WIP] removes need to unsqueeze from dp Mar 31, 2020
@mergify mergify bot requested a review from a team March 31, 2020 14:06
@williamFalcon williamFalcon changed the title [WIP] removes need to unsqueeze from dp Removes need to unsqueeze from dp Mar 31, 2020
@mergify
Copy link
Contributor

mergify bot commented Mar 31, 2020

This pull request is now in conflict... :(

Copy link
Contributor

@awaelchli awaelchli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice! :)

pytorch_lightning/overrides/data_parallel.py Outdated Show resolved Hide resolved
@@ -199,3 +202,15 @@ def _worker(i, module, input, kwargs, device=None):
raise output
outputs.append(output)
return outputs


def auto_squeeze_dim_zeros(output):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about just "unsqueeze_scalars"?

Co-Authored-By: Adrian Wälchli <adrian.waelchli@students.unibe.ch>
@mergify
Copy link
Contributor

mergify bot commented Apr 1, 2020

This pull request is now in conflict... :(

@codecov
Copy link

codecov bot commented Apr 2, 2020

Codecov Report

Merging #1319 into master will increase coverage by <1%.
The diff coverage is 100%.

@@           Coverage Diff           @@
##           master   #1319    +/-   ##
=======================================
+ Coverage      92%     92%   +<1%     
=======================================
  Files          62      62            
  Lines        3239    3246     +7     
=======================================
+ Hits         2964    2971     +7     
  Misses        275     275

@williamFalcon williamFalcon merged commit 3cb149f into master Apr 2, 2020
@Borda Borda added the feature Is an improvement or enhancement label Apr 2, 2020
@Borda Borda added this to the 0.7.2 milestone Apr 2, 2020
@Borda Borda deleted the ddp branch April 2, 2020 15:55
@awaelchli
Copy link
Contributor

alexeykarnachev pushed a commit to alexeykarnachev/pytorch-lightning that referenced this pull request Apr 3, 2020
* removes need to unsqueeze from dp

* removes need to unsqueeze from dp

* fixed examples

* added auto unsqueeze

* added auto unsqueeze

* added auto unsqueeze

* added auto unsqueeze

* Update pytorch_lightning/overrides/data_parallel.py

Co-Authored-By: Adrian Wälchli <adrian.waelchli@students.unibe.ch>

* fixed dp parse

* fixed dp parse

Co-authored-by: Adrian Wälchli <adrian.waelchli@students.unibe.ch>
@Borda Borda modified the milestones: v0.7., v0.7.x Apr 18, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Is an improvement or enhancement
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants