Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove unused param tpu_core_idx #1948

Merged
merged 1 commit into from
May 25, 2020
Merged

Remove unused param tpu_core_idx #1948

merged 1 commit into from
May 25, 2020

Conversation

rohitgr7
Copy link
Contributor

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

What does this PR do?

Removed unused parameter tpu_core_idx.

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@mergify mergify bot requested a review from a team May 25, 2020 18:53
@codecov
Copy link

codecov bot commented May 25, 2020

Codecov Report

Merging #1948 into master will not change coverage.
The diff coverage is 100%.

@@          Coverage Diff           @@
##           master   #1948   +/-   ##
======================================
  Coverage      88%     88%           
======================================
  Files          74      74           
  Lines        4645    4645           
======================================
  Hits         4068    4068           
  Misses        577     577           

@mergify mergify bot requested a review from a team May 25, 2020 19:51
@Borda Borda added the bug Something isn't working label May 25, 2020
@Borda Borda added this to the 0.7.7 milestone May 25, 2020
@Borda Borda added the ready PRs ready to be merged label May 25, 2020
@williamFalcon williamFalcon merged commit d0ec11b into Lightning-AI:master May 25, 2020
@rohitgr7 rohitgr7 deleted the fix_tpu_idx_param branch May 25, 2020 20:09
@Borda Borda modified the milestones: 0.7.7, 0.8.0 May 26, 2020
@awaelchli
Copy link
Contributor

Hi
Did you actually test this?
I'm getting

Exception in device=TPU:0: tpu_train() takes 2 positional arguments but 3 were given
Traceback (most recent call last):
  File "/usr/local/lib/python3.6/dist-packages/torch_xla/distributed/xla_multiprocessing.py", line 119, in _start_fn
    fn(gindex, *args)
TypeError: tpu_train() takes 2 positional arguments but 3 were given

in the MNIST TPU example
https://colab.research.google.com/drive/1-_LKx4HwAxl5M6xPJmqAAu444LTDQoa3
could we fix this asap, I'd like to test my PR #1756 to make sure TPU works

@awaelchli
Copy link
Contributor

Note that we don't have TPU tests yet, so we need to always test it manually in the notebooks.

@@ -885,7 +885,7 @@ def fit(

# train
if self.tpu_id is not None:
self.tpu_train(self.tpu_id, model)
self.tpu_train(model)
else:
xmp.spawn(self.tpu_train, args=(model,), nprocs=self.tpu_cores, start_method=start_method)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

as you can see in this line, the tpu_train function is passed to this spawn function and it expect the tpu_core_idx.
It's not unused afterall.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

dam... we need to get TPU test asap...
cc: @luiscape

@mergify mergify bot requested a review from a team May 26, 2020 20:29
Borda added a commit that referenced this pull request May 26, 2020
williamFalcon pushed a commit that referenced this pull request May 26, 2020
justusschock pushed a commit that referenced this pull request Jun 29, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working ready PRs ready to be merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants