-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove unused param tpu_core_idx #1948
Conversation
Codecov Report
@@ Coverage Diff @@
## master #1948 +/- ##
======================================
Coverage 88% 88%
======================================
Files 74 74
Lines 4645 4645
======================================
Hits 4068 4068
Misses 577 577 |
Hi
in the MNIST TPU example |
Note that we don't have TPU tests yet, so we need to always test it manually in the notebooks. |
@@ -885,7 +885,7 @@ def fit( | |||
|
|||
# train | |||
if self.tpu_id is not None: | |||
self.tpu_train(self.tpu_id, model) | |||
self.tpu_train(model) | |||
else: | |||
xmp.spawn(self.tpu_train, args=(model,), nprocs=self.tpu_cores, start_method=start_method) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
as you can see in this line, the tpu_train function is passed to this spawn function and it expect the tpu_core_idx.
It's not unused afterall.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
dam... we need to get TPU test asap...
cc: @luiscape
Before submitting
What does this PR do?
Removed unused parameter
tpu_core_idx
.PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃