Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

check for kaggle env variable #1568

Merged

Conversation

lezwon
Copy link
Contributor

@lezwon lezwon commented Apr 23, 2020

Before submitting

  • [ x] Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • [ x] Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?
  • [x ] If you made a notable change (that affects users), did you update the CHANGELOG?

What does this PR do?

Fixes #1538

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@mergify mergify bot requested a review from a team April 23, 2020 04:14
@lezwon lezwon force-pushed the bugfix/1538_kaggle_tpu_support branch from 5ca9dbb to f9360e6 Compare April 23, 2020 04:19
@codecov
Copy link

codecov bot commented Apr 23, 2020

Codecov Report

Merging #1568 into master will not change coverage.
The diff coverage is n/a.

@@          Coverage Diff           @@
##           master   #1568   +/-   ##
======================================
  Coverage      89%     89%           
======================================
  Files          68      68           
  Lines        3906    3906           
======================================
  Hits         3471    3471           
  Misses        435     435           

@williamFalcon williamFalcon merged commit 8318429 into Lightning-AI:master Apr 23, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

num_tpu_cores=8 does not work on kaggle
2 participants