Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update/merge multi-gpu docs #2021

Merged
merged 14 commits into from
Jun 2, 2020
Merged

Update/merge multi-gpu docs #2021

merged 14 commits into from
Jun 2, 2020

Conversation

awaelchli
Copy link
Member

@awaelchli awaelchli commented May 30, 2020

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
    - yes, on slack
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

@mergify mergify bot requested a review from a team May 30, 2020 22:26
@awaelchli awaelchli added the docs Documentation related label May 30, 2020
@awaelchli awaelchli marked this pull request as ready for review May 31, 2020 01:09
docs/source/multi_gpu.rst Outdated Show resolved Hide resolved
docs/source/multi_gpu.rst Outdated Show resolved Hide resolved
docs/source/multi_gpu.rst Outdated Show resolved Hide resolved
docs/source/slurm.rst Outdated Show resolved Hide resolved
docs/source/slurm.rst Outdated Show resolved Hide resolved
dataloader = Dataloader(dataset, sampler=dist_sampler)


Auto-slurm-job-submission
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what about this? @awaelchli

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I moved it and renamed the title to "Building SLURM scripts" because auto slurm job submission was already described in the other document and this title seemed to fit better.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I found it later too :]

@mergify mergify bot requested a review from a team June 1, 2020 12:07
@mergify mergify bot requested a review from a team June 1, 2020 12:08
@Borda
Copy link
Member

Borda commented Jun 2, 2020

@awaelchli pls fix the doc build and ready to go... :]

@Borda Borda added the ready PRs ready to be merged label Jun 2, 2020
@codecov
Copy link

codecov bot commented Jun 2, 2020

Codecov Report

Merging #2021 into master will not change coverage.
The diff coverage is 100%.

@@          Coverage Diff           @@
##           master   #2021   +/-   ##
======================================
  Coverage      86%     86%           
======================================
  Files          74      74           
  Lines        4710    4710           
======================================
  Hits         4066    4066           
  Misses        644     644           

@williamFalcon williamFalcon merged commit a699003 into master Jun 2, 2020
@awaelchli awaelchli deleted the docs/multi-gpu-docs branch June 2, 2020 23:29
justusschock pushed a commit that referenced this pull request Jun 29, 2020
* merge multi-gpu docs

* extend slurm docs

* update links to elastic

* format docs and type hints in distrib parts

* reference multi-gpu/slurm in trainer args docs

* fix doctest

* typo

* doctest

* Apply suggestions from code review

Co-authored-by: Lucas Vazquez <lucasgouvaz@gmail.com>

* wall time

* Update docs/source/slurm.rst

Co-authored-by: Lucas Vazquez <lucasgouvaz@gmail.com>

* fix title

* update docs for weights summary

* update changelog

Co-authored-by: Lucas Vazquez <lucasgouvaz@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
docs Documentation related ready PRs ready to be merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants