Skip to content

Commit

Permalink
docs: Add FSDP to deepspeed (#9182)
Browse files Browse the repository at this point in the history
  • Loading branch information
tara-det-ai committed May 14, 2024
1 parent 4f180db commit 2092943
Showing 1 changed file with 7 additions and 0 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -305,6 +305,13 @@ interleaving micro batches:
.. _deepspeed-profiler:

Fully Sharded Data Parallelism (FSDP)
=====================================

To use FSDP, use the PyTorch FSDP package as usual along with the Core API. ``PytorchTrial`` API
does not support FSDP. To find out more about the PyTorch FSDP package, visit `PyTorch tutorials
<https://pytorch.org/tutorials/>`__ > Getting Started with Fully Sharded Data Parallel (FSDP).

***********
Profiling
***********
Expand Down

0 comments on commit 2092943

Please sign in to comment.