Skip to content

Commit

Permalink
quick documentation fix (#3379)
Browse files Browse the repository at this point in the history
Co-authored-by: v-chen_data <v-chen_data@example.com>
  • Loading branch information
2 people authored and mvpatel2000 committed Jul 22, 2024
1 parent 3e6fccb commit c0dc9ab
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion composer/distributed/dist_strategy.py
Original file line number Diff line number Diff line change
Expand Up @@ -209,7 +209,7 @@ def prepare_fsdp_module(
Args:
model (torch.nn.Module): The model to wrap.
optimizers (torch.optim.Optimizer | Sequence[torch.optim.Optimizer], optional): The optimizer for `model`, assumed to have a single param group := model.parameters().
fsdp_config (dict[str, Any]): The FSDP config.
fsdp_config (FSDPConfig): The FSDP config.
precision: (Precision): The precision being used by the Trainer, used to fill in defaults for FSDP `mixed_precision` settings.
device (Device): The device being used by the Trainer.
auto_microbatching (bool, optional): Whether or not auto microbatching is enabled.
Expand Down

0 comments on commit c0dc9ab

Please sign in to comment.