Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding in the possibility of 'None' for MixedPrecision FSDP #1796

Merged
merged 8 commits into from
Dec 8, 2022

Conversation

bcui19
Copy link
Contributor

@bcui19 bcui19 commented Dec 7, 2022

What does this PR do?

Passing in 'NONE' as a value for MixedPrecision in FSDP will keep all values in 'FULL' precision:

https://github.com/pytorch/pytorch/blob/eb56b08f96fdbca17ee09ab8500ca145085e1a3a/test/distributed/fsdp/test_fsdp_mixed_precision.py#L78

What issue(s) does this change relate to?

CO-1503

Before submitting

  • Have you read the contributor guidelines?
  • Is this change a documentation change or typo fix? If so, skip the rest of this checklist.
  • Was this change discussed/approved in a GitHub issue first? It is much more likely to be merged if so.
  • Did you update any related docs and document your change?
  • Did you update any related tests and add any new tests related to your change? (see testing)
  • Did you run the tests locally to make sure they pass?
  • Did you run pre-commit on your change? (see the pre-commit section of prerequisites)

@bcui19 bcui19 changed the title Adding in the possibility of 'None for MixedPrecision FSDP Adding in the possibility of 'None' for MixedPrecision FSDP Dec 7, 2022
composer/trainer/dist_strategy.py Outdated Show resolved Hide resolved
@vchiley
Copy link
Contributor

vchiley commented Dec 7, 2022

Can you comment on how this does differ from 'FULL' / why 'FULL' doesn't work if this is the same thing?

Copy link
Contributor

@vchiley vchiley left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the request change for get_torch_dtype being passed None is blocking.

vchiley and others added 2 commits December 8, 2022 08:41
add error if mixed_precision dict is passed incorrectly
enable mixed_precision dict to set keep_low_precision_grads
@bcui19 bcui19 requested a review from vchiley December 8, 2022 17:16
@bcui19 bcui19 merged commit 250c654 into mosaicml:dev Dec 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants