Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[AnyPrecision optimizer] consider FP32 defaults, possibly automated via BF16 support check #59

Open
lessw2020 opened this issue Aug 31, 2022 · 1 comment
Labels
enhancement New feature or request

Comments

@lessw2020
Copy link
Contributor

Enhancement (credit to @rohan-varma):
"this can be done in a follow up PR, but let's maybe consider not defaulting things to torch.bfloat16 eventually. this is because it might be good to make this optimizer usable out of the box with the defaults on all HW architectures, but only A100 supports bfloat16 well at the moment.

But the downside here would be that the default optimizer won't be too interesting, it'd just be AdamW"

Possible option to accomplish this would be a simple bf16 native support check, and then revert any BF16 defaults to FP32 (and turn off Kahan as well since it would not add benefit).
Downside dilemma is if you should warn user about this change - positive they know they are not getting BF16 benefits, negative is they may have been aware and don't enjoy one line warning * 128 gpus.

@lessw2020
Copy link
Contributor Author

from review discussion:
"actually I think having this concept of 'smart defaults' where it attempts appropriate BF16 but rolls back to FP32 when not supported is a nice user experience. Could also apply for kahan summation being turned on and we revert it back to off, if BF1t6 is not supported and enable both momentum and variance to FP32 automatically (since Kahan adds no value for FP32). This would be nice as they would inherently get the optimal setup on various hardware.

That also gives it a bit more value even in the scenario of it becomes AdamW b/c it also reverts automatically on future hardward where BF16 is supported."

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant