Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable or disable bf16 support based on availability #1116

Merged
merged 4 commits into from
Jan 14, 2024

Conversation

simhallq
Copy link
Contributor

@simhallq simhallq commented Jan 13, 2024

Issue #1090

Added support for auto in bf16 setting of config-files. Now sets bf16 to True/False depending on GPU-support if bf16: auto.

@winglian
Copy link
Collaborator

Also if you're feeling ambitious, adding a test for normalize config that it sets bf16 appropriately based on a mock of is_torch_bf16_gpu_available.

@simhallq
Copy link
Contributor Author

@winglian I've changed logging level and added a unit test.

@simhallq
Copy link
Contributor Author

simhallq commented Jan 14, 2024

Btw the test fails on Mac since device=="mps" sets bf16==False (which is the desired behavior), should I mock the value of choose_device in the test or is that overkill?

Copy link
Collaborator

@winglian winglian left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perfect. Thanks! No need to worry about mocking the device for now.

@winglian winglian merged commit 0865613 into axolotl-ai-cloud:main Jan 14, 2024
7 checks passed
@simhallq simhallq deleted the feature/bf16auto-support branch January 15, 2024 15:44
@simhallq simhallq restored the feature/bf16auto-support branch January 15, 2024 15:44
@simhallq
Copy link
Contributor Author

@winglian I just realized I missed something - validate_config now throws an error at this line if bf16 is unavailable, since we are not explicility checking cfg.bf16 is True but just checking for any truthy value (which includes non-empty string like "auto").

Should I open a new PR with the fix?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants