Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adds precision to eval #148

Merged
merged 5 commits into from
May 18, 2023
Merged

Adds precision to eval #148

merged 5 commits into from
May 18, 2023

Conversation

mvpatel2000
Copy link
Collaborator

@mvpatel2000 mvpatel2000 commented May 16, 2023

Adds precision to eval. Sets MPT to bf16. For some reason, BF16 + FSDP requires mixed_precision: FULL. It works fine without FSDP. FP16 also works fine and gives basically the same numbers with FSDP on any setting.

@mvpatel2000 mvpatel2000 requested a review from vchiley May 17, 2023 18:54
Copy link
Collaborator

@dakinggg dakinggg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please wait for @abhi-mosaic approval as well

Copy link
Member

@abhi-mosaic abhi-mosaic left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good for now, and we'll debug amp_bf16 + FSDP so we can reduce memory+compute requirements in the future.

@abhi-mosaic abhi-mosaic merged commit fb9f7dd into main May 18, 2023
6 checks passed
@mvpatel2000 mvpatel2000 deleted the mvpatel2000/bf16 branch May 18, 2023 19:28
bmosaicml pushed a commit that referenced this pull request Jun 6, 2023
bmosaicml pushed a commit that referenced this pull request Jun 6, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants