Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Keep model in CPU during ONNX export #1586

Merged
merged 1 commit into from
May 31, 2023
Merged

Keep model in CPU during ONNX export #1586

merged 1 commit into from
May 31, 2023

Conversation

natuan
Copy link
Contributor

@natuan natuan commented May 27, 2023

This is to fix OOM issue when exporting 13b OPT model.

@natuan natuan requested review from a team, shubhra, rahul-tuli, KSGulin, bfineran, markurtz and dbogunowicz and removed request for a team, shubhra, rahul-tuli and KSGulin May 27, 2023 04:19
Copy link
Contributor

@dbogunowicz dbogunowicz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me. Just wondering, why:
model.cpu() did not work in the first place? Was the Trainer ignoring this piece of logic?

@natuan
Copy link
Contributor Author

natuan commented May 30, 2023

Looks good to me. Just wondering, why: model.cpu() did not work in the first place? Was the Trainer ignoring this piece of logic?

It's too late at that point

@natuan natuan merged commit 9c7c285 into main May 31, 2023
12 checks passed
@natuan natuan deleted the onnx_export_cpu branch May 31, 2023 18:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants