-
Notifications
You must be signed in to change notification settings - Fork 145
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can't run with multiple gpu #364
Comments
same |
I refer to the example_code/example_chat.py to run the newest InternLM-XComposer-2.5 model using 4 A800 GPUs. But still meet the OOM problem. |
same question |
Please try to install transformers==4.33.1 with the following command and try again: pip install transformers==4.33.1 |
I found the model cannot take multiple images as inputs, neither can it take a list of images thus the fix is
|
Still meet the same problem with transformers 4.33.1. I'm running the video understanding example on the huggingface. |
Hello, thanks for the great work!
I refer to the
example_code/example_chat.py
to run the newest InternLM-XComposer-2.5 model using 4 NVIDIA 4090 GPUs. But still meet the OOM problem. It seems that although the weights are divided successfully, the first gpu always runs into OOM whenmodel.chat
is called.Any response will be greatly appreciated!
The text was updated successfully, but these errors were encountered: