Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't run with multiple gpu #364

Open
volcverse opened this issue Jul 10, 2024 · 6 comments
Open

Can't run with multiple gpu #364

volcverse opened this issue Jul 10, 2024 · 6 comments
Assignees

Comments

@volcverse
Copy link

Hello, thanks for the great work!

I refer to the example_code/example_chat.py to run the newest InternLM-XComposer-2.5 model using 4 NVIDIA 4090 GPUs. But still meet the OOM problem. It seems that although the weights are divided successfully, the first gpu always runs into OOM when model.chat is called.

Any response will be greatly appreciated!

@Uoops
Copy link

Uoops commented Jul 11, 2024

same

@hyyuan123
Copy link

I refer to the example_code/example_chat.py to run the newest InternLM-XComposer-2.5 model using 4 A800 GPUs. But still meet the OOM problem.

@waltonfuture
Copy link

same question

@yhcao6
Copy link
Collaborator

yhcao6 commented Jul 18, 2024

Please try to install transformers==4.33.1 with the following command and try again:

pip install transformers==4.33.1

@YerongLi
Copy link

YerongLi commented Jul 18, 2024

Hello, thanks for the great work!

I refer to the example_code/example_chat.py to run the newest InternLM-XComposer-2.5 model using 4 NVIDIA 4090 GPUs. But still meet the OOM problem. It seems that although the weights are divided successfully, the first gpu always runs into OOM when model.chat is called.

Any response will be greatly appreciated!

I found the model cannot take multiple images as inputs, neither can it take a list of images thus the fix is

image = './examples/dubai.png'
query = '<ImageHere>Please describe this image'

@resi1ience
Copy link

Please try to install transformers==4.33.1 with the following command and try again:

pip install transformers==4.33.1

Still meet the same problem with transformers 4.33.1. I'm running the video understanding example on the huggingface.
Response will be greatly appreciated

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants