-
-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error on trying to run internlm/internlm-xcomposer2-vl-7b-4bit #222
Comments
enkie358
changed the title
Error on trying to run internlm/internlm-xcompser2-vl-7b-4bit
Error on trying to run internlm/internlm-xcomposer2-vl-7b-4bit
Jun 25, 2024
You have "Load in 4-bit" selected and the device is set to GPU, right? |
It is set to load in 4-bit but the machine I was testing this on was set to
CPU.
…On Mon, Jun 24, 2024, 10:42 PM jhc13 ***@***.***> wrote:
You have "Load in 4-bit" selected and the device is set to GPU, right?
—
Reply to this email directly, view it on GitHub
<#222 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/BJLAAJATE5XKFYFH3MGDF3DZJD7M5AVCNFSM6AAAAABJ2ZFT32VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCOBYGAZDIMZUGA>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
That model can only be used with a GPU. But the program should not have let you proceed if you selected CPU. Did you modify the code? |
No, I did not. Thank you for letting me know!
…On Mon, Jun 24, 2024, 10:51 PM jhc13 ***@***.***> wrote:
That model can only be used with a GPU.
But the program should not have let you proceed if you selected CPU. Did
you modify the code?
—
Reply to this email directly, view it on GitHub
<#222 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/BJLAAJANWZNLH5Q64KVXYVDZJEAO7AVCNFSM6AAAAABJ2ZFT32VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCOBYGAZTGMZUGU>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I get this error when I try to caption images with this model:
--
Loading internlm/internlm-xcomposer2-vl-7b-4bit...
Traceback (most recent call last):
File "auto_captioning\captioning_thread.py", line 532, in run
File "auto_captioning\captioning_thread.py", line 528, in run
File "auto_captioning\captioning_thread.py", line 415, in run_captioning
File "auto_captioning\captioning_thread.py", line 271, in load_processor_and_model
File "transformers\models\auto\auto_factory.py", line 558, in from_pretrained
File "transformers\modeling_utils.py", line 3451, in from_pretrained
OSError
:
internlm/internlm-xcomposer2-vl-7b-4bit does not appear to have a file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt or flax_model.msgpack.
--
Any advice would be appreciated!
The text was updated successfully, but these errors were encountered: