-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BEIT-3] error happens when I evaluate BEiT-3 finetuned model on VQAv2 #1597
Comments
Changing |
@Sv3n01 Thank you for replying! |
I removed "torch.distributed.launch --nproc_per_node=2" and run again. I run this code.
The error
I am trying to solve this problem now. |
I can get submit_vqav2_test.json (the list of pairs of question_id and answer). I write this in run_beit3_finetuning.py (line 141) Then, I run this code. (Maybe you should not omit "-m torch.distributed.launch --nproc_per_node=2")
Then, I can get submit_vqav2_test.json
I don't know why I can get the json file. But, I close this issue. |
Describe
Model I am using (UniLM, MiniLM, LayoutLM ...): BEIT-3
I want to evaluate BEiT-3 finetuned model on VQAv2.
https://github.com/microsoft/unilm/blob/master/beit3/get_started/get_started_for_vqav2.md#example-evaluate-beit-3-finetuned-model-on-vqav2-visual-question-answering
However, error happens. I cannot understand what this error message means. How do I solve this problem? Please help me.
Thank you for sharing codes of BEIT-3.
The text was updated successfully, but these errors were encountered: