Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Put prompt_token_ids, attentions_mask and weights on the same device #719

Merged
merged 1 commit into from
Mar 1, 2024

Conversation

rlouf
Copy link
Member

@rlouf rlouf commented Mar 1, 2024

Closes #679

@rlouf rlouf added bug transformers Linked to the `transformers` integration labels Mar 1, 2024
@rlouf rlouf merged commit 42f465c into main Mar 1, 2024
5 checks passed
@rlouf rlouf deleted the all-tensor-same-device branch March 1, 2024 20:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug transformers Linked to the `transformers` integration
Projects
None yet
Development

Successfully merging this pull request may close these issues.

"Expected all tensors to be on the same device, but found at least two devices" when using different threads
1 participant