Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use LogitsProcessor for transformers integration #926

Closed
wants to merge 1 commit into from

Conversation

lapp0
Copy link
Collaborator

@lapp0 lapp0 commented May 29, 2024

Fixes #806

Related: #789

Requesting comment on whether we should unify the logits processor in this PR as described in the last section of this message. @brandonwillard @rlouf

Problem

transformers has two issues in its generate integration:

  • if input_ids is empty, it fails
  • it has an inconsistent implementation with other models, it is the only model which doesn't use logits processors

Solution

Implement logits processors for transformers.

Proposed Additional Work

We have four implementations of logits processors which do the same thing. Their implementations have drifted and now there are growing differences in their implementation details. We should simplify this and use a single logits processor for all models, ensuring a bug fix or enhancement for one logits processor applies to all models.

This WIP PR shows generalized logits_processors.py which aims to be compatible with all inference engines, along with a modified generate/regex.py incorporating the new singular logits processor implementation.

@lapp0 lapp0 force-pushed the fix-806 branch 2 times, most recently from 62cc80e to 1ad23d6 Compare May 29, 2024 09:28
@lapp0
Copy link
Collaborator Author

lapp0 commented Jun 23, 2024

closed in favor of #966

@lapp0 lapp0 closed this Jun 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement transformers Linked to the `transformers` integration
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Update the transformers integration
2 participants