Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Llama + Mistral] Add attention dropout #27315

Merged
merged 10 commits into from
Nov 13, 2023
Merged

[Llama + Mistral] Add attention dropout #27315

merged 10 commits into from
Nov 13, 2023

Conversation

ArthurZucker
Copy link
Collaborator

@ArthurZucker ArthurZucker commented Nov 6, 2023

What does this PR do?

fixes #26616 by adding the attention_dropout attribute to the config and the logic in the code.

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Nov 6, 2023

The documentation is not available anymore as the PR was closed or merged.

@ArthurZucker ArthurZucker changed the title [Llama] Add attention dorpout [Llama] Add attention dropout Nov 9, 2023
@ArthurZucker ArthurZucker marked this pull request as ready for review November 9, 2023 12:29
@ArthurZucker ArthurZucker changed the title [Llama] Add attention dropout [Llama + Mistral] Add attention dropout Nov 9, 2023
Copy link
Collaborator

@amyeroberts amyeroberts left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for adding!

@ArthurZucker ArthurZucker merged commit 210e38d into main Nov 13, 2023
19 checks passed
@ArthurZucker ArthurZucker deleted the llama-nit branch November 13, 2023 13:51
EduardoPach pushed a commit to EduardoPach/transformers that referenced this pull request Nov 19, 2023
* add droppouts

* add the dropout

* add doc in the config

* nits

* fix mistral config

* nits
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Any way we can get dropout added to modeling_llama.py?
3 participants