Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mistral example fail ValueError: You are attempting to perform batched generation with padding_side='right' #677

Closed
6 of 8 tasks
kvikk opened this issue Oct 5, 2023 · 10 comments
Labels
bug Something isn't working

Comments

@kvikk
Copy link

kvikk commented Oct 5, 2023

Please check that this issue hasn't been reported before.

  • I searched previous Bug Reports didn't find any similar reports.

Expected Behavior

Training should work.

Current behaviour

Running the example fails with:
File "/home/axolotl/env/lib/python3.10/site-packages/transformers/models/mistral/modeling_mistral.py", line 882, in forward raise ValueError( ValueError: You are attempting to perform batched generation with padding_side='right' this may lead to unexpected behaviour for Flash Attention version of Mistral. Make sure to call tokenizer.padding_side = 'left' before tokenizing the input.

Steps to reproduce

On the main branch (43856c0), install and run example
accelerate launch -m axolotl.cli.train examples/mistral/config.yml

req.txt

Config yaml

examples/mistral/config.yml

Possible solution

No response

Which Operating Systems are you using?

  • Linux
  • macOS
  • Windows

Python Version

3.10

axolotl branch-commit

main/43856c0a393fb7c4c44c56dc1a35ab7bc4bd52fd

Acknowledgements

  • My issue title is concise, descriptive, and in title casing.
  • I have searched the existing issues to make sure this bug has not been reported yet.
  • I am using the latest version of axolotl.
  • I have provided enough information for the maintainers to reproduce and diagnose the issue.
@kvikk kvikk added the bug Something isn't working label Oct 5, 2023
@NanoCode012
Copy link
Collaborator

Fixed in #681 . Please let me know if the issue still arises.

@wmiller
Copy link

wmiller commented Oct 5, 2023

I'm still seeing this issue after the merge. The mistral example trains without a problem after altering it use qlora, but when I switch datasets from the original alpaca_2k_test to a local sharegpt-formatted dataset, I get the error again.

@kvikk
Copy link
Author

kvikk commented Oct 6, 2023

@NanoCode012 Yes +1 @wmiller , using the sharegpt format produces the same error as before, and the example (with lora for memory reasons) works.

@NanoCode012
Copy link
Collaborator

NanoCode012 commented Oct 6, 2023

@wmiller @kvikk , if you're using local dataset, could you try clear your dataset_prepared_path and try again? The old dataset might've been cached.

Secondly, is this limited to only sharegpt format + Mistral + FA combo? Just trying to figure out the common fault. Is it still the paddIng_side issue ?

@NanoCode012 NanoCode012 reopened this Oct 6, 2023
@kvikk
Copy link
Author

kvikk commented Oct 9, 2023

I cleaned the path, so the dataset was reprocessed. I only tried the sharegpt format (and the example with alpaca which works now). The error message is the same as above.

@kvikk
Copy link
Author

kvikk commented Oct 9, 2023

To reproduce, this yml can be used:

( The dataset is just a random set from hf I found, but it is in sharegpt format. It throws a lot of "roles not alternating" warnings. Rename to yml)

mistral_test.txt

@teknium1
Copy link
Contributor

still happens, yes

@PocketDocLabs
Copy link
Contributor

#728
This seems to fix it for me.

@winglian
Copy link
Collaborator

@kvikk #728 has been merged and should be the correct fix

@kvikk
Copy link
Author

kvikk commented Oct 17, 2023

Thank you! It does work now.

@kvikk kvikk closed this as completed Oct 17, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

6 participants