Llama 3 & Mistral LoRA Examples Error (needs eval_sample_packing: False
)
#1644
Labels
bug
Something isn't working
eval_sample_packing: False
)
#1644
Please check that this issue hasn't been reported before.
Expected Behavior
I'm running:
The command should complete successfully, outputting the preprocessed dataset.
Current behaviour
It errors out:
Steps to reproduce
CUDA_VISIBLE_DEVICES="" python -m axolotl.cli.preprocess examples/llama-3/lora-8b.yml
Config yaml
No response
Possible solution
Set
sample_packing
tofalse
in the example config? But it's explicitly set totrue
there, so not sure if something else is wrong.Similar issue: #999
Edit: Issue happens for
mistral/lora.yml
too.Which Operating Systems are you using?
Python Version
3.10
axolotl branch-commit
main/22ae21a
Acknowledgements
The text was updated successfully, but these errors were encountered: