Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Respect sequence_len in config for type: llama2_chat #926

Merged
merged 6 commits into from
Dec 12, 2023

Commits on Dec 9, 2023

  1. Respect sequence_len in config for type: llama2_chat

    It was hardcoded to `4096` I am not sure why?  This updates it to pull from the config. 
    
    cc: @winglian
    hamelsmu committed Dec 9, 2023
    Configuration menu
    Copy the full SHA
    523bca6 View commit details
    Browse the repository at this point in the history
  2. Update llama2_chat.py

    hamelsmu committed Dec 9, 2023
    Configuration menu
    Copy the full SHA
    f46f72b View commit details
    Browse the repository at this point in the history

Commits on Dec 12, 2023

  1. apply black formatting

    hamelsmu committed Dec 12, 2023
    Configuration menu
    Copy the full SHA
    c59561d View commit details
    Browse the repository at this point in the history
  2. fix tokenizer

    hamelsmu committed Dec 12, 2023
    Configuration menu
    Copy the full SHA
    2db48c3 View commit details
    Browse the repository at this point in the history
  3. update test data

    hamelsmu committed Dec 12, 2023
    Configuration menu
    Copy the full SHA
    44b13d3 View commit details
    Browse the repository at this point in the history
  4. lint fixtures

    hamelsmu committed Dec 12, 2023
    Configuration menu
    Copy the full SHA
    8336cf4 View commit details
    Browse the repository at this point in the history