Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

try #2: pin hf transformers and accelerate to latest release, don't reinstall pytorch #867

Merged
merged 8 commits into from
Nov 16, 2023

Commits on Nov 16, 2023

  1. Configuration menu
    Copy the full SHA
    351a2d9 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    47fa020 View commit details
    Browse the repository at this point in the history
  3. Configuration menu
    Copy the full SHA
    b103697 View commit details
    Browse the repository at this point in the history
  4. try w auto-gptq==0.5.1

    winglian committed Nov 16, 2023
    Configuration menu
    Copy the full SHA
    d67b405 View commit details
    Browse the repository at this point in the history
  5. Configuration menu
    Copy the full SHA
    9678eea View commit details
    Browse the repository at this point in the history
  6. pin xformers to 0.0.22

    winglian committed Nov 16, 2023
    Configuration menu
    Copy the full SHA
    642b113 View commit details
    Browse the repository at this point in the history
  7. bump flash-attn to 2.3.3

    winglian committed Nov 16, 2023
    Configuration menu
    Copy the full SHA
    bafa035 View commit details
    Browse the repository at this point in the history
  8. Configuration menu
    Copy the full SHA
    aaafd34 View commit details
    Browse the repository at this point in the history