Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Brandon/flux model loading #6739

Merged
merged 113 commits into from
Aug 27, 2024
Merged

Brandon/flux model loading #6739

merged 113 commits into from
Aug 27, 2024

Conversation

brandonrising
Copy link
Collaborator

@brandonrising brandonrising commented Aug 12, 2024

Summary

Add FLUX support to the model manager.

QA Instructions

Install the FLUX model via the "Starter Models" tab:

  • Test fresh install and inference of all FLUX variants
    • FLUX schnell
    • FLUX schnell quantized
    • FLUX dev
    • FLUX dev quantized
  • Test quantized on 16GB GPU
  • Maybe: Test quantized on 8GB GPU
  • Test that the app still runs on CPU and MPS systems (main concern is pip install, or that we try to import bitsandbytes)

Merge Plan

Can be merged with proper approvals

@github-actions github-actions bot added python PRs that change python files invocations PRs that change invocations backend PRs that change backend files frontend PRs that change frontend files labels Aug 12, 2024
@github-actions github-actions bot added Root python-deps PRs that change python dependencies services PRs that change app services labels Aug 15, 2024
@RyanJDick RyanJDick changed the base branch from ryan/flux to main August 20, 2024 18:10
@github-actions github-actions bot added the python-tests PRs that change python tests label Aug 20, 2024
invokeai/app/invocations/fields.py Outdated Show resolved Hide resolved
invokeai/app/invocations/fields.py Outdated Show resolved Hide resolved
invokeai/app/invocations/flux_text_encoder.py Show resolved Hide resolved
invokeai/app/invocations/flux_text_encoder.py Outdated Show resolved Hide resolved
invokeai/app/invocations/flux_text_encoder.py Outdated Show resolved Hide resolved
invokeai/backend/requantize.py Outdated Show resolved Hide resolved
invokeai/backend/requantize.py Outdated Show resolved Hide resolved
invokeai/configs/flux/flux1-schnell.yaml Outdated Show resolved Hide resolved
pyproject.toml Outdated Show resolved Hide resolved
pyproject.toml Outdated Show resolved Hide resolved
@brandonrising brandonrising force-pushed the brandon/flux-model-loading branch 2 times, most recently from 5606eea to 5edec7f Compare August 21, 2024 13:11
pyproject.toml Outdated Show resolved Hide resolved
RyanJDick and others added 25 commits August 26, 2024 19:21
…ly equivalent, but in my test VAE deconding was ~8% faster after the change.
@brandonrising brandonrising merged commit 50085b4 into main Aug 27, 2024
14 checks passed
@brandonrising brandonrising deleted the brandon/flux-model-loading branch August 27, 2024 00:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
backend PRs that change backend files CI-CD Continuous integration / Continuous delivery frontend PRs that change frontend files invocations PRs that change invocations python PRs that change python files python-deps PRs that change python dependencies python-tests PRs that change python tests Root services PRs that change app services
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants