Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: CUDA out of memory error on Windows #23

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

michaelgold
Copy link

@michaelgold michaelgold commented Sep 18, 2023

Added Windows installation instructions for xformers==0.0.16 and torch==1.13.1+cu117
This resolves the CUDA out of memory errors on Windows

@FurkanGozukara
Copy link

i am using
torch==2.0.1+cu118
xformers==0.0.21

but i guess those could be using lesser VRAM

i have RTX 3090 so 24 gb VRAM

@michaelgold
Copy link
Author

I have a RTX 3080, and to clarify, I was only getting the out of memory errors without xformers installed. Xformers with Pytorch 2 did not work correctly for me, but I was able to get things working with the earlier version.

@michaelgold
Copy link
Author

pip install xformers==0.0.16
# install compatible version of pytorch
pip install torch==1.13.1+cu117 torchvision==0.14.1+cu117 torchaudio==0.13.1 --extra-index-url https://download.pytorch.org/whl/cu117

How did you install?

@p80ric
Copy link

p80ric commented Sep 23, 2023

I had to do the following:

pip install xformers==0.0.21
pip install torch==2.0.1+cu118 torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/cu118

and then there was a strange conflict with the libiomp5md.dll that i have resolved by removing the duplicate dll:

Remove X:\miniconda3\envs\rerenderX\Library\bin\libiomp5md.dll
where rendererX is my conda environment name

@tbxmh
Copy link

tbxmh commented Sep 25, 2023

我正在使用 torch==2.0.1+cu118 xformers==0.0.21

但我猜那些可能使用很少的 VRAM

我有 RTX 3090 所以 24 GB VRAM

Can I ask if it works as long as I download and install xformer?

@s-marcelle
Copy link

i am using torch==2.0.1+cu118 xformers==0.0.21

but i guess those could be using lesser VRAM

i have RTX 3090 so 24 gb VRAM

I look forward for your tutorials on this!!

@FurkanGozukara
Copy link

I have auto installer working amazing - 1 click

hopefully will make a tutorial too

https://www.patreon.com/posts/1-click-auto-for-89457537

6r8SWXGPgDz4jCKw.mp4

@andypotato
Copy link

So just to clarify, this can work on cards with less than 24GB Vram? Like a 16GB 4060ti?

@michaelgold
Copy link
Author

Yes. Runs for me on a 3080

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants