Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CUDA out of Memory with 2 x 3090 RTX #72

Open
MiriamJo opened this issue May 19, 2022 · 2 comments
Open

CUDA out of Memory with 2 x 3090 RTX #72

MiriamJo opened this issue May 19, 2022 · 2 comments

Comments

@MiriamJo
Copy link

I wonder if somethings off with my code or if 2 3090RTX with 25GB Ram is simply not enough RAM? I turned the batch size all the way down, however, I have m,assive input size images with a size of 1440x1920. I also used the apex O1 optimizer and devided the sample_points by 48 instead of 24.

How can I further improve the code to save some memory? kind regards.

@MiriamJo
Copy link
Author

Alright, I tried it with 8x 3090 with 25gb memory and it still went out of memory. I really dont know what to do next. I added emoty_cache and things, but it always runs out of memory when training.

@xiaoWen9246
Copy link

Alright, I tried it with 8x 3090 with 25gb memory and it still went out of memory. I really dont know what to do next. I added emoty_cache and things, but it always runs out of memory when training.

maybe you can lower the batch_size, I run it successfully on 3090 with batch_size = 3.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants