-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
sat-12l-sm running on GPU #120
Comments
Hi, this is a bit odd, and should not be the case. Is your torch properly set up? Did you check with nvidia-smi also after calling |
Hi, yes I will check thank you ) |
can we set cuda index? |
Should be possible, yes. It is no different than other PyTorch models in this regard. |
There is a bug in
|
Why is this a bug? I have been using |
Hi @bminixhofer,
I'm trying to use
sat-12l-sm
on GPU with the following code:MODEL_NAME = "sat-12l-sm"
sat = SaT(MODEL_NAME)
sat.to("cuda")
However, when I run nvidia-smi in the terminal, it doesn't show any usage of the GPU, and it seems that the GPU is not being utilized. Could you please provide any guidance or suggestions on how to ensure that the model is actually using the GPU?
Thank you!
The text was updated successfully, but these errors were encountered: