You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks so much for this repo and please forgive me if this is trivial, I've been trying for a little while now to run the model on Google Colab. I'm running into two separate issues, which I think may be linked. The first, is that if I load the model in a GPU runtime the model defaults to the 'cpu'.
After running: device = torch.device('cuda' if torch.cuda.is_available() else 'cpu') model.to(device)
There's an error when trying to run goemotions(texts):
"RuntimeError: Expected object of device type cuda but got device type cpu for argument #3 'index' in call to _th_index_select".
Second, when trying to run goemotions over more than a few thousand rows on Colab in a high-RAM runtime environment, I run into an out of memory error. I'm wondering if this is a problem with batching in the data loader? I'll be looking for solutions & hope to close this issue myself, but in the meantime any help is much appreciated, thanks!
The text was updated successfully, but these errors were encountered:
Hi,
Thanks so much for this repo and please forgive me if this is trivial, I've been trying for a little while now to run the model on Google Colab. I'm running into two separate issues, which I think may be linked. The first, is that if I load the model in a GPU runtime the model defaults to the 'cpu'.
After running:
device = torch.device('cuda' if torch.cuda.is_available() else 'cpu') model.to(device)
There's an error when trying to run goemotions(texts):
"RuntimeError: Expected object of device type cuda but got device type cpu for argument #3 'index' in call to _th_index_select".
Second, when trying to run goemotions over more than a few thousand rows on Colab in a high-RAM runtime environment, I run into an out of memory error. I'm wondering if this is a problem with batching in the data loader? I'll be looking for solutions & hope to close this issue myself, but in the meantime any help is much appreciated, thanks!
The text was updated successfully, but these errors were encountered: