Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Activating logger but still saving model #10

Open
florpi opened this issue May 20, 2021 · 1 comment
Open

Activating logger but still saving model #10

florpi opened this issue May 20, 2021 · 1 comment

Comments

@florpi
Copy link

florpi commented May 20, 2021

Hi! I'm very happy to see an implementation of gpytorch together with pytorch lightning :D Thanks for making it publicly available.

I was wondering whether you had to deal with this error when turning on the logger and saving the model hyperparameters:


  File "/cosma/home/dp004/dc-cues1/.local/lib/python3.7/site-packages/torch/utils/tensorboard/summary.py", line 192, in hparams
    ssi.hparams[k].number_value = v
TypeError: array([-1.7629994 ,  0.74281293,  0.3584844 , -0.03555403, -1.3293115 ,
        1.0728952 ,  0.      has type numpy.ndarray, but expected one of: int, long, float

I guess this comes from storing the training data together with the model, but do you have any idea of how to solve it or can you think of any way around it? I love callbacks, and without the logger doing anything is quite annoying.

@acxz
Copy link
Owner

acxz commented May 20, 2021

Glad you are finding this useful! I have been meaning to make a tutorial on the GPytorch repo about this for a while, but just haven't got around to it.

I think I have seen this error before and not just with gpytorch models. I think the issue rises because hparams can't store arrays. Here is a PR on actually deprecating hparams and using another system. (Lightning-AI/pytorch-lightning#1896) not sure if the documentation has been updated or whatnot, but the workaround is prob to just not use hparams. It has been a while and my current code in this repo might not align with best pl practices. Sorry I can't help up front. Let's keep this issue open for the time being tho.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants