You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Wandb logger does not flatten parameters resulting in dictionaries being logged to Wandb, which are not searchable causing for some loss of features in wandb.
To Reproduce
Run the cpu_template with wandb logger, and log a nested dictionary.
Expected behavior
Solution, just call params = self._flatten_dict(params) this in the wandb logger.
Environment
CUDA:
GPU:
available: False
version: None
Packages:
numpy: 1.18.5
pyTorch_debug: False
pyTorch_version: 1.5.0
pytorch-lightning: 0.8.4
tensorboard: 2.2.2
tqdm: 4.46.1
System:
OS: Darwin
architecture:
64bit
processor: i386
python: 3.7.7
version: Darwin Kernel Version 19.4.0: Wed Mar 4 22:28:40 PST 2020; root:xnu-6153.101.6~15/RELEASE_X86_64
The text was updated successfully, but these errors were encountered:
Wandb logger should flatten the dictionary of parameters before logging. Every other logger has the bellow pattern of code:
🐛 Bug
Wandb logger does not flatten parameters resulting in dictionaries being logged to Wandb, which are not searchable causing for some loss of features in wandb.
To Reproduce
Run the cpu_template with wandb logger, and log a nested dictionary.
Expected behavior
Solution, just call
params = self._flatten_dict(params)
this in the wandb logger.Environment
The text was updated successfully, but these errors were encountered: