Skip to content

Commit

Permalink
Copy wandb param dict before training to avoid overwrites (#7317)
Browse files Browse the repository at this point in the history
* Copy wandb param dict before training to avoid overwrites.

Copy the hyperparameter dict retrieved from wandb configuration before passing it to `train()`. Training overwrites parameters in the dictionary (eg scaling obj/box/cls gains), which causes the values reported in wandb to not match the input values. This is confusing as it makes it hard to reproduce a run, and also throws off wandb's Bayesian sweep algorithm.

* Cleanup

Co-authored-by: Glenn Jocher <glenn.jocher@ultralytics.com>
  • Loading branch information
n1mmy and glenn-jocher committed Apr 6, 2022
1 parent 245d645 commit a88a814
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions utils/loggers/wandb/sweep.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,8 +16,8 @@

def sweep():
wandb.init()
# Get hyp dict from sweep agent
hyp_dict = vars(wandb.config).get("_items")
# Get hyp dict from sweep agent. Copy because train() modifies parameters which confused wandb.
hyp_dict = vars(wandb.config).get("_items").copy()

# Workaround: get necessary opt args
opt = parse_opt(known=True)
Expand Down

0 comments on commit a88a814

Please sign in to comment.