Skip to content

Commit

Permalink
Dynamic normalization layer selection (#7392)
Browse files Browse the repository at this point in the history
* Dynamic normalization layer selection

Based on actual available layers. Torch 1.7 compatible, resolves #7381

* Update train.py
  • Loading branch information
glenn-jocher committed Apr 12, 2022
1 parent fa569cd commit 4bb7eb8
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion train.py
Original file line number Diff line number Diff line change
Expand Up @@ -151,7 +151,7 @@ def train(hyp, opt, device, callbacks): # hyp is path/to/hyp.yaml or hyp dictio
LOGGER.info(f"Scaled weight_decay = {hyp['weight_decay']}")

g = [], [], [] # optimizer parameter groups
bn = nn.BatchNorm2d, nn.LazyBatchNorm2d, nn.GroupNorm, nn.InstanceNorm2d, nn.LazyInstanceNorm2d, nn.LayerNorm
bn = tuple(v for k, v in nn.__dict__.items() if 'Norm' in k) # normalization layers, i.e. BatchNorm2d()
for v in model.modules():
if hasattr(v, 'bias') and isinstance(v.bias, nn.Parameter): # bias
g[2].append(v.bias)
Expand Down

0 comments on commit 4bb7eb8

Please sign in to comment.