Skip to content

Commit

Permalink
Fix lp layernorm weight (#2954)
Browse files Browse the repository at this point in the history
* lp layernorm weight fix

* Update composer/algorithms/low_precision_layernorm/low_precision_layernorm.py

Co-authored-by: Mihir Patel <mihir.v.patel7@gmail.com>

---------

Co-authored-by: Brian <23239305+b-chu@users.noreply.github.com>
Co-authored-by: Mihir Patel <mihir.v.patel7@gmail.com>
  • Loading branch information
3 people committed Feb 1, 2024
1 parent 58c69c6 commit 3c14906
Showing 1 changed file with 1 addition and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -143,7 +143,7 @@ def _to_LPLayerNorm(layer: torch.nn.Module, module_index: int) -> LPLayerNorm:
lp_layernorm = LPLayerNorm(layer.normalized_shape, layer.eps, layer.elementwise_affine)

with torch.no_grad():
if hasattr(layer, 'weight'):
if layer.weight is None: # pyright: ignore[reportUnnecessaryComparison]
lp_layernorm.register_parameter('weight', None)
else:
lp_layernorm.weight.copy_(layer.weight) # type: ignore
Expand Down

0 comments on commit 3c14906

Please sign in to comment.