Skip to content

Commit

Permalink
Fix unfreeze_and_add_param_group expects modules rather than `mod…
Browse files Browse the repository at this point in the history
  • Loading branch information
sadiqj authored and kaushikb11 committed Apr 6, 2021
1 parent cd997d6 commit c92f84a
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion pytorch_lightning/callbacks/finetuning.py
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ def finetune_function(self, pl_module, current_epoch, optimizer, optimizer_idx):
# When `current_epoch` is 10, feature_extractor will start training.
if current_epoch == self._unfreeze_at_epoch:
self.unfreeze_and_add_param_group(
module=pl_module.feature_extractor,
modules=pl_module.feature_extractor,
optimizer=optimizer,
train_bn=True,
)
Expand Down

0 comments on commit c92f84a

Please sign in to comment.