Skip to content

Commit

Permalink
Fix(model): Linear detected and added to target module with rope line…
Browse files Browse the repository at this point in the history
…ar (#738)

* Fix(model): Linear detected and added to target module with rope linear

* fix: exclude layer instead
  • Loading branch information
NanoCode012 authored Oct 19, 2023
1 parent 992d57f commit 440c3ab
Showing 1 changed file with 5 additions and 1 deletion.
6 changes: 5 additions & 1 deletion src/axolotl/utils/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -507,7 +507,11 @@ def find_all_linear_names(model):
cls = (bnb.nn.Linear4bit, bnb.nn.Linear8bitLt, torch.nn.Linear, QuantLinear)
lora_module_names = set()
for name, module in model.named_modules():
if isinstance(module, cls) or "Linear" in module.__class__.__name__:
if (
isinstance(module, cls)
or "Linear" in module.__class__.__name__
and module.__class__.__name__ not in ("LlamaLinearScalingRotaryEmbedding",)
):
names = name.split(".")
lora_module_names.add(names[0] if len(names) == 1 else names[-1])

Expand Down

0 comments on commit 440c3ab

Please sign in to comment.