Skip to content

Commit

Permalink
be excplicit with kwargs
Browse files Browse the repository at this point in the history
  • Loading branch information
winglian committed Sep 26, 2023
1 parent 68b57b5 commit 367d8ec
Showing 1 changed file with 5 additions and 5 deletions.
10 changes: 5 additions & 5 deletions src/axolotl/monkeypatch/llama_attn_hijack_flash.py
Original file line number Diff line number Diff line change
Expand Up @@ -530,11 +530,11 @@ def custom_forward(*inputs):
layer_outputs = torch.utils.checkpoint.checkpoint(
create_custom_forward(decoder_layer),
hidden_states,
attention_mask,
position_ids,
None,
cu_seqlens,
max_seqlen,
attention_mask=attention_mask,
position_ids=position_ids,
use_cache=None,
cu_seqlens=cu_seqlens,
max_seqlen=max_seqlen,
)
else:
layer_outputs = decoder_layer(
Expand Down

0 comments on commit 367d8ec

Please sign in to comment.