Skip to content

Commit

Permalink
just pass args into torch checkpoint
Browse files Browse the repository at this point in the history
  • Loading branch information
winglian committed Sep 26, 2023
1 parent bb87204 commit 8076bba
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions src/axolotl/monkeypatch/llama_attn_hijack_flash.py
Original file line number Diff line number Diff line change
Expand Up @@ -520,9 +520,6 @@ def custom_forward(*inputs):
# None for past_key_value
return module(
*inputs,
past_key_value, # pylint: disable=(cell-var-from-loop)
output_attentions,
padding_mask=padding_mask,
)

return custom_forward
Expand All @@ -532,7 +529,10 @@ def custom_forward(*inputs):
hidden_states,
attention_mask,
position_ids,
past_key_value,
output_attentions,
None,
padding_mask,
cu_seqlens,
max_seqlen,
)
Expand Down

0 comments on commit 8076bba

Please sign in to comment.