Skip to content

Commit

Permalink
Remove xpos and llama-landmark
Browse files Browse the repository at this point in the history
  • Loading branch information
joecummings committed Jan 11, 2024
1 parent 3019d1f commit cb08b2d
Showing 1 changed file with 0 additions and 7 deletions.
7 changes: 0 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -825,15 +825,8 @@ flash_attn_fuse_mlp: # Whether to fuse part of the MLP into a single operation
# Whether to use scaled-dot-product attention
# https://pytorch.org/docs/stable/generated/torch.nn.functional.scaled_dot_product_attention.html
sdp_attention:
# Landmark attention (only llama)
landmark_attention:
# Shifted-sparse attention (only llama)
s2_attention:

# xpos RoPE see https://github.com/kaiokendev/cutoff-len-is-context-len/blob/main/util/xpos_rope_llama_monkey_patch.py
# LLaMA only
xpos_rope:

# Resume from a specific checkpoint dir
resume_from_checkpoint:
# If resume_from_checkpoint isn't set and you simply want it to start where it left off.
Expand Down

0 comments on commit cb08b2d

Please sign in to comment.