Skip to content

Commit

Permalink
handle key padding mask directly passed into Attend
Browse files Browse the repository at this point in the history
  • Loading branch information
lucidrains committed Jul 6, 2024
1 parent 65e9d36 commit 832464f
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 1 deletion.
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
setup(
name = 'x-transformers',
packages = find_packages(exclude=['examples']),
version = '1.31.8',
version = '1.31.9',
license='MIT',
description = 'X-Transformers - Pytorch',
author = 'Phil Wang',
Expand Down
5 changes: 5 additions & 0 deletions x_transformers/attend.py
Original file line number Diff line number Diff line change
Expand Up @@ -268,6 +268,11 @@ def forward(

causal = self.causal

# handle key padding mask

if exists(mask) and mask.ndim == 2:
mask = rearrange(mask, 'b j -> b 1 1 j')

# handle kv cached decoding

if n == 1 and causal:
Expand Down

0 comments on commit 832464f

Please sign in to comment.