Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove patch #8

Merged
merged 1 commit into from
Aug 16, 2023
Merged

Remove patch #8

merged 1 commit into from
Aug 16, 2023

Conversation

groenenboomj
Copy link

No description provided.

Use a version of PyTorch with the hipify
changes included.
@sabreshao
Copy link
Collaborator

Our main branch is flash_attention_for_rocm. Could you close this PR and create a new PR to flash_attention_for_rocm?

@groenenboomj groenenboomj changed the base branch from main to flash_attention_for_rocm August 8, 2023 15:16
@groenenboomj
Copy link
Author

Our main branch is flash_attention_for_rocm. Could you close this PR and create a new PR to flash_attention_for_rocm?

Done

@sabreshao
Copy link
Collaborator

sabreshao commented Aug 9, 2023

So you add some patches inside latest Pytorch docker so no more patch is needed during FA building?
Does it work for both Pyt 1.13 and 2.0+? Would you please update the build instruction part of README.md?
I prefer adding the simplified building step and keeping the existing building step for legacy docker image.
We can remove the legacy building step in the future.

@groenenboomj groenenboomj merged commit 52427b5 into flash_attention_for_rocm Aug 16, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants