Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

npu_fusion_attention error #381

Closed
WangFeng18 opened this issue Aug 6, 2024 · 0 comments
Closed

npu_fusion_attention error #381

WangFeng18 opened this issue Aug 6, 2024 · 0 comments

Comments

@WangFeng18
Copy link

Hi, thanks a lot for the great project. But I have met a problem on the flash attention op. When i run your code on ascend 910B npu, I have encountered the following error for npu_fusion_attention op,

image

have you ever encouter this problem? and how to resolve it.

Thanks again for the great work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant