-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
【Prim】Support amp logic for layer_norm and softmax #51473
【Prim】Support amp logic for layer_norm and softmax #51473
Conversation
你的PR提交成功,感谢你对开源项目的贡献! |
@@ -119,7 +119,7 @@ def compare_forward(self): | |||
atol=attrs.get_atol("forward"), | |||
) | |||
|
|||
def test_forward(self): | |||
def _test_forward(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
debug typo?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
@@ -34,15 +34,25 @@ def _composite(op, *args): | |||
@REGISTER_COMPOSITE('softmax') | |||
def softmax_composite(x, axis): | |||
"""define composite rule of op softmax""" | |||
is_amp = False |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
better to record why not float16
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
PR types
Others
PR changes
Others
Describe
Pcard-66975
This Pr support amp mode for softmax and layernorm