-
Notifications
You must be signed in to change notification settings - Fork 25.7k
Add Half support for softmax and log_softmax on CPU #103315
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/103315
Note: Links to docs will display an error until the docs builds have been completed. ✅ You can merge normally! (1 Unrelated Failure)As of commit 7cf7caf with merge base 2aaa7e5 ( FLAKY - The following job failed but was likely due to flakiness present on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
test/test_nn.py
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why move it to a different place? It makes reviewing harder. :-)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Because it can't use @dtypes in the original place.
0853fcb to
486b4b5
Compare
95ec717 to
26ec8a7
Compare
|
@mruberry Could you please review this PR ? |
|
@mikaylagawarecki Could you please review this PR ? Thanks. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks ok to me other than a question about half_to_float
1cdc8ce to
33184a6
Compare
|
@pytorchbot merge |
6ebbe39 to
0801f2c
Compare
e2e7d3c to
f93f49d
Compare
Co-authored-by: mikaylagawarecki <mikaylagawarecki@gmail.com>
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Add Half support for softmax and log_softmax on CPU. Note: This introduces a correctness issue with MPS pytorch#111416 and pytorch#111479. Pull Request resolved: pytorch#103315 Approved by: https://github.com/jgong5, https://github.com/mikaylagawarecki, https://github.com/malfet
Add Half support for softmax and log_softmax on CPU. Note: This introduces a correctness issue with MPS pytorch#111416 and pytorch#111479. Pull Request resolved: pytorch#103315 Approved by: https://github.com/jgong5, https://github.com/mikaylagawarecki, https://github.com/malfet
Add Half support for softmax and log_softmax on CPU.
Note: This introduces a correctness issue with MPS #111416 and #111479.
cc @jgong5 @mingfeima @XiaobingSuper @sanchitintel @ashokei @jingxu10 @voznesenskym @penguinwu @EikanWang @Guobing-Chen @zhuhaozhe @blzheng @Xia-Weiwen @wenzhe-nrv @jiayisunx @peterbell10 @ipiszy @yf225 @chenyang78 @kadeng @muchulee8 @aakhundov @ColinPeppler @ngimel