-
Notifications
You must be signed in to change notification settings - Fork 25.7k
OpInfo: log_softmax #59336
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OpInfo: log_softmax #59336
Conversation
💊 CI failures summary and remediationsAs of commit 714e727 (more details on the Dr. CI page): 💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions to the (internal) Dr. CI Users group. |
| 'log_softmax', | ||
| variant_test_name='dtype', | ||
| supports_out=False, | ||
| dtypes=all_types_and_complex_and(torch.bool, torch.float16, torch.bfloat16), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure, as to how complex is also supported with dtype=torch.float64.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like this part of code facilitates that,
pytorch/aten/src/ATen/native/SoftMax.cpp
Lines 273 to 274 in c3bf42e
| Tensor converted = dtype.has_value() ? input_.toType(dtype.value()) : input_; | |
| return at::_softmax(converted, dim_, false); |
>>> t = torch.randn(3,3, dtype=torch.complex128)
>>> torch.log_softmax(t, 0, dtype=torch.float64)
<stdin>:1: UserWarning: Casting complex values to real discards the imaginary part (Triggered internally at ../aten/src/ATen/native/Copy.cpp:240.)It does raise a warning. But I think it should just error out.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is really interesting -- thanks for taking a closer look. I don't think we should take any action for this behavior in this PR, however.
fyi @anjali411
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wahoo!
|
@mruberry has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
Summary: Reference: pytorch#54261 Pull Request resolved: pytorch#59336 Reviewed By: agolynski Differential Revision: D28899052 Pulled By: mruberry fbshipit-source-id: 60a9a4ffbca5a0f2c899d4d83500dcab4555ffb0
Reference: #54261