KEMBAR78
OpInfo: log_softmax by kshitij12345 · Pull Request #59336 · pytorch/pytorch · GitHub
Skip to content

Conversation

@kshitij12345
Copy link
Collaborator

Reference: #54261

@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented Jun 2, 2021

💊 CI failures summary and remediations

As of commit 714e727 (more details on the Dr. CI page):


💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

@jbschlosser jbschlosser added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Jun 2, 2021
'log_softmax',
variant_test_name='dtype',
supports_out=False,
dtypes=all_types_and_complex_and(torch.bool, torch.float16, torch.bfloat16),
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure, as to how complex is also supported with dtype=torch.float64.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like this part of code facilitates that,

Tensor converted = dtype.has_value() ? input_.toType(dtype.value()) : input_;
return at::_softmax(converted, dim_, false);

>>> t = torch.randn(3,3, dtype=torch.complex128)
>>> torch.log_softmax(t, 0, dtype=torch.float64)
<stdin>:1: UserWarning: Casting complex values to real discards the imaginary part (Triggered internally at  ../aten/src/ATen/native/Copy.cpp:240.)

It does raise a warning. But I think it should just error out.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is really interesting -- thanks for taking a closer look. I don't think we should take any action for this behavior in this PR, however.

fyi @anjali411

Copy link
Collaborator

@mruberry mruberry left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wahoo!

@facebook-github-bot
Copy link
Contributor

@mruberry has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@mruberry merged this pull request in 1be7ca7.

deniskokarev pushed a commit to deniskokarev/pytorch that referenced this pull request Jun 9, 2021
Summary:
Reference: pytorch#54261

Pull Request resolved: pytorch#59336

Reviewed By: agolynski

Differential Revision: D28899052

Pulled By: mruberry

fbshipit-source-id: 60a9a4ffbca5a0f2c899d4d83500dcab4555ffb0
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

cla signed Merged open source triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants