KEMBAR78
Alias for logsumexp to special namespace by krshrimali · Pull Request #58838 · pytorch/pytorch · GitHub
Skip to content

Conversation

@krshrimali
Copy link
Contributor

@krshrimali krshrimali commented May 24, 2021

@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented May 24, 2021

💊 CI failures summary and remediations

As of commit 7de4af3 (more details on the Dr. CI page and at hud.pytorch.org/pr/58838):


💚 💚 Looks good so far! There are no failures yet. 💚 💚


Preview docs built from this PR

This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

@facebook-github-bot facebook-github-bot added the oncall: jit Add this issue/PR to JIT oncall triage queue label May 24, 2021
@krshrimali krshrimali changed the title [WIP] Alias for logsumexp to special namespace (WIP) Alias for logsumexp to special namespace May 24, 2021
@krshrimali krshrimali marked this pull request as draft May 24, 2021 07:00
@krshrimali krshrimali added the module: special Functions with no exact solutions, analogous to those in scipy.special label May 24, 2021
@krshrimali krshrimali mentioned this pull request May 24, 2021
17 tasks
@ezyang ezyang removed their request for review May 24, 2021 18:52
@krshrimali
Copy link
Contributor Author

Updates:

Apologies for a long hold on this PR. Most of the issues have been resolved, waiting for the tests to pass then it will be ready for review. :)

Copy link
Collaborator

@kshitij12345 kshitij12345 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall looks good to me. But one suggested change.

Thanks @krshrimali

Copy link
Collaborator

@lezcano lezcano left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Left a few small points. Overall looks very good, thanks @krshrimali

Example::
>>> a = torch.randn(3, 3)
>>> torch.logsumexp(a, 1)
Copy link
Collaborator

@lezcano lezcano Jun 8, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be torch.special.logsumexp.

It would also be more meaningful to write here:

>>> torch.dist(torch.special.logsumexp(a, 1), torch.log(torch.sum(torch.exp(a), dim)))
tensor(1.1921e-07)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm unfortunately on a different opinion with this change, I'm thinking that we should be consistent with examples on other functions and I feel having >>> torch.special.logsumexp(a, 1) is OK here.

Okay, I think your suggestion also makes sense, so to satisfy both simplicity as well as give a better meaning to what the function does. How about this instead?

>>> torch.special.logsumexp(a, 1)
#output
>>> torch.dist(torch.special.logsumexp(a, 1), torch.log(torch.sum(torch.exp(a), dim)))
#output

Copy link
Collaborator

@lezcano lezcano Jun 22, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think having both would be a good addition. Note that the formula I wrote had an errata, and dim should be 1.

@krshrimali krshrimali changed the title (WIP) Alias for logsumexp to special namespace Alias for logsumexp to special namespace Jun 8, 2021
@krshrimali
Copy link
Contributor Author

Gentle ping, @mruberry - Can you please take a look at this whenever you find time? Thanks!

Copy link
Collaborator

@lezcano lezcano left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I do not know enough to be able to say whether the changes in the JIT are fine, I'll let @mruberry assess that or find the person that knows about this. Everything else looks good to me. Thank you Kush!

On this topic, I have a question. Would this fix happen to fix all (or a number of) those tests that we used to ignore in the OpInfos, or does it address something different? @krshrimali @kshitij12345
If it is so, this is great! We should then remove the relevan skips.
If it wasn't, how do other functions that have this problem deal with the JIT errors that we found here?

@krshrimali
Copy link
Contributor Author

I do not know enough to be able to say whether the changes in the JIT are fine, I'll let @mruberry assess that or find the person that knows about this. Everything else looks good to me. Thank you Kush!

Thanks, @lezcano for taking a closer look at this PR. Appreciate your comments. Will update the doc to add both formulae.

On this topic, I have a question. Would this fix happen to fix all (or a number of) those tests that we used to ignore in the OpInfos, or does it address something different? @krshrimali @kshitij12345

This is a great question, the fix in this PR is for test_jit_alias_remapping (that is only for JIT remapping tests on aliases). I just noticed that there are a few tests which are skipped, and which probably can be solved by the fix from this PR. I'll see if they pass on this branch.

If it is so, this is great! We should then remove the relevan skips.

Agreed! :)

@krshrimali
Copy link
Contributor Author

Update:

There are currently 2 tests skipped:

  1. vstack which is because of the function taking sequence of tensors as input instead of just a tensor. This is not fixed in this PR but again we can remove i0 from the formals and directly pass sample inputs to actuals, but I am not too sure if this is the correct solution for this issue.
  2. movedim - this has been fixed by this PR. :)

Thanks for asking the right question, @lezcano!

cc: @mruberry @lezcano @kshitij12345

@codecov
Copy link

codecov bot commented Jun 22, 2021

Codecov Report

Merging #58838 (89352af) into master (45ae2e7) will increase coverage by 0.00%.
The diff coverage is 66.66%.

❗ Current head 89352af differs from pull request most recent head 2f7d050. Consider uploading reports for the commit 2f7d050 to get more accurate results

@@           Coverage Diff           @@
##           master   #58838   +/-   ##
=======================================
  Coverage   76.23%   76.24%           
=======================================
  Files        2054     2054           
  Lines      205033   205117   +84     
=======================================
+ Hits       156306   156387   +81     
- Misses      48727    48730    +3     

>>> a = torch.randn(3, 3)
>>> torch.logsumexp(a, 1)
tensor([ 0.8442, 1.4322, 0.8711])
Alias for :func:`torch.special.logsumexp`.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's keep the docs in torch for now and have special.logsumexp alias it.

skips=(
# Expected a value of type 'int' for argument 'source'
# but instead found type 'list'.
SkipInfo('TestJit', 'test_jit_alias_remapping'),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome skip removal

Copy link
Collaborator

@mruberry mruberry left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks really good. cc @eellison, who's interested in reducing jit test skips.

@eellison, do these jit test changes make sense?

I also made one comment about keeping the docs on torch.logsumexp for now (we'll review what goes in special vs. torch later)

@krshrimali
Copy link
Contributor Author

Hi, @eellison and @mruberry - Gentle ping if this can be looked at. Thanks!

Copy link
Contributor

@eellison eellison left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for removing the skipped test 😍 😍 😍 LGTM if tests pass

Copy link
Collaborator

@mruberry mruberry left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks @krshrimali

@facebook-github-bot
Copy link
Contributor

@mruberry has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@mruberry merged this pull request in 423523d.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

cla signed Merged module: special Functions with no exact solutions, analogous to those in scipy.special oncall: jit Add this issue/PR to JIT oncall triage queue open source

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants