-
Notifications
You must be signed in to change notification settings - Fork 25.7k
OpInfo for nn.functional.layer_norm
#63276
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OpInfo for nn.functional.layer_norm
#63276
Conversation
🔗 Helpful links
💊 CI failures summary and remediationsAs of commit b97c6ec (more details on the Dr. CI page):
🕵️ 1 new failure recognized by patternsThe following CI failures do not appear to be due to upstream breakages:
|
| Job | Step | Action |
|---|---|---|
| Run tests | 🔁 rerun |
1 job timed out:
pytorch_linux_xenial_py3_clang7_asan_test2
ci.pytorch.org: 1 failed
This comment was automatically generated by Dr. CI (expand for details).
Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions to the (internal) Dr. CI Users group.
…://github.com/krshrimali/pytorch into opinfo/high_priority/nn/functional/layer_norm
…ity/nn/functional/layer_norm
|
@zou3519 has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
|
NB: some internal tests are failing on this. I will investigate and report back. |
|
I'm still working on merging this! The internal tests were failing due to an internal issue |
| self._test_LayerNorm_cuda_half(device) | ||
|
|
||
| @onlyOnCPUAndCUDA | ||
| def test_LayerNorm_numeric(self, device): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@krshrimali please restore this test. It is testing numerics for large inputs, and it is added on purpose. Added OpInfo does not cover this (and it should not, OpInfos are not intended for testing interesting numerics of individual operations).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oops, sorry for not noticing this! I'll put this back in (and cc' you and krshrimali as reviewers)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It was deleted in #63276. The numerics test was meant to check LayerNorm behavior on large inputs, but we deleted it without realizing that. Test Plan: - wait for tests. [ghstack-poisoned]
Summary: Pull Request resolved: #64385 It was deleted in #63276. The numerics test was meant to check LayerNorm behavior on large inputs, but we deleted it without realizing that. Test Plan: - wait for tests. Reviewed By: ngimel Differential Revision: D30702950 Pulled By: zou3519 fbshipit-source-id: a480e26c45ec38fb628938b70416cdb22d976a46
Please see pytorch/functorch#78 and #54261.
Note:
test_nn.py.cc: @mruberry @zou3519