-
Notifications
You must be signed in to change notification settings - Fork 25.7k
[jit] Add some of nn.init to weak script #19640
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
[jit] Add some of nn.init to weak script gh-metadata: pytorch pytorch 19640 gh/driazati/26/head
[jit] Add some of nn.init to weak script gh-metadata: pytorch pytorch 19640 gh/driazati/26/head
b606124 to
ebc8c39
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. A little more coverage on different inputs would be nice, as would be adding the div float / int, which should be a really really easy change.
|
|
||
| def test_nn_init(self): | ||
| tests = ( | ||
| ('constant_', (lambda: (torch.ones(2, 2), 2.5)), "Tensor, float"), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you switch it to rand ? plz do so before landing
| """ | ||
| with torch.no_grad(): | ||
| return tensor.fill_(1) | ||
| return _no_grad_fill_(tensor, 1.) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm it looks like this is changing the semantics, because previously the type of the tensor would have been preserved, but now it is being filled with floats. ALthought not 100% sure
Summary: Stack from [ghstack](https://github.com/ezyang/ghstack): * **pytorch#19640 [jit] Add some of nn.init to weak script** Pull Request resolved: pytorch#19640 Pulled By: driazati Differential Revision: D15065332 fbshipit-source-id: 30df9f02e527cd5e5ebe34b7e003444eae96c66d
Stack from ghstack:
Differential Revision: D15065332