-
Notifications
You must be signed in to change notification settings - Fork 25.7k
[doc][hackathon] To add Adam Optimizer to the documentation #63251
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful links
💊 CI failures summary and remediationsAs of commit 6626eb5 (more details on the Dr. CI page):
🕵️ 3 new failures recognized by patternsThe following CI failures do not appear to be due to upstream breakages:
|
5970814 to
64543f0
Compare
42f20bd to
fa7c43c
Compare
af59471 to
fc67b05
Compare
Codecov Report
@@ Coverage Diff @@
## master #63251 +/- ##
==========================================
- Coverage 67.09% 67.00% -0.09%
==========================================
Files 692 691 -1
Lines 90579 90558 -21
==========================================
- Hits 60774 60680 -94
- Misses 29805 29878 +73 |
44c88c5 to
09d0911
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
torch/optim/adam.py
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is not true anymore?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This paper contains AdamW rather than Adam, that's why i deleted this :) thanks for pointing it out.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok
|
@iramazanli has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
09d0911 to
6626eb5
Compare
|
@iramazanli has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
|
@iramazanli merged this pull request in 43248d9. |
It has been discussed before that adding description of Optimization algorithms to PyTorch Core documentation may result in a nice Optimization research tutorial. In the following tracking issue we mentioned about all the necessary algorithms and links to the originally published paper #63236.
In this PR we are adding description of Adam Algorithm to the documentation. For more details, we refer to the paper https://arxiv.org/abs/1412.6980
cc @vincentqb @iramazanli