KEMBAR78
[optim]: Adagrad with sparse grad is not on multi_tensor testing path + bug · Issue #110444 · pytorch/pytorch · GitHub
Skip to content

[optim]: Adagrad with sparse grad is not on multi_tensor testing path + bug #110444

@jon-chuang

Description

@jon-chuang

🐛 Describe the bug

Bug: should not return here, just call the function

return _single_tensor_adagrad(

Solution: fix this, and add adagrad sparse to testing path (test_optim, inductor test)

Versions

Main

cc @vincentqb @jbschlosser @albanD @janeyx99 @crcrpar

Metadata

Metadata

Assignees

No one assigned

    Labels

    actionablemodule: optimizerRelated to torch.optimtriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions