KEMBAR78
[optim]: `NAdam`, `RAdam` and `_multi_tensor_adadelta` do not support complex types · Issue #110606 · pytorch/pytorch · GitHub
Skip to content

[optim]: NAdam, RAdam and _multi_tensor_adadelta do not support complex types #110606

@jon-chuang

Description

@jon-chuang

🐛 Describe the bug

It is trivial to add support (by viewing params, grads and intermediates as real via view_as_real), but somehow, unlike other optim classes, they inconsistently do not support complex.

Versions

main

Tasks

  • adadelta multitensor
  • NAdam
  • RAdam

cc @vincentqb @jbschlosser @albanD @janeyx99 @crcrpar @ezyang @anjali411 @dylanbespalko @mruberry @lezcano @nikitaved

Metadata

Metadata

Assignees

No one assigned

    Labels

    actionableenhancementNot as big of a feature, but technically not a bug. Should be easy to fixmodule: complexRelated to complex number support in PyTorchmodule: optimizerRelated to torch.optimtriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions