KEMBAR78
channels_last/channels_last_3d memory format not supported for some modules on ROCm that should be supported on CUDA · Issue #70125 · pytorch/pytorch · GitHub
Skip to content

channels_last/channels_last_3d memory format not supported for some modules on ROCm that should be supported on CUDA #70125

@mikaylagawarecki

Description

@mikaylagawarecki

🐛 Describe the bug

#69317 is adding a test test_memory_format to test/test_modules.py to determine whether outputs are valid and in the correct memory format when the inputs and modules are converted to various memory formats. However, for dtype=float32 on the CI ROCm cuda test, the following modules do not produce the expected behavior:

  • Batchnorm2d
  • Batchnorm3d
  • Conv2d
  • Conv3d
  • ConvTransponse2d

i.e. when either input.to(memory_format=torch.channels_last) (or torch.channels_last_3d where relevant) (or module.to(memory_format=torch.channels_last) when the module has state) the assert output.is_contiguous(memory_format=torch.channels_last) fails

logs here: https://ci.pytorch.org/jenkins/job/pytorch-builds/job/pytorch-linux-bionic-rocm4.3.1-py3.6-test1/14485/console

Versions

ROCm CI

cc @jeffdaily @sunway513 @jithunnair-amd @ROCmSupport @KyleCZH @jbschlosser

Metadata

Metadata

Assignees

No one assigned

    Labels

    Stalemodule: rocmAMD GPU support for PytorchtriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions