KEMBAR78
[ROCm] [Upstream Triton] Flex attention `Assertion idx < size()' failed.` · Issue #139621 · pytorch/pytorch · GitHub
Skip to content

[ROCm] [Upstream Triton] Flex attention Assertion idx < size()' failed. #139621

@jataylo

Description

@jataylo

🐛 Describe the bug

Encountered while testing for preview release/2.6 builds as part of #139175

python test_flex_attention.py -k "test_load_from_bias_seq_only_float16

Traceback

test_load_from_bias_seq_only_float16 (main.TestFlexAttention) ... python: /root/.triton/llvm/llvm-b5cc222d-ubuntu-x64/include/llvm/ADT/SmallVector.h:291: T& llvm::SmallVectorTemplateCommon<T, >::operator[](llvm::SmallVectorTemplateCommon<T, >::size_type) [with T = unsigned int; = void; llvm::SmallVectorTemplateCommon<T, >::reference = unsigned int&; llvm::SmallVectorTemplateCommon<T, >::size_type = long unsigned int]: Assertion `idx < size()' failed.

Reproducer:
https://gist.github.com/jataylo/dfd4179d26e8fd5019a81189949e6f3b

Versions

#139206

cc @jeffdaily @sunway513 @jithunnair-amd @pruthvistony @ROCmSupport @dllehr-amd @hongxiayang @naromero77amd @bertmaher @int3 @davidberard98 @nmacchioni @chenyang78 @embg @peterbell10 @aakhundov

Metadata

Metadata

Labels

module: rocmAMD GPU support for PytorchtriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleupstream tritonUpstream Triton Issue

Type

No type

Projects

Status

Done

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions