KEMBAR78
Qwen3: Fix eagle hidden states by IzzyPutterman · Pull Request #6199 · NVIDIA/TensorRT-LLM · GitHub
Skip to content

Conversation

@IzzyPutterman
Copy link
Collaborator

@IzzyPutterman IzzyPutterman commented Jul 20, 2025

Summary by CodeRabbit

  • Bug Fixes
    • Improved handling of hidden state synchronization and normalization during model execution, ensuring correct behavior when certain advanced features are active.
    • Enhanced stability and accuracy for users leveraging speculative metadata features in model layers.

@IzzyPutterman IzzyPutterman requested a review from a team as a code owner July 20, 2025 05:05
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Jul 20, 2025

Walkthrough

The update adds conditional checks in the Qwen3MoEDecoderLayer.forward method to determine if speculative metadata is present and active for the current layer. Depending on these checks, the code selectively applies allreduce operations, hidden state capturing, and layer normalization, ensuring correct coordination between speculative execution and fusion logic.

Changes

File(s) Change Summary
tensorrt_llm/_torch/models/modeling_qwen3_moe.py Added conditional logic using spec_metadata to control allreduce, hidden state capture, and normalization in the decoder layer's forward method. No changes to method signatures.

Poem

In the warren of MoE code so deep,
New checks now guide what secrets we keep.
Speculative paths, allreduce in stride,
Only when metadata says to abide.
With careful hops, the logic aligns—
A rabbit’s delight in conditional signs! 🐇


📜 Recent review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 98428f3 and 9c114b1.

📒 Files selected for processing (1)
  • tensorrt_llm/_torch/models/modeling_qwen3_moe.py (2 hunks)
🔇 Additional comments (3)
tensorrt_llm/_torch/models/modeling_qwen3_moe.py (3)

260-281: LGTM! Proper handling of speculative metadata in the finalized fusion path.

The conditional logic correctly differentiates between speculative capture and regular fusion modes. When capturing hidden states for speculative execution, the code appropriately disables fusion operations to preserve the original states.


291-318: LGTM! Consistent handling of speculative metadata in the non-finalized MoE fusion path.

The implementation correctly adapts the MoE allreduce parameters based on whether speculative capture is active. The separation of fusion operations when capturing hidden states maintains consistency with the finalized path.


321-324: LGTM! Speculative capture added to the non-fusion path.

The implementation correctly adds hidden state capture for speculative execution when fusion is disabled, maintaining consistency across all execution paths.

✨ Finishing Touches
  • 📝 Generate Docstrings

🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Explain this complex logic.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@IzzyPutterman
Copy link
Collaborator Author

Might be easier to turn off self.fusion_config.POST_MOE_FUSION if spec metadata is not None and the layer idx is required to be saved

@IzzyPutterman
Copy link
Collaborator Author

Signed-off-by: Izzy Putterman <iputterman@nvidia.com>
Signed-off-by: Izzy Putterman <iputterman@nvidia.com>
@IzzyPutterman IzzyPutterman force-pushed the iputterman/qwen3-eagle-hidden branch from 9c114b1 to 8515466 Compare July 23, 2025 20:06
Copy link
Collaborator

@nv-yilinf nv-yilinf left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the fix and refactoring.

@IzzyPutterman IzzyPutterman changed the title Draft: Qwen3: Fix eagle hidden states Qwen3: Fix eagle hidden states Jul 23, 2025
@IzzyPutterman
Copy link
Collaborator Author

/bot run

@tensorrt-cicd
Copy link
Collaborator

PR_Github #12755 [ run ] triggered by Bot

@tensorrt-cicd
Copy link
Collaborator

PR_Github #12755 [ run ] completed with state SUCCESS
/LLM/main/L0_MergeRequest_PR pipeline #9497 completed with status: 'FAILURE'

@IzzyPutterman
Copy link
Collaborator Author

/bot run

@tensorrt-cicd
Copy link
Collaborator

PR_Github #12789 [ run ] triggered by Bot

@tensorrt-cicd
Copy link
Collaborator

PR_Github #12789 [ run ] completed with state SUCCESS
/LLM/main/L0_MergeRequest_PR pipeline #9527 completed with status: 'SUCCESS'

@mikeiovine mikeiovine merged commit 7e0158b into NVIDIA:main Aug 6, 2025
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants