KEMBAR78
Fix Gemma3 and Gemma2 flops computation. by gagika · Pull Request #2009 · AI-Hypercomputer/maxtext · GitHub
Skip to content

Conversation

@gagika
Copy link
Collaborator

@gagika gagika commented Jul 22, 2025

Description

  • Add correct local attention flops for Gemma3
  • Change both gemma2 and gemma3 to use causal attention flops (total attention flops / 2).

Checklist

Before submitting this PR, please make sure (put X in square brackets):

  • I have performed a self-review of my code.
  • I have necessary comments in my code, particularly in hard-to-understand areas.
  • I have run end-to-end tests tests and provided workload links above if applicable.
  • I have made or will make corresponding changes to the doc if needed.

Copy link
Collaborator

@gobbleturk gobbleturk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks gagik!

@copybara-service copybara-service bot merged commit 6da8ede into main Jul 23, 2025
20 checks passed
@copybara-service copybara-service bot deleted the gemma3-flops branch July 23, 2025 19:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants