KEMBAR78
[FIX] Minor bug fixes by zhuohan123 · Pull Request #1035 · vllm-project/vllm · GitHub
Skip to content

Conversation

@zhuohan123
Copy link
Member

From #959.

cc @wanmok

Copy link
Collaborator

@WoosukKwon WoosukKwon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks for the PR.

# We use float32 for probabilities and log probabilities.
# Compute the probabilities.
probs = torch.softmax(logits, dim=-1, dtype=torch.float)
# Compute the log probabilities (before applying top-p and top-k).
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Besides, should we remove (before applying top-p and top-k)?

@zhuohan123 zhuohan123 merged commit f04908c into main Sep 13, 2023
@zhuohan123 zhuohan123 deleted the minor-bug-fixes branch October 16, 2023 21:02
hongxiayang pushed a commit to hongxiayang/vllm that referenced this pull request Feb 13, 2024
* [FIX] Minor bug fixes

* Address review comments
yiliu30 pushed a commit to yiliu30/vllm-fork that referenced this pull request Apr 23, 2025
Previously, we use `if hasattr(self, "scaling_factors")` to determine
whether call `prepare_cos_sin`, however, some model's scaling factor in
`rope_scaling` has a slightly different name, which is `scaling_factor`
(**note: no ending 's'**), in this case, it will failed to call
`prepare_cos_sin`, which is not expected.

This PR will help with the following class:
- `Llama3RotaryEmbedding`
- `DynamicNTKScalingRotaryEmbedding`
- `YaRNScalingRotaryEmbedding`

Signed-off-by: zhouyu5 <yu.zhou@intel.com>
Co-authored-by: Michał Kuligowski <mkuligowski@habana.ai>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants