KEMBAR78
OpenAI-Server: Only fail if logit_bias has actual values by LLukas22 · Pull Request #1045 · vllm-project/vllm · GitHub
Skip to content

Conversation

@LLukas22
Copy link
Contributor

Many OpenAI wrappers of popular libraries such as langchain or haystack will always append an empty dictionary as a logit_bias value to their requests. Currently the server will return an error in these cases. I think the server should only error out if logit_bias actually contains any values and process the request otherwise. This enables user to simply use their vLLM servers via langchain.

Copy link
Member

@zhuohan123 zhuohan123 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks for the fix!

@zhuohan123 zhuohan123 merged commit b5f93d0 into vllm-project:main Sep 15, 2023
hongxiayang pushed a commit to hongxiayang/vllm that referenced this pull request Feb 13, 2024
yiliu30 pushed a commit to yiliu30/vllm-fork that referenced this pull request Apr 15, 2025
Adjusted method of extracting synapse build id for release branches
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants