KEMBAR78
Use standard extras for `uvicorn` by danilopeixoto · Pull Request #1166 · vllm-project/vllm · GitHub
Skip to content

Conversation

@danilopeixoto
Copy link
Contributor

By adding the standard, Uvicorn will install and use some recommended extra dependencies.

That including uvloop, the high-performance drop-in replacement for asyncio.
uvloop is not supported on Windows. Uvicorn will fallback to asyncio on Windows.

Copy link
Member

@zhuohan123 zhuohan123 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks for your contribution!

@zhuohan123 zhuohan123 merged commit 649aa73 into vllm-project:main Sep 28, 2023
@danilopeixoto danilopeixoto deleted the uvicorn-standard branch September 29, 2023 11:46
hongxiayang pushed a commit to hongxiayang/vllm that referenced this pull request Feb 13, 2024
yiliu30 pushed a commit to yiliu30/vllm-fork that referenced this pull request May 8, 2025
pi314ever pushed a commit to pi314ever/vllm that referenced this pull request May 20, 2025
After PR vllm-project#1166 , when give text-only input to llama3.2, the error will
occur,
`ValueError: The encoder prompt cannot be empty`

This PR fix this minor error.

Signed-off-by: zhouyu5 <yu.zhou@intel.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants