-
Notifications
You must be signed in to change notification settings - Fork 60.9k
Support OpenAI o3 and o4-mini #6457
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Someone is attempting to deploy a commit to the NextChat Team on Vercel. A member of the Team first needs to authorize it. |
WalkthroughThe changes update model recognition and request handling logic to include support for "o3" and "o4-mini" models. Specifically, the model grouping logic in the OpenAI platform client is extended so that models with names starting with "o4-mini" are treated the same as "o1" and "o3" models, affecting the inclusion of the Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant OpenAIPlatform
participant Constants
Client->>OpenAIPlatform: Send request with selected model (e.g., "o1", "o3", "o4-mini")
OpenAIPlatform->>Constants: Check if model matches vision regexes
Constants-->>OpenAIPlatform: Return match result
OpenAIPlatform->>OpenAIPlatform: Determine if model is "o1", "o3", or "o4-mini"
alt If model is "o1", "o3", or "o4-mini"
OpenAIPlatform->>OpenAIPlatform: Exclude max_tokens from vision request
else Other models
OpenAIPlatform->>OpenAIPlatform: Include max_tokens in vision request
end
OpenAIPlatform->>Client: Return response
Possibly related issues
Possibly related PRs
Suggested labels
Suggested reviewers
Poem
Tip ⚡💬 Agentic Chat (Pro Plan, General Availability)
📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (2)
🔇 Additional comments (4)
✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
💻 变更类型 | Change Type
🔀 变更说明 | Description of Change
新增对 OpenAI 最新模型 o3 和 o4-mini 的适配,
更新了判断视觉模型的正则表达式,支o3和o4-mini的多模态功能
针对 o3、o4-mini 这类新模型,API 请求参数自动切换为 max_completion_tokens,完全兼容 OpenAI 新接口,避免因参数不符导致的报错。
参考:https://openai.com/index/introducing-o3-and-o4-mini/
📝 补充信息 | Additional Information
解决了 Issue: #6456
Summary by CodeRabbit