-
Notifications
You must be signed in to change notification settings - Fork 7.8k
Add Llama API OAI compatible endpoint support #6442
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Add Llama API Experimental
|
Llama support is cool! You could start with autogen/python/packages/autogen-ext/src/autogen_ext/models/openai/_message_transform.py Lines 413 to 432 in c7757de
and... autogen/python/packages/autogen-ext/src/autogen_ext/models/openai/_message_transform.py Lines 470 to 492 in c7757de
I think you are expert of Llama. If you know about Llama's message limited, could support more well with _message_transfrom. In Anthropic case, Claude does not support empty message, so this message transformer remove empty messages. |
Adding links to Llama API website for sign-up
|
@WuhanMonkey thanks for the PR. Could you address @SongChiYoung 's comments? |
Set Llama models to use base message transformer. It is fully compatible with OAI
Hey yes, just updated it. We are still pending on CLA review from our legal side. |
|
@microsoft-github-policy-service agree company="Meta" |
|
We finally get the CLA approved and signed. @SongChiYoung and @ekzhu would you mind help me review and approve this PR? Thanks |
|
One more question, @SongChiYoung, does AutoGen allows extra headers in the request for customized x-title or http-referer for tracking purpose? |
You can pass in |
Fixed lint error
Fix issue during rebase.
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #6442 +/- ##
==========================================
- Coverage 79.53% 79.52% -0.01%
==========================================
Files 225 225
Lines 16644 16661 +17
==========================================
+ Hits 13237 13249 +12
- Misses 3407 3412 +5
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
Fix mypy
|
@WuhanMonkey I will bring it to finish. Thanks. For local checks you can see |
Thanks appreciate it. |
Why are these changes needed?
To add the latest support for using Llama API offerings with AutoGen
Checks