-
Notifications
You must be signed in to change notification settings - Fork 7.8k
Default usage statistics for streaming responses #6578
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Default usage statistics for streaming responses #6578
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #6578 +/- ##
=======================================
Coverage 79.53% 79.53%
=======================================
Files 225 225
Lines 16650 16656 +6
=======================================
+ Hits 13242 13248 +6
Misses 3408 3408
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
python/packages/autogen-ext/tests/models/test_openai_model_client.py
Outdated
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/models/openai/_openai_client.py
Outdated
Show resolved
Hide resolved
|
It looks like we actually have documentation around the behavior of I added a flag to enable the behavior more easily, defaulted to False to preserve current behavior |
… into default_stream_usage
Why are these changes needed?
Enables usage statistics for streaming responses by default.
There is a similar bug in the AzureAI client. Theoretically adding the parameter
should fix the problem, but I'm currently unable to test that workflow
Related issue number
closes #6548
Checks