KEMBAR78
Fix OllamaChatCompletionClient load_component() error by adding to WELL_KNOWN_PROVIDERS by Copilot · Pull Request #7030 · microsoft/autogen · GitHub
Skip to content

Conversation

Copy link
Contributor

Copilot AI commented Sep 16, 2025

The ChatCompletionClient.load_component() method was failing with ValueError("Invalid") when trying to load OllamaChatCompletionClient using provider configuration. This prevented users from using the same configuration-based loading approach available for other chat completion clients.

Problem

The following code would fail with a ValueError: Invalid error:

import asyncio
from autogen_core.models import ChatCompletionClient

async def main():
    config = {
        "provider": "OllamaChatCompletionClient",
        "config": {
            "model": "qwen3",
            "host": "http://1.2.3.4:30130",
        }
    }
    client = ChatCompletionClient.load_component(config)

if __name__ == "__main__":
    asyncio.run(main())

While the direct instantiation approach worked fine:

from autogen_ext.models.ollama import OllamaChatCompletionClient

client = OllamaChatCompletionClient(
    model="qwen3", 
    host="http://1.2.3.4:30130"
)

Root Cause

The issue occurred because OllamaChatCompletionClient was missing from the WELL_KNOWN_PROVIDERS dictionary in _component_config.py. When load_component() couldn't find the provider in the known providers list, it attempted to use "OllamaChatCompletionClient" directly as a module path. Since this string contains no dots, the parsing logic provider.rsplit(".", maxsplit=1) would return a single-element list, causing the validation len(output) != 2 to fail and raise ValueError("Invalid").

Solution

Added the missing entry to WELL_KNOWN_PROVIDERS:

WELL_KNOWN_PROVIDERS = {
    "azure_openai_chat_completion_client": "autogen_ext.models.openai.AzureOpenAIChatCompletionClient",
    "AzureOpenAIChatCompletionClient": "autogen_ext.models.openai.AzureOpenAIChatCompletionClient",
    "openai_chat_completion_client": "autogen_ext.models.openai.OpenAIChatCompletionClient",
    "OpenAIChatCompletionClient": "autogen_ext.models.openai.OpenAIChatCompletionClient",
    "OllamaChatCompletionClient": "autogen_ext.models.ollama.OllamaChatCompletionClient",  # Added
}

Changes

  • Core fix: Added OllamaChatCompletionClient mapping to WELL_KNOWN_PROVIDERS in _component_config.py
  • Tests: Added test_ollama_in_well_known_providers() to validate the provider mapping
  • Integration tests: Added test_ollama_load_component() and test_ollama_load_component_via_class() to test the load_component functionality

Impact

  • ✅ The original issue's code example now works without errors
  • OllamaChatCompletionClient can be loaded using the same configuration approach as other clients
  • ✅ Maintains consistency with existing OpenAI and Azure OpenAI client patterns
  • ✅ No breaking changes to existing functionality
  • ✅ Full backward compatibility preserved

Fixes #6994.


💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.

Co-authored-by: ekzhu <320302+ekzhu@users.noreply.github.com>
Copilot AI changed the title [WIP] OllamaChatCompletionClient load_component() error Fix OllamaChatCompletionClient load_component() error by adding to WELL_KNOWN_PROVIDERS Sep 16, 2025
Copilot AI requested a review from ekzhu September 16, 2025 08:57
@ekzhu ekzhu marked this pull request as ready for review September 16, 2025 09:05
@codecov
Copy link

codecov bot commented Sep 16, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 80.94%. Comparing base (79d5d6a) to head (0fb779f).
⚠️ Report is 3 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #7030      +/-   ##
==========================================
+ Coverage   80.93%   80.94%   +0.01%     
==========================================
  Files         237      237              
  Lines       18240    18241       +1     
==========================================
+ Hits        14762    14765       +3     
+ Misses       3478     3476       -2     
Flag Coverage Δ
unittests 80.94% <ø> (+0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@ekzhu ekzhu merged commit c469fc0 into main Sep 16, 2025
34 checks passed
@ekzhu ekzhu deleted the copilot/fix-6994 branch September 16, 2025 09:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

OllamaChatCompletionClient load_component() error

2 participants