Skip to content

Conversation

@cwarre33
Copy link

@cwarre33 cwarre33 commented Oct 21, 2025

Add try-except blocks to model client init.py files (OpenAI, Anthropic, Azure, Ollama) to provide clear, actionable error messages when required dependencies are missing.

This improvement helps users quickly identify and install missing packages instead of encountering generic ImportError messages.

Changes:

  • Wrap imports in try-except blocks for all main LLM client modules
  • Include specific installation instructions (e.g., pip install autogen-ext[openai])
  • Preserve original error context with 'raise ... from e' for debugging
  • Consistent error message format across all client implementations

Fixes #4605

Why are these changes needed?

Related issue number

Checks

…soft#4605)

Add try-except blocks to model client __init__.py files (OpenAI, Anthropic, Azure, Ollama)
to provide clear, actionable error messages when required dependencies are missing.

This improvement helps users quickly identify and install missing packages instead of
encountering generic ImportError messages.

Changes:
- Wrap imports in try-except blocks for all main LLM client modules
- Include specific installation instructions (e.g., pip install autogen-ext[openai])
- Preserve original error context with 'raise ... from e' for debugging
- Consistent error message format across all client implementations

Fixes microsoft#4605
@cwarre33
Copy link
Author

@microsoft-github-policy-service agree

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Improve Import Error Messages for LLM Client Dependencies

1 participant