-
Notifications
You must be signed in to change notification settings - Fork 1.4k
feat: add CometAPI integration with LLM service and examples #2739
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
- Updated README.md to include CometAPI quickstart instructions. - Added CometAPI environment variable to env.example. - Updated pyproject.toml to include CometAPI as a dependency. - Implemented CometAPILLMService for LLM interactions with CometAPI. - Added example script for minimal CometAPI LLM usage. - Created unit tests for CometAPILLMService utilities.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR adds CometAPI as an OpenAI-compatible LLM provider to Pipecat, with comprehensive integration including service implementation, model curation utilities, and example usage.
- Implements CometAPILLMService extending OpenAILLMService with model filtering and recommendation helpers
- Adds comprehensive test coverage for model utilities and chat model fetching functionality
- Includes foundational example demonstrating CometAPI integration with environment configuration
Reviewed Changes
Copilot reviewed 8 out of 9 changed files in this pull request and generated 1 comment.
Show a summary per file
| File | Description |
|---|---|
| src/pipecat/services/cometapi/llm.py | Core CometAPI LLM service implementation with model curation and filtering utilities |
| src/pipecat/services/cometapi/init.py | Module initialization with deprecation proxy setup |
| tests/test_cometapi_llm_service.py | Unit tests for model utilities and chat model fetching functionality |
| examples/foundational/02a-llm-say-one-thing-cometapi.py | Foundational example demonstrating CometAPI usage |
| src/pipecat/tests/utils.py | Test utility improvements for frame queue draining |
| pyproject.toml | Build configuration updates for CometAPI extra and test markers |
| env.example | Environment variable template addition for CometAPI_KEY |
| README.md | Documentation updates with CometAPI quickstart guide |
Comments suppressed due to low confidence (1)
tests/test_cometapi_llm_service.py:1
- Redundant import of
asyncio- it's already imported on line 15. Remove this duplicate import.
"""Tests for CometAPILLMService helper utilities.
Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.
|
Hello! Sorry for the delayed response. We've just rolled out Community Integrations and we'd like to invite you to submit your integration for listing. While your integration won't be part of Pipecat's core code, it will be discoverable in the Pipecat docs and fully usable with Pipecat. Please review the guidelines here and let me know if you have any questions: |
Sorry I’m late. It’s my pleasure, and I really hope to bring CometAPI into Pipecat’s community integration efforts. I’ve read the community integration specification document you published, but I still have a few questions:
I look forward to your reply. |

Summary
Adds CometAPI as an OpenAI‑compatible LLM provider with a curated recommended model list and filtering helpers.
Related issue
#2649