-
Notifications
You must be signed in to change notification settings - Fork 239
Description
🧭 Epic
Title: Sample Agent – CrewAI Integration (OpenAI & A2A Endpoints)
Goal: Deliver a ready‑to‑fork reference agent built with CrewAI that can:
- Connect to an MCP Gateway instance via REST & WebSocket.
- Dynamically retrieve the current tool catalog or accept a user‑supplied CSV list of tool IDs.
- Expose both OpenAI‑compatible
/v1/chat/completions
and A2A JSON‑RPC endpoints so existing clients and other gateways can invoke it.
Why now: Many teams are experimenting with CrewAI’s autonomous‑agent workflow. A concrete sample demystifies crew & task orchestration, accelerates integrations and serves as a comparative companion to the LangChain example.
🧭 Type of Feature
- Developer tooling / sample code
- Integration & connectivity
🙋♂️ User Story 1 — Dynamic Tool Discovery
As a: Gateway app developer
I want: the agent to call GET /tools
on startup (or accept a comma‑separated TOOLS=
env/CLI arg)
So that: the available tool list stays in sync without code edits.
✅ Acceptance Criteria
Scenario: Auto‑discover tools at launch
Given the gateway URL is "https://gw.local"
And TOOLS env var is *unset*
When the CrewAI agent starts
Then it MUST call GET https://gw.local/tools
And map each tool into a CrewAI `Tool` wrapper for later invocation
🙋♂️ User Story 2 — Dual Endpoint Exposure
As a: Client integrator
I want: the agent to serve both /v1/chat/completions
(OpenAI spec) and /a2a
JSON‑RPC
So that: existing OpenAI SDKs and A2A‑aware gateways can call the same crew workflow.
✅ Acceptance Criteria
Scenario: OpenAI endpoint responds via CrewAI
Given the agent is running on :8100
When I POST valid OpenAI chat JSON to /v1/chat/completions
Then I receive a streaming "choices" response produced by CrewAI
And any tool calls are resolved through the MCP Gateway
🙋♂️ User Story 3 — Parameterised Tool Allow‑List
As a: DevOps engineer
I want: to restrict the agent to --tools "weather.get,math.calculate"
So that: only approved tools are callable in production.
✅ Acceptance Criteria
Scenario: CSV override takes precedence
Given CLI flag --tools="weather.get,math.calculate"
When the agent starts
Then it MUST NOT call /tools autodiscovery
And only register those two tools with CrewAI
🙋♂️ User Story 4 — Tool Schema Introspection
As a: Tool author
I want: the agent to inspect the JSON schema of each tool
So that: it can auto‑generate structured‑output prompts and argument validation for each CrewAI Task
.
📐 Design Sketch
flowchart TD
subgraph Startup
A[Parse CLI / ENV] --> B{Tool list?}
B -- CSV given --> C[Load CSV list]
B -- none --> D[GET /tools]
C --> E[Build CrewAI Tool wrappers]
D --> E
end
E --> F[Crew: Orchestrator]
F --> G[OpenAI Adapter]
F --> H[A2A RPC handler]
G --> I[MCP Gateway REST / WS]
H --> I
Component | Change | Detail |
---|---|---|
agent_crewai.py |
NEW | Core agent; defines Crew, Members & Tasks with tool router |
gateway_client.py |
UPDATE | /tools fetch, auth token support (shared w/ LangChain sample) |
cli.py |
NEW | sample-agent-crewai start --gateway-url --tools CSV |
openai_adapter.py |
NEW | FastAPI blueprint mapping OpenAI chat to Crew run() |
a2a_rpc.py |
NEW | JSON‑RPC handler exposing crew.invoke |
docker/Dockerfile |
NEW | Multi‑arch image for quick deploy |
Docs | ADD | Quick‑start, env‑vars, curl examples, helm chart snippet |
🔄 Roll‑out Plan
- Phase 0: Skeleton repo + CI + lint.
- Phase 1: Implement tool discovery & CSV override.
- Phase 2: Wire OpenAI adapter; unit tests.
- Phase 3: Wire A2A RPC; integration tests with local gateway.
- Phase 4: Docker/Helm artefacts; publish to ghcr.io.
📣 Next Steps
- Scaffold repo (
agent-crewai
) underagent_runtimes/
. - Create a tutorial under docs.