diff --git a/daprdocs/content/en/developing-applications/building-blocks/conversation/conversation-overview.md b/daprdocs/content/en/developing-applications/building-blocks/conversation/conversation-overview.md
index 1287f4cf5f9..7ccea189a15 100644
--- a/daprdocs/content/en/developing-applications/building-blocks/conversation/conversation-overview.md
+++ b/daprdocs/content/en/developing-applications/building-blocks/conversation/conversation-overview.md
@@ -14,7 +14,12 @@ Dapr's conversation API reduces the complexity of securely and reliably interact
-In additon to enabling critical performance and security functionality (like [prompt caching]({{% ref "#prompt-caching" %}}) and [PII scrubbing]({{% ref "#personally-identifiable-information-pii-obfuscation" %}})), you can also pair the conversation API with Dapr functionalities, like:
+In addition to enabling critical performance and security functionality (like [prompt caching]({{% ref "#prompt-caching" %}}) and [PII scrubbing]({{% ref "#personally-identifiable-information-pii-obfuscation" %}})), the conversation API also provides:
+
+- **Tool calling capabilities** that allow LLMs to interact with external functions and APIs, enabling more sophisticated AI applications
+- **OpenAI-compatible interface** for seamless integration with existing AI workflows and tools
+
+You can also pair the conversation API with Dapr functionalities, like:
- Resiliency circuit breakers and retries to circumvent limit and token errors, or
- Middleware to authenticate requests coming to and from the LLM
@@ -45,6 +50,17 @@ The PII scrubber obfuscates the following user information:
- SHA-256 hex
- MD5 hex
+### Tool calling support
+
+The conversation API supports advanced tool calling capabilities that allow LLMs to interact with external functions and APIs. This enables you to build sophisticated AI applications that can:
+
+- Execute custom functions based on user requests
+- Integrate with external services and databases
+- Provide dynamic, context-aware responses
+- Create multi-step workflows and automation
+
+Tool calling follows OpenAI's interface standards, making it easy to integrate with existing AI development workflows and tools.
+
## Demo
Watch the demo presented during [Diagrid's Dapr v1.15 celebration](https://www.diagrid.io/videos/dapr-1-15-deep-dive) to see how the conversation API works using the .NET SDK.
diff --git a/daprdocs/content/en/reference/api/conversation_api.md b/daprdocs/content/en/reference/api/conversation_api.md
index 1a4e006b348..2c470d59f79 100644
--- a/daprdocs/content/en/reference/api/conversation_api.md
+++ b/daprdocs/content/en/reference/api/conversation_api.md
@@ -10,14 +10,14 @@ weight: 1400
The conversation API is currently in [alpha]({{% ref "certification-lifecycle.md#certification-levels" %}}).
{{% /alert %}}
-Dapr provides an API to interact with Large Language Models (LLMs) and enables critical performance and security functionality with features like prompt caching and PII data obfuscation.
+Dapr provides an API to interact with Large Language Models (LLMs) and enables critical performance and security functionality with features like prompt caching, PII data obfuscation, and tool calling capabilities.
## Converse
-This endpoint lets you converse with LLMs.
+This endpoint lets you converse with LLMs using the Alpha2 version of the API, which provides enhanced tool calling support and alignment with OpenAI's interface.
```
-POST http://localhost:/v1.0-alpha1/conversation//converse
+POST http://localhost:/v1.0-alpha2/conversation//converse
```
### URL parameters
@@ -30,34 +30,117 @@ POST http://localhost:/v1.0-alpha1/conversation//converse
| Field | Description |
| --------- | ----------- |
+| `name` | The name of the conversation component. Required |
+| `context_id` | The ID of an existing chat (like in ChatGPT). Optional |
| `inputs` | Inputs for the conversation. Multiple inputs at one time are supported. Required |
-| `cacheTTL` | A time-to-live value for a prompt cache to expire. Uses Golang duration format. Optional |
-| `scrubPII` | A boolean value to enable obfuscation of sensitive information returning from the LLM. Set this value if all PII (across contents) in the request needs to be scrubbed. Optional |
-| `temperature` | A float value to control the temperature of the model. Used to optimize for consistency and creativity. Optional |
+| `parameters` | Parameters for all custom fields. Optional |
| `metadata` | [Metadata](#metadata) passed to conversation components. Optional |
+| `scrub_pii` | A boolean value to enable obfuscation of sensitive information returning from the LLM. Optional |
+| `temperature` | A float value to control the temperature of the model. Used to optimize for consistency (0) or creativity (1). Optional |
+| `tools` | Tools register the tools available to be used by the LLM during the conversation. Optional |
+| `tool_choice` | Controls which (if any) tool is called by the model. Values: `""`, `auto`, `required`, or specific tool name. Defaults to `""` if no tools are present, or `auto` if tools are present. Optional |
#### Input body
| Field | Description |
| --------- | ----------- |
-| `content` | The message content to send to the LLM. Required |
-| `role` | The role for the LLM to assume. Possible values: 'user', 'tool', 'assistant' |
-| `scrubPII` | A boolean value to enable obfuscation of sensitive information present in the content field. Set this value if PII for this specific content needs to be scrubbed exclusively. Optional |
+| `messages` | Array of conversation messages. Required |
+| `scrub_pii` | A boolean value to enable obfuscation of sensitive information present in the content field. Optional |
+
+#### Message types
+
+The API supports different message types through the `oneof` field:
+
+- **`of_developer`**: Developer role messages with optional name and content
+- **`of_system`**: System role messages with optional name and content
+- **`of_user`**: User role messages with optional name and content
+- **`of_assistant`**: Assistant role messages with optional name, content, and tool calls
+- **`of_tool`**: Tool role messages with tool ID, name, and content
+
+#### Tool calling
+
+Tools can be defined using the `tools` field with function definitions:
+
+| Field | Description |
+| --------- | ----------- |
+| `function.name` | The name of the function to be called. Required |
+| `function.description` | A description of what the function does. Optional |
+| `function.parameters` | JSON Schema object describing the function parameters. Optional |
+
+### Request content examples
-### Request content example
+#### Basic conversation
```json
REQUEST = {
- "inputs": [
- {
- "content": "What is Dapr?",
- "role": "user", // Optional
- "scrubPII": "true", // Optional. Will obfuscate any sensitive information found in the content field
+ "name": "openai",
+ "inputs": [{
+ "messages": [{
+ "of_user": {
+ "content": [{
+ "text": "What is Dapr?"
+ }]
+ }
+ }]
+ }],
+ "parameters": {},
+ "metadata": {}
+}
+```
+
+#### Conversation with tool calling
+
+```json
+{
+ "name": "openai",
+ "inputs": [{
+ "messages": [{
+ "of_user": {
+ "content": [{
+ "text": "What is the weather like in San Francisco in celsius?"
+ }]
+ }
+ }],
+ "scrub_pii": false
+ }],
+ "parameters": {
+ "max_tokens": {
+ "@type": "type.googleapis.com/google.protobuf.Int64Value",
+ "value": "100"
},
- ],
- "cacheTTL": "10m", // Optional
- "scrubPII": "true", // Optional. Will obfuscate any sensitive information returning from the LLM
- "temperature": 0.5 // Optional. Optimizes for consistency (0) or creativity (1)
+ "model": {
+ "@type": "type.googleapis.com/google.protobuf.StringValue",
+ "value": "claude-3-5-sonnet-20240620"
+ }
+ },
+ "metadata": {
+ "api_key": "test-key",
+ "version": "1.0"
+ },
+ "scrub_pii": false,
+ "temperature": 0.7,
+ "tools": [{
+ "function": {
+ "name": "get_weather",
+ "description": "Get the current weather for a location",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "location": {
+ "type": "string",
+ "description": "The city and state, e.g. San Francisco, CA"
+ },
+ "unit": {
+ "type": "string",
+ "enum": ["celsius", "fahrenheit"],
+ "description": "The temperature unit to use"
+ }
+ },
+ "required": ["location"]
+ }
+ }
+ }],
+ "tool_choice": "auto"
}
```
@@ -71,21 +154,63 @@ Code | Description
### Response content
+#### Basic conversation response
+
```json
-RESPONSE = {
- "outputs": {
- {
- "result": "Dapr is distribution application runtime ...",
- "parameters": {},
- },
- {
- "result": "Dapr can help developers ...",
- "parameters": {},
- }
- },
+RESPONSE = {
+ "outputs": [{
+ "choices": [{
+ "finish_reason": "stop",
+ "index": 0,
+ "message": {
+ "content": "Dapr is a distributed application runtime that makes it easy for developers to build resilient, stateless and stateful applications that run on the cloud and edge.",
+ "tool_calls": []
+ }
+ }]
+ }]
+}
+```
+
+#### Tool calling response
+
+```json
+{
+ "outputs": [{
+ "choices": [{
+ "finish_reason": "tool_calls",
+ "index": 0,
+ "message": {
+ "content": null,
+ "tool_calls": [{
+ "id": "call_123",
+ "function": {
+ "name": "get_weather",
+ "arguments": "{\"location\": \"San Francisco, CA\", \"unit\": \"celsius\"}"
+ }
+ }]
+ }
+ }]
+ }]
}
```
+### Tool choice options
+
+The `tool_choice` parameter controls how the model uses available tools:
+
+- **`""`**: The model will not call any tool and instead generates a message (default when no tools are present)
+- **`auto`**: The model can pick between generating a message or calling one or more tools (default when tools are present)
+- **`required`**: Requires one or more functions to be called
+- **`{tool_name}`**: Forces the model to call a specific tool by name
+
+## Legacy Alpha1 API
+
+The previous Alpha1 version of the API is still supported for backward compatibility but is deprecated. For new implementations, use the Alpha2 version described above.
+
+```
+POST http://localhost:/v1.0-alpha1/conversation//converse
+```
+
## Next steps
- [Conversation API overview]({{% ref conversation-overview.md %}})