Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,12 @@ Dapr's conversation API reduces the complexity of securely and reliably interact

<img src="/images/conversation-overview.png" width=800 alt="Diagram showing the flow of a user's app communicating with Dapr's LLM components.">

In additon to enabling critical performance and security functionality (like [prompt caching]({{% ref "#prompt-caching" %}}) and [PII scrubbing]({{% ref "#personally-identifiable-information-pii-obfuscation" %}})), you can also pair the conversation API with Dapr functionalities, like:
In addition to enabling critical performance and security functionality (like [prompt caching]({{% ref "#prompt-caching" %}}) and [PII scrubbing]({{% ref "#personally-identifiable-information-pii-obfuscation" %}})), the conversation API also provides:

- **Tool calling capabilities** that allow LLMs to interact with external functions and APIs, enabling more sophisticated AI applications
- **OpenAI-compatible interface** for seamless integration with existing AI workflows and tools

You can also pair the conversation API with Dapr functionalities, like:
- Resiliency circuit breakers and retries to circumvent limit and token errors, or
- Middleware to authenticate requests coming to and from the LLM

Expand Down Expand Up @@ -45,6 +50,17 @@ The PII scrubber obfuscates the following user information:
- SHA-256 hex
- MD5 hex

### Tool calling support

The conversation API supports advanced tool calling capabilities that allow LLMs to interact with external functions and APIs. This enables you to build sophisticated AI applications that can:

- Execute custom functions based on user requests
- Integrate with external services and databases
- Provide dynamic, context-aware responses
- Create multi-step workflows and automation

Tool calling follows OpenAI's interface standards, making it easy to integrate with existing AI development workflows and tools.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"interface standards" is not a common terminology. Let's use "OpenAI's function calling format" and provide a link

https://platform.openai.com/docs/guides/function-calling


## Demo

Watch the demo presented during [Diagrid's Dapr v1.15 celebration](https://www.diagrid.io/videos/dapr-1-15-deep-dive) to see how the conversation API works using the .NET SDK.
Expand Down
185 changes: 155 additions & 30 deletions daprdocs/content/en/reference/api/conversation_api.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,14 +10,14 @@ weight: 1400
The conversation API is currently in [alpha]({{% ref "certification-lifecycle.md#certification-levels" %}}).
{{% /alert %}}

Dapr provides an API to interact with Large Language Models (LLMs) and enables critical performance and security functionality with features like prompt caching and PII data obfuscation.
Dapr provides an API to interact with Large Language Models (LLMs) and enables critical performance and security functionality with features like prompt caching, PII data obfuscation, and tool calling capabilities.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Dapr provides an API to interact with Large Language Models (LLMs) and enables critical performance and security functionality with features like prompt caching, PII data obfuscation, and tool calling capabilities.
Dapr provides an API to interact with Large Language Models (LLMs) and enables critical performance and security functionality with features like prompt caching, PII data obfuscation, and tool calling capabilities.
The tool calling follows OpenAI's interface standards, making it easy to integrate with existing AI development workflows and tools.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OpenAI's interface standards -> OpenAI’s function calling format


## Converse

This endpoint lets you converse with LLMs.
This endpoint lets you converse with LLMs using the Alpha2 version of the API, which provides enhanced tool calling support and alignment with OpenAI's interface.

```
POST http://localhost:<daprPort>/v1.0-alpha1/conversation/<llm-name>/converse
POST http://localhost:<daprPort>/v1.0-alpha2/conversation/<llm-name>/converse
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Update the Alpha API table here to move to alpha2 anc update the proto link
https://docs.dapr.io/operations/support/alpha-beta-apis/#alpha-apis

```

### URL parameters
Expand All @@ -30,34 +30,117 @@ POST http://localhost:<daprPort>/v1.0-alpha1/conversation/<llm-name>/converse

| Field | Description |
| --------- | ----------- |
| `name` | The name of the conversation component. Required |
| `context_id` | The ID of an existing chat (like in ChatGPT). Optional |
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Change to camelCase

| `inputs` | Inputs for the conversation. Multiple inputs at one time are supported. Required |
| `cacheTTL` | A time-to-live value for a prompt cache to expire. Uses Golang duration format. Optional |
| `scrubPII` | A boolean value to enable obfuscation of sensitive information returning from the LLM. Set this value if all PII (across contents) in the request needs to be scrubbed. Optional |
| `temperature` | A float value to control the temperature of the model. Used to optimize for consistency and creativity. Optional |
| `parameters` | Parameters for all custom fields. Optional |
| `metadata` | [Metadata](#metadata) passed to conversation components. Optional |
| `scrub_pii` | A boolean value to enable obfuscation of sensitive information returning from the LLM. Optional |
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This has changed and anyway should not have an underscore.

| `temperature` | A float value to control the temperature of the model. Used to optimize for consistency (0) or creativity (1). Optional |
| `tools` | Tools register the tools available to be used by the LLM during the conversation. Optional |
| `tool_choice` | Controls which (if any) tool is called by the model. Values: `""`, `auto`, `required`, or specific tool name. Defaults to `""` if no tools are present, or `auto` if tools are present. Optional |

#### Input body

| Field | Description |
| --------- | ----------- |
| `content` | The message content to send to the LLM. Required |
| `role` | The role for the LLM to assume. Possible values: 'user', 'tool', 'assistant' |
| `scrubPII` | A boolean value to enable obfuscation of sensitive information present in the content field. Set this value if PII for this specific content needs to be scrubbed exclusively. Optional |
| `messages` | Array of conversation messages. Required |
| `scrub_pii` | A boolean value to enable obfuscation of sensitive information present in the content field. Optional |
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Change to camelCase


#### Message types

The API supports different message types through the `oneof` field:

- **`of_developer`**: Developer role messages with optional name and content
- **`of_system`**: System role messages with optional name and content
- **`of_user`**: User role messages with optional name and content
- **`of_assistant`**: Assistant role messages with optional name, content, and tool calls
- **`of_tool`**: Tool role messages with tool ID, name, and content

#### Tool calling

Tools can be defined using the `tools` field with function definitions:

| Field | Description |
| --------- | ----------- |
| `function.name` | The name of the function to be called. Required |
| `function.description` | A description of what the function does. Optional |
| `function.parameters` | JSON Schema object describing the function parameters. Optional |

### Request content examples

### Request content example
#### Basic conversation

```json
REQUEST = {
"inputs": [
{
"content": "What is Dapr?",
"role": "user", // Optional
"scrubPII": "true", // Optional. Will obfuscate any sensitive information found in the content field
"name": "openai",
"inputs": [{
"messages": [{
"of_user": {
"content": [{
"text": "What is Dapr?"
}]
}
}]
}],
"parameters": {},
"metadata": {}
}
```

#### Conversation with tool calling

```json
{
"name": "openai",
"inputs": [{
"messages": [{
"of_user": {
"content": [{
"text": "What is the weather like in San Francisco in celsius?"
}]
}
}],
"scrub_pii": false
}],
"parameters": {
"max_tokens": {
"@type": "type.googleapis.com/google.protobuf.Int64Value",
"value": "100"
},
],
"cacheTTL": "10m", // Optional
"scrubPII": "true", // Optional. Will obfuscate any sensitive information returning from the LLM
"temperature": 0.5 // Optional. Optimizes for consistency (0) or creativity (1)
"model": {
"@type": "type.googleapis.com/google.protobuf.StringValue",
"value": "claude-3-5-sonnet-20240620"
}
},
"metadata": {
"api_key": "test-key",
"version": "1.0"
},
"scrub_pii": false,
"temperature": 0.7,
"tools": [{
"function": {
"name": "get_weather",
"description": "Get the current weather for a location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA"
},
"unit": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
"description": "The temperature unit to use"
}
},
"required": ["location"]
}
}
}],
"tool_choice": "auto"
}
```

Expand All @@ -71,21 +154,63 @@ Code | Description

### Response content

#### Basic conversation response

```json
RESPONSE = {
"outputs": {
{
"result": "Dapr is distribution application runtime ...",
"parameters": {},
},
{
"result": "Dapr can help developers ...",
"parameters": {},
}
},
RESPONSE = {
"outputs": [{
"choices": [{
"finish_reason": "stop",
"index": 0,
"message": {
"content": "Dapr is a distributed application runtime that makes it easy for developers to build resilient, stateless and stateful applications that run on the cloud and edge.",
"tool_calls": []
}
}]
}]
}
```

#### Tool calling response

```json
{
"outputs": [{
"choices": [{
"finish_reason": "tool_calls",
"index": 0,
"message": {
"content": null,
"tool_calls": [{
"id": "call_123",
"function": {
"name": "get_weather",
"arguments": "{\"location\": \"San Francisco, CA\", \"unit\": \"celsius\"}"
}
}]
}
}]
}]
}
```

### Tool choice options

The `tool_choice` parameter controls how the model uses available tools:

- **`""`**: The model will not call any tool and instead generates a message (default when no tools are present)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This part is confusing: (default when no tools are present)

When no tools are present, would it behave like "auto" , or it will behave like ""

- **`auto`**: The model can pick between generating a message or calling one or more tools (default when tools are present)
- **`required`**: Requires one or more functions to be called
- **`{tool_name}`**: Forces the model to call a specific tool by name

## Legacy Alpha1 API

The previous Alpha1 version of the API is still supported for backward compatibility but is deprecated. For new implementations, use the Alpha2 version described above.

```
POST http://localhost:<daprPort>/v1.0-alpha1/conversation/<llm-name>/converse
```

## Next steps

- [Conversation API overview]({{% ref conversation-overview.md %}})
Expand Down
Loading