This is the multi-page printable view of this section. Click here to print.

Return to the regular view of this page.

Conversation component specs

The supported conversation components that interface with Dapr

Table headers to note:

HeaderDescriptionExample
StatusComponent certification statusAlpha
Beta
Stable
Component versionThe version of the componentv1
Since runtime versionThe version of the Dapr runtime when the component status was set or updated1.11

Amazon Web Services (AWS)

ComponentStatusComponent versionSince runtime version
AWS BedrockAlphav11.15

Generic

ComponentStatusComponent versionSince runtime version
AnthropicAlphav11.15
DeepSeekAlphav11.15
GoogleAIAlphav11.16
HuggingfaceAlphav11.15
Local echoStablev11.15
MistralAlphav11.15
OllamaAlphav11.16
OpenAIAlphav11.15

1 - Anthropic

Detailed information on the Anthropic conversation component

Component format

A Dapr conversation.yaml component file has the following structure:

apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
  name: anthropic
spec:
  type: conversation.anthropic
  metadata:
  - name: key
    value: "mykey"
  - name: model
    value: claude-3-5-sonnet-20240620
  - name: cacheTTL
    value: 10m

Spec metadata fields

FieldRequiredDetailsExample
keyYAPI key for Anthropic."mykey"
modelNThe Anthropic LLM to use. Defaults to claude-3-5-sonnet-20240620claude-3-5-sonnet-20240620
cacheTTLNA time-to-live value for a prompt cache to expire. Uses Golang duration format.10m

2 - AWS Bedrock

Detailed information on the AWS Bedrock conversation component

Component format

A Dapr conversation.yaml component file has the following structure:

apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
  name: awsbedrock
spec:
  type: conversation.aws.bedrock
  metadata:
  - name: endpoint
    value: "http://localhost:4566"
  - name: model
    value: amazon.titan-text-express-v1
  - name: cacheTTL
    value: 10m

Spec metadata fields

FieldRequiredDetailsExample
endpointNAWS endpoint for the component to use and connect to emulators. Not recommended for production AWS use.http://localhost:4566
modelNThe LLM to use. Defaults to Bedrock’s default provider model from Amazon.amazon.titan-text-express-v1
cacheTTLNA time-to-live value for a prompt cache to expire. Uses Golang duration format.10m

Authenticating AWS

Instead of using a key parameter, AWS Bedrock authenticates using Dapr’s standard method of IAM or static credentials. Learn more about authenticating with AWS.

3 - DeepSeek

Detailed information on the DeepSeek conversation component

Component format

A Dapr conversation.yaml component file has the following structure:

apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
  name: deepseek
spec:
  type: conversation.deepseek
  metadata:
  - name: key
    value: mykey
  - name: maxTokens
    value: 2048

Spec metadata fields

FieldRequiredDetailsExample
keyYAPI key for DeepSeek.mykey
maxTokensNThe max amount of tokens for each request.2048

4 - Local Testing

Detailed information on the echo conversation component used for local testing

Component format

A Dapr conversation.yaml component file has the following structure:

apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
  name: echo
spec:
  type: conversation.echo
  version: v1

5 - GoogleAI

Detailed information on the GoogleAI conversation component

Component format

A Dapr conversation.yaml component file has the following structure:

apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
  name: googleai
spec:
  type: conversation.googleai
  metadata:
  - name: key
    value: mykey
  - name: model
    value: gemini-1.5-flash
  - name: cacheTTL
    value: 10m

Spec metadata fields

FieldRequiredDetailsExample
keyYAPI key for GoogleAI.mykey
modelNThe GoogleAI LLM to use. Defaults to gemini-1.5-flash.gemini-2.0-flash
cacheTTLNA time-to-live value for a prompt cache to expire. Uses Golang duration format.10m

6 - Huggingface

Detailed information on the Huggingface conversation component

Component format

A Dapr conversation.yaml component file has the following structure:

apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
  name: huggingface
spec:
  type: conversation.huggingface
  metadata:
  - name: key
    value: mykey
  - name: model
    value: meta-llama/Meta-Llama-3-8B
  - name: cacheTTL
    value: 10m

Spec metadata fields

FieldRequiredDetailsExample
keyYAPI key for Huggingface.mykey
modelNThe Huggingface LLM to use. Defaults to meta-llama/Meta-Llama-3-8B.meta-llama/Meta-Llama-3-8B
cacheTTLNA time-to-live value for a prompt cache to expire. Uses Golang duration format.10m

7 - Mistral

Detailed information on the Mistral conversation component

Component format

A Dapr conversation.yaml component file has the following structure:

apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
  name: mistral
spec:
  type: conversation.mistral
  metadata:
  - name: key
    value: mykey
  - name: model
    value: open-mistral-7b
  - name: cacheTTL
    value: 10m

Spec metadata fields

FieldRequiredDetailsExample
keyYAPI key for Mistral.mykey
modelNThe Mistral LLM to use. Defaults to open-mistral-7b.open-mistral-7b
cacheTTLNA time-to-live value for a prompt cache to expire. Uses Golang duration format.10m

8 - Ollama

Detailed information on the Ollama conversation component

Component format

A Dapr conversation.yaml component file has the following structure:

apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
  name: ollama
spec:
  type: conversation.ollama
  metadata:
  - name: model
    value: llama3.2:latest
  - name: cacheTTL
    value: 10m

Spec metadata fields

FieldRequiredDetailsExample
modelNThe Ollama LLM to use. Defaults to llama3.2:latest.phi4:latest
cacheTTLNA time-to-live value for a prompt cache to expire. Uses Golang duration format.10m

OpenAI Compatibility

Ollama is compatible with OpenAI’s API. You can use the OpenAI component with Ollama models with the following changes:

apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
  name: ollama-openai
spec:
  type: conversation.openai # use the openai component type
  metadata:
  - name: key
    value: 'ollama' # just any non-empty string
  - name: model
    value: gpt-oss:20b  # an ollama model (https://ollama.com/search) in this case openai open source model. See https://ollama.com/library/gpt-oss
  - name: endpoint
    value: 'http://localhost:11434/v1' # ollama endpoint

9 - OpenAI

Detailed information on the OpenAI conversation component

Component format

A Dapr conversation.yaml component file has the following structure:

apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
  name: openai
spec:
  type: conversation.openai
  metadata:
  - name: key
    value: mykey
  - name: model
    value: gpt-4-turbo
  - name: endpoint
    value: 'https://api.openai.com/v1'
  - name: cacheTTL
    value: 10m
  # - name: apiType # Optional
  #   value: 'azure'
  # - name: apiVersion # Optional
  #   value: '2025-01-01-preview'

Spec metadata fields

FieldRequiredDetailsExample
keyYAPI key for OpenAI.mykey
modelNThe OpenAI LLM to use. Defaults to gpt-4-turbo.gpt-4-turbo
endpointNCustom API endpoint URL for OpenAI API-compatible services. If not specified, the default OpenAI API endpoint is used. Required when apiType is set to azure.https://api.openai.com/v1, https://example.openai.azure.com/
cacheTTLNA time-to-live value for a prompt cache to expire. Uses Golang duration format.10m
apiTypeNSpecifies the API provider type. Required when using a provider that does not follow the default OpenAI API endpoint conventions.azure
apiVersionNThe API version to use. Required when the apiType is set to azure.2025-04-01-preview