Let’s take a look at how the Dapr conversation building block makes interacting with Large Language Models (LLMs) easier. In this quickstart, you use the echo component to communicate with the mock LLM and ask it to define Dapr.
You can try out this conversation quickstart by either:
For this example, you will need:
Clone the sample provided in the Quickstarts repo.
git clone https://github.com/dapr/quickstarts.git
From the root of the Quickstarts directory, navigate into the conversation directory:
cd conversation/python/sdk/conversation
Install the dependencies:
pip3 install -r requirements.txt
Navigate back to the sdk
directory and start the conversation service with the following command:
dapr run -f .
Expected output
== APP - conversation == Input sent: What is dapr?
== APP - conversation == Output response: What is dapr?
When you ran dapr init
during Dapr install, the dapr.yaml
Multi-App Run template file was generated in the .dapr/components
directory.
Running dapr run -f .
in this Quickstart started conversation.go.
dapr.yaml
Multi-App Run template fileRunning the Multi-App Run template file with dapr run -f .
starts all applications in your project. This Quickstart has only one application, so the dapr.yaml
file contains the following:
version: 1
common:
resourcesPath: ../../components/
apps:
- appID: conversation
appDirPath: ./conversation/
command: ["python3", "app.py"]
In conversation/components
directly of the quickstart, the conversation.yaml
file configures the echo LLM component.
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: echo
spec:
type: conversation.echo
version: v1
To interface with a real LLM, swap out the mock component with one of the supported conversation components. For example, to use an OpenAI component, see the example in the conversation how-to guide
app.py
conversation appIn the application code:
from dapr.clients import DaprClient
from dapr.clients.grpc._request import ConversationInput
with DaprClient() as d:
inputs = [
ConversationInput(content="What is dapr?", role='user', scrub_pii=True),
]
metadata = {
'model': 'modelname',
'key': 'authKey',
'cacheTTL': '10m',
}
print('Input sent: What is dapr?')
response = d.converse_alpha1(
name='echo', inputs=inputs, temperature=0.7, context_id='chat-123', metadata=metadata
)
for output in response.outputs:
print(f'Output response: {output.result}')
For this example, you will need:
Clone the sample provided in the Quickstarts repo.
git clone https://github.com/dapr/quickstarts.git
From the root of the Quickstarts directory, navigate into the conversation directory:
cd conversation/javascript/http/conversation
Install the dependencies:
npm install
Navigate back to the http
directory and start the conversation service with the following command:
dapr run -f .
Expected output
== APP - conversation == Input sent: What is dapr?
== APP - conversation == Output response: What is dapr?
When you ran dapr init
during Dapr install, the dapr.yaml
Multi-App Run template file was generated in the .dapr/components
directory.
Running dapr run -f .
in this Quickstart started conversation.go.
dapr.yaml
Multi-App Run template fileRunning the Multi-App Run template file with dapr run -f .
starts all applications in your project. This Quickstart has only one application, so the dapr.yaml
file contains the following:
version: 1
common:
resourcesPath: ../../components/
apps:
- appID: conversation
appDirPath: ./conversation/
daprHTTPPort: 3502
command: ["npm", "run", "start"]
In conversation/components
directly of the quickstart, the conversation.yaml
file configures the echo LLM component.
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: echo
spec:
type: conversation.echo
version: v1
To interface with a real LLM, swap out the mock component with one of the supported conversation components. For example, to use an OpenAI component, see the example in the conversation how-to guide
index.js
conversation appIn the application code:
const conversationComponentName = "echo";
async function main() {
const daprHost = process.env.DAPR_HOST || "http://localhost";
const daprHttpPort = process.env.DAPR_HTTP_PORT || "3500";
const inputBody = {
name: "echo",
inputs: [{ message: "What is dapr?" }],
parameters: {},
metadata: {},
};
const reqURL = `${daprHost}:${daprHttpPort}/v1.0-alpha1/conversation/${conversationComponentName}/converse`;
try {
const response = await fetch(reqURL, {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify(inputBody),
});
console.log("Input sent: What is dapr?");
const data = await response.json();
const result = data.outputs[0].result;
console.log("Output response:", result);
} catch (error) {
console.error("Error:", error.message);
process.exit(1);
}
}
main().catch((error) => {
console.error("Unhandled error:", error);
process.exit(1);
});
For this example, you will need:
Clone the sample provided in the Quickstarts repo.
git clone https://github.com/dapr/quickstarts.git
From the root of the Quickstarts directory, navigate into the conversation directory:
cd conversation/csharp/sdk
Start the conversation service with the following command:
dapr run -f .
Expected output
== APP - conversation == Input sent: What is dapr?
== APP - conversation == Output response: What is dapr?
When you ran dapr init
during Dapr install, the dapr.yaml
Multi-App Run template file was generated in the .dapr/components
directory.
Running dapr run -f .
in this Quickstart started the conversation Program.cs.
dapr.yaml
Multi-App Run template fileRunning the Multi-App Run template file with dapr run -f .
starts all applications in your project. This Quickstart has only one application, so the dapr.yaml
file contains the following:
version: 1
common:
resourcesPath: ../../components/
apps:
- appDirPath: ./conversation/
appID: conversation
daprHTTPPort: 3500
command: ["dotnet", "run"]
In conversation/components
, the conversation.yaml
file configures the echo mock LLM component.
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: echo
spec:
type: conversation.echo
version: v1
To interface with a real LLM, swap out the mock component with one of the supported conversation components. For example, to use an OpenAI component, see the example in the conversation how-to guide
Program.cs
conversation appIn the application code:
using Dapr.AI.Conversation;
using Dapr.AI.Conversation.Extensions;
class Program
{
private const string ConversationComponentName = "echo";
static async Task Main(string[] args)
{
const string prompt = "What is dapr?";
var builder = WebApplication.CreateBuilder(args);
builder.Services.AddDaprConversationClient();
var app = builder.Build();
//Instantiate Dapr Conversation Client
var conversationClient = app.Services.GetRequiredService<DaprConversationClient>();
try
{
// Send a request to the echo mock LLM component
var response = await conversationClient.ConverseAsync(ConversationComponentName, [new(prompt, DaprConversationRole.Generic)]);
Console.WriteLine("Input sent: " + prompt);
if (response != null)
{
Console.Write("Output response:");
foreach (var resp in response.Outputs)
{
Console.WriteLine($" {resp.Result}");
}
}
}
catch (Exception ex)
{
Console.WriteLine("Error: " + ex.Message);
}
}
}
For this example, you will need:
Clone the sample provided in the Quickstarts repo.
git clone https://github.com/dapr/quickstarts.git
From the root of the Quickstarts directory, navigate into the conversation directory:
cd conversation/go/sdk
Start the conversation service with the following command:
dapr run -f .
Expected output
== APP - conversation == Input sent: What is dapr?
== APP - conversation == Output response: What is dapr?
When you ran dapr init
during Dapr install, the dapr.yaml
Multi-App Run template file was generated in the .dapr/components
directory.
Running dapr run -f .
in this Quickstart started conversation.go.
dapr.yaml
Multi-App Run template fileRunning the Multi-App Run template file with dapr run -f .
starts all applications in your project. This Quickstart has only one application, so the dapr.yaml
file contains the following:
version: 1
common:
resourcesPath: ../../components/
apps:
- appDirPath: ./conversation/
appID: conversation
daprHTTPPort: 3501
command: ["go", "run", "."]
In conversation/components
directly of the quickstart, the conversation.yaml
file configures the echo LLM component.
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: echo
spec:
type: conversation.echo
version: v1
To interface with a real LLM, swap out the mock component with one of the supported conversation components. For example, to use an OpenAI component, see the example in the conversation how-to guide
conversation.go
conversation appIn the application code:
package main
import (
"context"
"fmt"
"log"
dapr "github.com/dapr/go-sdk/client"
)
func main() {
client, err := dapr.NewClient()
if err != nil {
panic(err)
}
input := dapr.ConversationInput{
Message: "What is dapr?",
// Role: nil, // Optional
// ScrubPII: nil, // Optional
}
fmt.Println("Input sent:", input.Message)
var conversationComponent = "echo"
request := dapr.NewConversationRequest(conversationComponent, []dapr.ConversationInput{input})
resp, err := client.ConverseAlpha1(context.Background(), request)
if err != nil {
log.Fatalf("err: %v", err)
}
fmt.Println("Output response:", resp.Outputs[0].Result)
}
For this example, you will need:
Clone the sample provided in the Quickstarts repo.
git clone https://github.com/dapr/quickstarts.git
From the root of the Quickstarts directory, navigate into the conversation directory:
cd conversation/python/sdk/conversation
Install the dependencies:
pip3 install -r requirements.txt
Navigate back to the sdk
directory and start the conversation service with the following command:
dapr run --app-id conversation --resources-path ../../../components -- python3 app.py
Note: Since Python3.exe is not defined in Windows, you may need to use
python app.py
instead ofpython3 app.py
.
Expected output
== APP - conversation == Input sent: What is dapr?
== APP - conversation == Output response: What is dapr?
For this example, you will need:
Clone the sample provided in the Quickstarts repo.
git clone https://github.com/dapr/quickstarts.git
From the root of the Quickstarts directory, navigate into the conversation directory:
cd conversation/javascript/http/conversation
Install the dependencies:
npm install
Navigate back to the http
directory and start the conversation service with the following command:
dapr run --app-id conversation --resources-path ../../../components/ -- npm run start
Expected output
== APP - conversation == Input sent: What is dapr?
== APP - conversation == Output response: What is dapr?
For this example, you will need:
Clone the sample provided in the Quickstarts repo.
git clone https://github.com/dapr/quickstarts.git
From the root of the Quickstarts directory, navigate into the conversation directory:
cd conversation/csharp/sdk/conversation
Install the dependencies:
dotnet build
Start the conversation service with the following command:
dapr run --app-id conversation --resources-path ../../../components/ -- dotnet run
Expected output
== APP - conversation == Input sent: What is dapr?
== APP - conversation == Output response: What is dapr?
For this example, you will need:
Clone the sample provided in the Quickstarts repo.
git clone https://github.com/dapr/quickstarts.git
From the root of the Quickstarts directory, navigate into the conversation directory:
cd conversation/go/sdk/conversation
Install the dependencies:
go build .
Start the conversation service with the following command:
dapr run --app-id conversation --resources-path ../../../components/ -- go run .
Expected output
== APP - conversation == Input sent: What is dapr?
== APP - conversation == Output response: What is dapr?
Watch the demo presented during Diagrid’s Dapr v1.15 celebration to see how the conversation API works using the .NET SDK.
We’re continuously working to improve our Quickstart examples and value your feedback. Did you find this Quickstart helpful? Do you have suggestions for improvement?
Join the discussion in our discord channel.