This is the multi-page printable view of this section. Click here to print.

Return to the regular view of this page.

Dapr Software Development Kits (SDKs)

Use your favorite languages with Dapr

The Dapr SDKs are the easiest way for you to get Dapr into your application. Choose your favorite language and get up and running with Dapr in minutes.

SDK packages

Select your preferred language below to learn more about client, server, actor, and workflow packages.

  • Client: The Dapr client allows you to invoke Dapr building block APIs and perform each building block’s actions
  • Server extensions: The Dapr service extensions allow you to create services that can be invoked by other services and subscribe to topics
  • Actor: The Dapr Actor SDK allows you to build virtual actors with methods, state, timers, and persistent reminders
  • Workflow: Dapr Workflow makes it easy for you to write long running business logic and integrations in a reliable way

SDK languages

LanguageStatusClientServer extensionsActorWorkflow
.NETStableASP.NET Core
PythonStablegRPC
FastAPI
Flask
JavaStableSpring Boot
Quarkus
GoStable
PHPStable
JavaScriptStable
C++In development
RustIn development

Further reading

1 - Dapr .NET SDK

.NET SDK packages for developing Dapr applications

Dapr offers a variety of packages to help with the development of .NET applications. Using them you can create .NET clients, servers, and virtual actors with Dapr.

Prerequisites

Installation

To get started with the Client .NET SDK, install the Dapr .NET SDK package:

dotnet add package Dapr.Client

Try it out

Put the Dapr .NET SDK to the test. Walk through the .NET quickstarts and tutorials to see Dapr in action:

SDK samplesDescription
QuickstartsExperience Dapr’s API building blocks in just a few minutes using the .NET SDK.
SDK samplesClone the SDK repo to try out some examples and get started.
Pub/sub tutorialSee how Dapr .NET SDK works alongside other Dapr SDKs to enable pub/sub applications.

Available packages

Client

Create .NET clients that interact with a Dapr sidecar and other Dapr applications.

Server

Write servers and services in .NET using the Dapr SDK. Includes support for ASP.NET.

Actors

Create virtual actors with state, reminders/timers, and methods in .NET.

Workflow

Create and manage workflows that work with other Dapr APIs in .NET.

Jobs

Create and manage the scheduling and orchestration of jobs in .NET.

AI

Create and manage AI operations in .NET

More information

Learn more about local development options, best practices, or browse NuGet packages to add to your existing .NET applications.

Development

Learn about local development integration options

Best Practices

Learn about best practices for developing .NET Dapr applications

NuGet packages

NuGet packages for adding the Dapr to your .NET applications.


1.1 - Getting started with the Dapr client .NET SDK

How to get up and running with the Dapr .NET SDK

The Dapr client package allows you to interact with other Dapr applications from a .NET application.

Building blocks

The .NET SDK allows you to interface with all of the Dapr building blocks.

Invoke a service

HTTP

You can either use the DaprClient or System.Net.Http.HttpClient to invoke your services.

using var client = new DaprClientBuilder().
                UseTimeout(TimeSpan.FromSeconds(2)). // Optionally, set a timeout
                Build(); 

// Invokes a POST method named "deposit" that takes input of type "Transaction"
var data = new { id = "17", amount = 99m };
var account = await client.InvokeMethodAsync<Account>("routing", "deposit", data, cancellationToken);
Console.WriteLine("Returned: id:{0} | Balance:{1}", account.Id, account.Balance);
var client = DaprClient.CreateInvokeHttpClient(appId: "routing");

// To set a timeout on the HTTP client:
client.Timeout = TimeSpan.FromSeconds(2);

var deposit = new Transaction  { Id = "17", Amount = 99m };
var response = await client.PostAsJsonAsync("/deposit", deposit, cancellationToken);
var account = await response.Content.ReadFromJsonAsync<Account>(cancellationToken: cancellationToken);
Console.WriteLine("Returned: id:{0} | Balance:{1}", account.Id, account.Balance);

gRPC

You can use the DaprClient to invoke your services over gRPC.

using var cts = new CancellationTokenSource(TimeSpan.FromSeconds(20));
var invoker = DaprClient.CreateInvocationInvoker(appId: myAppId, daprEndpoint: serviceEndpoint);
var client = new MyService.MyServiceClient(invoker);

var options = new CallOptions(cancellationToken: cts.Token, deadline: DateTime.UtcNow.AddSeconds(1));
await client.MyMethodAsync(new Empty(), options);

Assert.Equal(StatusCode.DeadlineExceeded, ex.StatusCode);

Save & get application state

var client = new DaprClientBuilder().Build();

var state = new Widget() { Size = "small", Color = "yellow", };
await client.SaveStateAsync(storeName, stateKeyName, state, cancellationToken: cancellationToken);
Console.WriteLine("Saved State!");

state = await client.GetStateAsync<Widget>(storeName, stateKeyName, cancellationToken: cancellationToken);
Console.WriteLine($"Got State: {state.Size} {state.Color}");

await client.DeleteStateAsync(storeName, stateKeyName, cancellationToken: cancellationToken);
Console.WriteLine("Deleted State!");

Query State (Alpha)

var query = "{" +
                "\"filter\": {" +
                    "\"EQ\": { \"value.Id\": \"1\" }" +
                "}," +
                "\"sort\": [" +
                    "{" +
                        "\"key\": \"value.Balance\"," +
                        "\"order\": \"DESC\"" +
                    "}" +
                "]" +
            "}";

var client = new DaprClientBuilder().Build();
var queryResponse = await client.QueryStateAsync<Account>("querystore", query, cancellationToken: cancellationToken);

Console.WriteLine($"Got {queryResponse.Results.Count}");
foreach (var account in queryResponse.Results)
{
    Console.WriteLine($"Account: {account.Data.Id} has {account.Data.Balance}");
}

Publish messages

var client = new DaprClientBuilder().Build();

var eventData = new { Id = "17", Amount = 10m, };
await client.PublishEventAsync(pubsubName, "deposit", eventData, cancellationToken);
Console.WriteLine("Published deposit event!");

Interact with output bindings

using var client = new DaprClientBuilder().Build();

// Example payload for the Twilio SendGrid binding
var email = new 
{
    metadata = new 
    {
        emailTo = "customer@example.com",
        subject = "An email from Dapr SendGrid binding",    
    }, 
    data =  "<h1>Testing Dapr Bindings</h1>This is a test.<br>Bye!",
};
await client.InvokeBindingAsync("send-email", "create", email);

Retrieve secrets

var client = new DaprClientBuilder().Build();

// Retrieve a key-value-pair-based secret - returns a Dictionary<string, string>
var secrets = await client.GetSecretAsync("mysecretstore", "key-value-pair-secret");
Console.WriteLine($"Got secret keys: {string.Join(", ", secrets.Keys)}");
var client = new DaprClientBuilder().Build();

// Retrieve a key-value-pair-based secret - returns a Dictionary<string, string>
var secrets = await client.GetSecretAsync("mysecretstore", "key-value-pair-secret");
Console.WriteLine($"Got secret keys: {string.Join(", ", secrets.Keys)}");

// Retrieve a single-valued secret - returns a Dictionary<string, string>
// containing a single value with the secret name as the key
var data = await client.GetSecretAsync("mysecretstore", "single-value-secret");
var value = data["single-value-secret"]
Console.WriteLine("Got a secret value, I'm not going to be print it, it's a secret!");

Get Configuration Keys

var client = new DaprClientBuilder().Build();

// Retrieve a specific set of keys.
var specificItems = await client.GetConfiguration("configstore", new List<string>() { "key1", "key2" });
Console.WriteLine($"Here are my values:\n{specificItems[0].Key} -> {specificItems[0].Value}\n{specificItems[1].Key} -> {specificItems[1].Value}");

// Retrieve all configuration items by providing an empty list.
var specificItems = await client.GetConfiguration("configstore", new List<string>());
Console.WriteLine($"I got {configItems.Count} entires!");
foreach (var item in configItems)
{
    Console.WriteLine($"{item.Key} -> {item.Value}")
}

Subscribe to Configuration Keys

var client = new DaprClientBuilder().Build();

// The Subscribe Configuration API returns a wrapper around an IAsyncEnumerable<IEnumerable<ConfigurationItem>>.
// Iterate through it by accessing its Source in a foreach loop. The loop will end when the stream is severed
// or if the cancellation token is cancelled.
var subscribeConfigurationResponse = await daprClient.SubscribeConfiguration(store, keys, metadata, cts.Token);
await foreach (var items in subscribeConfigurationResponse.Source.WithCancellation(cts.Token))
{
    foreach (var item in items)
    {
        Console.WriteLine($"{item.Key} -> {item.Value}")
    }
}

Distributed lock (Alpha)

Acquire a lock

using System;
using Dapr.Client;

namespace LockService
{
    class Program
    {
        [Obsolete("Distributed Lock API is in Alpha, this can be removed once it is stable.")]
        static async Task Main(string[] args)
        {
            var daprLockName = "lockstore";
            var fileName = "my_file_name";
            var client = new DaprClientBuilder().Build();
     
            // Locking with this approach will also unlock it automatically, as this is a disposable object
            await using (var fileLock = await client.Lock(DAPR_LOCK_NAME, fileName, "random_id_abc123", 60))
            {
                if (fileLock.Success)
                {
                    Console.WriteLine("Success");
                }
                else
                {
                    Console.WriteLine($"Failed to lock {fileName}.");
                }
            }
        }
    }
}

Unlock an existing lock

using System;
using Dapr.Client;

namespace LockService
{
    class Program
    {
        static async Task Main(string[] args)
        {
            var daprLockName = "lockstore";
            var client = new DaprClientBuilder().Build();

            var response = await client.Unlock(DAPR_LOCK_NAME, "my_file_name", "random_id_abc123"));
            Console.WriteLine(response.status);
        }
    }
}

Sidecar APIs

Sidecar Health

The .NET SDK provides a way to poll for the sidecar health, as well as a convenience method to wait for the sidecar to be ready.

Poll for health

This health endpoint returns true when both the sidecar and your application are up (fully initialized).

var client = new DaprClientBuilder().Build();

var isDaprReady = await client.CheckHealthAsync();

if (isDaprReady) 
{
    // Execute Dapr dependent code.
}

Poll for health (outbound)

This health endpoint returns true when Dapr has initialized all its components, but may not have finished setting up a communication channel with your application.

This is best used when you want to utilize a Dapr component in your startup path, for instance, loading secrets from a secretstore.

var client = new DaprClientBuilder().Build();

var isDaprComponentsReady = await client.CheckOutboundHealthAsync();

if (isDaprComponentsReady) 
{
    // Execute Dapr component dependent code.
}

Wait for sidecar

The DaprClient also provides a helper method to wait for the sidecar to become healthy (components only). When using this method, it is recommended to include a CancellationToken to allow for the request to timeout. Below is an example of how this is used in the DaprSecretStoreConfigurationProvider.

// Wait for the Dapr sidecar to report healthy before attempting use Dapr components.
using (var tokenSource = new CancellationTokenSource(sidecarWaitTimeout))
{
    await client.WaitForSidecarAsync(tokenSource.Token);
}

// Perform Dapr component operations here i.e. fetching secrets.

Shutdown the sidecar

var client = new DaprClientBuilder().Build();
await client.ShutdownSidecarAsync();

1.1.1 - DaprClient usage

Essential tips and advice for using DaprClient

Lifetime management

A DaprClient holds access to networking resources in the form of TCP sockets used to communicate with the Dapr sidecar. DaprClient implements IDisposable to support eager cleanup of resources.

Dependency Injection

The AddDaprClient() method will register the Dapr client with ASP.NET Core dependency injection. This method accepts an optional options delegate for configuring the DaprClient and an ServiceLifetime argument, allowing you to specify a different lifetime for the registered resources instead of the default Singleton value.

The following example assumes all default values are acceptable and is sufficient to register the DaprClient.

services.AddDaprClient();

The optional configuration delegates are used to configure DaprClient by specifying options on the provided DaprClientBuilder as in the following example:

services.AddDaprClient(daprBuilder => {
    daprBuilder.UseJsonSerializerOptions(new JsonSerializerOptions {
            WriteIndented = true,
            MaxDepth = 8
        });
    daprBuilder.UseTimeout(TimeSpan.FromSeconds(30));
});

The another optional configuration delegate overload provides access to both the DaprClientBuilder as well as an IServiceProvider allowing for more advanced configurations that may require injecting services from the dependency injection container.

services.AddSingleton<SampleService>();
services.AddDaprClient((serviceProvider, daprBuilder) => {
    var sampleService = serviceProvider.GetRequiredService<SampleService>();
    var timeoutValue = sampleService.TimeoutOptions;
    
    daprBuilder.UseTimeout(timeoutValue);
});

Manual Instantiation

Rather than using dependency injection, a DaprClient can also be built using the static client builder.

For best performance, create a single long-lived instance of DaprClient and provide access to that shared instance throughout your application. DaprClient instances are thread-safe and intended to be shared.

Avoid creating a DaprClient per-operation and disposing it when the operation is complete.

Configuring DaprClient

A DaprClient can be configured by invoking methods on DaprClientBuilder class before calling .Build() to create the client. The settings for each DaprClient object are separate and cannot be changed after calling .Build().

var daprClient = new DaprClientBuilder()
    .UseJsonSerializerSettings( ... ) // Configure JSON serializer
    .Build();

By default, the DaprClientBuilder will prioritize the following locations, in the following order, to source the configuration values:

  • The value provided to a method on the DaprClientBuilder (e.g. UseTimeout(TimeSpan.FromSeconds(30)))
  • The value pulled from an optionally injected IConfiguration matching the name expected in the associated environment variable
  • The value pulled from the associated environment variable
  • Default values

Configuring on DaprClientBuilder

The DaprClientBuilder contains the following methods to set configuration options:

  • UseHttpEndpoint(string): The HTTP endpoint of the Dapr sidecar
  • UseGrpcEndpoint(string): Sets the gRPC endpoint of the Dapr sidecar
  • UseGrpcChannelOptions(GrpcChannelOptions): Sets the gRPC channel options used to connect to the Dapr sidecar
  • UseHttpClientFactory(IHttpClientFactory): Configures the DaprClient to use a registered IHttpClientFactory when building HttpClient instances
  • UseJsonSerializationOptions(JsonSerializerOptions): Used to configure JSON serialization
  • UseDaprApiToken(string): Adds the provided token to every request to authenticate to the Dapr sidecar
  • UseTimeout(TimeSpan): Specifies a timeout value used by the HttpClient when communicating with the Dapr sidecar

Configuring From IConfiguration

Rather than rely on sourcing configuration values directly from environment variables or because the values are sourced from dependency injected services, another options is to make these values available on IConfiguration.

For example, you might be registering your application in a multi-tenant environment and need to prefix the environment variables used. The following example shows how these values can be sourced from the environment variables to your IConfiguration when their keys are prefixed with test_;

var builder = WebApplication.CreateBuilder(args);
builder.Configuration.AddEnvironmentVariables("test_"); //Retrieves all environment variables that start with "test_" and removes the prefix when sourced from IConfiguration
builder.Services.AddDaprClient();

Configuring From Environment Variables

The SDK will read the following environment variables to configure the default values:

  • DAPR_HTTP_ENDPOINT: used to find the HTTP endpoint of the Dapr sidecar, example: https://dapr-api.mycompany.com
  • DAPR_GRPC_ENDPOINT: used to find the gRPC endpoint of the Dapr sidecar, example: https://dapr-grpc-api.mycompany.com
  • DAPR_HTTP_PORT: if DAPR_HTTP_ENDPOINT is not set, this is used to find the HTTP local endpoint of the Dapr sidecar
  • DAPR_GRPC_PORT: if DAPR_GRPC_ENDPOINT is not set, this is used to find the gRPC local endpoint of the Dapr sidecar
  • DAPR_API_TOKEN: used to set the API Token

Configuring gRPC channel options

Dapr’s use of CancellationToken for cancellation relies on the configuration of the gRPC channel options and this is enabled by default. If you need to configure these options yourself, make sure to enable the ThrowOperationCanceledOnCancellation setting.

var daprClient = new DaprClientBuilder()
    .UseGrpcChannelOptions(new GrpcChannelOptions { ... ThrowOperationCanceledOnCancellation = true })
    .Build();

Using cancellation with DaprClient

The APIs on DaprClient that perform asynchronous operations accept an optional CancellationToken parameter. This follows a standard .NET idiom for cancellable operations. Note that when cancellation occurs, there is no guarantee that the remote endpoint stops processing the request, only that the client has stopped waiting for completion.

When an operation is cancelled, it will throw an OperationCancelledException.

Understanding DaprClient JSON serialization

Many methods on DaprClient perform JSON serialization using the System.Text.Json serializer. Methods that accept an application data type as an argument will JSON serialize it, unless the documentation clearly states otherwise.

It is worth reading the System.Text.Json documentation if you have advanced requirements. The Dapr .NET SDK provides no unique serialization behavior or customizations - it relies on the underlying serializer to convert data to and from the application’s .NET types.

DaprClient is configured to use a serializer options object configured from JsonSerializerDefaults.Web. This means that DaprClient will use camelCase for property names, allow reading quoted numbers ("10.99"), and will bind properties case-insensitively. These are the same settings used with ASP.NET Core and the System.Text.Json.Http APIs, and are designed to follow interoperable web conventions.

System.Text.Json as of .NET 5.0 does not have good support for all of F# language features built-in. If you are using F# you may want to use one of the converter packages that add support for F#’s features such as FSharp.SystemTextJson.

Simple guidance for JSON serialization

Your experience using JSON serialization and DaprClient will be smooth if you use a feature set that maps to JSON’s type system. These are general guidelines that will simplify your code where they can be applied.

  • Avoid inheritance and polymorphism
  • Do not attempt to serialize data with cyclic references
  • Do not put complex or expensive logic in constructors or property accessors
  • Use .NET types that map cleanly to JSON types (numeric types, strings, DateTime)
  • Create your own classes for top-level messages, events, or state values so you can add properties in the future
  • Design types with get/set properties OR use the supported pattern for immutable types with JSON

Polymorphism and serialization

The System.Text.Json serializer used by DaprClient uses the declared type of values when performing serialization.

This section will use DaprClient.SaveStateAsync<TValue>(...) in examples, but the advice is applicable to any Dapr building block exposed by the SDK.

public class Widget
{
    public string Color { get; set; }
}
...

// Storing a Widget value as JSON in the state store
widget widget = new Widget() { Color = "Green", };
await client.SaveStateAsync("mystatestore", "mykey", widget);

In the example above, the type parameter TValue has its type argument inferred from the type of the widget variable. This is important because the System.Text.Json serializer will perform serialization based on the declared type of the value. The result is that the JSON value { "color": "Green" } will be stored.

Consider what happens when you try to use derived type of Widget:

public class Widget
{
    public string Color { get; set; }
}

public class SuperWidget : Widget
{
    public bool HasSelfCleaningFeature { get; set; }
}
...

// Storing a SuperWidget value as JSON in the state store
Widget widget = new SuperWidget() { Color = "Green", HasSelfCleaningFeature = true, };
await client.SaveStateAsync("mystatestore", "mykey", widget);

In this example we’re using a SuperWidget but the variable’s declared type is Widget. Since the JSON serializer’s behavior is determined by the declared type, it only sees a simple Widget and will save the value { "color": "Green" } instead of { "color": "Green", "hasSelfCleaningFeature": true }.

If you want the properties of SuperWidget to be serialized, then the best option is to override the type argument with object. This will cause the serializer to include all data as it knows nothing about the type.

Widget widget = new SuperWidget() { Color = "Green", HasSelfCleaningFeature = true, };
await client.SaveStateAsync<object>("mystatestore", "mykey", widget);

Error handling

Methods on DaprClient will throw DaprException or a subclass when a failure is encountered.

try
{
    var widget = new Widget() { Color = "Green", };
    await client.SaveStateAsync("mystatestore", "mykey", widget);
}
catch (DaprException ex)
{
    // handle the exception, log, retry, etc.
}

The most common cases of failure will be related to:

  • Incorrect configuration of Dapr component
  • Transient failures such as a networking problem
  • Invalid data, such as a failure to deserialize JSON

In any of these cases you can examine more exception details through the .InnerException property.

1.2 - Dapr actors .NET SDK

Get up and running with the Dapr actors .NET SDK

With the Dapr actor package, you can interact with Dapr virtual actors from a .NET application.

To get started, walk through the Dapr actors how-to guide.

1.2.1 - The IActorProxyFactory interface

Learn how to create actor clients with the IActorProxyFactory interface

Inside of an Actor class or an ASP.NET Core project, the IActorProxyFactory interface is recommended to create actor clients.

The AddActors(...) method will register actor services with ASP.NET Core dependency injection.

  • Outside of an actor instance: The IActorProxyFactory instance is available through dependency injection as a singleton service.
  • Inside an actor instance: The IActorProxyFactory instance is available as a property (this.ProxyFactory).

The following is an example of creating a proxy inside an actor:

public Task<MyData> GetDataAsync()
{
    var proxy = this.ProxyFactory.CreateActorProxy<IOtherActor>(ActorId.CreateRandom(), "OtherActor");
    await proxy.DoSomethingGreat();

    return this.StateManager.GetStateAsync<MyData>("my_data");
}

In this guide, you will learn how to use IActorProxyFactory.

Identifying an actor

All of the APIs on IActorProxyFactory will require an actor type and actor id to communicate with an actor. For strongly-typed clients, you also need one of its interfaces.

  • Actor type uniquely identifies the actor implementation across the whole application.
  • Actor id uniquely identifies an instance of that type.

If you don’t have an actor id and want to communicate with a new instance, create a random id with ActorId.CreateRandom(). Since the random id is a cryptographically strong identifier, the runtime will create a new actor instance when you interact with it.

You can use the type ActorReference to exchange an actor type and actor id with other actors as part of messages.

Two styles of actor client

The actor client supports two different styles of invocation:

Actor client styleDescription
Strongly-typedStrongly-typed clients are based on .NET interfaces and provide the typical benefits of strong-typing. They don’t work with non-.NET actors.
Weakly-typedWeakly-typed clients use the ActorProxy class. It is recommended to use these only when required for interop or other advanced reasons.

Using a strongly-typed client

The following example uses the CreateActorProxy<> method to create a strongly-typed client. CreateActorProxy<> requires an actor interface type, and will return an instance of that interface.

// Create a proxy for IOtherActor to type OtherActor with a random id
var proxy = this.ProxyFactory.CreateActorProxy<IOtherActor>(ActorId.CreateRandom(), "OtherActor");

// Invoke a method defined by the interface to invoke the actor
//
// proxy is an implementation of IOtherActor so we can invoke its methods directly
await proxy.DoSomethingGreat();

Using a weakly-typed client

The following example uses the Create method to create a weakly-typed client. Create returns an instance of ActorProxy.

// Create a proxy for type OtherActor with a random id
var proxy = this.ProxyFactory.Create(ActorId.CreateRandom(), "OtherActor");

// Invoke a method by name to invoke the actor
//
// proxy is an instance of ActorProxy.
await proxy.InvokeMethodAsync("DoSomethingGreat");

Since ActorProxy is a weakly-typed proxy, you need to pass in the actor method name as a string.

You can also use ActorProxy to invoke methods with both a request and a response message. Request and response messages will be serialized using the System.Text.Json serializer.

// Create a proxy for type OtherActor with a random id
var proxy = this.ProxyFactory.Create(ActorId.CreateRandom(), "OtherActor");

// Invoke a method on the proxy to invoke the actor
//
// proxy is an instance of ActorProxy.
var request = new MyRequest() { Message = "Hi, it's me.", };
var response = await proxy.InvokeMethodAsync<MyRequest, MyResponse>("DoSomethingGreat", request);

When using a weakly-typed proxy, you must proactively define the correct actor method names and message types. When using a strongly-typed proxy, these names and types are defined for you as part of the interface definition.

Actor method invocation exception details

The actor method invocation exception details are surfaced to the caller and the callee, providing an entry point to track down the issue. Exception details include:

  • Method name
  • Line number
  • Exception type
  • UUID

You use the UUID to match the exception on the caller and callee side. Below is an example of exception details:

Dapr.Actors.ActorMethodInvocationException: Remote Actor Method Exception, DETAILS: Exception: NotImplementedException, Method Name: ExceptionExample, Line Number: 14, Exception uuid: d291a006-84d5-42c4-b39e-d6300e9ac38b

Next steps

Learn how to author and run actors with ActorHost.

1.2.2 - Author & run actors

Learn all about authoring and running actors with the .NET SDK

Author actors

ActorHost

The ActorHost:

  • Is a required constructor parameter of all actors
  • Is provided by the runtime
  • Must be passed to the base class constructor
  • Contains all of the state that allows that actor instance to communicate with the runtime
internal class MyActor : Actor, IMyActor, IRemindable
{
    public MyActor(ActorHost host) // Accept ActorHost in the constructor
        : base(host) // Pass ActorHost to the base class constructor
    {
    }
}

Since the ActorHost contains state unique to the actor, you don’t need to pass the instance into other parts of your code. It’s recommended only create your own instances of ActorHost in tests.

Dependency injection

Actors support dependency injection of additional parameters into the constructor. Any other parameters you define will have their values satisfied from the dependency injection container.

internal class MyActor : Actor, IMyActor, IRemindable
{
    public MyActor(ActorHost host, BankService bank) // Accept BankService in the constructor
        : base(host)
    {
        ...
    }
}

An actor type should have a single public constructor. The actor infrastructure uses the ActivatorUtilities pattern for constructing actor instances.

You can register types with dependency injection in Startup.cs to make them available. Read more about the different ways of registering your types.

// In Startup.cs
public void ConfigureServices(IServiceCollection services)
{
    ...

    // Register additional types with dependency injection.
    services.AddSingleton<BankService>();
}

Each actor instance has its own dependency injection scope and remains in memory for some time after performing an operation. During that time, the dependency injection scope associated with the actor is also considered live. The scope will be released when the actor is deactivated.

If an actor injects an IServiceProvider in the constructor, the actor will receive a reference to the IServiceProvider associated with its scope. The IServiceProvider can be used to resolve services dynamically in the future.

internal class MyActor : Actor, IMyActor, IRemindable
{
    public MyActor(ActorHost host, IServiceProvider services) // Accept IServiceProvider in the constructor
        : base(host)
    {
        ...
    }
}

When using this pattern, avoid creating many instances of transient services which implement IDisposable. Since the scope associated with an actor could be considered valid for a long time, you can accumulate many services in memory. See the dependency injection guidelines for more information.

IDisposable and actors

Actors can implement IDisposable or IAsyncDisposable. It’s recommended that you rely on dependency injection for resource management rather than implementing dispose functionality in application code. Dispose support is provided in the rare case where it is truly necessary.

Logging

Inside an actor class, you have access to an ILogger instance through a property on the base Actor class. This instance is connected to the ASP.NET Core logging system and should be used for all logging inside an actor. Read more about logging. You can configure a variety of different logging formats and output sinks.

Use structured logging with named placeholders like the example below:

public Task<MyData> GetDataAsync()
{
    this.Logger.LogInformation("Getting state at {CurrentTime}", DateTime.UtcNow);
    return this.StateManager.GetStateAsync<MyData>("my_data");
}

When logging, avoid using format strings like: $"Getting state at {DateTime.UtcNow}"

Logging should use the named placeholder syntax which offers better performance and integration with logging systems.

Using an explicit actor type name

By default, the type of the actor, as seen by clients, is derived from the name of the actor implementation class. The default name will be the class name (without namespace).

If desired, you can specify an explicit type name by attaching an ActorAttribute attribute to the actor implementation class.

[Actor(TypeName = "MyCustomActorTypeName")]
internal class MyActor : Actor, IMyActor
{
    // ...
}

In the example above, the name will be MyCustomActorTypeName.

No change is needed to the code that registers the actor type with the runtime, providing the value via the attribute is all that is required.

Host actors on the server

Registering actors

Actor registration is part of ConfigureServices in Startup.cs. You can register services with dependency injection via the ConfigureServices method. Registering the set of actor types is part of the registration of actor services.

Inside ConfigureServices you can:

  • Register the actor runtime (AddActors)
  • Register actor types (options.Actors.RegisterActor<>)
  • Configure actor runtime settings options
  • Register additional service types for dependency injection into actors (services)
// In Startup.cs
public void ConfigureServices(IServiceCollection services)
{
    // Register actor runtime with DI
    services.AddActors(options =>
    {
        // Register actor types and configure actor settings
        options.Actors.RegisterActor<MyActor>();
        
        // Configure default settings
        options.ActorIdleTimeout = TimeSpan.FromMinutes(10);
        options.ActorScanInterval = TimeSpan.FromSeconds(35);
        options.DrainOngoingCallTimeout = TimeSpan.FromSeconds(35);
        options.DrainRebalancedActors = true;
    });

    // Register additional services for use with actors
    services.AddSingleton<BankService>();
}

Configuring JSON options

The actor runtime uses System.Text.Json for:

  • Serializing data to the state store
  • Handling requests from the weakly-typed client

By default, the actor runtime uses settings based on JsonSerializerDefaults.Web.

You can configure the JsonSerializerOptions as part of ConfigureServices:

// In Startup.cs
public void ConfigureServices(IServiceCollection services)
{
    services.AddActors(options =>
    {
        ...
        
        // Customize JSON options
        options.JsonSerializerOptions = ...
    });
}

Actors and routing

The ASP.NET Core hosting support for actors uses the endpoint routing system. The .NET SDK provides no support hosting actors with the legacy routing system from early ASP.NET Core releases.

Since actors uses endpoint routing, the actors HTTP handler is part of the middleware pipeline. The following is a minimal example of a Configure method setting up the middleware pipeline with actors.

// in Startup.cs
public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
{
    if (env.IsDevelopment())
    {
        app.UseDeveloperExceptionPage();
    }

    app.UseRouting();

    app.UseEndpoints(endpoints =>
    {
        // Register actors handlers that interface with the Dapr runtime.
        endpoints.MapActorsHandlers();
    });
}

The UseRouting and UseEndpoints calls are necessary to configure routing. Configure actors as part of the pipeline by adding MapActorsHandlers inside the endpoint middleware.

This is a minimal example, it’s valid for Actors functionality to existing alongside:

  • Controllers
  • Razor Pages
  • Blazor
  • gRPC Services
  • Dapr pub/sub handler
  • other endpoints such as health checks

Problematic middleware

Certain middleware may interfere with the routing of Dapr requests to the actors handlers. In particular, the UseHttpsRedirection is problematic for Dapr’s default configuration. Dapr sends requests over unencrypted HTTP by default, which the UseHttpsRedirection middleware will block. This middleware cannot be used with Dapr at this time.

// in Startup.cs
public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
{
    if (env.IsDevelopment())
    {
        app.UseDeveloperExceptionPage();
    }

    // INVALID - this will block non-HTTPS requests
    app.UseHttpsRedirection();
    // INVALID - this will block non-HTTPS requests

    app.UseRouting();

    app.UseEndpoints(endpoints =>
    {
        // Register actors handlers that interface with the Dapr runtime.
        endpoints.MapActorsHandlers();
    });
}

Next steps

Try the Running and using virtual actors example.

1.2.3 - Actor serialization in the .NET SDK

Necessary steps to serialize your types remoted and non-remoted Actors in .NET

Actor Serialization

The Dapr actor package enables you to use Dapr virtual actors within a .NET application with either a weakly- or strongly-typed client. Each utilizes a different serialization approach. This document will review the differences and convey a few key ground rules to understand in either scenario.

Please be advised that it is not a supported scenario to use the weakly- or strongly typed actor clients interchangeably because of these different serialization approaches. The data persisted using one Actor client will not be accessible using the other Actor client, so it is important to pick one and use it consistently throughout your application.

Weakly-typed Dapr Actor client

In this section, you will learn how to configure your C# types so they are properly serialized and deserialized at runtime when using a weakly-typed actor client. These clients use string-based names of methods with request and response payloads that are serialized using the System.Text.Json serializer. Please note that this serialization framework is not specific to Dapr and is separately maintained by the .NET team within the .NET GitHub repository.

When using the weakly-typed Dapr Actor client to invoke methods from your various actors, it’s not necessary to independently serialize or deserialize the method payloads as this will happen transparently on your behalf by the SDK.

The client will use the latest version of System.Text.Json available for the version of .NET you’re building against and serialization is subject to all the inherent capabilities provided in the associated .NET documentation.

The serializer will be configured to use the JsonSerializerOptions.Web default options unless overridden with a custom options configuration which means the following are applied:

  • Deserialization of the property name is performed in a case-insensitive manner
  • Serialization of the property name is performed using camel casing unless the property is overridden with a [JsonPropertyName] attribute
  • Deserialization will read numeric values from number and/or string values

Basic Serialization

In the following example, we present a simple class named Doodad though it could just as well be a record as well.

public class Doodad
{
    public Guid Id { get; set; }
    public string Name { get; set; }
    public int Count { get; set; }
}

By default, this will serialize using the names of the members as used in the type and whatever values it was instantiated with:

{"id": "a06ced64-4f42-48ad-84dd-46ae6a7e333d", "name": "DoodadName", "count": 5}

Override Serialized Property Name

The default property names can be overridden by applying the [JsonPropertyName] attribute to desired properties.

Generally, this isn’t going to be necessary for types you’re persisting to the actor state as you’re not intended to read or write them independent of Dapr-associated functionality, but the following is provided just to clearly illustrate that it’s possible.

Override Property Names on Classes

Here’s an example demonstrating the use of JsonPropertyName to change the name for the first property following serialization. Note that the last usage of JsonPropertyName on the Count property matches what it would be expected to serialize to. This is largely just to demonstrate that applying this attribute won’t negatively impact anything - in fact, it might be preferable if you later decide to change the default serialization options but still need to consistently access the properties previously serialized before that change as JsonPropertyName will override those options.

public class Doodad
{
    [JsonPropertyName("identifier")]
    public Guid Id { get; set; }
    public string Name { get; set; }
    [JsonPropertyName("count")]
    public int Count { get; set; }
}

This would serialize to the following:

{"identifier": "a06ced64-4f42-48ad-84dd-46ae6a7e333d", "name": "DoodadName", "count": 5}

Override Property Names on Records

Let’s try doing the same thing with a record from C# 12 or later:

public record Thingy(string Name, [JsonPropertyName("count")] int Count); 

Because the argument passed in a primary constructor (introduced in C# 12) can be applied to either a property or field within a record, using the [JsonPropertyName] attribute may require specifying that you intend the attribute to apply to a property and not a field in some ambiguous cases. Should this be necessary, you’d indicate as much in the primary constructor with:

public record Thingy(string Name, [property: JsonPropertyName("count")] int Count);

If [property: ] is applied to the [JsonPropertyName] attribute where it’s not necessary, it will not negatively impact serialization or deserialization as the operation will proceed normally as though it were a property (as it typically would if not marked as such).

Enumeration types

Enumerations, including flat enumerations are serializable to JSON, but the value persisted may surprise you. Again, it’s not expected that the developer should ever engage with the serialized data independently of Dapr, but the following information may at least help in diagnosing why a seemingly mild version migration isn’t working as expected.

Take the following enum type providing the various seasons in the year:

public enum Season
{
    Spring,
    Summer,
    Fall,
    Winter
}

We’ll go ahead and use a separate demonstration type that references our Season and simultaneously illustrate how this works with records:

public record Engagement(string Name, Season TimeOfYear);

Given the following initialized instance:

var myEngagement = new Engagement("Ski Trip", Season.Winter);

This would serialize to the following JSON:

{"name":  "Ski Trip", "season":  3}

That might be unexpected that our Season.Winter value was represented as a 3, but this is because the serializer is going to automatically use numeric representations of the enum values starting with zero for the first value and incrementing the numeric value for each additional value available. Again, if a migration were taking place and a developer had flipped the order of the enums, this would affect a breaking change in your solution as the serialized numeric values would point to different values when deserialized.

Rather, there is a JsonConverter available with System.Text.Json that will instead opt to use a string-based value instead of the numeric value. The [JsonConverter] attribute needs to be applied to be enum type itself to enable this, but will then be realized in any downstream serialization or deserialization operation that references the enum.

[JsonConverter(typeof(JsonStringEnumConverter<Season>))]
public enum Season
{
    Spring,
    Summer,
    Fall,
    Winter
}

Using the same values from our myEngagement instance above, this would produce the following JSON instead:

{"name":  "Ski Trip", "season":  "Winter"}

As a result, the enum members can be shifted around without fear of introducing errors during deserialization.

Custom Enumeration Values

The System.Text.Json serialization platform doesn’t, out of the box, support the use of [EnumMember] to allow you to change the value of enum that’s used during serialization or deserialization, but there are scenarios where this could be useful. Again, assume that you’re tasking with refactoring the solution to apply some better names to your various enums. You’re using the JsonStringEnumConverter<TType> detailed above so you’re saving the name of the enum to value instead of a numeric value, but if you change the enum name, that will introduce a breaking change as the name will no longer match what’s in state.

Do note that if you opt into using this approach, you should decorate all your enum members with the [EnumMeber] attribute so that the values are consistently applied for each enum value instead of haphazardly. Nothing will validate this at build or runtime, but it is considered a best practice operation.

How can you specify the precise value persisted while still changing the name of the enum member in this scenario? Use a custom JsonConverter with an extension method that can pull the value out of the attached [EnumMember] attributes where provided. Add the following to your solution:

public sealed class EnumMemberJsonConverter<T> : JsonConverter<T> where T : struct, Enum
{
    /// <summary>Reads and converts the JSON to type <typeparamref name="T" />.</summary>
    /// <param name="reader">The reader.</param>
    /// <param name="typeToConvert">The type to convert.</param>
    /// <param name="options">An object that specifies serialization options to use.</param>
    /// <returns>The converted value.</returns>
    public override T Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options)
    {
        // Get the string value from the JSON reader
        var value = reader.GetString();

        // Loop through all the enum values
        foreach (var enumValue in Enum.GetValues<T>())
        {
            // Get the value from the EnumMember attribute, if any
            var enumMemberValue = GetValueFromEnumMember(enumValue);

            // If the values match, return the enum value
            if (value == enumMemberValue)
            {
                return enumValue;
            }
        }

        // If no match found, throw an exception
        throw new JsonException($"Invalid value for {typeToConvert.Name}: {value}");
    }

    /// <summary>Writes a specified value as JSON.</summary>
    /// <param name="writer">The writer to write to.</param>
    /// <param name="value">The value to convert to JSON.</param>
    /// <param name="options">An object that specifies serialization options to use.</param>
    public override void Write(Utf8JsonWriter writer, T value, JsonSerializerOptions options)
    {
        // Get the value from the EnumMember attribute, if any
        var enumMemberValue = GetValueFromEnumMember(value);

        // Write the value to the JSON writer
        writer.WriteStringValue(enumMemberValue);
    }

    private static string GetValueFromEnumMember(T value)
    {
        MemberInfo[] member = typeof(T).GetMember(value.ToString(), BindingFlags.DeclaredOnly | BindingFlags.Static | BindingFlags.Public);
        if (member.Length == 0)
            return value.ToString();
        object[] customAttributes = member.GetCustomAttributes(typeof(EnumMemberAttribute), false);
        if (customAttributes.Length != 0)
        {
            EnumMemberAttribute enumMemberAttribute = (EnumMemberAttribute)customAttributes;
            if (enumMemberAttribute != null && enumMemberAttribute.Value != null)
                return enumMemberAttribute.Value;
        }
        return value.ToString();
    }
}

Now let’s add a sample enumerator. We’ll set a value that uses the lower-case version of each enum member to demonstrate this. Don’t forget to decorate the enum with the JsonConverter attribute and reference our custom converter in place of the numeral-to-string converter used in the last section.

[JsonConverter(typeof(EnumMemberJsonConverter<Season>))]
public enum Season
{
    [EnumMember(Value="spring")]
    Spring,
    [EnumMember(Value="summer")]
    Summer,
    [EnumMember(Value="fall")]
    Fall,
    [EnumMember(Value="winter")]
    Winter
}

Let’s use our sample record from before. We’ll also add a [JsonPropertyName] attribute just to augment the demonstration:

public record Engagement([property: JsonPropertyName("event")] string Name, Season TimeOfYear);

And finally, let’s initialize a new instance of this:

var myEngagement = new Engagement("Conference", Season.Fall);

This time, serialization will take into account the values from the attached [EnumMember] attribute providing us a mechanism to refactor our application without necessitating a complex versioning scheme for our existing enum values in the state.

{"event":  "Conference",  "season":  "fall"}

Polymorphic Serialization

When working with polymorphic types in Dapr Actor clients, it is essential to handle serialization and deserialization correctly to ensure that the appropriate derived types are instantiated. Polymorphic serialization allows you to serialize objects of a base type while preserving the specific derived type information.

To enable polymorphic deserialization, you must use the [JsonPolymorphic] attribute on your base type. Additionally, it is crucial to include the [AllowOutOfOrderMetadataProperties] attribute to ensure that metadata properties, such as $type can be processed correctly by System.Text.Json even if they are not the first properties in the JSON object.

Example

[JsonPolymorphic]
[AllowOutOfOrderMetadataProperties]
public abstract class SampleValueBase
{
    public string CommonProperty { get; set; }
}

public class DerivedSampleValue : SampleValueBase
{
    public string SpecificProperty { get; set; }
}

In this example, the SampleValueBase class is marked with both [JsonPolymorphic] and [AllowOutOfOrderMetadataProperties] attributes. This setup ensures that the $type metadata property can be correctly identified and processed during deserialization, regardless of its position in the JSON object.

By following this approach, you can effectively manage polymorphic serialization and deserialization in your Dapr Actor clients, ensuring that the correct derived types are instantiated and used.

Strongly-typed Dapr Actor client

In this section, you will learn how to configure your classes and records so they are properly serialized and deserialized at runtime when using a strongly-typed actor client. These clients are implemented using .NET interfaces and are not compatible with Dapr Actors written using other languages.

This actor client serializes data using an engine called the Data Contract Serializer which converts your C# types to and from XML documents. This serialization framework is not specific to Dapr and is separately maintained by the .NET team within the .NET GitHub repository.

When sending or receiving primitives (like strings or ints), this serialization happens transparently and there’s no requisite preparation needed on your part. However, when working with complex types such as those you create, there are some important rules to take into consideration so this process works smoothly.

Serializable Types

There are several important considerations to keep in mind when using the Data Contract Serializer:

  • By default, all types, read/write properties (after construction) and fields marked as publicly visible are serialized
  • All types must either expose a public parameterless constructor or be decorated with the DataContractAttribute attribute
  • Init-only setters are only supported with the use of the DataContractAttribute attribute
  • Read-only fields, properties without a Get and Set method and internal or properties with private Get and Set methods are ignored during serialization
  • Serialization is supported for types that use other complex types that are not themselves marked with the DataContractAttribute attribute through the use of the KnownTypesAttribute attribute
  • If a type is marked with the DataContractAttribute attribute, all members you wish to serialize and deserialize must be decorated with the DataMemberAttribute attribute as well or they’ll be set to their default values

How does deserialization work?

The approach used for deserialization depends on whether or not the type is decorated with the DataContractAttribute attribute. If this attribute isn’t present, an instance of the type is created using the parameterless constructor. Each of the properties and fields are then mapped into the type using their respective setters and the instance is returned to the caller.

If the type is marked with [DataContract], the serializer instead uses reflection to read the metadata of the type and determine which properties or fields should be included based on whether or not they’re marked with the DataMemberAttribute attribute as it’s performed on an opt-in basis. It then allocates an uninitialized object in memory (avoiding the use of any constructors, parameterless or not) and then sets the value directly on each mapped property or field, even if private or uses init-only setters. Serialization callbacks are invoked as applicable throughout this process and then the object is returned to the caller.

Use of the serialization attributes is highly recommended as they grant more flexibility to override names and namespaces and generally use more of the modern C# functionality. While the default serializer can be relied on for primitive types, it’s not recommended for any of your own types, whether they be classes, structs or records. It’s recommended that if you decorate a type with the DataContractAttribute attribute, you also explicitly decorate each of the members you want to serialize or deserialize with the DataMemberAttribute attribute as well.

.NET Classes

Classes are fully supported in the Data Contract Serializer provided that that other rules detailed on this page and the Data Contract Serializer documentation are also followed.

The most important thing to remember here is that you must either have a public parameterless constructor or you must decorate it with the appropriate attributes. Let’s review some examples to really clarify what will and won’t work.

In the following example, we present a simple class named Doodad. We don’t provide an explicit constructor here, so the compiler will provide an default parameterless constructor. Because we’re using supported primitive types (Guid, string and int32) and all our members have a public getter and setter, no attributes are required and we’ll be able to use this class without issue when sending and receiving it from a Dapr actor method.

public class Doodad
{
    public Guid Id { get; set; }
    public string Name { get; set; }
    public int Count { get; set; }
}

By default, this will serialize using the names of the members as used in the type and whatever values it was instantiated with:

<Doodad>
  <Id>a06ced64-4f42-48ad-84dd-46ae6a7e333d</Id>
  <Name>DoodadName</Name>
  <Count>5</Count>
</Doodad>

So let’s tweak it - let’s add our own constructor and only use init-only setters on the members. This will fail to serialize and deserialize not because of the use of the init-only setters, but because there’s no parameterless constructors.

// WILL NOT SERIALIZE PROPERLY!
public class Doodad
{
    public Doodad(string name, int count)
    {
        Id = Guid.NewGuid();
        Name = name;
        Count = count;
    }

    public Guid Id { get; set; }
    public string Name { get; init; }
    public int Count { get; init; }
}

If we add a public parameterless constructor to the type, we’re good to go and this will work without further annotations.

public class Doodad
{
    public Doodad()
    {
    }

    public Doodad(string name, int count)
    {
        Id = Guid.NewGuid();
        Name = name;
        Count = count;
    }

    public Guid Id { get; set; }
    public string Name { get; set; }
    public int Count { get; set; }
}

But what if we don’t want to add this constructor? Perhaps you don’t want your developers to accidentally create an instance of this Doodad using an unintended constructor. That’s where the more flexible attributes are useful. If you decorate your type with a DataContractAttribute attribute, you can drop your parameterless constructor and it will work once again.

[DataContract]
public class Doodad
{
    public Doodad(string name, int count)
    {
        Id = Guid.NewGuid();
        Name = name;
        Count = count;
    }

    public Guid Id { get; set; }
    public string Name { get; set; }
    public int Count { get; set; }
}

In the above example, we don’t need to also use the DataMemberAttribute attributes because again, we’re using built-in primitives that the serializer supports. But, we do get more flexibility if we use the attributes. From the DataContractAttribute attribute, we can specify our own XML namespace with the Namespace argument and, via the Name argument, change the name of the type as used when serialized into the XML document.

It’s a recommended practice to append the DataContractAttribute attribute to the type and the DataMemberAttribute attributes to all the members you want to serialize anyway - if they’re not necessary and you’re not changing the default values, they’ll just be ignored, but they give you a mechanism to opt into serializing members that wouldn’t otherwise have been included such as those marked as private or that are themselves complex types or collections.

Note that if you do opt into serializing your private members, their values will be serialized into plain text - they can very well be viewed, intercepted and potentially manipulated based on how you’re handing the data once serialized, so it’s an important consideration whether you want to mark these members or not in your use case.

In the following example, we’ll look at using the attributes to change the serialized names of some of the members as well as introduce the IgnoreDataMemberAttribute attribute. As the name indicates, this tells the serializer to skip this property even though it’d be otherwise eligible to serialize. Further, because I’m decorating the type with the DataContractAttribute attribute, it means that I can use init-only setters on the properties.

[DataContract(Name="Doodad")]
public class Doodad
{
    public Doodad(string name = "MyDoodad", int count = 5)
    {
        Id = Guid.NewGuid();
        Name = name;
        Count = count;
    }

    [DataMember(Name = "id")]
    public Guid Id { get; init; }
    [IgnoreDataMember]
    public string Name { get; init; }
    [DataMember]
    public int Count { get; init; }
}

When this is serialized, because we’re changing the names of the serialized members, we can expect a new instance of Doodad using the default values this to be serialized as:

<Doodad>
  <id>a06ced64-4f42-48ad-84dd-46ae6a7e333d</id>
  <Count>5</Count>
</Doodad>
Classes in C# 12 - Primary Constructors

C# 12 brought us primary constructors on classes. Use of a primary constructor means the compiler will be prevented from creating the default implicit parameterless constructor. While a primary constructor on a class doesn’t generate any public properties, it does mean that if you pass this primary constructor any arguments or have non-primitive types in your class, you’ll either need to specify your own parameterless constructor or use the serialization attributes.

Here’s an example where we’re using the primary constructor to inject an ILogger to a field and add our own parameterless constructor without the need for any attributes.

public class Doodad(ILogger<Doodad> _logger)
{
    public Doodad() {} //Our parameterless constructor

    public Doodad(string name, int count)
    {
        Id = Guid.NewGuid();
        Name = name;
        Count = count;
    }

    public Guid Id { get; set; }
    public string Name { get; set; }
    public int Count { get; set; } 
}

And using our serialization attributes (again, opting for init-only setters since we’re using the serialization attributes):

[DataContract]
public class Doodad(ILogger<Doodad> _logger)
{
    public Doodad(string name, int count)
    {
        Id = Guid.NewGuid();
        Name = name;
        Count = count;
    }

    [DataMember]
    public Guid Id { get; init; }
    [DataMember]
    public string Name { get; init; }
    [DataMember]
    public int Count { get; init; }
}

.NET Structs

Structs are supported by the Data Contract serializer provided that they are marked with the DataContractAttribute attribute and the members you wish to serialize are marked with the DataMemberAttribute attribute. Further, to support deserialization, the struct will also need to have a parameterless constructor. This works even if you define your own parameterless constructor as enabled in C# 10.

[DataContract]
public struct Doodad
{
    [DataMember]
    public int Count { get; set; }
}

.NET Records

Records were introduced in C# 9 and follow precisely the same rules as classes when it comes to serialization. We recommend that you should decorate all your records with the DataContractAttribute attribute and members you wish to serialize with DataMemberAttribute attributes so you don’t experience any deserialization issues using this or other newer C# functionalities. Because record classes use init-only setters for properties by default and encourage the use of the primary constructor, applying these attributes to your types ensures that the serializer can properly otherwise accommodate your types as-is.

Typically records are presented as a simple one-line statement using the new primary constructor concept:

public record Doodad(Guid Id, string Name, int Count);

This will throw an error encouraging the use of the serialization attributes as soon as you use it in a Dapr actor method invocation because there’s no parameterless constructor available nor is it decorated with the aforementioned attributes.

Here we add an explicit parameterless constructor and it won’t throw an error, but none of the values will be set during deserialization since they’re created with init-only setters. Because this doesn’t use the DataContractAttribute attribute or the DataMemberAttribute attribute on any members, the serializer will be unable to map the target members correctly during deserialization.

public record Doodad(Guid Id, string Name, int Count)
{
    public Doodad() {}
}

This approach does without the additional constructor and instead relies on the serialization attributes. Because we mark the type with the DataContractAttribute attribute and decorate each member with its own DataMemberAttribute attribute, the serialization engine will be able to map from the XML document to our type without issue.

[DataContract]
public record Doodad(
        [property: DataMember] Guid Id,
        [property: DataMember] string Name,
        [property: DataMember] int Count)

Supported Primitive Types

There are several types built into .NET that are considered primitive and eligible for serialization without additional effort on the part of the developer:

There are additional types that aren’t actually primitives but have similar built-in support:

Again, if you want to pass these types around via your actor methods, no additional consideration is necessary as they’ll be serialized and deserialized without issue. Further, types that are themselves marked with the (SerializeableAttribute)[https://learn.microsoft.com/en-us/dotnet/api/system.serializableattribute] attribute will be serialized.

Enumeration Types

Enumerations, including flag enumerations are serializable if appropriately marked. The enum members you wish to be serialized must be marked with the EnumMemberAttribute attribute in order to be serialized. Passing a custom value into the optional Value argument on this attribute will allow you to specify the value used for the member in the serialized document instead of having the serializer derive it from the name of the member.

The enum type does not require that the type be decorated with the DataContractAttribute attribute - only that the members you wish to serialize be decorated with the EnumMemberAttribute attributes.

public enum Colors
{
    [EnumMember]
    Red,
    [EnumMember(Value="g")]
    Green,
    Blue, //Even if used by a type, this value will not be serialized as it's not decorated with the EnumMember attribute
}

Collection Types

With regards to the data contact serializer, all collection types that implement the IEnumerable interface including arays and generic collections are considered collections. Those types that implement IDictionary or the generic IDictionary<TKey, TValue> are considered dictionary collections; all others are list collections.

Not unlike other complex types, collection types must have a parameterless constructor available. Further, they must also have a method called Add so they can be properly serialized and deserialized. The types used by these collection types must themselves be marked with the DataContractAttribute attribute or otherwise be serializable as described throughout this document.

Data Contract Versioning

As the data contract serializer is only used in Dapr with respect to serializing the values in the .NET SDK to and from the Dapr actor instances via the proxy methods, there’s little need to consider versioning of data contracts as the data isn’t being persisted between application versions using the same serializer. For those interested in learning more about data contract versioning visit here.

Known Types

Nesting your own complex types is easily accommodated by marking each of the types with the DataContractAttribute attribute. This informs the serializer as to how deserialization should be performed. But what if you’re working with polymorphic types and one of your members is a base class or interface with derived classes or other implementations? Here, you’ll use the KnownTypeAttribute attribute to give a hint to the serializer about how to proceed.

When you apply the KnownTypeAttribute attribute to a type, you are informing the data contract serializer about what subtypes it might encounter allowing it to properly handle the serialization and deserialization of these types, even when the actual type at runtime is different from the declared type.

[DataContract]
[KnownType(typeof(DerivedClass))]
public class BaseClass
{
    //Members of the base class
}

[DataContract]
public class DerivedClass : BaseClass 
{
    //Additional members of the derived class
}

In this example, the BaseClass is marked with [KnownType(typeof(DerivedClass))] which tells the data contract serializer that DerivedClass is a possible implementation of BaseClass that it may need to serialize or deserialize. Without this attribute, the serialize would not be aware of the DerivedClass when it encounters an instance of BaseClass that is actually of type DerivedClass and this could lead to a serialization exception because the serializer would not know how to handle the derived type. By specifying all possible derived types as known types, you ensure that the serializer can process the type and its members correctly.

For more information and examples about using [KnownType], please refer to the official documentation.

1.2.4 - How to: Run and use virtual actors in the .NET SDK

Try out .NET Dapr virtual actors with this example

The Dapr actor package allows you to interact with Dapr virtual actors from a .NET application. In this guide, you learn how to:

  • Create an Actor (MyActor).
  • Invoke its methods on the client application.
MyActor --- MyActor.Interfaces
         |
         +- MyActorService
         |
         +- MyActorClient

The interface project (\MyActor\MyActor.Interfaces)

This project contains the interface definition for the actor. Actor interfaces can be defined in any project with any name. The interface defines the actor contract shared by:

  • The actor implementation
  • The clients calling the actor

Because client projects may depend on it, it’s better to define it in an assembly separate from the actor implementation.

The actor service project (\MyActor\MyActorService)

This project implements the ASP.Net Core web service that hosts the actor. It contains the implementation of the actor, MyActor.cs. An actor implementation is a class that:

  • Derives from the base type Actor
  • Implements the interfaces defined in the MyActor.Interfaces project.

An actor class must also implement a constructor that accepts an ActorService instance and an ActorId, and passes them to the base Actor class.

The actor client project (\MyActor\MyActorClient)

This project contains the implementation of the actor client which calls MyActor’s method defined in Actor Interfaces.

Prerequisites

Step 0: Prepare

Since we’ll be creating 3 projects, choose an empty directory to start from, and open it in your terminal of choice.

Step 1: Create actor interfaces

Actor interface defines the actor contract that is shared by the actor implementation and the clients calling the actor.

Actor interface is defined with the below requirements:

  • Actor interface must inherit Dapr.Actors.IActor interface
  • The return type of Actor method must be Task or Task<object>
  • Actor method can have one argument at a maximum

Create interface project and add dependencies

# Create Actor Interfaces
dotnet new classlib -o MyActor.Interfaces

cd MyActor.Interfaces

# Add Dapr.Actors nuget package. Please use the latest package version from nuget.org
dotnet add package Dapr.Actors

cd ..

Implement IMyActor interface

Define IMyActor interface and MyData data object. Paste the following code into MyActor.cs in the MyActor.Interfaces project.

using Dapr.Actors;
using Dapr.Actors.Runtime;
using System.Threading.Tasks;

namespace MyActor.Interfaces
{
    public interface IMyActor : IActor
    {       
        Task<string> SetDataAsync(MyData data);
        Task<MyData> GetDataAsync();
        Task RegisterReminder();
        Task UnregisterReminder();
        Task<IActorReminder> GetReminder();
        Task RegisterTimer();
        Task UnregisterTimer();
    }

    public class MyData
    {
        public string PropertyA { get; set; }
        public string PropertyB { get; set; }

        public override string ToString()
        {
            var propAValue = this.PropertyA == null ? "null" : this.PropertyA;
            var propBValue = this.PropertyB == null ? "null" : this.PropertyB;
            return $"PropertyA: {propAValue}, PropertyB: {propBValue}";
        }
    }
}

Step 2: Create actor service

Dapr uses ASP.NET web service to host Actor service. This section will implement IMyActor actor interface and register Actor to Dapr Runtime.

Create actor service project and add dependencies

# Create ASP.Net Web service to host Dapr actor
dotnet new web -o MyActorService

cd MyActorService

# Add Dapr.Actors.AspNetCore nuget package. Please use the latest package version from nuget.org
dotnet add package Dapr.Actors.AspNetCore

# Add Actor Interface reference
dotnet add reference ../MyActor.Interfaces/MyActor.Interfaces.csproj

cd ..

Add actor implementation

Implement IMyActor interface and derive from Dapr.Actors.Actor class. Following example shows how to use Actor Reminders as well. For Actors to use Reminders, it must derive from IRemindable. If you don’t intend to use Reminder feature, you can skip implementing IRemindable and reminder specific methods which are shown in the code below.

Paste the following code into MyActor.cs in the MyActorService project:

using Dapr.Actors;
using Dapr.Actors.Runtime;
using MyActor.Interfaces;
using System;
using System.Threading.Tasks;

namespace MyActorService
{
    internal class MyActor : Actor, IMyActor, IRemindable
    {
        // The constructor must accept ActorHost as a parameter, and can also accept additional
        // parameters that will be retrieved from the dependency injection container
        //
        /// <summary>
        /// Initializes a new instance of MyActor
        /// </summary>
        /// <param name="host">The Dapr.Actors.Runtime.ActorHost that will host this actor instance.</param>
        public MyActor(ActorHost host)
            : base(host)
        {
        }

        /// <summary>
        /// This method is called whenever an actor is activated.
        /// An actor is activated the first time any of its methods are invoked.
        /// </summary>
        protected override Task OnActivateAsync()
        {
            // Provides opportunity to perform some optional setup.
            Console.WriteLine($"Activating actor id: {this.Id}");
            return Task.CompletedTask;
        }

        /// <summary>
        /// This method is called whenever an actor is deactivated after a period of inactivity.
        /// </summary>
        protected override Task OnDeactivateAsync()
        {
            // Provides Opporunity to perform optional cleanup.
            Console.WriteLine($"Deactivating actor id: {this.Id}");
            return Task.CompletedTask;
        }

        /// <summary>
        /// Set MyData into actor's private state store
        /// </summary>
        /// <param name="data">the user-defined MyData which will be stored into state store as "my_data" state</param>
        public async Task<string> SetDataAsync(MyData data)
        {
            // Data is saved to configured state store implicitly after each method execution by Actor's runtime.
            // Data can also be saved explicitly by calling this.StateManager.SaveStateAsync();
            // State to be saved must be DataContract serializable.
            await this.StateManager.SetStateAsync<MyData>(
                "my_data",  // state name
                data);      // data saved for the named state "my_data"

            return "Success";
        }

        /// <summary>
        /// Get MyData from actor's private state store
        /// </summary>
        /// <return>the user-defined MyData which is stored into state store as "my_data" state</return>
        public Task<MyData> GetDataAsync()
        {
            // Gets state from the state store.
            return this.StateManager.GetStateAsync<MyData>("my_data");
        }

        /// <summary>
        /// Register MyReminder reminder with the actor
        /// </summary>
        public async Task RegisterReminder()
        {
            await this.RegisterReminderAsync(
                "MyReminder",              // The name of the reminder
                null,                      // User state passed to IRemindable.ReceiveReminderAsync()
                TimeSpan.FromSeconds(5),   // Time to delay before invoking the reminder for the first time
                TimeSpan.FromSeconds(5));  // Time interval between reminder invocations after the first invocation
        }

        /// <summary>
        /// Get MyReminder reminder details with the actor
        /// </summary>
        public async Task<IActorReminder> GetReminder()
        {
            await this.GetReminderAsync("MyReminder");
        }

        /// <summary>
        /// Unregister MyReminder reminder with the actor
        /// </summary>
        public Task UnregisterReminder()
        {
            Console.WriteLine("Unregistering MyReminder...");
            return this.UnregisterReminderAsync("MyReminder");
        }

        // <summary>
        // Implement IRemindeable.ReceiveReminderAsync() which is call back invoked when an actor reminder is triggered.
        // </summary>
        public Task ReceiveReminderAsync(string reminderName, byte[] state, TimeSpan dueTime, TimeSpan period)
        {
            Console.WriteLine("ReceiveReminderAsync is called!");
            return Task.CompletedTask;
        }

        /// <summary>
        /// Register MyTimer timer with the actor
        /// </summary>
        public Task RegisterTimer()
        {
            return this.RegisterTimerAsync(
                "MyTimer",                  // The name of the timer
                nameof(this.OnTimerCallBack),       // Timer callback
                null,                       // User state passed to OnTimerCallback()
                TimeSpan.FromSeconds(5),    // Time to delay before the async callback is first invoked
                TimeSpan.FromSeconds(5));   // Time interval between invocations of the async callback
        }

        /// <summary>
        /// Unregister MyTimer timer with the actor
        /// </summary>
        public Task UnregisterTimer()
        {
            Console.WriteLine("Unregistering MyTimer...");
            return this.UnregisterTimerAsync("MyTimer");
        }

        /// <summary>
        /// Timer callback once timer is expired
        /// </summary>
        private Task OnTimerCallBack(byte[] data)
        {
            Console.WriteLine("OnTimerCallBack is called!");
            return Task.CompletedTask;
        }
    }
}

Register actor runtime with ASP.NET Core startup

The Actor runtime is configured through ASP.NET Core Startup.cs.

The runtime uses the ASP.NET Core dependency injection system to register actor types and essential services. This integration is provided through the AddActors(...) method call in ConfigureServices(...). Use the delegate passed to AddActors(...) to register actor types and configure actor runtime settings. You can register additional types for dependency injection inside ConfigureServices(...). These will be available to be injected into the constructors of your Actor types.

Actors are implemented via HTTP calls with the Dapr runtime. This functionality is part of the application’s HTTP processing pipeline and is registered inside UseEndpoints(...) inside Configure(...).

Paste the following code into Startup.cs in the MyActorService project:

using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Hosting;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;

namespace MyActorService
{
    public class Startup
    {
        public void ConfigureServices(IServiceCollection services)
        {
            services.AddActors(options =>
            {
                // Register actor types and configure actor settings
                options.Actors.RegisterActor<MyActor>();
            });
        }

        public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
        {
            if (env.IsDevelopment())
            {
                app.UseDeveloperExceptionPage();
            }

            app.UseRouting();

            // Register actors handlers that interface with the Dapr runtime.
            app.MapActorsHandlers();
        }
    }
}

Step 3: Add a client

Create a simple console app to call the actor service. Dapr SDK provides Actor Proxy client to invoke actor methods defined in Actor Interface.

Create actor client project and add dependencies

# Create Actor's Client
dotnet new console -o MyActorClient

cd MyActorClient

# Add Dapr.Actors nuget package. Please use the latest package version from nuget.org
dotnet add package Dapr.Actors

# Add Actor Interface reference
dotnet add reference ../MyActor.Interfaces/MyActor.Interfaces.csproj

cd ..

Invoke actor methods with strongly-typed client

You can use ActorProxy.Create<IMyActor>(..) to create a strongly-typed client and invoke methods on the actor.

Paste the following code into Program.cs in the MyActorClient project:

using System;
using System.Threading.Tasks;
using Dapr.Actors;
using Dapr.Actors.Client;
using MyActor.Interfaces;

namespace MyActorClient
{
    class Program
    {
        static async Task MainAsync(string[] args)
        {
            Console.WriteLine("Startup up...");

            // Registered Actor Type in Actor Service
            var actorType = "MyActor";

            // An ActorId uniquely identifies an actor instance
            // If the actor matching this id does not exist, it will be created
            var actorId = new ActorId("1");

            // Create the local proxy by using the same interface that the service implements.
            //
            // You need to provide the type and id so the actor can be located. 
            var proxy = ActorProxy.Create<IMyActor>(actorId, actorType);

            // Now you can use the actor interface to call the actor's methods.
            Console.WriteLine($"Calling SetDataAsync on {actorType}:{actorId}...");
            var response = await proxy.SetDataAsync(new MyData()
            {
                PropertyA = "ValueA",
                PropertyB = "ValueB",
            });
            Console.WriteLine($"Got response: {response}");

            Console.WriteLine($"Calling GetDataAsync on {actorType}:{actorId}...");
            var savedData = await proxy.GetDataAsync();
            Console.WriteLine($"Got response: {savedData}");
        }
    }
}

Running the code

The projects that you’ve created can now to test the sample.

  1. Run MyActorService

    Since MyActorService is hosting actors, it needs to be run with the Dapr CLI.

    cd MyActorService
    dapr run --app-id myapp --app-port 5000 --dapr-http-port 3500 -- dotnet run
    

    You will see commandline output from both daprd and MyActorService in this terminal. You should see something like the following, which indicates that the application started successfully.

    ...
    ℹ️  Updating metadata for app command: dotnet run
    ✅  You're up and running! Both Dapr and your app logs will appear here.
    
    == APP == info: Microsoft.Hosting.Lifetime[0]
    
    == APP ==       Now listening on: https://localhost:5001
    
    == APP == info: Microsoft.Hosting.Lifetime[0]
    
    == APP ==       Now listening on: http://localhost:5000
    
    == APP == info: Microsoft.Hosting.Lifetime[0]
    
    == APP ==       Application started. Press Ctrl+C to shut down.
    
    == APP == info: Microsoft.Hosting.Lifetime[0]
    
    == APP ==       Hosting environment: Development
    
    == APP == info: Microsoft.Hosting.Lifetime[0]
    
    == APP ==       Content root path: /Users/ryan/actortest/MyActorService
    
  2. Run MyActorClient

    MyActorClient is acting as the client, and it can be run normally with dotnet run.

    Open a new terminal an navigate to the MyActorClient directory. Then run the project with:

    dotnet run
    

    You should see commandline output like:

    Startup up...
    Calling SetDataAsync on MyActor:1...
    Got response: Success
    Calling GetDataAsync on MyActor:1...
    Got response: PropertyA: ValueA, PropertyB: ValueB
    

💡 This sample relies on a few assumptions. The default listening port for an ASP.NET Core web project is 5000, which is being passed to dapr run as --app-port 5000. The default HTTP port for the Dapr sidecar is 3500. We’re telling the sidecar for MyActorService to use 3500 so that MyActorClient can rely on the default value.

Now you have successfully created an actor service and client. See the related links section to learn more.

1.3 - Dapr Workflow .NET SDK

Get up and running with Dapr Workflow and the Dapr .NET SDK

1.3.1 - DaprWorkflowClient usage

Essential tips and advice for using DaprWorkflowClient

Lifetime management

A DaprWorkflowClient holds access to networking resources in the form of TCP sockets used to communicate with the Dapr sidecar as well as other types used in the management and operation of Workflows. DaprWorkflowClient implements IAsyncDisposable to support eager cleanup of resources.

Dependency Injection

The AddDaprWorkflow() method will register the Dapr workflow services with ASP.NET Core dependency injection. This method requires an options delegate that defines each of the workflows and activities you wish to register and use in your application.

Singleton Registration

By default, the AddDaprWorkflow method will register the DaprWorkflowClient and associated services using a singleton lifetime. This means that the services will be instantiated only a single time.

The following is an example of how registration of the DaprWorkflowClient as it would appear in a typical Program.cs file:

builder.Services.AddDaprWorkflow(options => {
    options.RegisterWorkflow<YourWorkflow>();
    options.RegisterActivity<YourActivity>();
});

var app = builder.Build();
await app.RunAsync();

Scoped Registration

While this may generally be acceptable in your use case, you may instead wish to override the lifetime specified. This is done by passing a ServiceLifetime argument in AddDaprWorkflow. For example, you may wish to inject another scoped service into your ASP.NET Core processing pipeline that needs context used by the DaprClient that wouldn’t be available if the former service were registered as a singleton.

This is demonstrated in the following example:

builder.Services.AddDaprWorkflow(options => {
    options.RegisterWorkflow<YourWorkflow>();
    options.RegisterActivity<YourActivity>();
}, ServiceLifecycle.Scoped);

var app = builder.Build();
await app.RunAsync();

Transient Registration

Finally, Dapr services can also be registered using a transient lifetime meaning that they will be initialized every time they’re injected. This is demonstrated in the following example:

builder.Services.AddDaprWorkflow(options => {
    options.RegisterWorkflow<YourWorkflow>();
    options.RegisterActivity<YourActivity>();
}, ServiceLifecycle.Transient);

var app = builder.Build();
await app.RunAsync();

Injecting Services into Workflow Activities

Workflow activities support the same dependency injection that developers have come to expect of modern C# applications. Assuming a proper registration at startup, any such type can be injected into the constructor of the workflow activity and available to utilize during the execution of the workflow. This makes it simple to add logging via an injected ILogger or access to other Dapr building blocks by injecting DaprClient or DaprJobsClient, for example.

internal sealed class SquareNumberActivity : WorkflowActivity<int, int>
{
    private readonly ILogger _logger;
    
    public MyActivity(ILogger logger)
    {
        this._logger = logger;
    }
    
    public override Task<int> RunAsync(WorkflowActivityContext context, int input) 
    {
        this._logger.LogInformation("Squaring the value {number}", input);
        var result = input * input;
        this._logger.LogInformation("Got a result of {squareResult}", result);
        
        return Task.FromResult(result);
    }
}

Using ILogger in Workflow

Because workflows must be deterministic, it is not possible to inject arbitrary services into them. For example, if you were able to inject a standard ILogger into a workflow and it needed to be replayed because of an error, subsequent replay from the event source log would result in the log recording additional operations that didn’t actually take place a second or third time because their results were sourced from the log. This has the potential to introduce a significant amount of confusion. Rather, a replay-safe logger is made available for use within workflows. It will only log events the first time the workflow runs and will not log anything whenever the workflow is being replaced.

This logger can be retrieved from a method present on the WorkflowContext available on your workflow instance and otherwise used precisely as you might otherwise use an ILogger instance.

An end-to-end sample demonstrating this can be seen in the .NET SDK repository but a brief extraction of this sample is available below.

public class OrderProcessingWorkflow : Workflow<OrderPayload, OrderResult>
{
    public override async Task<OrderResult> RunAsync(WorkflowContext context, OrderPayload order)
    {
        string orderId = context.InstanceId;
        var logger = context.CreateReplaySafeLogger<OrderProcessingWorkflow>(); //Use this method to access the logger instance

        logger.LogInformation("Received order {orderId} for {quantity} {name} at ${totalCost}", orderId, order.Quantity, order.Name, order.TotalCost);
        
        //...
    }
}

1.3.2 - How to: Author and manage Dapr Workflow in the .NET SDK

Learn how to author and manage Dapr Workflow using the .NET SDK

Let’s create a Dapr workflow and invoke it using the console. In the provided order processing workflow example, the console prompts provide directions on how to both purchase and restock items. In this guide, you will:

  • Deploy a .NET console application (WorkflowConsoleApp).
  • Utilize the .NET workflow SDK and API calls to start and query workflow instances.

In the .NET example project:

  • The main Program.cs file contains the setup of the app, including the registration of the workflow and workflow activities.
  • The workflow definition is found in the Workflows directory.
  • The workflow activity definitions are found in the Activities directory.

Prerequisites

Set up the environment

Clone the .NET SDK repo.

git clone https://github.com/dapr/dotnet-sdk.git

From the .NET SDK root directory, navigate to the Dapr Workflow example.

cd examples/Workflow

Run the application locally

To run the Dapr application, you need to start the .NET program and a Dapr sidecar. Navigate to the WorkflowConsoleApp directory.

cd WorkflowConsoleApp

Start the program.

dotnet run

In a new terminal, navigate again to the WorkflowConsoleApp directory and run the Dapr sidecar alongside the program.

dapr run --app-id wfapp --dapr-grpc-port 4001 --dapr-http-port 3500

Dapr listens for HTTP requests at http://localhost:3500 and internal workflow gRPC requests at http://localhost:4001.

Start a workflow

To start a workflow, you have two options:

  1. Follow the directions from the console prompts.
  2. Use the workflow API and send a request to Dapr directly.

This guide focuses on the workflow API option.

Run the following command to start a workflow.

curl -i -X POST http://localhost:3500/v1.0/workflows/dapr/OrderProcessingWorkflow/start?instanceID=12345678 \
  -H "Content-Type: application/json" \
  -d '{"Name": "Paperclips", "TotalCost": 99.95, "Quantity": 1}'
curl -i -X POST http://localhost:3500/v1.0/workflows/dapr/OrderProcessingWorkflow/start?instanceID=12345678 `
  -H "Content-Type: application/json" `
  -d '{"Name": "Paperclips", "TotalCost": 99.95, "Quantity": 1}'

If successful, you should see a response like the following:

{"instanceID":"12345678"}

Send an HTTP request to get the status of the workflow that was started:

curl -i -X GET http://localhost:3500/v1.0/workflows/dapr/12345678

The workflow is designed to take several seconds to complete. If the workflow hasn’t completed when you issue the HTTP request, you’ll see the following JSON response (formatted for readability) with workflow status as RUNNING:

{
  "instanceID": "12345678",
  "workflowName": "OrderProcessingWorkflow",
  "createdAt": "2023-05-10T00:42:03.911444105Z",
  "lastUpdatedAt": "2023-05-10T00:42:06.142214153Z",
  "runtimeStatus": "RUNNING",
  "properties": {
    "dapr.workflow.custom_status": "",
    "dapr.workflow.input": "{\"Name\": \"Paperclips\", \"TotalCost\": 99.95, \"Quantity\": 1}"
  }
}

Once the workflow has completed running, you should see the following output, indicating that it has reached the COMPLETED status:

{
  "instanceID": "12345678",
  "workflowName": "OrderProcessingWorkflow",
  "createdAt": "2023-05-10T00:42:03.911444105Z",
  "lastUpdatedAt": "2023-05-10T00:42:18.527704176Z",
  "runtimeStatus": "COMPLETED",
  "properties": {
    "dapr.workflow.custom_status": "",
    "dapr.workflow.input": "{\"Name\": \"Paperclips\", \"TotalCost\": 99.95, \"Quantity\": 1}",
    "dapr.workflow.output": "{\"Processed\":true}"
  }
}

When the workflow has completed, the stdout of the workflow app should look like:

info: WorkflowConsoleApp.Activities.NotifyActivity[0]
      Received order 12345678 for Paperclips at $99.95
info: WorkflowConsoleApp.Activities.ReserveInventoryActivity[0]
      Reserving inventory: 12345678, Paperclips, 1
info: WorkflowConsoleApp.Activities.ProcessPaymentActivity[0]
      Processing payment: 12345678, 99.95, USD
info: WorkflowConsoleApp.Activities.NotifyActivity[0]
      Order 12345678 processed successfully!

If you have Zipkin configured for Dapr locally on your machine, then you can view the workflow trace spans in the Zipkin web UI (typically at http://localhost:9411/zipkin/).

Demo

Watch this video demonstrating .NET Workflow:

Next steps

1.4 - Dapr AI .NET SDK

Get up and running with the Dapr AI .NET SDK

With the Dapr AI package, you can interact with the Dapr AI workloads from a .NET application.

Today, Dapr provides the Conversational API to engage with large language models. To get started with this workload, walk through the Dapr Conversational AI how-to guide.

1.4.1 - Dapr AI Client

Learn how to create Dapr AI clients

The Dapr AI client package allows you to interact with the AI capabilities provided by the Dapr sidecar.

Lifetime management

A DaprConversationClient is a version of the Dapr client that is dedicated to interacting with the Dapr Conversation API. It can be registered alongside a DaprClient and other Dapr clients without issue.

It maintains access to networking resources in the form of TCP sockets used to communicate with the Dapr sidecar.

For best performance, create a single long-lived instance of DaprConversationClient and provide access to that shared instance throughout your application. DaprConversationClient instances are thread-safe and intended to be shared.

This can be aided by utilizing the dependency injection functionality. The registration method supports registration as a singleton, a scoped instance or as transient (meaning it’s recreated every time it’s injected), but also enables registration to utilize values from an IConfiguration or other injected service in a way that’s impractical when creating the client from scratch in each of your classes.

Avoid creating a DaprConversationClient for each operation.

Configuring DaprConversationClient via DaprConversationClientBuilder

A DaprConversationClient can be configured by invoking methods on the DaprConversationClientBuilder class before calling .Build() to create the client itself. The settings for each DaprConversationClient are separate and cannot be changed after calling .Build().

var daprConversationClient = new DaprConversationClientBuilder()
    .UseDaprApiToken("abc123") // Specify the API token used to authenticate to other Dapr sidecars
    .Build();

The DaprConversationClientBuilder contains settings for:

  • The HTTP endpoint of the Dapr sidecar
  • The gRPC endpoint of the Dapr sidecar
  • The JsonSerializerOptions object used to configure JSON serialization
  • The GrpcChannelOptions object used to configure gRPC
  • The API token used to authenticate requests to the sidecar
  • The factory method used to create the HttpClient instance used by the SDK
  • The timeout used for the HttpClient instance when making requests to the sidecar

The SDK will read the following environment variables to configure the default values:

  • DAPR_HTTP_ENDPOINT: used to find the HTTP endpoint of the Dapr sidecar, example: https://dapr-api.mycompany.com
  • DAPR_GRPC_ENDPOINT: used to find the gRPC endpoint of the Dapr sidecar, example: https://dapr-grpc-api.mycompany.com
  • DAPR_HTTP_PORT: if DAPR_HTTP_ENDPOINT is not set, this is used to find the HTTP local endpoint of the Dapr sidecar
  • DAPR_GRPC_PORT: if DAPR_GRPC_ENDPOINT is not set, this is used to find the gRPC local endpoint of the Dapr sidecar
  • DAPR_API_TOKEN: used to set the API token

Configuring gRPC channel options

Dapr’s use of CancellationToken for cancellation relies on the configuration of the gRPC channel options. If you need to configure these options yourself, make sure to enable the ThrowOperationCanceledOnCancellation setting.

var daprConversationClient = new DaprConversationClientBuilder()
    .UseGrpcChannelOptions(new GrpcChannelOptions { ... ThrowOperationCanceledOnCancellation = true })
    .Build();

Using cancellation with DaprConversationClient

The APIs on DaprConversationClient perform asynchronous operations and accept an optional CancellationToken parameter. This follows a standard .NET practice for cancellable operations. Note that when cancellation occurs, there is no guarantee that the remote endpoint stops processing the request, only that the client has stopped waiting for completion.

When an operation is cancelled, it will throw an OperationCancelledException.

Configuring DaprConversationClient via dependency injection

Using the built-in extension methods for registering the DaprConversationClient in a dependency injection container can provide the benefit of registering the long-lived service a single time, centralize complex configuration and improve performance by ensuring similarly long-lived resources are re-purposed when possible (e.g. HttpClient instances).

There are three overloads available to give the developer the greatest flexibility in configuring the client for their scenario. Each of these will register the IHttpClientFactory on your behalf if not already registered, and configure the DaprConversationClientBuilder to use it when creating the HttpClient instance in order to re-use the same instance as much as possible and avoid socket exhaustion and other issues.

In the first approach, there’s no configuration done by the developer and the DaprConversationClient is configured with the default settings.

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddDaprConversationClient(); //Registers the `DaprConversationClient` to be injected as needed
var app = builder.Build();

Sometimes the developer will need to configure the created client using the various configuration options detailed above. This is done through an overload that passes in the DaprConversationClientBuiler and exposes methods for configuring the necessary options.

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddDaprConversationClient((_, daprConversationClientBuilder) => {
   //Set the API token
   daprConversationClientBuilder.UseDaprApiToken("abc123");
   //Specify a non-standard HTTP endpoint
   daprConversationClientBuilder.UseHttpEndpoint("http://dapr.my-company.com");
});

var app = builder.Build();

Finally, it’s possible that the developer may need to retrieve information from another service in order to populate these configuration values. That value may be provided from a DaprClient instance, a vendor-specific SDK or some local service, but as long as it’s also registered in DI, it can be injected into this configuration operation via the last overload:

var builder = WebApplication.CreateBuilder(args);

//Register a fictional service that retrieves secrets from somewhere
builder.Services.AddSingleton<SecretService>();

builder.Services.AddDaprConversationClient((serviceProvider, daprConversationClientBuilder) => {
    //Retrieve an instance of the `SecretService` from the service provider
    var secretService = serviceProvider.GetRequiredService<SecretService>();
    var daprApiToken = secretService.GetSecret("DaprApiToken").Value;

    //Configure the `DaprConversationClientBuilder`
    daprConversationClientBuilder.UseDaprApiToken(daprApiToken);
});

var app = builder.Build();

1.4.2 - How to: Create and use Dapr AI Conversations in the .NET SDK

Learn how to create and use the Dapr Conversational AI client using the .NET SDK

Prerequisites

Installation

To get started with the Dapr AI .NET SDK client, install the Dapr.AI package from NuGet:

dotnet add package Dapr.AI

A DaprConversationClient maintains access to networking resources in the form of TCP sockets used to communicate with the Dapr sidecar.

Dependency Injection

The AddDaprAiConversation() method will register the Dapr client ASP.NET Core dependency injection and is the recommended approach for using this package. This method accepts an optional options delegate for configuring the DaprConversationClient and a ServiceLifetime argument, allowing you to specify a different lifetime for the registered services instead of the default Singleton value.

The following example assumes all default values are acceptable and is sufficient to register the DaprConversationClient:

services.AddDaprAiConversation();

The optional configuration delegate is used to configure the DaprConversationClient by specifying options on the DaprConversationClientBuilder as in the following example:

services.AddSingleton<DefaultOptionsProvider>();
services.AddDaprAiConversation((serviceProvider, clientBuilder) => {
     //Inject a service to source a value from
     var optionsProvider = serviceProvider.GetRequiredService<DefaultOptionsProvider>();
     var standardTimeout = optionsProvider.GetStandardTimeout();
     
     //Configure the value on the client builder
     clientBuilder.UseTimeout(standardTimeout);
});

Manual Instantiation

Rather than using dependency injection, a DaprConversationClient can also be built using the static client builder.

For best performance, create a single long-lived instance of DaprConversationClient and provide access to that shared instance throughout your application. DaprConversationClient instances are thread-safe and intended to be shared.

Avoid creating a DaprConversationClient per-operation.

A DaprConversationClient can be configured by invoking methods on the DaprConversationClientBuilder class before calling .Build() to create the client. The settings for each DaprConversationClient are separate and cannot be changed after calling .Build().

var daprConversationClient = new DaprConversationClientBuilder()
    .UseJsonSerializerSettings( ... ) //Configure JSON serializer
    .Build();

See the .NET documentation here for more information about the options available when configuring the Dapr client via the builder.

Try it out

Put the Dapr AI .NET SDK to the test. Walk through the samples to see Dapr in action:

SDK SamplesDescription
SDK samplesClone the SDK repo to try out some examples and get started.

Building Blocks

This part of the .NET SDK allows you to interface with the Conversations API to send and receive messages from large language models.

1.5 - Dapr Jobs .NET SDK

Get up and running with Dapr Jobs and the Dapr .NET SDK

With the Dapr Job package, you can interact with the Dapr Job APIs from a .NET application to trigger future operations to run according to a predefined schedule with an optional payload.

To get started, walk through the Dapr Jobs how-to guide and refer to best practices documentation for additional guidance.

1.5.1 - How to: Author and manage Dapr Jobs in the .NET SDK

Learn how to author and manage Dapr Jobs using the .NET SDK

Let’s create an endpoint that will be invoked by Dapr Jobs when it triggers, then schedule the job in the same app. We’ll use the simple example provided here, for the following demonstration and walk through it as an explainer of how you can schedule one-time or recurring jobs using either an interval or Cron expression yourself. In this guide, you will:

  • Deploy a .NET Web API application (JobsSample)
  • Utilize the Dapr .NET Jobs SDK to schedule a job invocation and set up the endpoint to be triggered

In the .NET example project:

  • The main Program.cs file comprises the entirety of this demonstration.

Prerequisites

Set up the environment

Clone the .NET SDK repo.

git clone https://github.com/dapr/dotnet-sdk.git

From the .NET SDK root directory, navigate to the Dapr Jobs example.

cd examples/Jobs

Run the application locally

To run the Dapr application, you need to start the .NET program and a Dapr sidecar. Navigate to the JobsSample directory.

cd JobsSample

We’ll run a command that starts both the Dapr sidecar and the .NET program at the same time.

dapr run --app-id jobsapp --dapr-grpc-port 4001 --dapr-http-port 3500 -- dotnet run

Dapr listens for HTTP requests at http://localhost:3500 and internal Jobs gRPC requests at http://localhost:4001.

Register the Dapr Jobs client with dependency injection

The Dapr Jobs SDK provides an extension method to simplify the registration of the Dapr Jobs client. Before completing the dependency injection registration in Program.cs, add the following line:

var builder = WebApplication.CreateBuilder(args);

//Add anywhere between these two lines
builder.Services.AddDaprJobsClient();

var app = builder.Build();

Note that in today’s implementation of the Jobs API, the app that schedules the job will also be the app that receives the trigger notification. In other words, you cannot schedule a trigger to run in another application. As a result, while you don’t explicitly need the Dapr Jobs client to be registered in your application to schedule a trigger invocation endpoint, your endpoint will never be invoked without the same app also scheduling the job somehow (whether via this Dapr Jobs .NET SDK or an HTTP call to the sidecar).

It’s possible that you may want to provide some configuration options to the Dapr Jobs client that should be present with each call to the sidecar such as a Dapr API token, or you want to use a non-standard HTTP or gRPC endpoint. This is possible through use of an overload of the registration method that allows configuration of a DaprJobsClientBuilder instance:

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddDaprJobsClient((_, daprJobsClientBuilder) =>
{
    daprJobsClientBuilder.UseDaprApiToken("abc123");
    daprJobsClientBuilder.UseHttpEndpoint("http://localhost:8512"); //Non-standard sidecar HTTP endpoint
});

var app = builder.Build();

Still, it’s possible that whatever values you wish to inject need to be retrieved from some other source, itself registered as a dependency. There’s one more overload you can use to inject an IServiceProvider into the configuration action method. In the following example, we register a fictional singleton that can retrieve secrets from somewhere and pass it into the configuration method for AddDaprJobClient so we can retrieve our Dapr API token from somewhere else for registration here:

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddSingleton<SecretRetriever>();
builder.Services.AddDaprJobsClient((serviceProvider, daprJobsClientBuilder) =>
{
    var secretRetriever = serviceProvider.GetRequiredService<SecretRetriever>();
    var daprApiToken = secretRetriever.GetSecret("DaprApiToken").Value;
    daprJobsClientBuilder.UseDaprApiToken(daprApiToken);

    daprJobsClientBuilder.UseHttpEndpoint("http://localhost:8512");
});

var app = builder.Build();

Use the Dapr Jobs client using IConfiguration

It’s possible to configure the Dapr Jobs client using the values in your registered IConfiguration as well without explicitly specifying each of the value overrides using the DaprJobsClientBuilder as demonstrated in the previous section. Rather, by populating an IConfiguration made available through dependency injection the AddDaprJobsClient() registration will automatically use these values over their respective defaults.

Start by populating the values in your configuration. This can be done in several different ways as demonstrated below.

Configuration via ConfigurationBuilder

Application settings can be configured without using a configuration source and by instead populating the value in-memory using a ConfigurationBuilder instance:

var builder = WebApplication.CreateBuilder();

//Create the configuration
var configuration = new ConfigurationBuilder()
    .AddInMemoryCollection(new Dictionary<string, string> {
            { "DAPR_HTTP_ENDPOINT", "http://localhost:54321" },
            { "DAPR_API_TOKEN", "abc123" }
        })
    .Build();

builder.Configuration.AddConfiguration(configuration);
builder.Services.AddDaprJobsClient(); //This will automatically populate the HTTP endpoint and API token values from the IConfiguration

Configuration via Environment Variables

Application settings can be accessed from environment variables available to your application.

The following environment variables will be used to populate both the HTTP endpoint and API token used to register the Dapr Jobs client.

KeyValue
DAPR_HTTP_ENDPOINThttp://localhost:54321
DAPR_API_TOKENabc123
var builder = WebApplication.CreateBuilder();

builder.Configuration.AddEnvironmentVariables();
builder.Services.AddDaprJobsClient();

The Dapr Jobs client will be configured to use both the HTTP endpoint http://localhost:54321 and populate all outbound requests with the API token header abc123.

Configuration via prefixed Environment Variables

However, in shared-host scenarios where there are multiple applications all running on the same machine without using containers or in development environments, it’s not uncommon to prefix environment variables. The following example assumes that both the HTTP endpoint and the API token will be pulled from environment variables prefixed with the value “myapp_”. The two environment variables used in this scenario are as follows:

KeyValue
myapp_DAPR_HTTP_ENDPOINThttp://localhost:54321
myapp_DAPR_API_TOKENabc123

These environment variables will be loaded into the registered configuration in the following example and made available without the prefix attached.

var builder = WebApplication.CreateBuilder();

builder.Configuration.AddEnvironmentVariables(prefix: "myapp_");
builder.Services.AddDaprJobsClient();

The Dapr Jobs client will be configured to use both the HTTP endpoint http://localhost:54321 and populate all outbound requests with the API token header abc123.

Use the Dapr Jobs client without relying on dependency injection

While the use of dependency injection simplifies the use of complex types in .NET and makes it easier to deal with complicated configurations, you’re not required to register the DaprJobsClient in this way. Rather, you can also elect to create an instance of it from a DaprJobsClientBuilder instance as demonstrated below:


public class MySampleClass
{
    public void DoSomething()
    {
        var daprJobsClientBuilder = new DaprJobsClientBuilder();
        var daprJobsClient = daprJobsClientBuilder.Build();

        //Do something with the `daprJobsClient`
    }
}

Set up a endpoint to be invoked when the job is triggered

It’s easy to set up a jobs endpoint if you’re at all familiar with minimal APIs in ASP.NET Core as the syntax is the same between the two.

Once dependency injection registration has been completed, configure the application the same way you would to handle mapping an HTTP request via the minimal API functionality in ASP.NET Core. Implemented as an extension method, pass the name of the job it should be responsive to and a delegate. Services can be injected into the delegate’s arguments as you wish and the job payload can be accessed from the ReadOnlyMemory<byte> originally provided to the job registration.

There are two delegates you can use here. One provides an IServiceProvider in case you need to inject other services into the handler:

//We have this from the example above
var builder = WebApplication.CreateBuilder(args);

builder.Services.AddDaprJobsClient();

var app = builder.Build();

//Add our endpoint registration
app.MapDaprScheduledJob("myJob", (IServiceProvider serviceProvider, string jobName, ReadOnlyMemory<byte> jobPayload) => {
    var logger = serviceProvider.GetService<ILogger>();
    logger?.LogInformation("Received trigger invocation for '{jobName}'", "myJob");

    //Do something...
});

app.Run();

The other overload of the delegate doesn’t require an IServiceProvider if not necessary:

//We have this from the example above
var builder = WebApplication.CreateBuilder(args);

builder.Services.AddDaprJobsClient();

var app = builder.Build();

//Add our endpoint registration
app.MapDaprScheduledJob("myJob", (string jobName, ReadOnlyMemory<byte> jobPayload) => {
    //Do something...
});

app.Run();

Support cancellation tokens when processing mapped invocations

You may want to ensure that timeouts are handled on job invocations so that they don’t indefinitely hang and use system resources. When setting up the job mapping, there’s an optional TimeSpan parameter that can be provided as the last argument to specify a timeout for the request. Every time the job mapping invocation is triggered, a new CancellationTokenSource will be created using this timeout parameter and a CancellationToken will be created from it to put an upper bound on the processing of the request. If a timeout isn’t provided, this defaults to CancellationToken.None and a timeout will not be automatically applied to the mapping.

//We have this from the example above
var builder = WebApplication.CreateBuilder(args);

builder.Services.AddDaprJobsClient();

var app = builder.Build();

//Add our endpoint registration
app.MapDaprScheduledJob("myJob", (string jobName, ReadOnlyMemory<byte> jobPayload) => {
    //Do something...
}, TimeSpan.FromSeconds(15)); //Assigns a maximum timeout of 15 seconds for handling the invocation request

app.Run();

Register the job

Finally, we have to register the job we want scheduled. Note that from here, all SDK methods have cancellation token support and use a default token if not otherwise set.

There are three different ways to set up a job that vary based on how you want to configure the schedule. The following shows the different arguments available when scheduling a job:

Argument NameTypeDescriptionRequired
jobNamestringThe name of the job being scheduled.Yes
scheduleDaprJobScheduleThe schedule defining when the job will be triggered.Yes
payloadReadOnlyMemoryJob data provided to the invocation endpoint when triggered.No
startingFromDateTimeThe point in time from which the job schedule should start.No
repeatsintThe maximum number of times the job should be triggered.No
ttlWhen the job should expires and no longer trigger.No
overwriteboolA flag indicating whether an existing job should be overwritten when submitted or false to require that an existing job with the same name be deleted first.No
cancellationTokenCancellationTokenUsed to cancel out of the operation early, e.g. because of an operation timeout.No

DaprJobSchedule

All jobs are scheduled via the SDK using the DaprJobSchedule which creates an expression passed to the runtime to schedule jobs. There are several static methods exposed on the DaprJobSchedule used to faciliate easy registration of each of the kinds of job schedules available as follows. This separates specifying the job schedule itself from any additional options like repeating the operation or providing a cancellation token.

One-time job

A one-time job is exactly that; it will run at a single point in time and will not repeat.

This approach requires that you select a job name and specify a time it should be triggered.

DaprJobSchedule.FromDateTime(DateTimeOffset scheduledTime)

One-time jobs can be scheduled from the Dapr Jobs client as in the following example:

public class MyOperation(DaprJobsClient daprJobsClient)
{
    public async Task ScheduleOneTimeJobAsync(CancellationToken cancellationToken)
    {
        var today = DateTimeOffset.UtcNow;
        var threeDaysFromNow = today.AddDays(3);

        var schedule = DaprJobSchedule.FromDateTime(threeDaysFromNow);
        await daprJobsClient.ScheduleJobAsync("job", schedule, cancellationToken: cancellationToken);
    }
}

Interval-based job

An interval-based job is one that runs on a recurring loop configured as a fixed amount of time, not unlike how reminders work in the Actors building block today.

DaprJobSchedule.FromDuration(TimeSpan interval)

Interval-based jobs can be scheduled from the Dapr Jobs client as in the following example:

public class MyOperation(DaprJobsClient daprJobsClient)
{

    public async Task ScheduleIntervalJobAsync(CancellationToken cancellationToken)
    {
        var hourlyInterval = TimeSpan.FromHours(1);

        //Trigger the job hourly, but a maximum of 5 times
        var schedule = DaprJobSchedule.FromDuration(hourlyInterval);
        await daprJobsClient.ScheduleJobAsync("job", schedule, repeats: 5, cancellationToken: cancellationToken);
    }
}

Cron-based job

A Cron-based job is scheduled using a Cron expression. This gives more calendar-based control over when the job is triggered as it can used calendar-based values in the expression.

DaprJobSchedule.FromCronExpression(string cronExpression)

There are two different approaches supported to scheduling a Cron-based job in the Dapr SDK.

Provide your own Cron expression

You can just provide your own Cron expression via a string via DaprJobSchedule.FromExpression():

public class MyOperation(DaprJobsClient daprJobsClient)
{
    public async Task ScheduleCronJobAsync(CancellationToken cancellationToken)
    {
        //At the top of every other hour on the fifth day of the month
        const string cronSchedule = "0 */2 5 * *";
        var schedule = DaprJobSchedule.FromExpression(cronSchedule);

        //Don't start this until next month
        var now = DateTime.UtcNow;
        var oneMonthFromNow = now.AddMonths(1);
        var firstOfNextMonth = new DateTime(oneMonthFromNow.Year, oneMonthFromNow.Month, 1, 0, 0, 0);

        await daprJobsClient.ScheduleJobAsync("myJobName", )
        await daprJobsClient.ScheduleCronJobAsync("myJobName", schedule, dueTime: firstOfNextMonth, cancellationToken: cancellationToken);
    }
}

Use the CronExpressionBuilder

Alternatively, you can use our fluent builder to produce a valid Cron expression:

public class MyOperation(DaprJobsClient daprJobsClient)
{
    public async Task ScheduleCronJobAsync(CancellationToken cancellationToken)
    {
        //At the top of every other hour on the fifth day of the month
        var cronExpression = new CronExpressionBuilder()
            .Every(EveryCronPeriod.Hour, 2)
            .On(OnCronPeriod.DayOfMonth, 5)
            .ToString();
        var schedule = DaprJobSchedule.FromExpression(cronExpression);

        //Don't start this until next month
        var now = DateTime.UtcNow;
        var oneMonthFromNow = now.AddMonths(1);
        var firstOfNextMonth = new DateTime(oneMonthFromNow.Year, oneMonthFromNow.Month, 1, 0, 0, 0);

        await daprJobsClient.ScheduleJobAsync("myJobName", )
        await daprJobsClient.ScheduleCronJobAsync("myJobName", schedule, dueTime: firstOfNextMonth, cancellationToken: cancellationToken);
    }
}

Get details of already-scheduled job

If you know the name of an already-scheduled job, you can retrieve its metadata without waiting for it to be triggered. The returned JobDetails exposes a few helpful properties for consuming the information from the Dapr Jobs API:

  • If the Schedule property contains a Cron expression, the IsCronExpression property will be true and the expression will also be available in the CronExpression property.
  • If the Schedule property contains a duration value, the IsIntervalExpression property will instead be true and the value will be converted to a TimeSpan value accessible from the Interval property.

This can be done by using the following:

public class MyOperation(DaprJobsClient daprJobsClient)
{
    public async Task<JobDetails> GetJobDetailsAsync(string jobName, CancellationToken cancellationToken)
    {
        var jobDetails = await daprJobsClient.GetJobAsync(jobName, canecllationToken);
        return jobDetails;
    }
}

Delete a scheduled job

To delete a scheduled job, you’ll need to know its name. From there, it’s as simple as calling the DeleteJobAsync method on the Dapr Jobs client:

public class MyOperation(DaprJobsClient daprJobsClient)
{
    public async Task DeleteJobAsync(string jobName, CancellationToken cancellationToken)
    {
        await daprJobsClient.DeleteJobAsync(jobName, cancellationToken);
    }
}

1.5.2 - DaprJobsClient usage

Essential tips and advice for using DaprJobsClient

Lifetime management

A DaprJobsClient is a version of the Dapr client that is dedicated to interacting with the Dapr Jobs API. It can be registered alongside a DaprClient and other Dapr clients without issue.

It maintains access to networking resources in the form of TCP sockets used to communicate with the Dapr sidecar and implements IDisposable to support the eager cleanup of resources.

For best performance, create a single long-lived instance of DaprJobsClient and provide access to that shared instance throughout your application. DaprJobsClient instances are thread-safe and intended to be shared.

This can be aided by utilizing the dependency injection functionality. The registration method supports registration using as a singleton, a scoped instance or as transient (meaning it’s recreated every time it’s injected), but also enables registration to utilize values from an IConfiguration or other injected service in a way that’s impractical when creating the client from scratch in each of your classes.

Avoid creating a DaprJobsClient for each operation and disposing it when the operation is complete.

Configuring DaprJobsClient via the DaprJobsClientBuilder

A DaprJobsClient can be configured by invoking methods on the DaprJobsClientBuilder class before calling .Build() to create the client itself. The settings for each DaprJobsClient are separate and cannot be changed after calling .Build().

var daprJobsClient = new DaprJobsClientBuilder()
    .UseDaprApiToken("abc123") // Specify the API token used to authenticate to other Dapr sidecars
    .Build();

The DaprJobsClientBuilder contains settings for:

  • The HTTP endpoint of the Dapr sidecar
  • The gRPC endpoint of the Dapr sidecar
  • The JsonSerializerOptions object used to configure JSON serialization
  • The GrpcChannelOptions object used to configure gRPC
  • The API token used to authenticate requests to the sidecar
  • The factory method used to create the HttpClient instance used by the SDK
  • The timeout used for the HttpClient instance when making requests to the sidecar

The SDK will read the following environment variables to configure the default values:

  • DAPR_HTTP_ENDPOINT: used to find the HTTP endpoint of the Dapr sidecar, example: https://dapr-api.mycompany.com
  • DAPR_GRPC_ENDPOINT: used to find the gRPC endpoint of the Dapr sidecar, example: https://dapr-grpc-api.mycompany.com
  • DAPR_HTTP_PORT: if DAPR_HTTP_ENDPOINT is not set, this is used to find the HTTP local endpoint of the Dapr sidecar
  • DAPR_GRPC_PORT: if DAPR_GRPC_ENDPOINT is not set, this is used to find the gRPC local endpoint of the Dapr sidecar
  • DAPR_API_TOKEN: used to set the API token

Configuring gRPC channel options

Dapr’s use of CancellationToken for cancellation relies on the configuration of the gRPC channel options. If you need to configure these options yourself, make sure to enable the ThrowOperationCanceledOnCancellation setting.

var daprJobsClient = new DaprJobsClientBuilder()
    .UseGrpcChannelOptions(new GrpcChannelOptions { ... ThrowOperationCanceledOnCancellation = true })
    .Build();

Using cancellation with DaprJobsClient

The APIs on DaprJobsClient perform asynchronous operations and accept an optional CancellationToken parameter. This follows a standard .NET practice for cancellable operations. Note that when cancellation occurs, there is no guarantee that the remote endpoint stops processing the request, only that the client has stopped waiting for completion.

When an operation is cancelled, it will throw an OperationCancelledException.

Configuring DaprJobsClient via dependency injection

Using the built-in extension methods for registering the DaprJobsClient in a dependency injection container can provide the benefit of registering the long-lived service a single time, centralize complex configuration and improve performance by ensuring similarly long-lived resources are re-purposed when possible (e.g. HttpClient instances).

There are three overloads available to give the developer the greatest flexibility in configuring the client for their scenario. Each of these will register the IHttpClientFactory on your behalf if not already registered, and configure the DaprJobsClientBuilder to use it when creating the HttpClient instance in order to re-use the same instance as much as possible and avoid socket exhaustion and other issues.

In the first approach, there’s no configuration done by the developer and the DaprJobsClient is configured with the default settings.

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddDaprJobsClient(); //Registers the `DaprJobsClient` to be injected as needed
var app = builder.Build();

Sometimes the developer will need to configure the created client using the various configuration options detailed above. This is done through an overload that passes in the DaprJobsClientBuiler and exposes methods for configuring the necessary options.

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddDaprJobsClient((_, daprJobsClientBuilder) => {
   //Set the API token
   daprJobsClientBuilder.UseDaprApiToken("abc123");
   //Specify a non-standard HTTP endpoint
   daprJobsClientBuilder.UseHttpEndpoint("http://dapr.my-company.com");
});

var app = builder.Build();

Finally, it’s possible that the developer may need to retrieve information from another service in order to populate these configuration values. That value may be provided from a DaprClient instance, a vendor-specific SDK or some local service, but as long as it’s also registered in DI, it can be injected into this configuration operation via the last overload:

var builder = WebApplication.CreateBuilder(args);

//Register a fictional service that retrieves secrets from somewhere
builder.Services.AddSingleton<SecretService>();

builder.Services.AddDaprJobsClient((serviceProvider, daprJobsClientBuilder) => {
    //Retrieve an instance of the `SecretService` from the service provider
    var secretService = serviceProvider.GetRequiredService<SecretService>();
    var daprApiToken = secretService.GetSecret("DaprApiToken").Value;

    //Configure the `DaprJobsClientBuilder`
    daprJobsClientBuilder.UseDaprApiToken(daprApiToken);
});

var app = builder.Build();

Understanding payload serialization on DaprJobsClient

While there are many methods on the DaprClient that automatically serialize and deserialize data using the System.Text.Json serializer, this SDK takes a different philosophy. Instead, the relevant methods accept an optional payload of ReadOnlyMemory<byte> meaning that serialization is an exercise left to the developer and is not generally handled by the SDK.

That said, there are some helper extension methods available for each of the scheduling methods. If you know that you want to use a type that’s JSON-serializable, you can use the Schedule*WithPayloadAsync method for each scheduling type that accepts an object as a payload and an optional JsonSerializerOptions to use when serializing the value. This will convert the value to UTF-8 encoded bytes for you as a convenience. Here’s an example of what this might look like when scheduling a Cron expression:

public sealed record Doodad (string Name, int Value);

//...
var doodad = new Doodad("Thing", 100);
await daprJobsClient.ScheduleCronJobWithPayloadAsync("myJob", "5 * * * *", doodad);

In the same vein, if you have a plain string value, you can use an overload of the same method to serialize a string-typed payload and the JSON serialization step will be skipped and it’ll only be encoded to an array of UTF-8 encoded bytes. Here’s an example of what this might look like when scheduling a one-time job:

var now = DateTime.UtcNow;
var oneWeekFromNow = now.AddDays(7);
await daprJobsClient.ScheduleOneTimeJobWithPayloadAsync("myOtherJob", oneWeekFromNow, "This is a test!");

The delegate handling the job invocation expects at least two arguments to be present:

  • A string that is populated with the jobName, providing the name of the invoked job
  • A ReadOnlyMemory<byte> that is populated with the bytes originally provided during the job registration.

Because the payload is stored as a ReadOnlyMemory<byte>, the developer has the freedom to serialize and deserialize as they wish, but there are again two helper extensions included that can deserialize this to either a JSON-compatible type or a string. Both methods assume that the developer encoded the originally scheduled job (perhaps using the helper serialization methods) as these methods will not force the bytes to represent something they’re not.

To deserialize the bytes to a string, the following helper method can be used:

var payloadAsString = Encoding.UTF8.GetString(jobPayload.Span); //If successful, returns a string with the value

Error handling

Methods on DaprJobsClient will throw a DaprJobsServiceException if an issue is encountered between the SDK and the Jobs API service running on the Dapr sidecar. If a failure is encountered because of a poorly formatted request made to the Jobs API service through this SDK, a DaprMalformedJobException will be thrown. In case of illegal argument values, the appropriate standard exception will be thrown (e.g. ArgumentOutOfRangeException or ArgumentNullException) with the name of the offending argument. And for anything else, a DaprException will be thrown.

The most common cases of failure will be related to:

  • Incorrect argument formatting while engaging with the Jobs API
  • Transient failures such as a networking problem
  • Invalid data, such as a failure to deserialize a value into a type it wasn’t originally serialized from

In any of these cases, you can examine more exception details through the .InnerException property.

1.6 - Dapr Cryptography .NET SDK

Get up and running with the Dapr Cryptography .NET SDK

With the Dapr Cryptography package, you can perform high-performance encryption and decryption operations with Dapr.

To get started with this functionality, walk through the [Dapr Cryptography(https://v1-16.docs.dapr.io/developing-applications/sdks/dotnet/dotnet-cryptography/dotnet-cryptography-howto/) how-to guide.

1.6.1 - Dapr Cryptography Client

Learn how to create Dapr Crytography clients

The Dapr Cryptography package allows you to perform encryption and decryption operations provided by the Dapr sidecar.

Lifetime management

A DaprEncryptionClient is a version of the Dapr client that is dedicated to interacting with the Dapr Cryptography API. It can be registered alongside a DaprClient and other Dapr clients without issue.

It maintains access to networking resources in the form of TCP sockets used to communicate with the Dapr sidecar.

For best performance, create a single long-lived instance of DaprEncryptionClient and provide access to that shared instance throughout your application. DaprEncryptionClient instances are thread-safe and intended to be shared.

This can be aided by utilizing the dependency injection functionality. The registration method supports registration as a singleton, a scoped instance, or as a transient (meaning it’s recreated every time it’s injected), but also enables registration to utilize values from an IConfiguration or other injected service in a way that’s impractical when creating the client from scratch in each of your classes.

Avoid creating a DaprEncryptionClient for each operation.

Configuring DaprEncryptionClient via DaprEncryptionClientBuilder

A DaprCryptographyClient can be configured by invoking methods on the DaprEncryptionClientBuilder class before calling .Build() to create the client itself. The settings for each DaprEncryptionClientBuilder are separate can cannot be changed after calling .Build().

var daprEncryptionClient = new DaprEncryptionClientBuilder()
    .UseDaprApiToken("abc123") //Specify the API token used to authenticate to the Dapr sidecar
    .Build();

The DaprEncryptionClientBuilder contains settings for:

  • The HTTP endpoint of the Dapr sidecar
  • The gRPC endpoint of the Dapr sidecar
  • The JsonSerializerOptions object used to configure JSON serialization
  • The GrpcChannelOptions object used to configure gRPC
  • The API token used to authenticate requests to the sidecar
  • The factory method used to create the HttpClient instance used by the SDK
  • The timeout used for the HttpClient instance when making requests to the sidecar

The SDK will read the following environment variables to configure the default values:

  • DAPR_HTTP_ENDPOINT: used to find the HTTP endpoint of the Dapr sidecar, example: https://dapr-api.mycompany.com
  • DAPR_GRPC_ENDPOINT: used to find the gRPC endpoint of the Dapr sidecar, example: https://dapr-grpc-api.mycompany.com
  • DAPR_HTTP_PORT: if DAPR_HTTP_ENDPOINT is not set, this is used to find the HTTP local endpoint of the Dapr sidecar
  • DAPR_GRPC_PORT: if DAPR_GRPC_ENDPOINT is not set, this is used to find the gRPC local endpoint of the Dapr sidecar
  • DAPR_API_TOKEN: used to set the API token

Configuring gRPC channel options

Dapr’s use of CancellationToken for cancellation relies on the configuration of the gRPC channel options. If you need to configure these options yourself, make sure to enable the ThrowOperationCanceledOnCancellation setting.

var daprEncryptionClient = new DaprEncryptionClientBuilder()
    .UseGrpcChannelOptions(new GrpcChannelOptions { .. ThrowOperationCanceledOnCancellation = true })
    .Build();

Using cancellation with DaprEncryptionClient

The APIs on DaprEncryptionClient perform asynchronous operations and accept an optional CancellationToken parameter. This follows a standard .NET practice for cancellable operations. Note that when cancellation occurs, there is no guarantee that the remote endpoint stops processing the request, only that the client has stopped waiting for completion.

When an operation is cancelled, it will throw an OperationCancelledException.

Configuring DaprEncryptionClient via dependency injection

Using the built-in extension methods for registering the DaprEncryptionClient in a dependency injection container can provide the benefit of registering the long-lived service a single time, centralize complex configuration and improve performance by ensuring similarly long-lived resources are re-purposed when possible (e.g. HttpClient instances).

There are three overloads available to give the developer the greatest flexibility in configuring the client for their scenario. Each of these will register the IHttpClientFactory on your behalf if not already registered, and configure the DaprEncryptionClientBuilder to use it when creating the HttpClient instance in order to re-use the same instance as much as possible and avoid socket exhaustion and other issues.

In the first approach, there’s no configuration done by the developer and the DaprEncryptionClient is configured with the default settings.

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddDaprEncryptionClent(); //Registers the `DaprEncryptionClient` to be injected as needed
var app = builder.Build();

Sometimes the developer will need to configure the created client using the various configuration options detailed above. This is done through an overload that passes in the DaprEncryptionClientBuiler and exposes methods for configuring the necessary options.

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddDaprEncryptionClient((_, daprEncrpyptionClientBuilder) => {
   //Set the API token
   daprEncryptionClientBuilder.UseDaprApiToken("abc123");
   //Specify a non-standard HTTP endpoint
   daprEncryptionClientBuilder.UseHttpEndpoint("http://dapr.my-company.com");
});

var app = builder.Build();

Finally, it’s possible that the developer may need to retrieve information from another service in order to populate these configuration values. That value may be provided from a DaprClient instance, a vendor-specific SDK or some local service, but as long as it’s also registered in DI, it can be injected into this configuration operation via the last overload:

var builder = WebApplication.CreateBuilder(args);

//Register a fictional service that retrieves secrets from somewhere
builder.Services.AddSingleton<SecretService>();

builder.Services.AddDaprEncryptionClient((serviceProvider, daprEncryptionClientBuilder) => {
    //Retrieve an instance of the `SecretService` from the service provider
    var secretService = serviceProvider.GetRequiredService<SecretService>();
    var daprApiToken = secretService.GetSecret("DaprApiToken").Value;

    //Configure the `DaprEncryptionClientBuilder`
    daprEncryptionClientBuilder.UseDaprApiToken(daprApiToken);
});

var app = builder.Build();

1.6.2 - How to: Create an use Dapr Cryptography in the .NET SDK

Learn how to create and use the Dapr Cryptography client using the .NET SDK

Prerequisites

Installation

To get started with the Dapr Cryptography client, install the Dapr.Cryptography package from NuGet:

dotnet add package Dapr.Cryptography

A DaprEncryptionClient maintains access to networking resources in the form of TCP sockets used to communicate with the Dapr sidecar.

Dependency Injection

The AddDaprEncryptionClient() method will register the Dapr client with dependency injection and is the recommended approach for using this package. This method accepts an optional options delegate for configuring the DaprEncryptionClient and a ServiceLifetime argument, allowing you to specify a different lifetime for the registered services instead of the default Singleton value.

The following example assumes all default values are acceptable and is sufficient to register the DaprEncryptionClient:

services.AddDaprEncryptionClient();

The optional configuration delegate is used to configure the DaprEncryptionClient by specifying options on the DaprEncryptionClientBuilder as in the following example:

services.AddSingleton<DefaultOptionsProvider>();
services.AddDaprEncryptionClient((serviceProvider, clientBuilder) => {
     //Inject a service to source a value from
     var optionsProvider = serviceProvider.GetRequiredService<DefaultOptionsProvider>();
     var standardTimeout = optionsProvider.GetStandardTimeout();
     
     //Configure the value on the client builder
     clientBuilder.UseTimeout(standardTimeout);
});

Manual Instantiation

Rather than using dependency injection, a DaprEncryptionClient can also be built using the static client builder.

For best performance, create a single long-lived instance of DaprEncryptionClient and provide access to that shared instance throughout your application. DaprEncryptionClient instances are thread-safe and intended to be shared.

Avoid creating a DaprEncryptionClient per-operation.

A DaprEncryptionClient can be configured by invoking methods on the DaprEncryptionClientBuilder class before calling .Build() to create the client. The settings for each DaprEncryptionClient are separate and cannot be changed after calling .Build().

var daprEncryptionClient = new DaprEncryptionClientBuilder()
    .UseJsonSerializerSettings( ... ) //Configure JSON serializer
    .Build();

See the .NET documentation here for more information about the options available when configuring the Dapr client via the builder.

Try it out

Put the Dapr AI .NET SDK to the test. Walk through the samples to see Dapr in action:

SDK SamplesDescription
SDK samplesClone the SDK repo to try out some examples and get started.

1.7 - Dapr Messaging .NET SDK

Get up and running with the Dapr Messaging .NET SDK

With the Dapr Messaging package, you can interact with the Dapr messaging APIs from a .NET application. In the v1.15 release, this package only contains the functionality corresponding to the streaming PubSub capability.

Future Dapr .NET SDK releases will migrate existing messaging capabilities out from Dapr.Client to this Dapr.Messaging package. This will be documented in the release notes, documentation and obsolete attributes in advance.

To get started, walk through the Dapr Messaging how-to guide and refer to best practices documentation for additional guidance.

1.7.1 - How to: Author and manage Dapr streaming subscriptions in the .NET SDK

Learn how to author and manage Dapr streaming subscriptions using the .NET SDK

Let’s create a subscription to a pub/sub topic or queue at using the streaming capability. We’ll use the simple example provided here, for the following demonstration and walk through it as an explainer of how you can configure message handlers at runtime and which do not require an endpoint to be pre-configured. In this guide, you will:

  • Deploy a .NET Web API application (StreamingSubscriptionExample)
  • Utilize the Dapr .NET Messaging SDK to subscribe dynamically to a pub/sub topic.

Prerequisites

Set up the environment

Clone the .NET SDK repo.

git clone https://github.com/dapr/dotnet-sdk.git

From the .NET SDK root directory, navigate to the Dapr streaming PubSub example.

cd examples/Client/PublishSubscribe

Run the application locally

To run the Dapr application, you need to start the .NET program and a Dapr sidecar. Navigate to the StreamingSubscriptionExample directory.

cd StreamingSubscriptionExample

We’ll run a command that starts both the Dapr sidecar and the .NET program at the same time.

dapr run --app-id pubsubapp --dapr-grpc-port 4001 --dapr-http-port 3500 -- dotnet run

Dapr listens for HTTP requests at http://localhost:3500 and internal Jobs gRPC requests at http://localhost:4001.

Register the Dapr PubSub client with dependency injection

The Dapr Messaging SDK provides an extension method to simplify the registration of the Dapr PubSub client. Before completing the dependency injection registration in Program.cs, add the following line:

var builder = WebApplication.CreateBuilder(args);

//Add anywhere between these two
builder.Services.AddDaprPubSubClient(); //That's it

var app = builder.Build();

It’s possible that you may want to provide some configuration options to the Dapr PubSub client that should be present with each call to the sidecar such as a Dapr API token, or you want to use a non-standard HTTP or gRPC endpoint. This be possible through use of an overload of the registration method that allows configuration of a DaprPublishSubscribeClientBuilder instance:

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddDaprPubSubClient((_, daprPubSubClientBuilder) => {
    daprPubSubClientBuilder.UseDaprApiToken("abc123");
    daprPubSubClientBuilder.UseHttpEndpoint("http://localhost:8512"); //Non-standard sidecar HTTP endpoint
});

var app = builder.Build();

Still, it’s possible that whatever values you wish to inject need to be retrieved from some other source, itself registered as a dependency. There’s one more overload you can use to inject an IServiceProvider into the configuration action method. In the following example, we register a fictional singleton that can retrieve secrets from somewhere and pass it into the configuration method for AddDaprJobClient so we can retrieve our Dapr API token from somewhere else for registration here:

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddSingleton<SecretRetriever>();
builder.Services.AddDaprPubSubClient((serviceProvider, daprPubSubClientBuilder) => {
    var secretRetriever = serviceProvider.GetRequiredService<SecretRetriever>();
    var daprApiToken = secretRetriever.GetSecret("DaprApiToken").Value;
    daprPubSubClientBuilder.UseDaprApiToken(daprApiToken);
    
    daprPubSubClientBuilder.UseHttpEndpoint("http://localhost:8512");
});

var app = builder.Build();

Use the Dapr PubSub client using IConfiguration

It’s possible to configure the Dapr PubSub client using the values in your registered IConfiguration as well without explicitly specifying each of the value overrides using the DaprPublishSubscribeClientBuilder as demonstrated in the previous section. Rather, by populating an IConfiguration made available through dependency injection the AddDaprPubSubClient() registration will automatically use these values over their respective defaults.

Start by populating the values in your configuration. This can be done in several different ways as demonstrated below.

Configuration via ConfigurationBuilder

Application settings can be configured without using a configuration source and by instead populating the value in-memory using a ConfigurationBuilder instance:

var builder = WebApplication.CreateBuilder();

//Create the configuration
var configuration = new ConfigurationBuilder()
    .AddInMemoryCollection(new Dictionary<string, string> {
            { "DAPR_HTTP_ENDPOINT", "http://localhost:54321" },
            { "DAPR_API_TOKEN", "abc123" }
        })
    .Build();

builder.Configuration.AddConfiguration(configuration);
builder.Services.AddDaprPubSubClient(); //This will automatically populate the HTTP endpoint and API token values from the IConfiguration

Configuration via Environment Variables

Application settings can be accessed from environment variables available to your application.

The following environment variables will be used to populate both the HTTP endpoint and API token used to register the Dapr PubSub client.

KeyValue
DAPR_HTTP_ENDPOINThttp://localhost:54321
DAPR_API_TOKENabc123
var builder = WebApplication.CreateBuilder();

builder.Configuration.AddEnvironmentVariables();
builder.Services.AddDaprPubSubClient();

The Dapr PubSub client will be configured to use both the HTTP endpoint http://localhost:54321 and populate all outbound requests with the API token header abc123.

Configuration via prefixed Environment Variables

However, in shared-host scenarios where there are multiple applications all running on the same machine without using containers or in development environments, it’s not uncommon to prefix environment variables. The following example assumes that both the HTTP endpoint and the API token will be pulled from environment variables prefixed with the value “myapp_”. The two environment variables used in this scenario are as follows:

KeyValue
myapp_DAPR_HTTP_ENDPOINThttp://localhost:54321
myapp_DAPR_API_TOKENabc123

These environment variables will be loaded into the registered configuration in the following example and made available without the prefix attached.

var builder = WebApplication.CreateBuilder();

builder.Configuration.AddEnvironmentVariables(prefix: "myapp_");
builder.Services.AddDaprPubSubClient();

The Dapr PubSub client will be configured to use both the HTTP endpoint http://localhost:54321 and populate all outbound requests with the API token header abc123.

Use the Dapr PubSub client without relying on dependency injection

While the use of dependency injection simplifies the use of complex types in .NET and makes it easier to deal with complicated configurations, you’re not required to register the DaprPublishSubscribeClient in this way. Rather, you can also elect to create an instance of it from a DaprPublishSubscribeClientBuilder instance as demonstrated below:


public class MySampleClass
{
    public void DoSomething()
    {
        var daprPubSubClientBuilder = new DaprPublishSubscribeClientBuilder();
        var daprPubSubClient = daprPubSubClientBuilder.Build();

        //Do something with the `daprPubSubClient`
    }
}

Set up message handler

The streaming subscription implementation in Dapr gives you greater control over handling backpressure from events by leaving the messages in the Dapr runtime until your application is ready to accept them. The .NET SDK supports a high-performance queue for maintaining a local cache of these messages in your application while processing is pending. These messages will persist in the queue until processing either times out for each one or a response action is taken for each (typically after processing succeeds or fails). Until this response action is received by the Dapr runtime, the messages will be persisted by Dapr and made available in case of a service failure.

The various response actions available are as follows:

Response ActionDescription
RetryThe event should be delivered again in the future.
DropThe event should be deleted (or forwarded to a dead letter queue, if configured) and not attempted again.
SuccessThe event should be deleted as it was successfully processed.

The handler will receive only one message at a time and if a cancellation token is provided to the subscription, this token will be provided during the handler invocation.

The handler must be configured to return a Task<TopicResponseAction> indicating one of these operations, even if from a try/catch block. If an exception is not caught by your handler, the subscription will use the response action configured in the options during subscription registration.

The following demonstrates the sample message handler provided in the example:

Task<TopicResponseAction> HandleMessageAsync(TopicMessage message, CancellationToken cancellationToken = default)
{
    try
    {
        //Do something with the message
        Console.WriteLine(Encoding.UTF8.GetString(message.Data.Span));
        return Task.FromResult(TopicResponseAction.Success);
    }
    catch
    {
        return Task.FromResult(TopicResponseAction.Retry);
    }
}

Configure and subscribe to the PubSub topic

Configuration of the streaming subscription requires the name of the PubSub component registered with Dapr, the name of the topic or queue being subscribed to, the DaprSubscriptionOptions providing the configuration for the subscription, the message handler and an optional cancellation token. The only required argument to the DaprSubscriptionOptions is the default MessageHandlingPolicy which consists of a per-event timeout and the TopicResponseAction to take when that timeout occurs.

Other options are as follows:

Property NameDescription
MetadataAdditional subscription metadata
DeadLetterTopicThe optional name of the dead-letter topic to send dropped messages to.
MaximumQueuedMessagesBy default, there is no maximum boundary enforced for the internal queue, but setting this
property would impose an upper limit.
MaximumCleanupTimeoutWhen the subscription is disposed of or the token flags a cancellation request, this specifies
the maximum amount of time available to process the remaining messages in the internal queue.

Subscription is then configured as in the following example:

var messagingClient = app.Services.GetRequiredService<DaprPublishSubscribeClient>();

var cancellationTokenSource = new CancellationTokenSource(TimeSpan.FromSeconds(60)); //Override the default of 30 seconds
var options = new DaprSubscriptionOptions(new MessageHandlingPolicy(TimeSpan.FromSeconds(10), TopicResponseAction.Retry));
var subscription = await messagingClient.SubscribeAsync("pubsub", "mytopic", options, HandleMessageAsync, cancellationTokenSource.Token);

Terminate and clean up subscription

When you’ve finished with your subscription and wish to stop receiving new events, simply await a call to DisposeAsync() on your subscription instance. This will cause the client to unregister from additional events and proceed to finish processing all the events still leftover in the backpressure queue, if any, before disposing of any internal resources. This cleanup will be limited to the timeout interval provided in the DaprSubscriptionOptions when the subscription was registered and by default, this is set to 30 seconds.

1.7.2 - DaprPublishSubscribeClient usage

Essential tips and advice for using DaprPublishSubscribeClient

Lifetime management

A DaprPublishSubscribeClient is a version of the Dapr client that is dedicated to interacting with the Dapr Messaging API. It can be registered alongside a DaprClient and other Dapr clients without issue.

It maintains access to networking resources in the form of TCP sockets used to communicate with the Dapr sidecar and implements IAsyncDisposable to support the eager cleanup of resources.

For best performance, create a single long-lived instance of DaprPublishSubscribeClient and provide access to that shared instance throughout your application. DaprPublishSubscribeClient instances are thread-safe and intended to be shared.

This can be aided by utilizing the dependency injection functionality. The registration method supports registration using as a singleton, a scoped instance or as transient (meaning it’s recreated every time it’s injected), but also enables registration to utilize values from an IConfiguration or other injected service in a way that’s impractical when creating the client from scratch in each of your classes.

Avoid creating a DaprPublishSubscribeClient for each operation and disposing it when the operation is complete. It’s intended that the DaprPublishSubscribeClient should only be disposed when you no longer wish to receive events on the subscription as disposing it will cancel the ongoing receipt of new events.

Configuring DaprPublishSubscribeClient via the DaprPublishSubscribeClientBuilder

A DaprPublishSubscribeClient can be configured by invoking methods on the DaprPublishSubscribeClientBuilder class before calling .Build() to create the client itself. The settings for each DaprPublishSubscribeClient are separate and cannot be changed after calling .Build().

var daprPubsubClient = new DaprPublishSubscribeClientBuilder()
    .UseDaprApiToken("abc123") // Specify the API token used to authenticate to other Dapr sidecars
    .Build();

The DaprPublishSubscribeClientBuilder contains settings for:

  • The HTTP endpoint of the Dapr sidecar
  • The gRPC endpoint of the Dapr sidecar
  • The JsonSerializerOptions object used to configure JSON serialization
  • The GrpcChannelOptions object used to configure gRPC
  • The API token used to authenticate requests to the sidecar
  • The factory method used to create the HttpClient instance used by the SDK
  • The timeout used for the HttpClient instance when making requests to the sidecar

The SDK will read the following environment variables to configure the default values:

  • DAPR_HTTP_ENDPOINT: used to find the HTTP endpoint of the Dapr sidecar, example: https://dapr-api.mycompany.com
  • DAPR_GRPC_ENDPOINT: used to find the gRPC endpoint of the Dapr sidecar, example: https://dapr-grpc-api.mycompany.com
  • DAPR_HTTP_PORT: if DAPR_HTTP_ENDPOINT is not set, this is used to find the HTTP local endpoint of the Dapr sidecar
  • DAPR_GRPC_PORT: if DAPR_GRPC_ENDPOINT is not set, this is used to find the gRPC local endpoint of the Dapr sidecar
  • DAPR_API_TOKEN: used to set the API token

Configuring gRPC channel options

Dapr’s use of CancellationToken for cancellation relies on the configuration of the gRPC channel options. If you need to configure these options yourself, make sure to enable the ThrowOperationCanceledOnCancellation setting.

var daprPubsubClient = new DaprPublishSubscribeClientBuilder()
    .UseGrpcChannelOptions(new GrpcChannelOptions { ... ThrowOperationCanceledOnCancellation = true })
    .Build();

Using cancellation with DaprPublishSubscribeClient

The APIs on DaprPublishSubscribeClient perform asynchronous operations and accept an optional CancellationToken parameter. This follows a standard .NET practice for cancellable operations. Note that when cancellation occurs, there is no guarantee that the remote endpoint stops processing the request, only that the client has stopped waiting for completion.

When an operation is cancelled, it will throw an OperationCancelledException.

Configuring DaprPublishSubscribeClient via dependency injection

Using the built-in extension methods for registering the DaprPublishSubscribeClient in a dependency injection container can provide the benefit of registering the long-lived service a single time, centralize complex configuration and improve performance by ensuring similarly long-lived resources are re-purposed when possible (e.g. HttpClient instances).

There are three overloads available to give the developer the greatest flexibility in configuring the client for their scenario. Each of these will register the IHttpClientFactory on your behalf if not already registered, and configure the DaprPublishSubscribeClientBuilder to use it when creating the HttpClient instance in order to re-use the same instance as much as possible and avoid socket exhaustion and other issues.

In the first approach, there’s no configuration done by the developer and the DaprPublishSubscribeClient is configured with the default settings.

var builder = WebApplication.CreateBuilder(args);

builder.Services.DaprPublishSubscribeClient(); //Registers the `DaprPublishSubscribeClient` to be injected as needed
var app = builder.Build();

Sometimes the developer will need to configure the created client using the various configuration options detailed above. This is done through an overload that passes in the DaprJobsClientBuiler and exposes methods for configuring the necessary options.

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddDaprJobsClient((_, daprPubSubClientBuilder) => {
   //Set the API token
   daprPubSubClientBuilder.UseDaprApiToken("abc123");
   //Specify a non-standard HTTP endpoint
   daprPubSubClientBuilder.UseHttpEndpoint("http://dapr.my-company.com");
});

var app = builder.Build();

Finally, it’s possible that the developer may need to retrieve information from another service in order to populate these configuration values. That value may be provided from a DaprClient instance, a vendor-specific SDK or some local service, but as long as it’s also registered in DI, it can be injected into this configuration operation via the last overload:

var builder = WebApplication.CreateBuilder(args);

//Register a fictional service that retrieves secrets from somewhere
builder.Services.AddSingleton<SecretService>();

builder.Services.AddDaprPublishSubscribeClient((serviceProvider, daprPubSubClientBuilder) => {
    //Retrieve an instance of the `SecretService` from the service provider
    var secretService = serviceProvider.GetRequiredService<SecretService>();
    var daprApiToken = secretService.GetSecret("DaprApiToken").Value;

    //Configure the `DaprPublishSubscribeClientBuilder`
    daprPubSubClientBuilder.UseDaprApiToken(daprApiToken);
});

var app = builder.Build();

1.8 - Best Practices for the Dapr .NET SDK

Using Dapr .NET SDK effectively

Building with confidence

The Dapr .NET SDK offers a rich set of capabilities for building distributed applications. This section provides practical guidance for using the SDK effectively in production scenarios—focusing on reliability, maintainability, and developer experience.

Topics covered include:

  • Error handling strategies across Dapr building blocks
  • Managing experimental features and suppressing related warnings
  • Leveraging source analyzers and generators to reduce boilerplate and catch issues early
  • General .NET development practices in Dapr-based applications

Error model guidance

Dapr operations can fail for many reasons—network issues, misconfigured components, or transient faults. The SDK provides structured error types to help you distinguish between retryable and fatal errors.

Learn how to use DaprException and its derived types effectively here.

Experimental attributes

Some SDK features are marked as experimental and may change in future releases. These are annotated with [Experimental] and generate build-time warnings by default. You can:

  • Suppress warnings selectively using #pragma warning disable
  • Use SuppressMessage attributes for finer control
  • Track experimental usage across your codebase

Learn more about our use of the [Experimenta] attribute here.

Source tooling

The SDK includes Roslyn-based analyzers and source generators to help you write better code with less effort. These tools:

  • Warn about common misuses of the SDK
  • Generate boilerplate for actor registration and invocation
  • Support IDE integration for faster feedback

Read more about how to install and use these analyzers here.

Additional guidance

This section is designed to support a wide range of development scenarios. As your applications grow in complexity, you’ll find increasingly relevant practices and patterns for working with Dapr in .NET—from actor lifecycle management to configuration strategies and performance tuning.

1.8.1 - Error Model in the Dapr .NET SDK

Learn how to use the richer error model in the .NET SDK.

The Dapr .NET SDK supports the richer error model, implemented by the Dapr runtime. This model provides a way for applications to enrich their errors with added context, allowing consumers of the application to better understand the issue and resolve it faster. You can read more about the richer error model here, and you can find the Dapr proto file implementing these errors here.

The Dapr .NET SDK implements all details supported by the Dapr runtime, implemented in the Dapr.Common.Exceptions namespace, and is accessible through the DaprException extension method TryGetExtendedErrorInfo. Currently, this detail extraction is only supported for RpcExceptions where the details are present.

// Example usage of ExtendedErrorInfo

try
{
    // Perform some action with the Dapr client that throws a DaprException.
}
catch (DaprException daprEx)
{
    if (daprEx.TryGetExtendedErrorInfo(out DaprExtendedErrorInfo errorInfo)
    {
        Console.WriteLine(errorInfo.Code);
        Console.WriteLine(errorInfo.Message);

        foreach (DaprExtendedErrorDetail detail in errorInfo.Details)
        {
            Console.WriteLine(detail.ErrorType);
            switch (detail.ErrorType)
                case ExtendedErrorType.ErrorInfo:
                    Console.WriteLine(detail.Reason);
                    Console.WriteLine(detail.Domain);
                default:
                    Console.WriteLine(detail.TypeUrl);
        }
    }
}

DaprExtendedErrorInfo

Contains Code (the status code) and Message (the error message) associated with the error, parsed from an inner RpcException. Also contains a collection of DaprExtendedErrorDetails parsed from the details in the exception.

DaprExtendedErrorDetail

All details implement the abstract DaprExtendedErrorDetail and have an associated DaprExtendedErrorType.

  1. RetryInfo

  2. DebugInfo

  3. QuotaFailure

  4. PreconditionFailure

  5. RequestInfo

  6. LocalizedMessage

  7. BadRequest

  8. ErrorInfo

  9. Help

  10. ResourceInfo

  11. Unknown

RetryInfo

Information notifying the client how long to wait before they should retry. Provides a DaprRetryDelay with the properties Second (offset in seconds) and Nano (offset in nanoseconds).

DebugInfo

Debugging information offered by the server. Contains StackEntries (a collection of strings containing the stack trace), and Detail (further debugging information).

QuotaFailure

Information relating to some quota that may have been reached, such as a daily usage limit on an API. It has one property Violations, a collection of DaprQuotaFailureViolation, which each contain a Subject (the subject of the request) and Description (further information regarding the failure).

PreconditionFailure

Information informing the client that some required precondition was not met. Has one property Violations, a collection of DaprPreconditionFailureViolation, which each has Subject (subject where the precondition failure occured, e.g. “Azure”), Type (representation of the precondition type, e.g. “TermsOfService”), and Description (further description e.g. “ToS must be accepted.”).

RequestInfo

Information returned by the server that can be used by the server to identify the client’s request. Contains RequestId and ServingData properties, RequestId being some string (such as a UID) the server can interpret, and ServingData being some arbitrary data that made up part of the request.

LocalizedMessage

Contains a localized message, along with the locale of the message. Contains Locale (the locale e.g. “en-US”) and Message (the localized message).

BadRequest

Describes a bad request field. Contains collection of DaprBadRequestDetailFieldViolation, which each has Field (the offending field in request, e.g. ‘first_name’) and Description (further information detailing the reason, e.g. “first_name cannot contain special characters”).

ErrorInfo

Details the cause of an error. Contains three properties, Reason (the reason for the error, which should take the form of UPPER_SNAKE_CASE, e.g. DAPR_INVALID_KEY), Domain (domain the error belongs to, e.g. ‘dapr.io’), and Metadata, a key/value-based collection with further information.

Help

Provides resources for the client to perform further research into the issue. Contains a collection of DaprHelpDetailLink, which provides Url (a url to help or documentation), and Description (a description of what the link provides).

ResourceInfo

Provides information relating to an accessed resource. Provides three properties ResourceType (type of the resource being access e.g. “Azure service bus”), ResourceName (the name of the resource e.g. “my-configured-service-bus”), Owner (the owner of the resource e.g. “subscriptionowner@dapr.io”), and Description (further information on the resource relating to the error, e.g. “missing permissions to use this resource”).

Unknown

Returned when the detail type url cannot be mapped to the correct DaprExtendedErrorDetail implementation. Provides one property TypeUrl (the type url that could not be parsed, e.g. “type.googleapis.com/Google.rpc.UnrecognizedType”).

1.8.2 - Experimental Attributes

Learn about why we mark some methods with the [Experimental] attribute

Experimental Attributes

Introduction to Experimental Attributes

With the release of .NET 8, C# 12 introduced the [Experimental] attribute, which provides a standardized way to mark APIs that are still in development or experimental. This attribute is defined in the System.Diagnostics.CodeAnalysis namespace and requires a diagnostic ID parameter used to generate compiler warnings when the experimental API is used.

In the Dapr .NET SDK, we now use the [Experimental] attribute instead of [Obsolete] to mark building blocks and components that have not yet passed the stable lifecycle certification. This approach provides a clearer distinction between:

  1. Experimental APIs - Features that are available but still evolving and have not yet been certified as stable according to the Dapr Component Certification Lifecycle.

  2. Obsolete APIs - Features that are truly deprecated and will be removed in a future release.

Usage in the Dapr .NET SDK

In the Dapr .NET SDK, we apply the [Experimental] attribute at the class level for building blocks that are still in the Alpha or Beta stages of the Component Certification Lifecycle. The attribute includes:

  • A diagnostic ID that identifies the experimental building block
  • A URL that points to the relevant documentation for that block

For example:

using System.Diagnostics.CodeAnalysis;
namespace Dapr.Cryptography.Encryption 
{ 
    [Experimental("DAPR_CRYPTOGRAPHY", UrlFormat = "https://docs.dapr.io/developing-applications/building-blocks/cryptography/cryptography-overview/")] 
    public class DaprEncryptionClient 
    { 
        // Implementation 
    } 
}

The diagnostic IDs follow a naming convention of DAPR_[BUILDING_BLOCK_NAME], such as:

  • DAPR_CONVERSATION - For the Conversation building block
  • DAPR_CRYPTOGRAPHY - For the Cryptography building block
  • DAPR_JOBS - For the Jobs building block
  • DAPR_DISTRIBUTEDLOCK - For the Distributed Lock building block

Suppressing Experimental Warnings

When you use APIs marked with the [Experimental] attribute, the compiler will generate errors. To build your solution without marking your own code as experimental, you will need to suppress these errors. Here are several approaches to do this:

Option 1: Using #pragma directive

You can use the #pragma warning directive to suppress the warning for specific sections of code:

// Disable experimental warning 
#pragma warning disable DAPR_CRYPTOGRAPHY 
// Your code using the experimental API 
var client = new DaprEncryptionClient(); 
// Re-enable the warning 
#pragma warning restore DAPR_CRYPTOGRAPHY

This approach is useful when you want to suppress warnings only for specific sections of your code.

Option 2: Project-level suppression

To suppress warnings for an entire project, add the following to your .csproj file. file.

<PropertyGroup>
    <NoWarn>$(NoWarn);DAPR_CRYPTOGRAPHY</NoWarn>
</PropertyGroup>

You can include multiple diagnostic IDs separated by semicolons:

<PropertyGroup>
    <NoWarn>$(NoWarn);DAPR_CONVERSATION;DAPR_JOBS;DAPR_DISTRIBUTEDLOCK;DAPR_CRYPTOGRAPHY</NoWarn>
</PropertyGroup>

This approach is particularly useful for test projects that need to use experimental APIs.

Option 3: Directory-level suppression

For suppressing warnings across multiple projects in a directory, add a Directory.Build.props file:

<PropertyGroup>
    <NoWarn>$(NoWarn);DAPR_CONVERSATION;DAPR_JOBS;DAPR_DISTRIBUTEDLOCK;DAPR_CRYPTOGRAPHY</NoWarn>
</PropertyGroup>

This file should be placed in the root directory of your test projects. You can learn more about using Directory.Build.props files in the MSBuild documentation.

Lifecycle of Experimental APIs

As building blocks move through the certification lifecycle and reach the “Stable” stage, the [Experimental] attribute will be removed. No migration or code changes will be required from users when this happens, except for the removal of any warning suppressions if they were added.

Conversely, the [Obsolete] attribute will now be reserved exclusively for APIs that are truly deprecated and scheduled for removal. When you see a method or class marked with [Obsolete], you should plan to migrate away from it according to the migration guidance provided in the attribute message.

Best Practices

  1. In application code:

    • Be cautious when using experimental APIs, as they may change in future releases
    • Consider isolating usage of experimental APIs to make future updates easier
    • Document your use of experimental APIs for team awareness
  2. In test code:

    • Use project-level suppression to avoid cluttering test code with warning suppressions
    • Regularly review which experimental APIs you’re using and check if they’ve been stabilized
  3. When contributing to the SDK:

    • Use [Experimental] for new building blocks that haven’t completed certification
    • Use [Obsolete] only for truly deprecated APIs
    • Provide clear documentation links in the UrlFormat parameter

Additional Resources

1.8.3 - Dapr source code analyzers and generators

Code analyzers and fixes for common Dapr issues

Dapr supports a growing collection of optional Roslyn analyzers and code fix providers that inspect your code for code quality issues. Starting with the release of v1.16, developers have the opportunity to install additional projects from NuGet alongside each of the standard capability packages to enable these analyzers in their solutions.

Rule violations will typically be marked as Info or Warning so that if the analyzer identifies an issue, it won’t necessarily break builds. All code analysis violations appear with the prefix “DAPR” and are uniquely distinguished by a number following this prefix.

Install and configure analyzers

The following packages will be available via NuGet following the v1.16 Dapr release:

  • Dapr.Actors.Analyzers
  • Dapr.Jobs.Analyzers
  • Dapr.Workflow.Analyzers

Install each NuGet package on every project where you want the analyzers to run. The package will be installed as a project dependency and analyzers will run as you write your code or as part of a CI/CD build. The analyzers will flag issues in your existing code and warn you about new issues as you build your project.

Many of our analyzers have associated code fixes that can be applied to automatically correct the problem. If your IDE supports this capability, any available code fixes will show up as an inline menu option in your code.

Further, most of our analyzers should also report a specific line and column number in your code of the syntax that’s been identified as a key aspect of the rule. If your IDE supports it, double clicking any of the analyzer warnings should jump directly to the part of your code responsible for the violating the analyzer’s rule.

Suppress specific analyzers

If you wish to keep an analyzer from firing against some particular piece of your project, their outputs can be individually targeted for suppression through a number of ways. Read more about suppressing analyzers in projects or files in the associated .NET documentation.

Disable all analyzers

If you wish to disable all analyzers in your project without removing any packages providing them, set the EnableNETAnalyzers property to false in your csproj file.

Available Analyzers

Diagnostic IDDapr PackageCategorySeverityVersion AddedDescriptionCode Fix Available
DAPR1301Dapr.WorkflowUsageWarning1.16The workflow type is not registered with the dependency injection providerYes
DAPR1302Dapr.WorkflowUsageWarning1.16The workflow activity type is not registered with the dependency injection providerYes
DAPR1401Dapr.ActorsUsageWarning1.16Actor timer method invocations require the named callback method to exist on typeNo
DAPR1402Dapr.ActorsUsageWarning1.16The actor type is not registered with dependency injectionYes
DAPR1403Dapr.ActorsInteroperabilityInfo1.16Set options.UseJsonSerialization to true to support interoperability with non-.NET actorsYes
DAPR1404Dapr.ActorsUsageWarning1.16Call app.MapActorsHandlers to map endpoints for Dapr actorsYes
DAPR1501Dapr.JobsUsageWarning1.16Job invocations require the MapDaprScheduledJobHandler to be set and configured for each anticipated job on IEndpointRouteBuilderNo

Analyzer Categories

The following are each of the eligible categories that an analyzer can be assigned to and are modeled after the standard categories used by the .NET analyzers:

  • Design
  • Documentation
  • Globalization
  • Interoperability
  • Maintainability
  • Naming
  • Performance
  • Reliability
  • Security
  • Usage

1.9 - Developing applications with the Dapr .NET SDK

Deployment integrations with the Dapr .NET SDK

Thinking more than one at a time

Using your favorite IDE or editor to launch an application typically assumes that you only need to run one thing: the application you’re debugging. However, developing microservices challenges you to think about your local development process for more than one at a time. A microservices application has multiple services that you might need running simultaneously, and dependencies (like state stores) to manage.

Adding Dapr to your development process means you need to manage the following concerns:

  • Each service you want to run
  • A Dapr sidecar for each service
  • Dapr component and configuration manifests
  • Additional dependencies such as state stores
  • optional: the Dapr placement service for actors

This document assumes that you’re building a production application and want to create a repeatable and robust set of development practices. The guidance here is generalized, and applies to any .NET server application using Dapr (including actors).

Managing components

You have two primary methods of storing component definitions for local development with Dapr:

  • Use the default location (~/.dapr/components)
  • Use your own location

Creating a folder within your source code repository to store components and configuration will give you a way to version and share these definitions. The guidance provided here will assume you created a folder next to the application source code to store these files.

Development options

Choose one of these links to learn about tools you can use in local development scenarios. It’s suggested that you familiarize yourself with each of them to get a sense of the options provided by the .NET SDK.

1.9.1 - Dapr .NET SDK Development with Dapr CLI

Learn about local development with the Dapr CLI

Dapr CLI

Consider this to be a .NET companion to the Dapr Self-Hosted with Docker Guide.

The Dapr CLI provides you with a good base to work from by initializing a local redis container, zipkin container, the placement service, and component manifests for redis. This will enable you to work with the following building blocks on a fresh install with no additional setup:

You can run .NET services with dapr run as your strategy for developing locally. Plan on running one of these commands per-service in order to launch your application.

  • Pro: this is easy to set up since its part of the default Dapr installation
  • Con: this uses long-running docker containers on your machine, which might not be desirable
  • Con: the scalability of this approach is poor since it requires running a separate command per-service

Using the Dapr CLI

For each service you need to choose:

  • A unique app-id for addressing (app-id)
  • A unique listening port for HTTP (port)

You also should have decided on where you are storing components (components-path).

The following command can be run from multiple terminals to launch each service, with the respective values substituted.

dapr run --app-id <app-id> --app-port <port> --components-path <components-path> -- dotnet run -p <project> --urls http://localhost:<port>

Explanation: this command will use dapr run to launch each service and its sidecar. The first half of the command (before --) passes required configuration to the Dapr CLI. The second half of the command (after --) passes required configuration to the dotnet run command.

If any of your services do not accept HTTP traffic, then modify the command above by removing the --app-port and --urls arguments.

Next steps

If you need to debug, then use the attach feature of your debugger to attach to one of the running processes.

If you want to scale up this approach, then consider building a script which automates this process for your whole application.

1.9.2 - Dapr .NET SDK Development with Docker-Compose

Learn about local development with Docker-Compose

Docker-Compose

Consider this to be a .NET companion to the Dapr Self-Hosted with Docker Guide.

docker-compose is a CLI tool included with Docker Desktop that you can use to run multiple containers at a time. It is a way to automate the lifecycle of multiple containers together, and offers a development experience similar to a production environment for applications targeting Kubernetes.

  • Pro: Since docker-compose manages containers for you, you can make dependencies part of the application definition and stop the long-running containers on your machine.
  • Con: most investment required, services need to be containerized to get started.
  • Con: can be difficult to debug and troubleshoot if you are unfamilar with Docker.

Using docker-compose

From the .NET perspective, there is no specialized guidance needed for docker-compose with Dapr. docker-compose runs containers, and once your service is in a container, configuring it similar to any other programming technology.

To summarize the approach:

  • Create a Dockerfile for each service
  • Create a docker-compose.yaml and place check it in to the source code repository

To understand the authoring the docker-compose.yaml you should start with the Hello, docker-compose sample.

Similar to running locally with dapr run for each service you need to choose a unique app-id. Choosing the container name as the app-id will make this simple to remember.

The compose file will contain at a minimum:

  • A network that the containers use to communicate
  • Each service’s container
  • A <service>-daprd sidecar container with the service’s port and app-id specified
  • Additional dependencies that run in containers (redis for example)
  • optional: Dapr placement container (for actors)

You can also view a larger example from the eShopOnContainers sample application.

1.9.3 - Dapr .NET SDK Development with .NET Aspire

Learn about local development with .NET Aspire

.NET Aspire

.NET Aspire is a development tool designed to make it easier to include external software into .NET applications by providing a framework that allows third-party services to be readily integrated, observed and provisioned alongside your own software.

Aspire simplifies local development by providing rich integration with popular IDEs including Microsoft Visual Studio, Visual Studio Code, JetBrains Rider and others to launch your application with the debugger while automatically launching and provisioning access to other integrations as well, including Dapr.

While Aspire also assists with deployment of your application to various cloud hosts like Microsoft Azure and Amazon AWS, deployment is currently outside the scope of this guide. More information can be found in Aspire’s documentation here.

An end-to-end demonstration featuring the following and demonstrating service invocation between multiple Dapr-enabled services can be found here.

Prerequisites

  • Both the Dapr .NET SDK and .NET Aspire are compatible with .NET 8 or .NET 9
  • An OCI compliant container runtime such as Docker Desktop or Podman
  • Install and initialize Dapr v1.16 or later

Using .NET Aspire via CLI

We’ll start by creating a brand new .NET application. Open your preferred CLI and navigate to the directory you wish to create your new .NET solution within. Start by using the following command to install a template that will create an empty Aspire application:

dotnet new install Aspire.ProjectTemplates

Once that’s installed, proceed to create an empty .NET Aspire application in your current directory. The -n argument allows you to specify the name of the output solution. If it’s excluded, the .NET CLI will instead use the name of the output directory, e.g. C:\source\aspiredemo will result in the solution being named aspiredemo. The rest of this tutorial will assume a solution named aspiredemo.

dotnet new aspire -n aspiredemo

This will create two Aspire-specific directories and one file in your directory:

  • aspiredemo.AppHost/ contains the Aspire orchestration project that is used to configure each of the integrations used in your application(s).
  • aspiredemo.ServiceDefaults/ contains a collection of extensions meant to be shared across your solution to aid in resilience, service discovery and telemetry capabilities offered by Aspire (these are distinct from the capabilities offered in Dapr itself).
  • aspiredemo.sln is the file that maintains the layout of your current solution

We’ll next create twp projects that’ll serve as our Dapr application and demonstrate Dapr functionality. From the same directory, use the following to create an empty ASP.NET Core project called FrontEndApp and another called ‘BackEndApp’. Either one will be created relative to your current directory in FrontEndApp\FrontEndApp.csproj and BackEndApp\BackEndApp.csproj, respectively.

dotnet new web --name FrontEndApp

Next we’ll configure the AppHost project to add the necessary package to support local Dapr development. Navigate into the AppHost directory with the following and install the CommunityToolkit.Aspire.Hosting.Dapr package from NuGet into the project.

We’ll also add a reference to our FrontEndApp project so we can reference it during the registration process.

cd aspiredemo.AppHost
dotnet add package CommunityToolkit.Aspire.Hosting.Dapr
dotnet add reference ../FrontEndApp/
dotnet add reference ../BackEndApp/

Next, we need to configure Dapr as a resource to be loaded alongside your project. Open the Program.cs file in that project within your preferred IDE. It should look similar to the following:

var builder = DistributedApplication.CreateBuilder(args);

builder.Build().Run();

If you’re familiar with the dependency injection approach used in ASP.NET Core projects or others utilizing the Microsoft.Extensions.DependencyInjection functionality, you’ll find that this will be a familiar experience.

Because we’ve already added a project reference to MyApp, we need to start by adding a reference in this configuration as well. Add the following before the builder.Build().Run() line:

var backEndApp = builder
    .AddProject<Projects.BackEndApp>("be")
    .WithDaprSidecar();

var frontEndApp = builder
    .AddProject<Projects.FrontEndApp>("fe")
    .WithDaprSidecar();

Because the project reference has been added to this solution, your project shows up as a type within the Projects. namespace for our purposes here. The name of the variable you assign the project to doesn’t much matter in this tutorial but would be used if you wanted to create a reference between this project and another using Aspire’s service discovery functionality.

Adding .WithDaprSidecar() configures Dapr as a .NET Aspire resource so that when the project runs, the sidecar will be deployed alongside your application. This accepts a number of different options and could optionally be configured as in the following example:

DaprSidecarOptions sidecarOptions = new()
{
    AppId = "how-dapr-identifies-your-app",
    AppPort = 8080, //Note that this argument is required if you intend to configure pubsub, actors or workflows as of Aspire v9.0 
    DaprGrpcPort = 50001,
    DaprHttpPort = 3500,
    MetricsPort = 9090
};

builder
    .AddProject<Projects.BackEndApp>("be")
    .WithReference(myApp)
    .WithDaprSidecar(sidecarOptions);

Finally, let’s add an endpoint to the back-end app that we can invoke using Dapr’s service invocation to display to a page to demonstrate that Dapr is working as expected.

When you open the solution in your IDE, ensure that the aspiredemo.AppHost is configured as your startup project, but when you launch it in a debug configuration, you’ll note that your integrated console should reflect your expected Dapr logs and it will be available to your application.

1.10 - How to troubleshoot and debug with the Dapr .NET SDK

Tips, tricks, and guides for troubleshooting and debugging with the Dapr .NET SDKs

1.10.1 - Troubleshoot Pub/Sub with the .NET SDK

Troubleshoot Pub/Sub with the .NET SDK

Troubleshooting Pub/Sub

The most common problem with pub/sub is that the pub/sub endpoint in your application is not being called.

There are a few layers to this problem with different solutions:

  • The application is not receiving any traffic from Dapr
  • The application is not registering pub/sub endpoints with Dapr
  • The pub/sub endpoints are registered with Dapr, but the request is not reaching the desired endpoint

Step 1: Turn up the logs

This is important. Future steps will depend on your ability to see logging output. ASP.NET Core logs almost nothing with the default log settings, so you will need to change it.

Adjust the logging verbosity to include Information logging for ASP.NET Core as described here. Set the Microsoft key to Information.

Step 2: Verify you can receive traffic from Dapr

  1. Start the application as you would normally (dapr run ...). Make sure that you’re including an --app-port argument in the commandline. Dapr needs to know that your application is listening for traffic. By default an ASP.NET Core application will listen for HTTP on port 5000 in local development.

  2. Wait for Dapr to finish starting

  3. Examine the logs

You should see a log entry like:

info: Microsoft.AspNetCore.Hosting.Diagnostics[1]
      Request starting HTTP/1.1 GET http://localhost:5000/.....

During initialization Dapr will make some requests to your application for configuration. If you can’t find these then it means that something has gone wrong. Please ask for help either via an issue or in Discord (include the logs). If you see requests made to your application, then continue to step 3.

Step 3: Verify endpoint registration

  1. Start the application as you would normally (dapr run ...).

  2. Use curl at the command line (or another HTTP testing tool) to access the /dapr/subscribe endpoint.

Here’s an example command assuming your application’s listening port is 5000:

curl http://localhost:5000/dapr/subscribe -v

For a correctly configured application the output should look like the following:

*   Trying ::1...
* TCP_NODELAY set
* Connected to localhost (::1) port 5000 (#0)
> GET /dapr/subscribe HTTP/1.1
> Host: localhost:5000
> User-Agent: curl/7.64.1
> Accept: */*
>
< HTTP/1.1 200 OK
< Date: Fri, 15 Jan 2021 22:31:40 GMT
< Content-Type: application/json
< Server: Kestrel
< Transfer-Encoding: chunked
<
* Connection #0 to host localhost left intact
[{"topic":"deposit","route":"deposit","pubsubName":"pubsub"},{"topic":"withdraw","route":"withdraw","pubsubName":"pubsub"}]* Closing connection 0

Pay particular attention to the HTTP status code, and the JSON output.

< HTTP/1.1 200 OK

A 200 status code indicates success.

The JSON blob that’s included near the end is the output of /dapr/subscribe that’s processed by the Dapr runtime. In this case it’s using the ControllerSample in this repo - so this is an example of correct output.

[
    {"topic":"deposit","route":"deposit","pubsubName":"pubsub"},
    {"topic":"withdraw","route":"withdraw","pubsubName":"pubsub"}
]

With the output of this command in hand, you are ready to diagnose a problem or move on to the next step.

Option 0: The response was a 200 included some pub/sub entries

If you have entries in the JSON output from this test then the problem lies elsewhere, move on to step 2.

Option 1: The response was not a 200, or didn’t contain JSON

If the response was not a 200 or did not contain JSON, then the MapSubscribeHandler() endpoint was not reached.

Make sure you have some code like the following in Startup.cs and repeat the test.

app.UseRouting();

app.UseCloudEvents();

app.UseEndpoints(endpoints =>
{
    endpoints.MapSubscribeHandler(); // This is the Dapr subscribe handler
    endpoints.MapControllers();
});

If adding the subscribe handler did not resolve the problem, please open an issue on this repo and include the contents of your Startup.cs file.

Option 2: The response contained JSON but it was empty (like [])

If the JSON output was an empty array (like []) then the subscribe handler is registered, but no topic endpoints were registered.


If you’re using a controller for pub/sub you should have a method like:

[Topic("pubsub", "deposit")]
[HttpPost("deposit")]
public async Task<ActionResult> Deposit(...)

// Using Pub/Sub routing
[Topic("pubsub", "transactions", "event.type == \"withdraw.v2\"", 1)]
[HttpPost("withdraw")]
public async Task<ActionResult> Withdraw(...)

In this example the Topic and HttpPost attributes are required, but other details might be different.


If you’re using routing for pub/sub you should have an endpoint like:

endpoints.MapPost("deposit", ...).WithTopic("pubsub", "deposit");

In this example the call to WithTopic(...) is required but other details might be different.


After correcting this code and re-testing if the JSON output is still the empty array (like []) then please open an issue on this repository and include the contents of Startup.cs and your pub/sub endpoint.

Step 4: Verify endpoint reachability

In this step we’ll verify that the entries registered with pub/sub are reachable. The last step should have left you with some JSON output like the following:

[
  {
    "pubsubName": "pubsub",
    "topic": "deposit",
    "route": "deposit"
  },
  {
    "pubsubName": "pubsub",
    "topic": "deposit",
    "routes": {
      "rules": [
        {
          "match": "event.type == \"withdraw.v2\"",
          "path": "withdraw"
        }
      ]
    }
  }
]

Keep this output, as we’ll use the route information to test the application.

  1. Start the application as you would normally (dapr run ...).

  2. Use curl at the command line (or another HTTP testing tool) to access one of the routes registered with a pub/sub endpoint.

Here’s an example command assuming your application’s listening port is 5000, and one of your pub/sub routes is withdraw:

curl http://localhost:5000/withdraw -H 'Content-Type: application/json' -d '{}' -v

Here’s the output from running the above command against the sample:

*   Trying ::1...
* TCP_NODELAY set
* Connected to localhost (::1) port 5000 (#0)
> POST /withdraw HTTP/1.1
> Host: localhost:5000
> User-Agent: curl/7.64.1
> Accept: */*
> Content-Type: application/json
> Content-Length: 2
>
* upload completely sent off: 2 out of 2 bytes
< HTTP/1.1 400 Bad Request
< Date: Fri, 15 Jan 2021 22:53:27 GMT
< Content-Type: application/problem+json; charset=utf-8
< Server: Kestrel
< Transfer-Encoding: chunked
<
* Connection #0 to host localhost left intact
{"type":"https://tools.ietf.org/html/rfc7231#section-6.5.1","title":"One or more validation errors occurred.","status":400,"traceId":"|5e9d7eee-4ea66b1e144ce9bb.","errors":{"Id":["The Id field is required."]}}* Closing connection 0

Based on the HTTP 400 and JSON payload, this response indicates that the endpoint was reached but the request was rejected due to a validation error.

You should also look at the console output of the running application. This is example output with the Dapr logging headers stripped away for clarity.

info: Microsoft.AspNetCore.Hosting.Diagnostics[1]
      Request starting HTTP/1.1 POST http://localhost:5000/withdraw application/json 2
info: Microsoft.AspNetCore.Routing.EndpointMiddleware[0]
      Executing endpoint 'ControllerSample.Controllers.SampleController.Withdraw (ControllerSample)'
info: Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker[3]
      Route matched with {action = "Withdraw", controller = "Sample"}. Executing controller action with signature System.Threading.Tasks.Task`1[Microsoft.AspNetCore.Mvc.ActionResult`1[ControllerSample.Account]] Withdraw(ControllerSample.Transaction, Dapr.Client.DaprClient) on controller ControllerSample.Controllers.SampleController (ControllerSample).
info: Microsoft.AspNetCore.Mvc.Infrastructure.ObjectResultExecutor[1]
      Executing ObjectResult, writing value of type 'Microsoft.AspNetCore.Mvc.ValidationProblemDetails'.
info: Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker[2]
      Executed action ControllerSample.Controllers.SampleController.Withdraw (ControllerSample) in 52.1211ms
info: Microsoft.AspNetCore.Routing.EndpointMiddleware[1]
      Executed endpoint 'ControllerSample.Controllers.SampleController.Withdraw (ControllerSample)'
info: Microsoft.AspNetCore.Hosting.Diagnostics[2]
      Request finished in 157.056ms 400 application/problem+json; charset=utf-8

The log entry of primary interest is the one coming from routing:

info: Microsoft.AspNetCore.Routing.EndpointMiddleware[0]
      Executing endpoint 'ControllerSample.Controllers.SampleController.Withdraw (ControllerSample)'

This entry shows that:

  • Routing executed
  • Routing chose the ControllerSample.Controllers.SampleController.Withdraw (ControllerSample)' endpoint

Now you have the information needed to troubleshoot this step.

Option 0: Routing chose the correct endpoint

If the information in the routing log entry is correct, then it means that in isolation your application is behaving correctly.

Example:

info: Microsoft.AspNetCore.Routing.EndpointMiddleware[0]
      Executing endpoint 'ControllerSample.Controllers.SampleController.Withdraw (ControllerSample)'

You might want to try using the Dapr cli to execute send a pub/sub message directly and compare the logging output.

Example command:

dapr publish --pubsub pubsub --topic withdraw --data '{}'

If after doing this you still don’t understand the problem please open an issue on this repo and include the contents of your Startup.cs.

Option 1: Routing did not execute

If you don’t see an entry for Microsoft.AspNetCore.Routing.EndpointMiddleware in the logs, then it means that the request was handled by something other than routing. Usually the problem in this case is a misbehaving middleware. Other logs from the request might give you a clue to what’s happening.

If you need help understanding the problem please open an issue on this repo and include the contents of your Startup.cs.

Option 2: Routing chose the wrong endpoint

If you see an entry for Microsoft.AspNetCore.Routing.EndpointMiddleware in the logs, but it contains the wrong endpoint then it means that you’ve got a routing conflict. The endpoint that was chosen will appear in the logs so that should give you an idea of what’s causing the conflict.

If you need help understanding the problem please open an issue on this repo and include the contents of your Startup.cs.

2 - Dapr Go SDK

Go SDK packages for developing Dapr applications

A client library to help build Dapr applications in Go. This client supports all public Dapr APIs while focusing on idiomatic Go experiences and developer productivity.

Client

Use the Go Client SDK for invoking public Dapr APIs [**Learn more about the Go Client SDK**](https://v1-16.docs.dapr.io/developing-applications/sdks/go/go-client/)

Service

Use the Dapr Service (Callback) SDK for Go to create services that will be invoked by Dapr. [**Learn more about the Go Service (Callback) SDK**](https://v1-16.docs.dapr.io/developing-applications/sdks/go/go-service/)

2.1 - Getting started with the Dapr client Go SDK

How to get up and running with the Dapr Go SDK

The Dapr client package allows you to interact with other Dapr applications from a Go application.

Prerequisites

Import the client package

import "github.com/dapr/go-sdk/client"

Error handling

Dapr errors are based on gRPC’s richer error model. The following code shows an example of how you can parse and handle the error details:

if err != nil {
    st := status.Convert(err)

    fmt.Printf("Code: %s\n", st.Code().String())
    fmt.Printf("Message: %s\n", st.Message())

    for _, detail := range st.Details() {
        switch t := detail.(type) {
        case *errdetails.ErrorInfo:
            // Handle ErrorInfo details
            fmt.Printf("ErrorInfo:\n- Domain: %s\n- Reason: %s\n- Metadata: %v\n", t.GetDomain(), t.GetReason(), t.GetMetadata())
        case *errdetails.BadRequest:
            // Handle BadRequest details
            fmt.Println("BadRequest:")
            for _, violation := range t.GetFieldViolations() {
                fmt.Printf("- Key: %s\n", violation.GetField())
                fmt.Printf("- The %q field was wrong: %s\n", violation.GetField(), violation.GetDescription())
            }
        case *errdetails.ResourceInfo:
            // Handle ResourceInfo details
            fmt.Printf("ResourceInfo:\n- Resource type: %s\n- Resource name: %s\n- Owner: %s\n- Description: %s\n",
                t.GetResourceType(), t.GetResourceName(), t.GetOwner(), t.GetDescription())
        case *errdetails.Help:
            // Handle ResourceInfo details
            fmt.Println("HelpInfo:")
            for _, link := range t.GetLinks() {
                fmt.Printf("- Url: %s\n", link.Url)
                fmt.Printf("- Description: %s\n", link.Description)
            }

        default:
            // Add cases for other types of details you expect
            fmt.Printf("Unhandled error detail type: %v\n", t)
        }
    }
}

Building blocks

The Go SDK allows you to interface with all of the Dapr building blocks.

Service Invocation

To invoke a specific method on another service running with Dapr sidecar, the Dapr client Go SDK provides two options:

Invoke a service without data:

resp, err := client.InvokeMethod(ctx, "app-id", "method-name", "post")

Invoke a service with data:

content := &dapr.DataContent{
    ContentType: "application/json",
    Data:        []byte(`{ "id": "a123", "value": "demo", "valid": true }`),
}

resp, err = client.InvokeMethodWithContent(ctx, "app-id", "method-name", "post", content)

For a full guide on service invocation, visit How-To: Invoke a service.

Workflows

Workflows and their activities can be authored and managed using the Dapr Go SDK like so:

import (
...
"github.com/dapr/go-sdk/workflow"
...
)

func ExampleWorkflow(ctx *workflow.WorkflowContext) (any, error) {
    var output string
    input := "world"

    if err := ctx.CallActivity(ExampleActivity, workflow.ActivityInput(input)).Await(&output); err != nil {
        return nil, err
    }

    // Print output - "hello world"
    fmt.Println(output)

    return nil, nil
}

func ExampleActivity(ctx workflow.ActivityContext) (any, error) {
    var input int
    if err := ctx.GetInput(&input); err != nil {
        return "", err
    }

    return fmt.Sprintf("hello %s", input), nil
}

func main() {
    // Create a workflow worker
    w, err := workflow.NewWorker()
    if err != nil {
        log.Fatalf("error creating worker: %v", err)
    }

    // Register the workflow
    w.RegisterWorkflow(ExampleWorkflow)

    // Register the activity
    w.RegisterActivity(ExampleActivity)

    // Start workflow runner
    if err := w.Start(); err != nil {
        log.Fatal(err)
    }

    // Create a workflow client
    wfClient, err := workflow.NewClient()
    if err != nil {
        log.Fatal(err)
    }

    // Start a new workflow
    id, err := wfClient.ScheduleNewWorkflow(context.Background(), "ExampleWorkflow")
    if err != nil {
        log.Fatal(err)
    }

    // Wait for the workflow to complete
    metadata, err := wfClient.WaitForWorkflowCompletion(ctx, id)
    if err != nil {
        log.Fatal(err)
    }

    // Print workflow status post-completion
    fmt.Println(metadata.RuntimeStatus)

    // Shutdown Worker
    w.Shutdown()
}

State Management

For simple use-cases, Dapr client provides easy to use Save, Get, Delete methods:

ctx := context.Background()
data := []byte("hello")
store := "my-store" // defined in the component YAML

// save state with the key key1, default options: strong, last-write
if err := client.SaveState(ctx, store, "key1", data, nil); err != nil {
    panic(err)
}

// get state for key key1
item, err := client.GetState(ctx, store, "key1", nil)
if err != nil {
    panic(err)
}
fmt.Printf("data [key:%s etag:%s]: %s", item.Key, item.Etag, string(item.Value))

// delete state for key key1
if err := client.DeleteState(ctx, store, "key1", nil); err != nil {
    panic(err)
}

For more granular control, the Dapr Go client exposes SetStateItem type, which can be use to gain more control over the state operations and allow for multiple items to be saved at once:

item1 := &dapr.SetStateItem{
    Key:  "key1",
    Etag: &ETag{
        Value: "1",
    },
    Metadata: map[string]string{
        "created-on": time.Now().UTC().String(),
    },
    Value: []byte("hello"),
    Options: &dapr.StateOptions{
        Concurrency: dapr.StateConcurrencyLastWrite,
        Consistency: dapr.StateConsistencyStrong,
    },
}

item2 := &dapr.SetStateItem{
    Key:  "key2",
    Metadata: map[string]string{
        "created-on": time.Now().UTC().String(),
    },
    Value: []byte("hello again"),
}

item3 := &dapr.SetStateItem{
    Key:  "key3",
    Etag: &dapr.ETag{
	Value: "1",
    },
    Value: []byte("hello again"),
}

if err := client.SaveBulkState(ctx, store, item1, item2, item3); err != nil {
    panic(err)
}

Similarly, GetBulkState method provides a way to retrieve multiple state items in a single operation:

keys := []string{"key1", "key2", "key3"}
items, err := client.GetBulkState(ctx, store, keys, nil,100)

And the ExecuteStateTransaction method to execute multiple upsert or delete operations transactionally.

ops := make([]*dapr.StateOperation, 0)

op1 := &dapr.StateOperation{
    Type: dapr.StateOperationTypeUpsert,
    Item: &dapr.SetStateItem{
        Key:   "key1",
        Value: []byte(data),
    },
}
op2 := &dapr.StateOperation{
    Type: dapr.StateOperationTypeDelete,
    Item: &dapr.SetStateItem{
        Key:   "key2",
    },
}
ops = append(ops, op1, op2)
meta := map[string]string{}
err := testClient.ExecuteStateTransaction(ctx, store, meta, ops)

Retrieve, filter, and sort key/value data stored in your statestore using QueryState.

// Define the query string
query := `{
	"filter": {
		"EQ": { "value.Id": "1" }
	},
	"sort": [
		{
			"key": "value.Balance",
			"order": "DESC"
		}
	]
}`

// Use the client to query the state
queryResponse, err := c.QueryState(ctx, "querystore", query)
if err != nil {
	log.Fatal(err)
}

fmt.Printf("Got %d\n", len(queryResponse))

for _, account := range queryResponse {
	var data Account
	err := account.Unmarshal(&data)
	if err != nil {
		log.Fatal(err)
	}

	fmt.Printf("Account: %s has %f\n", data.ID, data.Balance)
}

Note: Query state API is currently in alpha

For a full guide on state management, visit How-To: Save & get state.

Publish Messages

To publish data onto a topic, the Dapr Go client provides a simple method:

data := []byte(`{ "id": "a123", "value": "abcdefg", "valid": true }`)
if err := client.PublishEvent(ctx, "component-name", "topic-name", data); err != nil {
    panic(err)
}

To publish multiple messages at once, the PublishEvents method can be used:

events := []string{"event1", "event2", "event3"}
res := client.PublishEvents(ctx, "component-name", "topic-name", events)
if res.Error != nil {
    panic(res.Error)
}

For a full guide on pub/sub, visit How-To: Publish & subscribe.

Workflow

You can create workflows using the Go SDK. For example, start with a simple workflow activity:

func TestActivity(ctx workflow.ActivityContext) (any, error) {
	var input int
	if err := ctx.GetInput(&input); err != nil {
		return "", err
	}

	// Do something here
	return "result", nil
}

Write a simple workflow function:

func TestWorkflow(ctx *workflow.WorkflowContext) (any, error) {
	var input int
	if err := ctx.GetInput(&input); err != nil {
		return nil, err
	}
	var output string
	if err := ctx.CallActivity(TestActivity, workflow.ActivityInput(input)).Await(&output); err != nil {
		return nil, err
	}
	if err := ctx.WaitForExternalEvent("testEvent", time.Second*60).Await(&output); err != nil {
		return nil, err
	}

	if err := ctx.CreateTimer(time.Second).Await(nil); err != nil {
		return nil, nil
	}
	return output, nil
}

Then compose your application that will use the workflow you’ve created. Refer to the How-To: Author workflows guide for a full walk-through.

Try out the Go SDK workflow example.

Jobs

The Dapr client Go SDK allows you to schedule, get, and delete jobs. Jobs enable you to schedule work to be executed at specific times or intervals.

Scheduling a Job

To schedule a new job, use the ScheduleJobAlpha1 method:

import (
    "google.golang.org/protobuf/types/known/anypb"
)

// Create job data
data, err := anypb.New(&YourDataStruct{Message: "Hello, Job!"})
if err != nil {
    panic(err)
}

// Create a simple job using the builder pattern
job := client.NewJob("my-scheduled-job",
    client.WithJobData(data),
    client.WithJobDueTime("10s"), // Execute in 10 seconds
)

// Schedule the job
err = client.ScheduleJobAlpha1(ctx, job)
if err != nil {
    panic(err)
}

Job with Schedule and Repeats

You can create recurring jobs using the Schedule field with cron expressions:

job := client.NewJob("recurring-job",
    client.WithJobData(data),
    client.WithJobSchedule("0 9 * * *"), // Run at 9 AM every day
    client.WithJobRepeats(10),            // Repeat 10 times
    client.WithJobTTL("1h"),              // Job expires after 1 hour
)

err = client.ScheduleJobAlpha1(ctx, job)

Job with Failure Policy

Configure how jobs should handle failures using failure policies:

// Constant retry policy with max retries and interval
job := client.NewJob("resilient-job",
    client.WithJobData(data),
    client.WithJobDueTime("2024-01-01T10:00:00Z"),
    client.WithJobConstantFailurePolicy(),
    client.WithJobConstantFailurePolicyMaxRetries(3),
    client.WithJobConstantFailurePolicyInterval(30*time.Second),
)

err = client.ScheduleJobAlpha1(ctx, job)

For jobs that should not be retried on failure, use the drop policy:

job := client.NewJob("one-shot-job",
    client.WithJobData(data),
    client.WithJobDueTime("2024-01-01T10:00:00Z"),
    client.WithJobDropFailurePolicy(),
)

err = client.ScheduleJobAlpha1(ctx, job)

Getting a Job

To get information about a scheduled job:

job, err := client.GetJobAlpha1(ctx, "my-scheduled-job")
if err != nil {
    panic(err)
}

fmt.Printf("Job: %s, Schedule: %s, Repeats: %d\n",
    job.Name, job.Schedule, job.Repeats)

Deleting a Job

To cancel a scheduled job:

err = client.DeleteJobAlpha1(ctx, "my-scheduled-job")
if err != nil {
    panic(err)
}

For a full guide on jobs, visit How-To: Schedule and manage jobs.

Output Bindings

The Dapr Go client SDK provides two methods to invoke an operation on a Dapr-defined binding. Dapr supports input, output, and bidirectional bindings.

For simple, output-only binding:

in := &dapr.InvokeBindingRequest{ Name: "binding-name", Operation: "operation-name" }
err = client.InvokeOutputBinding(ctx, in)

To invoke method with content and metadata:

in := &dapr.InvokeBindingRequest{
    Name:      "binding-name",
    Operation: "operation-name",
    Data: []byte("hello"),
    Metadata: map[string]string{"k1": "v1", "k2": "v2"},
}

out, err := client.InvokeBinding(ctx, in)

For a full guide on output bindings, visit How-To: Use bindings.

Actors

Use the Dapr Go client SDK to write actors.

// MyActor represents an example actor type.
type MyActor struct {
	actors.Actor
}

// MyActorMethod is a method that can be invoked on MyActor.
func (a *MyActor) MyActorMethod(ctx context.Context, req *actors.Message) (string, error) {
	log.Printf("Received message: %s", req.Data)
	return "Hello from MyActor!", nil
}

func main() {
	// Create a Dapr client
	daprClient, err := client.NewClient()
	if err != nil {
		log.Fatal("Error creating Dapr client: ", err)
	}

	// Register the actor type with Dapr
	actors.RegisterActor(&MyActor{})

	// Create an actor client
	actorClient := actors.NewClient(daprClient)

	// Create an actor ID
	actorID := actors.NewActorID("myactor")

	// Get or create the actor
	err = actorClient.SaveActorState(context.Background(), "myactorstore", actorID, map[string]interface{}{"data": "initial state"})
	if err != nil {
		log.Fatal("Error saving actor state: ", err)
	}

	// Invoke a method on the actor
	resp, err := actorClient.InvokeActorMethod(context.Background(), "myactorstore", actorID, "MyActorMethod", &actors.Message{Data: []byte("Hello from client!")})
	if err != nil {
		log.Fatal("Error invoking actor method: ", err)
	}

	log.Printf("Response from actor: %s", resp.Data)

	// Wait for a few seconds before terminating
	time.Sleep(5 * time.Second)

	// Delete the actor
	err = actorClient.DeleteActor(context.Background(), "myactorstore", actorID)
	if err != nil {
		log.Fatal("Error deleting actor: ", err)
	}

	// Close the Dapr client
	daprClient.Close()
}

For a full guide on actors, visit the Actors building block documentation.

Secret Management

The Dapr client also provides access to the runtime secrets that can be backed by any number of secrete stores (e.g. Kubernetes Secrets, HashiCorp Vault, or Azure KeyVault):

opt := map[string]string{
    "version": "2",
}

secret, err := client.GetSecret(ctx, "store-name", "secret-name", opt)

Authentication

By default, Dapr relies on the network boundary to limit access to its API. If however the target Dapr API is configured with token-based authentication, users can configure the Go Dapr client with that token in two ways:

Environment Variable

If the DAPR_API_TOKEN environment variable is defined, Dapr will automatically use it to augment its Dapr API invocations to ensure authentication.

Explicit Method

In addition, users can also set the API token explicitly on any Dapr client instance. This approach is helpful in cases when the user code needs to create multiple clients for different Dapr API endpoints.

func main() {
    client, err := dapr.NewClient()
    if err != nil {
        panic(err)
    }
    defer client.Close()
    client.WithAuthToken("your-Dapr-API-token-here")
}

For a full guide on secrets, visit How-To: Retrieve secrets.

Distributed Lock

The Dapr client provides mutually exclusive access to a resource using a lock. With a lock, you can:

  • Provide access to a database row, table, or an entire database
  • Lock reading messages from a queue in a sequential manner
package main

import (
    "fmt"

    dapr "github.com/dapr/go-sdk/client"
)

func main() {
    client, err := dapr.NewClient()
    if err != nil {
        panic(err)
    }
    defer client.Close()

    resp, err := client.TryLockAlpha1(ctx, "lockstore", &dapr.LockRequest{
			LockOwner:         "random_id_abc123",
			ResourceID:      "my_file_name",
			ExpiryInSeconds: 60,
		})

    fmt.Println(resp.Success)
}

For a full guide on distributed lock, visit How-To: Use a lock.

Configuration

With the Dapr client Go SDK, you can consume configuration items that are returned as read-only key/value pairs, and subscribe to configuration item changes.

Config Get

	items, err := client.GetConfigurationItem(ctx, "example-config", "mykey")
	if err != nil {
		panic(err)
	}
	fmt.Printf("get config = %s\n", (*items).Value)

Config Subscribe

go func() {
	if err := client.SubscribeConfigurationItems(ctx, "example-config", []string{"mySubscribeKey1", "mySubscribeKey2", "mySubscribeKey3"}, func(id string, items map[string]*dapr.ConfigurationItem) {
		for k, v := range items {
			fmt.Printf("get updated config key = %s, value = %s \n", k, v.Value)
		}
		subscribeID = id
	}); err != nil {
		panic(err)
	}
}()

For a full guide on configuration, visit How-To: Manage configuration from a store.

Cryptography

With the Dapr client Go SDK, you can use the high-level Encrypt and Decrypt cryptography APIs to encrypt and decrypt files while working on a stream of data.

To encrypt:

// Encrypt the data using Dapr
out, err := client.Encrypt(context.Background(), rf, dapr.EncryptOptions{
	// These are the 3 required parameters
	ComponentName: "mycryptocomponent",
	KeyName:        "mykey",
	Algorithm:     "RSA",
})
if err != nil {
	panic(err)
}

To decrypt:

// Decrypt the data using Dapr
out, err := client.Decrypt(context.Background(), rf, dapr.EncryptOptions{
	// Only required option is the component name
	ComponentName: "mycryptocomponent",
})

For a full guide on cryptography, visit How-To: Use the cryptography APIs.

Go SDK Examples

2.2 - Getting started with the Dapr Service (Callback) SDK for Go

How to get up and running with the Dapr Service (Callback) SDK for Go

In addition to this Dapr API client, Dapr Go SDK also provides service package to bootstrap your Dapr callback services. These services can be developed in either gRPC or HTTP:

2.2.1 - Getting started with the Dapr HTTP Service SDK for Go

How to get up and running with the Dapr HTTP Service SDK for Go

Prerequisite

Start by importing Dapr Go service/http package:

daprd "github.com/dapr/go-sdk/service/http"

Creating and Starting Service

To create an HTTP Dapr service, first, create a Dapr callback instance with a specific address:

s := daprd.NewService(":8080")

Or with address and an existing http.ServeMux in case you want to combine existing server implementations:

mux := http.NewServeMux()
mux.HandleFunc("/", myOtherHandler)
s := daprd.NewServiceWithMux(":8080", mux)

Once you create a service instance, you can “attach” to that service any number of event, binding, and service invocation logic handlers as shown below. Onces the logic is defined, you are ready to start the service:

if err := s.Start(); err != nil && err != http.ErrServerClosed {
	log.Fatalf("error: %v", err)
}

Event Handling

To handle events from specific topic you need to add at least one topic event handler before starting the service:

sub := &common.Subscription{
	PubsubName: "messages",
	Topic:      "topic1",
	Route:      "/events",
}
err := s.AddTopicEventHandler(sub, eventHandler)
if err != nil {
	log.Fatalf("error adding topic subscription: %v", err)
}

The handler method itself can be any method with the expected signature:

func eventHandler(ctx context.Context, e *common.TopicEvent) (retry bool, err error) {
	log.Printf("event - PubsubName:%s, Topic:%s, ID:%s, Data: %v", e.PubsubName, e.Topic, e.ID, e.Data)
	// do something with the event
	return true, nil
}

Optionally, you can use routing rules to send messages to different handlers based on the contents of the CloudEvent.

sub := &common.Subscription{
	PubsubName: "messages",
	Topic:      "topic1",
	Route:      "/important",
	Match:      `event.type == "important"`,
	Priority:   1,
}
err := s.AddTopicEventHandler(sub, importantHandler)
if err != nil {
	log.Fatalf("error adding topic subscription: %v", err)
}

You can also create a custom type that implements the TopicEventSubscriber interface to handle your events:

type EventHandler struct {
	// any data or references that your event handler needs.
}

func (h *EventHandler) Handle(ctx context.Context, e *common.TopicEvent) (retry bool, err error) {
    log.Printf("event - PubsubName:%s, Topic:%s, ID:%s, Data: %v", e.PubsubName, e.Topic, e.ID, e.Data)
    // do something with the event
    return true, nil
}

The EventHandler can then be added using the AddTopicEventSubscriber method:

sub := &common.Subscription{
    PubsubName: "messages",
    Topic:      "topic1",
}
eventHandler := &EventHandler{
// initialize any fields
}
if err := s.AddTopicEventSubscriber(sub, eventHandler); err != nil {
    log.Fatalf("error adding topic subscription: %v", err)
}

Service Invocation Handler

To handle service invocations you will need to add at least one service invocation handler before starting the service:

if err := s.AddServiceInvocationHandler("/echo", echoHandler); err != nil {
	log.Fatalf("error adding invocation handler: %v", err)
}

The handler method itself can be any method with the expected signature:

func echoHandler(ctx context.Context, in *common.InvocationEvent) (out *common.Content, err error) {
	log.Printf("echo - ContentType:%s, Verb:%s, QueryString:%s, %+v", in.ContentType, in.Verb, in.QueryString, string(in.Data))
	// do something with the invocation here 
	out = &common.Content{
		Data:        in.Data,
		ContentType: in.ContentType,
		DataTypeURL: in.DataTypeURL,
	}
	return
}

Binding Invocation Handler

if err := s.AddBindingInvocationHandler("/run", runHandler); err != nil {
	log.Fatalf("error adding binding handler: %v", err)
}

The handler method itself can be any method with the expected signature:

func runHandler(ctx context.Context, in *common.BindingEvent) (out []byte, err error) {
	log.Printf("binding - Data:%v, Meta:%v", in.Data, in.Metadata)
	// do something with the invocation here 
	return nil, nil
}

2.2.2 - Getting started with the Dapr Service (Callback) SDK for Go

How to get up and running with the Dapr Service (Callback) SDK for Go

Dapr gRPC Service SDK for Go

Prerequisite

Start by importing Dapr Go service/grpc package:

daprd "github.com/dapr/go-sdk/service/grpc"

Creating and Starting Service

To create a gRPC Dapr service, first, create a Dapr callback instance with a specific address:

s, err := daprd.NewService(":50001")
if err != nil {
    log.Fatalf("failed to start the server: %v", err)
}

Or with address and an existing net.Listener in case you want to combine existing server listener:

list, err := net.Listen("tcp", "localhost:0")
if err != nil {
	log.Fatalf("gRPC listener creation failed: %s", err)
}
s := daprd.NewServiceWithListener(list)

Once you create a service instance, you can “attach” to that service any number of event, binding, and service invocation logic handlers as shown below. Onces the logic is defined, you are ready to start the service:

if err := s.Start(); err != nil {
    log.Fatalf("server error: %v", err)
}

Event Handling

To handle events from specific topic you need to add at least one topic event handler before starting the service:

sub := &common.Subscription{
		PubsubName: "messages",
		Topic:      "topic1",
	}
if err := s.AddTopicEventHandler(sub, eventHandler); err != nil {
    log.Fatalf("error adding topic subscription: %v", err)
}

The handler method itself can be any method with the expected signature:

func eventHandler(ctx context.Context, e *common.TopicEvent) (retry bool, err error) {
	log.Printf("event - PubsubName:%s, Topic:%s, ID:%s, Data: %v", e.PubsubName, e.Topic, e.ID, e.Data)
	// do something with the event
	return true, nil
}

Optionally, you can use routing rules to send messages to different handlers based on the contents of the CloudEvent.

sub := &common.Subscription{
	PubsubName: "messages",
	Topic:      "topic1",
	Route:      "/important",
	Match:      `event.type == "important"`,
	Priority:   1,
}
err := s.AddTopicEventHandler(sub, importantHandler)
if err != nil {
	log.Fatalf("error adding topic subscription: %v", err)
}

You can also create a custom type that implements the TopicEventSubscriber interface to handle your events:

type EventHandler struct {
	// any data or references that your event handler needs.
}

func (h *EventHandler) Handle(ctx context.Context, e *common.TopicEvent) (retry bool, err error) {
    log.Printf("event - PubsubName:%s, Topic:%s, ID:%s, Data: %v", e.PubsubName, e.Topic, e.ID, e.Data)
    // do something with the event
    return true, nil
}

The EventHandler can then be added using the AddTopicEventSubscriber method:

sub := &common.Subscription{
    PubsubName: "messages",
    Topic:      "topic1",
}
eventHandler := &EventHandler{
// initialize any fields
}
if err := s.AddTopicEventSubscriber(sub, eventHandler); err != nil {
    log.Fatalf("error adding topic subscription: %v", err)
}

Service Invocation Handler

To handle service invocations you will need to add at least one service invocation handler before starting the service:

if err := s.AddServiceInvocationHandler("echo", echoHandler); err != nil {
    log.Fatalf("error adding invocation handler: %v", err)
}

The handler method itself can be any method with the expected signature:

func echoHandler(ctx context.Context, in *common.InvocationEvent) (out *common.Content, err error) {
	log.Printf("echo - ContentType:%s, Verb:%s, QueryString:%s, %+v", in.ContentType, in.Verb, in.QueryString, string(in.Data))
	// do something with the invocation here 
	out = &common.Content{
		Data:        in.Data,
		ContentType: in.ContentType,
		DataTypeURL: in.DataTypeURL,
	}
	return
}

Binding Invocation Handler

To handle binding invocations you will need to add at least one binding invocation handler before starting the service:

if err := s.AddBindingInvocationHandler("run", runHandler); err != nil {
    log.Fatalf("error adding binding handler: %v", err)
}

The handler method itself can be any method with the expected signature:

func runHandler(ctx context.Context, in *common.BindingEvent) (out []byte, err error) {
	log.Printf("binding - Data:%v, Meta:%v", in.Data, in.Metadata)
	// do something with the invocation here 
	return nil, nil
}

3 - Dapr Java SDK

Java SDK packages for developing Dapr applications

Dapr offers a variety of packages to help with the development of Java applications. Using them you can create Java clients, servers, and virtual actors with Dapr.

Prerequisites

Import Dapr’s Java SDK

Next, import the Java SDK packages to get started. Select your preferred build tool to learn how to import.

For a Maven project, add the following to your pom.xml file:

<project>
  ...
  <dependencies>
    ...
    <!-- Dapr's core SDK with all features, except Actors. -->
    <dependency>
      <groupId>io.dapr</groupId>
      <artifactId>dapr-sdk</artifactId>
      <version>1.15.0</version>
    </dependency>
    <!-- Dapr's SDK for Actors (optional). -->
    <dependency>
      <groupId>io.dapr</groupId>
      <artifactId>dapr-sdk-actors</artifactId>
      <version>1.15.0</version>
    </dependency>
    <!-- Dapr's SDK integration with SpringBoot (optional). -->
    <dependency>
      <groupId>io.dapr</groupId>
      <artifactId>dapr-sdk-springboot</artifactId>
      <version>1.15.0</version>
    </dependency>
    ...
  </dependencies>
  ...
</project>

For a Gradle project, add the following to your build.gradle file:

dependencies {
...
    // Dapr's core SDK with all features, except Actors.
    compile('io.dapr:dapr-sdk:1.15.0')
    // Dapr's SDK for Actors (optional).
    compile('io.dapr:dapr-sdk-actors:1.15.0')
    // Dapr's SDK integration with SpringBoot (optional).
    compile('io.dapr:dapr-sdk-springboot:1.15.0')
}

If you are also using Spring Boot, you may run into a common issue where the OkHttp version that the Dapr SDK uses conflicts with the one specified in the Spring Boot Bill of Materials.

You can fix this by specifying a compatible OkHttp version in your project to match the version that the Dapr SDK uses:

<dependency>
  <groupId>com.squareup.okhttp3</groupId>
  <artifactId>okhttp</artifactId>
  <version>1.15.0</version>
</dependency>

Try it out

Put the Dapr Java SDK to the test. Walk through the Java quickstarts and tutorials to see Dapr in action:

SDK samplesDescription
QuickstartsExperience Dapr’s API building blocks in just a few minutes using the Java SDK.
SDK samplesClone the SDK repo to try out some examples and get started.
import io.dapr.client.DaprClient;
import io.dapr.client.DaprClientBuilder;

try (DaprClient client = (new DaprClientBuilder()).build()) {
  // sending a class with message; BINDING_OPERATION="create"
  client.invokeBinding(BINDING_NAME, BINDING_OPERATION, myClass).block();

  // sending a plain string
  client.invokeBinding(BINDING_NAME, BINDING_OPERATION, message).block();
}

Available packages

Client

Create Java clients that interact with a Dapr sidecar and other Dapr applications.

Workflow

Create and manage workflows that work with other Dapr APIs in Java.

3.1 - AI

With the Dapr Conversation AI package, you can interact with the Dapr AI workloads from a Java application. To get started, walk through the Dapr AI how-to guide.

3.1.1 - How to: Author and manage Dapr Conversation AI in the Java SDK

How to get up and running with Conversation AI using the Dapr Java SDK

As part of this demonstration, we will look at how to use the Conversation API to converse with a Large Language Model (LLM). The API will return the response from the LLM for the given prompt. With the provided conversation ai example, you will:

This example uses the default configuration from dapr init in self-hosted mode.

Prerequisites

Set up the environment

Clone the Java SDK repo and navigate into it.

git clone https://github.com/dapr/java-sdk.git
cd java-sdk

Run the following command to install the requirements for running the Conversation AI example with the Dapr Java SDK.

mvn clean install -DskipTests

From the Java SDK root directory, navigate to the examples’ directory.

cd examples

Run the Dapr sidecar.

dapr run --app-id conversationapp --dapr-grpc-port 51439 --dapr-http-port 3500 --app-port 8080

Now, Dapr is listening for HTTP requests at http://localhost:3500 and gRPC requests at http://localhost:51439.

Send a prompt with Personally identifiable information (PII) to the Conversation AI API

In the DemoConversationAI there are steps to send a prompt using the converse method under the DaprPreviewClient.

public class DemoConversationAI {
  /**
   * The main method to start the client.
   *
   * @param args Input arguments (unused).
   */
  public static void main(String[] args) {
    try (DaprPreviewClient client = new DaprClientBuilder().buildPreviewClient()) {
      System.out.println("Sending the following input to LLM: Hello How are you? This is the my number 672-123-4567");

      ConversationInput daprConversationInput = new ConversationInput("Hello How are you? "
              + "This is the my number 672-123-4567");

      // Component name is the name provided in the metadata block of the conversation.yaml file.
      Mono<ConversationResponse> responseMono = client.converse(new ConversationRequest("echo",
              List.of(daprConversationInput))
              .setContextId("contextId")
              .setScrubPii(true).setTemperature(1.1d));
      ConversationResponse response = responseMono.block();
      System.out.printf("Conversation output: %s", response.getConversationOutputs().get(0).getResult());
    } catch (Exception e) {
      throw new RuntimeException(e);
    }
  }
}

Run the DemoConversationAI with the following command.

java -jar target/dapr-java-sdk-examples-exec.jar io.dapr.examples.conversation.DemoConversationAI

Sample output

== APP == Conversation output: Hello How are you? This is the my number <ISBN>

As shown in the output, the number sent to the API is obfuscated and returned in the form of . The example above uses an “echo” component for testing, which simply returns the input message. When integrated with LLMs like OpenAI or Claude, you’ll receive meaningful responses instead of echoed input.

Next steps

3.2 - Getting started with the Dapr client Java SDK

How to get up and running with the Dapr Java SDK

The Dapr client package allows you to interact with other Dapr applications from a Java application.

Prerequisites

Complete initial setup and import the Java SDK into your project

Initializing the client

You can initialize a Dapr client as so:

DaprClient client = new DaprClientBuilder().build()

This will connect to the default Dapr gRPC endpoint localhost:50001. For information about configuring the client using environment variables and system properties, see Properties.

Error Handling

Initially, errors in Dapr followed the Standard gRPC error model. However, to provide more detailed and informative error messages, in version 1.13 an enhanced error model has been introduced which aligns with the gRPC Richer error model. In response, the Java SDK extended the DaprException to include the error details that were added in Dapr.

Example of handling the DaprException and consuming the error details when using the Dapr Java SDK:

...
      try {
        client.publishEvent("unknown_pubsub", "mytopic", "mydata").block();
      } catch (DaprException exception) {
        System.out.println("Dapr exception's error code: " + exception.getErrorCode());
        System.out.println("Dapr exception's message: " + exception.getMessage());
        // DaprException now contains `getStatusDetails()` to include more details about the error from Dapr runtime.
        System.out.println("Dapr exception's reason: " + exception.getStatusDetails().get(
        DaprErrorDetails.ErrorDetailType.ERROR_INFO,
            "reason",
        TypeRef.STRING));
      }
...

Building blocks

The Java SDK allows you to interface with all of the Dapr building blocks.

Invoke a service

import io.dapr.client.DaprClient;
import io.dapr.client.DaprClientBuilder;

try (DaprClient client = (new DaprClientBuilder()).build()) {
  // invoke a 'GET' method (HTTP) skipping serialization: \say with a Mono<byte[]> return type
  // for gRPC set HttpExtension.NONE parameters below
  response = client.invokeMethod(SERVICE_TO_INVOKE, METHOD_TO_INVOKE, "{\"name\":\"World!\"}", HttpExtension.GET, byte[].class).block();

  // invoke a 'POST' method (HTTP) skipping serialization: to \say with a Mono<byte[]> return type     
  response = client.invokeMethod(SERVICE_TO_INVOKE, METHOD_TO_INVOKE, "{\"id\":\"100\", \"FirstName\":\"Value\", \"LastName\":\"Value\"}", HttpExtension.POST, byte[].class).block();

  System.out.println(new String(response));

  // invoke a 'POST' method (HTTP) with serialization: \employees with a Mono<Employee> return type      
  Employee newEmployee = new Employee("Nigel", "Guitarist");
  Employee employeeResponse = client.invokeMethod(SERVICE_TO_INVOKE, "employees", newEmployee, HttpExtension.POST, Employee.class).block();
}

Save & get application state

import io.dapr.client.DaprClient;
import io.dapr.client.DaprClientBuilder;
import io.dapr.client.domain.State;
import reactor.core.publisher.Mono;

try (DaprClient client = (new DaprClientBuilder()).build()) {
  // Save state
  client.saveState(STATE_STORE_NAME, FIRST_KEY_NAME, myClass).block();

  // Get state
  State<MyClass> retrievedMessage = client.getState(STATE_STORE_NAME, FIRST_KEY_NAME, MyClass.class).block();

  // Delete state
  client.deleteState(STATE_STORE_NAME, FIRST_KEY_NAME).block();
}

Publish & subscribe to messages

Publish messages
import io.dapr.client.DaprClient;
import io.dapr.client.DaprClientBuilder;
import io.dapr.client.domain.Metadata;
import static java.util.Collections.singletonMap;

try (DaprClient client = (new DaprClientBuilder()).build()) {
  client.publishEvent(PUBSUB_NAME, TOPIC_NAME, message, singletonMap(Metadata.TTL_IN_SECONDS, MESSAGE_TTL_IN_SECONDS)).block();
}
Subscribe to messages
import com.fasterxml.jackson.databind.ObjectMapper;
import io.dapr.Topic;
import io.dapr.client.domain.BulkSubscribeAppResponse;
import io.dapr.client.domain.BulkSubscribeAppResponseEntry;
import io.dapr.client.domain.BulkSubscribeAppResponseStatus;
import io.dapr.client.domain.BulkSubscribeMessage;
import io.dapr.client.domain.BulkSubscribeMessageEntry;
import io.dapr.client.domain.CloudEvent;
import io.dapr.springboot.annotations.BulkSubscribe;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RestController;
import reactor.core.publisher.Mono;

@RestController
public class SubscriberController {

  private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();

  @Topic(name = "testingtopic", pubsubName = "${myAppProperty:messagebus}")
  @PostMapping(path = "/testingtopic")
  public Mono<Void> handleMessage(@RequestBody(required = false) CloudEvent<?> cloudEvent) {
    return Mono.fromRunnable(() -> {
      try {
        System.out.println("Subscriber got: " + cloudEvent.getData());
        System.out.println("Subscriber got: " + OBJECT_MAPPER.writeValueAsString(cloudEvent));
      } catch (Exception e) {
        throw new RuntimeException(e);
      }
    });
  }

  @Topic(name = "testingtopic", pubsubName = "${myAppProperty:messagebus}",
          rule = @Rule(match = "event.type == 'myevent.v2'", priority = 1))
  @PostMapping(path = "/testingtopicV2")
  public Mono<Void> handleMessageV2(@RequestBody(required = false) CloudEvent envelope) {
    return Mono.fromRunnable(() -> {
      try {
        System.out.println("Subscriber got: " + cloudEvent.getData());
        System.out.println("Subscriber got: " + OBJECT_MAPPER.writeValueAsString(cloudEvent));
      } catch (Exception e) {
        throw new RuntimeException(e);
      }
    });
  }

  @BulkSubscribe()
  @Topic(name = "testingtopicbulk", pubsubName = "${myAppProperty:messagebus}")
  @PostMapping(path = "/testingtopicbulk")
  public Mono<BulkSubscribeAppResponse> handleBulkMessage(
          @RequestBody(required = false) BulkSubscribeMessage<CloudEvent<String>> bulkMessage) {
    return Mono.fromCallable(() -> {
      if (bulkMessage.getEntries().size() == 0) {
        return new BulkSubscribeAppResponse(new ArrayList<BulkSubscribeAppResponseEntry>());
      }

      System.out.println("Bulk Subscriber received " + bulkMessage.getEntries().size() + " messages.");

      List<BulkSubscribeAppResponseEntry> entries = new ArrayList<BulkSubscribeAppResponseEntry>();
      for (BulkSubscribeMessageEntry<?> entry : bulkMessage.getEntries()) {
        try {
          System.out.printf("Bulk Subscriber message has entry ID: %s\n", entry.getEntryId());
          CloudEvent<?> cloudEvent = (CloudEvent<?>) entry.getEvent();
          System.out.printf("Bulk Subscriber got: %s\n", cloudEvent.getData());
          entries.add(new BulkSubscribeAppResponseEntry(entry.getEntryId(), BulkSubscribeAppResponseStatus.SUCCESS));
        } catch (Exception e) {
          e.printStackTrace();
          entries.add(new BulkSubscribeAppResponseEntry(entry.getEntryId(), BulkSubscribeAppResponseStatus.RETRY));
        }
      }
      return new BulkSubscribeAppResponse(entries);
    });
  }
}
Bulk Publish Messages

Note: API is in Alpha stage

import io.dapr.client.DaprClientBuilder;
import io.dapr.client.DaprPreviewClient;
import io.dapr.client.domain.BulkPublishResponse;
import io.dapr.client.domain.BulkPublishResponseFailedEntry;
import java.util.ArrayList;
import java.util.List;
class Solution {
  public void publishMessages() {
    try (DaprPreviewClient client = (new DaprClientBuilder()).buildPreviewClient()) {
      // Create a list of messages to publish
      List<String> messages = new ArrayList<>();
      for (int i = 0; i < NUM_MESSAGES; i++) {
        String message = String.format("This is message #%d", i);
        messages.add(message);
        System.out.println("Going to publish message : " + message);
      }

      // Publish list of messages using the bulk publish API
      BulkPublishResponse<String> res = client.publishEvents(PUBSUB_NAME, TOPIC_NAME, "text/plain", messages).block()
    }
  }
}

Interact with output bindings

import io.dapr.client.DaprClient;
import io.dapr.client.DaprClientBuilder;

try (DaprClient client = (new DaprClientBuilder()).build()) {
  // sending a class with message; BINDING_OPERATION="create"
  client.invokeBinding(BINDING_NAME, BINDING_OPERATION, myClass).block();

  // sending a plain string
  client.invokeBinding(BINDING_NAME, BINDING_OPERATION, message).block();
}

Interact with input bindings

import org.springframework.web.bind.annotation.*;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

@RestController
@RequestMapping("/")
public class myClass {
    private static final Logger log = LoggerFactory.getLogger(myClass);
        @PostMapping(path = "/checkout")
        public Mono<String> getCheckout(@RequestBody(required = false) byte[] body) {
            return Mono.fromRunnable(() ->
                    log.info("Received Message: " + new String(body)));
        }
}

Retrieve secrets

import com.fasterxml.jackson.databind.ObjectMapper;
import io.dapr.client.DaprClient;
import io.dapr.client.DaprClientBuilder;
import java.util.Map;

try (DaprClient client = (new DaprClientBuilder()).build()) {
  Map<String, String> secret = client.getSecret(SECRET_STORE_NAME, secretKey).block();
  System.out.println(JSON_SERIALIZER.writeValueAsString(secret));
}

Actors

An actor is an isolated, independent unit of compute and state with single-threaded execution. Dapr provides an actor implementation based on the Virtual Actor pattern, which provides a single-threaded programming model and where actors are garbage collected when not in use. With Dapr’s implementaiton, you write your Dapr actors according to the Actor model, and Dapr leverages the scalability and reliability that the underlying platform provides.

import io.dapr.actors.ActorMethod;
import io.dapr.actors.ActorType;
import reactor.core.publisher.Mono;

@ActorType(name = "DemoActor")
public interface DemoActor {

  void registerReminder();

  @ActorMethod(name = "echo_message")
  String say(String something);

  void clock(String message);

  @ActorMethod(returns = Integer.class)
  Mono<Integer> incrementAndGet(int delta);
}

Get & Subscribe to application configurations

Note this is a preview API and thus will only be accessible via the DaprPreviewClient interface and not the normal DaprClient interface

import io.dapr.client.DaprClientBuilder;
import io.dapr.client.DaprPreviewClient;
import io.dapr.client.domain.ConfigurationItem;
import io.dapr.client.domain.GetConfigurationRequest;
import io.dapr.client.domain.SubscribeConfigurationRequest;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;

try (DaprPreviewClient client = (new DaprClientBuilder()).buildPreviewClient()) {
  // Get configuration for a single key
  Mono<ConfigurationItem> item = client.getConfiguration(CONFIG_STORE_NAME, CONFIG_KEY).block();

  // Get configurations for multiple keys
  Mono<Map<String, ConfigurationItem>> items =
          client.getConfiguration(CONFIG_STORE_NAME, CONFIG_KEY_1, CONFIG_KEY_2);

  // Subscribe to configuration changes
  Flux<SubscribeConfigurationResponse> outFlux = client.subscribeConfiguration(CONFIG_STORE_NAME, CONFIG_KEY_1, CONFIG_KEY_2);
  outFlux.subscribe(configItems -> configItems.forEach(...));

  // Unsubscribe from configuration changes
  Mono<UnsubscribeConfigurationResponse> unsubscribe = client.unsubscribeConfiguration(SUBSCRIPTION_ID, CONFIG_STORE_NAME)
}

Query saved state

Note this is a preview API and thus will only be accessible via the DaprPreviewClient interface and not the normal DaprClient interface

import io.dapr.client.DaprClient;
import io.dapr.client.DaprClientBuilder;
import io.dapr.client.DaprPreviewClient;
import io.dapr.client.domain.QueryStateItem;
import io.dapr.client.domain.QueryStateRequest;
import io.dapr.client.domain.QueryStateResponse;
import io.dapr.client.domain.query.Query;
import io.dapr.client.domain.query.Sorting;
import io.dapr.client.domain.query.filters.EqFilter;

try (DaprClient client = builder.build(); DaprPreviewClient previewClient = builder.buildPreviewClient()) {
        String searchVal = args.length == 0 ? "searchValue" : args[0];
        
        // Create JSON data
        Listing first = new Listing();
        first.setPropertyType("apartment");
        first.setId("1000");
        ...
        Listing second = new Listing();
        second.setPropertyType("row-house");
        second.setId("1002");
        ...
        Listing third = new Listing();
        third.setPropertyType("apartment");
        third.setId("1003");
        ...
        Listing fourth = new Listing();
        fourth.setPropertyType("apartment");
        fourth.setId("1001");
        ...
        Map<String, String> meta = new HashMap<>();
        meta.put("contentType", "application/json");

        // Save state
        SaveStateRequest request = new SaveStateRequest(STATE_STORE_NAME).setStates(
          new State<>("1", first, null, meta, null),
          new State<>("2", second, null, meta, null),
          new State<>("3", third, null, meta, null),
          new State<>("4", fourth, null, meta, null)
        );
        client.saveBulkState(request).block();
        
        
        // Create query and query state request

        Query query = new Query()
          .setFilter(new EqFilter<>("propertyType", "apartment"))
          .setSort(Arrays.asList(new Sorting("id", Sorting.Order.DESC)));
        QueryStateRequest request = new QueryStateRequest(STATE_STORE_NAME)
          .setQuery(query);

        // Use preview client to call query state API
        QueryStateResponse<MyData> result = previewClient.queryState(request, MyData.class).block();
        
        // View Query state response 
        System.out.println("Found " + result.getResults().size() + " items.");
        for (QueryStateItem<Listing> item : result.getResults()) {
          System.out.println("Key: " + item.getKey());
          System.out.println("Data: " + item.getValue());
        }
}

Distributed lock

package io.dapr.examples.lock.grpc;

import io.dapr.client.DaprClientBuilder;
import io.dapr.client.DaprPreviewClient;
import io.dapr.client.domain.LockRequest;
import io.dapr.client.domain.UnlockRequest;
import io.dapr.client.domain.UnlockResponseStatus;
import reactor.core.publisher.Mono;

public class DistributedLockGrpcClient {
  private static final String LOCK_STORE_NAME = "lockstore";

  /**
   * Executes various methods to check the different apis.
   *
   * @param args arguments
   * @throws Exception throws Exception
   */
  public static void main(String[] args) throws Exception {
    try (DaprPreviewClient client = (new DaprClientBuilder()).buildPreviewClient()) {
      System.out.println("Using preview client...");
      tryLock(client);
      unlock(client);
    }
  }

  /**
   * Trying to get lock.
   *
   * @param client DaprPreviewClient object
   */
  public static void tryLock(DaprPreviewClient client) {
    System.out.println("*******trying to get a free distributed lock********");
    try {
      LockRequest lockRequest = new LockRequest(LOCK_STORE_NAME, "resouce1", "owner1", 5);
      Mono<Boolean> result = client.tryLock(lockRequest);
      System.out.println("Lock result -> " + (Boolean.TRUE.equals(result.block()) ? "SUCCESS" : "FAIL"));
    } catch (Exception ex) {
      System.out.println(ex.getMessage());
    }
  }

  /**
   * Unlock a lock.
   *
   * @param client DaprPreviewClient object
   */
  public static void unlock(DaprPreviewClient client) {
    System.out.println("*******unlock a distributed lock********");
    try {
      UnlockRequest unlockRequest = new UnlockRequest(LOCK_STORE_NAME, "resouce1", "owner1");
      Mono<UnlockResponseStatus> result = client.unlock(unlockRequest);
      System.out.println("Unlock result ->" + result.block().name());
    } catch (Exception ex) {
      System.out.println(ex.getMessage());
    }
  }
}

Workflow

package io.dapr.examples.workflows;

import io.dapr.workflows.client.DaprWorkflowClient;
import io.dapr.workflows.client.WorkflowInstanceStatus;

import java.time.Duration;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.TimeoutException;

/**
 * For setup instructions, see the README.
 */
public class DemoWorkflowClient {

  /**
   * The main method.
   *
   * @param args Input arguments (unused).
   * @throws InterruptedException If program has been interrupted.
   */
  public static void main(String[] args) throws InterruptedException {
    DaprWorkflowClient client = new DaprWorkflowClient();

    try (client) {
      String separatorStr = "*******";
      System.out.println(separatorStr);
      String instanceId = client.scheduleNewWorkflow(DemoWorkflow.class, "input data");
      System.out.printf("Started new workflow instance with random ID: %s%n", instanceId);

      System.out.println(separatorStr);
      System.out.println("**GetInstanceMetadata:Running Workflow**");
      WorkflowInstanceStatus workflowMetadata = client.getInstanceState(instanceId, true);
      System.out.printf("Result: %s%n", workflowMetadata);

      System.out.println(separatorStr);
      System.out.println("**WaitForInstanceStart**");
      try {
        WorkflowInstanceStatus waitForInstanceStartResult =
            client.waitForInstanceStart(instanceId, Duration.ofSeconds(60), true);
        System.out.printf("Result: %s%n", waitForInstanceStartResult);
      } catch (TimeoutException ex) {
        System.out.printf("waitForInstanceStart has an exception:%s%n", ex);
      }

      System.out.println(separatorStr);
      System.out.println("**SendExternalMessage**");
      client.raiseEvent(instanceId, "TestEvent", "TestEventPayload");

      System.out.println(separatorStr);
      System.out.println("** Registering parallel Events to be captured by allOf(t1,t2,t3) **");
      client.raiseEvent(instanceId, "event1", "TestEvent 1 Payload");
      client.raiseEvent(instanceId, "event2", "TestEvent 2 Payload");
      client.raiseEvent(instanceId, "event3", "TestEvent 3 Payload");
      System.out.printf("Events raised for workflow with instanceId: %s\n", instanceId);

      System.out.println(separatorStr);
      System.out.println("** Registering Event to be captured by anyOf(t1,t2,t3) **");
      client.raiseEvent(instanceId, "e2", "event 2 Payload");
      System.out.printf("Event raised for workflow with instanceId: %s\n", instanceId);


      System.out.println(separatorStr);
      System.out.println("**WaitForInstanceCompletion**");
      try {
        WorkflowInstanceStatus waitForInstanceCompletionResult =
            client.waitForInstanceCompletion(instanceId, Duration.ofSeconds(60), true);
        System.out.printf("Result: %s%n", waitForInstanceCompletionResult);
      } catch (TimeoutException ex) {
        System.out.printf("waitForInstanceCompletion has an exception:%s%n", ex);
      }

      System.out.println(separatorStr);
      System.out.println("**purgeInstance**");
      boolean purgeResult = client.purgeInstance(instanceId);
      System.out.printf("purgeResult: %s%n", purgeResult);

      System.out.println(separatorStr);
      System.out.println("**raiseEvent**");

      String eventInstanceId = client.scheduleNewWorkflow(DemoWorkflow.class);
      System.out.printf("Started new workflow instance with random ID: %s%n", eventInstanceId);
      client.raiseEvent(eventInstanceId, "TestException", null);
      System.out.printf("Event raised for workflow with instanceId: %s\n", eventInstanceId);

      System.out.println(separatorStr);
      String instanceToTerminateId = "terminateMe";
      client.scheduleNewWorkflow(DemoWorkflow.class, null, instanceToTerminateId);
      System.out.printf("Started new workflow instance with specified ID: %s%n", instanceToTerminateId);

      TimeUnit.SECONDS.sleep(5);
      System.out.println("Terminate this workflow instance manually before the timeout is reached");
      client.terminateWorkflow(instanceToTerminateId, null);
      System.out.println(separatorStr);

      String restartingInstanceId = "restarting";
      client.scheduleNewWorkflow(DemoWorkflow.class, null, restartingInstanceId);
      System.out.printf("Started new  workflow instance with ID: %s%n", restartingInstanceId);
      System.out.println("Sleeping 30 seconds to restart the workflow");
      TimeUnit.SECONDS.sleep(30);

      System.out.println("**SendExternalMessage: RestartEvent**");
      client.raiseEvent(restartingInstanceId, "RestartEvent", "RestartEventPayload");

      System.out.println("Sleeping 30 seconds to terminate the eternal workflow");
      TimeUnit.SECONDS.sleep(30);
      client.terminateWorkflow(restartingInstanceId, null);
    }

    System.out.println("Exiting DemoWorkflowClient.");
    System.exit(0);
  }
}

Sidecar APIs

Wait for sidecar

The DaprClient also provides a helper method to wait for the sidecar to become healthy (components only). When using this method, be sure to specify a timeout in milliseconds and block() to wait for the result of a reactive operation.

// Wait for the Dapr sidecar to report healthy before attempting to use Dapr components.
try (DaprClient client = new DaprClientBuilder().build()) {
  System.out.println("Waiting for Dapr sidecar ...");
  client.waitForSidecar(10000).block(); // Specify the timeout in milliseconds
  System.out.println("Dapr sidecar is ready.");
  ...
}

// Perform Dapr component operations here i.e. fetching secrets or saving state.

Shutdown the sidecar

try (DaprClient client = new DaprClientBuilder().build()) {
  logger.info("Sending shutdown request.");
  client.shutdown().block();
  logger.info("Ensuring dapr has stopped.");
  ...
}

Learn more about the Dapr Java SDK packages available to add to your Java applications.

For a full list of SDK properties and how to configure them, visit Properties.

3.2.1 - Properties

SDK-wide properties for configuring the Dapr Java SDK using environment variables and system properties

Properties

The Dapr Java SDK provides a set of global properties that control the behavior of the SDK. These properties can be configured using environment variables or system properties. System properties can be set using the -D flag when running your Java application.

These properties affect the entire SDK, including clients and runtime. They control aspects such as:

  • Sidecar connectivity (endpoints, ports)
  • Security settings (TLS, API tokens)
  • Performance tuning (timeouts, connection pools)
  • Protocol settings (gRPC, HTTP)
  • String encoding

Environment Variables

The following environment variables are available for configuring the Dapr Java SDK:

Sidecar Endpoints

When these variables are set, the client will automatically use them to connect to the Dapr sidecar.

Environment VariableDescriptionDefault
DAPR_GRPC_ENDPOINTThe gRPC endpoint for the Dapr sidecarlocalhost:50001
DAPR_HTTP_ENDPOINTThe HTTP endpoint for the Dapr sidecarlocalhost:3500
DAPR_GRPC_PORTThe gRPC port for the Dapr sidecar (legacy, DAPR_GRPC_ENDPOINT takes precedence)50001
DAPR_HTTP_PORTThe HTTP port for the Dapr sidecar (legacy, DAPR_HTTP_ENDPOINT takes precedence)3500

API Token

Environment VariableDescriptionDefault
DAPR_API_TOKENAPI token for authentication between app and Dapr sidecar. This is the same token used by the Dapr runtime for API authentication. For more details, see Dapr API token authentication and Environment variables reference.null

gRPC Configuration

TLS Settings

For secure gRPC communication, you can configure TLS settings using the following environment variables:

Environment VariableDescriptionDefault
DAPR_GRPC_TLS_INSECUREWhen set to “true”, enables insecure TLS mode which still uses TLS but doesn’t verify certificates. This uses InsecureTrustManagerFactory to trust all certificates. This should only be used for testing or in secure environments.false
DAPR_GRPC_TLS_CA_PATHPath to the CA certificate file. This is used for TLS connections to servers with self-signed certificates.null
DAPR_GRPC_TLS_CERT_PATHPath to the TLS certificate file for client authentication.null
DAPR_GRPC_TLS_KEY_PATHPath to the TLS private key file for client authentication.null

Keepalive Settings

Configure gRPC keepalive behavior using these environment variables:

Environment VariableDescriptionDefault
DAPR_GRPC_ENABLE_KEEP_ALIVEWhether to enable gRPC keepalivefalse
DAPR_GRPC_KEEP_ALIVE_TIME_SECONDSgRPC keepalive time in seconds10
DAPR_GRPC_KEEP_ALIVE_TIMEOUT_SECONDSgRPC keepalive timeout in seconds5
DAPR_GRPC_KEEP_ALIVE_WITHOUT_CALLSWhether to keep gRPC connection alive without callstrue

Inbound Message Settings

Configure gRPC inbound message settings using these environment variables:

Environment VariableDescriptionDefault
DAPR_GRPC_MAX_INBOUND_MESSAGE_SIZE_BYTESDapr’s maximum inbound message size for gRPC in bytes. This value sets the maximum size of a gRPC message that can be received by the application4194304
DAPR_GRPC_MAX_INBOUND_METADATA_SIZE_BYTESDapr’s maximum inbound metadata size for gRPC in bytes8192

HTTP Client Configuration

These properties control the behavior of the HTTP client used for communication with the Dapr sidecar:

Environment VariableDescriptionDefault
DAPR_HTTP_CLIENT_READ_TIMEOUT_SECONDSTimeout in seconds for HTTP client read operations. This is the maximum time to wait for a response from the Dapr sidecar.60
DAPR_HTTP_CLIENT_MAX_REQUESTSMaximum number of concurrent HTTP requests that can be executed. Above this limit, requests will queue in memory waiting for running calls to complete.1024
DAPR_HTTP_CLIENT_MAX_IDLE_CONNECTIONSMaximum number of idle connections in the HTTP connection pool. This is the maximum number of connections that can remain idle in the pool.128

API Configuration

These properties control the behavior of API calls made through the SDK:

Environment VariableDescriptionDefault
DAPR_API_MAX_RETRIESMaximum number of retries for retriable exceptions when making API calls to the Dapr sidecar0
DAPR_API_TIMEOUT_MILLISECONDSTimeout in milliseconds for API calls to the Dapr sidecar. A value of 0 means no timeout.0

String Encoding

Environment VariableDescriptionDefault
DAPR_STRING_CHARSETCharacter set used for string encoding/decoding in the SDK. Must be a valid Java charset name.UTF-8

System Properties

All environment variables can be set as system properties using the -D flag. Here is the complete list of available system properties:

System PropertyDescriptionDefault
dapr.sidecar.ipIP address for the Dapr sidecarlocalhost
dapr.http.portHTTP port for the Dapr sidecar3500
dapr.grpc.portgRPC port for the Dapr sidecar50001
dapr.grpc.tls.cert.pathPath to the gRPC TLS certificatenull
dapr.grpc.tls.key.pathPath to the gRPC TLS keynull
dapr.grpc.tls.ca.pathPath to the gRPC TLS CA certificatenull
dapr.grpc.tls.insecureWhether to use insecure TLS modefalse
dapr.grpc.endpointgRPC endpoint for remote sidecarnull
dapr.grpc.enable.keep.aliveWhether to enable gRPC keepalivefalse
dapr.grpc.keep.alive.time.secondsgRPC keepalive time in seconds10
dapr.grpc.keep.alive.timeout.secondsgRPC keepalive timeout in seconds5
dapr.grpc.keep.alive.without.callsWhether to keep gRPC connection alive without callstrue
dapr.http.endpointHTTP endpoint for remote sidecarnull
dapr.api.maxRetriesMaximum number of retries for API calls0
dapr.api.timeoutMillisecondsTimeout for API calls in milliseconds0
dapr.api.tokenAPI token for authenticationnull
dapr.string.charsetString encoding used in the SDKUTF-8
dapr.http.client.readTimeoutSecondsTimeout in seconds for HTTP client reads60
dapr.http.client.maxRequestsMaximum number of concurrent HTTP requests1024
dapr.http.client.maxIdleConnectionsMaximum number of idle HTTP connections128

Property Resolution Order

Properties are resolved in the following order:

  1. Override values (if provided when creating a Properties instance)
  2. System properties (set via -D)
  3. Environment variables
  4. Default values

The SDK checks each source in order. If a value is invalid for the property type (e.g., non-numeric for a numeric property), the SDK will log a warning and try the next source. For example:

# Invalid boolean value - will be ignored
java -Ddapr.grpc.enable.keep.alive=not-a-boolean -jar myapp.jar

# Valid boolean value - will be used
export DAPR_GRPC_ENABLE_KEEP_ALIVE=false

In this case, the environment variable is used because the system property value is invalid. However, if both values are valid, the system property takes precedence:

# Valid boolean value - will be used
java -Ddapr.grpc.enable.keep.alive=true -jar myapp.jar

# Valid boolean value - will be ignored
export DAPR_GRPC_ENABLE_KEEP_ALIVE=false

Override values can be set using the DaprClientBuilder in two ways:

  1. Using individual property overrides (recommended for most cases):
import io.dapr.config.Properties;

// Set a single property override
DaprClient client = new DaprClientBuilder()
    .withPropertyOverride(Properties.GRPC_ENABLE_KEEP_ALIVE, "true")
    .build();

// Or set multiple property overrides
DaprClient client = new DaprClientBuilder()
    .withPropertyOverride(Properties.GRPC_ENABLE_KEEP_ALIVE, "true")
    .withPropertyOverride(Properties.HTTP_CLIENT_READ_TIMEOUT_SECONDS, "120")
    .build();
  1. Using a Properties instance (useful when you have many properties to set at once):
// Create a map of property overrides
Map<String, String> overrides = new HashMap<>();
overrides.put("dapr.grpc.enable.keep.alive", "true");
overrides.put("dapr.http.client.readTimeoutSeconds", "120");

// Create a Properties instance with overrides
Properties properties = new Properties(overrides);

// Use these properties when creating a client
DaprClient client = new DaprClientBuilder()
    .withProperties(properties)
    .build();

For most use cases, you’ll use system properties or environment variables. Override values are primarily used when you need different property values for different instances of the SDK in the same application.

Proxy Configuration

You can configure proxy settings for your Java application using system properties. These are standard Java system properties that are part of Java’s networking layer (java.net package), not specific to Dapr. They are used by Java’s networking stack, including the HTTP client that Dapr’s SDK uses.

For detailed information about Java’s proxy configuration, including all available properties and their usage, see the Java Networking Properties documentation.

For example, here’s how to configure a proxy:

# Configure HTTP proxy - replace with your actual proxy server details
java -Dhttp.proxyHost=your-proxy-server.com -Dhttp.proxyPort=8080 -jar myapp.jar

# Configure HTTPS proxy - replace with your actual proxy server details
java -Dhttps.proxyHost=your-proxy-server.com -Dhttps.proxyPort=8443 -jar myapp.jar

Replace your-proxy-server.com with your actual proxy server hostname or IP address, and adjust the port numbers to match your proxy server configuration.

These proxy settings will affect all HTTP/HTTPS connections made by your Java application, including connections to the Dapr sidecar.

3.3 - Jobs

With the Dapr Jobs package, you can interact with the Dapr Jobs APIs from a Java application to trigger future operations to run according to a predefined schedule with an optional payload. To get started, walk through the Dapr Jobs how-to guide.

3.3.1 - How to: Author and manage Dapr Jobs in the Java SDK

How to get up and running with Jobs using the Dapr Java SDK

As part of this demonstration we will schedule a Dapr Job. The scheduled job will trigger an endpoint registered in the same app. With the provided jobs example, you will:

This example uses the default configuration from dapr init in self-hosted mode.

Prerequisites

Set up the environment

Clone the Java SDK repo and navigate into it.

git clone https://github.com/dapr/java-sdk.git
cd java-sdk

Run the following command to install the requirements for running the jobs example with the Dapr Java SDK.

mvn clean install -DskipTests

From the Java SDK root directory, navigate to the examples’ directory.

cd examples

Run the Dapr sidecar.

dapr run --app-id jobsapp --dapr-grpc-port 51439 --dapr-http-port 3500 --app-port 8080

Now, Dapr is listening for HTTP requests at http://localhost:3500 and internal Jobs gRPC requests at http://localhost:51439.

Schedule and Get a job

In the DemoJobsClient there are steps to schedule a job. Calling scheduleJob using the DaprPreviewClient will schedule a job with the Dapr Runtime.

public class DemoJobsClient {

  /**
   * The main method of this app to schedule and get jobs.
   */
  public static void main(String[] args) throws Exception {
    try (DaprPreviewClient client = new DaprClientBuilder().withPropertyOverrides(overrides).buildPreviewClient()) {

      // Schedule a job.
      System.out.println("**** Scheduling a Job with name dapr-jobs-1 *****");
      ScheduleJobRequest scheduleJobRequest = new ScheduleJobRequest("dapr-job-1",
          JobSchedule.fromString("* * * * * *")).setData("Hello World!".getBytes());
      client.scheduleJob(scheduleJobRequest).block();

      System.out.println("**** Scheduling job dapr-jobs-1 completed *****");
    }
  }
}

Call getJob to retrieve the job details that were previously created and scheduled.

client.getJob(new GetJobRequest("dapr-job-1")).block()

Run the DemoJobsClient with the following command.

java -jar target/dapr-java-sdk-examples-exec.jar io.dapr.examples.jobs.DemoJobsClient

Sample output

**** Scheduling a Job with name dapr-jobs-1 *****
**** Scheduling job dapr-jobs-1 completed *****
**** Retrieving a Job with name dapr-jobs-1 *****

Set up an endpoint to be invoked when the job is triggered

The DemoJobsSpringApplication class starts a Spring Boot application that registers the endpoints specified in the JobsController This endpoint acts like a callback for the scheduled job requests.

@RestController
public class JobsController {

  /**
   * Handles jobs callback from Dapr.
   *
   * @param jobName name of the job.
   * @param payload data from the job if payload exists.
   * @return Empty Mono.
   */
  @PostMapping("/job/{jobName}")
  public Mono<Void> handleJob(@PathVariable("jobName") String jobName,
                              @RequestBody(required = false) byte[] payload) {
    System.out.println("Job Name: " + jobName);
    System.out.println("Job Payload: " + new String(payload));

    return Mono.empty();
  }
}

Parameters:

  • jobName: The name of the triggered job.
  • payload: Optional payload data associated with the job (as a byte array).

Run the Spring Boot application with the following command.

java -jar target/dapr-java-sdk-examples-exec.jar io.dapr.examples.jobs.DemoJobsSpringApplication

Sample output

Job Name: dapr-job-1
Job Payload: Hello World!

Delete a scheduled job

public class DemoJobsClient {

  /**
   * The main method of this app deletes a job that was previously scheduled.
   */
  public static void main(String[] args) throws Exception {
    try (DaprPreviewClient client = new DaprClientBuilder().buildPreviewClient()) {

      // Delete a job.
      System.out.println("**** Delete a Job with name dapr-jobs-1 *****");
      client.deleteJob(new DeleteJobRequest("dapr-job-1")).block();
    }
  }
}

Next steps

3.4 - Workflow

How to get up and running with the Dapr Workflow extension

3.4.1 - How to: Author and manage Dapr Workflow in the Java SDK

How to get up and running with workflows using the Dapr Java SDK

Let’s create a Dapr workflow and invoke it using the console. With the provided workflow example, you will:

This example uses the default configuration from dapr init in self-hosted mode.

Prerequisites

  • Verify you’re using the latest proto bindings

Set up the environment

Clone the Java SDK repo and navigate into it.

git clone https://github.com/dapr/java-sdk.git
cd java-sdk

Run the following command to install the requirements for running this workflow sample with the Dapr Java SDK.

mvn clean install

From the Java SDK root directory, navigate to the Dapr Workflow example.

cd examples

Run the DemoWorkflowWorker

The DemoWorkflowWorker class registers an implementation of DemoWorkflow in Dapr’s workflow runtime engine. In the DemoWorkflowWorker.java file, you can find the DemoWorkflowWorker class and the main method:

public class DemoWorkflowWorker {

  public static void main(String[] args) throws Exception {
    // Register the Workflow with the runtime.
    WorkflowRuntime.getInstance().registerWorkflow(DemoWorkflow.class);
    System.out.println("Start workflow runtime");
    WorkflowRuntime.getInstance().startAndBlock();
    System.exit(0);
  }
}

In the code above:

  • WorkflowRuntime.getInstance().registerWorkflow() registers DemoWorkflow as a workflow in the Dapr Workflow runtime.
  • WorkflowRuntime.getInstance().start() builds and starts the engine within the Dapr Workflow runtime.

In the terminal, execute the following command to kick off the DemoWorkflowWorker:

dapr run --app-id demoworkflowworker --resources-path ./components/workflows --dapr-grpc-port 50001 -- java -jar target/dapr-java-sdk-examples-exec.jar io.dapr.examples.workflows.DemoWorkflowWorker

Expected output

You're up and running! Both Dapr and your app logs will appear here.

...

== APP == Start workflow runtime
== APP == Sep 13, 2023 9:02:03 AM com.microsoft.durabletask.DurableTaskGrpcWorker startAndBlock
== APP == INFO: Durable Task worker is connecting to sidecar at 127.0.0.1:50001.

Run the `DemoWorkflowClient

The DemoWorkflowClient starts instances of workflows that have been registered with Dapr.

public class DemoWorkflowClient {

  // ...
  public static void main(String[] args) throws InterruptedException {
    DaprWorkflowClient client = new DaprWorkflowClient();

    try (client) {
      String separatorStr = "*******";
      System.out.println(separatorStr);
      String instanceId = client.scheduleNewWorkflow(DemoWorkflow.class, "input data");
      System.out.printf("Started new workflow instance with random ID: %s%n", instanceId);

      System.out.println(separatorStr);
      System.out.println("**GetInstanceMetadata:Running Workflow**");
      WorkflowInstanceStatus workflowMetadata = client.getInstanceState(instanceId, true);
      System.out.printf("Result: %s%n", workflowMetadata);

      System.out.println(separatorStr);
      System.out.println("**WaitForInstanceStart**");
      try {
        WorkflowInstanceStatus waitForInstanceStartResult =
            client.waitForInstanceStart(instanceId, Duration.ofSeconds(60), true);
        System.out.printf("Result: %s%n", waitForInstanceStartResult);
      } catch (TimeoutException ex) {
        System.out.printf("waitForInstanceStart has an exception:%s%n", ex);
      }

      System.out.println(separatorStr);
      System.out.println("**SendExternalMessage**");
      client.raiseEvent(instanceId, "TestEvent", "TestEventPayload");

      System.out.println(separatorStr);
      System.out.println("** Registering parallel Events to be captured by allOf(t1,t2,t3) **");
      client.raiseEvent(instanceId, "event1", "TestEvent 1 Payload");
      client.raiseEvent(instanceId, "event2", "TestEvent 2 Payload");
      client.raiseEvent(instanceId, "event3", "TestEvent 3 Payload");
      System.out.printf("Events raised for workflow with instanceId: %s\n", instanceId);

      System.out.println(separatorStr);
      System.out.println("** Registering Event to be captured by anyOf(t1,t2,t3) **");
      client.raiseEvent(instanceId, "e2", "event 2 Payload");
      System.out.printf("Event raised for workflow with instanceId: %s\n", instanceId);


      System.out.println(separatorStr);
      System.out.println("**WaitForInstanceCompletion**");
      try {
        WorkflowInstanceStatus waitForInstanceCompletionResult =
            client.waitForInstanceCompletion(instanceId, Duration.ofSeconds(60), true);
        System.out.printf("Result: %s%n", waitForInstanceCompletionResult);
      } catch (TimeoutException ex) {
        System.out.printf("waitForInstanceCompletion has an exception:%s%n", ex);
      }

      System.out.println(separatorStr);
      System.out.println("**purgeInstance**");
      boolean purgeResult = client.purgeInstance(instanceId);
      System.out.printf("purgeResult: %s%n", purgeResult);

      System.out.println(separatorStr);
      System.out.println("**raiseEvent**");

      String eventInstanceId = client.scheduleNewWorkflow(DemoWorkflow.class);
      System.out.printf("Started new workflow instance with random ID: %s%n", eventInstanceId);
      client.raiseEvent(eventInstanceId, "TestException", null);
      System.out.printf("Event raised for workflow with instanceId: %s\n", eventInstanceId);

      System.out.println(separatorStr);
      String instanceToTerminateId = "terminateMe";
      client.scheduleNewWorkflow(DemoWorkflow.class, null, instanceToTerminateId);
      System.out.printf("Started new workflow instance with specified ID: %s%n", instanceToTerminateId);

      TimeUnit.SECONDS.sleep(5);
      System.out.println("Terminate this workflow instance manually before the timeout is reached");
      client.terminateWorkflow(instanceToTerminateId, null);
      System.out.println(separatorStr);

      String restartingInstanceId = "restarting";
      client.scheduleNewWorkflow(DemoWorkflow.class, null, restartingInstanceId);
      System.out.printf("Started new  workflow instance with ID: %s%n", restartingInstanceId);
      System.out.println("Sleeping 30 seconds to restart the workflow");
      TimeUnit.SECONDS.sleep(30);

      System.out.println("**SendExternalMessage: RestartEvent**");
      client.raiseEvent(restartingInstanceId, "RestartEvent", "RestartEventPayload");

      System.out.println("Sleeping 30 seconds to terminate the eternal workflow");
      TimeUnit.SECONDS.sleep(30);
      client.terminateWorkflow(restartingInstanceId, null);
    }

    System.out.println("Exiting DemoWorkflowClient.");
    System.exit(0);
  }
}

In a second terminal window, start the workflow by running the following command:

java -jar target/dapr-java-sdk-examples-exec.jar io.dapr.examples.workflows.DemoWorkflowClient

Expected output

*******
Started new workflow instance with random ID: 0b4cc0d5-413a-4c1c-816a-a71fa24740d4
*******
**GetInstanceMetadata:Running Workflow**
Result: [Name: 'io.dapr.examples.workflows.DemoWorkflow', ID: '0b4cc0d5-413a-4c1c-816a-a71fa24740d4', RuntimeStatus: RUNNING, CreatedAt: 2023-09-13T13:02:30.547Z, LastUpdatedAt: 2023-09-13T13:02:30.699Z, Input: '"input data"', Output: '']
*******
**WaitForInstanceStart**
Result: [Name: 'io.dapr.examples.workflows.DemoWorkflow', ID: '0b4cc0d5-413a-4c1c-816a-a71fa24740d4', RuntimeStatus: RUNNING, CreatedAt: 2023-09-13T13:02:30.547Z, LastUpdatedAt: 2023-09-13T13:02:30.699Z, Input: '"input data"', Output: '']
*******
**SendExternalMessage**
*******
** Registering parallel Events to be captured by allOf(t1,t2,t3) **
Events raised for workflow with instanceId: 0b4cc0d5-413a-4c1c-816a-a71fa24740d4
*******
** Registering Event to be captured by anyOf(t1,t2,t3) **
Event raised for workflow with instanceId: 0b4cc0d5-413a-4c1c-816a-a71fa24740d4
*******
**WaitForInstanceCompletion**
Result: [Name: 'io.dapr.examples.workflows.DemoWorkflow', ID: '0b4cc0d5-413a-4c1c-816a-a71fa24740d4', RuntimeStatus: FAILED, CreatedAt: 2023-09-13T13:02:30.547Z, LastUpdatedAt: 2023-09-13T13:02:55.054Z, Input: '"input data"', Output: '']
*******
**purgeInstance**
purgeResult: true
*******
**raiseEvent**
Started new workflow instance with random ID: 7707d141-ebd0-4e54-816e-703cb7a52747
Event raised for workflow with instanceId: 7707d141-ebd0-4e54-816e-703cb7a52747
*******
Started new workflow instance with specified ID: terminateMe
Terminate this workflow instance manually before the timeout is reached
*******
Started new  workflow instance with ID: restarting
Sleeping 30 seconds to restart the workflow
**SendExternalMessage: RestartEvent**
Sleeping 30 seconds to terminate the eternal workflow
Exiting DemoWorkflowClient.

What happened?

  1. When you ran dapr run, the workflow worker registered the workflow (DemoWorkflow) and its actvities to the Dapr Workflow engine.
  2. When you ran java, the workflow client started the workflow instance with the following activities. You can follow along with the output in the terminal where you ran dapr run.
    1. The workflow is started, raises three parallel tasks, and waits for them to complete.
    2. The workflow client calls the activity and sends the “Hello Activity” message to the console.
    3. The workflow times out and is purged.
    4. The workflow client starts a new workflow instance with a random ID, uses another workflow instance called terminateMe to terminate it, and restarts it with the workflow called restarting.
    5. The worfklow client is then exited.

Next steps

3.5 - Getting started with the Dapr and Spring Boot

How to get started with Dapr and Spring Boot

By combining Dapr and Spring Boot, we can create infrastructure independent Java applications that can be deployed across different environments, supporting a wide range of on-premises and cloud provider services.

First, we will start with a simple integration covering the DaprClient and the Testcontainers integration, to then use Spring and Spring Boot mechanisms and programming model to leverage the Dapr APIs under the hood. This helps teams to remove dependencies such as clients and drivers required to connect to environment-specific infrastructure (databases, key-value stores, message brokers, configuration/secret stores, etc)

Adding the Dapr and Spring Boot integration to your project

If you already have a Spring Boot application, you can directly add the following dependencies to your project:

	<dependency>
        <groupId>io.dapr.spring</groupId>
		<artifactId>dapr-spring-boot-starter</artifactId>
		<version>0.15.0</version>
	</dependency>
	<dependency>
		<groupId>io.dapr.spring</groupId>
		<artifactId>dapr-spring-boot-starter-test</artifactId>
		<version>0.15.0</version>
		<scope>test</scope>
	</dependency>

You can find the latest released version here.

By adding these dependencies, you can:

  • Autowire a DaprClient to use inside your applications
  • Use the Spring Data and Messaging abstractions and programming model that uses the Dapr APIs under the hood
  • Improve your inner-development loop by relying on Testcontainers to bootstrap Dapr Control plane services and default components

Once these dependencies are in your application, you can rely on Spring Boot autoconfiguration to autowire a DaprClient instance:

@Autowired
private DaprClient daprClient;

This will connect to the default Dapr gRPC endpoint localhost:50001, requiring you to start Dapr outside of your application.

You can use the DaprClient to interact with the Dapr APIs anywhere in your application, for example from inside a REST endpoint:

@RestController
public class DemoRestController {
  @Autowired
  private DaprClient daprClient;

  @PostMapping("/store")
  public void storeOrder(@RequestBody Order order){
    daprClient.saveState("kvstore", order.orderId(), order).block();
  }
}

record Order(String orderId, Integer amount){}

If you want to avoid managing Dapr outside of your Spring Boot application, you can rely on Testcontainers to bootstrap Dapr beside your application for development purposes. To do this we can create a test configuration that uses Testcontainers to bootstrap all we need to develop our applications using the Dapr APIs.

Using Testcontainers and Dapr integrations, we let the @TestConfiguration bootstrap Dapr for our applications. Notice that for this example, we are configuring Dapr with a Statestore component called kvstore that connects to an instance of PostgreSQL also bootstrapped by Testcontainers.

@TestConfiguration(proxyBeanMethods = false)
public class DaprTestContainersConfig {
  @Bean
  @ServiceConnection
  public DaprContainer daprContainer(Network daprNetwork, PostgreSQLContainer<?> postgreSQLContainer){
    
    return new DaprContainer("daprio/daprd:1.16.0-rc.3")
            .withAppName("producer-app")
            .withNetwork(daprNetwork)
            .withComponent(new Component("kvstore", "state.postgresql", "v1", STATE_STORE_PROPERTIES))
            .withComponent(new Component("kvbinding", "bindings.postgresql", "v1", BINDING_PROPERTIES))
            .dependsOn(postgreSQLContainer);
  }
}

Inside the test classpath you can add a new Spring Boot Application that uses this configuration for tests:

@SpringBootApplication
public class TestProducerApplication {

  public static void main(String[] args) {

    SpringApplication
            .from(ProducerApplication::main)
            .with(DaprTestContainersConfig.class)
            .run(args);
  }
  
}

Now you can start your application with:

mvn spring-boot:test-run

Running this command will start the application, using the provided test configuration that includes the Testcontainers and Dapr integration. In the logs you should be able to see that the daprd and the placement service containers were started for your application.

Besides the previous configuration (DaprTestContainersConfig) your tests shouldn’t be testing Dapr itself, just the REST endpoints that your application is exposing.

Leveraging Spring & Spring Boot programming model with Dapr

The Java SDK allows you to interface with all of the Dapr building blocks. But if you want to leverage the Spring and Spring Boot programming model you can use the dapr-spring-boot-starter integration. This includes implementations of Spring Data (KeyValueTemplate and CrudRepository) as well as a DaprMessagingTemplate for producing and consuming messages (similar to Spring Kafka, Spring Pulsar and Spring AMQP for RabbitMQ) and Dapr workflows.

Using Spring Data CrudRepository and KeyValueTemplate

You can use well known Spring Data constructs relying on a Dapr-based implementation. With Dapr, you don’t need to add any infrastructure-related driver or client, making your Spring application lighter and decoupled from the environment where it is running.

Under the hood these implementations use the Dapr Statestore and Binding APIs.

Configuration parameters

With Spring Data abstractions you can configure which statestore and bindings will be used by Dapr to connect to the available infrastructure. This can be done by setting the following properties:

dapr.statestore.name=kvstore
dapr.statestore.binding=kvbinding

Then you can @Autowire a KeyValueTemplate or a CrudRepository like this:

@RestController
@EnableDaprRepositories
public class OrdersRestController {
  @Autowired
  private OrderRepository repository;
  
  @PostMapping("/orders")
  public void storeOrder(@RequestBody Order order){
    repository.save(order);
  }

  @GetMapping("/orders")
  public Iterable<Order> getAll(){
    return repository.findAll();
  }


}

Where OrderRepository is defined in an interface that extends the Spring Data CrudRepository interface:

public interface OrderRepository extends CrudRepository<Order, String> {}

Notice that the @EnableDaprRepositories annotation does all the magic of wiring the Dapr APIs under the CrudRespository interface. Because Dapr allow users to interact with different StateStores from the same application, as a user you need to provide the following beans as a Spring Boot @Configuration:

@Configuration
@EnableConfigurationProperties({DaprStateStoreProperties.class})
public class ProducerAppConfiguration {
  
  @Bean
  public KeyValueAdapterResolver keyValueAdapterResolver(DaprClient daprClient, ObjectMapper mapper, DaprStateStoreProperties daprStatestoreProperties) {
    String storeName = daprStatestoreProperties.getName();
    String bindingName = daprStatestoreProperties.getBinding();

    return new DaprKeyValueAdapterResolver(daprClient, mapper, storeName, bindingName);
  }

  @Bean
  public DaprKeyValueTemplate daprKeyValueTemplate(KeyValueAdapterResolver keyValueAdapterResolver) {
    return new DaprKeyValueTemplate(keyValueAdapterResolver);
  }
  
}

Using Spring Messaging for producing and consuming events

Similar to Spring Kafka, Spring Pulsar and Spring AMQP you can use the DaprMessagingTemplate to publish messages to the configured infrastructure. To consume messages you can use the @Topic annotation (soon to be renamed to @DaprListener).

To publish events/messages you can @Autowired the DaprMessagingTemplate in your Spring application. For this example we will be publishing Order events and we are sending messages to the topic named topic.

@Autowired
private DaprMessagingTemplate<Order> messagingTemplate;

@PostMapping("/orders")
public void storeOrder(@RequestBody Order order){
  repository.save(order);
  messagingTemplate.send("topic", order);
}

Similarly to the CrudRepository we need to specify which PubSub broker do we want to use to publish and consume our messages.

dapr.pubsub.name=pubsub

Because with Dapr you can connect to multiple PubSub brokers you need to provide the following bean to let Dapr know which PubSub broker your DaprMessagingTemplate will use:

@Bean
public DaprMessagingTemplate<Order> messagingTemplate(DaprClient daprClient,
                                                             DaprPubSubProperties daprPubSubProperties) {
  return new DaprMessagingTemplate<>(daprClient, daprPubSubProperties.getName());
}

Finally, because Dapr PubSub requires a bidirectional connection between your application and Dapr you need to expand your Testcontainers configuration with a few parameters:

@Bean
@ServiceConnection
public DaprContainer daprContainer(Network daprNetwork, PostgreSQLContainer<?> postgreSQLContainer, RabbitMQContainer rabbitMQContainer){
    
    return new DaprContainer("daprio/daprd:1.16.0-rc.3")
            .withAppName("producer-app")
            .withNetwork(daprNetwork)
            .withComponent(new Component("kvstore", "state.postgresql", "v1", STATE_STORE_PROPERTIES))
            .withComponent(new Component("kvbinding", "bindings.postgresql", "v1", BINDING_PROPERTIES))
            .withComponent(new Component("pubsub", "pubsub.rabbitmq", "v1", rabbitMqProperties))
            .withAppPort(8080)
            .withAppChannelAddress("host.testcontainers.internal")
            .dependsOn(rabbitMQContainer)
            .dependsOn(postgreSQLContainer);
}

Now, in the Dapr configuration we have included a pubsub component that will connect to an instance of RabbitMQ started by Testcontainers. We have also set two important parameters .withAppPort(8080) and .withAppChannelAddress("host.testcontainers.internal") which allows Dapr to contact back to the application when a message is published in the broker.

To listen to events/messages you need to expose an endpoint in the application that will be responsible to receive the messages. If you expose a REST endpoint you can use the @Topic annotation to let Dapr know where it needs to forward the events/messages too:

@PostMapping("subscribe")
@Topic(pubsubName = "pubsub", name = "topic")
public void subscribe(@RequestBody CloudEvent<Order> cloudEvent){
    events.add(cloudEvent);
}

Upon bootstrapping your application, Dapr will register the subscription to messages to be forwarded to the subscribe endpoint exposed by your application.

If you are writing tests for these subscribers you need to ensure that Testcontainers knows that your application will be running on port 8080, so containers started with Testcontainers know where your application is:

@BeforeAll
public static void setup(){
  org.testcontainers.Testcontainers.exposeHostPorts(8080);
}

You can check and run the full example source code here.

Using Dapr Workflows with Spring Boot

Following the same approach that we used for Spring Data and Spring Messaging, the dapr-spring-boot-starter brings Dapr Workflow integration for Spring Boot users.

To work with Dapr Workflows you need to define and implement your workflows using code. The Dapr Spring Boot Starter makes your life easier by managing Workflows and WorkflowActivitys as Spring beans.

In order to enable the automatic bean discovery you can annotate your @SpringBootApplication with the @EnableDaprWorkflows annotation:

@SpringBootApplication
@EnableDaprWorkflows
public class MySpringBootApplication {}

By adding this annotation, all the WorkflowActivitys will be automatically managed by Spring and registered to the workflow engine.

By having all WorkflowActivitys as managed beans we can use Spring @Autowired mechanism to inject any bean that our workflow activity might need to implement its functionality, for example the @RestTemplate:

public class MyWorkflowActivity implements WorkflowActivity {

  @Autowired
  private RestTemplate restTemplate;

You can also @Autowired the DaprWorkflowClient to create new instances of your workflows.

@Autowired
private DaprWorkflowClient daprWorkflowClient;

This enable applications to schedule new workflow instances and raise events.

String instanceId = daprWorkflowClient.scheduleNewWorkflow(MyWorkflow.class, payload);

and

daprWorkflowClient.raiseEvent(instanceId, "MyEvenet", event);

Check the Dapr Workflow documentation for more information about how to work with Dapr Workflows.

Next steps

Learn more about the Dapr Java SDK packages available to add to your Java applications.

4 - JavaScript SDK

JavaScript SDK packages for developing Dapr applications

A client library for building Dapr apps in JavaScript and TypeScript. This client abstracts the public Dapr APIs like service to service invocation, state management, pub/sub, secrets, and much more, and provides a simple, intuitive API for building applications.

Installation

To get started with the JavaScript SDK, install the Dapr JavaScript SDK package from NPM:

npm install --save @dapr/dapr

Structure

The Dapr JavaScript SDK contains two major components:

  • DaprServer: to manage all Dapr sidecar to application communication.
  • DaprClient: to manage all application to Dapr sidecar communication.

The above communication can be configured to use either of the gRPC or HTTP protocols.

Dapr ServerDapr Client

Getting Started

To help you get started, check out the resources below:

Client

Create a JavaScript client and interact with the Dapr sidecar and other Dapr applications (e.g., publishing events, output binding support, etc.).

Server

Create a JavaScript server and let the Dapr sidecar interact with your application (e.g., subscribing to events, input binding support, etc.).

Actors

Create virtual actors with state, reminders/timers, and methods.


Logging

Configure and customize the SDK logging.

Examples

Clone the JavaScript SDK source code and try out some of the examples to get started quickly.

4.1 - JavaScript Client SDK

JavaScript Client SDK for developing Dapr applications

Introduction

The Dapr Client allows you to communicate with the Dapr Sidecar and get access to its client facing features such as Publishing Events, Invoking Output Bindings, State Management, Secret Management, and much more.

Pre-requisites

Installing and importing Dapr’s JS SDK

  1. Install the SDK with npm:
npm i @dapr/dapr --save
  1. Import the libraries:
import { DaprClient, DaprServer, HttpMethod, CommunicationProtocolEnum } from "@dapr/dapr";

const daprHost = "127.0.0.1"; // Dapr Sidecar Host
const daprPort = "3500"; // Dapr Sidecar Port of this Example Server
const serverHost = "127.0.0.1"; // App Host of this Example Server
const serverPort = "50051"; // App Port of this Example Server

// HTTP Example
const client = new DaprClient({ daprHost, daprPort });

// GRPC Example
const client = new DaprClient({ daprHost, daprPort, communicationProtocol: CommunicationProtocolEnum.GRPC });

Running

To run the examples, you can use two different protocols to interact with the Dapr sidecar: HTTP (default) or gRPC.

Using HTTP (default)

import { DaprClient } from "@dapr/dapr";
const client = new DaprClient({ daprHost, daprPort });
# Using dapr run
dapr run --app-id example-sdk --app-protocol http -- npm run start

# or, using npm script
npm run start:dapr-http

Using gRPC

Since HTTP is the default, you will have to adapt the communication protocol to use gRPC. You can do this by passing an extra argument to the client or server constructor.

import { DaprClient, CommunicationProtocol } from "@dapr/dapr";
const client = new DaprClient({ daprHost, daprPort, communicationProtocol: CommunicationProtocol.GRPC });
# Using dapr run
dapr run --app-id example-sdk --app-protocol grpc -- npm run start

# or, using npm script
npm run start:dapr-grpc

Environment Variables

Dapr Sidecar Endpoints

You can use the DAPR_HTTP_ENDPOINT and DAPR_GRPC_ENDPOINT environment variables to set the Dapr Sidecar’s HTTP and gRPC endpoints respectively. When these variables are set, the daprHost and daprPort don’t have to be set in the options argument of the constructor, the client will parse them automatically out of the provided endpoints.

import { DaprClient, CommunicationProtocol } from "@dapr/dapr";

// Using HTTP, when DAPR_HTTP_ENDPOINT is set
const client = new DaprClient();

// Using gRPC, when DAPR_GRPC_ENDPOINT is set
const client = new DaprClient({ communicationProtocol: CommunicationProtocol.GRPC });

If the environment variables are set, but daprHost and daprPort values are passed to the constructor, the latter will take precedence over the environment variables.

Dapr API Token

You can use the DAPR_API_TOKEN environment variable to set the Dapr API token. When this variable is set, the daprApiToken doesn’t have to be set in the options argument of the constructor, the client will get it automatically.

General

Increasing Body Size

You can increase the body size that is used by the application to communicate with the sidecar by using aDaprClient’s option.

import { DaprClient, CommunicationProtocol } from "@dapr/dapr";

// Allow a body size of 10Mb to be used
// The default is 4Mb
const client = new DaprClient({
  daprHost,
  daprPort,
  communicationProtocol: CommunicationProtocol.HTTP,
  maxBodySizeMb: 10,
});

Proxying Requests

By proxying requests, we can utilize the unique capabilities that Dapr brings with its sidecar architecture such as service discovery, logging, etc., enabling us to instantly “upgrade” our gRPC services. This feature of gRPC proxying was demonstrated in community call 41.

Creating a Proxy

To perform gRPC proxying, simply create a proxy by calling the client.proxy.create() method:

// As always, create a client to our dapr sidecar
// this client takes care of making sure the sidecar is started, that we can communicate, ...
const clientSidecar = new DaprClient({ daprHost, daprPort, communicationProtocol: CommunicationProtocol.GRPC });

// Create a Proxy that allows us to use our gRPC code
const clientProxy = await clientSidecar.proxy.create<GreeterClient>(GreeterClient);

We can now call the methods as defined in our GreeterClient interface (which in this case is from the Hello World example)

Behind the Scenes (Technical Working)

Architecture

  1. The gRPC service gets started in Dapr. We tell Dapr which port this gRPC server is running on through --app-port and give it a unique Dapr app ID with --app-id <APP_ID_HERE>
  2. We can now call the Dapr Sidecar through a client that will connect to the Sidecar
  3. Whilst calling the Dapr Sidecar, we provide a metadata key named dapr-app-id with the value of our gRPC server booted in Dapr (e.g. server in our example)
  4. Dapr will now forward the call to the gRPC server configured

Building blocks

The JavaScript Client SDK allows you to interface with all of the Dapr building blocks focusing on Client to Sidecar features.

Invocation API

Invoke a Service

import { DaprClient, HttpMethod } from "@dapr/dapr";

const daprHost = "127.0.0.1";
const daprPort = "3500";

async function start() {
  const client = new DaprClient({ daprHost, daprPort });

  const serviceAppId = "my-app-id";
  const serviceMethod = "say-hello";

  // POST Request
  const response = await client.invoker.invoke(serviceAppId, serviceMethod, HttpMethod.POST, { hello: "world" });

  // POST Request with headers
  const response = await client.invoker.invoke(
    serviceAppId,
    serviceMethod,
    HttpMethod.POST,
    { hello: "world" },
    { headers: { "X-User-ID": "123" } },
  );

  // GET Request
  const response = await client.invoker.invoke(serviceAppId, serviceMethod, HttpMethod.GET);
}

start().catch((e) => {
  console.error(e);
  process.exit(1);
});

For a full guide on service invocation visit How-To: Invoke a service.

State Management API

Save, Get and Delete application state

import { DaprClient } from "@dapr/dapr";

const daprHost = "127.0.0.1";
const daprPort = "3500";

async function start() {
  const client = new DaprClient({ daprHost, daprPort });

  const serviceStoreName = "my-state-store-name";

  // Save State
  const response = await client.state.save(
    serviceStoreName,
    [
      {
        key: "first-key-name",
        value: "hello",
        metadata: {
          foo: "bar",
        },
      },
      {
        key: "second-key-name",
        value: "world",
      },
    ],
    {
      metadata: {
        ttlInSeconds: "3", // this should override the ttl in the state item
      },
    },
  );

  // Get State
  const response = await client.state.get(serviceStoreName, "first-key-name");

  // Get Bulk State
  const response = await client.state.getBulk(serviceStoreName, ["first-key-name", "second-key-name"]);

  // State Transactions
  await client.state.transaction(serviceStoreName, [
    {
      operation: "upsert",
      request: {
        key: "first-key-name",
        value: "new-data",
      },
    },
    {
      operation: "delete",
      request: {
        key: "second-key-name",
      },
    },
  ]);

  // Delete State
  const response = await client.state.delete(serviceStoreName, "first-key-name");
}

start().catch((e) => {
  console.error(e);
  process.exit(1);
});

For a full list of state operations visit How-To: Get & save state.

Query State API

import { DaprClient } from "@dapr/dapr";

async function start() {
  const client = new DaprClient({ daprHost, daprPort });

  const res = await client.state.query("state-mongodb", {
    filter: {
      OR: [
        {
          EQ: { "person.org": "Dev Ops" },
        },
        {
          AND: [
            {
              EQ: { "person.org": "Finance" },
            },
            {
              IN: { state: ["CA", "WA"] },
            },
          ],
        },
      ],
    },
    sort: [
      {
        key: "state",
        order: "DESC",
      },
    ],
    page: {
      limit: 10,
    },
  });

  console.log(res);
}

start().catch((e) => {
  console.error(e);
  process.exit(1);
});

PubSub API

Publish messages

import { DaprClient } from "@dapr/dapr";

const daprHost = "127.0.0.1";
const daprPort = "3500";

async function start() {
  const client = new DaprClient({ daprHost, daprPort });

  const pubSubName = "my-pubsub-name";
  const topic = "topic-a";

  // Publish message to topic as text/plain
  // Note, the content type is inferred from the message type unless specified explicitly
  const response = await client.pubsub.publish(pubSubName, topic, "hello, world!");
  // If publish fails, response contains the error
  console.log(response);

  // Publish message to topic as application/json
  await client.pubsub.publish(pubSubName, topic, { hello: "world" });

  // Publish a JSON message as plain text
  const options = { contentType: "text/plain" };
  await client.pubsub.publish(pubSubName, topic, { hello: "world" }, options);

  // Publish message to topic as application/cloudevents+json
  // You can also use the cloudevent SDK to create cloud events https://github.com/cloudevents/sdk-javascript
  const cloudEvent = {
    specversion: "1.0",
    source: "/some/source",
    type: "example",
    id: "1234",
  };
  await client.pubsub.publish(pubSubName, topic, cloudEvent);

  // Publish a cloudevent as raw payload
  const options = { metadata: { rawPayload: true } };
  await client.pubsub.publish(pubSubName, topic, "hello, world!", options);

  // Publish multiple messages to a topic as text/plain
  await client.pubsub.publishBulk(pubSubName, topic, ["message 1", "message 2", "message 3"]);

  // Publish multiple messages to a topic as application/json
  await client.pubsub.publishBulk(pubSubName, topic, [
    { hello: "message 1" },
    { hello: "message 2" },
    { hello: "message 3" },
  ]);

  // Publish multiple messages with explicit bulk publish messages
  const bulkPublishMessages = [
    {
      entryID: "entry-1",
      contentType: "application/json",
      event: { hello: "foo message 1" },
    },
    {
      entryID: "entry-2",
      contentType: "application/cloudevents+json",
      event: { ...cloudEvent, data: "foo message 2", datacontenttype: "text/plain" },
    },
    {
      entryID: "entry-3",
      contentType: "text/plain",
      event: "foo message 3",
    },
  ];
  await client.pubsub.publishBulk(pubSubName, topic, bulkPublishMessages);
}

start().catch((e) => {
  console.error(e);
  process.exit(1);
});

Bindings API

Invoke Output Binding

Output Bindings

import { DaprClient } from "@dapr/dapr";

const daprHost = "127.0.0.1";
const daprPort = "3500";

async function start() {
  const client = new DaprClient({ daprHost, daprPort });

  const bindingName = "my-binding-name";
  const bindingOperation = "create";
  const message = { hello: "world" };

  const response = await client.binding.send(bindingName, bindingOperation, message);
}

start().catch((e) => {
  console.error(e);
  process.exit(1);
});

For a full guide on output bindings visit How-To: Use bindings.

Secret API

Retrieve secrets

import { DaprClient } from "@dapr/dapr";

const daprHost = "127.0.0.1";
const daprPort = "3500";

async function start() {
  const client = new DaprClient({ daprHost, daprPort });

  const secretStoreName = "my-secret-store";
  const secretKey = "secret-key";

  // Retrieve a single secret from secret store
  const response = await client.secret.get(secretStoreName, secretKey);

  // Retrieve all secrets from secret store
  const response = await client.secret.getBulk(secretStoreName);
}

start().catch((e) => {
  console.error(e);
  process.exit(1);
});

For a full guide on secrets visit How-To: Retrieve secrets.

Configuration API

Get Configuration Keys

import { DaprClient } from "@dapr/dapr";

const daprHost = "127.0.0.1";

async function start() {
  const client = new DaprClient({
    daprHost,
    daprPort: process.env.DAPR_GRPC_PORT,
    communicationProtocol: CommunicationProtocolEnum.GRPC,
  });

  const config = await client.configuration.get("config-store", ["key1", "key2"]);
  console.log(config);
}

start().catch((e) => {
  console.error(e);
  process.exit(1);
});

Sample output:

{
   items: {
     key1: { key: 'key1', value: 'foo', version: '', metadata: {} },
     key2: { key: 'key2', value: 'bar2', version: '', metadata: {} }
   }
}

Subscribe to Configuration Updates

import { DaprClient } from "@dapr/dapr";

const daprHost = "127.0.0.1";

async function start() {
  const client = new DaprClient({
    daprHost,
    daprPort: process.env.DAPR_GRPC_PORT,
    communicationProtocol: CommunicationProtocolEnum.GRPC,
  });

  // Subscribes to config store changes for keys "key1" and "key2"
  const stream = await client.configuration.subscribeWithKeys("config-store", ["key1", "key2"], async (data) => {
    console.log("Subscribe received updates from config store: ", data);
  });

  // Wait for 60 seconds and unsubscribe.
  await new Promise((resolve) => setTimeout(resolve, 60000));
  stream.stop();
}

start().catch((e) => {
  console.error(e);
  process.exit(1);
});

Sample output:

Subscribe received updates from config store:  {
  items: { key2: { key: 'key2', value: 'bar', version: '', metadata: {} } }
}
Subscribe received updates from config store:  {
  items: { key1: { key: 'key1', value: 'foobar', version: '', metadata: {} } }
}

Cryptography API

Support for the cryptography API is only available on the gRPC client in the JavaScript SDK.

import { createReadStream, createWriteStream } from "node:fs";
import { readFile, writeFile } from "node:fs/promises";
import { pipeline } from "node:stream/promises";

import { DaprClient, CommunicationProtocolEnum } from "@dapr/dapr";

const daprHost = "127.0.0.1";
const daprPort = "50050"; // Dapr Sidecar Port of this example server

async function start() {
  const client = new DaprClient({
    daprHost,
    daprPort,
    communicationProtocol: CommunicationProtocolEnum.GRPC,
  });

  // Encrypt and decrypt a message using streams
  await encryptDecryptStream(client);

  // Encrypt and decrypt a message from a buffer
  await encryptDecryptBuffer(client);
}

async function encryptDecryptStream(client: DaprClient) {
  // First, encrypt the message
  console.log("== Encrypting message using streams");
  console.log("Encrypting plaintext.txt to ciphertext.out");

  await pipeline(
    createReadStream("plaintext.txt"),
    await client.crypto.encrypt({
      componentName: "crypto-local",
      keyName: "symmetric256",
      keyWrapAlgorithm: "A256KW",
    }),
    createWriteStream("ciphertext.out"),
  );

  // Decrypt the message
  console.log("== Decrypting message using streams");
  console.log("Encrypting ciphertext.out to plaintext.out");
  await pipeline(
    createReadStream("ciphertext.out"),
    await client.crypto.decrypt({
      componentName: "crypto-local",
    }),
    createWriteStream("plaintext.out"),
  );
}

async function encryptDecryptBuffer(client: DaprClient) {
  // Read "plaintext.txt" so we have some content
  const plaintext = await readFile("plaintext.txt");

  // First, encrypt the message
  console.log("== Encrypting message using buffers");

  const ciphertext = await client.crypto.encrypt(plaintext, {
    componentName: "crypto-local",
    keyName: "my-rsa-key",
    keyWrapAlgorithm: "RSA",
  });

  await writeFile("test.out", ciphertext);

  // Decrypt the message
  console.log("== Decrypting message using buffers");
  const decrypted = await client.crypto.decrypt(ciphertext, {
    componentName: "crypto-local",
  });

  // The contents should be equal
  if (plaintext.compare(decrypted) !== 0) {
    throw new Error("Decrypted message does not match original message");
  }
}

start().catch((e) => {
  console.error(e);
  process.exit(1);
});

For a full guide on cryptography visit How-To: Cryptography.

Distributed Lock API

Try Lock and Unlock APIs

import { CommunicationProtocolEnum, DaprClient } from "@dapr/dapr";
import { LockStatus } from "@dapr/dapr/types/lock/UnlockResponse";

const daprHost = "127.0.0.1";
const daprPortDefault = "3500";

async function start() {
  const client = new DaprClient({ daprHost, daprPort });

  const storeName = "redislock";
  const resourceId = "resourceId";
  const lockOwner = "owner1";
  let expiryInSeconds = 1000;

  console.log(`Acquiring lock on ${storeName}, ${resourceId} as owner: ${lockOwner}`);
  const lockResponse = await client.lock.lock(storeName, resourceId, lockOwner, expiryInSeconds);
  console.log(lockResponse);

  console.log(`Unlocking on ${storeName}, ${resourceId} as owner: ${lockOwner}`);
  const unlockResponse = await client.lock.unlock(storeName, resourceId, lockOwner);
  console.log("Unlock API response: " + getResponseStatus(unlockResponse.status));
}

function getResponseStatus(status: LockStatus) {
  switch (status) {
    case LockStatus.Success:
      return "Success";
    case LockStatus.LockDoesNotExist:
      return "LockDoesNotExist";
    case LockStatus.LockBelongsToOthers:
      return "LockBelongsToOthers";
    default:
      return "InternalError";
  }
}

start().catch((e) => {
  console.error(e);
  process.exit(1);
});

For a full guide on distributed locks visit How-To: Use Distributed Locks.

Workflow API

Workflow management

import { DaprClient } from "@dapr/dapr";

async function start() {
  const client = new DaprClient();

  // Start a new workflow instance
  const instanceId = await client.workflow.start("OrderProcessingWorkflow", {
    Name: "Paperclips",
    TotalCost: 99.95,
    Quantity: 4,
  });
  console.log(`Started workflow instance ${instanceId}`);

  // Get a workflow instance
  const workflow = await client.workflow.get(instanceId);
  console.log(
    `Workflow ${workflow.workflowName}, created at ${workflow.createdAt.toUTCString()}, has status ${
      workflow.runtimeStatus
    }`,
  );
  console.log(`Additional properties: ${JSON.stringify(workflow.properties)}`);

  // Pause a workflow instance
  await client.workflow.pause(instanceId);
  console.log(`Paused workflow instance ${instanceId}`);

  // Resume a workflow instance
  await client.workflow.resume(instanceId);
  console.log(`Resumed workflow instance ${instanceId}`);

  // Terminate a workflow instance
  await client.workflow.terminate(instanceId);
  console.log(`Terminated workflow instance ${instanceId}`);

  // Purge a workflow instance
  await client.workflow.purge(instanceId);
  console.log(`Purged workflow instance ${instanceId}`);
}

start().catch((e) => {
  console.error(e);
  process.exit(1);
});

4.2 - JavaScript Server SDK

JavaScript Server SDK for developing Dapr applications

Introduction

The Dapr Server will allow you to receive communication from the Dapr Sidecar and get access to its server facing features such as: Subscribing to Events, Receiving Input Bindings, and much more.

Pre-requisites

Installing and importing Dapr’s JS SDK

  1. Install the SDK with npm:
npm i @dapr/dapr --save
  1. Import the libraries:
import { DaprServer, CommunicationProtocolEnum } from "@dapr/dapr";

const daprHost = "127.0.0.1"; // Dapr Sidecar Host
const daprPort = "3500"; // Dapr Sidecar Port of this Example Server
const serverHost = "127.0.0.1"; // App Host of this Example Server
const serverPort = "50051"; // App Port of this Example Server

// HTTP Example
const server = new DaprServer({
  serverHost,
  serverPort,
  communicationProtocol: CommunicationProtocolEnum.HTTP, // DaprClient to use same communication protocol as DaprServer, in case DaprClient protocol not mentioned explicitly
  clientOptions: {
    daprHost,
    daprPort,
  },
});

// GRPC Example
const server = new DaprServer({
  serverHost,
  serverPort,
  communicationProtocol: CommunicationProtocolEnum.GRPC,
  clientOptions: {
    daprHost,
    daprPort,
  },
});

Running

To run the examples, you can use two different protocols to interact with the Dapr sidecar: HTTP (default) or gRPC.

Using HTTP (built-in express webserver)

import { DaprServer } from "@dapr/dapr";

const server = new DaprServer({
  serverHost: appHost,
  serverPort: appPort,
  clientOptions: {
    daprHost,
    daprPort,
  },
});
// initialize subscribtions, ... before server start
// the dapr sidecar relies on these
await server.start();
# Using dapr run
dapr run --app-id example-sdk --app-port 50051 --app-protocol http -- npm run start

# or, using npm script
npm run start:dapr-http

ℹ️ Note: The app-port is required here, as this is where our server will need to bind to. Dapr will check for the application to bind to this port, before finishing start-up.

Using HTTP (bring your own express webserver)

Instead of using the built-in web server for Dapr sidecar to application communication, you can also bring your own instance. This is helpful in scenarios like when you are building a REST API back-end and want to integrate Dapr directly in it.

Note, this is currently available for express only.

💡 Note: when using a custom web-server, the SDK will configure server properties like max body size, and add new routes to it. The routes are unique on their own to avoid any collisions with your application, but it’s not guaranteed to not collide.

import { DaprServer, CommunicationProtocolEnum } from "@dapr/dapr";
import express from "express";

const myApp = express();

myApp.get("/my-custom-endpoint", (req, res) => {
  res.send({ msg: "My own express app!" });
});

const daprServer = new DaprServer({
      serverHost: "127.0.0.1", // App Host
      serverPort: "50002", // App Port
      serverHttp: myApp,
      clientOptions: {
        daprHost
        daprPort
      }
    });

// Initialize subscriptions before the server starts, the Dapr sidecar uses it.
// This will also initialize the app server itself (removing the need for `app.listen` to be called).
await daprServer.start();

After configuring the above, you can call your custom endpoint as you normally would:

const res = await fetch(`http://127.0.0.1:50002/my-custom-endpoint`);
const json = await res.json();

Using gRPC

Since HTTP is the default, you will have to adapt the communication protocol to use gRPC. You can do this by passing an extra argument to the client or server constructor.

import { DaprServer, CommunicationProtocol } from "@dapr/dapr";

const server = new DaprServer({
  serverHost: appHost,
  serverPort: appPort,
  communicationProtocol: CommunicationProtocolEnum.GRPC,
  clientOptions: {
    daprHost,
    daprPort,
  },
});
// initialize subscribtions, ... before server start
// the dapr sidecar relies on these
await server.start();
# Using dapr run
dapr run --app-id example-sdk --app-port 50051 --app-protocol grpc -- npm run start

# or, using npm script
npm run start:dapr-grpc

ℹ️ Note: The app-port is required here, as this is where our server will need to bind to. Dapr will check for the application to bind to this port, before finishing start-up.

Building blocks

The JavaScript Server SDK allows you to interface with all of the Dapr building blocks focusing on Sidecar to App features.

Invocation API

Listen to an Invocation

import { DaprServer, DaprInvokerCallbackContent } from "@dapr/dapr";

const daprHost = "127.0.0.1"; // Dapr Sidecar Host
const daprPort = "3500"; // Dapr Sidecar Port of this Example Server
const serverHost = "127.0.0.1"; // App Host of this Example Server
const serverPort = "50051"; // App Port of this Example Server "

async function start() {
  const server = new DaprServer({
    serverHost,
    serverPort,
    clientOptions: {
      daprHost,
      daprPort,
    },
  });

  const callbackFunction = (data: DaprInvokerCallbackContent) => {
    console.log("Received body: ", data.body);
    console.log("Received metadata: ", data.metadata);
    console.log("Received query: ", data.query);
    console.log("Received headers: ", data.headers); // only available in HTTP
  };

  await server.invoker.listen("hello-world", callbackFunction, { method: HttpMethod.GET });

  // You can now invoke the service with your app id and method "hello-world"

  await server.start();
}

start().catch((e) => {
  console.error(e);
  process.exit(1);
});

For a full guide on service invocation visit How-To: Invoke a service.

PubSub API

Subscribe to messages

Subscribing to messages can be done in several ways to offer flexibility of receiving messages on your topics:

  • Direct subscription through the subscribe method
  • Direct susbcription with options through the subscribeWithOptions method
  • Subscription afterwards through the susbcribeOnEvent method

Each time an event arrives, we pass its body as data and the headers as headers, which can contain properties of the event publisher (e.g., a device ID from IoT Hub)

Dapr requires subscriptions to be set up on startup, but in the JS SDK we allow event handlers to be added afterwards as well, providing you the flexibility of programming.

An example is provided below

import { DaprServer } from "@dapr/dapr";

const daprHost = "127.0.0.1"; // Dapr Sidecar Host
const daprPort = "3500"; // Dapr Sidecar Port of this Example Server
const serverHost = "127.0.0.1"; // App Host of this Example Server
const serverPort = "50051"; // App Port of this Example Server "

async function start() {
  const server = new DaprServer({
    serverHost,
    serverPort,
    clientOptions: {
      daprHost,
      daprPort,
    },
  });

  const pubSubName = "my-pubsub-name";
  const topic = "topic-a";

  // Configure Subscriber for a Topic
  // Method 1: Direct subscription through the `subscribe` method
  await server.pubsub.subscribe(pubSubName, topic, async (data: any, headers: object) =>
    console.log(`Received Data: ${JSON.stringify(data)} with headers: ${JSON.stringify(headers)}`),
  );

  // Method 2: Direct susbcription with options through the `subscribeWithOptions` method
  await server.pubsub.subscribeWithOptions(pubSubName, topic, {
    callback: async (data: any, headers: object) =>
      console.log(`Received Data: ${JSON.stringify(data)} with headers: ${JSON.stringify(headers)}`),
  });

  // Method 3: Subscription afterwards through the `susbcribeOnEvent` method
  // Note: we use default, since if no route was passed (empty options) we utilize "default" as the route name
  await server.pubsub.subscribeWithOptions("pubsub-redis", "topic-options-1", {});
  server.pubsub.subscribeToRoute("pubsub-redis", "topic-options-1", "default", async (data: any, headers: object) => {
    console.log(`Received Data: ${JSON.stringify(data)} with headers: ${JSON.stringify(headers)}`);
  });

  // Start the server
  await server.start();
}

For a full list of state operations visit How-To: Publish & subscribe.

Subscribe with SUCCESS/RETRY/DROP status

Dapr supports status codes for retry logic to specify what should happen after a message gets processed.

⚠️ The JS SDK allows multiple callbacks on the same topic, we handle priority of status on RETRY > DROP > SUCCESS and default to SUCCESS

⚠️ Make sure to configure resiliency in your application to handle RETRY messages

In the JS SDK we support these messages through the DaprPubSubStatusEnum enum. To ensure Dapr will retry we configure a Resiliency policy as well.

components/resiliency.yaml

apiVersion: dapr.io/v1alpha1
kind: Resiliency
metadata:
  name: myresiliency
spec:
  policies:
    retries:
      # Global Retry Policy for Inbound Component operations
      DefaultComponentInboundRetryPolicy:
        policy: constant
        duration: 500ms
        maxRetries: 10
  targets:
    components:
      messagebus:
        inbound:
          retry: DefaultComponentInboundRetryPolicy

src/index.ts

import { DaprServer, DaprPubSubStatusEnum } from "@dapr/dapr";

const daprHost = "127.0.0.1"; // Dapr Sidecar Host
const daprPort = "3500"; // Dapr Sidecar Port of this Example Server
const serverHost = "127.0.0.1"; // App Host of this Example Server
const serverPort = "50051"; // App Port of this Example Server "

async function start() {
  const server = new DaprServer({
    serverHost,
    serverPort,
    clientOptions: {
      daprHost,
      daprPort,
    },
  });

  const pubSubName = "my-pubsub-name";
  const topic = "topic-a";

  // Process a message successfully
  await server.pubsub.subscribe(pubSubName, topic, async (data: any, headers: object) => {
    return DaprPubSubStatusEnum.SUCCESS;
  });

  // Retry a message
  // Note: this example will keep on retrying to deliver the message
  // Note 2: each component can have their own retry configuration
  //   e.g., https://docs.dapr.io/reference/components-reference/supported-pubsub/setup-redis-pubsub/
  await server.pubsub.subscribe(pubSubName, topic, async (data: any, headers: object) => {
    return DaprPubSubStatusEnum.RETRY;
  });

  // Drop a message
  await server.pubsub.subscribe(pubSubName, topic, async (data: any, headers: object) => {
    return DaprPubSubStatusEnum.DROP;
  });

  // Start the server
  await server.start();
}

Subscribe to messages rule based

Dapr supports routing messages to different handlers (routes) based on rules.

E.g., you are writing an application that needs to handle messages depending on their “type” with Dapr, you can send them to different routes handlerType1 and handlerType2 with the default route being handlerDefault

import { DaprServer } from "@dapr/dapr";

const daprHost = "127.0.0.1"; // Dapr Sidecar Host
const daprPort = "3500"; // Dapr Sidecar Port of this Example Server
const serverHost = "127.0.0.1"; // App Host of this Example Server
const serverPort = "50051"; // App Port of this Example Server "

async function start() {
  const server = new DaprServer({
    serverHost,
    serverPort,
    clientOptions: {
      daprHost,
      daprPort,
    },
  });

  const pubSubName = "my-pubsub-name";
  const topic = "topic-a";

  // Configure Subscriber for a Topic with rule set
  // Note: the default route and match patterns are optional
  await server.pubsub.subscribe("pubsub-redis", "topic-1", {
    default: "/default",
    rules: [
      {
        match: `event.type == "my-type-1"`,
        path: "/type-1",
      },
      {
        match: `event.type == "my-type-2"`,
        path: "/type-2",
      },
    ],
  });

  // Add handlers for each route
  server.pubsub.subscribeToRoute("pubsub-redis", "topic-1", "default", async (data) => {
    console.log(`Handling Default`);
  });
  server.pubsub.subscribeToRoute("pubsub-redis", "topic-1", "type-1", async (data) => {
    console.log(`Handling Type 1`);
  });
  server.pubsub.subscribeToRoute("pubsub-redis", "topic-1", "type-2", async (data) => {
    console.log(`Handling Type 2`);
  });

  // Start the server
  await server.start();
}

Susbcribe with Wildcards

The popular wildcards * and + are supported (make sure to validate if the pubsub component supports it) and can be subscribed to as follows:

import { DaprServer } from "@dapr/dapr";

const daprHost = "127.0.0.1"; // Dapr Sidecar Host
const daprPort = "3500"; // Dapr Sidecar Port of this Example Server
const serverHost = "127.0.0.1"; // App Host of this Example Server
const serverPort = "50051"; // App Port of this Example Server "

async function start() {
  const server = new DaprServer({
    serverHost,
    serverPort,
    clientOptions: {
      daprHost,
      daprPort,
    },
  });

  const pubSubName = "my-pubsub-name";

  // * Wildcard
  await server.pubsub.subscribe(pubSubName, "/events/*", async (data: any, headers: object) =>
    console.log(`Received Data: ${JSON.stringify(data)}`),
  );

  // + Wildcard
  await server.pubsub.subscribe(pubSubName, "/events/+/temperature", async (data: any, headers: object) =>
    console.log(`Received Data: ${JSON.stringify(data)}`),
  );

  // Start the server
  await server.start();
}

Bulk Subscribe to messages

Bulk Subscription is supported and is available through following API:

  • Bulk subscription through the subscribeBulk method: maxMessagesCount and maxAwaitDurationMs are optional; and if not provided, default values for related components will be used.

While listening for messages, the application receives messages from Dapr in bulk. However, like regular subscribe, the callback function receives a single message at a time, and the user can choose to return a DaprPubSubStatusEnum value to acknowledge successfully, retry, or drop the message. The default behavior is to return a success response.

Please refer this document for more details.

import { DaprServer } from "@dapr/dapr";

const pubSubName = "orderPubSub";
const topic = "topicbulk";

const daprHost = process.env.DAPR_HOST || "127.0.0.1";
const daprHttpPort = process.env.DAPR_HTTP_PORT || "3502";
const serverHost = process.env.SERVER_HOST || "127.0.0.1";
const serverPort = process.env.APP_PORT || 5001;

async function start() {
  const server = new DaprServer({
    serverHost,
    serverPort,
    clientOptions: {
      daprHost,
      daprPort: daprHttpPort,
    },
  });

  // Publish multiple messages to a topic with default config.
  await client.pubsub.subscribeBulk(pubSubName, topic, (data) =>
    console.log("Subscriber received: " + JSON.stringify(data)),
  );

  // Publish multiple messages to a topic with specific maxMessagesCount and maxAwaitDurationMs.
  await client.pubsub.subscribeBulk(
    pubSubName,
    topic,
    (data) => {
      console.log("Subscriber received: " + JSON.stringify(data));
      return DaprPubSubStatusEnum.SUCCESS; // If App doesn't return anything, the default is SUCCESS. App can also return RETRY or DROP based on the incoming message.
    },
    {
      maxMessagesCount: 100,
      maxAwaitDurationMs: 40,
    },
  );
}

Dead Letter Topics

Dapr supports dead letter topic. This means that when a message fails to be processed, it gets sent to a dead letter queue. E.g., when a message fails to be handled on /my-queue it will be sent to /my-queue-failed. E.g., when a message fails to be handled on /my-queue it will be sent to /my-queue-failed.

You can use the following options with subscribeWithOptions method:

  • deadletterTopic: Specify a deadletter topic name (note: if none is provided we create one named deadletter)
  • deadletterCallback: The method to trigger as handler for our deadletter

Implementing Deadletter support in the JS SDK can be done by either

  • Passing the deadletterCallback as an option
  • By subscribing to route manually with subscribeToRoute

An example is provided below

import { DaprServer } from "@dapr/dapr";

const daprHost = "127.0.0.1"; // Dapr Sidecar Host
const daprPort = "3500"; // Dapr Sidecar Port of this Example Server
const serverHost = "127.0.0.1"; // App Host of this Example Server
const serverPort = "50051"; // App Port of this Example Server "

async function start() {
  const server = new DaprServer({
    serverHost,
    serverPort,
    clientOptions: {
      daprHost,
      daprPort,
    },
  });

  const pubSubName = "my-pubsub-name";

  // Method 1 (direct subscribing through subscribeWithOptions)
  await server.pubsub.subscribeWithOptions("pubsub-redis", "topic-options-5", {
    callback: async (data: any) => {
      throw new Error("Triggering Deadletter");
    },
    deadLetterCallback: async (data: any) => {
      console.log("Handling Deadletter message");
    },
  });

  // Method 2 (subscribe afterwards)
  await server.pubsub.subscribeWithOptions("pubsub-redis", "topic-options-1", {
    deadletterTopic: "my-deadletter-topic",
  });
  server.pubsub.subscribeToRoute("pubsub-redis", "topic-options-1", "default", async () => {
    throw new Error("Triggering Deadletter");
  });
  server.pubsub.subscribeToRoute("pubsub-redis", "topic-options-1", "my-deadletter-topic", async () => {
    console.log("Handling Deadletter message");
  });

  // Start server
  await server.start();
}

Bindings API

Receive an Input Binding

import { DaprServer } from "@dapr/dapr";

const daprHost = "127.0.0.1";
const daprPort = "3500";
const serverHost = "127.0.0.1";
const serverPort = "5051";

async function start() {
  const server = new DaprServer({
    serverHost,
    serverPort,
    clientOptions: {
      daprHost,
      daprPort,
    },
  });

  const bindingName = "my-binding-name";

  const response = await server.binding.receive(bindingName, async (data: any) =>
    console.log(`Got Data: ${JSON.stringify(data)}`),
  );

  await server.start();
}

start().catch((e) => {
  console.error(e);
  process.exit(1);
});

For a full guide on output bindings visit How-To: Use bindings.

Configuration API

💡 The configuration API is currently only available through gRPC

Getting a configuration value

import { DaprServer } from "@dapr/dapr";

const daprHost = "127.0.0.1";
const daprPort = "3500";
const serverHost = "127.0.0.1";
const serverPort = "5051";

async function start() {
  const client = new DaprClient({
    daprHost,
    daprPort,
    communicationProtocol: CommunicationProtocolEnum.GRPC,
  });
  const config = await client.configuration.get("config-redis", ["myconfigkey1", "myconfigkey2"]);
}

start().catch((e) => {
  console.error(e);
  process.exit(1);
});

Subscribing to Key Changes

import { DaprServer } from "@dapr/dapr";

const daprHost = "127.0.0.1";
const daprPort = "3500";
const serverHost = "127.0.0.1";
const serverPort = "5051";

async function start() {
  const client = new DaprClient({
    daprHost,
    daprPort,
    communicationProtocol: CommunicationProtocolEnum.GRPC,
  });
  const stream = await client.configuration.subscribeWithKeys("config-redis", ["myconfigkey1", "myconfigkey2"], () => {
    // Received a key update
  });

  // When you are ready to stop listening, call the following
  await stream.close();
}

start().catch((e) => {
  console.error(e);
  process.exit(1);
});

4.3 - JavaScript SDK for Actors

How to get up and running with Actors using the Dapr JavaScript SDK

The Dapr actors package allows you to interact with Dapr virtual actors from a JavaScript application. The examples below demonstrate how to use the JavaScript SDK for interacting with virtual actors.

For a more in-depth overview of Dapr actors, visit the actors overview page.

Pre-requisites

Scenario

The below code examples loosely describe the scenario of a Parking Garage Spot Monitoring System, which can be seen in this video by Mark Russinovich.

A parking garage consists of hundreds of parking spaces, where each parking space includes a sensor that provides updates to a centralized monitoring system. The parking space sensors (our actors) detect if a parking space is occupied or available.

To jump in and run this example yourself, clone the source code, which can be found in the JavaScript SDK examples directory.

Actor Interface

The actor interface defines the contract that is shared between the actor implementation and the clients calling the actor. In the example below, we have created an interace for a parking garage sensor. Each sensor has 2 methods: carEnter and carLeave, which defines the state of the parking space:

export default interface ParkingSensorInterface {
  carEnter(): Promise<void>;
  carLeave(): Promise<void>;
}

Actor Implementation

An actor implementation defines a class by extending the base type AbstractActor and implementing the actor interface (ParkingSensorInterface in this case).

The following code describes an actor implementation along with a few helper methods.

import { AbstractActor } from "@dapr/dapr";
import ParkingSensorInterface from "./ParkingSensorInterface";

export default class ParkingSensorImpl extends AbstractActor implements ParkingSensorInterface {
  async carEnter(): Promise<void> {
    // Implementation that updates state that this parking spaces is occupied.
  }

  async carLeave(): Promise<void> {
    // Implementation that updates state that this parking spaces is available.
  }

  private async getInfo(): Promise<object> {
    // Implementation of requesting an update from the parking space sensor.
  }

  /**
   * @override
   */
  async onActivate(): Promise<void> {
    // Initialization logic called by AbstractActor.
  }
}

Configuring Actor Runtime

To configure actor runtime, use the DaprClientOptions. The various parameters and their default values are documented at How-to: Use virtual actors in Dapr.

Note, the timeouts and intervals should be formatted as time.ParseDuration strings.

import { CommunicationProtocolEnum, DaprClient, DaprServer } from "@dapr/dapr";

// Configure the actor runtime with the DaprClientOptions.
const clientOptions = {
  daprHost: daprHost,
  daprPort: daprPort,
  communicationProtocol: CommunicationProtocolEnum.HTTP,
  actor: {
    actorIdleTimeout: "1h",
    actorScanInterval: "30s",
    drainOngoingCallTimeout: "1m",
    drainRebalancedActors: true,
    reentrancy: {
      enabled: true,
      maxStackDepth: 32,
    },
    remindersStoragePartitions: 0,
  },
};

// Use the options when creating DaprServer and DaprClient.

// Note, DaprServer creates a DaprClient internally, which needs to be configured with clientOptions.
const server = new DaprServer({ serverHost, serverPort, clientOptions });

const client = new DaprClient(clientOptions);

Registering Actors

Initialize and register your actors by using the DaprServer package:

import { DaprServer } from "@dapr/dapr";
import ParkingSensorImpl from "./ParkingSensorImpl";

const daprHost = "127.0.0.1";
const daprPort = "50000";
const serverHost = "127.0.0.1";
const serverPort = "50001";

const server = new DaprServer({
  serverHost,
  serverPort,
  clientOptions: {
    daprHost,
    daprPort,
  },
});

await server.actor.init(); // Let the server know we need actors
server.actor.registerActor(ParkingSensorImpl); // Register the actor
await server.start(); // Start the server

// To get the registered actors, you can invoke `getRegisteredActors`:
const resRegisteredActors = await server.actor.getRegisteredActors();
console.log(`Registered Actors: ${JSON.stringify(resRegisteredActors)}`);

Invoking Actor Methods

After Actors are registered, create a Proxy object that implements ParkingSensorInterface using the ActorProxyBuilder. You can invoke the actor methods by directly calling methods on the Proxy object. Internally, it translates to making a network call to the Actor API and fetches the result back.

import { ActorId, DaprClient } from "@dapr/dapr";
import ParkingSensorImpl from "./ParkingSensorImpl";
import ParkingSensorInterface from "./ParkingSensorInterface";

const daprHost = "127.0.0.1";
const daprPort = "50000";

const client = new DaprClient({ daprHost, daprPort });

// Create a new actor builder. It can be used to create multiple actors of a type.
const builder = new ActorProxyBuilder<ParkingSensorInterface>(ParkingSensorImpl, client);

// Create a new actor instance.
const actor = builder.build(new ActorId("my-actor"));
// Or alternatively, use a random ID
// const actor = builder.build(ActorId.createRandomId());

// Invoke the method.
await actor.carEnter();

Using states with Actor

import { AbstractActor } from "@dapr/dapr";
import ActorStateInterface from "./ActorStateInterface";

export default class ActorStateExample extends AbstractActor implements ActorStateInterface {
  async setState(key: string, value: any): Promise<void> {
    await this.getStateManager().setState(key, value);
    await this.getStateManager().saveState();
  }

  async removeState(key: string): Promise<void> {
    await this.getStateManager().removeState(key);
    await this.getStateManager().saveState();
  }

  // getState with a specific type
  async getState<T>(key: string): Promise<T | null> {
    return await this.getStateManager<T>().getState(key);
  }

  // getState without type as `any`
  async getState(key: string): Promise<any> {
    return await this.getStateManager().getState(key);
  }
}

Actor Timers and Reminders

The JS SDK supports actors that can schedule periodic work on themselves by registering either timers or reminders. The main difference between timers and reminders is that the Dapr actor runtime does not retain any information about timers after deactivation, but persists reminders information using the Dapr actor state provider.

This distinction allows users to trade off between light-weight but stateless timers versus more resource-demanding but stateful reminders.

The scheduling interface of timers and reminders is identical. For an more in-depth look at the scheduling configurations see the actors timers and reminders docs.

Actor Timers

// ...

const actor = builder.build(new ActorId("my-actor"));

// Register a timer
await actor.registerActorTimer(
  "timer-id", // Unique name of the timer.
  "cb-method", // Callback method to execute when timer is fired.
  Temporal.Duration.from({ seconds: 2 }), // DueTime
  Temporal.Duration.from({ seconds: 1 }), // Period
  Temporal.Duration.from({ seconds: 1 }), // TTL
  50, // State to be sent to timer callback.
);

// Delete the timer
await actor.unregisterActorTimer("timer-id");

Actor Reminders

// ...

const actor = builder.build(new ActorId("my-actor"));

// Register a reminder, it has a default callback: `receiveReminder`
await actor.registerActorReminder(
  "reminder-id", // Unique name of the reminder.
  Temporal.Duration.from({ seconds: 2 }), // DueTime
  Temporal.Duration.from({ seconds: 1 }), // Period
  Temporal.Duration.from({ seconds: 1 }), // TTL
  100, // State to be sent to reminder callback.
);

// Delete the reminder
await actor.unregisterActorReminder("reminder-id");

To handle the callback, you need to override the default receiveReminder implementation in your actor. For example, from our original actor implementation:

export default class ParkingSensorImpl extends AbstractActor implements ParkingSensorInterface {
  // ...

  /**
   * @override
   */
  async receiveReminder(state: any): Promise<void> {
    // handle stuff here
  }

  // ...
}

For a full guide on actors, visit How-To: Use virtual actors in Dapr.

4.4 - Logging in JavaScript SDK

Configuring logging in JavaScript SDK

Introduction

The JavaScript SDK comes with a out-of-box Console based logger. The SDK emits various internal logs to help users understand the chain of events and troubleshoot problems. A consumer of this SDK can customize the verbosity of the log, as well as provide their own implementation for the logger.

Configure log level

There are five levels of logging in descending order of importance - error, warn, info, verbose, and debug. Setting the log to a level means that the logger will emit all the logs that are at least as important as the mentioned level. For example, setting to verbose log means that the SDK will not emit debug level logs. The default log level is info.

Dapr Client

import { CommunicationProtocolEnum, DaprClient, LogLevel } from "@dapr/dapr";

// create a client instance with log level set to verbose.
const client = new DaprClient({
  daprHost,
  daprPort,
  communicationProtocol: CommunicationProtocolEnum.HTTP,
  logger: { level: LogLevel.Verbose },
});

For more details on how to use the Client, see JavaScript Client.

DaprServer

import { CommunicationProtocolEnum, DaprServer, LogLevel } from "@dapr/dapr";

// create a server instance with log level set to error.
const server = new DaprServer({
  serverHost,
  serverPort,
  clientOptions: {
    daprHost,
    daprPort,
    logger: { level: LogLevel.Error },
  },
});

For more details on how to use the Server, see JavaScript Server.

Custom LoggerService

The JavaScript SDK uses the in-built Console for logging. To use a custom logger like Winston or Pino, you can implement the LoggerService interface.

Winston based logging:

Create a new implementation of LoggerService.

import { LoggerService } from "@dapr/dapr";
import * as winston from "winston";

export class WinstonLoggerService implements LoggerService {
  private logger;

  constructor() {
    this.logger = winston.createLogger({
      transports: [new winston.transports.Console(), new winston.transports.File({ filename: "combined.log" })],
    });
  }

  error(message: any, ...optionalParams: any[]): void {
    this.logger.error(message, ...optionalParams);
  }
  warn(message: any, ...optionalParams: any[]): void {
    this.logger.warn(message, ...optionalParams);
  }
  info(message: any, ...optionalParams: any[]): void {
    this.logger.info(message, ...optionalParams);
  }
  verbose(message: any, ...optionalParams: any[]): void {
    this.logger.verbose(message, ...optionalParams);
  }
  debug(message: any, ...optionalParams: any[]): void {
    this.logger.debug(message, ...optionalParams);
  }
}

Pass the new implementation to the SDK.

import { CommunicationProtocolEnum, DaprClient, LogLevel } from "@dapr/dapr";
import { WinstonLoggerService } from "./WinstonLoggerService";

const winstonLoggerService = new WinstonLoggerService();

// create a client instance with log level set to verbose and logger service as winston.
const client = new DaprClient({
  daprHost,
  daprPort,
  communicationProtocol: CommunicationProtocolEnum.HTTP,
  logger: { level: LogLevel.Verbose, service: winstonLoggerService },
});

4.6 - How to: Author and manage Dapr Workflow in the JavaScript SDK

How to get up and running with workflows using the Dapr JavaScript SDK

Let’s create a Dapr workflow and invoke it using the console. With the provided workflow example, you will:

This example uses the default configuration from dapr init in self-hosted mode.

Prerequisites

  • Verify you’re using the latest proto bindings

Set up the environment

Clone the JavaScript SDK repo and navigate into it.

git clone https://github.com/dapr/js-sdk
cd js-sdk

From the JavaScript SDK root directory, navigate to the Dapr Workflow example.

cd examples/workflow/authoring

Run the following command to install the requirements for running this workflow sample with the Dapr JavaScript SDK.

npm install

Run the activity-sequence.ts

The activity-sequence file registers a workflow and an activity with the Dapr Workflow runtime. The workflow is a sequence of activities that are executed in order. We use DaprWorkflowClient to schedule a new workflow instance and wait for it to complete.

const daprHost = "localhost";
const daprPort = "50001";
const workflowClient = new DaprWorkflowClient({
  daprHost,
  daprPort,
});
const workflowRuntime = new WorkflowRuntime({
  daprHost,
  daprPort,
});

const hello = async (_: WorkflowActivityContext, name: string) => {
  return `Hello ${name}!`;
};

const sequence: TWorkflow = async function* (ctx: WorkflowContext): any {
  const cities: string[] = [];

  const result1 = yield ctx.callActivity(hello, "Tokyo");
  cities.push(result1);
  const result2 = yield ctx.callActivity(hello, "Seattle");
  cities.push(result2);
  const result3 = yield ctx.callActivity(hello, "London");
  cities.push(result3);

  return cities;
};

workflowRuntime.registerWorkflow(sequence).registerActivity(hello);

// Wrap the worker startup in a try-catch block to handle any errors during startup
try {
  await workflowRuntime.start();
  console.log("Workflow runtime started successfully");
} catch (error) {
  console.error("Error starting workflow runtime:", error);
}

// Schedule a new orchestration
try {
  const id = await workflowClient.scheduleNewWorkflow(sequence);
  console.log(`Orchestration scheduled with ID: ${id}`);

  // Wait for orchestration completion
  const state = await workflowClient.waitForWorkflowCompletion(id, undefined, 30);

  console.log(`Orchestration completed! Result: ${state?.serializedOutput}`);
} catch (error) {
  console.error("Error scheduling or waiting for orchestration:", error);
}

In the code above:

  • workflowRuntime.registerWorkflow(sequence) registers sequence as a workflow in the Dapr Workflow runtime.
  • await workflowRuntime.start(); builds and starts the engine within the Dapr Workflow runtime.
  • await workflowClient.scheduleNewWorkflow(sequence) schedules a new workflow instance with the Dapr Workflow runtime.
  • await workflowClient.waitForWorkflowCompletion(id, undefined, 30) waits for the workflow instance to complete.

In the terminal, execute the following command to kick off the activity-sequence.ts:

npm run start:dapr:activity-sequence

Expected output

You're up and running! Both Dapr and your app logs will appear here.

...

== APP == Orchestration scheduled with ID: dc040bea-6436-4051-9166-c9294f9d2201
== APP == Waiting 30 seconds for instance dc040bea-6436-4051-9166-c9294f9d2201 to complete...
== APP == Received "Orchestrator Request" work item with instance id 'dc040bea-6436-4051-9166-c9294f9d2201'
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Rebuilding local state with 0 history event...
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Processing 2 new history event(s): [ORCHESTRATORSTARTED=1, EXECUTIONSTARTED=1]
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Waiting for 1 task(s) and 0 event(s) to complete...
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Returning 1 action(s)
== APP == Received "Activity Request" work item
== APP == Activity hello completed with output "Hello Tokyo!" (14 chars)
== APP == Received "Orchestrator Request" work item with instance id 'dc040bea-6436-4051-9166-c9294f9d2201'
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Rebuilding local state with 3 history event...
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Processing 2 new history event(s): [ORCHESTRATORSTARTED=1, TASKCOMPLETED=1]
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Waiting for 1 task(s) and 0 event(s) to complete...
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Returning 1 action(s)
== APP == Received "Activity Request" work item
== APP == Activity hello completed with output "Hello Seattle!" (16 chars)
== APP == Received "Orchestrator Request" work item with instance id 'dc040bea-6436-4051-9166-c9294f9d2201'
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Rebuilding local state with 6 history event...
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Processing 2 new history event(s): [ORCHESTRATORSTARTED=1, TASKCOMPLETED=1]
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Waiting for 1 task(s) and 0 event(s) to complete...
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Returning 1 action(s)
== APP == Received "Activity Request" work item
== APP == Activity hello completed with output "Hello London!" (15 chars)
== APP == Received "Orchestrator Request" work item with instance id 'dc040bea-6436-4051-9166-c9294f9d2201'
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Rebuilding local state with 9 history event...
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Processing 2 new history event(s): [ORCHESTRATORSTARTED=1, TASKCOMPLETED=1]
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Orchestration completed with status COMPLETED
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Returning 1 action(s)
INFO[0006] dc040bea-6436-4051-9166-c9294f9d2201: 'sequence' completed with a COMPLETED status.  app_id=activity-sequence-workflow instance=kaibocai-devbox scope=wfengine.backend type=log ver=1.12.3
== APP == Instance dc040bea-6436-4051-9166-c9294f9d2201 completed
== APP == Orchestration completed! Result: ["Hello Tokyo!","Hello Seattle!","Hello London!"]

Next steps

5 - Dapr PHP SDK

PHP SDK packages for developing Dapr applications

Dapr offers an SDK to help with the development of PHP applications. Using it, you can create PHP clients, servers, and virtual actors with Dapr.

Setting up

Prerequisites

Optional Prerequisites

Initialize your project

In a directory where you want to create your service, run composer init and answer the questions. Install with composer require dapr/php-sdk and any other dependencies you may wish to use.

Configure your service

Create a config.php, copying the contents below:

<?php

use Dapr\Actors\Generators\ProxyFactory;
use Dapr\Middleware\Defaults\{Response\ApplicationJson,Tracing};
use Psr\Log\LogLevel;
use function DI\{env,get};

return [
    // set the log level
    'dapr.log.level'               => LogLevel::WARNING,

    // Generate a new proxy on each request - recommended for development
    'dapr.actors.proxy.generation' => ProxyFactory::GENERATED,
    
    // put any subscriptions here
    'dapr.subscriptions'           => [],
    
    // if this service will be hosting any actors, add them here
    'dapr.actors'                  => [],
    
    // if this service will be hosting any actors, configure how long until dapr should consider an actor idle
    'dapr.actors.idle_timeout'     => null,
    
    // if this service will be hosting any actors, configure how often dapr will check for idle actors 
    'dapr.actors.scan_interval'    => null,
    
    // if this service will be hosting any actors, configure how long dapr will wait for an actor to finish during drains
    'dapr.actors.drain_timeout'    => null,
    
    // if this service will be hosting any actors, configure if dapr should wait for an actor to finish
    'dapr.actors.drain_enabled'    => null,
    
    // you shouldn't have to change this, but the setting is here if you need to
    'dapr.port'                    => env('DAPR_HTTP_PORT', '3500'),
    
    // add any custom serialization routines here
    'dapr.serializers.custom'      => [],
    
    // add any custom deserialization routines here
    'dapr.deserializers.custom'    => [],
    
    // the following has no effect, as it is the default middlewares and processed in order specified
    'dapr.http.middleware.request'  => [get(Tracing::class)],
    'dapr.http.middleware.response' => [get(ApplicationJson::class), get(Tracing::class)],
];

Create your service

Create index.php and put the following contents:

<?php

require_once __DIR__.'/vendor/autoload.php';

use Dapr\App;

$app = App::create(configure: fn(\DI\ContainerBuilder $builder) => $builder->addDefinitions(__DIR__ . '/config.php'));
$app->get('/hello/{name}', function(string $name) {
    return ['hello' => $name];
});
$app->start();

Try it out

Initialize dapr with dapr init and then start the project with dapr run -a dev -p 3000 -- php -S 0.0.0.0:3000.

You can now open a web browser and point it to http://localhost:3000/hello/world replacing world with your name, a pet’s name, or whatever you want.

Congratulations, you’ve created your first Dapr service! I’m excited to see what you’ll do with it!

More Information

5.1 - Virtual Actors

How to build actors

If you’re new to the actor pattern, the best place to learn about the actor pattern is in the Actor Overview.

In the PHP SDK, there are two sides to an actor, the Client, and the Actor (aka, the Runtime). As a client of an actor, you’ll interact with a remote actor via the ActorProxy class. This class generates a proxy class on-the-fly using one of several configured strategies.

When writing an actor, state can be managed for you. You can hook into the actor lifecycle, and define reminders and timers. This gives you considerable power for handling all types of problems that the actor pattern is suited for.

The Actor Proxy

Whenever you want to communicate with an actor, you’ll need to get a proxy object to do so. The proxy is responsible for serializing your request, deserializing the response, and returning it to you, all while obeying the contract defined by the specified interface.

In order to create the proxy, you’ll first need an interface to define how and what you send and receive from an actor. For example, if you want to communicate with a counting actor that solely keeps track of counts, you might define the interface as follows:

<?php
#[\Dapr\Actors\Attributes\DaprType('Counter')]
interface ICount {
    function increment(int $amount = 1): void;
    function get_count(): int;
}

It’s a good idea to put this interface in a shared library that the actor and clients can both access (if both are written in PHP). The DaprType attribute tells the DaprClient the name of the actor to send to. It should match the implementation’s DaprType, though you can override the type if needed.

<?php
$app->run(function(\Dapr\Actors\ActorProxy $actorProxy) {
    $actor = $actorProxy->get(ICount::class, 'actor-id');
    $actor->increment(10);
});

Writing Actors

To create an actor, you need to implement the interface you defined earlier and also add the DaprType attribute. All actors must implement IActor, however there’s an Actor base class that implements the boilerplate making your implementation much simpler.

Here’s the counter actor:

<?php
#[\Dapr\Actors\Attributes\DaprType('Count')]
class Counter extends \Dapr\Actors\Actor implements ICount {
    function __construct(string $id, private CountState $state) {
        parent::__construct($id);
    }
    
    function increment(int $amount = 1): void {
        $this->state->count += $amount;
    }
    
    function get_count(): int {
        return $this->state->count;
    }
}

The most important bit is the constructor. It takes at least one argument with the name of id which is the id of the actor. Any additional arguments are injected by the DI container, including any ActorState you want to use.

Actor Lifecycle

An actor is instantiated via the constructor on every request targeting that actor type. You can use it to calculate ephemeral state or handle any kind of request-specific startup you require, such as setting up other clients or connections.

After the actor is instantiated, the on_activation() method may be called. The on_activation() method is called any time the actor “wakes up” or when it is created for the first time. It is not called on every request.

Next, the actor method is called. This may be from a timer, reminder, or from a client. You may perform any work that needs to be done and/or throw an exception.

Finally, the result of the work is returned to the caller. After some time (depending on how you’ve configured the service), the actor will be deactivated and on_deactivation() will be called. This may not be called if the host dies, daprd crashes, or some other error occurs which prevents it from being called successfully.

Actor State

Actor state is a “Plain Old PHP Object” (POPO) that extends ActorState. The ActorState base class provides a couple of useful methods. Here’s an example implementation:

<?php
class CountState extends \Dapr\Actors\ActorState {
    public int $count = 0;
}

Registering an Actor

Dapr expects to know what actors a service may host at startup. You need to add it to the configuration:

If you want to take advantage of pre-compiled dependency injection, you need to use a factory:

<?php
// in config.php

return [
    'dapr.actors' => fn() => [Counter::class],
];

All that is required to start the app:

<?php

require_once __DIR__ . '/vendor/autoload.php';

$app = \Dapr\App::create(
    configure: fn(\DI\ContainerBuilder $builder) => $builder->addDefinitions('config.php')->enableCompilation(__DIR__)
);
$app->start();
<?php
// in config.php

return [
    'dapr.actors' => [Counter::class]
];

All that is required to start the app:

<?php

require_once __DIR__ . '/vendor/autoload.php';

$app = \Dapr\App::create(configure: fn(\DI\ContainerBuilder $builder) => $builder->addDefinitions('config.php'));
$app->start();

5.1.1 - Production Reference: Actors

Running PHP actors in production

Proxy modes

There are four different modes actor proxies are handled. Each mode presents different trade-offs that you’ll need to weigh during development and in production.

<?php
\Dapr\Actors\Generators\ProxyFactory::GENERATED;
\Dapr\Actors\Generators\ProxyFactory::GENERATED_CACHED;
\Dapr\Actors\Generators\ProxyFactory::ONLY_EXISTING;
\Dapr\Actors\Generators\ProxyFactory::DYNAMIC;

It can be set with dapr.actors.proxy.generation configuration key.

This is the default mode. In this mode, a class is generated and eval’d on every request. It’s mostly for development and shouldn’t be used in production.

This is the same as ProxyModes::GENERATED except the class is stored in a tmp file so it doesn’t need to be regenerated on every request. It doesn’t know when to update the cached class, so using it in development is discouraged but is offered for when manually generating the files isn’t possible.

In this mode, an exception is thrown if the proxy class doesn’t exist. This is useful for when you don’t want to generate code in production. You’ll have to make sure the class is generated and pre-/autoloaded.

Generating proxies

You can create a composer script to generate proxies on demand to take advantage of the ONLY_EXISTING mode.

Create a ProxyCompiler.php

<?php

class ProxyCompiler {
    private const PROXIES = [
        MyActorInterface::class,
        MyOtherActorInterface::class,
    ];
    
    private const PROXY_LOCATION = __DIR__.'/proxies/';
    
    public static function compile() {
        try {
            $app = \Dapr\App::create();
            foreach(self::PROXIES as $interface) {
                $output = $app->run(function(\DI\FactoryInterface $factory) use ($interface) {
                    return \Dapr\Actors\Generators\FileGenerator::generate($interface, $factory);
                });
                $reflection = new ReflectionClass($interface);
                $dapr_type = $reflection->getAttributes(\Dapr\Actors\Attributes\DaprType::class)[0]->newInstance()->type;
                $filename = 'dapr_proxy_'.$dapr_type.'.php';
                file_put_contents(self::PROXY_LOCATION.$filename, $output);
                echo "Compiled: $interface";
            }
        } catch (Exception $ex) {
            echo "Failed to generate proxy for $interface\n{$ex->getMessage()} on line {$ex->getLine()} in {$ex->getFile()}\n";
        }
    }
}

Then add a psr-4 autoloader for the generated proxies and a script in composer.json:

{
  "autoload": {
    "psr-4": {
      "Dapr\\Proxies\\": "path/to/proxies"
    }
  },
  "scripts": {
    "compile-proxies": "ProxyCompiler::compile"
  }
}

And finally, configure dapr to only use the generated proxies:

<?php
// in config.php

return [
    'dapr.actors.proxy.generation' => ProxyFactory::ONLY_EXISTING,
];

In this mode, the proxy satisfies the interface contract, however, it does not actually implement the interface itself (meaning instanceof will be false). This mode takes advantage of a few quirks in PHP to work and exists for cases where code cannot be eval’d or generated.

Requests

Creating an actor proxy is very inexpensive for any mode. There are no requests made when creating an actor proxy object.

When you call a method on a proxy object, only methods that you implemented are serviced by your actor implementation. get_id() is handled locally, and get_reminder(), delete_reminder(), etc. are handled by the daprd.

Actor implementation

Every actor implementation in PHP must implement \Dapr\Actors\IActor and use the \Dapr\Actors\ActorTrait trait. This allows for fast reflection and some shortcuts. Using the \Dapr\Actors\Actor abstract base class does this for you, but if you need to override the default behavior, you can do so by implementing the interface and using the trait.

Activation and deactivation

When an actor activates, a token file is written to a temporary directory (by default this is in '/tmp/dapr_' + sha256(concat(Dapr type, id)) in linux and '%temp%/dapr_' + sha256(concat(Dapr type, id)) on Windows). This is persisted until the actor deactivates, or the host shuts down. This allows for on_activation to be called once and only once when Dapr activates the actor on the host.

Performance

Actor method invocation is very fast on a production setup with php-fpm and nginx, or IIS on Windows. Even though the actor is constructed on every request, actor state keys are only loaded on-demand and not during each request. However, there is some overhead in loading each key individually. This can be mitigated by storing an array of data in state, trading some usability for speed. It is not recommended doing this from the start, but as an optimization when needed.

Versioning state

The names of the variables in the ActorState object directly correspond to key names in the store. This means that if you change the type or name of a variable, you may run into errors. To get around this, you may need to version your state object. In order to do this, you’ll need to override how state is loaded and stored. There are many ways to approach this, one such solution might be something like this:

<?php

class VersionedState extends \Dapr\Actors\ActorState {
    /**
     * @var int The current version of the state in the store. We give a default value of the current version. 
     * However, it may be in the store with a different value. 
     */
    public int $state_version = self::VERSION;
    
    /**
     * @var int The current version of the data
     */
    private const VERSION = 3;
    
    /**
     * Call when your actor activates.
     */
    public function upgrade() {
        if($this->state_version < self::VERSION) {
            $value = parent::__get($this->get_versioned_key('key', $this->state_version));
            // update the value after updating the data structure
            parent::__set($this->get_versioned_key('key', self::VERSION), $value);
            $this->state_version = self::VERSION;
            $this->save_state();
        }
    }
    
    // if you upgrade all keys as needed in the method above, you don't need to walk the previous
    // keys when loading/saving and you can just get the current version of the key.
    
    private function get_previous_version(int $version): int {
        return $this->has_previous_version($version) ? $version - 1 : $version;
    }
    
    private function has_previous_version(int $version): bool {
        return $version >= 0;
    }
    
    private function walk_versions(int $version, callable $callback, callable $predicate): mixed {
        $value = $callback($version);
        if($predicate($value) || !$this->has_previous_version($version)) {
            return $value;
        }
        return $this->walk_versions($this->get_previous_version($version), $callback, $predicate);
    }
    
    private function get_versioned_key(string $key, int $version) {
        return $this->has_previous_version($version) ? $version.$key : $key;
    }
    
    public function __get(string $key): mixed {
        return $this->walk_versions(
            self::VERSION, 
            fn($version) => parent::__get($this->get_versioned_key($key, $version)),
            fn($value) => isset($value)
        );
    }
    
    public function __isset(string $key): bool {
        return $this->walk_versions(
            self::VERSION,
            fn($version) => parent::__isset($this->get_versioned_key($key, $version)),
            fn($isset) => $isset
        );
    }
    
    public function __set(string $key,mixed $value): void {
        // optional: you can unset previous versions of the key
        parent::__set($this->get_versioned_key($key, self::VERSION), $value);
    }
    
    public function __unset(string $key) : void {
        // unset this version and all previous versions
        $this->walk_versions(
            self::VERSION, 
            fn($version) => parent::__unset($this->get_versioned_key($key, $version)), 
            fn() => false
        );
    }
}

There’s a lot to be optimized, and it wouldn’t be a good idea to use this verbatim in production, but you can get the gist of how it would work. A lot of it will depend on your use case which is why there’s not something like this in the SDK. For instance, in this example implementation, the previous value is kept for where there may be a bug during an upgrade; keeping the previous value allows for running the upgrade again, but you may wish to delete the previous value.

5.2 - The App

Using the App Class

In PHP, there is no default router. Thus, the \Dapr\App class is provided. It uses Nikic’s FastRoute under the hood. However, you are free to use any router or framework that you’d like. Just check out the add_dapr_routes() method in the App class to see how actors and subscriptions are implemented.

Every app should start with App::create() which takes two parameters, the first is an existing DI container, if you have one, and the second is a callback to hook into the ContainerBuilder and add your own configuration.

From there, you should define your routes and then call $app->start() to execute the route on the current request.

<?php
// app.php

require_once __DIR__ . '/vendor/autoload.php';

$app = \Dapr\App::create(configure: fn(\DI\ContainerBuilder $builder) => $builder->addDefinitions('config.php'));

// add a controller for GET /test/{id} that returns the id
$app->get('/test/{id}', fn(string $id) => $id);

$app->start();

Returning from a controller

You can return anything from a controller, and it will be serialized into a json object. You can also request the Psr Response object and return that instead, allowing you to customize headers, and have control over the entire response:

<?php
$app = \Dapr\App::create(configure: fn(\DI\ContainerBuilder $builder) => $builder->addDefinitions('config.php'));

// add a controller for GET /test/{id} that returns the id
$app->get('/test/{id}', 
    fn(
        string $id, 
        \Psr\Http\Message\ResponseInterface $response, 
        \Nyholm\Psr7\Factory\Psr17Factory $factory) => $response->withBody($factory->createStream($id)));

$app->start();

Using the app as a client

When you just want to use Dapr as a client, such as in existing code, you can call $app->run(). In these cases, there’s usually no need for a custom configuration, however, you may want to use a compiled DI container, especially in production:

<?php
// app.php

require_once __DIR__ . '/vendor/autoload.php';

$app = \Dapr\App::create(configure: fn(\DI\ContainerBuilder $builder) => $builder->enableCompilation(__DIR__));
$result = $app->run(fn(\Dapr\DaprClient $client) => $client->get('/invoke/other-app/method/my-method'));

Using in other frameworks

A DaprClient object is provided, in fact, all the sugar used by the App object is built on the DaprClient.

<?php

require_once __DIR__ . '/vendor/autoload.php';

$clientBuilder = \Dapr\Client\DaprClient::clientBuilder();

// you can customize (de)serialization or comment out to use the default JSON serializers.
$clientBuilder = $clientBuilder->withSerializationConfig($yourSerializer)->withDeserializationConfig($yourDeserializer);

// you can also pass it a logger
$clientBuilder = $clientBuilder->withLogger($myLogger);

// and change the url of the sidecar, for example, using https
$clientBuilder = $clientBuilder->useHttpClient('https://localhost:3800') 

There are several functions you can call before

5.2.1 - Unit Testing

Unit Testing

Unit and integration tests are first-class citizens with the PHP SDK. Using the DI container, mocks, stubs, and the provided \Dapr\Mocks\TestClient allows you to have very fine-grained tests.

Testing Actors

With actors, there are two things we’re interested in while the actor is under test:

  1. The returned result based on an initial state
  2. The resulting state based on the initial state

Here’s an example test a very simple actor that updates its state and returns a specific value:

<?php

// TestState.php

class TestState extends \Dapr\Actors\ActorState
{
    public int $number;
}

// TestActor.php

#[\Dapr\Actors\Attributes\DaprType('TestActor')]
class TestActor extends \Dapr\Actors\Actor
{
    public function __construct(string $id, private TestState $state)
    {
        parent::__construct($id);
    }

    public function oddIncrement(): bool
    {
        if ($this->state->number % 2 === 0) {
            return false;
        }
        $this->state->number += 1;

        return true;
    }
}

// TheTest.php

class TheTest extends \PHPUnit\Framework\TestCase
{
    private \DI\Container $container;

    public function setUp(): void
    {
        parent::setUp();
        // create a default app and extract the DI container from it
        $app = \Dapr\App::create(
            configure: fn(\DI\ContainerBuilder $builder) => $builder->addDefinitions(
            ['dapr.actors' => [TestActor::class]],
            [\Dapr\DaprClient::class => \DI\autowire(\Dapr\Mocks\TestClient::class)]
        ));
        $app->run(fn(\DI\Container $container) => $this->container = $container);
    }

    public function testIncrementsWhenOdd()
    {
        $id      = uniqid();
        $runtime = $this->container->get(\Dapr\Actors\ActorRuntime::class);
        $client  = $this->getClient();

        // return the current state from http://localhost:1313/reference/api/actors_api/
        $client->register_get("/actors/TestActor/$id/state/number", code: 200, data: 3);

        // ensure it increments from http://localhost:1313/reference/api/actors_api/
        $client->register_post(
            "/actors/TestActor/$id/state",
            code: 204,
            response_data: null,
            expected_request: [
                [
                    'operation' => 'upsert',
                    'request'   => [
                        'key'   => 'number',
                        'value' => 4,
                    ],
                ],
            ]
        );

        $result = $runtime->resolve_actor(
            'TestActor',
            $id,
            fn($actor) => $runtime->do_method($actor, 'oddIncrement', null)
        );
        $this->assertTrue($result);
    }

    private function getClient(): \Dapr\Mocks\TestClient
    {
        return $this->container->get(\Dapr\DaprClient::class);
    }
}
<?php

// TestState.php

class TestState extends \Dapr\Actors\ActorState
{
    public int $number;
}

// TestActor.php

#[\Dapr\Actors\Attributes\DaprType('TestActor')]
class TestActor extends \Dapr\Actors\Actor
{
    public function __construct(string $id, private TestState $state)
    {
        parent::__construct($id);
    }

    public function oddIncrement(): bool
    {
        if ($this->state->number % 2 === 0) {
            return false;
        }
        $this->state->number += 1;

        return true;
    }
}

// TheTest.php

class TheTest extends \PHPUnit\Framework\TestCase
{
    public function testNotIncrementsWhenEven() {
        $container = new \DI\Container();
        $state = new TestState($container, $container);
        $state->number = 4;
        $id = uniqid();
        $actor = new TestActor($id, $state);
        $this->assertFalse($actor->oddIncrement());
        $this->assertSame(4, $state->number);
    }
}

Testing Transactions

When building on transactions, you’ll likely want to test how a failed transaction is handled. In order to do that, you need to inject failures and ensure the transaction matches what you expect.

<?php

// MyState.php
#[\Dapr\State\Attributes\StateStore('statestore', \Dapr\consistency\EventualFirstWrite::class)]
class MyState extends \Dapr\State\TransactionalState {
    public string $value = '';
}

// SomeService.php
class SomeService {
    public function __construct(private MyState $state) {}

    public function doWork() {
        $this->state->begin();
        $this->state->value = "hello world";
        $this->state->commit();
    }
}

// TheTest.php
class TheTest extends \PHPUnit\Framework\TestCase {
    private \DI\Container $container;

    public function setUp(): void
    {
        parent::setUp();
        $app = \Dapr\App::create(configure: fn(\DI\ContainerBuilder $builder)
            => $builder->addDefinitions([\Dapr\DaprClient::class => \DI\autowire(\Dapr\Mocks\TestClient::class)]));
        $this->container = $app->run(fn(\DI\Container $container) => $container);
    }

    private function getClient(): \Dapr\Mocks\TestClient {
        return $this->container->get(\Dapr\DaprClient::class);
    }

    public function testTransactionFailure() {
        $client = $this->getClient();

        // create a response from https://v1-16.docs.dapr.io/reference/api/state_api/
        $client->register_post('/state/statestore/bulk', code: 200, response_data: [
            [
                'key' => 'value',
                // no previous value
            ],
        ], expected_request: [
            'keys' => ['value'],
            'parallelism' => 10
        ]);
        $client->register_post('/state/statestore/transaction',
            code: 200,
            response_data: null,
            expected_request: [
                'operations' => [
                    [
                        'operation' => 'upsert',
                        'request' => [
                            'key' => 'value',
                            'value' => 'hello world'
                        ]
                    ]
                ]
            ]
        );
        $state = new MyState($this->container, $this->container);
        $service = new SomeService($state);
        $service->doWork();
        $this->assertSame('hello world', $state->value);
    }
}
<?php
// MyState.php
#[\Dapr\State\Attributes\StateStore('statestore', \Dapr\consistency\EventualFirstWrite::class)]
class MyState extends \Dapr\State\TransactionalState {
    public string $value = '';
}

// SomeService.php
class SomeService {
    public function __construct(private MyState $state) {}

    public function doWork() {
        $this->state->begin();
        $this->state->value = "hello world";
        $this->state->commit();
    }
}

// TheTest.php
class TheTest extends \PHPUnit\Framework\TestCase {
    public function testTransactionFailure() {
        $state = $this->createStub(MyState::class);
        $service = new SomeService($state);
        $service->doWork();
        $this->assertSame('hello world', $state->value);
    }
}

5.3 - Custom Serialization

How to configure serialization

Dapr uses JSON serialization and thus (complex) type information is lost when sending/receiving data.

Serialization

When returning an object from a controller, passing an object to the DaprClient, or storing an object in a state store, only public properties are scanned and serialized. You can customize this behavior by implementing \Dapr\Serialization\ISerialize. For example, if you wanted to create an ID type that serialized to a string, you may implement it like so:

<?php

class MyId implements \Dapr\Serialization\Serializers\ISerialize 
{
    public string $id;
    
    public function serialize(mixed $value,\Dapr\Serialization\ISerializer $serializer): mixed
    {
        // $value === $this
        return $this->id; 
    }
}

This works for any type that we have full ownership over, however, it doesn’t work for classes from libraries or PHP itself. For that, you need to register a custom serializer with the DI container:

<?php
// in config.php

class SerializeSomeClass implements \Dapr\Serialization\Serializers\ISerialize 
{
    public function serialize(mixed $value,\Dapr\Serialization\ISerializer $serializer) : mixed 
    {
        // serialize $value and return the result
    }
}

return [
    'dapr.serializers.custom'      => [SomeClass::class => new SerializeSomeClass()],
];

Deserialization

Deserialization works exactly the same way, except the interface is \Dapr\Deserialization\Deserializers\IDeserialize.

5.4 - Publish and Subscribe with PHP

How to use

With Dapr, you can publish anything, including cloud events. The SDK contains a simple cloud event implementation, but you can also just pass an array that conforms to the cloud event spec or use another library.

<?php
$app->post('/publish', function(\Dapr\Client\DaprClient $daprClient) {
    $daprClient->publishEvent(pubsubName: 'pubsub', topicName: 'my-topic', data: ['something' => 'happened']);
});

For more information about publish/subscribe, check out the howto.

Data content type

The PHP SDK allows setting the data content type either when constructing a custom cloud event, or when publishing raw data.

<?php
$event = new \Dapr\PubSub\CloudEvent();
$event->data = $xml;
$event->data_content_type = 'application/xml';
<?php
/**
 * @var \Dapr\Client\DaprClient $daprClient 
 */
$daprClient->publishEvent(pubsubName: 'pubsub', topicName: 'my-topic', data: $raw_data, contentType: 'application/octet-stream');

Receiving cloud events

In your subscription handler, you can have the DI Container inject either a Dapr\PubSub\CloudEvent or an array into your controller. The former does some validation to ensure you have a proper event. If you need direct access to the data, or the events do not conform to the spec, use an array.

5.5 - State Management with PHP

How to use

Dapr offers a great modular approach to using state in your application. The best way to learn the basics is to visit the howto.

Metadata

Many state components allow you to pass metadata to the component to control specific aspects of the component’s behavior. The PHP SDK allows you to pass that metadata through:

<?php
// using the state manager
$app->run(
    fn(\Dapr\State\StateManager $stateManager) => 
        $stateManager->save_state('statestore', new \Dapr\State\StateItem('key', 'value', metadata: ['port' => '112'])));

// using the DaprClient
$app->run(fn(\Dapr\Client\DaprClient $daprClient) => $daprClient->saveState(storeName: 'statestore', key: 'key', value: 'value', metadata: ['port' => '112']))

This is an example of how you might pass the port metadata to Cassandra.

Every state operation allows passing metadata.

Consistency/concurrency

In the PHP SDK, there are four classes that represent the four different types of consistency and concurrency in Dapr:

<?php
[
    \Dapr\consistency\StrongLastWrite::class, 
    \Dapr\consistency\StrongFirstWrite::class,
    \Dapr\consistency\EventualLastWrite::class,
    \Dapr\consistency\EventualFirstWrite::class,
] 

Passing one of them to a StateManager method or using the StateStore() attribute allows you to define how the state store should handle conflicts.

Parallelism

When doing a bulk read or beginning a transaction, you can specify the amount of parallelism. Dapr will read “at most” that many keys at a time from the underlying store if it has to read one key at a time. This can be helpful to control the load on the state store at the expense of performance. The default is 10.

Prefix

Hardcoded key names are useful, but why not make state objects more reusable? When committing a transaction or saving an object to state, you can pass a prefix that is applied to every key in the object.

<?php
class TransactionObject extends \Dapr\State\TransactionalState {
    public string $key;
}

$app->run(function (TransactionObject $object ) {
    $object->begin(prefix: 'my-prefix-');
    $object->key = 'value';
    // commit to key `my-prefix-key`
    $object->commit();
});
<?php
class StateObject {
    public string $key;
}

$app->run(function(\Dapr\State\StateManager $stateManager) {
    $stateManager->load_object($obj = new StateObject(), prefix: 'my-prefix-');
    // original value is from `my-prefix-key`
    $obj->key = 'value';
    // save to `my-prefix-key`
    $stateManager->save_object($obj, prefix: 'my-prefix-');
});

6 - Dapr Python SDK

Python SDK packages for developing Dapr applications

Dapr offers a variety of subpackages to help with the development of Python applications. Using them you can create Python clients, servers, and virtual actors with Dapr.

Prerequisites

Installation

To get started with the Python SDK, install the main Dapr Python SDK package.

pip install dapr

Note: The development package will contain features and behavior that will be compatible with the pre-release version of the Dapr runtime. Make sure to uninstall any stable versions of the Python SDK before installing the dapr-dev package.

pip install dapr-dev

Available subpackages

SDK imports

Python SDK imports are subpackages included with the main SDK install, but need to be imported when used. The most common imports provided by the Dapr Python SDK are:

Client

Write Python applications to interact with a Dapr sidecar and other Dapr applications, including stateful virtual actors in Python

Actors

Create and interact with Dapr's Actor framework.

Learn more about all of the available Dapr Python SDK imports.

SDK extensions

SDK extensions mainly work as utilities for receiving pub/sub events, programatically creating pub/sub subscriptions, and handling input binding events. While you can acheive all of these tasks without an extension, using a Python SDK extension proves convenient.

gRPC

Create Dapr services with the gRPC server extension.

FastAPI

Integrate with Dapr Python virtual actors and pub/sub using the Dapr FastAPI extension.

Flask

Integrate with Dapr Python virtual actors using the Dapr Flask extension.

Workflow

Author workflows that work with other Dapr APIs in Python.

Learn more about the Dapr Python SDK extensions.

Try it out

Clone the Python SDK repo.

git clone https://github.com/dapr/python-sdk.git

Walk through the Python quickstarts, tutorials, and examples to see Dapr in action:

SDK samplesDescription
QuickstartsExperience Dapr’s API building blocks in just a few minutes using the Python SDK.
SDK samplesClone the SDK repo to try out some examples and get started.
Bindings tutorialSee how Dapr Python SDK works alongside other Dapr SDKs to enable bindings.
Distributed Calculator tutorialUse the Dapr Python SDK to handle method invocation and state persistent capabilities.
Hello World tutorialLearn how to get Dapr up and running locally on your machine with the Python SDK.
Hello Kubernetes tutorialGet up and running with the Dapr Python SDK in a Kubernetes cluster.
Observability tutorialExplore Dapr’s metric collection, tracing, logging and health check capabilities using the Python SDK.
Pub/sub tutorialSee how Dapr Python SDK works alongside other Dapr SDKs to enable pub/sub applications.

More information

Serialization

Learn more about serialization in Dapr SDKs.

PyPI

Python Package Index

6.1 - Getting started with the Dapr client Python SDK

How to get up and running with the Dapr Python SDK

The Dapr client package allows you to interact with other Dapr applications from a Python application.

Prerequisites

Install the Dapr Python package before getting started.

Import the client package

The dapr package contains the DaprClient, which is used to create and use a client.

from dapr.clients import DaprClient

Initialising the client

You can initialise a Dapr client in multiple ways:

Default values:

When you initialise the client without any parameters it will use the default values for a Dapr sidecar instance (127.0.0.1:50001).

from dapr.clients import DaprClient

with DaprClient() as d:
    # use the client

Specifying an endpoint on initialisation:

When passed as an argument in the constructor, the gRPC endpoint takes precedence over any configuration or environment variable.

from dapr.clients import DaprClient

with DaprClient("mydomain:50051?tls=true") as d:
    # use the client

Configuration options:

Dapr Sidecar Endpoints

You can use the standardised DAPR_GRPC_ENDPOINT environment variable to specify the gRPC endpoint. When this variable is set, the client can be initialised without any arguments:

export DAPR_GRPC_ENDPOINT="mydomain:50051?tls=true"
from dapr.clients import DaprClient

with DaprClient() as d:
    # the client will use the endpoint specified in the environment variables

The legacy environment variables DAPR_RUNTIME_HOST, DAPR_HTTP_PORT and DAPR_GRPC_PORT are also supported, but DAPR_GRPC_ENDPOINT takes precedence.

Dapr API Token

If your Dapr instance is configured to require the DAPR_API_TOKEN environment variable, you can set it in the environment and the client will use it automatically.
You can read more about Dapr API token authentication here.

Health timeout

On client initialisation, a health check is performed against the Dapr sidecar (/healthz/outbound). The client will wait for the sidecar to be up and running before proceeding.

The default healthcheck timeout is 60 seconds, but it can be overridden by setting the DAPR_HEALTH_TIMEOUT environment variable.

Retries and timeout

The Dapr client can retry a request if a specific error code is received from the sidecar. This is configurable through the DAPR_API_MAX_RETRIES environment variable and is picked up automatically, not requiring any code changes. The default value for DAPR_API_MAX_RETRIES is 0, which means no retries will be made.

You can fine-tune more retry parameters by creating a dapr.clients.retry.RetryPolicy object and passing it to the DaprClient constructor:

from dapr.clients.retry import RetryPolicy

retry = RetryPolicy(
    max_attempts=5, 
    initial_backoff=1, 
    max_backoff=20, 
    backoff_multiplier=1.5,
    retryable_http_status_codes=[408, 429, 500, 502, 503, 504],
    retryable_grpc_status_codes=[StatusCode.UNAVAILABLE, StatusCode.DEADLINE_EXCEEDED, ]
)

with DaprClient(retry_policy=retry) as d:
    ...

or for actors:

factory = ActorProxyFactory(retry_policy=RetryPolicy(max_attempts=3))
proxy = ActorProxy.create('DemoActor', ActorId('1'), DemoActorInterface, factory)

Timeout can be set for all calls through the environment variable DAPR_API_TIMEOUT_SECONDS. The default value is 60 seconds.

Note: You can control timeouts on service invocation separately, by passing a timeout parameter to the invoke_method method.

Error handling

Initially, errors in Dapr followed the Standard gRPC error model. However, to provide more detailed and informative error messages, in version 1.13 an enhanced error model has been introduced which aligns with the gRPC Richer error model. In response, the Python SDK implemented DaprGrpcError, a custom exception class designed to improve the developer experience.
It’s important to note that the transition to using DaprGrpcError for all gRPC status exceptions is a work in progress. As of now, not every API call in the SDK has been updated to leverage this custom exception. We are actively working on this enhancement and welcome contributions from the community.

Example of handling DaprGrpcError exceptions when using the Dapr python-SDK:

try:
    d.save_state(store_name=storeName, key=key, value=value)
except DaprGrpcError as err:
    print(f'Status code: {err.code()}')
    print(f"Message: {err.message()}")
    print(f"Error code: {err.error_code()}")
    print(f"Error info(reason): {err.error_info.reason}")
    print(f"Resource info (resource type): {err.resource_info.resource_type}")
    print(f"Resource info (resource name): {err.resource_info.resource_name}")
    print(f"Bad request (field): {err.bad_request.field_violations[0].field}")
    print(f"Bad request (description): {err.bad_request.field_violations[0].description}")

Building blocks

The Python SDK allows you to interface with all of the Dapr building blocks.

Invoke a service

The Dapr Python SDK provides a simple API for invoking services via either HTTP or gRPC (deprecated). The protocol can be selected by setting the DAPR_API_METHOD_INVOCATION_PROTOCOL environment variable, defaulting to HTTP when unset. GRPC service invocation in Dapr is deprecated and GRPC proxying is recommended as an alternative.

from dapr.clients import DaprClient

with DaprClient() as d:
    # invoke a method (gRPC or HTTP GET)    
    resp = d.invoke_method('service-to-invoke', 'method-to-invoke', data='{"message":"Hello World"}')

    # for other HTTP verbs the verb must be specified
    # invoke a 'POST' method (HTTP only)    
    resp = d.invoke_method('service-to-invoke', 'method-to-invoke', data='{"id":"100", "FirstName":"Value", "LastName":"Value"}', http_verb='post')

The base endpoint for HTTP api calls is specified in the DAPR_HTTP_ENDPOINT environment variable. If this variable is not set, the endpoint value is derived from the DAPR_RUNTIME_HOST and DAPR_HTTP_PORT variables, whose default values are 127.0.0.1 and 3500 accordingly.

The base endpoint for gRPC calls is the one used for the client initialisation (explained above).

Save & get application state

from dapr.clients import DaprClient

with DaprClient() as d:
    # Save state
    d.save_state(store_name="statestore", key="key1", value="value1")

    # Get state
    data = d.get_state(store_name="statestore", key="key1").data

    # Delete state
    d.delete_state(store_name="statestore", key="key1")

Query application state (Alpha)

    from dapr import DaprClient

    query = '''
    {
        "filter": {
            "EQ": { "state": "CA" }
        },
        "sort": [
            {
                "key": "person.id",
                "order": "DESC"
            }
        ]
    }
    '''

    with DaprClient() as d:
        resp = d.query_state(
            store_name='state_store',
            query=query,
            states_metadata={"metakey": "metavalue"},  # optional
        )

Publish & subscribe

Publish messages

from dapr.clients import DaprClient

with DaprClient() as d:
    resp = d.publish_event(pubsub_name='pubsub', topic_name='TOPIC_A', data='{"message":"Hello World"}')

Send CloudEvents messages with a json payload:

from dapr.clients import DaprClient
import json

with DaprClient() as d:
    cloud_event = {
        'specversion': '1.0',
        'type': 'com.example.event',
        'source': 'my-service',
        'id': 'myid',
        'data': {'id': 1, 'message': 'hello world'},
        'datacontenttype': 'application/json',
    }

    # Set the data content type to 'application/cloudevents+json'
    resp = d.publish_event(
        pubsub_name='pubsub',
        topic_name='TOPIC_CE',
        data=json.dumps(cloud_event),
        data_content_type='application/cloudevents+json',
    )

Publish CloudEvents messages with plain text payload:

from dapr.clients import DaprClient
import json

with DaprClient() as d:
    cloud_event = {
        'specversion': '1.0',
        'type': 'com.example.event',
        'source': 'my-service',
        'id': "myid",
        'data': 'hello world',
        'datacontenttype': 'text/plain',
    }

    # Set the data content type to 'application/cloudevents+json'
    resp = d.publish_event(
        pubsub_name='pubsub',
        topic_name='TOPIC_CE',
        data=json.dumps(cloud_event),
        data_content_type='application/cloudevents+json',
    )

Subscribe to messages

from cloudevents.sdk.event import v1
from dapr.ext.grpc import App
import json

app = App()

# Default subscription for a topic
@app.subscribe(pubsub_name='pubsub', topic='TOPIC_A')
def mytopic(event: v1.Event) -> None:
    data = json.loads(event.Data())
    print(f'Received: id={data["id"]}, message="{data ["message"]}"' 
          ' content_type="{event.content_type}"',flush=True)

# Specific handler using Pub/Sub routing
@app.subscribe(pubsub_name='pubsub', topic='TOPIC_A',
               rule=Rule("event.type == \"important\"", 1))
def mytopic_important(event: v1.Event) -> None:
    data = json.loads(event.Data())
    print(f'Received: id={data["id"]}, message="{data ["message"]}"' 
          ' content_type="{event.content_type}"',flush=True)

Streaming message subscription

You can create a streaming subscription to a PubSub topic using either the subscribe or subscribe_handler methods.

The subscribe method returns an iterable Subscription object, which allows you to pull messages from the stream by using a for loop (ex. for message in subscription) or by calling the next_message method. This will block on the main thread while waiting for messages. When done, you should call the close method to terminate the subscription and stop receiving messages.

The subscribe_with_handler method accepts a callback function that is executed for each message received from the stream. It runs in a separate thread, so it doesn’t block the main thread. The callback should return a TopicEventResponse (ex. TopicEventResponse('success')), indicating whether the message was processed successfully, should be retried, or should be discarded. The method will automatically manage message acknowledgements based on the returned status. The call to subscribe_with_handler method returns a close function, which should be called to terminate the subscription when you’re done.

Here’s an example of using the subscribe method:

import time

from dapr.clients import DaprClient
from dapr.clients.grpc.subscription import StreamInactiveError, StreamCancelledError

counter = 0


def process_message(message):
    global counter
    counter += 1
    # Process the message here
    print(f'Processing message: {message.data()} from {message.topic()}...')
    return 'success'


def main():
    with DaprClient() as client:
        global counter

        subscription = client.subscribe(
            pubsub_name='pubsub', topic='TOPIC_A', dead_letter_topic='TOPIC_A_DEAD'
        )

        try:
            for message in subscription:
                if message is None:
                    print('No message received. The stream might have been cancelled.')
                    continue

                try:
                    response_status = process_message(message)

                    if response_status == 'success':
                        subscription.respond_success(message)
                    elif response_status == 'retry':
                        subscription.respond_retry(message)
                    elif response_status == 'drop':
                        subscription.respond_drop(message)

                    if counter >= 5:
                        break
                except StreamInactiveError:
                    print('Stream is inactive. Retrying...')
                    time.sleep(1)
                    continue
                except StreamCancelledError:
                    print('Stream was cancelled')
                    break
                except Exception as e:
                    print(f'Error occurred during message processing: {e}')

        finally:
            print('Closing subscription...')
            subscription.close()


if __name__ == '__main__':
    main()

And here’s an example of using the subscribe_with_handler method:

import time

from dapr.clients import DaprClient
from dapr.clients.grpc._response import TopicEventResponse

counter = 0


def process_message(message):
    # Process the message here
    global counter
    counter += 1
    print(f'Processing message: {message.data()} from {message.topic()}...')
    return TopicEventResponse('success')


def main():
    with (DaprClient() as client):
        # This will start a new thread that will listen for messages
        # and process them in the `process_message` function
        close_fn = client.subscribe_with_handler(
            pubsub_name='pubsub', topic='TOPIC_A', handler_fn=process_message,
            dead_letter_topic='TOPIC_A_DEAD'
        )

        while counter < 5:
            time.sleep(1)

        print("Closing subscription...")
        close_fn()


if __name__ == '__main__':
    main()

Conversation (Alpha)

Since version 1.15 Dapr offers developers the capability to securely and reliably interact with Large Language Models (LLM) through the Conversation API.

from dapr.clients import DaprClient
from dapr.clients.grpc._request import ConversationInput

with DaprClient() as d:
    inputs = [
        ConversationInput(content="What's Dapr?", role='user', scrub_pii=True),
        ConversationInput(content='Give a brief overview.', role='user', scrub_pii=True),
    ]

    metadata = {
        'model': 'foo',
        'key': 'authKey',
        'cacheTTL': '10m',
    }

    response = d.converse_alpha1(
        name='echo', inputs=inputs, temperature=0.7, context_id='chat-123', metadata=metadata
    )

    for output in response.outputs:
        print(f'Result: {output.result}')

Interact with output bindings

from dapr.clients import DaprClient

with DaprClient() as d:
    resp = d.invoke_binding(binding_name='kafkaBinding', operation='create', data='{"message":"Hello World"}')

Retrieve secrets

from dapr.clients import DaprClient

with DaprClient() as d:
    resp = d.get_secret(store_name='localsecretstore', key='secretKey')

Configuration

Get configuration

from dapr.clients import DaprClient

with DaprClient() as d:
    # Get Configuration
    configuration = d.get_configuration(store_name='configurationstore', keys=['orderId'], config_metadata={})

Subscribe to configuration

import asyncio
from time import sleep
from dapr.clients import DaprClient

async def executeConfiguration():
    with DaprClient() as d:
        storeName = 'configurationstore'

        key = 'orderId'

        # Wait for sidecar to be up within 20 seconds.
        d.wait(20)

        # Subscribe to configuration by key.
        configuration = await d.subscribe_configuration(store_name=storeName, keys=[key], config_metadata={})
        while True:
            if configuration != None:
                items = configuration.get_items()
                for key, item in items:
                    print(f"Subscribe key={key} value={item.value} version={item.version}", flush=True)
            else:
                print("Nothing yet")
        sleep(5)

asyncio.run(executeConfiguration())

Distributed Lock

from dapr.clients import DaprClient

def main():
    # Lock parameters
    store_name = 'lockstore'  # as defined in components/lockstore.yaml
    resource_id = 'example-lock-resource'
    client_id = 'example-client-id'
    expiry_in_seconds = 60

    with DaprClient() as dapr:
        print('Will try to acquire a lock from lock store named [%s]' % store_name)
        print('The lock is for a resource named [%s]' % resource_id)
        print('The client identifier is [%s]' % client_id)
        print('The lock will will expire in %s seconds.' % expiry_in_seconds)

        with dapr.try_lock(store_name, resource_id, client_id, expiry_in_seconds) as lock_result:
            assert lock_result.success, 'Failed to acquire the lock. Aborting.'
            print('Lock acquired successfully!!!')

        # At this point the lock was released - by magic of the `with` clause ;)
        unlock_result = dapr.unlock(store_name, resource_id, client_id)
        print('We already released the lock so unlocking will not work.')
        print('We tried to unlock it anyway and got back [%s]' % unlock_result.status)

Cryptography

from dapr.clients import DaprClient

message = 'The secret is "passw0rd"'

def main():
    with DaprClient() as d:
        resp = d.encrypt(
            data=message.encode(),
            options=EncryptOptions(
                component_name='crypto-localstorage',
                key_name='rsa-private-key.pem',
                key_wrap_algorithm='RSA',
            ),
        )
        encrypt_bytes = resp.read()

        resp = d.decrypt(
            data=encrypt_bytes,
            options=DecryptOptions(
                component_name='crypto-localstorage',
                key_name='rsa-private-key.pem',
            ),
        )
        decrypt_bytes = resp.read()

        print(decrypt_bytes.decode())  # The secret is "passw0rd"

Python SDK examples

6.2 - Getting started with the Dapr actor Python SDK

How to get up and running with the Dapr Python SDK

The Dapr actor package allows you to interact with Dapr virtual actors from a Python application.

Pre-requisites

Actor interface

The interface defines the actor contract that is shared between the actor implementation and the clients calling the actor. Because a client may depend on it, it typically makes sense to define it in an assembly that is separate from the actor implementation.

from dapr.actor import ActorInterface, actormethod

class DemoActorInterface(ActorInterface):
    @actormethod(name="GetMyData")
    async def get_my_data(self) -> object:
        ...

Actor services

An actor service hosts the virtual actor. It is implemented a class that derives from the base type Actor and implements the interfaces defined in the actor interface.

Actors can be created using one of the Dapr actor extensions:

Actor client

An actor client contains the implementation of the actor client which calls the actor methods defined in the actor interface.

import asyncio

from dapr.actor import ActorProxy, ActorId
from demo_actor_interface import DemoActorInterface

async def main():
    # Create proxy client
    proxy = ActorProxy.create('DemoActor', ActorId('1'), DemoActorInterface)

    # Call method on client
    resp = await proxy.GetMyData()

Sample

Visit this page for a runnable actor sample.

Mock Actor Testing

The Dapr Python SDK provides the ability to create mock actors to unit test your actor methods and see how they interact with the actor state.

Sample Usage

from dapr.actor.runtime.mock_actor import create_mock_actor

class MyActor(Actor, MyActorInterface):
    async def save_state(self, data) -> None:
        await self._state_manager.set_state('mystate', data)
        await self._state_manager.save_state()

mock_actor = create_mock_actor(MyActor, "id")

await mock_actor.save_state(5)
assert mockactor._state_manager._mock_state['mystate'] == 5 #True

Mock actors are created by passing your actor class and an actor ID (a string) to the create_mock_actor function. This function returns an instance of the actor with many internal methods overridden. Instead of interacting with Dapr for tasks like saving state or managing timers, the mock actor uses in-memory state to simulate these behaviors.

This state can be accessed through the following variables:

IMPORTANT NOTE: Due to type hinting issues as discussed further down, these variables will not be visible to type hinters/linters/etc, who will think they are invalid variables. You will need to use them with #type: ignore in order to satisfy any such systems.

  • _state_manager._mock_state()
    A [str, object] dict where all the actor state is stored. Any variable saved via _state_manager.save_state(key, value), or any other statemanager method is stored in the dict as that key, value pair. Any value loaded via try_get_state or any other statemanager method is taken from this dict.

  • _state_manager._mock_timers()
    A [str, ActorTimerData] dict which holds the active actor timers. Any actor method which would add or remove a timer adds or pops the appropriate ActorTimerData object from this dict.

  • _state_manager._mock_reminders()
    A [str, ActorReminderData] dict which holds the active actor reminders. Any actor method which would add or remove a timer adds or pops the appropriate ActorReminderData object from this dict.

Note: The timers and reminders will never actually trigger. The dictionaries exist only so methods that should add or remove timers/reminders can be tested. If you need to test the callbacks they should activate, you should call them directly with the appropriate values:

result = await mock_actor.recieve_reminder(name, state, due_time, period, _ttl)
# Test the result directly or test for side effects (like changing state) by querying `_state_manager._mock_state`

Usage and Limitations

To allow for more fine-grained control, the _on_activate method will not be called automatically the way it is when Dapr initializes a new Actor instance. You should call it manually as needed as part of your tests.

A current limitation of the mock actor system is that it does not call the _on_pre_actor_method and _on_post_actor_method methods. You can always call these methods manually as part of a test.

The __init__, register_timer, unregister_timer, register_reminder, unregister_reminder methods are all overwritten by the MockActor class that gets applied as a mixin via create_mock_actor. If your actor itself overwrites these methods, those modifications will themselves be overwritten and the actor will likely not behave as you expect.

note: __init__ is a special case where you are expected to define it as

    def __init__(self, ctx, actor_id):
        super().__init__(ctx, actor_id)

Mock actors work fine with this, but if you have added any extra logic into __init__, it will be overwritten. It is worth noting that the correct way to apply logic on initialization is via _on_activate (which can also be safely used with mock actors) instead of __init__.

If you have an actor which does override default Dapr actor methods, you can create a custom subclass of the MockActor class (from MockActor.py) which implements whatever custom logic you have along with interacting with _mock_state, _mock_timers, and _mock_reminders as normal, and then applying that custom class as a mixin via a create_mock_actor function you define yourself.

The actor _runtime_ctx variable is set to None. All the normal actor methods have been overwritten such as to not call it, but if your code itself interacts directly with _runtime_ctx, tests may fail.

The actor _state_manager is overwritten with an instance of MockStateManager. This has all the same methods and functionality of the base ActorStateManager, except for using the various _mock variables for storing data instead of the _runtime_ctx. If your code implements its own custom state manager it will be overwritten and tests will likely fail.

Type Hinting

Because of Python’s lack of a unified method for type hinting type intersections (see: python/typing #213), type hinting unfortunately doesn’t work with Mock Actors. The return type is type hinted as “instance of Actor subclass T” when it should really be type hinted as “instance of MockActor subclass T” or “instance of type intersection [Actor subclass T, MockActor]” (where, it is worth noting, MockActor is itself a subclass of Actor).

This means that, for example, if you hover over mockactor._state_manager in a code editor, it will come up as an instance of ActorStateManager (instead of MockStateManager), and various IDE helper functions (like VSCode’s Go to Definition, which will bring you to the definition of ActorStateManager instead of MockStateManager) won’t work properly.

For now, this issue is unfixable, so it’s merely something to be noted because of the confusion it might cause. If in the future it becomes possible to accurately type hint cases like this feel free to open an issue about implementing it.

6.3 - Dapr Python SDK extensions

Python SDK for developing Dapr applications

6.3.1 - Getting started with the Dapr Python gRPC service extension

How to get up and running with the Dapr Python gRPC extension

The Dapr Python SDK provides a built in gRPC server extension, dapr.ext.grpc, for creating Dapr services.

Installation

You can download and install the Dapr gRPC server extension with:

pip install dapr-ext-grpc
pip3 install dapr-ext-grpc-dev

Examples

The App object can be used to create a server.

Listen for service invocation requests

The InvokeMethodReqest and InvokeMethodResponse objects can be used to handle incoming requests.

A simple service that will listen and respond to requests will look like:

from dapr.ext.grpc import App, InvokeMethodRequest, InvokeMethodResponse

app = App()

@app.method(name='my-method')
def mymethod(request: InvokeMethodRequest) -> InvokeMethodResponse:
    print(request.metadata, flush=True)
    print(request.text(), flush=True)

    return InvokeMethodResponse(b'INVOKE_RECEIVED', "text/plain; charset=UTF-8")

app.run(50051)

A full sample can be found here.

Subscribe to a topic

When subscribing to a topic, you can instruct dapr whether the event delivered has been accepted, or whether it should be dropped, or retried later.

from typing import Optional
from cloudevents.sdk.event import v1
from dapr.ext.grpc import App
from dapr.clients.grpc._response import TopicEventResponse

app = App()

# Default subscription for a topic
@app.subscribe(pubsub_name='pubsub', topic='TOPIC_A')
def mytopic(event: v1.Event) -> Optional[TopicEventResponse]:
    print(event.Data(),flush=True)
    # Returning None (or not doing a return explicitly) is equivalent
    # to returning a TopicEventResponse("success").
    # You can also return TopicEventResponse("retry") for dapr to log
    # the message and retry delivery later, or TopicEventResponse("drop")
    # for it to drop the message
    return TopicEventResponse("success")

# Specific handler using Pub/Sub routing
@app.subscribe(pubsub_name='pubsub', topic='TOPIC_A',
               rule=Rule("event.type == \"important\"", 1))
def mytopic_important(event: v1.Event) -> None:
    print(event.Data(),flush=True)

# Handler with disabled topic validation
@app.subscribe(pubsub_name='pubsub-mqtt', topic='topic/#', disable_topic_validation=True,)
def mytopic_wildcard(event: v1.Event) -> None:
    print(event.Data(),flush=True)

app.run(50051)

A full sample can be found here.

Setup input binding trigger

from dapr.ext.grpc import App, BindingRequest

app = App()

@app.binding('kafkaBinding')
def binding(request: BindingRequest):
    print(request.text(), flush=True)

app.run(50051)

A full sample can be found here.

6.3.2 - Dapr Python SDK integration with FastAPI

How to create Dapr Python virtual actors and pubsub with the FastAPI extension

The Dapr Python SDK provides integration with FastAPI using the dapr-ext-fastapi extension.

Installation

You can download and install the Dapr FastAPI extension with:

pip install dapr-ext-fastapi
pip install dapr-ext-fastapi-dev

Example

Subscribing to events of different types

import uvicorn
from fastapi import Body, FastAPI
from dapr.ext.fastapi import DaprApp
from pydantic import BaseModel

class RawEventModel(BaseModel):
    body: str

class User(BaseModel):
    id: int
    name: str

class CloudEventModel(BaseModel):
    data: User
    datacontenttype: str
    id: str
    pubsubname: str
    source: str
    specversion: str
    topic: str
    traceid: str
    traceparent: str
    tracestate: str
    type: str    
    
    
app = FastAPI()
dapr_app = DaprApp(app)

# Allow handling event with any structure (Easiest, but least robust)
# dapr publish --publish-app-id sample --topic any_topic --pubsub pubsub --data '{"id":"7", "desc": "good", "size":"small"}'
@dapr_app.subscribe(pubsub='pubsub', topic='any_topic')
def any_event_handler(event_data = Body()):
    print(event_data)    

# For robustness choose one of the below based on if publisher is using CloudEvents

# Handle events sent with CloudEvents
# dapr publish --publish-app-id sample --topic cloud_topic --pubsub pubsub --data '{"id":"7", "name":"Bob Jones"}'
@dapr_app.subscribe(pubsub='pubsub', topic='cloud_topic')
def cloud_event_handler(event_data: CloudEventModel):
    print(event_data)   

# Handle raw events sent without CloudEvents
# curl -X "POST" http://localhost:3500/v1.0/publish/pubsub/raw_topic?metadata.rawPayload=true -H "Content-Type: application/json" -d '{"body": "345"}'
@dapr_app.subscribe(pubsub='pubsub', topic='raw_topic')
def raw_event_handler(event_data: RawEventModel):
    print(event_data)    

 

if __name__ == "__main__":
    uvicorn.run(app, host="0.0.0.0", port=30212)

Creating an actor

from fastapi import FastAPI
from dapr.ext.fastapi import DaprActor
from demo_actor import DemoActor

app = FastAPI(title=f'{DemoActor.__name__}Service')

# Add Dapr Actor Extension
actor = DaprActor(app)

@app.on_event("startup")
async def startup_event():
    # Register DemoActor
    await actor.register_actor(DemoActor)

@app.get("/GetMyData")
def get_my_data():
    return "{'message': 'myData'}"

6.3.3 - Dapr Python SDK integration with Flask

How to create Dapr Python virtual actors with the Flask extension

The Dapr Python SDK provides integration with Flask using the flask-dapr extension.

Installation

You can download and install the Dapr Flask extension with:

pip install flask-dapr
pip install flask-dapr-dev

Example

from flask import Flask
from flask_dapr.actor import DaprActor

from dapr.conf import settings
from demo_actor import DemoActor

app = Flask(f'{DemoActor.__name__}Service')

# Enable DaprActor Flask extension
actor = DaprActor(app)

# Register DemoActor
actor.register_actor(DemoActor)

# Setup method route
@app.route('/GetMyData', methods=['GET'])
def get_my_data():
    return {'message': 'myData'}, 200

# Run application
if __name__ == '__main__':
    app.run(port=settings.HTTP_APP_PORT)

6.3.4 - Dapr Python SDK integration with Dapr Workflow extension

How to get up and running with the Dapr Workflow extension

The Dapr Python SDK provides a built-in Dapr Workflow extension, dapr.ext.workflow, for creating Dapr services.

Installation

You can download and install the Dapr Workflow extension with:

pip install dapr-ext-workflow
pip install dapr-ext-workflow-dev

Example

from time import sleep

import dapr.ext.workflow as wf


wfr = wf.WorkflowRuntime()


@wfr.workflow(name='random_workflow')
def task_chain_workflow(ctx: wf.DaprWorkflowContext, wf_input: int):
    try:
        result1 = yield ctx.call_activity(step1, input=wf_input)
        result2 = yield ctx.call_activity(step2, input=result1)
    except Exception as e:
        yield ctx.call_activity(error_handler, input=str(e))
        raise
    return [result1, result2]


@wfr.activity(name='step1')
def step1(ctx, activity_input):
    print(f'Step 1: Received input: {activity_input}.')
    # Do some work
    return activity_input + 1


@wfr.activity
def step2(ctx, activity_input):
    print(f'Step 2: Received input: {activity_input}.')
    # Do some work
    return activity_input * 2

@wfr.activity
def error_handler(ctx, error):
    print(f'Executing error handler: {error}.')
    # Do some compensating work


if __name__ == '__main__':
    wfr.start()
    sleep(10)  # wait for workflow runtime to start

    wf_client = wf.DaprWorkflowClient()
    instance_id = wf_client.schedule_new_workflow(workflow=task_chain_workflow, input=42)
    print(f'Workflow started. Instance ID: {instance_id}')
    state = wf_client.wait_for_workflow_completion(instance_id)
    print(f'Workflow completed! Status: {state.runtime_status}')

    wfr.shutdown()

Next steps

Getting started with the Dapr Workflow Python SDK

6.3.4.1 - Getting started with the Dapr Workflow Python SDK

How to get up and running with workflows using the Dapr Python SDK

Let’s create a Dapr workflow and invoke it using the console. With the provided workflow example, you will:

  • Run a Python console application that demonstrates workflow orchestration with activities, child workflows, and external events
  • Learn how to handle retries, timeouts, and workflow state management
  • Use the Python workflow SDK to start, pause, resume, and purge workflow instances

This example uses the default configuration from dapr init in self-hosted mode.

In the Python example project, the simple.py file contains the setup of the app, including:

  • The workflow definition
  • The workflow activity definitions
  • The registration of the workflow and workflow activities

Prerequisites

Set up the environment

Start by cloning the [Python SDK repo].

git clone https://github.com/dapr/python-sdk.git

From the Python SDK root directory, navigate to the Dapr Workflow example.

cd examples/workflow

Run the following command to install the requirements for running this workflow sample with the Dapr Python SDK.

pip3 install -r workflow/requirements.txt

Run the application locally

To run the Dapr application, you need to start the Python program and a Dapr sidecar. In the terminal, run:

dapr run --app-id wf-simple-example --dapr-grpc-port 50001 --resources-path components -- python3 simple.py

Note: Since Python3.exe is not defined in Windows, you may need to use python simple.py instead of python3 simple.py.

Expected output

- "== APP == Hi Counter!"
- "== APP == New counter value is: 1!"
- "== APP == New counter value is: 11!"
- "== APP == Retry count value is: 0!"
- "== APP == Retry count value is: 1! This print statement verifies retry"
- "== APP == Appending 1 to child_orchestrator_string!"
- "== APP == Appending a to child_orchestrator_string!"
- "== APP == Appending a to child_orchestrator_string!"
- "== APP == Appending 2 to child_orchestrator_string!"
- "== APP == Appending b to child_orchestrator_string!"
- "== APP == Appending b to child_orchestrator_string!"
- "== APP == Appending 3 to child_orchestrator_string!"
- "== APP == Appending c to child_orchestrator_string!"
- "== APP == Appending c to child_orchestrator_string!"
- "== APP == Get response from hello_world_wf after pause call: Suspended"
- "== APP == Get response from hello_world_wf after resume call: Running"
- "== APP == New counter value is: 111!"
- "== APP == New counter value is: 1111!"
- "== APP == Workflow completed! Result: "Completed"

What happened?

When you run the application, several key workflow features are shown:

  1. Workflow and Activity Registration: The application uses Python decorators to automatically register workflows and activities with the runtime. This decorator-based approach provides a clean, declarative way to define your workflow components:

    @wfr.workflow(name='hello_world_wf')
    def hello_world_wf(ctx: DaprWorkflowContext, wf_input):
        # Workflow definition...
    
    @wfr.activity(name='hello_act')
    def hello_act(ctx: WorkflowActivityContext, wf_input):
        # Activity definition...
    
  2. Runtime Setup: The application initializes the workflow runtime and client:

    wfr = WorkflowRuntime()
    wfr.start()
    wf_client = DaprWorkflowClient()
    
  3. Activity Execution: The workflow executes a series of activities that increment a counter:

    @wfr.workflow(name='hello_world_wf')
    def hello_world_wf(ctx: DaprWorkflowContext, wf_input):
        yield ctx.call_activity(hello_act, input=1)
        yield ctx.call_activity(hello_act, input=10)
    
  4. Retry Logic: The workflow demonstrates error handling with a retry policy:

    retry_policy = RetryPolicy(
        first_retry_interval=timedelta(seconds=1),
        max_number_of_attempts=3,
        backoff_coefficient=2,
        max_retry_interval=timedelta(seconds=10),
        retry_timeout=timedelta(seconds=100),
    )
    yield ctx.call_activity(hello_retryable_act, retry_policy=retry_policy)
    
  5. Child Workflow: A child workflow is executed with its own retry policy:

    yield ctx.call_child_workflow(child_retryable_wf, retry_policy=retry_policy)
    
  6. External Event Handling: The workflow waits for an external event with a timeout:

    event = ctx.wait_for_external_event(event_name)
    timeout = ctx.create_timer(timedelta(seconds=30))
    winner = yield when_any([event, timeout])
    
  7. Workflow Lifecycle Management: The example demonstrates how to pause and resume the workflow:

    wf_client.pause_workflow(instance_id=instance_id)
    metadata = wf_client.get_workflow_state(instance_id=instance_id)
    # ... check status ...
    wf_client.resume_workflow(instance_id=instance_id)
    
  8. Event Raising: After resuming, the workflow raises an event:

    wf_client.raise_workflow_event(
        instance_id=instance_id,
        event_name=event_name,
        data=event_data
    )
    
  9. Completion and Cleanup: Finally, the workflow waits for completion and cleans up:

    state = wf_client.wait_for_workflow_completion(
        instance_id,
        timeout_in_seconds=30
    )
    wf_client.purge_workflow(instance_id=instance_id)
    

Next steps

7 - Dapr Rust SDK

Rust SDK packages for developing Dapr applications

A client library to help build Dapr applications using Rust. This client is targeting support for all public Dapr APIs while focusing on idiomatic Rust experiences and developer productivity.

Client

Use the Rust Client SDK for invoking public Dapr APIs [**Learn more about the Rust Client SDK**](https://v1-16.docs.dapr.io/developing-applications/sdks/rust/rust-client/)

7.1 - Getting started with the Dapr client Rust SDK

How to get up and running with the Dapr Rust SDK

The Dapr client package allows you to interact with other Dapr applications from a Rust application.

Prerequisites

Import the client package

Add Dapr to your cargo.toml

[dependencies]
# Other dependencies
dapr = "0.16.0"

You can either reference dapr::Client or bind the full path to a new name as follows:

use dapr::Client as DaprClient;

Instantiating the Dapr client

let addr = "https://127.0.0.1".to_string();

let mut client = dapr::Client::<dapr::client::TonicClient>::connect(addr,
port).await?;

Alternatively if you would like to specify a custom port, this can be done by using this connect method:

let mut client = dapr::Client::<dapr::client::TonicClient>::connect_with_port(addr, "3500".to_string()).await?;

Building blocks

The Rust SDK allows you to interface with the Dapr building blocks.

Service Invocation (gRPC)

To invoke a specific method on another service running with Dapr sidecar, the Dapr client provides two options:

Invoke a (gRPC) service

let response = client
    .invoke_service("service-to-invoke", "method-to-invoke", Some(data))
    .await
    .unwrap();

For a full guide on service invocation, visit How-To: Invoke a service.

State Management

The Dapr Client provides access to these state management methods: save_state , get_state, delete_state that can be used like so:

let store_name = String::from("statestore");

let key = String::from("hello");
let val = String::from("world").into_bytes();

// save key-value pair in the state store
client
    .save_state(store_name, key, val, None, None, None)
    .await?;

let get_response = client
    .get_state("statestore", "hello", None)
    .await?;

// delete a value from the state store
client
    .delete_state("statestore", "hello", None)
    .await?;

Multiple states can be sent with the save_bulk_states method.

For a full guide on state management, visit How-To: Save & get state.

Publish Messages

To publish data onto a topic, the Dapr client provides a simple method:

let pubsub_name = "pubsub-name".to_string();
let pubsub_topic = "topic-name".to_string();
let pubsub_content_type = "text/plain".to_string();

let data = "content".to_string().into_bytes();
client
    .publish_event(pubsub_name, pubsub_topic, pubsub_content_type, data, None)
    .await?;

For a full guide on pub/sub, visit How-To: Publish & subscribe.

Rust SDK Examples