The Dapr SDKs are the easiest way for you to get Dapr into your application. Choose your favorite language and get up and running with Dapr in minutes.
SDK packages
Select your preferred language below to learn more about client, server, actor, and workflow packages.
Client: The Dapr client allows you to invoke Dapr building block APIs and perform each building block’s actions
Server extensions: The Dapr service extensions allow you to create services that can be invoked by other services and subscribe to topics
Actor: The Dapr Actor SDK allows you to build virtual actors with methods, state, timers, and persistent reminders
Workflow: Dapr Workflow makes it easy for you to write long running business logic and integrations in a reliable way
.NET SDK packages for developing Dapr applications
Dapr offers a variety of packages to help with the development of .NET applications. Using them you can create .NET clients, servers, and virtual actors with Dapr.
usingvarclient=newDaprClientBuilder().UseTimeout(TimeSpan.FromSeconds(2)).// Optionally, set a timeoutBuild();// Invokes a POST method named "deposit" that takes input of type "Transaction"vardata=new{id="17",amount=99m};varaccount=awaitclient.InvokeMethodAsync<Account>("routing","deposit",data,cancellationToken);Console.WriteLine("Returned: id:{0} | Balance:{1}",account.Id,account.Balance);
varclient=DaprClient.CreateInvokeHttpClient(appId:"routing");// To set a timeout on the HTTP client:client.Timeout=TimeSpan.FromSeconds(2);vardeposit=newTransaction{Id="17",Amount=99m};varresponse=awaitclient.PostAsJsonAsync("/deposit",deposit,cancellationToken);varaccount=awaitresponse.Content.ReadFromJsonAsync<Account>(cancellationToken:cancellationToken);Console.WriteLine("Returned: id:{0} | Balance:{1}",account.Id,account.Balance);
gRPC
You can use the DaprClient to invoke your services over gRPC.
Visit .NET SDK examples for code samples and instructions to try out pub/sub
Interact with output bindings
usingvarclient=newDaprClientBuilder().Build();// Example payload for the Twilio SendGrid bindingvaremail=new{metadata=new{emailTo="customer@example.com",subject="An email from Dapr SendGrid binding",},data="<h1>Testing Dapr Bindings</h1>This is a test.<br>Bye!",};awaitclient.InvokeBindingAsync("send-email","create",email);
varclient=newDaprClientBuilder().Build();// Retrieve a key-value-pair-based secret - returns a Dictionary<string, string>varsecrets=awaitclient.GetSecretAsync("mysecretstore","key-value-pair-secret");Console.WriteLine($"Got secret keys: {string.Join(",", secrets.Keys)}");
varclient=newDaprClientBuilder().Build();// Retrieve a key-value-pair-based secret - returns a Dictionary<string, string>varsecrets=awaitclient.GetSecretAsync("mysecretstore","key-value-pair-secret");Console.WriteLine($"Got secret keys: {string.Join(",", secrets.Keys)}");// Retrieve a single-valued secret - returns a Dictionary<string, string>// containing a single value with the secret name as the keyvardata=awaitclient.GetSecretAsync("mysecretstore","single-value-secret");varvalue=data["single-value-secret"]Console.WriteLine("Got a secret value, I'm not going to be print it, it's a secret!");
varclient=newDaprClientBuilder().Build();// Retrieve a specific set of keys.varspecificItems=awaitclient.GetConfiguration("configstore",newList<string>(){"key1","key2"});Console.WriteLine($"Here are my values:\n{specificItems[0].Key} -> {specificItems[0].Value}\n{specificItems[1].Key} -> {specificItems[1].Value}");// Retrieve all configuration items by providing an empty list.varspecificItems=awaitclient.GetConfiguration("configstore",newList<string>());Console.WriteLine($"I got {configItems.Count} entires!");foreach(variteminconfigItems){Console.WriteLine($"{item.Key} -> {item.Value}")}
Subscribe to Configuration Keys
varclient=newDaprClientBuilder().Build();// The Subscribe Configuration API returns a wrapper around an IAsyncEnumerable<IEnumerable<ConfigurationItem>>.// Iterate through it by accessing its Source in a foreach loop. The loop will end when the stream is severed// or if the cancellation token is cancelled.varsubscribeConfigurationResponse=awaitdaprClient.SubscribeConfiguration(store,keys,metadata,cts.Token);awaitforeach(varitemsinsubscribeConfigurationResponse.Source.WithCancellation(cts.Token)){foreach(variteminitems){Console.WriteLine($"{item.Key} -> {item.Value}")}}
Distributed lock (Alpha)
Acquire a lock
usingSystem;usingDapr.Client;namespaceLockService{classProgram{ [Obsolete("Distributed Lock API is in Alpha, this can be removed once it is stable.")]staticasyncTaskMain(string[]args){vardaprLockName="lockstore";varfileName="my_file_name";varclient=newDaprClientBuilder().Build();// Locking with this approach will also unlock it automatically, as this is a disposable objectawaitusing(varfileLock=awaitclient.Lock(DAPR_LOCK_NAME,fileName,"random_id_abc123",60)){if(fileLock.Success){Console.WriteLine("Success");}else{Console.WriteLine($"Failed to lock {fileName}.");}}}}}
This health endpoint returns true when Dapr has initialized all its components, but may not have finished setting up a communication channel with your application.
This is best used when you want to utilize a Dapr component in your startup path, for instance, loading secrets from a secretstore.
The DaprClient also provides a helper method to wait for the sidecar to become healthy (components only). When using this method, it is recommended to include a CancellationToken to
allow for the request to timeout. Below is an example of how this is used in the DaprSecretStoreConfigurationProvider.
// Wait for the Dapr sidecar to report healthy before attempting use Dapr components.using(vartokenSource=newCancellationTokenSource(sidecarWaitTimeout)){awaitclient.WaitForSidecarAsync(tokenSource.Token);}// Perform Dapr component operations here i.e. fetching secrets.
A DaprClient holds access to networking resources in the form of TCP sockets used to communicate with the Dapr sidecar. DaprClient implements IDisposable to support eager cleanup of resources.
Dependency Injection
The AddDaprClient() method will register the Dapr client with ASP.NET Core dependency injection. This method accepts an optional
options delegate for configuring the DaprClient and an ServiceLifetime argument, allowing you to specify a different lifetime
for the registered resources instead of the default Singleton value.
The following example assumes all default values are acceptable and is sufficient to register the DaprClient.
services.AddDaprClient();
The optional configuration delegates are used to configure DaprClient by specifying options on the provided DaprClientBuilder
as in the following example:
The another optional configuration delegate overload provides access to both the DaprClientBuilder as well as an IServiceProvider
allowing for more advanced configurations that may require injecting services from the dependency injection container.
Rather than using dependency injection, a DaprClient can also be built using the static client builder.
For best performance, create a single long-lived instance of DaprClient and provide access to that shared instance throughout your application. DaprClient instances are thread-safe and intended to be shared.
Avoid creating a DaprClient per-operation and disposing it when the operation is complete.
Configuring DaprClient
A DaprClient can be configured by invoking methods on DaprClientBuilder class before calling .Build() to create the client. The settings for each DaprClient object are separate and cannot be changed after calling .Build().
By default, the DaprClientBuilder will prioritize the following locations, in the following order, to source the configuration
values:
The value provided to a method on the DaprClientBuilder (e.g. UseTimeout(TimeSpan.FromSeconds(30)))
The value pulled from an optionally injected IConfiguration matching the name expected in the associated environment variable
The value pulled from the associated environment variable
Default values
Configuring on DaprClientBuilder
The DaprClientBuilder contains the following methods to set configuration options:
UseHttpEndpoint(string): The HTTP endpoint of the Dapr sidecar
UseGrpcEndpoint(string): Sets the gRPC endpoint of the Dapr sidecar
UseGrpcChannelOptions(GrpcChannelOptions): Sets the gRPC channel options used to connect to the Dapr sidecar
UseHttpClientFactory(IHttpClientFactory): Configures the DaprClient to use a registered IHttpClientFactory when building HttpClient instances
UseJsonSerializationOptions(JsonSerializerOptions): Used to configure JSON serialization
UseDaprApiToken(string): Adds the provided token to every request to authenticate to the Dapr sidecar
UseTimeout(TimeSpan): Specifies a timeout value used by the HttpClient when communicating with the Dapr sidecar
Configuring From IConfiguration
Rather than rely on sourcing configuration values directly from environment variables or because the values are sourced
from dependency injected services, another options is to make these values available on IConfiguration.
For example, you might be registering your application in a multi-tenant environment and need to prefix the environment
variables used. The following example shows how these values can be sourced from the environment variables to your
IConfiguration when their keys are prefixed with test_;
varbuilder=WebApplication.CreateBuilder(args);builder.Configuration.AddEnvironmentVariables("test_");//Retrieves all environment variables that start with "test_" and removes the prefix when sourced from IConfigurationbuilder.Services.AddDaprClient();
Configuring From Environment Variables
The SDK will read the following environment variables to configure the default values:
DAPR_HTTP_ENDPOINT: used to find the HTTP endpoint of the Dapr sidecar, example: https://dapr-api.mycompany.com
DAPR_GRPC_ENDPOINT: used to find the gRPC endpoint of the Dapr sidecar, example: https://dapr-grpc-api.mycompany.com
DAPR_HTTP_PORT: if DAPR_HTTP_ENDPOINT is not set, this is used to find the HTTP local endpoint of the Dapr sidecar
DAPR_GRPC_PORT: if DAPR_GRPC_ENDPOINT is not set, this is used to find the gRPC local endpoint of the Dapr sidecar
DAPR_API_TOKEN: used to set the API Token
Note
If both DAPR_HTTP_ENDPOINT and DAPR_HTTP_PORT are specified, the port value from DAPR_HTTP_PORT will be ignored in favor of the port
implicitly or explicitly defined on DAPR_HTTP_ENDPOINT. The same is true of both DAPR_GRPC_ENDPOINT and DAPR_GRPC_PORT.
Configuring gRPC channel options
Dapr’s use of CancellationToken for cancellation relies on the configuration of the gRPC channel options and this is enabled by default. If you need to configure these options yourself, make sure to enable the ThrowOperationCanceledOnCancellation setting.
The APIs on DaprClient that perform asynchronous operations accept an optional CancellationToken parameter. This follows a standard .NET idiom for cancellable operations. Note that when cancellation occurs, there is no guarantee that the remote endpoint stops processing the request, only that the client has stopped waiting for completion.
When an operation is cancelled, it will throw an OperationCancelledException.
Understanding DaprClient JSON serialization
Many methods on DaprClient perform JSON serialization using the System.Text.Json serializer. Methods that accept an application data type as an argument will JSON serialize it, unless the documentation clearly states otherwise.
It is worth reading the System.Text.Json documentation if you have advanced requirements. The Dapr .NET SDK provides no unique serialization behavior or customizations - it relies on the underlying serializer to convert data to and from the application’s .NET types.
DaprClient is configured to use a serializer options object configured from JsonSerializerDefaults.Web. This means that DaprClient will use camelCase for property names, allow reading quoted numbers ("10.99"), and will bind properties case-insensitively. These are the same settings used with ASP.NET Core and the System.Text.Json.Http APIs, and are designed to follow interoperable web conventions.
System.Text.Json as of .NET 5.0 does not have good support for all of F# language features built-in. If you are using F# you may want to use one of the converter packages that add support for F#’s features such as FSharp.SystemTextJson.
Simple guidance for JSON serialization
Your experience using JSON serialization and DaprClient will be smooth if you use a feature set that maps to JSON’s type system. These are general guidelines that will simplify your code where they can be applied.
Avoid inheritance and polymorphism
Do not attempt to serialize data with cyclic references
Do not put complex or expensive logic in constructors or property accessors
Use .NET types that map cleanly to JSON types (numeric types, strings, DateTime)
Create your own classes for top-level messages, events, or state values so you can add properties in the future
Design types with get/set properties OR use the supported pattern for immutable types with JSON
Polymorphism and serialization
The System.Text.Json serializer used by DaprClient uses the declared type of values when performing serialization.
This section will use DaprClient.SaveStateAsync<TValue>(...) in examples, but the advice is applicable to any Dapr building block exposed by the SDK.
publicclassWidget{publicstringColor{get;set;}}...// Storing a Widget value as JSON in the state storewidgetwidget=newWidget(){Color="Green",};awaitclient.SaveStateAsync("mystatestore","mykey",widget);
In the example above, the type parameter TValue has its type argument inferred from the type of the widget variable. This is important because the System.Text.Json serializer will perform serialization based on the declared type of the value. The result is that the JSON value { "color": "Green" } will be stored.
Consider what happens when you try to use derived type of Widget:
publicclassWidget{publicstringColor{get;set;}}publicclassSuperWidget:Widget{publicboolHasSelfCleaningFeature{get;set;}}...// Storing a SuperWidget value as JSON in the state storeWidgetwidget=newSuperWidget(){Color="Green",HasSelfCleaningFeature=true,};awaitclient.SaveStateAsync("mystatestore","mykey",widget);
In this example we’re using a SuperWidget but the variable’s declared type is Widget. Since the JSON serializer’s behavior is determined by the declared type, it only sees a simple Widget and will save the value { "color": "Green" } instead of { "color": "Green", "hasSelfCleaningFeature": true }.
If you want the properties of SuperWidget to be serialized, then the best option is to override the type argument with object. This will cause the serializer to include all data as it knows nothing about the type.
In this guide, you will learn how to use IActorProxyFactory.
Tip
For a non-dependency-injected application, you can use the static methods on ActorProxy. Since the ActorProxy methods are error prone, try to avoid using them when configuring custom settings.
Identifying an actor
All of the APIs on IActorProxyFactory will require an actor type and actor id to communicate with an actor. For strongly-typed clients, you also need one of its interfaces.
Actor type uniquely identifies the actor implementation across the whole application.
Actor id uniquely identifies an instance of that type.
If you don’t have an actor id and want to communicate with a new instance, create a random id with ActorId.CreateRandom(). Since the random id is a cryptographically strong identifier, the runtime will create a new actor instance when you interact with it.
You can use the type ActorReference to exchange an actor type and actor id with other actors as part of messages.
Two styles of actor client
The actor client supports two different styles of invocation:
Actor client style
Description
Strongly-typed
Strongly-typed clients are based on .NET interfaces and provide the typical benefits of strong-typing. They don’t work with non-.NET actors.
Weakly-typed
Weakly-typed clients use the ActorProxy class. It is recommended to use these only when required for interop or other advanced reasons.
Using a strongly-typed client
The following example uses the CreateActorProxy<> method to create a strongly-typed client. CreateActorProxy<> requires an actor interface type, and will return an instance of that interface.
// Create a proxy for IOtherActor to type OtherActor with a random idvarproxy=this.ProxyFactory.CreateActorProxy<IOtherActor>(ActorId.CreateRandom(),"OtherActor");// Invoke a method defined by the interface to invoke the actor//// proxy is an implementation of IOtherActor so we can invoke its methods directlyawaitproxy.DoSomethingGreat();
Using a weakly-typed client
The following example uses the Create method to create a weakly-typed client. Create returns an instance of ActorProxy.
// Create a proxy for type OtherActor with a random idvarproxy=this.ProxyFactory.Create(ActorId.CreateRandom(),"OtherActor");// Invoke a method by name to invoke the actor//// proxy is an instance of ActorProxy.awaitproxy.InvokeMethodAsync("DoSomethingGreat");
Since ActorProxy is a weakly-typed proxy, you need to pass in the actor method name as a string.
You can also use ActorProxy to invoke methods with both a request and a response message. Request and response messages will be serialized using the System.Text.Json serializer.
// Create a proxy for type OtherActor with a random idvarproxy=this.ProxyFactory.Create(ActorId.CreateRandom(),"OtherActor");// Invoke a method on the proxy to invoke the actor//// proxy is an instance of ActorProxy.varrequest=newMyRequest(){Message="Hi, it's me.",};varresponse=awaitproxy.InvokeMethodAsync<MyRequest,MyResponse>("DoSomethingGreat",request);
When using a weakly-typed proxy, you must proactively define the correct actor method names and message types. When using a strongly-typed proxy, these names and types are defined for you as part of the interface definition.
Actor method invocation exception details
The actor method invocation exception details are surfaced to the caller and the callee, providing an entry point to track down the issue. Exception details include:
Method name
Line number
Exception type
UUID
You use the UUID to match the exception on the caller and callee side. Below is an example of exception details:
Dapr.Actors.ActorMethodInvocationException: Remote Actor Method Exception, DETAILS: Exception: NotImplementedException, Method Name: ExceptionExample, Line Number: 14, Exception uuid: d291a006-84d5-42c4-b39e-d6300e9ac38b
Learn all about authoring and running actors with the .NET SDK
Author actors
ActorHost
The ActorHost:
Is a required constructor parameter of all actors
Is provided by the runtime
Must be passed to the base class constructor
Contains all of the state that allows that actor instance to communicate with the runtime
internalclassMyActor:Actor,IMyActor,IRemindable{publicMyActor(ActorHosthost)// Accept ActorHost in the constructor:base(host)// Pass ActorHost to the base class constructor{}}
Since the ActorHost contains state unique to the actor, you don’t need to pass the instance into other parts of your code. It’s recommended only create your own instances of ActorHost in tests.
Dependency injection
Actors support dependency injection of additional parameters into the constructor. Any other parameters you define will have their values satisfied from the dependency injection container.
internalclassMyActor:Actor,IMyActor,IRemindable{publicMyActor(ActorHosthost,BankServicebank)// Accept BankService in the constructor:base(host){...}}
An actor type should have a single public constructor. The actor infrastructure uses the ActivatorUtilities pattern for constructing actor instances.
// In Startup.cspublicvoidConfigureServices(IServiceCollectionservices){...// Register additional types with dependency injection.services.AddSingleton<BankService>();}
Each actor instance has its own dependency injection scope and remains in memory for some time after performing an operation. During that time, the dependency injection scope associated with the actor is also considered live. The scope will be released when the actor is deactivated.
If an actor injects an IServiceProvider in the constructor, the actor will receive a reference to the IServiceProvider associated with its scope. The IServiceProvider can be used to resolve services dynamically in the future.
internalclassMyActor:Actor,IMyActor,IRemindable{publicMyActor(ActorHosthost,IServiceProviderservices)// Accept IServiceProvider in the constructor:base(host){...}}
When using this pattern, avoid creating many instances of transient services which implement IDisposable. Since the scope associated with an actor could be considered valid for a long time, you can accumulate many services in memory. See the dependency injection guidelines for more information.
IDisposable and actors
Actors can implement IDisposable or IAsyncDisposable. It’s recommended that you rely on dependency injection for resource management rather than implementing dispose functionality in application code. Dispose support is provided in the rare case where it is truly necessary.
Logging
Inside an actor class, you have access to an ILogger instance through a property on the base Actor class. This instance is connected to the ASP.NET Core logging system and should be used for all logging inside an actor. Read more about logging. You can configure a variety of different logging formats and output sinks.
Use structured logging with named placeholders like the example below:
publicTask<MyData>GetDataAsync(){this.Logger.LogInformation("Getting state at {CurrentTime}",DateTime.UtcNow);returnthis.StateManager.GetStateAsync<MyData>("my_data");}
When logging, avoid using format strings like: $"Getting state at {DateTime.UtcNow}"
Logging should use the named placeholder syntax which offers better performance and integration with logging systems.
Using an explicit actor type name
By default, the type of the actor, as seen by clients, is derived from the name of the actor implementation class. The default name will be the class name (without namespace).
If desired, you can specify an explicit type name by attaching an ActorAttribute attribute to the actor implementation class.
In the example above, the name will be MyCustomActorTypeName.
No change is needed to the code that registers the actor type with the runtime, providing the value via the attribute is all that is required.
Host actors on the server
Registering actors
Actor registration is part of ConfigureServices in Startup.cs. You can register services with dependency injection via the ConfigureServices method. Registering the set of actor types is part of the registration of actor services.
Inside ConfigureServices you can:
Register the actor runtime (AddActors)
Register actor types (options.Actors.RegisterActor<>)
Configure actor runtime settings options
Register additional service types for dependency injection into actors (services)
// In Startup.cspublicvoidConfigureServices(IServiceCollectionservices){// Register actor runtime with DIservices.AddActors(options=>{// Register actor types and configure actor settingsoptions.Actors.RegisterActor<MyActor>();// Configure default settingsoptions.ActorIdleTimeout=TimeSpan.FromMinutes(10);options.ActorScanInterval=TimeSpan.FromSeconds(35);options.DrainOngoingCallTimeout=TimeSpan.FromSeconds(35);options.DrainRebalancedActors=true;});// Register additional services for use with actorsservices.AddSingleton<BankService>();}
You can configure the JsonSerializerOptions as part of ConfigureServices:
// In Startup.cspublicvoidConfigureServices(IServiceCollectionservices){services.AddActors(options=>{...// Customize JSON optionsoptions.JsonSerializerOptions=...});}
Actors and routing
The ASP.NET Core hosting support for actors uses the endpoint routing system. The .NET SDK provides no support hosting actors with the legacy routing system from early ASP.NET Core releases.
Since actors uses endpoint routing, the actors HTTP handler is part of the middleware pipeline. The following is a minimal example of a Configure method setting up the middleware pipeline with actors.
// in Startup.cspublicvoidConfigure(IApplicationBuilderapp,IWebHostEnvironmentenv){if(env.IsDevelopment()){app.UseDeveloperExceptionPage();}app.UseRouting();app.UseEndpoints(endpoints=>{// Register actors handlers that interface with the Dapr runtime.endpoints.MapActorsHandlers();});}
The UseRouting and UseEndpoints calls are necessary to configure routing. Configure actors as part of the pipeline by adding MapActorsHandlers inside the endpoint middleware.
This is a minimal example, it’s valid for Actors functionality to existing alongside:
Controllers
Razor Pages
Blazor
gRPC Services
Dapr pub/sub handler
other endpoints such as health checks
Problematic middleware
Certain middleware may interfere with the routing of Dapr requests to the actors handlers. In particular, the UseHttpsRedirection is problematic for Dapr’s default configuration. Dapr sends requests over unencrypted HTTP by default, which the UseHttpsRedirection middleware will block. This middleware cannot be used with Dapr at this time.
// in Startup.cspublicvoidConfigure(IApplicationBuilderapp,IWebHostEnvironmentenv){if(env.IsDevelopment()){app.UseDeveloperExceptionPage();}// INVALID - this will block non-HTTPS requestsapp.UseHttpsRedirection();// INVALID - this will block non-HTTPS requestsapp.UseRouting();app.UseEndpoints(endpoints=>{// Register actors handlers that interface with the Dapr runtime.endpoints.MapActorsHandlers();});}
Necessary steps to serialize your types remoted and non-remoted Actors in .NET
Actor Serialization
The Dapr actor package enables you to use Dapr virtual actors within a .NET application with either a weakly- or strongly-typed client. Each utilizes a different serialization approach. This document will review the differences and convey a few key ground rules to understand in either scenario.
Please be advised that it is not a supported scenario to use the weakly- or strongly typed actor clients interchangeably because of these different serialization approaches. The data persisted using one Actor client will not be accessible using the other Actor client, so it is important to pick one and use it consistently throughout your application.
Weakly-typed Dapr Actor client
In this section, you will learn how to configure your C# types so they are properly serialized and deserialized at runtime when using a weakly-typed actor client. These clients use string-based names of methods with request and response payloads that are serialized using the System.Text.Json serializer. Please note that this serialization framework is not specific to Dapr and is separately maintained by the .NET team within the .NET GitHub repository.
When using the weakly-typed Dapr Actor client to invoke methods from your various actors, it’s not necessary to independently serialize or deserialize the method payloads as this will happen transparently on your behalf by the SDK.
The client will use the latest version of System.Text.Json available for the version of .NET you’re building against and serialization is subject to all the inherent capabilities provided in the associated .NET documentation.
The serializer will be configured to use the JsonSerializerOptions.Webdefault options unless overridden with a custom options configuration which means the following are applied:
Deserialization of the property name is performed in a case-insensitive manner
Serialization of the property name is performed using camel casing unless the property is overridden with a [JsonPropertyName] attribute
Deserialization will read numeric values from number and/or string values
Basic Serialization
In the following example, we present a simple class named Doodad though it could just as well be a record as well.
The default property names can be overridden by applying the [JsonPropertyName] attribute to desired properties.
Generally, this isn’t going to be necessary for types you’re persisting to the actor state as you’re not intended to read or write them independent of Dapr-associated functionality, but
the following is provided just to clearly illustrate that it’s possible.
Override Property Names on Classes
Here’s an example demonstrating the use of JsonPropertyName to change the name for the first property following serialization. Note that the last usage of JsonPropertyName on the Count property
matches what it would be expected to serialize to. This is largely just to demonstrate that applying this attribute won’t negatively impact anything - in fact, it might be preferable if you later
decide to change the default serialization options but still need to consistently access the properties previously serialized before that change as JsonPropertyName will override those options.
Because the argument passed in a primary constructor (introduced in C# 12) can be applied to either a property or field within a record, using the [JsonPropertyName] attribute may
require specifying that you intend the attribute to apply to a property and not a field in some ambiguous cases. Should this be necessary, you’d indicate as much in the primary constructor with:
If [property: ] is applied to the [JsonPropertyName] attribute where it’s not necessary, it will not negatively impact serialization or deserialization as the operation will
proceed normally as though it were a property (as it typically would if not marked as such).
Enumeration types
Enumerations, including flat enumerations are serializable to JSON, but the value persisted may surprise you. Again, it’s not expected that the developer should ever engage
with the serialized data independently of Dapr, but the following information may at least help in diagnosing why a seemingly mild version migration isn’t working as expected.
Take the following enum type providing the various seasons in the year:
publicenumSeason{Spring,Summer,Fall,Winter}
We’ll go ahead and use a separate demonstration type that references our Season and simultaneously illustrate how this works with records:
That might be unexpected that our Season.Winter value was represented as a 3, but this is because the serializer is going to automatically use numeric representations
of the enum values starting with zero for the first value and incrementing the numeric value for each additional value available. Again, if a migration were taking place and
a developer had flipped the order of the enums, this would affect a breaking change in your solution as the serialized numeric values would point to different values when deserialized.
Rather, there is a JsonConverter available with System.Text.Json that will instead opt to use a string-based value instead of the numeric value. The [JsonConverter] attribute needs
to be applied to be enum type itself to enable this, but will then be realized in any downstream serialization or deserialization operation that references the enum.
Using the same values from our myEngagement instance above, this would produce the following JSON instead:
{"name":"Ski Trip","season":"Winter"}
As a result, the enum members can be shifted around without fear of introducing errors during deserialization.
Custom Enumeration Values
The System.Text.Json serialization platform doesn’t, out of the box, support the use of [EnumMember] to allow you to change the value of enum that’s used during serialization or deserialization, but
there are scenarios where this could be useful. Again, assume that you’re tasking with refactoring the solution to apply some better names to your various
enums. You’re using the JsonStringEnumConverter<TType> detailed above so you’re saving the name of the enum to value instead of a numeric value, but if you change
the enum name, that will introduce a breaking change as the name will no longer match what’s in state.
Do note that if you opt into using this approach, you should decorate all your enum members with the [EnumMeber] attribute so that the values are consistently applied for each enum value instead
of haphazardly. Nothing will validate this at build or runtime, but it is considered a best practice operation.
How can you specify the precise value persisted while still changing the name of the enum member in this scenario? Use a custom JsonConverter with an extension method that can pull the value
out of the attached [EnumMember] attributes where provided. Add the following to your solution:
publicsealedclassEnumMemberJsonConverter<T>:JsonConverter<T>whereT:struct,Enum{/// <summary>Reads and converts the JSON to type <typeparamref name="T" />.</summary>/// <param name="reader">The reader.</param>/// <param name="typeToConvert">The type to convert.</param>/// <param name="options">An object that specifies serialization options to use.</param>/// <returns>The converted value.</returns>publicoverrideTRead(refUtf8JsonReaderreader,TypetypeToConvert,JsonSerializerOptionsoptions){// Get the string value from the JSON readervarvalue=reader.GetString();// Loop through all the enum valuesforeach(varenumValueinEnum.GetValues<T>()){// Get the value from the EnumMember attribute, if anyvarenumMemberValue=GetValueFromEnumMember(enumValue);// If the values match, return the enum valueif(value==enumMemberValue){returnenumValue;}}// If no match found, throw an exceptionthrownewJsonException($"Invalid value for {typeToConvert.Name}: {value}");}/// <summary>Writes a specified value as JSON.</summary>/// <param name="writer">The writer to write to.</param>/// <param name="value">The value to convert to JSON.</param>/// <param name="options">An object that specifies serialization options to use.</param>publicoverridevoidWrite(Utf8JsonWriterwriter,Tvalue,JsonSerializerOptionsoptions){// Get the value from the EnumMember attribute, if anyvarenumMemberValue=GetValueFromEnumMember(value);// Write the value to the JSON writerwriter.WriteStringValue(enumMemberValue);}privatestaticstringGetValueFromEnumMember(Tvalue){MemberInfo[]member=typeof(T).GetMember(value.ToString(),BindingFlags.DeclaredOnly|BindingFlags.Static|BindingFlags.Public);if(member.Length==0)returnvalue.ToString();object[]customAttributes=member.GetCustomAttributes(typeof(EnumMemberAttribute),false);if(customAttributes.Length!=0){EnumMemberAttributeenumMemberAttribute=(EnumMemberAttribute)customAttributes;if(enumMemberAttribute!=null&&enumMemberAttribute.Value!=null)returnenumMemberAttribute.Value;}returnvalue.ToString();}}
Now let’s add a sample enumerator. We’ll set a value that uses the lower-case version of each enum member to demonstrate this. Don’t forget to decorate the enum with the JsonConverter
attribute and reference our custom converter in place of the numeral-to-string converter used in the last section.
This time, serialization will take into account the values from the attached [EnumMember] attribute providing us a mechanism to refactor our application without necessitating
a complex versioning scheme for our existing enum values in the state.
{"event":"Conference","season":"fall"}
Polymorphic Serialization
When working with polymorphic types in Dapr Actor clients, it is essential to handle serialization and deserialization correctly to ensure that the appropriate
derived types are instantiated. Polymorphic serialization allows you to serialize objects of a base type while preserving the specific derived type information.
To enable polymorphic deserialization, you must use the [JsonPolymorphic] attribute on your base type. Additionally,
it is crucial to include the [AllowOutOfOrderMetadataProperties] attribute to ensure that metadata properties, such as $type
can be processed correctly by System.Text.Json even if they are not the first properties in the JSON object.
In this example, the SampleValueBase class is marked with both [JsonPolymorphic] and [AllowOutOfOrderMetadataProperties]
attributes. This setup ensures that the $type metadata property can be correctly identified and processed during
deserialization, regardless of its position in the JSON object.
By following this approach, you can effectively manage polymorphic serialization and deserialization in your Dapr Actor
clients, ensuring that the correct derived types are instantiated and used.
Strongly-typed Dapr Actor client
In this section, you will learn how to configure your classes and records so they are properly serialized and deserialized at runtime when using a strongly-typed actor client. These clients are implemented using .NET interfaces and are not compatible with Dapr Actors written using other languages.
This actor client serializes data using an engine called the Data Contract Serializer which converts your C# types to and from XML documents. This serialization framework is not specific to Dapr and is separately maintained by the .NET team within the .NET GitHub repository.
When sending or receiving primitives (like strings or ints), this serialization happens transparently and there’s no requisite preparation needed on your part. However, when working with complex types such as those you create, there are some important rules to take into consideration so this process works smoothly.
Serializable Types
There are several important considerations to keep in mind when using the Data Contract Serializer:
By default, all types, read/write properties (after construction) and fields marked as publicly visible are serialized
All types must either expose a public parameterless constructor or be decorated with the DataContractAttribute attribute
Init-only setters are only supported with the use of the DataContractAttribute attribute
Read-only fields, properties without a Get and Set method and internal or properties with private Get and Set methods are ignored during serialization
Serialization is supported for types that use other complex types that are not themselves marked with the DataContractAttribute attribute through the use of the KnownTypesAttribute attribute
If a type is marked with the DataContractAttribute attribute, all members you wish to serialize and deserialize must be decorated with the DataMemberAttribute attribute as well or they’ll be set to their default values
How does deserialization work?
The approach used for deserialization depends on whether or not the type is decorated with the DataContractAttribute attribute. If this attribute isn’t present, an instance of the type is created using the parameterless constructor. Each of the properties and fields are then mapped into the type using their respective setters and the instance is returned to the caller.
If the type is marked with [DataContract], the serializer instead uses reflection to read the metadata of the type and determine which properties or fields should be included based on whether or not they’re marked with the DataMemberAttribute attribute as it’s performed on an opt-in basis. It then allocates an uninitialized object in memory (avoiding the use of any constructors, parameterless or not) and then sets the value directly on each mapped property or field, even if private or uses init-only setters. Serialization callbacks are invoked as applicable throughout this process and then the object is returned to the caller.
Use of the serialization attributes is highly recommended as they grant more flexibility to override names and namespaces and generally use more of the modern C# functionality. While the default serializer can be relied on for primitive types, it’s not recommended for any of your own types, whether they be classes, structs or records. It’s recommended that if you decorate a type with the DataContractAttribute attribute, you also explicitly decorate each of the members you want to serialize or deserialize with the DataMemberAttribute attribute as well.
.NET Classes
Classes are fully supported in the Data Contract Serializer provided that that other rules detailed on this page and the Data Contract Serializer documentation are also followed.
The most important thing to remember here is that you must either have a public parameterless constructor or you must decorate it with the appropriate attributes. Let’s review some examples to really clarify what will and won’t work.
In the following example, we present a simple class named Doodad. We don’t provide an explicit constructor here, so the compiler will provide an default parameterless constructor. Because we’re using supported primitive types (Guid, string and int32) and all our members have a public getter and setter, no attributes are required and we’ll be able to use this class without issue when sending and receiving it from a Dapr actor method.
So let’s tweak it - let’s add our own constructor and only use init-only setters on the members. This will fail to serialize and deserialize not because of the use of the init-only setters, but because there’s no parameterless constructors.
// WILL NOT SERIALIZE PROPERLY!publicclassDoodad{publicDoodad(stringname,intcount){Id=Guid.NewGuid();Name=name;Count=count;}publicGuidId{get;set;}publicstringName{get;init;}publicintCount{get;init;}}
If we add a public parameterless constructor to the type, we’re good to go and this will work without further annotations.
But what if we don’t want to add this constructor? Perhaps you don’t want your developers to accidentally create an instance of this Doodad using an unintended constructor. That’s where the more flexible attributes are useful. If you decorate your type with a DataContractAttribute attribute, you can drop your parameterless constructor and it will work once again.
In the above example, we don’t need to also use the DataMemberAttribute attributes because again, we’re using built-in primitives that the serializer supports. But, we do get more flexibility if we use the attributes. From the DataContractAttribute attribute, we can specify our own XML namespace with the Namespace argument and, via the Name argument, change the name of the type as used when serialized into the XML document.
It’s a recommended practice to append the DataContractAttribute attribute to the type and the DataMemberAttribute attributes to all the members you want to serialize anyway - if they’re not necessary and you’re not changing the default values, they’ll just be ignored, but they give you a mechanism to opt into serializing members that wouldn’t otherwise have been included such as those marked as private or that are themselves complex types or collections.
Note that if you do opt into serializing your private members, their values will be serialized into plain text - they can very well be viewed, intercepted and potentially manipulated based on how you’re handing the data once serialized, so it’s an important consideration whether you want to mark these members or not in your use case.
In the following example, we’ll look at using the attributes to change the serialized names of some of the members as well as introduce the IgnoreDataMemberAttribute attribute. As the name indicates, this tells the serializer to skip this property even though it’d be otherwise eligible to serialize. Further, because I’m decorating the type with the DataContractAttribute attribute, it means that I can use init-only setters on the properties.
When this is serialized, because we’re changing the names of the serialized members, we can expect a new instance of Doodad using the default values this to be serialized as:
C# 12 brought us primary constructors on classes. Use of a primary constructor means the compiler will be prevented from creating the default implicit parameterless constructor. While a primary constructor on a class doesn’t generate any public properties, it does mean that if you pass this primary constructor any arguments or have non-primitive types in your class, you’ll either need to specify your own parameterless constructor or use the serialization attributes.
Here’s an example where we’re using the primary constructor to inject an ILogger to a field and add our own parameterless constructor without the need for any attributes.
Structs are supported by the Data Contract serializer provided that they are marked with the DataContractAttribute attribute and the members you wish to serialize are marked with the DataMemberAttribute attribute. Further, to support deserialization, the struct will also need to have a parameterless constructor. This works even if you define your own parameterless constructor as enabled in C# 10.
Records were introduced in C# 9 and follow precisely the same rules as classes when it comes to serialization. We recommend that you should decorate all your records with the DataContractAttribute attribute and members you wish to serialize with DataMemberAttribute attributes so you don’t experience any deserialization issues using this or other newer C# functionalities. Because record classes use init-only setters for properties by default and encourage the use of the primary constructor, applying these attributes to your types ensures that the serializer can properly otherwise accommodate your types as-is.
Typically records are presented as a simple one-line statement using the new primary constructor concept:
publicrecordDoodad(GuidId,stringName,intCount);
This will throw an error encouraging the use of the serialization attributes as soon as you use it in a Dapr actor method invocation because there’s no parameterless constructor available nor is it decorated with the aforementioned attributes.
Here we add an explicit parameterless constructor and it won’t throw an error, but none of the values will be set during deserialization since they’re created with init-only setters. Because this doesn’t use the DataContractAttribute attribute or the DataMemberAttribute attribute on any members, the serializer will be unable to map the target members correctly during deserialization.
This approach does without the additional constructor and instead relies on the serialization attributes. Because we mark the type with the DataContractAttribute attribute and decorate each member with its own DataMemberAttribute attribute, the serialization engine will be able to map from the XML document to our type without issue.
There are several types built into .NET that are considered primitive and eligible for serialization without additional effort on the part of the developer:
Again, if you want to pass these types around via your actor methods, no additional consideration is necessary as they’ll be serialized and deserialized without issue. Further, types that are themselves marked with the (SerializeableAttribute)[https://learn.microsoft.com/en-us/dotnet/api/system.serializableattribute] attribute will be serialized.
Enumeration Types
Enumerations, including flag enumerations are serializable if appropriately marked. The enum members you wish to be serialized must be marked with the EnumMemberAttribute attribute in order to be serialized. Passing a custom value into the optional Value argument on this attribute will allow you to specify the value used for the member in the serialized document instead of having the serializer derive it from the name of the member.
The enum type does not require that the type be decorated with the DataContractAttribute attribute - only that the members you wish to serialize be decorated with the EnumMemberAttribute attributes.
publicenumColors{ [EnumMember]Red, [EnumMember(Value="g")]Green,Blue,//Even if used by a type, this value will not be serialized as it's not decorated with the EnumMember attribute}
Collection Types
With regards to the data contact serializer, all collection types that implement the IEnumerable interface including arays and generic collections are considered collections. Those types that implement IDictionary or the generic IDictionary<TKey, TValue> are considered dictionary collections; all others are list collections.
Not unlike other complex types, collection types must have a parameterless constructor available. Further, they must also have a method called Add so they can be properly serialized and deserialized. The types used by these collection types must themselves be marked with the DataContractAttribute attribute or otherwise be serializable as described throughout this document.
Data Contract Versioning
As the data contract serializer is only used in Dapr with respect to serializing the values in the .NET SDK to and from the Dapr actor instances via the proxy methods, there’s little need to consider versioning of data contracts as the data isn’t being persisted between application versions using the same serializer. For those interested in learning more about data contract versioning visit here.
Known Types
Nesting your own complex types is easily accommodated by marking each of the types with the DataContractAttribute attribute. This informs the serializer as to how deserialization should be performed.
But what if you’re working with polymorphic types and one of your members is a base class or interface with derived classes or other implementations? Here, you’ll use the KnownTypeAttribute attribute to give a hint to the serializer about how to proceed.
When you apply the KnownTypeAttribute attribute to a type, you are informing the data contract serializer about what subtypes it might encounter allowing it to properly handle the serialization and deserialization of these types, even when the actual type at runtime is different from the declared type.
[DataContract]
[KnownType(typeof(DerivedClass))]
public class BaseClass
{
//Members of the base class
}
[DataContract]
public class DerivedClass : BaseClass
{
//Additional members of the derived class
}
In this example, the BaseClass is marked with [KnownType(typeof(DerivedClass))] which tells the data contract serializer that DerivedClass is a possible implementation of BaseClass that it may need to serialize or deserialize. Without this attribute, the serialize would not be aware of the DerivedClass when it encounters an instance of BaseClass that is actually of type DerivedClass and this could lead to a serialization exception because the serializer would not know how to handle the derived type. By specifying all possible derived types as known types, you ensure that the serializer can process the type and its members correctly.
For more information and examples about using [KnownType], please refer to the official documentation.
1.2.4 - How to: Run and use virtual actors in the .NET SDK
Try out .NET Dapr virtual actors with this example
The Dapr actor package allows you to interact with Dapr virtual actors from a .NET application. In this guide, you learn how to:
The interface project (\MyActor\MyActor.Interfaces)
This project contains the interface definition for the actor. Actor interfaces can be defined in any project with any name. The interface defines the actor contract shared by:
The actor implementation
The clients calling the actor
Because client projects may depend on it, it’s better to define it in an assembly separate from the actor implementation.
The actor service project (\MyActor\MyActorService)
This project implements the ASP.Net Core web service that hosts the actor. It contains the implementation of the actor, MyActor.cs. An actor implementation is a class that:
Derives from the base type Actor
Implements the interfaces defined in the MyActor.Interfaces project.
An actor class must also implement a constructor that accepts an ActorService instance and an ActorId, and passes them to the base Actor class.
The actor client project (\MyActor\MyActorClient)
This project contains the implementation of the actor client which calls MyActor’s method defined in Actor Interfaces.
Since we’ll be creating 3 projects, choose an empty directory to start from, and open it in your terminal of choice.
Step 1: Create actor interfaces
Actor interface defines the actor contract that is shared by the actor implementation and the clients calling the actor.
Actor interface is defined with the below requirements:
Actor interface must inherit Dapr.Actors.IActor interface
The return type of Actor method must be Task or Task<object>
Actor method can have one argument at a maximum
Create interface project and add dependencies
# Create Actor Interfacesdotnet new classlib -o MyActor.Interfaces
cd MyActor.Interfaces
# Add Dapr.Actors nuget package. Please use the latest package version from nuget.orgdotnet add package Dapr.Actors
cd ..
Implement IMyActor interface
Define IMyActor interface and MyData data object. Paste the following code into MyActor.cs in the MyActor.Interfaces project.
Dapr uses ASP.NET web service to host Actor service. This section will implement IMyActor actor interface and register Actor to Dapr Runtime.
Create actor service project and add dependencies
# Create ASP.Net Web service to host Dapr actordotnet new web -o MyActorService
cd MyActorService
# Add Dapr.Actors.AspNetCore nuget package. Please use the latest package version from nuget.orgdotnet add package Dapr.Actors.AspNetCore
# Add Actor Interface referencedotnet add reference ../MyActor.Interfaces/MyActor.Interfaces.csproj
cd ..
Add actor implementation
Implement IMyActor interface and derive from Dapr.Actors.Actor class. Following example shows how to use Actor Reminders as well. For Actors to use Reminders, it must derive from IRemindable. If you don’t intend to use Reminder feature, you can skip implementing IRemindable and reminder specific methods which are shown in the code below.
Paste the following code into MyActor.cs in the MyActorService project:
usingDapr.Actors;usingDapr.Actors.Runtime;usingMyActor.Interfaces;usingSystem;usingSystem.Threading.Tasks;namespaceMyActorService{internalclassMyActor:Actor,IMyActor,IRemindable{// The constructor must accept ActorHost as a parameter, and can also accept additional// parameters that will be retrieved from the dependency injection container///// <summary>/// Initializes a new instance of MyActor/// </summary>/// <param name="host">The Dapr.Actors.Runtime.ActorHost that will host this actor instance.</param>publicMyActor(ActorHosthost):base(host){}/// <summary>/// This method is called whenever an actor is activated./// An actor is activated the first time any of its methods are invoked./// </summary>protectedoverrideTaskOnActivateAsync(){// Provides opportunity to perform some optional setup.Console.WriteLine($"Activating actor id: {this.Id}");returnTask.CompletedTask;}/// <summary>/// This method is called whenever an actor is deactivated after a period of inactivity./// </summary>protectedoverrideTaskOnDeactivateAsync(){// Provides Opporunity to perform optional cleanup.Console.WriteLine($"Deactivating actor id: {this.Id}");returnTask.CompletedTask;}/// <summary>/// Set MyData into actor's private state store/// </summary>/// <param name="data">the user-defined MyData which will be stored into state store as "my_data" state</param>publicasyncTask<string>SetDataAsync(MyDatadata){// Data is saved to configured state store implicitly after each method execution by Actor's runtime.// Data can also be saved explicitly by calling this.StateManager.SaveStateAsync();// State to be saved must be DataContract serializable.awaitthis.StateManager.SetStateAsync<MyData>("my_data",// state namedata);// data saved for the named state "my_data"return"Success";}/// <summary>/// Get MyData from actor's private state store/// </summary>/// <return>the user-defined MyData which is stored into state store as "my_data" state</return>publicTask<MyData>GetDataAsync(){// Gets state from the state store.returnthis.StateManager.GetStateAsync<MyData>("my_data");}/// <summary>/// Register MyReminder reminder with the actor/// </summary>publicasyncTaskRegisterReminder(){awaitthis.RegisterReminderAsync("MyReminder",// The name of the remindernull,// User state passed to IRemindable.ReceiveReminderAsync()TimeSpan.FromSeconds(5),// Time to delay before invoking the reminder for the first timeTimeSpan.FromSeconds(5));// Time interval between reminder invocations after the first invocation}/// <summary>/// Get MyReminder reminder details with the actor/// </summary>publicasyncTask<IActorReminder>GetReminder(){awaitthis.GetReminderAsync("MyReminder");}/// <summary>/// Unregister MyReminder reminder with the actor/// </summary>publicTaskUnregisterReminder(){Console.WriteLine("Unregistering MyReminder...");returnthis.UnregisterReminderAsync("MyReminder");}// <summary>// Implement IRemindeable.ReceiveReminderAsync() which is call back invoked when an actor reminder is triggered.// </summary>publicTaskReceiveReminderAsync(stringreminderName,byte[]state,TimeSpandueTime,TimeSpanperiod){Console.WriteLine("ReceiveReminderAsync is called!");returnTask.CompletedTask;}/// <summary>/// Register MyTimer timer with the actor/// </summary>publicTaskRegisterTimer(){returnthis.RegisterTimerAsync("MyTimer",// The name of the timernameof(this.OnTimerCallBack),// Timer callbacknull,// User state passed to OnTimerCallback()TimeSpan.FromSeconds(5),// Time to delay before the async callback is first invokedTimeSpan.FromSeconds(5));// Time interval between invocations of the async callback}/// <summary>/// Unregister MyTimer timer with the actor/// </summary>publicTaskUnregisterTimer(){Console.WriteLine("Unregistering MyTimer...");returnthis.UnregisterTimerAsync("MyTimer");}/// <summary>/// Timer callback once timer is expired/// </summary>privateTaskOnTimerCallBack(byte[]data){Console.WriteLine("OnTimerCallBack is called!");returnTask.CompletedTask;}}}
Register actor runtime with ASP.NET Core startup
The Actor runtime is configured through ASP.NET Core Startup.cs.
The runtime uses the ASP.NET Core dependency injection system to register actor types and essential services. This integration is provided through the AddActors(...) method call in ConfigureServices(...). Use the delegate passed to AddActors(...) to register actor types and configure actor runtime settings. You can register additional types for dependency injection inside ConfigureServices(...). These will be available to be injected into the constructors of your Actor types.
Actors are implemented via HTTP calls with the Dapr runtime. This functionality is part of the application’s HTTP processing pipeline and is registered inside UseEndpoints(...) inside Configure(...).
Paste the following code into Startup.cs in the MyActorService project:
usingMicrosoft.AspNetCore.Builder;usingMicrosoft.AspNetCore.Hosting;usingMicrosoft.Extensions.DependencyInjection;usingMicrosoft.Extensions.Hosting;namespaceMyActorService{publicclassStartup{publicvoidConfigureServices(IServiceCollectionservices){services.AddActors(options=>{// Register actor types and configure actor settingsoptions.Actors.RegisterActor<MyActor>();});}publicvoidConfigure(IApplicationBuilderapp,IWebHostEnvironmentenv){if(env.IsDevelopment()){app.UseDeveloperExceptionPage();}app.UseRouting();// Register actors handlers that interface with the Dapr runtime.app.MapActorsHandlers();}}}
Step 3: Add a client
Create a simple console app to call the actor service. Dapr SDK provides Actor Proxy client to invoke actor methods defined in Actor Interface.
Create actor client project and add dependencies
# Create Actor's Clientdotnet new console -o MyActorClient
cd MyActorClient
# Add Dapr.Actors nuget package. Please use the latest package version from nuget.orgdotnet add package Dapr.Actors
# Add Actor Interface referencedotnet add reference ../MyActor.Interfaces/MyActor.Interfaces.csproj
cd ..
Invoke actor methods with strongly-typed client
You can use ActorProxy.Create<IMyActor>(..) to create a strongly-typed client and invoke methods on the actor.
Paste the following code into Program.cs in the MyActorClient project:
usingSystem;usingSystem.Threading.Tasks;usingDapr.Actors;usingDapr.Actors.Client;usingMyActor.Interfaces;namespaceMyActorClient{classProgram{staticasyncTaskMainAsync(string[]args){Console.WriteLine("Startup up...");// Registered Actor Type in Actor ServicevaractorType="MyActor";// An ActorId uniquely identifies an actor instance// If the actor matching this id does not exist, it will be createdvaractorId=newActorId("1");// Create the local proxy by using the same interface that the service implements.//// You need to provide the type and id so the actor can be located. varproxy=ActorProxy.Create<IMyActor>(actorId,actorType);// Now you can use the actor interface to call the actor's methods.Console.WriteLine($"Calling SetDataAsync on {actorType}:{actorId}...");varresponse=awaitproxy.SetDataAsync(newMyData(){PropertyA="ValueA",PropertyB="ValueB",});Console.WriteLine($"Got response: {response}");Console.WriteLine($"Calling GetDataAsync on {actorType}:{actorId}...");varsavedData=awaitproxy.GetDataAsync();Console.WriteLine($"Got response: {savedData}");}}}
Running the code
The projects that you’ve created can now to test the sample.
Run MyActorService
Since MyActorService is hosting actors, it needs to be run with the Dapr CLI.
cd MyActorService
dapr run --app-id myapp --app-port 5000 --dapr-http-port 3500 -- dotnet run
You will see commandline output from both daprd and MyActorService in this terminal. You should see something like the following, which indicates that the application started successfully.
...
ℹ️ Updating metadata for app command: dotnet run
✅ You're up and running! Both Dapr and your app logs will appear here.
== APP == info: Microsoft.Hosting.Lifetime[0]
== APP == Now listening on: https://localhost:5001
== APP == info: Microsoft.Hosting.Lifetime[0]
== APP == Now listening on: http://localhost:5000
== APP == info: Microsoft.Hosting.Lifetime[0]
== APP == Application started. Press Ctrl+C to shut down.
== APP == info: Microsoft.Hosting.Lifetime[0]
== APP == Hosting environment: Development
== APP == info: Microsoft.Hosting.Lifetime[0]
== APP == Content root path: /Users/ryan/actortest/MyActorService
Run MyActorClient
MyActorClient is acting as the client, and it can be run normally with dotnet run.
Open a new terminal an navigate to the MyActorClient directory. Then run the project with:
dotnet run
You should see commandline output like:
Startup up...
Calling SetDataAsync on MyActor:1...
Got response: Success
Calling GetDataAsync on MyActor:1...
Got response: PropertyA: ValueA, PropertyB: ValueB
💡 This sample relies on a few assumptions. The default listening port for an ASP.NET Core web project is 5000, which is being passed to dapr run as --app-port 5000. The default HTTP port for the Dapr sidecar is 3500. We’re telling the sidecar for MyActorService to use 3500 so that MyActorClient can rely on the default value.
Now you have successfully created an actor service and client. See the related links section to learn more.
Get up and running with Dapr Workflow and the Dapr .NET SDK
1.3.1 - DaprWorkflowClient usage
Essential tips and advice for using DaprWorkflowClient
Lifetime management
A DaprWorkflowClient holds access to networking resources in the form of TCP sockets used to communicate with the Dapr sidecar as well
as other types used in the management and operation of Workflows. DaprWorkflowClient implements IAsyncDisposable to support eager
cleanup of resources.
Dependency Injection
The AddDaprWorkflow() method will register the Dapr workflow services with ASP.NET Core dependency injection. This method
requires an options delegate that defines each of the workflows and activities you wish to register and use in your application.
Note
This method will attempt to register a DaprClient instance, but this will only work if it hasn’t already been registered with another
lifetime. For example, an earlier call to AddDaprClient() with a singleton lifetime will always use a singleton regardless of the
lifetime chose for the workflow client. The DaprClient instance will be used to communicate with the Dapr sidecar and if it’s not
yet registered, the lifetime provided during the AddDaprWorkflow() registration will be used to register the DaprWorkflowClient
as well as its own dependencies.
Singleton Registration
By default, the AddDaprWorkflow method will register the DaprWorkflowClient and associated services using a singleton lifetime. This means
that the services will be instantiated only a single time.
The following is an example of how registration of the DaprWorkflowClient as it would appear in a typical Program.cs file:
While this may generally be acceptable in your use case, you may instead wish to override the lifetime specified. This is done by passing a ServiceLifetime
argument in AddDaprWorkflow. For example, you may wish to inject another scoped service into your ASP.NET Core processing pipeline
that needs context used by the DaprClient that wouldn’t be available if the former service were registered as a singleton.
Finally, Dapr services can also be registered using a transient lifetime meaning that they will be initialized every time they’re injected. This
is demonstrated in the following example:
Workflow activities support the same dependency injection that developers have come to expect of modern C# applications. Assuming a proper
registration at startup, any such type can be injected into the constructor of the workflow activity and available to utilize during
the execution of the workflow. This makes it simple to add logging via an injected ILogger or access to other Dapr
building blocks by injecting DaprClient or DaprJobsClient, for example.
internalsealedclassSquareNumberActivity:WorkflowActivity<int,int>{privatereadonlyILogger_logger;publicMyActivity(ILoggerlogger){this._logger=logger;}publicoverrideTask<int>RunAsync(WorkflowActivityContextcontext,intinput){this._logger.LogInformation("Squaring the value {number}",input);varresult=input*input;this._logger.LogInformation("Got a result of {squareResult}",result);returnTask.FromResult(result);}}
Using ILogger in Workflow
Because workflows must be deterministic, it is not possible to inject arbitrary services into them. For example,
if you were able to inject a standard ILogger into a workflow and it needed to be replayed because of an error,
subsequent replay from the event source log would result in the log recording additional operations that didn’t actually
take place a second or third time because their results were sourced from the log. This has the potential to introduce
a significant amount of confusion. Rather, a replay-safe logger is made available for use within workflows. It will only
log events the first time the workflow runs and will not log anything whenever the workflow is being replaced.
This logger can be retrieved from a method present on the WorkflowContext available on your workflow instance and
otherwise used precisely as you might otherwise use an ILogger instance.
An end-to-end sample demonstrating this can be seen in the
.NET SDK repository
but a brief extraction of this sample is available below.
publicclassOrderProcessingWorkflow:Workflow<OrderPayload,OrderResult>{publicoverrideasyncTask<OrderResult>RunAsync(WorkflowContextcontext,OrderPayloadorder){stringorderId=context.InstanceId;varlogger=context.CreateReplaySafeLogger<OrderProcessingWorkflow>();//Use this method to access the logger instancelogger.LogInformation("Received order {orderId} for {quantity} {name} at ${totalCost}",orderId,order.Quantity,order.Name,order.TotalCost);//...}}
1.3.2 - How to: Author and manage Dapr Workflow in the .NET SDK
Learn how to author and manage Dapr Workflow using the .NET SDK
Let’s create a Dapr workflow and invoke it using the console. In the provided order processing workflow example, the console prompts provide directions on how to both purchase and restock items. In this guide, you will:
If successful, you should see a response like the following:
{"instanceID":"12345678"}
Send an HTTP request to get the status of the workflow that was started:
curl -i -X GET http://localhost:3500/v1.0/workflows/dapr/12345678
The workflow is designed to take several seconds to complete. If the workflow hasn’t completed when you issue the HTTP request, you’ll see the following JSON response (formatted for readability) with workflow status as RUNNING:
When the workflow has completed, the stdout of the workflow app should look like:
info: WorkflowConsoleApp.Activities.NotifyActivity[0]
Received order 12345678 for Paperclips at $99.95
info: WorkflowConsoleApp.Activities.ReserveInventoryActivity[0]
Reserving inventory: 12345678, Paperclips, 1
info: WorkflowConsoleApp.Activities.ProcessPaymentActivity[0]
Processing payment: 12345678, 99.95, USD
info: WorkflowConsoleApp.Activities.NotifyActivity[0]
Order 12345678 processed successfully!
If you have Zipkin configured for Dapr locally on your machine, then you can view the workflow trace spans in the Zipkin web UI (typically at http://localhost:9411/zipkin/).
With the Dapr AI package, you can interact with the Dapr AI workloads from a .NET application.
Today, Dapr provides the Conversational API to engage with large language models. To get started with this workload,
walk through the Dapr Conversational AI how-to guide.
1.4.1 - Dapr AI Client
Learn how to create Dapr AI clients
The Dapr AI client package allows you to interact with the AI capabilities provided by the Dapr sidecar.
Lifetime management
A DaprConversationClient is a version of the Dapr client that is dedicated to interacting with the Dapr Conversation
API. It can be registered alongside a DaprClient and other Dapr clients without issue.
It maintains access to networking resources in the form of TCP sockets used to communicate with the Dapr sidecar.
For best performance, create a single long-lived instance of DaprConversationClient and provide access to that shared
instance throughout your application. DaprConversationClient instances are thread-safe and intended to be shared.
This can be aided by utilizing the dependency injection functionality. The registration method supports registration
as a singleton, a scoped instance or as transient (meaning it’s recreated every time it’s injected), but also enables
registration to utilize values from an IConfiguration or other injected service in a way that’s impractical when
creating the client from scratch in each of your classes.
Avoid creating a DaprConversationClient for each operation.
Configuring DaprConversationClient via DaprConversationClientBuilder
A DaprConversationClient can be configured by invoking methods on the DaprConversationClientBuilder class before
calling .Build() to create the client itself. The settings for each DaprConversationClient are separate
and cannot be changed after calling .Build().
vardaprConversationClient=newDaprConversationClientBuilder().UseDaprApiToken("abc123")// Specify the API token used to authenticate to other Dapr sidecars.Build();
The DaprConversationClientBuilder contains settings for:
The HTTP endpoint of the Dapr sidecar
The gRPC endpoint of the Dapr sidecar
The JsonSerializerOptions object used to configure JSON serialization
The GrpcChannelOptions object used to configure gRPC
The API token used to authenticate requests to the sidecar
The factory method used to create the HttpClient instance used by the SDK
The timeout used for the HttpClient instance when making requests to the sidecar
The SDK will read the following environment variables to configure the default values:
DAPR_HTTP_ENDPOINT: used to find the HTTP endpoint of the Dapr sidecar, example: https://dapr-api.mycompany.com
DAPR_GRPC_ENDPOINT: used to find the gRPC endpoint of the Dapr sidecar, example: https://dapr-grpc-api.mycompany.com
DAPR_HTTP_PORT: if DAPR_HTTP_ENDPOINT is not set, this is used to find the HTTP local endpoint of the Dapr sidecar
DAPR_GRPC_PORT: if DAPR_GRPC_ENDPOINT is not set, this is used to find the gRPC local endpoint of the Dapr sidecar
DAPR_API_TOKEN: used to set the API token
Configuring gRPC channel options
Dapr’s use of CancellationToken for cancellation relies on the configuration of the gRPC channel options. If you need
to configure these options yourself, make sure to enable the ThrowOperationCanceledOnCancellation setting.
The APIs on DaprConversationClient perform asynchronous operations and accept an optional CancellationToken parameter. This
follows a standard .NET practice for cancellable operations. Note that when cancellation occurs, there is no guarantee that
the remote endpoint stops processing the request, only that the client has stopped waiting for completion.
When an operation is cancelled, it will throw an OperationCancelledException.
Configuring DaprConversationClient via dependency injection
Using the built-in extension methods for registering the DaprConversationClient in a dependency injection container can
provide the benefit of registering the long-lived service a single time, centralize complex configuration and improve
performance by ensuring similarly long-lived resources are re-purposed when possible (e.g. HttpClient instances).
There are three overloads available to give the developer the greatest flexibility in configuring the client for their
scenario. Each of these will register the IHttpClientFactory on your behalf if not already registered, and configure
the DaprConversationClientBuilder to use it when creating the HttpClient instance in order to re-use the same instance as
much as possible and avoid socket exhaustion and other issues.
In the first approach, there’s no configuration done by the developer and the DaprConversationClient is configured with the
default settings.
varbuilder=WebApplication.CreateBuilder(args);builder.Services.AddDaprConversationClient();//Registers the `DaprConversationClient` to be injected as neededvarapp=builder.Build();
Sometimes the developer will need to configure the created client using the various configuration options detailed
above. This is done through an overload that passes in the DaprConversationClientBuiler and exposes methods for configuring
the necessary options.
varbuilder=WebApplication.CreateBuilder(args);builder.Services.AddDaprConversationClient((_,daprConversationClientBuilder)=>{//Set the API tokendaprConversationClientBuilder.UseDaprApiToken("abc123");//Specify a non-standard HTTP endpointdaprConversationClientBuilder.UseHttpEndpoint("http://dapr.my-company.com");});varapp=builder.Build();
Finally, it’s possible that the developer may need to retrieve information from another service in order to populate
these configuration values. That value may be provided from a DaprClient instance, a vendor-specific SDK or some
local service, but as long as it’s also registered in DI, it can be injected into this configuration operation via the
last overload:
varbuilder=WebApplication.CreateBuilder(args);//Register a fictional service that retrieves secrets from somewherebuilder.Services.AddSingleton<SecretService>();builder.Services.AddDaprConversationClient((serviceProvider,daprConversationClientBuilder)=>{//Retrieve an instance of the `SecretService` from the service providervarsecretService=serviceProvider.GetRequiredService<SecretService>();vardaprApiToken=secretService.GetSecret("DaprApiToken").Value;//Configure the `DaprConversationClientBuilder`daprConversationClientBuilder.UseDaprApiToken(daprApiToken);});varapp=builder.Build();
1.4.2 - How to: Create and use Dapr AI Conversations in the .NET SDK
Learn how to create and use the Dapr Conversational AI client using the .NET SDK
To get started with the Dapr AI .NET SDK client, install the Dapr.AI package from NuGet:
dotnet add package Dapr.AI
A DaprConversationClient maintains access to networking resources in the form of TCP sockets used to communicate with the Dapr sidecar.
Dependency Injection
The AddDaprAiConversation() method will register the Dapr client ASP.NET Core dependency injection and is the recommended approach
for using this package. This method accepts an optional options delegate for configuring the DaprConversationClient and a
ServiceLifetime argument, allowing you to specify a different lifetime for the registered services instead of the default Singleton
value.
The following example assumes all default values are acceptable and is sufficient to register the DaprConversationClient:
services.AddDaprAiConversation();
The optional configuration delegate is used to configure the DaprConversationClient by specifying options on the
DaprConversationClientBuilder as in the following example:
services.AddSingleton<DefaultOptionsProvider>();services.AddDaprAiConversation((serviceProvider,clientBuilder)=>{//Inject a service to source a value fromvaroptionsProvider=serviceProvider.GetRequiredService<DefaultOptionsProvider>();varstandardTimeout=optionsProvider.GetStandardTimeout();//Configure the value on the client builderclientBuilder.UseTimeout(standardTimeout);});
Manual Instantiation
Rather than using dependency injection, a DaprConversationClient can also be built using the static client builder.
For best performance, create a single long-lived instance of DaprConversationClient and provide access to that shared instance throughout
your application. DaprConversationClient instances are thread-safe and intended to be shared.
Avoid creating a DaprConversationClient per-operation.
A DaprConversationClient can be configured by invoking methods on the DaprConversationClientBuilder class before calling .Build()
to create the client. The settings for each DaprConversationClient are separate and cannot be changed after calling .Build().
Clone the SDK repo to try out some examples and get started.
Building Blocks
This part of the .NET SDK allows you to interface with the Conversations API to send and receive messages from
large language models.
1.5 - Dapr Jobs .NET SDK
Get up and running with Dapr Jobs and the Dapr .NET SDK
With the Dapr Job package, you can interact with the Dapr Job APIs from a .NET application to trigger future operations
to run according to a predefined schedule with an optional payload.
1.5.1 - How to: Author and manage Dapr Jobs in the .NET SDK
Learn how to author and manage Dapr Jobs using the .NET SDK
Let’s create an endpoint that will be invoked by Dapr Jobs when it triggers, then schedule the job in the same app. We’ll use the simple example provided here, for the following demonstration and walk through it as an explainer of how you can schedule one-time or recurring jobs using either an interval or Cron expression yourself. In this guide,
you will:
From the .NET SDK root directory, navigate to the Dapr Jobs example.
cd examples/Jobs
Run the application locally
To run the Dapr application, you need to start the .NET program and a Dapr sidecar. Navigate to the JobsSample directory.
cd JobsSample
We’ll run a command that starts both the Dapr sidecar and the .NET program at the same time.
dapr run --app-id jobsapp --dapr-grpc-port 4001 --dapr-http-port 3500 -- dotnet run
Dapr listens for HTTP requests at http://localhost:3500 and internal Jobs gRPC requests at http://localhost:4001.
Register the Dapr Jobs client with dependency injection
The Dapr Jobs SDK provides an extension method to simplify the registration of the Dapr Jobs client. Before completing
the dependency injection registration in Program.cs, add the following line:
varbuilder=WebApplication.CreateBuilder(args);//Add anywhere between these two linesbuilder.Services.AddDaprJobsClient();varapp=builder.Build();
Note that in today’s implementation of the Jobs API, the app that schedules the job will also be the app that receives the trigger notification. In other words, you cannot schedule a trigger to run in another application. As a result, while you don’t explicitly need the Dapr Jobs client to be registered in your application to schedule a trigger invocation endpoint, your endpoint will never be invoked without the same app also scheduling the job somehow (whether via this Dapr Jobs .NET SDK or an HTTP call to the sidecar).
It’s possible that you may want to provide some configuration options to the Dapr Jobs client that
should be present with each call to the sidecar such as a Dapr API token, or you want to use a non-standard
HTTP or gRPC endpoint. This is possible through use of an overload of the registration method that allows configuration of a
DaprJobsClientBuilder instance:
Still, it’s possible that whatever values you wish to inject need to be retrieved from some other source, itself registered as a dependency. There’s one more overload you can use to inject an IServiceProvider into the configuration action method. In the following example, we register a fictional singleton that can retrieve secrets from somewhere and pass it into the configuration method for AddDaprJobClient so
we can retrieve our Dapr API token from somewhere else for registration here:
It’s possible to configure the Dapr Jobs client using the values in your registered IConfiguration as well without
explicitly specifying each of the value overrides using the DaprJobsClientBuilder as demonstrated in the previous
section. Rather, by populating an IConfiguration made available through dependency injection the AddDaprJobsClient()
registration will automatically use these values over their respective defaults.
Start by populating the values in your configuration. This can be done in several different ways as demonstrated below.
Configuration via ConfigurationBuilder
Application settings can be configured without using a configuration source and by instead populating the value in-memory
using a ConfigurationBuilder instance:
varbuilder=WebApplication.CreateBuilder();//Create the configurationvarconfiguration=newConfigurationBuilder().AddInMemoryCollection(newDictionary<string,string>{{"DAPR_HTTP_ENDPOINT","http://localhost:54321"},{"DAPR_API_TOKEN","abc123"}}).Build();builder.Configuration.AddConfiguration(configuration);builder.Services.AddDaprJobsClient();//This will automatically populate the HTTP endpoint and API token values from the IConfiguration
Configuration via Environment Variables
Application settings can be accessed from environment variables available to your application.
The following environment variables will be used to populate both the HTTP endpoint and API token used to register the
Dapr Jobs client.
The Dapr Jobs client will be configured to use both the HTTP endpoint http://localhost:54321 and populate all outbound
requests with the API token header abc123.
Configuration via prefixed Environment Variables
However, in shared-host scenarios where there are multiple applications all running on the same machine without using
containers or in development environments, it’s not uncommon to prefix environment variables. The following example
assumes that both the HTTP endpoint and the API token will be pulled from environment variables prefixed with the
value “myapp_”. The two environment variables used in this scenario are as follows:
Key
Value
myapp_DAPR_HTTP_ENDPOINT
http://localhost:54321
myapp_DAPR_API_TOKEN
abc123
These environment variables will be loaded into the registered configuration in the following example and made available
without the prefix attached.
The Dapr Jobs client will be configured to use both the HTTP endpoint http://localhost:54321 and populate all outbound
requests with the API token header abc123.
Use the Dapr Jobs client without relying on dependency injection
While the use of dependency injection simplifies the use of complex types in .NET and makes it easier to
deal with complicated configurations, you’re not required to register the DaprJobsClient in this way. Rather, you can also elect to create an instance of it from a DaprJobsClientBuilder instance as demonstrated below:
publicclassMySampleClass{publicvoidDoSomething(){vardaprJobsClientBuilder=newDaprJobsClientBuilder();vardaprJobsClient=daprJobsClientBuilder.Build();//Do something with the `daprJobsClient`}}
Set up a endpoint to be invoked when the job is triggered
It’s easy to set up a jobs endpoint if you’re at all familiar with minimal APIs in ASP.NET Core as the syntax is the same between the two.
Once dependency injection registration has been completed, configure the application the same way you would to handle mapping an HTTP request via the minimal API functionality in ASP.NET Core. Implemented as an extension method,
pass the name of the job it should be responsive to and a delegate. Services can be injected into the delegate’s arguments as you wish and the job payload can be accessed from the ReadOnlyMemory<byte> originally provided to the
job registration.
There are two delegates you can use here. One provides an IServiceProvider in case you need to inject other services into the handler:
//We have this from the example abovevarbuilder=WebApplication.CreateBuilder(args);builder.Services.AddDaprJobsClient();varapp=builder.Build();//Add our endpoint registrationapp.MapDaprScheduledJob("myJob",(IServiceProviderserviceProvider,stringjobName,ReadOnlyMemory<byte>jobPayload)=>{varlogger=serviceProvider.GetService<ILogger>();logger?.LogInformation("Received trigger invocation for '{jobName}'","myJob");//Do something...});app.Run();
The other overload of the delegate doesn’t require an IServiceProvider if not necessary:
//We have this from the example abovevarbuilder=WebApplication.CreateBuilder(args);builder.Services.AddDaprJobsClient();varapp=builder.Build();//Add our endpoint registrationapp.MapDaprScheduledJob("myJob",(stringjobName,ReadOnlyMemory<byte>jobPayload)=>{//Do something...});app.Run();
Support cancellation tokens when processing mapped invocations
You may want to ensure that timeouts are handled on job invocations so that they don’t indefinitely hang and use system resources. When setting up the job mapping, there’s an optional TimeSpan parameter that can be
provided as the last argument to specify a timeout for the request. Every time the job mapping invocation is triggered, a new CancellationTokenSource will be created using this timeout parameter and a CancellationToken
will be created from it to put an upper bound on the processing of the request. If a timeout isn’t provided, this defaults to CancellationToken.None and a timeout will not be automatically applied to the mapping.
//We have this from the example abovevarbuilder=WebApplication.CreateBuilder(args);builder.Services.AddDaprJobsClient();varapp=builder.Build();//Add our endpoint registrationapp.MapDaprScheduledJob("myJob",(stringjobName,ReadOnlyMemory<byte>jobPayload)=>{//Do something...},TimeSpan.FromSeconds(15));//Assigns a maximum timeout of 15 seconds for handling the invocation requestapp.Run();
Register the job
Finally, we have to register the job we want scheduled. Note that from here, all SDK methods have cancellation token support and use a default token if not otherwise set.
There are three different ways to set up a job that vary based on how you want to configure the schedule. The following
shows the different arguments available when scheduling a job:
Argument Name
Type
Description
Required
jobName
string
The name of the job being scheduled.
Yes
schedule
DaprJobSchedule
The schedule defining when the job will be triggered.
Yes
payload
ReadOnlyMemory
Job data provided to the invocation endpoint when triggered.
No
startingFrom
DateTime
The point in time from which the job schedule should start.
No
repeats
int
The maximum number of times the job should be triggered.
No
ttl
When the job should expires and no longer trigger.
No
overwrite
bool
A flag indicating whether an existing job should be overwritten when submitted or false to require that an existing job with the same name be deleted first.
No
cancellationToken
CancellationToken
Used to cancel out of the operation early, e.g. because of an operation timeout.
No
DaprJobSchedule
All jobs are scheduled via the SDK using the DaprJobSchedule which creates an expression passed to the
runtime to schedule jobs. There are several static methods exposed on the DaprJobSchedule used to faciliate
easy registration of each of the kinds of job schedules available as follows. This separates specifying
the job schedule itself from any additional options like repeating the operation or providing a cancellation token.
One-time job
A one-time job is exactly that; it will run at a single point in time and will not repeat.
This approach requires that you select a job name and specify a time it should be triggered.
An interval-based job is one that runs on a recurring loop configured as a fixed amount of time, not unlike how reminders work in the Actors building block today.
DaprJobSchedule.FromDuration(TimeSpan interval)
Interval-based jobs can be scheduled from the Dapr Jobs client as in the following example:
publicclassMyOperation(DaprJobsClientdaprJobsClient){publicasyncTaskScheduleIntervalJobAsync(CancellationTokencancellationToken){varhourlyInterval=TimeSpan.FromHours(1);//Trigger the job hourly, but a maximum of 5 timesvarschedule=DaprJobSchedule.FromDuration(hourlyInterval);awaitdaprJobsClient.ScheduleJobAsync("job",schedule,repeats:5,cancellationToken:cancellationToken);}}
Cron-based job
A Cron-based job is scheduled using a Cron expression. This gives more calendar-based control over when the job is triggered as it can used calendar-based values in the expression.
There are two different approaches supported to scheduling a Cron-based job in the Dapr SDK.
Provide your own Cron expression
You can just provide your own Cron expression via a string via DaprJobSchedule.FromExpression():
publicclassMyOperation(DaprJobsClientdaprJobsClient){publicasyncTaskScheduleCronJobAsync(CancellationTokencancellationToken){//At the top of every other hour on the fifth day of the monthconststringcronSchedule="0 */2 5 * *";varschedule=DaprJobSchedule.FromExpression(cronSchedule);//Don't start this until next monthvarnow=DateTime.UtcNow;varoneMonthFromNow=now.AddMonths(1);varfirstOfNextMonth=newDateTime(oneMonthFromNow.Year,oneMonthFromNow.Month,1,0,0,0);awaitdaprJobsClient.ScheduleJobAsync("myJobName",)awaitdaprJobsClient.ScheduleCronJobAsync("myJobName",schedule,dueTime:firstOfNextMonth,cancellationToken:cancellationToken);}}
Use the CronExpressionBuilder
Alternatively, you can use our fluent builder to produce a valid Cron expression:
publicclassMyOperation(DaprJobsClientdaprJobsClient){publicasyncTaskScheduleCronJobAsync(CancellationTokencancellationToken){//At the top of every other hour on the fifth day of the monthvarcronExpression=newCronExpressionBuilder().Every(EveryCronPeriod.Hour,2).On(OnCronPeriod.DayOfMonth,5).ToString();varschedule=DaprJobSchedule.FromExpression(cronExpression);//Don't start this until next monthvarnow=DateTime.UtcNow;varoneMonthFromNow=now.AddMonths(1);varfirstOfNextMonth=newDateTime(oneMonthFromNow.Year,oneMonthFromNow.Month,1,0,0,0);awaitdaprJobsClient.ScheduleJobAsync("myJobName",)awaitdaprJobsClient.ScheduleCronJobAsync("myJobName",schedule,dueTime:firstOfNextMonth,cancellationToken:cancellationToken);}}
Get details of already-scheduled job
If you know the name of an already-scheduled job, you can retrieve its metadata without waiting for it to
be triggered. The returned JobDetails exposes a few helpful properties for consuming the information from the Dapr Jobs API:
If the Schedule property contains a Cron expression, the IsCronExpression property will be true and the expression will also be available in the CronExpression property.
If the Schedule property contains a duration value, the IsIntervalExpression property will instead be true and the value will be converted to a TimeSpan value accessible from the Interval property.
Essential tips and advice for using DaprJobsClient
Lifetime management
A DaprJobsClient is a version of the Dapr client that is dedicated to interacting with the Dapr Jobs API. It can be
registered alongside a DaprClient and other Dapr clients without issue.
It maintains access to networking resources in the form of TCP sockets used to communicate with the Dapr sidecar and
implements IDisposable to support the eager cleanup of resources.
For best performance, create a single long-lived instance of DaprJobsClient and provide access to that shared instance
throughout your application. DaprJobsClient instances are thread-safe and intended to be shared.
This can be aided by utilizing the dependency injection functionality. The registration method supports registration using
as a singleton, a scoped instance or as transient (meaning it’s recreated every time it’s injected), but also enables
registration to utilize values from an IConfiguration or other injected service in a way that’s impractical when
creating the client from scratch in each of your classes.
Avoid creating a DaprJobsClient for each operation and disposing it when the operation is complete.
Configuring DaprJobsClient via the DaprJobsClientBuilder
A DaprJobsClient can be configured by invoking methods on the DaprJobsClientBuilder class before calling .Build()
to create the client itself. The settings for each DaprJobsClient are separate
and cannot be changed after calling .Build().
vardaprJobsClient=newDaprJobsClientBuilder().UseDaprApiToken("abc123")// Specify the API token used to authenticate to other Dapr sidecars.Build();
The DaprJobsClientBuilder contains settings for:
The HTTP endpoint of the Dapr sidecar
The gRPC endpoint of the Dapr sidecar
The JsonSerializerOptions object used to configure JSON serialization
The GrpcChannelOptions object used to configure gRPC
The API token used to authenticate requests to the sidecar
The factory method used to create the HttpClient instance used by the SDK
The timeout used for the HttpClient instance when making requests to the sidecar
The SDK will read the following environment variables to configure the default values:
DAPR_HTTP_ENDPOINT: used to find the HTTP endpoint of the Dapr sidecar, example: https://dapr-api.mycompany.com
DAPR_GRPC_ENDPOINT: used to find the gRPC endpoint of the Dapr sidecar, example: https://dapr-grpc-api.mycompany.com
DAPR_HTTP_PORT: if DAPR_HTTP_ENDPOINT is not set, this is used to find the HTTP local endpoint of the Dapr sidecar
DAPR_GRPC_PORT: if DAPR_GRPC_ENDPOINT is not set, this is used to find the gRPC local endpoint of the Dapr sidecar
DAPR_API_TOKEN: used to set the API token
Configuring gRPC channel options
Dapr’s use of CancellationToken for cancellation relies on the configuration of the gRPC channel options. If you need
to configure these options yourself, make sure to enable the ThrowOperationCanceledOnCancellation setting.
The APIs on DaprJobsClient perform asynchronous operations and accept an optional CancellationToken parameter. This
follows a standard .NET practice for cancellable operations. Note that when cancellation occurs, there is no guarantee that
the remote endpoint stops processing the request, only that the client has stopped waiting for completion.
When an operation is cancelled, it will throw an OperationCancelledException.
Configuring DaprJobsClient via dependency injection
Using the built-in extension methods for registering the DaprJobsClient in a dependency injection container can
provide the benefit of registering the long-lived service a single time, centralize complex configuration and improve
performance by ensuring similarly long-lived resources are re-purposed when possible (e.g. HttpClient instances).
There are three overloads available to give the developer the greatest flexibility in configuring the client for their
scenario. Each of these will register the IHttpClientFactory on your behalf if not already registered, and configure
the DaprJobsClientBuilder to use it when creating the HttpClient instance in order to re-use the same instance as
much as possible and avoid socket exhaustion and other issues.
In the first approach, there’s no configuration done by the developer and the DaprJobsClient is configured with the
default settings.
varbuilder=WebApplication.CreateBuilder(args);builder.Services.AddDaprJobsClient();//Registers the `DaprJobsClient` to be injected as neededvarapp=builder.Build();
Sometimes the developer will need to configure the created client using the various configuration options detailed
above. This is done through an overload that passes in the DaprJobsClientBuiler and exposes methods for configuring
the necessary options.
varbuilder=WebApplication.CreateBuilder(args);builder.Services.AddDaprJobsClient((_,daprJobsClientBuilder)=>{//Set the API tokendaprJobsClientBuilder.UseDaprApiToken("abc123");//Specify a non-standard HTTP endpointdaprJobsClientBuilder.UseHttpEndpoint("http://dapr.my-company.com");});varapp=builder.Build();
Finally, it’s possible that the developer may need to retrieve information from another service in order to populate
these configuration values. That value may be provided from a DaprClient instance, a vendor-specific SDK or some
local service, but as long as it’s also registered in DI, it can be injected into this configuration operation via the
last overload:
varbuilder=WebApplication.CreateBuilder(args);//Register a fictional service that retrieves secrets from somewherebuilder.Services.AddSingleton<SecretService>();builder.Services.AddDaprJobsClient((serviceProvider,daprJobsClientBuilder)=>{//Retrieve an instance of the `SecretService` from the service providervarsecretService=serviceProvider.GetRequiredService<SecretService>();vardaprApiToken=secretService.GetSecret("DaprApiToken").Value;//Configure the `DaprJobsClientBuilder`daprJobsClientBuilder.UseDaprApiToken(daprApiToken);});varapp=builder.Build();
Understanding payload serialization on DaprJobsClient
While there are many methods on the DaprClient that automatically serialize and deserialize data using the
System.Text.Json serializer, this SDK takes a different philosophy. Instead, the relevant methods accept an optional
payload of ReadOnlyMemory<byte> meaning that serialization is an exercise left to the developer and is not
generally handled by the SDK.
That said, there are some helper extension methods available for each of the scheduling methods. If you know that you
want to use a type that’s JSON-serializable, you can use the Schedule*WithPayloadAsync method for each scheduling
type that accepts an object as a payload and an optional JsonSerializerOptions to use when serializing the value.
This will convert the value to UTF-8 encoded bytes for you as a convenience. Here’s an example of what this might
look like when scheduling a Cron expression:
In the same vein, if you have a plain string value, you can use an overload of the same method to serialize a
string-typed payload and the JSON serialization step will be skipped and it’ll only be encoded to an array of
UTF-8 encoded bytes. Here’s an example of what this might look like when scheduling a one-time job:
varnow=DateTime.UtcNow;varoneWeekFromNow=now.AddDays(7);awaitdaprJobsClient.ScheduleOneTimeJobWithPayloadAsync("myOtherJob",oneWeekFromNow,"This is a test!");
The delegate handling the job invocation expects at least two arguments to be present:
A string that is populated with the jobName, providing the name of the invoked job
A ReadOnlyMemory<byte> that is populated with the bytes originally provided during the job registration.
Because the payload is stored as a ReadOnlyMemory<byte>, the developer has the freedom to serialize and deserialize
as they wish, but there are again two helper extensions included that can deserialize this to either a JSON-compatible
type or a string. Both methods assume that the developer encoded the originally scheduled job (perhaps using the
helper serialization methods) as these methods will not force the bytes to represent something they’re not.
To deserialize the bytes to a string, the following helper method can be used:
varpayloadAsString=Encoding.UTF8.GetString(jobPayload.Span);//If successful, returns a string with the value
Error handling
Methods on DaprJobsClient will throw a DaprJobsServiceException if an issue is encountered between the SDK
and the Jobs API service running on the Dapr sidecar. If a failure is encountered because of a poorly formatted
request made to the Jobs API service through this SDK, a DaprMalformedJobException will be thrown. In case of
illegal argument values, the appropriate standard exception will be thrown (e.g. ArgumentOutOfRangeException
or ArgumentNullException) with the name of the offending argument. And for anything else, a DaprException
will be thrown.
The most common cases of failure will be related to:
Incorrect argument formatting while engaging with the Jobs API
Transient failures such as a networking problem
Invalid data, such as a failure to deserialize a value into a type it wasn’t originally serialized from
In any of these cases, you can examine more exception details through the .InnerException property.
1.6 - Dapr Cryptography .NET SDK
Get up and running with the Dapr Cryptography .NET SDK
With the Dapr Cryptography package, you can perform high-performance encryption and decryption operations with Dapr.
To get started with this functionality, walk through the [Dapr Cryptography(https://v1-16.docs.dapr.io/developing-applications/sdks/dotnet/dotnet-cryptography/dotnet-cryptography-howto/)
how-to guide.
1.6.1 - Dapr Cryptography Client
Learn how to create Dapr Crytography clients
The Dapr Cryptography package allows you to perform encryption and decryption operations provided by the Dapr sidecar.
Lifetime management
A DaprEncryptionClient is a version of the Dapr client that is dedicated to interacting with the Dapr Cryptography API.
It can be registered alongside a DaprClient and other Dapr clients without issue.
It maintains access to networking resources in the form of TCP sockets used to communicate with the Dapr sidecar.
For best performance, create a single long-lived instance of DaprEncryptionClient and provide access to that shared
instance throughout your application. DaprEncryptionClient instances are thread-safe and intended to be shared.
This can be aided by utilizing the dependency injection functionality. The registration method supports registration
as a singleton, a scoped instance, or as a transient (meaning it’s recreated every time it’s injected), but also enables
registration to utilize values from an IConfiguration or other injected service in a way that’s impractical when creating
the client from scratch in each of your classes.
Avoid creating a DaprEncryptionClient for each operation.
Configuring DaprEncryptionClient via DaprEncryptionClientBuilder
A DaprCryptographyClient can be configured by invoking methods on the DaprEncryptionClientBuilder class before calling
.Build() to create the client itself. The settings for each DaprEncryptionClientBuilder are separate can cannot be
changed after calling .Build().
vardaprEncryptionClient=newDaprEncryptionClientBuilder().UseDaprApiToken("abc123")//Specify the API token used to authenticate to the Dapr sidecar.Build();
The DaprEncryptionClientBuilder contains settings for:
The HTTP endpoint of the Dapr sidecar
The gRPC endpoint of the Dapr sidecar
The JsonSerializerOptions object used to configure JSON serialization
The GrpcChannelOptions object used to configure gRPC
The API token used to authenticate requests to the sidecar
The factory method used to create the HttpClient instance used by the SDK
The timeout used for the HttpClient instance when making requests to the sidecar
The SDK will read the following environment variables to configure the default values:
DAPR_HTTP_ENDPOINT: used to find the HTTP endpoint of the Dapr sidecar, example: https://dapr-api.mycompany.com
DAPR_GRPC_ENDPOINT: used to find the gRPC endpoint of the Dapr sidecar, example: https://dapr-grpc-api.mycompany.com
DAPR_HTTP_PORT: if DAPR_HTTP_ENDPOINT is not set, this is used to find the HTTP local endpoint of the Dapr sidecar
DAPR_GRPC_PORT: if DAPR_GRPC_ENDPOINT is not set, this is used to find the gRPC local endpoint of the Dapr sidecar
DAPR_API_TOKEN: used to set the API token
Configuring gRPC channel options
Dapr’s use of CancellationToken for cancellation relies on the configuration of the gRPC channel options. If you need
to configure these options yourself, make sure to enable the ThrowOperationCanceledOnCancellation setting.
The APIs on DaprEncryptionClient perform asynchronous operations and accept an optional CancellationToken parameter. This
follows a standard .NET practice for cancellable operations. Note that when cancellation occurs, there is no guarantee that
the remote endpoint stops processing the request, only that the client has stopped waiting for completion.
When an operation is cancelled, it will throw an OperationCancelledException.
Configuring DaprEncryptionClient via dependency injection
Using the built-in extension methods for registering the DaprEncryptionClient in a dependency injection container can
provide the benefit of registering the long-lived service a single time, centralize complex configuration and improve
performance by ensuring similarly long-lived resources are re-purposed when possible (e.g. HttpClient instances).
There are three overloads available to give the developer the greatest flexibility in configuring the client for their
scenario. Each of these will register the IHttpClientFactory on your behalf if not already registered, and configure
the DaprEncryptionClientBuilder to use it when creating the HttpClient instance in order to re-use the same instance as
much as possible and avoid socket exhaustion and other issues.
In the first approach, there’s no configuration done by the developer and the DaprEncryptionClient is configured with the
default settings.
varbuilder=WebApplication.CreateBuilder(args);builder.Services.AddDaprEncryptionClent();//Registers the `DaprEncryptionClient` to be injected as neededvarapp=builder.Build();
Sometimes the developer will need to configure the created client using the various configuration options detailed
above. This is done through an overload that passes in the DaprEncryptionClientBuiler and exposes methods for configuring
the necessary options.
varbuilder=WebApplication.CreateBuilder(args);builder.Services.AddDaprEncryptionClient((_,daprEncrpyptionClientBuilder)=>{//Set the API tokendaprEncryptionClientBuilder.UseDaprApiToken("abc123");//Specify a non-standard HTTP endpointdaprEncryptionClientBuilder.UseHttpEndpoint("http://dapr.my-company.com");});varapp=builder.Build();
Finally, it’s possible that the developer may need to retrieve information from another service in order to populate
these configuration values. That value may be provided from a DaprClient instance, a vendor-specific SDK or some
local service, but as long as it’s also registered in DI, it can be injected into this configuration operation via the
last overload:
varbuilder=WebApplication.CreateBuilder(args);//Register a fictional service that retrieves secrets from somewherebuilder.Services.AddSingleton<SecretService>();builder.Services.AddDaprEncryptionClient((serviceProvider,daprEncryptionClientBuilder)=>{//Retrieve an instance of the `SecretService` from the service providervarsecretService=serviceProvider.GetRequiredService<SecretService>();vardaprApiToken=secretService.GetSecret("DaprApiToken").Value;//Configure the `DaprEncryptionClientBuilder`daprEncryptionClientBuilder.UseDaprApiToken(daprApiToken);});varapp=builder.Build();
1.6.2 - How to: Create an use Dapr Cryptography in the .NET SDK
Learn how to create and use the Dapr Cryptography client using the .NET SDK
A DaprEncryptionClient maintains access to networking resources in the form of TCP sockets used to communicate with
the Dapr sidecar.
Dependency Injection
The AddDaprEncryptionClient() method will register the Dapr client with dependency injection and is the recommended approach
for using this package. This method accepts an optional options delegate for configuring the DaprEncryptionClient and a
ServiceLifetime argument, allowing you to specify a different lifetime for the registered services instead of the default Singleton
value.
The following example assumes all default values are acceptable and is sufficient to register the DaprEncryptionClient:
services.AddDaprEncryptionClient();
The optional configuration delegate is used to configure the DaprEncryptionClient by specifying options on the
DaprEncryptionClientBuilder as in the following example:
services.AddSingleton<DefaultOptionsProvider>();services.AddDaprEncryptionClient((serviceProvider,clientBuilder)=>{//Inject a service to source a value fromvaroptionsProvider=serviceProvider.GetRequiredService<DefaultOptionsProvider>();varstandardTimeout=optionsProvider.GetStandardTimeout();//Configure the value on the client builderclientBuilder.UseTimeout(standardTimeout);});
Manual Instantiation
Rather than using dependency injection, a DaprEncryptionClient can also be built using the static client builder.
For best performance, create a single long-lived instance of DaprEncryptionClient and provide access to that shared instance throughout
your application. DaprEncryptionClient instances are thread-safe and intended to be shared.
Avoid creating a DaprEncryptionClient per-operation.
A DaprEncryptionClient can be configured by invoking methods on the DaprEncryptionClientBuilder class before calling .Build()
to create the client. The settings for each DaprEncryptionClient are separate and cannot be changed after calling .Build().
Clone the SDK repo to try out some examples and get started.
1.7 - Dapr Messaging .NET SDK
Get up and running with the Dapr Messaging .NET SDK
With the Dapr Messaging package, you can interact with the Dapr messaging APIs from a .NET application. In the
v1.15 release, this package only contains the functionality corresponding to the
streaming PubSub capability.
Future Dapr .NET SDK releases will migrate existing messaging capabilities out from Dapr.Client to this
Dapr.Messaging package. This will be documented in the release notes, documentation and obsolete attributes in advance.
1.7.1 - How to: Author and manage Dapr streaming subscriptions in the .NET SDK
Learn how to author and manage Dapr streaming subscriptions using the .NET SDK
Let’s create a subscription to a pub/sub topic or queue at using the streaming capability. We’ll use the
simple example provided here,
for the following demonstration and walk through it as an explainer of how you can configure message handlers at
runtime and which do not require an endpoint to be pre-configured. In this guide, you will:
From the .NET SDK root directory, navigate to the Dapr streaming PubSub example.
cd examples/Client/PublishSubscribe
Run the application locally
To run the Dapr application, you need to start the .NET program and a Dapr sidecar. Navigate to the StreamingSubscriptionExample directory.
cd StreamingSubscriptionExample
We’ll run a command that starts both the Dapr sidecar and the .NET program at the same time.
dapr run --app-id pubsubapp --dapr-grpc-port 4001 --dapr-http-port 3500 -- dotnet run
Dapr listens for HTTP requests at http://localhost:3500 and internal Jobs gRPC requests at http://localhost:4001.
Register the Dapr PubSub client with dependency injection
The Dapr Messaging SDK provides an extension method to simplify the registration of the Dapr PubSub client. Before
completing the dependency injection registration in Program.cs, add the following line:
varbuilder=WebApplication.CreateBuilder(args);//Add anywhere between these twobuilder.Services.AddDaprPubSubClient();//That's itvarapp=builder.Build();
It’s possible that you may want to provide some configuration options to the Dapr PubSub client that
should be present with each call to the sidecar such as a Dapr API token, or you want to use a non-standard
HTTP or gRPC endpoint. This be possible through use of an overload of the registration method that allows configuration
of a DaprPublishSubscribeClientBuilder instance:
Still, it’s possible that whatever values you wish to inject need to be retrieved from some other source, itself registered as a dependency. There’s one more overload you can use to inject an IServiceProvider into the configuration action method. In the following example, we register a fictional singleton that can retrieve secrets from somewhere and pass it into the configuration method for AddDaprJobClient so
we can retrieve our Dapr API token from somewhere else for registration here:
It’s possible to configure the Dapr PubSub client using the values in your registered IConfiguration as well without
explicitly specifying each of the value overrides using the DaprPublishSubscribeClientBuilder as demonstrated in the previous
section. Rather, by populating an IConfiguration made available through dependency injection the AddDaprPubSubClient()
registration will automatically use these values over their respective defaults.
Start by populating the values in your configuration. This can be done in several different ways as demonstrated below.
Configuration via ConfigurationBuilder
Application settings can be configured without using a configuration source and by instead populating the value in-memory
using a ConfigurationBuilder instance:
varbuilder=WebApplication.CreateBuilder();//Create the configurationvarconfiguration=newConfigurationBuilder().AddInMemoryCollection(newDictionary<string,string>{{"DAPR_HTTP_ENDPOINT","http://localhost:54321"},{"DAPR_API_TOKEN","abc123"}}).Build();builder.Configuration.AddConfiguration(configuration);builder.Services.AddDaprPubSubClient();//This will automatically populate the HTTP endpoint and API token values from the IConfiguration
Configuration via Environment Variables
Application settings can be accessed from environment variables available to your application.
The following environment variables will be used to populate both the HTTP endpoint and API token used to register the
Dapr PubSub client.
The Dapr PubSub client will be configured to use both the HTTP endpoint http://localhost:54321 and populate all outbound
requests with the API token header abc123.
Configuration via prefixed Environment Variables
However, in shared-host scenarios where there are multiple applications all running on the same machine without using
containers or in development environments, it’s not uncommon to prefix environment variables. The following example
assumes that both the HTTP endpoint and the API token will be pulled from environment variables prefixed with the
value “myapp_”. The two environment variables used in this scenario are as follows:
Key
Value
myapp_DAPR_HTTP_ENDPOINT
http://localhost:54321
myapp_DAPR_API_TOKEN
abc123
These environment variables will be loaded into the registered configuration in the following example and made available
without the prefix attached.
The Dapr PubSub client will be configured to use both the HTTP endpoint http://localhost:54321 and populate all outbound
requests with the API token header abc123.
Use the Dapr PubSub client without relying on dependency injection
While the use of dependency injection simplifies the use of complex types in .NET and makes it easier to
deal with complicated configurations, you’re not required to register the DaprPublishSubscribeClient in this way.
Rather, you can also elect to create an instance of it from a DaprPublishSubscribeClientBuilder instance as
demonstrated below:
publicclassMySampleClass{publicvoidDoSomething(){vardaprPubSubClientBuilder=newDaprPublishSubscribeClientBuilder();vardaprPubSubClient=daprPubSubClientBuilder.Build();//Do something with the `daprPubSubClient`}}
Set up message handler
The streaming subscription implementation in Dapr gives you greater control over handling backpressure from events by
leaving the messages in the Dapr runtime until your application is ready to accept them. The .NET SDK supports a
high-performance queue for maintaining a local cache of these messages in your application while processing is pending.
These messages will persist in the queue until processing either times out for each one or a response action is taken
for each (typically after processing succeeds or fails). Until this response action is received by the Dapr runtime,
the messages will be persisted by Dapr and made available in case of a service failure.
The various response actions available are as follows:
Response Action
Description
Retry
The event should be delivered again in the future.
Drop
The event should be deleted (or forwarded to a dead letter queue, if configured) and not attempted again.
Success
The event should be deleted as it was successfully processed.
The handler will receive only one message at a time and if a cancellation token is provided to the subscription,
this token will be provided during the handler invocation.
The handler must be configured to return a Task<TopicResponseAction> indicating one of these operations, even if from
a try/catch block. If an exception is not caught by your handler, the subscription will use the response action configured
in the options during subscription registration.
The following demonstrates the sample message handler provided in the example:
Task<TopicResponseAction>HandleMessageAsync(TopicMessagemessage,CancellationTokencancellationToken=default){try{//Do something with the messageConsole.WriteLine(Encoding.UTF8.GetString(message.Data.Span));returnTask.FromResult(TopicResponseAction.Success);}catch{returnTask.FromResult(TopicResponseAction.Retry);}}
Configure and subscribe to the PubSub topic
Configuration of the streaming subscription requires the name of the PubSub component registered with Dapr, the name
of the topic or queue being subscribed to, the DaprSubscriptionOptions providing the configuration for the subscription,
the message handler and an optional cancellation token. The only required argument to the DaprSubscriptionOptions is
the default MessageHandlingPolicy which consists of a per-event timeout and the TopicResponseAction to take when
that timeout occurs.
Other options are as follows:
Property Name
Description
Metadata
Additional subscription metadata
DeadLetterTopic
The optional name of the dead-letter topic to send dropped messages to.
MaximumQueuedMessages
By default, there is no maximum boundary enforced for the internal queue, but setting this
property would impose an upper limit.
MaximumCleanupTimeout
When the subscription is disposed of or the token flags a cancellation request, this specifies
the maximum amount of time available to process the remaining messages in the internal queue.
Subscription is then configured as in the following example:
varmessagingClient=app.Services.GetRequiredService<DaprPublishSubscribeClient>();varcancellationTokenSource=newCancellationTokenSource(TimeSpan.FromSeconds(60));//Override the default of 30 secondsvaroptions=newDaprSubscriptionOptions(newMessageHandlingPolicy(TimeSpan.FromSeconds(10),TopicResponseAction.Retry));varsubscription=awaitmessagingClient.SubscribeAsync("pubsub","mytopic",options,HandleMessageAsync,cancellationTokenSource.Token);
Terminate and clean up subscription
When you’ve finished with your subscription and wish to stop receiving new events, simply await a call to
DisposeAsync() on your subscription instance. This will cause the client to unregister from additional events and
proceed to finish processing all the events still leftover in the backpressure queue, if any, before disposing of any
internal resources. This cleanup will be limited to the timeout interval provided in the DaprSubscriptionOptions when
the subscription was registered and by default, this is set to 30 seconds.
1.7.2 - DaprPublishSubscribeClient usage
Essential tips and advice for using DaprPublishSubscribeClient
Lifetime management
A DaprPublishSubscribeClient is a version of the Dapr client that is dedicated to interacting with the Dapr Messaging API.
It can be registered alongside a DaprClient and other Dapr clients without issue.
It maintains access to networking resources in the form of TCP sockets used to communicate with the Dapr sidecar and implements
IAsyncDisposable to support the eager cleanup of resources.
For best performance, create a single long-lived instance of DaprPublishSubscribeClient and provide access to that shared
instance throughout your application. DaprPublishSubscribeClient instances are thread-safe and intended to be shared.
This can be aided by utilizing the dependency injection functionality. The registration method supports registration using
as a singleton, a scoped instance or as transient (meaning it’s recreated every time it’s injected), but also enables
registration to utilize values from an IConfiguration or other injected service in a way that’s impractical when
creating the client from scratch in each of your classes.
Avoid creating a DaprPublishSubscribeClient for each operation and disposing it when the operation is complete. It’s
intended that the DaprPublishSubscribeClient should only be disposed when you no longer wish to receive events on the
subscription as disposing it will cancel the ongoing receipt of new events.
Configuring DaprPublishSubscribeClient via the DaprPublishSubscribeClientBuilder
A DaprPublishSubscribeClient can be configured by invoking methods on the DaprPublishSubscribeClientBuilder class
before calling .Build() to create the client itself. The settings for each DaprPublishSubscribeClient are separate
and cannot be changed after calling .Build().
vardaprPubsubClient=newDaprPublishSubscribeClientBuilder().UseDaprApiToken("abc123")// Specify the API token used to authenticate to other Dapr sidecars.Build();
The DaprPublishSubscribeClientBuilder contains settings for:
The HTTP endpoint of the Dapr sidecar
The gRPC endpoint of the Dapr sidecar
The JsonSerializerOptions object used to configure JSON serialization
The GrpcChannelOptions object used to configure gRPC
The API token used to authenticate requests to the sidecar
The factory method used to create the HttpClient instance used by the SDK
The timeout used for the HttpClient instance when making requests to the sidecar
The SDK will read the following environment variables to configure the default values:
DAPR_HTTP_ENDPOINT: used to find the HTTP endpoint of the Dapr sidecar, example: https://dapr-api.mycompany.com
DAPR_GRPC_ENDPOINT: used to find the gRPC endpoint of the Dapr sidecar, example: https://dapr-grpc-api.mycompany.com
DAPR_HTTP_PORT: if DAPR_HTTP_ENDPOINT is not set, this is used to find the HTTP local endpoint of the Dapr sidecar
DAPR_GRPC_PORT: if DAPR_GRPC_ENDPOINT is not set, this is used to find the gRPC local endpoint of the Dapr sidecar
DAPR_API_TOKEN: used to set the API token
Configuring gRPC channel options
Dapr’s use of CancellationToken for cancellation relies on the configuration of the gRPC channel options. If you
need to configure these options yourself, make sure to enable the ThrowOperationCanceledOnCancellation setting.
Using cancellation with DaprPublishSubscribeClient
The APIs on DaprPublishSubscribeClient perform asynchronous operations and accept an optional CancellationToken
parameter. This follows a standard .NET practice for cancellable operations. Note that when cancellation occurs, there is
no guarantee that the remote endpoint stops processing the request, only that the client has stopped waiting for completion.
When an operation is cancelled, it will throw an OperationCancelledException.
Configuring DaprPublishSubscribeClient via dependency injection
Using the built-in extension methods for registering the DaprPublishSubscribeClient in a dependency injection container
can provide the benefit of registering the long-lived service a single time, centralize complex configuration and improve
performance by ensuring similarly long-lived resources are re-purposed when possible (e.g. HttpClient instances).
There are three overloads available to give the developer the greatest flexibility in configuring the client for their
scenario. Each of these will register the IHttpClientFactory on your behalf if not already registered, and configure
the DaprPublishSubscribeClientBuilder to use it when creating the HttpClient instance in order to re-use the same
instance as much as possible and avoid socket exhaustion and other issues.
In the first approach, there’s no configuration done by the developer and the DaprPublishSubscribeClient is configured with
the default settings.
varbuilder=WebApplication.CreateBuilder(args);builder.Services.DaprPublishSubscribeClient();//Registers the `DaprPublishSubscribeClient` to be injected as neededvarapp=builder.Build();
Sometimes the developer will need to configure the created client using the various configuration options detailed above. This is done through an overload that passes in the DaprJobsClientBuiler and exposes methods for configuring the necessary options.
varbuilder=WebApplication.CreateBuilder(args);builder.Services.AddDaprJobsClient((_,daprPubSubClientBuilder)=>{//Set the API tokendaprPubSubClientBuilder.UseDaprApiToken("abc123");//Specify a non-standard HTTP endpointdaprPubSubClientBuilder.UseHttpEndpoint("http://dapr.my-company.com");});varapp=builder.Build();
Finally, it’s possible that the developer may need to retrieve information from another service in order to populate these configuration values. That value may be provided from a DaprClient instance, a vendor-specific SDK or some local service, but as long as it’s also registered in DI, it can be injected into this configuration operation via the last overload:
varbuilder=WebApplication.CreateBuilder(args);//Register a fictional service that retrieves secrets from somewherebuilder.Services.AddSingleton<SecretService>();builder.Services.AddDaprPublishSubscribeClient((serviceProvider,daprPubSubClientBuilder)=>{//Retrieve an instance of the `SecretService` from the service providervarsecretService=serviceProvider.GetRequiredService<SecretService>();vardaprApiToken=secretService.GetSecret("DaprApiToken").Value;//Configure the `DaprPublishSubscribeClientBuilder`daprPubSubClientBuilder.UseDaprApiToken(daprApiToken);});varapp=builder.Build();
1.8 - Best Practices for the Dapr .NET SDK
Using Dapr .NET SDK effectively
Building with confidence
The Dapr .NET SDK offers a rich set of capabilities for building distributed applications. This section provides
practical guidance for using the SDK effectively in production scenarios—focusing on reliability, maintainability, and
developer experience.
Topics covered include:
Error handling strategies across Dapr building blocks
Managing experimental features and suppressing related warnings
Leveraging source analyzers and generators to reduce boilerplate and catch issues early
General .NET development practices in Dapr-based applications
Error model guidance
Dapr operations can fail for many reasons—network issues, misconfigured components, or transient faults. The SDK
provides structured error types to help you distinguish between retryable and fatal errors.
Learn how to use DaprException and its derived types effectively here.
Experimental attributes
Some SDK features are marked as experimental and may change in future releases. These are annotated with
[Experimental] and generate build-time warnings by default. You can:
Suppress warnings selectively using #pragma warning disable
Use SuppressMessage attributes for finer control
Track experimental usage across your codebase
Learn more about our use of the [Experimenta] attribute here.
Source tooling
The SDK includes Roslyn-based analyzers and source generators to help you write better code with less effort. These tools:
Warn about common misuses of the SDK
Generate boilerplate for actor registration and invocation
Support IDE integration for faster feedback
Read more about how to install and use these analyzers here.
Additional guidance
This section is designed to support a wide range of development scenarios. As your applications grow in complexity, you’ll find increasingly relevant practices and patterns for working with Dapr in .NET—from actor lifecycle management to configuration strategies and performance tuning.
1.8.1 - Error Model in the Dapr .NET SDK
Learn how to use the richer error model in the .NET SDK.
The Dapr .NET SDK supports the richer error model, implemented by the Dapr runtime. This model provides a way for applications to enrich their errors with added context,
allowing consumers of the application to better understand the issue and resolve it faster. You can read more about the richer error model here, and you
can find the Dapr proto file implementing these errors here.
The Dapr .NET SDK implements all details supported by the Dapr runtime, implemented in the Dapr.Common.Exceptions namespace, and is accessible through
the DaprException extension method TryGetExtendedErrorInfo. Currently, this detail extraction is only supported for
RpcExceptions where the details are present.
// Example usage of ExtendedErrorInfotry{// Perform some action with the Dapr client that throws a DaprException.}catch(DaprExceptiondaprEx){if(daprEx.TryGetExtendedErrorInfo(outDaprExtendedErrorInfoerrorInfo){Console.WriteLine(errorInfo.Code);Console.WriteLine(errorInfo.Message);foreach(DaprExtendedErrorDetaildetailinerrorInfo.Details){Console.WriteLine(detail.ErrorType);switch(detail.ErrorType)caseExtendedErrorType.ErrorInfo:Console.WriteLine(detail.Reason);Console.WriteLine(detail.Domain);default:Console.WriteLine(detail.TypeUrl);}}}
DaprExtendedErrorInfo
Contains Code (the status code) and Message (the error message) associated with the error, parsed from an inner RpcException.
Also contains a collection of DaprExtendedErrorDetails parsed from the details in the exception.
DaprExtendedErrorDetail
All details implement the abstract DaprExtendedErrorDetail and have an associated DaprExtendedErrorType.
Information notifying the client how long to wait before they should retry. Provides a DaprRetryDelay with the properties
Second (offset in seconds) and Nano (offset in nanoseconds).
DebugInfo
Debugging information offered by the server. Contains StackEntries (a collection of strings containing the stack trace), and
Detail (further debugging information).
QuotaFailure
Information relating to some quota that may have been reached, such as a daily usage limit on an API. It has one property Violations,
a collection of DaprQuotaFailureViolation, which each contain a Subject (the subject of the request) and Description (further information regarding the failure).
PreconditionFailure
Information informing the client that some required precondition was not met. Has one property Violations, a collection of
DaprPreconditionFailureViolation, which each has Subject (subject where the precondition failure occured, e.g. “Azure”),
Type (representation of the precondition type, e.g. “TermsOfService”), and Description (further description e.g. “ToS must be accepted.”).
RequestInfo
Information returned by the server that can be used by the server to identify the client’s request. Contains
RequestId and ServingData properties, RequestId being some string (such as a UID) the server can interpret,
and ServingData being some arbitrary data that made up part of the request.
LocalizedMessage
Contains a localized message, along with the locale of the message. Contains Locale (the locale e.g. “en-US”) and Message (the localized message).
BadRequest
Describes a bad request field. Contains collection of DaprBadRequestDetailFieldViolation, which each has Field (the offending field in request, e.g. ‘first_name’) and
Description (further information detailing the reason, e.g. “first_name cannot contain special characters”).
ErrorInfo
Details the cause of an error. Contains three properties, Reason (the reason for the error, which should take the form of UPPER_SNAKE_CASE, e.g. DAPR_INVALID_KEY),
Domain (domain the error belongs to, e.g. ‘dapr.io’), and Metadata, a key/value-based collection with further information.
Help
Provides resources for the client to perform further research into the issue. Contains a collection of DaprHelpDetailLink,
which provides Url (a url to help or documentation), and Description (a description of what the link provides).
ResourceInfo
Provides information relating to an accessed resource. Provides three properties ResourceType (type of the resource being access e.g. “Azure service bus”),
ResourceName (the name of the resource e.g. “my-configured-service-bus”), Owner (the owner of the resource e.g. “subscriptionowner@dapr.io”),
and Description (further information on the resource relating to the error, e.g. “missing permissions to use this resource”).
Unknown
Returned when the detail type url cannot be mapped to the correct DaprExtendedErrorDetail implementation.
Provides one property TypeUrl (the type url that could not be parsed, e.g. “type.googleapis.com/Google.rpc.UnrecognizedType”).
1.8.2 - Experimental Attributes
Learn about why we mark some methods with the [Experimental] attribute
Experimental Attributes
Introduction to Experimental Attributes
With the release of .NET 8, C# 12 introduced the [Experimental] attribute, which provides a standardized way to mark
APIs that are still in development or experimental. This attribute is defined in the System.Diagnostics.CodeAnalysis
namespace and requires a diagnostic ID parameter used to generate compiler warnings when the experimental API
is used.
In the Dapr .NET SDK, we now use the [Experimental] attribute instead of [Obsolete] to mark building blocks and
components that have not yet passed the stable lifecycle certification. This approach provides a clearer distinction
between:
Experimental APIs - Features that are available but still evolving and have not yet been certified as stable
according to the Dapr Component Certification Lifecycle.
Obsolete APIs - Features that are truly deprecated and will be removed in a future release.
Usage in the Dapr .NET SDK
In the Dapr .NET SDK, we apply the [Experimental] attribute at the class level for building blocks that are still in
the Alpha or Beta stages of the Component Certification Lifecycle.
The attribute includes:
A diagnostic ID that identifies the experimental building block
A URL that points to the relevant documentation for that block
The diagnostic IDs follow a naming convention of DAPR_[BUILDING_BLOCK_NAME], such as:
DAPR_CONVERSATION - For the Conversation building block
DAPR_CRYPTOGRAPHY - For the Cryptography building block
DAPR_JOBS - For the Jobs building block
DAPR_DISTRIBUTEDLOCK - For the Distributed Lock building block
Suppressing Experimental Warnings
When you use APIs marked with the [Experimental] attribute, the compiler will generate errors.
To build your solution without marking your own code as experimental, you will need to suppress these errors. Here are
several approaches to do this:
Option 1: Using #pragma directive
You can use the #pragma warning directive to suppress the warning for specific sections of code:
// Disable experimental warning #pragmawarningdisableDAPR_CRYPTOGRAPHY// Your code using the experimental API varclient=newDaprEncryptionClient();// Re-enable the warning #pragmawarningrestoreDAPR_CRYPTOGRAPHY
This approach is useful when you want to suppress warnings only for specific sections of your code.
Option 2: Project-level suppression
To suppress warnings for an entire project, add the following to your .csproj file.
file.
This file should be placed in the root directory of your test projects. You can learn more about using
Directory.Build.props files in the
MSBuild documentation.
Lifecycle of Experimental APIs
As building blocks move through the certification lifecycle and reach the “Stable” stage, the [Experimental] attribute will be removed. No migration or code changes will be required from users when this happens, except for the removal of any warning suppressions if they were added.
Conversely, the [Obsolete] attribute will now be reserved exclusively for APIs that are truly deprecated and scheduled for removal. When you see a method or class marked with [Obsolete], you should plan to migrate away from it according to the migration guidance provided in the attribute message.
Best Practices
In application code:
Be cautious when using experimental APIs, as they may change in future releases
Consider isolating usage of experimental APIs to make future updates easier
Document your use of experimental APIs for team awareness
In test code:
Use project-level suppression to avoid cluttering test code with warning suppressions
Regularly review which experimental APIs you’re using and check if they’ve been stabilized
When contributing to the SDK:
Use [Experimental] for new building blocks that haven’t completed certification
Use [Obsolete] only for truly deprecated APIs
Provide clear documentation links in the UrlFormat parameter
Dapr supports a growing collection of optional Roslyn analyzers and code fix providers that inspect your code for
code quality issues. Starting with the release of v1.16, developers have the opportunity to install additional projects
from NuGet alongside each of the standard capability packages to enable these analyzers in their solutions.
Note
A future release of the Dapr .NET SDK will include these analyzers by default without requiring a separate package
install.
Rule violations will typically be marked as Info or Warning so that if the analyzer identifies an issue, it won’t
necessarily break builds. All code analysis violations appear with the prefix “DAPR” and are uniquely distinguished
by a number following this prefix.
Note
At this time, the first two digits of the diagnostic identifier map one-to-one to distinct Dapr packages, but this
is subject to change in the future as more analyzers are developed.
Install and configure analyzers
The following packages will be available via NuGet following the v1.16 Dapr release:
Dapr.Actors.Analyzers
Dapr.Jobs.Analyzers
Dapr.Workflow.Analyzers
Install each NuGet package on every project where you want the analyzers to run. The package will be installed as a
project dependency and analyzers will run as you write your code or as part of a CI/CD build. The analyzers will flag
issues in your existing code and warn you about new issues as you build your project.
Many of our analyzers have associated code fixes that can be applied to automatically correct the problem. If your IDE
supports this capability, any available code fixes will show up as an inline menu option in your code.
Further, most of our analyzers should also report a specific line and column number in your code of the syntax that’s
been identified as a key aspect of the rule. If your IDE supports it, double clicking any of the analyzer warnings
should jump directly to the part of your code responsible for the violating the analyzer’s rule.
Suppress specific analyzers
If you wish to keep an analyzer from firing against some particular piece of your project, their outputs can be
individually targeted for suppression through a number of ways. Read more about suppressing analyzers in projects
or files in the associated .NET documentation.
Disable all analyzers
If you wish to disable all analyzers in your project without removing any packages providing them, set
the EnableNETAnalyzers property to false in your csproj file.
Available Analyzers
Diagnostic ID
Dapr Package
Category
Severity
Version Added
Description
Code Fix Available
DAPR1301
Dapr.Workflow
Usage
Warning
1.16
The workflow type is not registered with the dependency injection provider
Yes
DAPR1302
Dapr.Workflow
Usage
Warning
1.16
The workflow activity type is not registered with the dependency injection provider
Yes
DAPR1401
Dapr.Actors
Usage
Warning
1.16
Actor timer method invocations require the named callback method to exist on type
No
DAPR1402
Dapr.Actors
Usage
Warning
1.16
The actor type is not registered with dependency injection
Yes
DAPR1403
Dapr.Actors
Interoperability
Info
1.16
Set options.UseJsonSerialization to true to support interoperability with non-.NET actors
Yes
DAPR1404
Dapr.Actors
Usage
Warning
1.16
Call app.MapActorsHandlers to map endpoints for Dapr actors
Yes
DAPR1501
Dapr.Jobs
Usage
Warning
1.16
Job invocations require the MapDaprScheduledJobHandler to be set and configured for each anticipated job on IEndpointRouteBuilder
No
Analyzer Categories
The following are each of the eligible categories that an analyzer can be assigned to and are modeled after the
standard categories used by the .NET analyzers:
Design
Documentation
Globalization
Interoperability
Maintainability
Naming
Performance
Reliability
Security
Usage
1.9 - Developing applications with the Dapr .NET SDK
Deployment integrations with the Dapr .NET SDK
Thinking more than one at a time
Using your favorite IDE or editor to launch an application typically assumes that you only need to run one thing:
the application you’re debugging. However, developing microservices challenges you to think about your local
development process for more than one at a time. A microservices application has multiple services that you might
need running simultaneously, and dependencies (like state stores) to manage.
Adding Dapr to your development process means you need to manage the following concerns:
Each service you want to run
A Dapr sidecar for each service
Dapr component and configuration manifests
Additional dependencies such as state stores
optional: the Dapr placement service for actors
This document assumes that you’re building a production application and want to create a repeatable and robust set of
development practices. The guidance here is generalized, and applies to any .NET server application using
Dapr (including actors).
Managing components
You have two primary methods of storing component definitions for local development with Dapr:
Use the default location (~/.dapr/components)
Use your own location
Creating a folder within your source code repository to store components and configuration will give you a way to
version and share these definitions. The guidance provided here will assume you created a folder next to the
application source code to store these files.
Development options
Choose one of these links to learn about tools you can use in local development scenarios. It’s suggested that
you familiarize yourself with each of them to get a sense of the options provided by the .NET SDK.
The Dapr CLI provides you with a good base to work from by initializing a local redis container, zipkin container, the placement service, and component manifests for redis. This will enable you to work with the following building blocks on a fresh install with no additional setup:
You can run .NET services with dapr run as your strategy for developing locally. Plan on running one of these commands per-service in order to launch your application.
Pro: this is easy to set up since its part of the default Dapr installation
Con: this uses long-running docker containers on your machine, which might not be desirable
Con: the scalability of this approach is poor since it requires running a separate command per-service
Using the Dapr CLI
For each service you need to choose:
A unique app-id for addressing (app-id)
A unique listening port for HTTP (port)
You also should have decided on where you are storing components (components-path).
The following command can be run from multiple terminals to launch each service, with the respective values substituted.
dapr run --app-id <app-id> --app-port <port> --components-path <components-path> -- dotnet run -p <project> --urls http://localhost:<port>
Explanation: this command will use dapr run to launch each service and its sidecar. The first half of the command (before --) passes required configuration to the Dapr CLI. The second half of the command (after --) passes required configuration to the dotnet run command.
💡 Ports
Since you need to configure a unique port for each service, you can use this command to pass that port value to both Dapr and the service. --urls http://localhost:<port> will configure ASP.NET Core to listen for traffic on the provided port. Using configuration at the commandline is a more flexible approach than hardcoding a listening port elsewhere.
If any of your services do not accept HTTP traffic, then modify the command above by removing the --app-port and --urls arguments.
Next steps
If you need to debug, then use the attach feature of your debugger to attach to one of the running processes.
If you want to scale up this approach, then consider building a script which automates this process for your whole application.
1.9.2 - Dapr .NET SDK Development with Docker-Compose
docker-compose is a CLI tool included with Docker Desktop that you can use to run multiple containers at a time. It is a way to automate the lifecycle of multiple containers together, and offers a development experience similar to a production environment for applications targeting Kubernetes.
Pro: Since docker-compose manages containers for you, you can make dependencies part of the application definition and stop the long-running containers on your machine.
Con: most investment required, services need to be containerized to get started.
Con: can be difficult to debug and troubleshoot if you are unfamilar with Docker.
Using docker-compose
From the .NET perspective, there is no specialized guidance needed for docker-compose with Dapr. docker-compose runs containers, and once your service is in a container, configuring it similar to any other programming technology.
💡 App Port
In a container, an ASP.NET Core app will listen on port 80 by default. Remember this for when you need to configure the --app-port later.
To summarize the approach:
Create a Dockerfile for each service
Create a docker-compose.yaml and place check it in to the source code repository
Similar to running locally with dapr run for each service you need to choose a unique app-id. Choosing the container name as the app-id will make this simple to remember.
The compose file will contain at a minimum:
A network that the containers use to communicate
Each service’s container
A <service>-daprd sidecar container with the service’s port and app-id specified
Additional dependencies that run in containers (redis for example)
optional: Dapr placement container (for actors)
You can also view a larger example from the eShopOnContainers sample application.
1.9.3 - Dapr .NET SDK Development with .NET Aspire
Learn about local development with .NET Aspire
.NET Aspire
.NET Aspire is a development tool
designed to make it easier to include external software into .NET applications by providing a framework that allows
third-party services to be readily integrated, observed and provisioned alongside your own software.
Aspire simplifies local development by providing rich integration with popular IDEs including
Microsoft Visual Studio,
Visual Studio Code,
JetBrains Rider and others
to launch your application with the debugger while automatically launching and provisioning access to other
integrations as well, including Dapr.
While Aspire also assists with deployment of your application to various cloud hosts like Microsoft Azure and
Amazon AWS, deployment is currently outside the scope of this guide. More information can be found in Aspire’s
documentation here.
An end-to-end demonstration featuring the following and demonstrating service invocation between multiple Dapr-enabled
services can be found here.
Prerequisites
Both the Dapr .NET SDK and .NET Aspire are compatible with .NET 8
or .NET 9
We’ll start by creating a brand new .NET application. Open your preferred CLI and navigate to the directory you wish
to create your new .NET solution within. Start by using the following command to install a template that will create
an empty Aspire application:
dotnet new install Aspire.ProjectTemplates
Once that’s installed, proceed to create an empty .NET Aspire application in your current directory. The -n argument
allows you to specify the name of the output solution. If it’s excluded, the .NET CLI will instead use the name
of the output directory, e.g. C:\source\aspiredemo will result in the solution being named aspiredemo. The rest
of this tutorial will assume a solution named aspiredemo.
dotnet new aspire -n aspiredemo
This will create two Aspire-specific directories and one file in your directory:
aspiredemo.AppHost/ contains the Aspire orchestration project that is used to configure each of the integrations
used in your application(s).
aspiredemo.ServiceDefaults/ contains a collection of extensions meant to be shared across your solution to aid in
resilience, service discovery and telemetry capabilities offered by Aspire (these are distinct from the capabilities
offered in Dapr itself).
aspiredemo.sln is the file that maintains the layout of your current solution
We’ll next create twp projects that’ll serve as our Dapr application and demonstrate Dapr functionality. From the same
directory, use the following to create an empty ASP.NET Core project called FrontEndApp and another called
‘BackEndApp’. Either one will be created relative to your current directory in
FrontEndApp\FrontEndApp.csproj and BackEndApp\BackEndApp.csproj, respectively.
dotnet new web --name FrontEndApp
Next we’ll configure the AppHost project to add the necessary package to support local Dapr development. Navigate
into the AppHost directory with the following and install the CommunityToolkit.Aspire.Hosting.Dapr package from NuGet into the project.
We’ll also add a reference to our FrontEndApp project so we can reference it during the registration process.
This package was previously called Aspire.Hosting.Dapr, which has been marked as deprecated.
This package was previously called Aspire.Hosting.Dapr, which has been marked as deprecated.
Next, we need to configure Dapr as a resource to be loaded alongside your project. Open the Program.cs file in that
project within your preferred IDE. It should look similar to the following:
If you’re familiar with the dependency injection approach used in ASP.NET Core projects or others utilizing the
Microsoft.Extensions.DependencyInjection functionality, you’ll find that this will be a familiar experience.
Because we’ve already added a project reference to MyApp, we need to start by adding a reference in this configuration
as well. Add the following before the builder.Build().Run() line:
Because the project reference has been added to this solution, your project shows up as a type within the Projects.
namespace for our purposes here. The name of the variable you assign the project to doesn’t much matter in this tutorial
but would be used if you wanted to create a reference between this project and another using Aspire’s service discovery
functionality.
Adding .WithDaprSidecar() configures Dapr as a .NET Aspire resource so that when the project runs, the sidecar will be
deployed alongside your application. This accepts a number of different options and could optionally be configured as in
the following example:
DaprSidecarOptionssidecarOptions=new(){AppId="how-dapr-identifies-your-app",AppPort=8080,//Note that this argument is required if you intend to configure pubsub, actors or workflows as of Aspire v9.0 DaprGrpcPort=50001,DaprHttpPort=3500,MetricsPort=9090};builder.AddProject<Projects.BackEndApp>("be").WithReference(myApp).WithDaprSidecar(sidecarOptions);
As indicated in the example above, as of .NET Aspire 9.0, if you intend to use any functionality in which Dapr needs to
call into your application such as pubsub, actors or workflows, you will need to specify your AppPort as
a configured option as Aspire will not automatically pass it to Dapr at runtime. It’s expected that this behavior will
change in a future release as a fix has been merged and can be tracked here.
Finally, let’s add an endpoint to the back-end app that we can invoke using Dapr’s service invocation to display to a
page to demonstrate that Dapr is working as expected.
When you open the solution in your IDE, ensure that the aspiredemo.AppHost is configured as your startup project, but
when you launch it in a debug configuration, you’ll note that your integrated console should reflect your expected Dapr
logs and it will be available to your application.
1.10 - How to troubleshoot and debug with the Dapr .NET SDK
Tips, tricks, and guides for troubleshooting and debugging with the Dapr .NET SDKs
1.10.1 - Troubleshoot Pub/Sub with the .NET SDK
Troubleshoot Pub/Sub with the .NET SDK
Troubleshooting Pub/Sub
The most common problem with pub/sub is that the pub/sub endpoint in your application is not being called.
There are a few layers to this problem with different solutions:
The application is not receiving any traffic from Dapr
The application is not registering pub/sub endpoints with Dapr
The pub/sub endpoints are registered with Dapr, but the request is not reaching the desired endpoint
Step 1: Turn up the logs
This is important. Future steps will depend on your ability to see logging output. ASP.NET Core logs almost nothing with the default log settings, so you will need to change it.
Adjust the logging verbosity to include Information logging for ASP.NET Core as described here. Set the Microsoft key to Information.
Step 2: Verify you can receive traffic from Dapr
Start the application as you would normally (dapr run ...). Make sure that you’re including an --app-port argument in the commandline. Dapr needs to know that your application is listening for traffic. By default an ASP.NET Core application will listen for HTTP on port 5000 in local development.
Wait for Dapr to finish starting
Examine the logs
You should see a log entry like:
info: Microsoft.AspNetCore.Hosting.Diagnostics[1]
Request starting HTTP/1.1 GET http://localhost:5000/.....
During initialization Dapr will make some requests to your application for configuration. If you can’t find these then it means that something has gone wrong. Please ask for help either via an issue or in Discord (include the logs). If you see requests made to your application, then continue to step 3.
Step 3: Verify endpoint registration
Start the application as you would normally (dapr run ...).
Use curl at the command line (or another HTTP testing tool) to access the /dapr/subscribe endpoint.
Here’s an example command assuming your application’s listening port is 5000:
curl http://localhost:5000/dapr/subscribe -v
For a correctly configured application the output should look like the following:
* Trying ::1...
* TCP_NODELAY set
* Connected to localhost (::1) port 5000 (#0)
> GET /dapr/subscribe HTTP/1.1
> Host: localhost:5000
> User-Agent: curl/7.64.1
> Accept: */*
>
< HTTP/1.1 200 OK
< Date: Fri, 15 Jan 2021 22:31:40 GMT
< Content-Type: application/json
< Server: Kestrel
< Transfer-Encoding: chunked
<
* Connection #0 to host localhost left intact
[{"topic":"deposit","route":"deposit","pubsubName":"pubsub"},{"topic":"withdraw","route":"withdraw","pubsubName":"pubsub"}]* Closing connection 0
Pay particular attention to the HTTP status code, and the JSON output.
< HTTP/1.1 200 OK
A 200 status code indicates success.
The JSON blob that’s included near the end is the output of /dapr/subscribe that’s processed by the Dapr runtime. In this case it’s using the ControllerSample in this repo - so this is an example of correct output.
With the output of this command in hand, you are ready to diagnose a problem or move on to the next step.
Option 0: The response was a 200 included some pub/sub entries
If you have entries in the JSON output from this test then the problem lies elsewhere, move on to step 2.
Option 1: The response was not a 200, or didn’t contain JSON
If the response was not a 200 or did not contain JSON, then the MapSubscribeHandler() endpoint was not reached.
Make sure you have some code like the following in Startup.cs and repeat the test.
app.UseRouting();app.UseCloudEvents();app.UseEndpoints(endpoints=>{endpoints.MapSubscribeHandler();// This is the Dapr subscribe handlerendpoints.MapControllers();});
If adding the subscribe handler did not resolve the problem, please open an issue on this repo and include the contents of your Startup.cs file.
Option 2: The response contained JSON but it was empty (like [])
If the JSON output was an empty array (like []) then the subscribe handler is registered, but no topic endpoints were registered.
If you’re using a controller for pub/sub you should have a method like:
[Topic("pubsub", "deposit")][HttpPost("deposit")]publicasyncTask<ActionResult>Deposit(...)// Using Pub/Sub routing[Topic("pubsub", "transactions", "event.type == \"withdraw.v2\"", 1)][HttpPost("withdraw")]publicasyncTask<ActionResult>Withdraw(...)
In this example the Topic and HttpPost attributes are required, but other details might be different.
If you’re using routing for pub/sub you should have an endpoint like:
In this example the call to WithTopic(...) is required but other details might be different.
After correcting this code and re-testing if the JSON output is still the empty array (like []) then please open an issue on this repository and include the contents of Startup.cs and your pub/sub endpoint.
Step 4: Verify endpoint reachability
In this step we’ll verify that the entries registered with pub/sub are reachable. The last step should have left you with some JSON output like the following:
If after doing this you still don’t understand the problem please open an issue on this repo and include the contents of your Startup.cs.
Option 1: Routing did not execute
If you don’t see an entry for Microsoft.AspNetCore.Routing.EndpointMiddleware in the logs, then it means that the request was handled by something other than routing. Usually the problem in this case is a misbehaving middleware. Other logs from the request might give you a clue to what’s happening.
If you need help understanding the problem please open an issue on this repo and include the contents of your Startup.cs.
Option 2: Routing chose the wrong endpoint
If you see an entry for Microsoft.AspNetCore.Routing.EndpointMiddleware in the logs, but it contains the wrong endpoint then it means that you’ve got a routing conflict. The endpoint that was chosen will appear in the logs so that should give you an idea of what’s causing the conflict.
If you need help understanding the problem please open an issue on this repo and include the contents of your Startup.cs.
2 - Dapr Go SDK
Go SDK packages for developing Dapr applications
A client library to help build Dapr applications in Go. This client supports all public Dapr APIs while focusing on idiomatic Go experiences and developer productivity.
Client
Use the Go Client SDK for invoking public Dapr APIs
[**Learn more about the Go Client SDK**](https://v1-16.docs.dapr.io/developing-applications/sdks/go/go-client/)
Service
Use the Dapr Service (Callback) SDK for Go to create services that will be invoked by Dapr.
[**Learn more about the Go Service (Callback) SDK**](https://v1-16.docs.dapr.io/developing-applications/sdks/go/go-service/)
2.1 - Getting started with the Dapr client Go SDK
How to get up and running with the Dapr Go SDK
The Dapr client package allows you to interact with other Dapr applications from a Go application.
Workflows and their activities can be authored and managed using the Dapr Go SDK like so:
import(..."github.com/dapr/go-sdk/workflow"...)funcExampleWorkflow(ctx*workflow.WorkflowContext)(any,error){varoutputstringinput:="world"iferr:=ctx.CallActivity(ExampleActivity,workflow.ActivityInput(input)).Await(&output);err!=nil{returnnil,err}// Print output - "hello world"fmt.Println(output)returnnil,nil}funcExampleActivity(ctxworkflow.ActivityContext)(any,error){varinputintiferr:=ctx.GetInput(&input);err!=nil{return"",err}returnfmt.Sprintf("hello %s",input),nil}funcmain(){// Create a workflow workerw,err:=workflow.NewWorker()iferr!=nil{log.Fatalf("error creating worker: %v",err)}// Register the workfloww.RegisterWorkflow(ExampleWorkflow)// Register the activityw.RegisterActivity(ExampleActivity)// Start workflow runneriferr:=w.Start();err!=nil{log.Fatal(err)}// Create a workflow clientwfClient,err:=workflow.NewClient()iferr!=nil{log.Fatal(err)}// Start a new workflowid,err:=wfClient.ScheduleNewWorkflow(context.Background(),"ExampleWorkflow")iferr!=nil{log.Fatal(err)}// Wait for the workflow to completemetadata,err:=wfClient.WaitForWorkflowCompletion(ctx,id)iferr!=nil{log.Fatal(err)}// Print workflow status post-completionfmt.Println(metadata.RuntimeStatus)// Shutdown Workerw.Shutdown()}
For a more comprehensive guide on workflows visit these How-To guides:
For simple use-cases, Dapr client provides easy to use Save, Get, Delete methods:
ctx:=context.Background()data:=[]byte("hello")store:="my-store"// defined in the component YAML// save state with the key key1, default options: strong, last-writeiferr:=client.SaveState(ctx,store,"key1",data,nil);err!=nil{panic(err)}// get state for key key1item,err:=client.GetState(ctx,store,"key1",nil)iferr!=nil{panic(err)}fmt.Printf("data [key:%s etag:%s]: %s",item.Key,item.Etag,string(item.Value))// delete state for key key1iferr:=client.DeleteState(ctx,store,"key1",nil);err!=nil{panic(err)}
For more granular control, the Dapr Go client exposes SetStateItem type, which can be use to gain more control over the state operations and allow for multiple items to be saved at once:
You can create workflows using the Go SDK. For example, start with a simple workflow activity:
funcTestActivity(ctxworkflow.ActivityContext)(any,error){varinputintiferr:=ctx.GetInput(&input);err!=nil{return"",err}// Do something herereturn"result",nil}
The Dapr client Go SDK allows you to schedule, get, and delete jobs. Jobs enable you to schedule work to be executed at specific times or intervals.
Scheduling a Job
To schedule a new job, use the ScheduleJobAlpha1 method:
import("google.golang.org/protobuf/types/known/anypb")// Create job datadata,err:=anypb.New(&YourDataStruct{Message:"Hello, Job!"})iferr!=nil{panic(err)}// Create a simple job using the builder patternjob:=client.NewJob("my-scheduled-job",client.WithJobData(data),client.WithJobDueTime("10s"),// Execute in 10 seconds)// Schedule the joberr=client.ScheduleJobAlpha1(ctx,job)iferr!=nil{panic(err)}
Job with Schedule and Repeats
You can create recurring jobs using the Schedule field with cron expressions:
job:=client.NewJob("recurring-job",client.WithJobData(data),client.WithJobSchedule("0 9 * * *"),// Run at 9 AM every dayclient.WithJobRepeats(10),// Repeat 10 timesclient.WithJobTTL("1h"),// Job expires after 1 hour)err=client.ScheduleJobAlpha1(ctx,job)
Job with Failure Policy
Configure how jobs should handle failures using failure policies:
// Constant retry policy with max retries and intervaljob:=client.NewJob("resilient-job",client.WithJobData(data),client.WithJobDueTime("2024-01-01T10:00:00Z"),client.WithJobConstantFailurePolicy(),client.WithJobConstantFailurePolicyMaxRetries(3),client.WithJobConstantFailurePolicyInterval(30*time.Second),)err=client.ScheduleJobAlpha1(ctx,job)
For jobs that should not be retried on failure, use the drop policy:
// MyActor represents an example actor type.typeMyActorstruct{actors.Actor}// MyActorMethod is a method that can be invoked on MyActor.func(a*MyActor)MyActorMethod(ctxcontext.Context,req*actors.Message)(string,error){log.Printf("Received message: %s",req.Data)return"Hello from MyActor!",nil}funcmain(){// Create a Dapr clientdaprClient,err:=client.NewClient()iferr!=nil{log.Fatal("Error creating Dapr client: ",err)}// Register the actor type with Dapractors.RegisterActor(&MyActor{})// Create an actor clientactorClient:=actors.NewClient(daprClient)// Create an actor IDactorID:=actors.NewActorID("myactor")// Get or create the actorerr=actorClient.SaveActorState(context.Background(),"myactorstore",actorID,map[string]interface{}{"data":"initial state"})iferr!=nil{log.Fatal("Error saving actor state: ",err)}// Invoke a method on the actorresp,err:=actorClient.InvokeActorMethod(context.Background(),"myactorstore",actorID,"MyActorMethod",&actors.Message{Data:[]byte("Hello from client!")})iferr!=nil{log.Fatal("Error invoking actor method: ",err)}log.Printf("Response from actor: %s",resp.Data)// Wait for a few seconds before terminatingtime.Sleep(5*time.Second)// Delete the actorerr=actorClient.DeleteActor(context.Background(),"myactorstore",actorID)iferr!=nil{log.Fatal("Error deleting actor: ",err)}// Close the Dapr clientdaprClient.Close()}
The Dapr client also provides access to the runtime secrets that can be backed by any number of secrete stores (e.g. Kubernetes Secrets, HashiCorp Vault, or Azure KeyVault):
By default, Dapr relies on the network boundary to limit access to its API. If however the target Dapr API is configured with token-based authentication, users can configure the Go Dapr client with that token in two ways:
Environment Variable
If the DAPR_API_TOKEN environment variable is defined, Dapr will automatically use it to augment its Dapr API invocations to ensure authentication.
Explicit Method
In addition, users can also set the API token explicitly on any Dapr client instance. This approach is helpful in cases when the user code needs to create multiple clients for different Dapr API endpoints.
With the Dapr client Go SDK, you can consume configuration items that are returned as read-only key/value pairs, and subscribe to configuration item changes.
With the Dapr client Go SDK, you can use the high-level Encrypt and Decrypt cryptography APIs to encrypt and decrypt files while working on a stream of data.
To encrypt:
// Encrypt the data using Daprout,err:=client.Encrypt(context.Background(),rf,dapr.EncryptOptions{// These are the 3 required parametersComponentName:"mycryptocomponent",KeyName:"mykey",Algorithm:"RSA",})iferr!=nil{panic(err)}
To decrypt:
// Decrypt the data using Daprout,err:=client.Decrypt(context.Background(),rf,dapr.EncryptOptions{// Only required option is the component nameComponentName:"mycryptocomponent",})
2.2 - Getting started with the Dapr Service (Callback) SDK for Go
How to get up and running with the Dapr Service (Callback) SDK for Go
In addition to this Dapr API client, Dapr Go SDK also provides service package to bootstrap your Dapr callback services. These services can be developed in either gRPC or HTTP:
Once you create a service instance, you can “attach” to that service any number of event, binding, and service invocation logic handlers as shown below. Onces the logic is defined, you are ready to start the service:
The handler method itself can be any method with the expected signature:
funceventHandler(ctxcontext.Context,e*common.TopicEvent)(retrybool,errerror){log.Printf("event - PubsubName:%s, Topic:%s, ID:%s, Data: %v",e.PubsubName,e.Topic,e.ID,e.Data)// do something with the eventreturntrue,nil}
Optionally, you can use routing rules to send messages to different handlers based on the contents of the CloudEvent.
You can also create a custom type that implements the TopicEventSubscriber interface to handle your events:
typeEventHandlerstruct{// any data or references that your event handler needs.}func(h*EventHandler)Handle(ctxcontext.Context,e*common.TopicEvent)(retrybool,errerror){log.Printf("event - PubsubName:%s, Topic:%s, ID:%s, Data: %v",e.PubsubName,e.Topic,e.ID,e.Data)// do something with the eventreturntrue,nil}
The EventHandler can then be added using the AddTopicEventSubscriber method:
sub:=&common.Subscription{PubsubName:"messages",Topic:"topic1",}eventHandler:=&EventHandler{// initialize any fields}iferr:=s.AddTopicEventSubscriber(sub,eventHandler);err!=nil{log.Fatalf("error adding topic subscription: %v",err)}
Service Invocation Handler
To handle service invocations you will need to add at least one service invocation handler before starting the service:
The handler method itself can be any method with the expected signature:
funcechoHandler(ctxcontext.Context,in*common.InvocationEvent)(out*common.Content,errerror){log.Printf("echo - ContentType:%s, Verb:%s, QueryString:%s, %+v",in.ContentType,in.Verb,in.QueryString,string(in.Data))// do something with the invocation here out=&common.Content{Data:in.Data,ContentType:in.ContentType,DataTypeURL:in.DataTypeURL,}return}
The handler method itself can be any method with the expected signature:
funcrunHandler(ctxcontext.Context,in*common.BindingEvent)(out[]byte,errerror){log.Printf("binding - Data:%v, Meta:%v",in.Data,in.Metadata)// do something with the invocation here returnnil,nil}
Once you create a service instance, you can “attach” to that service any number of event, binding, and service invocation logic handlers as shown below. Onces the logic is defined, you are ready to start the service:
The handler method itself can be any method with the expected signature:
funceventHandler(ctxcontext.Context,e*common.TopicEvent)(retrybool,errerror){log.Printf("event - PubsubName:%s, Topic:%s, ID:%s, Data: %v",e.PubsubName,e.Topic,e.ID,e.Data)// do something with the eventreturntrue,nil}
Optionally, you can use routing rules to send messages to different handlers based on the contents of the CloudEvent.
You can also create a custom type that implements the TopicEventSubscriber interface to handle your events:
typeEventHandlerstruct{// any data or references that your event handler needs.}func(h*EventHandler)Handle(ctxcontext.Context,e*common.TopicEvent)(retrybool,errerror){log.Printf("event - PubsubName:%s, Topic:%s, ID:%s, Data: %v",e.PubsubName,e.Topic,e.ID,e.Data)// do something with the eventreturntrue,nil}
The EventHandler can then be added using the AddTopicEventSubscriber method:
sub:=&common.Subscription{PubsubName:"messages",Topic:"topic1",}eventHandler:=&EventHandler{// initialize any fields}iferr:=s.AddTopicEventSubscriber(sub,eventHandler);err!=nil{log.Fatalf("error adding topic subscription: %v",err)}
Service Invocation Handler
To handle service invocations you will need to add at least one service invocation handler before starting the service:
The handler method itself can be any method with the expected signature:
funcechoHandler(ctxcontext.Context,in*common.InvocationEvent)(out*common.Content,errerror){log.Printf("echo - ContentType:%s, Verb:%s, QueryString:%s, %+v",in.ContentType,in.Verb,in.QueryString,string(in.Data))// do something with the invocation here out=&common.Content{Data:in.Data,ContentType:in.ContentType,DataTypeURL:in.DataTypeURL,}return}
Binding Invocation Handler
To handle binding invocations you will need to add at least one binding invocation handler before starting the service:
The handler method itself can be any method with the expected signature:
funcrunHandler(ctxcontext.Context,in*common.BindingEvent)(out[]byte,errerror){log.Printf("binding - Data:%v, Meta:%v",in.Data,in.Metadata)// do something with the invocation here returnnil,nil}
Java SDK packages for developing Dapr applications
Dapr offers a variety of packages to help with the development of Java applications. Using them you can create Java clients, servers, and virtual actors with Dapr.
Next, import the Java SDK packages to get started. Select your preferred build tool to learn how to import.
For a Maven project, add the following to your pom.xml file:
<project> ...
<dependencies> ...
<!-- Dapr's core SDK with all features, except Actors. --><dependency><groupId>io.dapr</groupId><artifactId>dapr-sdk</artifactId><version>1.15.0</version></dependency><!-- Dapr's SDK for Actors (optional). --><dependency><groupId>io.dapr</groupId><artifactId>dapr-sdk-actors</artifactId><version>1.15.0</version></dependency><!-- Dapr's SDK integration with SpringBoot (optional). --><dependency><groupId>io.dapr</groupId><artifactId>dapr-sdk-springboot</artifactId><version>1.15.0</version></dependency> ...
</dependencies> ...
</project>
For a Gradle project, add the following to your build.gradle file:
dependencies{...// Dapr's core SDK with all features, except Actors.compile('io.dapr:dapr-sdk:1.15.0')// Dapr's SDK for Actors (optional).compile('io.dapr:dapr-sdk-actors:1.15.0')// Dapr's SDK integration with SpringBoot (optional).compile('io.dapr:dapr-sdk-springboot:1.15.0')}
If you are also using Spring Boot, you may run into a common issue where the OkHttp version that the Dapr SDK uses conflicts with the one specified in the Spring Boot Bill of Materials.
You can fix this by specifying a compatible OkHttp version in your project to match the version that the Dapr SDK uses:
Clone the SDK repo to try out some examples and get started.
importio.dapr.client.DaprClient;importio.dapr.client.DaprClientBuilder;try(DaprClientclient=(newDaprClientBuilder()).build()){// sending a class with message; BINDING_OPERATION="create"client.invokeBinding(BINDING_NAME,BINDING_OPERATION,myClass).block();// sending a plain stringclient.invokeBinding(BINDING_NAME,BINDING_OPERATION,message).block();}
Visit Java SDK examples for code samples and instructions to try out output bindings.
Available packages
Client
Create Java clients that interact with a Dapr sidecar and other Dapr applications.
Workflow
Create and manage workflows that work with other Dapr APIs in Java.
3.1 - AI
With the Dapr Conversation AI package, you can interact with the Dapr AI workloads from a Java application. To get started, walk through the Dapr AI how-to guide.
3.1.1 - How to: Author and manage Dapr Conversation AI in the Java SDK
How to get up and running with Conversation AI using the Dapr Java SDK
As part of this demonstration, we will look at how to use the Conversation API to converse with a Large Language Model (LLM). The API
will return the response from the LLM for the given prompt. With the provided conversation ai example, you will:
git clone https://github.com/dapr/java-sdk.git
cd java-sdk
Run the following command to install the requirements for running the Conversation AI example with the Dapr Java SDK.
mvn clean install -DskipTests
From the Java SDK root directory, navigate to the examples’ directory.
cd examples
Run the Dapr sidecar.
dapr run --app-id conversationapp --dapr-grpc-port 51439 --dapr-http-port 3500 --app-port 8080
Now, Dapr is listening for HTTP requests at http://localhost:3500 and gRPC requests at http://localhost:51439.
Send a prompt with Personally identifiable information (PII) to the Conversation AI API
In the DemoConversationAI there are steps to send a prompt using the converse method under the DaprPreviewClient.
publicclassDemoConversationAI{/**
* The main method to start the client.
*
* @param args Input arguments (unused).
*/publicstaticvoidmain(String[]args){try(DaprPreviewClientclient=newDaprClientBuilder().buildPreviewClient()){System.out.println("Sending the following input to LLM: Hello How are you? This is the my number 672-123-4567");ConversationInputdaprConversationInput=newConversationInput("Hello How are you? "+"This is the my number 672-123-4567");// Component name is the name provided in the metadata block of the conversation.yaml file.Mono<ConversationResponse>responseMono=client.converse(newConversationRequest("echo",List.of(daprConversationInput)).setContextId("contextId").setScrubPii(true).setTemperature(1.1d));ConversationResponseresponse=responseMono.block();System.out.printf("Conversation output: %s",response.getConversationOutputs().get(0).getResult());}catch(Exceptione){thrownewRuntimeException(e);}}}
Run the DemoConversationAI with the following command.
== APP == Conversation output: Hello How are you? This is the my number <ISBN>
As shown in the output, the number sent to the API is obfuscated and returned in the form of .
The example above uses an “echo”
component for testing, which simply returns the input message.
When integrated with LLMs like OpenAI or Claude, you’ll receive meaningful responses instead of echoed input.
This will connect to the default Dapr gRPC endpoint localhost:50001. For information about configuring the client using environment variables and system properties, see Properties.
Error Handling
Initially, errors in Dapr followed the Standard gRPC error model. However, to provide more detailed and informative error
messages, in version 1.13 an enhanced error model has been introduced which aligns with the gRPC Richer error model. In
response, the Java SDK extended the DaprException to include the error details that were added in Dapr.
Example of handling the DaprException and consuming the error details when using the Dapr Java SDK:
...try{client.publishEvent("unknown_pubsub","mytopic","mydata").block();}catch(DaprExceptionexception){System.out.println("Dapr exception's error code: "+exception.getErrorCode());System.out.println("Dapr exception's message: "+exception.getMessage());// DaprException now contains `getStatusDetails()` to include more details about the error from Dapr runtime.System.out.println("Dapr exception's reason: "+exception.getStatusDetails().get(DaprErrorDetails.ErrorDetailType.ERROR_INFO,"reason",TypeRef.STRING));}...
importio.dapr.client.DaprClient;importio.dapr.client.DaprClientBuilder;try(DaprClientclient=(newDaprClientBuilder()).build()){// invoke a 'GET' method (HTTP) skipping serialization: \say with a Mono<byte[]> return type// for gRPC set HttpExtension.NONE parameters belowresponse=client.invokeMethod(SERVICE_TO_INVOKE,METHOD_TO_INVOKE,"{\"name\":\"World!\"}",HttpExtension.GET,byte[].class).block();// invoke a 'POST' method (HTTP) skipping serialization: to \say with a Mono<byte[]> return type response=client.invokeMethod(SERVICE_TO_INVOKE,METHOD_TO_INVOKE,"{\"id\":\"100\", \"FirstName\":\"Value\", \"LastName\":\"Value\"}",HttpExtension.POST,byte[].class).block();System.out.println(newString(response));// invoke a 'POST' method (HTTP) with serialization: \employees with a Mono<Employee> return type EmployeenewEmployee=newEmployee("Nigel","Guitarist");EmployeeemployeeResponse=client.invokeMethod(SERVICE_TO_INVOKE,"employees",newEmployee,HttpExtension.POST,Employee.class).block();}
Visit Java SDK examples for code samples and instructions to try out service invocation
Save & get application state
importio.dapr.client.DaprClient;importio.dapr.client.DaprClientBuilder;importio.dapr.client.domain.State;importreactor.core.publisher.Mono;try(DaprClientclient=(newDaprClientBuilder()).build()){// Save stateclient.saveState(STATE_STORE_NAME,FIRST_KEY_NAME,myClass).block();// Get stateState<MyClass>retrievedMessage=client.getState(STATE_STORE_NAME,FIRST_KEY_NAME,MyClass.class).block();// Delete stateclient.deleteState(STATE_STORE_NAME,FIRST_KEY_NAME).block();}
importio.dapr.client.DaprClientBuilder;importio.dapr.client.DaprPreviewClient;importio.dapr.client.domain.BulkPublishResponse;importio.dapr.client.domain.BulkPublishResponseFailedEntry;importjava.util.ArrayList;importjava.util.List;classSolution{publicvoidpublishMessages(){try(DaprPreviewClientclient=(newDaprClientBuilder()).buildPreviewClient()){// Create a list of messages to publishList<String>messages=newArrayList<>();for(inti=0;i<NUM_MESSAGES;i++){Stringmessage=String.format("This is message #%d",i);messages.add(message);System.out.println("Going to publish message : "+message);}// Publish list of messages using the bulk publish APIBulkPublishResponse<String>res=client.publishEvents(PUBSUB_NAME,TOPIC_NAME,"text/plain",messages).block()}}}
Visit Java SDK examples for code samples and instructions to try out pub/sub
Interact with output bindings
importio.dapr.client.DaprClient;importio.dapr.client.DaprClientBuilder;try(DaprClientclient=(newDaprClientBuilder()).build()){// sending a class with message; BINDING_OPERATION="create"client.invokeBinding(BINDING_NAME,BINDING_OPERATION,myClass).block();// sending a plain stringclient.invokeBinding(BINDING_NAME,BINDING_OPERATION,message).block();}
Visit Java SDK examples for code samples and instructions to try out retrieving secrets
Actors
An actor is an isolated, independent unit of compute and state with single-threaded execution. Dapr provides an actor implementation based on the Virtual Actor pattern, which provides a single-threaded programming model and where actors are garbage collected when not in use. With Dapr’s implementaiton, you write your Dapr actors according to the Actor model, and Dapr leverages the scalability and reliability that the underlying platform provides.
Visit Java SDK examples for code samples and instructions to try actors
Get & Subscribe to application configurations
Note this is a preview API and thus will only be accessible via the DaprPreviewClient interface and not the normal DaprClient interface
importio.dapr.client.DaprClientBuilder;importio.dapr.client.DaprPreviewClient;importio.dapr.client.domain.ConfigurationItem;importio.dapr.client.domain.GetConfigurationRequest;importio.dapr.client.domain.SubscribeConfigurationRequest;importreactor.core.publisher.Flux;importreactor.core.publisher.Mono;try(DaprPreviewClientclient=(newDaprClientBuilder()).buildPreviewClient()){// Get configuration for a single keyMono<ConfigurationItem>item=client.getConfiguration(CONFIG_STORE_NAME,CONFIG_KEY).block();// Get configurations for multiple keysMono<Map<String,ConfigurationItem>>items=client.getConfiguration(CONFIG_STORE_NAME,CONFIG_KEY_1,CONFIG_KEY_2);// Subscribe to configuration changesFlux<SubscribeConfigurationResponse>outFlux=client.subscribeConfiguration(CONFIG_STORE_NAME,CONFIG_KEY_1,CONFIG_KEY_2);outFlux.subscribe(configItems->configItems.forEach(...));// Unsubscribe from configuration changesMono<UnsubscribeConfigurationResponse>unsubscribe=client.unsubscribeConfiguration(SUBSCRIPTION_ID,CONFIG_STORE_NAME)}
Visit Java SDK examples for code samples and instructions to try out different configuration operations.
Query saved state
Note this is a preview API and thus will only be accessible via the DaprPreviewClient interface and not the normal DaprClient interface
importio.dapr.client.DaprClient;importio.dapr.client.DaprClientBuilder;importio.dapr.client.DaprPreviewClient;importio.dapr.client.domain.QueryStateItem;importio.dapr.client.domain.QueryStateRequest;importio.dapr.client.domain.QueryStateResponse;importio.dapr.client.domain.query.Query;importio.dapr.client.domain.query.Sorting;importio.dapr.client.domain.query.filters.EqFilter;try(DaprClientclient=builder.build();DaprPreviewClientpreviewClient=builder.buildPreviewClient()){StringsearchVal=args.length==0?"searchValue":args[0];// Create JSON dataListingfirst=newListing();first.setPropertyType("apartment");first.setId("1000");...Listingsecond=newListing();second.setPropertyType("row-house");second.setId("1002");...Listingthird=newListing();third.setPropertyType("apartment");third.setId("1003");...Listingfourth=newListing();fourth.setPropertyType("apartment");fourth.setId("1001");...Map<String,String>meta=newHashMap<>();meta.put("contentType","application/json");// Save stateSaveStateRequestrequest=newSaveStateRequest(STATE_STORE_NAME).setStates(newState<>("1",first,null,meta,null),newState<>("2",second,null,meta,null),newState<>("3",third,null,meta,null),newState<>("4",fourth,null,meta,null));client.saveBulkState(request).block();// Create query and query state requestQueryquery=newQuery().setFilter(newEqFilter<>("propertyType","apartment")).setSort(Arrays.asList(newSorting("id",Sorting.Order.DESC)));QueryStateRequestrequest=newQueryStateRequest(STATE_STORE_NAME).setQuery(query);// Use preview client to call query state APIQueryStateResponse<MyData>result=previewClient.queryState(request,MyData.class).block();// View Query state response System.out.println("Found "+result.getResults().size()+" items.");for(QueryStateItem<Listing>item:result.getResults()){System.out.println("Key: "+item.getKey());System.out.println("Data: "+item.getValue());}}
packageio.dapr.examples.lock.grpc;importio.dapr.client.DaprClientBuilder;importio.dapr.client.DaprPreviewClient;importio.dapr.client.domain.LockRequest;importio.dapr.client.domain.UnlockRequest;importio.dapr.client.domain.UnlockResponseStatus;importreactor.core.publisher.Mono;publicclassDistributedLockGrpcClient{privatestaticfinalStringLOCK_STORE_NAME="lockstore";/**
* Executes various methods to check the different apis.
*
* @param args arguments
* @throws Exception throws Exception
*/publicstaticvoidmain(String[]args)throwsException{try(DaprPreviewClientclient=(newDaprClientBuilder()).buildPreviewClient()){System.out.println("Using preview client...");tryLock(client);unlock(client);}}/**
* Trying to get lock.
*
* @param client DaprPreviewClient object
*/publicstaticvoidtryLock(DaprPreviewClientclient){System.out.println("*******trying to get a free distributed lock********");try{LockRequestlockRequest=newLockRequest(LOCK_STORE_NAME,"resouce1","owner1",5);Mono<Boolean>result=client.tryLock(lockRequest);System.out.println("Lock result -> "+(Boolean.TRUE.equals(result.block())?"SUCCESS":"FAIL"));}catch(Exceptionex){System.out.println(ex.getMessage());}}/**
* Unlock a lock.
*
* @param client DaprPreviewClient object
*/publicstaticvoidunlock(DaprPreviewClientclient){System.out.println("*******unlock a distributed lock********");try{UnlockRequestunlockRequest=newUnlockRequest(LOCK_STORE_NAME,"resouce1","owner1");Mono<UnlockResponseStatus>result=client.unlock(unlockRequest);System.out.println("Unlock result ->"+result.block().name());}catch(Exceptionex){System.out.println(ex.getMessage());}}}
packageio.dapr.examples.workflows;importio.dapr.workflows.client.DaprWorkflowClient;importio.dapr.workflows.client.WorkflowInstanceStatus;importjava.time.Duration;importjava.util.concurrent.TimeUnit;importjava.util.concurrent.TimeoutException;/**
* For setup instructions, see the README.
*/publicclassDemoWorkflowClient{/**
* The main method.
*
* @param args Input arguments (unused).
* @throws InterruptedException If program has been interrupted.
*/publicstaticvoidmain(String[]args)throwsInterruptedException{DaprWorkflowClientclient=newDaprWorkflowClient();try(client){StringseparatorStr="*******";System.out.println(separatorStr);StringinstanceId=client.scheduleNewWorkflow(DemoWorkflow.class,"input data");System.out.printf("Started new workflow instance with random ID: %s%n",instanceId);System.out.println(separatorStr);System.out.println("**GetInstanceMetadata:Running Workflow**");WorkflowInstanceStatusworkflowMetadata=client.getInstanceState(instanceId,true);System.out.printf("Result: %s%n",workflowMetadata);System.out.println(separatorStr);System.out.println("**WaitForInstanceStart**");try{WorkflowInstanceStatuswaitForInstanceStartResult=client.waitForInstanceStart(instanceId,Duration.ofSeconds(60),true);System.out.printf("Result: %s%n",waitForInstanceStartResult);}catch(TimeoutExceptionex){System.out.printf("waitForInstanceStart has an exception:%s%n",ex);}System.out.println(separatorStr);System.out.println("**SendExternalMessage**");client.raiseEvent(instanceId,"TestEvent","TestEventPayload");System.out.println(separatorStr);System.out.println("** Registering parallel Events to be captured by allOf(t1,t2,t3) **");client.raiseEvent(instanceId,"event1","TestEvent 1 Payload");client.raiseEvent(instanceId,"event2","TestEvent 2 Payload");client.raiseEvent(instanceId,"event3","TestEvent 3 Payload");System.out.printf("Events raised for workflow with instanceId: %s\n",instanceId);System.out.println(separatorStr);System.out.println("** Registering Event to be captured by anyOf(t1,t2,t3) **");client.raiseEvent(instanceId,"e2","event 2 Payload");System.out.printf("Event raised for workflow with instanceId: %s\n",instanceId);System.out.println(separatorStr);System.out.println("**WaitForInstanceCompletion**");try{WorkflowInstanceStatuswaitForInstanceCompletionResult=client.waitForInstanceCompletion(instanceId,Duration.ofSeconds(60),true);System.out.printf("Result: %s%n",waitForInstanceCompletionResult);}catch(TimeoutExceptionex){System.out.printf("waitForInstanceCompletion has an exception:%s%n",ex);}System.out.println(separatorStr);System.out.println("**purgeInstance**");booleanpurgeResult=client.purgeInstance(instanceId);System.out.printf("purgeResult: %s%n",purgeResult);System.out.println(separatorStr);System.out.println("**raiseEvent**");StringeventInstanceId=client.scheduleNewWorkflow(DemoWorkflow.class);System.out.printf("Started new workflow instance with random ID: %s%n",eventInstanceId);client.raiseEvent(eventInstanceId,"TestException",null);System.out.printf("Event raised for workflow with instanceId: %s\n",eventInstanceId);System.out.println(separatorStr);StringinstanceToTerminateId="terminateMe";client.scheduleNewWorkflow(DemoWorkflow.class,null,instanceToTerminateId);System.out.printf("Started new workflow instance with specified ID: %s%n",instanceToTerminateId);TimeUnit.SECONDS.sleep(5);System.out.println("Terminate this workflow instance manually before the timeout is reached");client.terminateWorkflow(instanceToTerminateId,null);System.out.println(separatorStr);StringrestartingInstanceId="restarting";client.scheduleNewWorkflow(DemoWorkflow.class,null,restartingInstanceId);System.out.printf("Started new workflow instance with ID: %s%n",restartingInstanceId);System.out.println("Sleeping 30 seconds to restart the workflow");TimeUnit.SECONDS.sleep(30);System.out.println("**SendExternalMessage: RestartEvent**");client.raiseEvent(restartingInstanceId,"RestartEvent","RestartEventPayload");System.out.println("Sleeping 30 seconds to terminate the eternal workflow");TimeUnit.SECONDS.sleep(30);client.terminateWorkflow(restartingInstanceId,null);}System.out.println("Exiting DemoWorkflowClient.");System.exit(0);}}
The DaprClient also provides a helper method to wait for the sidecar to become healthy (components only). When using
this method, be sure to specify a timeout in milliseconds and block() to wait for the result of a reactive operation.
// Wait for the Dapr sidecar to report healthy before attempting to use Dapr components.try(DaprClientclient=newDaprClientBuilder().build()){System.out.println("Waiting for Dapr sidecar ...");client.waitForSidecar(10000).block();// Specify the timeout in millisecondsSystem.out.println("Dapr sidecar is ready.");...}// Perform Dapr component operations here i.e. fetching secrets or saving state.
Shutdown the sidecar
try(DaprClientclient=newDaprClientBuilder().build()){logger.info("Sending shutdown request.");client.shutdown().block();logger.info("Ensuring dapr has stopped.");...}
For a full list of SDK properties and how to configure them, visit Properties.
3.2.1 - Properties
SDK-wide properties for configuring the Dapr Java SDK using environment variables and system properties
Properties
The Dapr Java SDK provides a set of global properties that control the behavior of the SDK. These properties can be configured using environment variables or system properties. System properties can be set using the -D flag when running your Java application.
These properties affect the entire SDK, including clients and runtime. They control aspects such as:
Sidecar connectivity (endpoints, ports)
Security settings (TLS, API tokens)
Performance tuning (timeouts, connection pools)
Protocol settings (gRPC, HTTP)
String encoding
Environment Variables
The following environment variables are available for configuring the Dapr Java SDK:
Sidecar Endpoints
When these variables are set, the client will automatically use them to connect to the Dapr sidecar.
Environment Variable
Description
Default
DAPR_GRPC_ENDPOINT
The gRPC endpoint for the Dapr sidecar
localhost:50001
DAPR_HTTP_ENDPOINT
The HTTP endpoint for the Dapr sidecar
localhost:3500
DAPR_GRPC_PORT
The gRPC port for the Dapr sidecar (legacy, DAPR_GRPC_ENDPOINT takes precedence)
50001
DAPR_HTTP_PORT
The HTTP port for the Dapr sidecar (legacy, DAPR_HTTP_ENDPOINT takes precedence)
For secure gRPC communication, you can configure TLS settings using the following environment variables:
Environment Variable
Description
Default
DAPR_GRPC_TLS_INSECURE
When set to “true”, enables insecure TLS mode which still uses TLS but doesn’t verify certificates. This uses InsecureTrustManagerFactory to trust all certificates. This should only be used for testing or in secure environments.
false
DAPR_GRPC_TLS_CA_PATH
Path to the CA certificate file. This is used for TLS connections to servers with self-signed certificates.
null
DAPR_GRPC_TLS_CERT_PATH
Path to the TLS certificate file for client authentication.
null
DAPR_GRPC_TLS_KEY_PATH
Path to the TLS private key file for client authentication.
null
Keepalive Settings
Configure gRPC keepalive behavior using these environment variables:
Environment Variable
Description
Default
DAPR_GRPC_ENABLE_KEEP_ALIVE
Whether to enable gRPC keepalive
false
DAPR_GRPC_KEEP_ALIVE_TIME_SECONDS
gRPC keepalive time in seconds
10
DAPR_GRPC_KEEP_ALIVE_TIMEOUT_SECONDS
gRPC keepalive timeout in seconds
5
DAPR_GRPC_KEEP_ALIVE_WITHOUT_CALLS
Whether to keep gRPC connection alive without calls
true
Inbound Message Settings
Configure gRPC inbound message settings using these environment variables:
Environment Variable
Description
Default
DAPR_GRPC_MAX_INBOUND_MESSAGE_SIZE_BYTES
Dapr’s maximum inbound message size for gRPC in bytes. This value sets the maximum size of a gRPC message that can be received by the application
4194304
DAPR_GRPC_MAX_INBOUND_METADATA_SIZE_BYTES
Dapr’s maximum inbound metadata size for gRPC in bytes
8192
HTTP Client Configuration
These properties control the behavior of the HTTP client used for communication with the Dapr sidecar:
Environment Variable
Description
Default
DAPR_HTTP_CLIENT_READ_TIMEOUT_SECONDS
Timeout in seconds for HTTP client read operations. This is the maximum time to wait for a response from the Dapr sidecar.
60
DAPR_HTTP_CLIENT_MAX_REQUESTS
Maximum number of concurrent HTTP requests that can be executed. Above this limit, requests will queue in memory waiting for running calls to complete.
1024
DAPR_HTTP_CLIENT_MAX_IDLE_CONNECTIONS
Maximum number of idle connections in the HTTP connection pool. This is the maximum number of connections that can remain idle in the pool.
128
API Configuration
These properties control the behavior of API calls made through the SDK:
Environment Variable
Description
Default
DAPR_API_MAX_RETRIES
Maximum number of retries for retriable exceptions when making API calls to the Dapr sidecar
0
DAPR_API_TIMEOUT_MILLISECONDS
Timeout in milliseconds for API calls to the Dapr sidecar. A value of 0 means no timeout.
0
String Encoding
Environment Variable
Description
Default
DAPR_STRING_CHARSET
Character set used for string encoding/decoding in the SDK. Must be a valid Java charset name.
UTF-8
System Properties
All environment variables can be set as system properties using the -D flag. Here is the complete list of available system properties:
System Property
Description
Default
dapr.sidecar.ip
IP address for the Dapr sidecar
localhost
dapr.http.port
HTTP port for the Dapr sidecar
3500
dapr.grpc.port
gRPC port for the Dapr sidecar
50001
dapr.grpc.tls.cert.path
Path to the gRPC TLS certificate
null
dapr.grpc.tls.key.path
Path to the gRPC TLS key
null
dapr.grpc.tls.ca.path
Path to the gRPC TLS CA certificate
null
dapr.grpc.tls.insecure
Whether to use insecure TLS mode
false
dapr.grpc.endpoint
gRPC endpoint for remote sidecar
null
dapr.grpc.enable.keep.alive
Whether to enable gRPC keepalive
false
dapr.grpc.keep.alive.time.seconds
gRPC keepalive time in seconds
10
dapr.grpc.keep.alive.timeout.seconds
gRPC keepalive timeout in seconds
5
dapr.grpc.keep.alive.without.calls
Whether to keep gRPC connection alive without calls
true
dapr.http.endpoint
HTTP endpoint for remote sidecar
null
dapr.api.maxRetries
Maximum number of retries for API calls
0
dapr.api.timeoutMilliseconds
Timeout for API calls in milliseconds
0
dapr.api.token
API token for authentication
null
dapr.string.charset
String encoding used in the SDK
UTF-8
dapr.http.client.readTimeoutSeconds
Timeout in seconds for HTTP client reads
60
dapr.http.client.maxRequests
Maximum number of concurrent HTTP requests
1024
dapr.http.client.maxIdleConnections
Maximum number of idle HTTP connections
128
Property Resolution Order
Properties are resolved in the following order:
Override values (if provided when creating a Properties instance)
System properties (set via -D)
Environment variables
Default values
The SDK checks each source in order. If a value is invalid for the property type (e.g., non-numeric for a numeric property), the SDK will log a warning and try the next source. For example:
# Invalid boolean value - will be ignoredjava -Ddapr.grpc.enable.keep.alive=not-a-boolean -jar myapp.jar
# Valid boolean value - will be usedexportDAPR_GRPC_ENABLE_KEEP_ALIVE=false
In this case, the environment variable is used because the system property value is invalid. However, if both values are valid, the system property takes precedence:
# Valid boolean value - will be usedjava -Ddapr.grpc.enable.keep.alive=true -jar myapp.jar
# Valid boolean value - will be ignoredexportDAPR_GRPC_ENABLE_KEEP_ALIVE=false
Override values can be set using the DaprClientBuilder in two ways:
Using individual property overrides (recommended for most cases):
importio.dapr.config.Properties;// Set a single property overrideDaprClientclient=newDaprClientBuilder().withPropertyOverride(Properties.GRPC_ENABLE_KEEP_ALIVE,"true").build();// Or set multiple property overridesDaprClientclient=newDaprClientBuilder().withPropertyOverride(Properties.GRPC_ENABLE_KEEP_ALIVE,"true").withPropertyOverride(Properties.HTTP_CLIENT_READ_TIMEOUT_SECONDS,"120").build();
Using a Properties instance (useful when you have many properties to set at once):
// Create a map of property overridesMap<String,String>overrides=newHashMap<>();overrides.put("dapr.grpc.enable.keep.alive","true");overrides.put("dapr.http.client.readTimeoutSeconds","120");// Create a Properties instance with overridesPropertiesproperties=newProperties(overrides);// Use these properties when creating a clientDaprClientclient=newDaprClientBuilder().withProperties(properties).build();
For most use cases, you’ll use system properties or environment variables. Override values are primarily used when you need different property values for different instances of the SDK in the same application.
Proxy Configuration
You can configure proxy settings for your Java application using system properties. These are standard Java system properties that are part of Java’s networking layer (java.net package), not specific to Dapr. They are used by Java’s networking stack, including the HTTP client that Dapr’s SDK uses.
For detailed information about Java’s proxy configuration, including all available properties and their usage, see the Java Networking Properties documentation.
For example, here’s how to configure a proxy:
# Configure HTTP proxy - replace with your actual proxy server detailsjava -Dhttp.proxyHost=your-proxy-server.com -Dhttp.proxyPort=8080 -jar myapp.jar
# Configure HTTPS proxy - replace with your actual proxy server detailsjava -Dhttps.proxyHost=your-proxy-server.com -Dhttps.proxyPort=8443 -jar myapp.jar
Replace your-proxy-server.com with your actual proxy server hostname or IP address, and adjust the port numbers to match your proxy server configuration.
These proxy settings will affect all HTTP/HTTPS connections made by your Java application, including connections to the Dapr sidecar.
3.3 - Jobs
With the Dapr Jobs package, you can interact with the Dapr Jobs APIs from a Java application to trigger future operations to run according to a predefined schedule with an optional payload. To get started, walk through the Dapr Jobs how-to guide.
3.3.1 - How to: Author and manage Dapr Jobs in the Java SDK
How to get up and running with Jobs using the Dapr Java SDK
As part of this demonstration we will schedule a Dapr Job. The scheduled job will trigger an endpoint registered in the
same app. With the provided jobs example, you will:
git clone https://github.com/dapr/java-sdk.git
cd java-sdk
Run the following command to install the requirements for running the jobs example with the Dapr Java SDK.
mvn clean install -DskipTests
From the Java SDK root directory, navigate to the examples’ directory.
cd examples
Run the Dapr sidecar.
dapr run --app-id jobsapp --dapr-grpc-port 51439 --dapr-http-port 3500 --app-port 8080
Now, Dapr is listening for HTTP requests at http://localhost:3500 and internal Jobs gRPC requests at http://localhost:51439.
Schedule and Get a job
In the DemoJobsClient there are steps to schedule a job. Calling scheduleJob using the DaprPreviewClient
will schedule a job with the Dapr Runtime.
publicclassDemoJobsClient{/**
* The main method of this app to schedule and get jobs.
*/publicstaticvoidmain(String[]args)throwsException{try(DaprPreviewClientclient=newDaprClientBuilder().withPropertyOverrides(overrides).buildPreviewClient()){// Schedule a job.System.out.println("**** Scheduling a Job with name dapr-jobs-1 *****");ScheduleJobRequestscheduleJobRequest=newScheduleJobRequest("dapr-job-1",JobSchedule.fromString("* * * * * *")).setData("Hello World!".getBytes());client.scheduleJob(scheduleJobRequest).block();System.out.println("**** Scheduling job dapr-jobs-1 completed *****");}}}
Call getJob to retrieve the job details that were previously created and scheduled.
**** Scheduling a Job with name dapr-jobs-1 *****
**** Scheduling job dapr-jobs-1 completed *****
**** Retrieving a Job with name dapr-jobs-1 *****
Set up an endpoint to be invoked when the job is triggered
The DemoJobsSpringApplication class starts a Spring Boot application that registers the endpoints specified in the JobsController
This endpoint acts like a callback for the scheduled job requests.
@RestControllerpublicclassJobsController{/**
* Handles jobs callback from Dapr.
*
* @param jobName name of the job.
* @param payload data from the job if payload exists.
* @return Empty Mono.
*/@PostMapping("/job/{jobName}")publicMono<Void>handleJob(@PathVariable("jobName")StringjobName,@RequestBody(required=false)byte[]payload){System.out.println("Job Name: "+jobName);System.out.println("Job Payload: "+newString(payload));returnMono.empty();}}
Parameters:
jobName: The name of the triggered job.
payload: Optional payload data associated with the job (as a byte array).
Run the Spring Boot application with the following command.
publicclassDemoJobsClient{/**
* The main method of this app deletes a job that was previously scheduled.
*/publicstaticvoidmain(String[]args)throwsException{try(DaprPreviewClientclient=newDaprClientBuilder().buildPreviewClient()){// Delete a job.System.out.println("**** Delete a Job with name dapr-jobs-1 *****");client.deleteJob(newDeleteJobRequest("dapr-job-1")).block();}}}
git clone https://github.com/dapr/java-sdk.git
cd java-sdk
Run the following command to install the requirements for running this workflow sample with the Dapr Java SDK.
mvn clean install
From the Java SDK root directory, navigate to the Dapr Workflow example.
cd examples
Run the DemoWorkflowWorker
The DemoWorkflowWorker class registers an implementation of DemoWorkflow in Dapr’s workflow runtime engine. In the DemoWorkflowWorker.java file, you can find the DemoWorkflowWorker class and the main method:
publicclassDemoWorkflowWorker{publicstaticvoidmain(String[]args)throwsException{// Register the Workflow with the runtime.WorkflowRuntime.getInstance().registerWorkflow(DemoWorkflow.class);System.out.println("Start workflow runtime");WorkflowRuntime.getInstance().startAndBlock();System.exit(0);}}
In the code above:
WorkflowRuntime.getInstance().registerWorkflow() registers DemoWorkflow as a workflow in the Dapr Workflow runtime.
WorkflowRuntime.getInstance().start() builds and starts the engine within the Dapr Workflow runtime.
In the terminal, execute the following command to kick off the DemoWorkflowWorker:
You're up and running! Both Dapr and your app logs will appear here.
...
== APP == Start workflow runtime
== APP == Sep 13, 2023 9:02:03 AM com.microsoft.durabletask.DurableTaskGrpcWorker startAndBlock
== APP == INFO: Durable Task worker is connecting to sidecar at 127.0.0.1:50001.
Run the `DemoWorkflowClient
The DemoWorkflowClient starts instances of workflows that have been registered with Dapr.
publicclassDemoWorkflowClient{// ...publicstaticvoidmain(String[]args)throwsInterruptedException{DaprWorkflowClientclient=newDaprWorkflowClient();try(client){StringseparatorStr="*******";System.out.println(separatorStr);StringinstanceId=client.scheduleNewWorkflow(DemoWorkflow.class,"input data");System.out.printf("Started new workflow instance with random ID: %s%n",instanceId);System.out.println(separatorStr);System.out.println("**GetInstanceMetadata:Running Workflow**");WorkflowInstanceStatusworkflowMetadata=client.getInstanceState(instanceId,true);System.out.printf("Result: %s%n",workflowMetadata);System.out.println(separatorStr);System.out.println("**WaitForInstanceStart**");try{WorkflowInstanceStatuswaitForInstanceStartResult=client.waitForInstanceStart(instanceId,Duration.ofSeconds(60),true);System.out.printf("Result: %s%n",waitForInstanceStartResult);}catch(TimeoutExceptionex){System.out.printf("waitForInstanceStart has an exception:%s%n",ex);}System.out.println(separatorStr);System.out.println("**SendExternalMessage**");client.raiseEvent(instanceId,"TestEvent","TestEventPayload");System.out.println(separatorStr);System.out.println("** Registering parallel Events to be captured by allOf(t1,t2,t3) **");client.raiseEvent(instanceId,"event1","TestEvent 1 Payload");client.raiseEvent(instanceId,"event2","TestEvent 2 Payload");client.raiseEvent(instanceId,"event3","TestEvent 3 Payload");System.out.printf("Events raised for workflow with instanceId: %s\n",instanceId);System.out.println(separatorStr);System.out.println("** Registering Event to be captured by anyOf(t1,t2,t3) **");client.raiseEvent(instanceId,"e2","event 2 Payload");System.out.printf("Event raised for workflow with instanceId: %s\n",instanceId);System.out.println(separatorStr);System.out.println("**WaitForInstanceCompletion**");try{WorkflowInstanceStatuswaitForInstanceCompletionResult=client.waitForInstanceCompletion(instanceId,Duration.ofSeconds(60),true);System.out.printf("Result: %s%n",waitForInstanceCompletionResult);}catch(TimeoutExceptionex){System.out.printf("waitForInstanceCompletion has an exception:%s%n",ex);}System.out.println(separatorStr);System.out.println("**purgeInstance**");booleanpurgeResult=client.purgeInstance(instanceId);System.out.printf("purgeResult: %s%n",purgeResult);System.out.println(separatorStr);System.out.println("**raiseEvent**");StringeventInstanceId=client.scheduleNewWorkflow(DemoWorkflow.class);System.out.printf("Started new workflow instance with random ID: %s%n",eventInstanceId);client.raiseEvent(eventInstanceId,"TestException",null);System.out.printf("Event raised for workflow with instanceId: %s\n",eventInstanceId);System.out.println(separatorStr);StringinstanceToTerminateId="terminateMe";client.scheduleNewWorkflow(DemoWorkflow.class,null,instanceToTerminateId);System.out.printf("Started new workflow instance with specified ID: %s%n",instanceToTerminateId);TimeUnit.SECONDS.sleep(5);System.out.println("Terminate this workflow instance manually before the timeout is reached");client.terminateWorkflow(instanceToTerminateId,null);System.out.println(separatorStr);StringrestartingInstanceId="restarting";client.scheduleNewWorkflow(DemoWorkflow.class,null,restartingInstanceId);System.out.printf("Started new workflow instance with ID: %s%n",restartingInstanceId);System.out.println("Sleeping 30 seconds to restart the workflow");TimeUnit.SECONDS.sleep(30);System.out.println("**SendExternalMessage: RestartEvent**");client.raiseEvent(restartingInstanceId,"RestartEvent","RestartEventPayload");System.out.println("Sleeping 30 seconds to terminate the eternal workflow");TimeUnit.SECONDS.sleep(30);client.terminateWorkflow(restartingInstanceId,null);}System.out.println("Exiting DemoWorkflowClient.");System.exit(0);}}
In a second terminal window, start the workflow by running the following command:
*******
Started new workflow instance with random ID: 0b4cc0d5-413a-4c1c-816a-a71fa24740d4
*******
**GetInstanceMetadata:Running Workflow**
Result: [Name: 'io.dapr.examples.workflows.DemoWorkflow', ID: '0b4cc0d5-413a-4c1c-816a-a71fa24740d4', RuntimeStatus: RUNNING, CreatedAt: 2023-09-13T13:02:30.547Z, LastUpdatedAt: 2023-09-13T13:02:30.699Z, Input: '"input data"', Output: '']
*******
**WaitForInstanceStart**
Result: [Name: 'io.dapr.examples.workflows.DemoWorkflow', ID: '0b4cc0d5-413a-4c1c-816a-a71fa24740d4', RuntimeStatus: RUNNING, CreatedAt: 2023-09-13T13:02:30.547Z, LastUpdatedAt: 2023-09-13T13:02:30.699Z, Input: '"input data"', Output: '']
*******
**SendExternalMessage**
*******
** Registering parallel Events to be captured by allOf(t1,t2,t3) **
Events raised for workflow with instanceId: 0b4cc0d5-413a-4c1c-816a-a71fa24740d4
*******
** Registering Event to be captured by anyOf(t1,t2,t3) **
Event raised for workflow with instanceId: 0b4cc0d5-413a-4c1c-816a-a71fa24740d4
*******
**WaitForInstanceCompletion**
Result: [Name: 'io.dapr.examples.workflows.DemoWorkflow', ID: '0b4cc0d5-413a-4c1c-816a-a71fa24740d4', RuntimeStatus: FAILED, CreatedAt: 2023-09-13T13:02:30.547Z, LastUpdatedAt: 2023-09-13T13:02:55.054Z, Input: '"input data"', Output: '']
*******
**purgeInstance**
purgeResult: true
*******
**raiseEvent**
Started new workflow instance with random ID: 7707d141-ebd0-4e54-816e-703cb7a52747
Event raised for workflow with instanceId: 7707d141-ebd0-4e54-816e-703cb7a52747
*******
Started new workflow instance with specified ID: terminateMe
Terminate this workflow instance manually before the timeout is reached
*******
Started new workflow instance with ID: restarting
Sleeping 30 seconds to restart the workflow
**SendExternalMessage: RestartEvent**
Sleeping 30 seconds to terminate the eternal workflow
Exiting DemoWorkflowClient.
What happened?
When you ran dapr run, the workflow worker registered the workflow (DemoWorkflow) and its actvities to the Dapr Workflow engine.
When you ran java, the workflow client started the workflow instance with the following activities. You can follow along with the output in the terminal where you ran dapr run.
The workflow is started, raises three parallel tasks, and waits for them to complete.
The workflow client calls the activity and sends the “Hello Activity” message to the console.
The workflow times out and is purged.
The workflow client starts a new workflow instance with a random ID, uses another workflow instance called terminateMe to terminate it, and restarts it with the workflow called restarting.
3.5 - Getting started with the Dapr and Spring Boot
How to get started with Dapr and Spring Boot
By combining Dapr and Spring Boot, we can create infrastructure independent Java applications that can be deployed across different environments, supporting a wide range of on-premises and cloud provider services.
First, we will start with a simple integration covering the DaprClient and the Testcontainers integration, to then use Spring and Spring Boot mechanisms and programming model to leverage the Dapr APIs under the hood. This helps teams to remove dependencies such as clients and drivers required to connect to environment-specific infrastructure (databases, key-value stores, message brokers, configuration/secret stores, etc)
Note
The Spring Boot integration requires Spring Boot 3.x+ to work. This will not work with Spring Boot 2.x.
The Spring Boot integration remains in alpha. We need your help and feedback to graduate it.
Please join the #java-sdk discord channel discussion or open issues in the dapr/java-sdk.
Adding the Dapr and Spring Boot integration to your project
If you already have a Spring Boot application, you can directly add the following dependencies to your project:
Autowire a DaprClient to use inside your applications
Use the Spring Data and Messaging abstractions and programming model that uses the Dapr APIs under the hood
Improve your inner-development loop by relying on Testcontainers to bootstrap Dapr Control plane services and default components
Once these dependencies are in your application, you can rely on Spring Boot autoconfiguration to autowire a DaprClient instance:
@AutowiredprivateDaprClientdaprClient;
This will connect to the default Dapr gRPC endpoint localhost:50001, requiring you to start Dapr outside of your application.
Note
By default, the following properties are preconfigured for DaprClient and DaprWorkflowClient:
dapr.client.httpEndpoint=http://localhostdapr.client.httpPort=3500dapr.client.grpcEndpoint=localhostdapr.client.grpcPort=50001dapr.client.apiToken=<your remote api token>
These values are used by default, but you can override them in your application.properties file to suit your environment. Please note that both kebab case and camel case are supported.
You can use the DaprClient to interact with the Dapr APIs anywhere in your application, for example from inside a REST endpoint:
If you want to avoid managing Dapr outside of your Spring Boot application, you can rely on Testcontainers to bootstrap Dapr beside your application for development purposes.
To do this we can create a test configuration that uses Testcontainers to bootstrap all we need to develop our applications using the Dapr APIs.
Using Testcontainers and Dapr integrations, we let the @TestConfiguration bootstrap Dapr for our applications.
Notice that for this example, we are configuring Dapr with a Statestore component called kvstore that connects to an instance of PostgreSQL also bootstrapped by Testcontainers.
Running this command will start the application, using the provided test configuration that includes the Testcontainers and Dapr integration. In the logs you should be able to see that the daprd and the placement service containers were started for your application.
Besides the previous configuration (DaprTestContainersConfig) your tests shouldn’t be testing Dapr itself, just the REST endpoints that your application is exposing.
Leveraging Spring & Spring Boot programming model with Dapr
The Java SDK allows you to interface with all of the Dapr building blocks.
But if you want to leverage the Spring and Spring Boot programming model you can use the dapr-spring-boot-starter integration.
This includes implementations of Spring Data (KeyValueTemplate and CrudRepository) as well as a DaprMessagingTemplate for producing and consuming messages
(similar to Spring Kafka, Spring Pulsar and Spring AMQP for RabbitMQ) and Dapr workflows.
Using Spring Data CrudRepository and KeyValueTemplate
You can use well known Spring Data constructs relying on a Dapr-based implementation.
With Dapr, you don’t need to add any infrastructure-related driver or client, making your Spring application lighter and decoupled from the environment where it is running.
Under the hood these implementations use the Dapr Statestore and Binding APIs.
Configuration parameters
With Spring Data abstractions you can configure which statestore and bindings will be used by Dapr to connect to the available infrastructure.
This can be done by setting the following properties:
Notice that the @EnableDaprRepositories annotation does all the magic of wiring the Dapr APIs under the CrudRespository interface.
Because Dapr allow users to interact with different StateStores from the same application, as a user you need to provide the following beans as a Spring Boot @Configuration:
Using Spring Messaging for producing and consuming events
Similar to Spring Kafka, Spring Pulsar and Spring AMQP you can use the DaprMessagingTemplate to publish messages to the configured infrastructure. To consume messages you can use the @Topic annotation (soon to be renamed to @DaprListener).
To publish events/messages you can @Autowired the DaprMessagingTemplate in your Spring application.
For this example we will be publishing Order events and we are sending messages to the topic named topic.
Similarly to the CrudRepository we need to specify which PubSub broker do we want to use to publish and consume our messages.
dapr.pubsub.name=pubsub
Because with Dapr you can connect to multiple PubSub brokers you need to provide the following bean to let Dapr know which PubSub broker your DaprMessagingTemplate will use:
Finally, because Dapr PubSub requires a bidirectional connection between your application and Dapr you need to expand your Testcontainers configuration with a few parameters:
Now, in the Dapr configuration we have included a pubsub component that will connect to an instance of RabbitMQ started by Testcontainers.
We have also set two important parameters .withAppPort(8080) and .withAppChannelAddress("host.testcontainers.internal") which allows Dapr to
contact back to the application when a message is published in the broker.
To listen to events/messages you need to expose an endpoint in the application that will be responsible to receive the messages.
If you expose a REST endpoint you can use the @Topic annotation to let Dapr know where it needs to forward the events/messages too:
Upon bootstrapping your application, Dapr will register the subscription to messages to be forwarded to the subscribe endpoint exposed by your application.
If you are writing tests for these subscribers you need to ensure that Testcontainers knows that your application will be running on port 8080,
so containers started with Testcontainers know where your application is:
Following the same approach that we used for Spring Data and Spring Messaging, the dapr-spring-boot-starter brings Dapr Workflow integration for Spring Boot users.
To work with Dapr Workflows you need to define and implement your workflows using code. The Dapr Spring Boot Starter makes your life easier by managing Workflows and WorkflowActivitys as Spring beans.
In order to enable the automatic bean discovery you can annotate your @SpringBootApplication with the @EnableDaprWorkflows annotation:
@SpringBootApplication
@EnableDaprWorkflows
public class MySpringBootApplication {}
By adding this annotation, all the WorkflowActivitys will be automatically managed by Spring and registered to the workflow engine.
By having all WorkflowActivitys as managed beans we can use Spring @Autowired mechanism to inject any bean that our workflow activity might need to implement its functionality, for example the @RestTemplate:
public class MyWorkflowActivity implements WorkflowActivity {
@Autowired
private RestTemplate restTemplate;
You can also @Autowired the DaprWorkflowClient to create new instances of your workflows.
JavaScript SDK packages for developing Dapr applications
A client library for building Dapr apps in JavaScript and TypeScript. This client abstracts the public Dapr APIs like service to service invocation, state management, pub/sub, secrets, and much more, and provides a simple, intuitive API for building applications.
Installation
To get started with the JavaScript SDK, install the Dapr JavaScript SDK package from NPM:
npm install --save @dapr/dapr
Structure
The Dapr JavaScript SDK contains two major components:
DaprServer: to manage all Dapr sidecar to application communication.
DaprClient: to manage all application to Dapr sidecar communication.
The above communication can be configured to use either of the gRPC or HTTP protocols.
Getting Started
To help you get started, check out the resources below:
Client
Create a JavaScript client and interact with the Dapr sidecar and other Dapr applications (e.g., publishing events, output binding support, etc.).
Server
Create a JavaScript server and let the Dapr sidecar interact with your application (e.g., subscribing to events, input binding support, etc.).
Actors
Create virtual actors with state, reminders/timers, and methods.
Logging
Configure and customize the SDK logging.
Examples
Clone the JavaScript SDK source code and try out some of the examples to get started quickly.
4.1 - JavaScript Client SDK
JavaScript Client SDK for developing Dapr applications
Introduction
The Dapr Client allows you to communicate with the Dapr Sidecar and get access to its client facing features such as Publishing Events, Invoking Output Bindings, State Management, Secret Management, and much more.
import{DaprClient,DaprServer,HttpMethod,CommunicationProtocolEnum}from"@dapr/dapr";constdaprHost="127.0.0.1";// Dapr Sidecar Host
constdaprPort="3500";// Dapr Sidecar Port of this Example Server
constserverHost="127.0.0.1";// App Host of this Example Server
constserverPort="50051";// App Port of this Example Server
// HTTP Example
constclient=newDaprClient({daprHost,daprPort});// GRPC Example
constclient=newDaprClient({daprHost,daprPort,communicationProtocol: CommunicationProtocolEnum.GRPC});
Running
To run the examples, you can use two different protocols to interact with the Dapr sidecar: HTTP (default) or gRPC.
# Using dapr rundapr run --app-id example-sdk --app-protocol http -- npm run start
# or, using npm scriptnpm run start:dapr-http
Using gRPC
Since HTTP is the default, you will have to adapt the communication protocol to use gRPC. You can do this by passing an extra argument to the client or server constructor.
# Using dapr rundapr run --app-id example-sdk --app-protocol grpc -- npm run start
# or, using npm scriptnpm run start:dapr-grpc
Environment Variables
Dapr Sidecar Endpoints
You can use the DAPR_HTTP_ENDPOINT and DAPR_GRPC_ENDPOINT environment variables to set the Dapr
Sidecar’s HTTP and gRPC endpoints respectively. When these variables are set, the daprHost
and daprPort don’t have to be set in the options argument of the constructor, the client will parse them automatically
out of the provided endpoints.
import{DaprClient,CommunicationProtocol}from"@dapr/dapr";// Using HTTP, when DAPR_HTTP_ENDPOINT is set
constclient=newDaprClient();// Using gRPC, when DAPR_GRPC_ENDPOINT is set
constclient=newDaprClient({communicationProtocol: CommunicationProtocol.GRPC});
If the environment variables are set, but daprHost and daprPort values are passed to the
constructor, the latter will take precedence over the environment variables.
Dapr API Token
You can use the DAPR_API_TOKEN environment variable to set the Dapr API token. When this variable
is set, the daprApiToken doesn’t have to be set in the options argument of the constructor,
the client will get it automatically.
General
Increasing Body Size
You can increase the body size that is used by the application to communicate with the sidecar by using aDaprClient’s option.
import{DaprClient,CommunicationProtocol}from"@dapr/dapr";// Allow a body size of 10Mb to be used
// The default is 4Mb
constclient=newDaprClient({daprHost,daprPort,communicationProtocol: CommunicationProtocol.HTTP,maxBodySizeMb: 10,});
Proxying Requests
By proxying requests, we can utilize the unique capabilities that Dapr brings with its sidecar architecture such as service discovery, logging, etc., enabling us to instantly “upgrade” our gRPC services. This feature of gRPC proxying was demonstrated in community call 41.
Creating a Proxy
To perform gRPC proxying, simply create a proxy by calling the client.proxy.create() method:
// As always, create a client to our dapr sidecar
// this client takes care of making sure the sidecar is started, that we can communicate, ...
constclientSidecar=newDaprClient({daprHost,daprPort,communicationProtocol: CommunicationProtocol.GRPC});// Create a Proxy that allows us to use our gRPC code
constclientProxy=awaitclientSidecar.proxy.create<GreeterClient>(GreeterClient);
We can now call the methods as defined in our GreeterClient interface (which in this case is from the Hello World example)
Behind the Scenes (Technical Working)
The gRPC service gets started in Dapr. We tell Dapr which port this gRPC server is running on through --app-port and give it a unique Dapr app ID with --app-id <APP_ID_HERE>
We can now call the Dapr Sidecar through a client that will connect to the Sidecar
Whilst calling the Dapr Sidecar, we provide a metadata key named dapr-app-id with the value of our gRPC server booted in Dapr (e.g. server in our example)
Dapr will now forward the call to the gRPC server configured
Building blocks
The JavaScript Client SDK allows you to interface with all of the Dapr building blocks focusing on Client to Sidecar features.
Invocation API
Invoke a Service
import{DaprClient,HttpMethod}from"@dapr/dapr";constdaprHost="127.0.0.1";constdaprPort="3500";asyncfunctionstart() {constclient=newDaprClient({daprHost,daprPort});constserviceAppId="my-app-id";constserviceMethod="say-hello";// POST Request
constresponse=awaitclient.invoker.invoke(serviceAppId,serviceMethod,HttpMethod.POST,{hello:"world"});// POST Request with headers
constresponse=awaitclient.invoker.invoke(serviceAppId,serviceMethod,HttpMethod.POST,{hello:"world"},{headers:{"X-User-ID":"123"}},);// GET Request
constresponse=awaitclient.invoker.invoke(serviceAppId,serviceMethod,HttpMethod.GET);}start().catch((e)=>{console.error(e);process.exit(1);});
import{DaprClient}from"@dapr/dapr";constdaprHost="127.0.0.1";constdaprPort="3500";asyncfunctionstart() {constclient=newDaprClient({daprHost,daprPort});constserviceStoreName="my-state-store-name";// Save State
constresponse=awaitclient.state.save(serviceStoreName,[{key:"first-key-name",value:"hello",metadata:{foo:"bar",},},{key:"second-key-name",value:"world",},],{metadata:{ttlInSeconds:"3",// this should override the ttl in the state item
},},);// Get State
constresponse=awaitclient.state.get(serviceStoreName,"first-key-name");// Get Bulk State
constresponse=awaitclient.state.getBulk(serviceStoreName,["first-key-name","second-key-name"]);// State Transactions
awaitclient.state.transaction(serviceStoreName,[{operation:"upsert",request:{key:"first-key-name",value:"new-data",},},{operation:"delete",request:{key:"second-key-name",},},]);// Delete State
constresponse=awaitclient.state.delete(serviceStoreName,"first-key-name");}start().catch((e)=>{console.error(e);process.exit(1);});
import{DaprClient}from"@dapr/dapr";constdaprHost="127.0.0.1";constdaprPort="3500";asyncfunctionstart() {constclient=newDaprClient({daprHost,daprPort});constpubSubName="my-pubsub-name";consttopic="topic-a";// Publish message to topic as text/plain
// Note, the content type is inferred from the message type unless specified explicitly
constresponse=awaitclient.pubsub.publish(pubSubName,topic,"hello, world!");// If publish fails, response contains the error
console.log(response);// Publish message to topic as application/json
awaitclient.pubsub.publish(pubSubName,topic,{hello:"world"});// Publish a JSON message as plain text
constoptions={contentType:"text/plain"};awaitclient.pubsub.publish(pubSubName,topic,{hello:"world"},options);// Publish message to topic as application/cloudevents+json
// You can also use the cloudevent SDK to create cloud events https://github.com/cloudevents/sdk-javascript
constcloudEvent={specversion:"1.0",source:"/some/source",type:"example",id:"1234",};awaitclient.pubsub.publish(pubSubName,topic,cloudEvent);// Publish a cloudevent as raw payload
constoptions={metadata:{rawPayload: true}};awaitclient.pubsub.publish(pubSubName,topic,"hello, world!",options);// Publish multiple messages to a topic as text/plain
awaitclient.pubsub.publishBulk(pubSubName,topic,["message 1","message 2","message 3"]);// Publish multiple messages to a topic as application/json
awaitclient.pubsub.publishBulk(pubSubName,topic,[{hello:"message 1"},{hello:"message 2"},{hello:"message 3"},]);// Publish multiple messages with explicit bulk publish messages
constbulkPublishMessages=[{entryID:"entry-1",contentType:"application/json",event:{hello:"foo message 1"},},{entryID:"entry-2",contentType:"application/cloudevents+json",event:{...cloudEvent,data:"foo message 2",datacontenttype:"text/plain"},},{entryID:"entry-3",contentType:"text/plain",event:"foo message 3",},];awaitclient.pubsub.publishBulk(pubSubName,topic,bulkPublishMessages);}start().catch((e)=>{console.error(e);process.exit(1);});
import{DaprClient}from"@dapr/dapr";constdaprHost="127.0.0.1";constdaprPort="3500";asyncfunctionstart() {constclient=newDaprClient({daprHost,daprPort});constsecretStoreName="my-secret-store";constsecretKey="secret-key";// Retrieve a single secret from secret store
constresponse=awaitclient.secret.get(secretStoreName,secretKey);// Retrieve all secrets from secret store
constresponse=awaitclient.secret.getBulk(secretStoreName);}start().catch((e)=>{console.error(e);process.exit(1);});
import{DaprClient}from"@dapr/dapr";constdaprHost="127.0.0.1";asyncfunctionstart() {constclient=newDaprClient({daprHost,daprPort: process.env.DAPR_GRPC_PORT,communicationProtocol: CommunicationProtocolEnum.GRPC,});// Subscribes to config store changes for keys "key1" and "key2"
conststream=awaitclient.configuration.subscribeWithKeys("config-store",["key1","key2"],async(data)=>{console.log("Subscribe received updates from config store: ",data);});// Wait for 60 seconds and unsubscribe.
awaitnewPromise((resolve)=>setTimeout(resolve,60000));stream.stop();}start().catch((e)=>{console.error(e);process.exit(1);});
Support for the cryptography API is only available on the gRPC client in the JavaScript SDK.
import{createReadStream,createWriteStream}from"node:fs";import{readFile,writeFile}from"node:fs/promises";import{pipeline}from"node:stream/promises";import{DaprClient,CommunicationProtocolEnum}from"@dapr/dapr";constdaprHost="127.0.0.1";constdaprPort="50050";// Dapr Sidecar Port of this example server
asyncfunctionstart() {constclient=newDaprClient({daprHost,daprPort,communicationProtocol: CommunicationProtocolEnum.GRPC,});// Encrypt and decrypt a message using streams
awaitencryptDecryptStream(client);// Encrypt and decrypt a message from a buffer
awaitencryptDecryptBuffer(client);}asyncfunctionencryptDecryptStream(client: DaprClient){// First, encrypt the message
console.log("== Encrypting message using streams");console.log("Encrypting plaintext.txt to ciphertext.out");awaitpipeline(createReadStream("plaintext.txt"),awaitclient.crypto.encrypt({componentName:"crypto-local",keyName:"symmetric256",keyWrapAlgorithm:"A256KW",}),createWriteStream("ciphertext.out"),);// Decrypt the message
console.log("== Decrypting message using streams");console.log("Encrypting ciphertext.out to plaintext.out");awaitpipeline(createReadStream("ciphertext.out"),awaitclient.crypto.decrypt({componentName:"crypto-local",}),createWriteStream("plaintext.out"),);}asyncfunctionencryptDecryptBuffer(client: DaprClient){// Read "plaintext.txt" so we have some content
constplaintext=awaitreadFile("plaintext.txt");// First, encrypt the message
console.log("== Encrypting message using buffers");constciphertext=awaitclient.crypto.encrypt(plaintext,{componentName:"crypto-local",keyName:"my-rsa-key",keyWrapAlgorithm:"RSA",});awaitwriteFile("test.out",ciphertext);// Decrypt the message
console.log("== Decrypting message using buffers");constdecrypted=awaitclient.crypto.decrypt(ciphertext,{componentName:"crypto-local",});// The contents should be equal
if(plaintext.compare(decrypted)!==0){thrownewError("Decrypted message does not match original message");}}start().catch((e)=>{console.error(e);process.exit(1);});
import{CommunicationProtocolEnum,DaprClient}from"@dapr/dapr";import{LockStatus}from"@dapr/dapr/types/lock/UnlockResponse";constdaprHost="127.0.0.1";constdaprPortDefault="3500";asyncfunctionstart() {constclient=newDaprClient({daprHost,daprPort});conststoreName="redislock";constresourceId="resourceId";constlockOwner="owner1";letexpiryInSeconds=1000;console.log(`Acquiring lock on ${storeName}, ${resourceId} as owner: ${lockOwner}`);constlockResponse=awaitclient.lock.lock(storeName,resourceId,lockOwner,expiryInSeconds);console.log(lockResponse);console.log(`Unlocking on ${storeName}, ${resourceId} as owner: ${lockOwner}`);constunlockResponse=awaitclient.lock.unlock(storeName,resourceId,lockOwner);console.log("Unlock API response: "+getResponseStatus(unlockResponse.status));}functiongetResponseStatus(status: LockStatus){switch(status){caseLockStatus.Success:
return"Success";caseLockStatus.LockDoesNotExist:
return"LockDoesNotExist";caseLockStatus.LockBelongsToOthers:
return"LockBelongsToOthers";default:return"InternalError";}}start().catch((e)=>{console.error(e);process.exit(1);});
import{DaprClient}from"@dapr/dapr";asyncfunctionstart() {constclient=newDaprClient();// Start a new workflow instance
constinstanceId=awaitclient.workflow.start("OrderProcessingWorkflow",{Name:"Paperclips",TotalCost: 99.95,Quantity: 4,});console.log(`Started workflow instance ${instanceId}`);// Get a workflow instance
constworkflow=awaitclient.workflow.get(instanceId);console.log(`Workflow ${workflow.workflowName}, created at ${workflow.createdAt.toUTCString()}, has status ${workflow.runtimeStatus}`,);console.log(`Additional properties: ${JSON.stringify(workflow.properties)}`);// Pause a workflow instance
awaitclient.workflow.pause(instanceId);console.log(`Paused workflow instance ${instanceId}`);// Resume a workflow instance
awaitclient.workflow.resume(instanceId);console.log(`Resumed workflow instance ${instanceId}`);// Terminate a workflow instance
awaitclient.workflow.terminate(instanceId);console.log(`Terminated workflow instance ${instanceId}`);// Purge a workflow instance
awaitclient.workflow.purge(instanceId);console.log(`Purged workflow instance ${instanceId}`);}start().catch((e)=>{console.error(e);process.exit(1);});
JavaScript Server SDK for developing Dapr applications
Introduction
The Dapr Server will allow you to receive communication from the Dapr Sidecar and get access to its server facing features such as: Subscribing to Events, Receiving Input Bindings, and much more.
import{DaprServer,CommunicationProtocolEnum}from"@dapr/dapr";constdaprHost="127.0.0.1";// Dapr Sidecar Host
constdaprPort="3500";// Dapr Sidecar Port of this Example Server
constserverHost="127.0.0.1";// App Host of this Example Server
constserverPort="50051";// App Port of this Example Server
// HTTP Example
constserver=newDaprServer({serverHost,serverPort,communicationProtocol: CommunicationProtocolEnum.HTTP,// DaprClient to use same communication protocol as DaprServer, in case DaprClient protocol not mentioned explicitly
clientOptions:{daprHost,daprPort,},});// GRPC Example
constserver=newDaprServer({serverHost,serverPort,communicationProtocol: CommunicationProtocolEnum.GRPC,clientOptions:{daprHost,daprPort,},});
Running
To run the examples, you can use two different protocols to interact with the Dapr sidecar: HTTP (default) or gRPC.
Using HTTP (built-in express webserver)
import{DaprServer}from"@dapr/dapr";constserver=newDaprServer({serverHost: appHost,serverPort: appPort,clientOptions:{daprHost,daprPort,},});// initialize subscribtions, ... before server start
// the dapr sidecar relies on these
awaitserver.start();
# Using dapr rundapr run --app-id example-sdk --app-port 50051 --app-protocol http -- npm run start
# or, using npm scriptnpm run start:dapr-http
ℹ️ Note: The app-port is required here, as this is where our server will need to bind to. Dapr will check for the application to bind to this port, before finishing start-up.
Using HTTP (bring your own express webserver)
Instead of using the built-in web server for Dapr sidecar to application communication, you can also bring your own instance. This is helpful in scenarios like when you are building a REST API back-end and want to integrate Dapr directly in it.
Note, this is currently available for express only.
💡 Note: when using a custom web-server, the SDK will configure server properties like max body size, and add new routes to it. The routes are unique on their own to avoid any collisions with your application, but it’s not guaranteed to not collide.
import{DaprServer,CommunicationProtocolEnum}from"@dapr/dapr";importexpressfrom"express";constmyApp=express();myApp.get("/my-custom-endpoint",(req,res)=>{res.send({msg:"My own express app!"});});constdaprServer=newDaprServer({serverHost:"127.0.0.1",// App Host
serverPort:"50002",// App Port
serverHttp: myApp,clientOptions:{daprHostdaprPort}});// Initialize subscriptions before the server starts, the Dapr sidecar uses it.
// This will also initialize the app server itself (removing the need for `app.listen` to be called).
awaitdaprServer.start();
After configuring the above, you can call your custom endpoint as you normally would:
Since HTTP is the default, you will have to adapt the communication protocol to use gRPC. You can do this by passing an extra argument to the client or server constructor.
import{DaprServer,CommunicationProtocol}from"@dapr/dapr";constserver=newDaprServer({serverHost: appHost,serverPort: appPort,communicationProtocol: CommunicationProtocolEnum.GRPC,clientOptions:{daprHost,daprPort,},});// initialize subscribtions, ... before server start
// the dapr sidecar relies on these
awaitserver.start();
# Using dapr rundapr run --app-id example-sdk --app-port 50051 --app-protocol grpc -- npm run start
# or, using npm scriptnpm run start:dapr-grpc
ℹ️ Note: The app-port is required here, as this is where our server will need to bind to. Dapr will check for the application to bind to this port, before finishing start-up.
Building blocks
The JavaScript Server SDK allows you to interface with all of the Dapr building blocks focusing on Sidecar to App features.
Invocation API
Listen to an Invocation
import{DaprServer,DaprInvokerCallbackContent}from"@dapr/dapr";constdaprHost="127.0.0.1";// Dapr Sidecar Host
constdaprPort="3500";// Dapr Sidecar Port of this Example Server
constserverHost="127.0.0.1";// App Host of this Example Server
constserverPort="50051";// App Port of this Example Server "
asyncfunctionstart() {constserver=newDaprServer({serverHost,serverPort,clientOptions:{daprHost,daprPort,},});constcallbackFunction=(data: DaprInvokerCallbackContent)=>{console.log("Received body: ",data.body);console.log("Received metadata: ",data.metadata);console.log("Received query: ",data.query);console.log("Received headers: ",data.headers);// only available in HTTP
};awaitserver.invoker.listen("hello-world",callbackFunction,{method: HttpMethod.GET});// You can now invoke the service with your app id and method "hello-world"
awaitserver.start();}start().catch((e)=>{console.error(e);process.exit(1);});
Subscribing to messages can be done in several ways to offer flexibility of receiving messages on your topics:
Direct subscription through the subscribe method
Direct susbcription with options through the subscribeWithOptions method
Subscription afterwards through the susbcribeOnEvent method
Each time an event arrives, we pass its body as data and the headers as headers, which can contain properties of the event publisher (e.g., a device ID from IoT Hub)
Dapr requires subscriptions to be set up on startup, but in the JS SDK we allow event handlers to be added afterwards as well, providing you the flexibility of programming.
An example is provided below
import{DaprServer}from"@dapr/dapr";constdaprHost="127.0.0.1";// Dapr Sidecar Host
constdaprPort="3500";// Dapr Sidecar Port of this Example Server
constserverHost="127.0.0.1";// App Host of this Example Server
constserverPort="50051";// App Port of this Example Server "
asyncfunctionstart() {constserver=newDaprServer({serverHost,serverPort,clientOptions:{daprHost,daprPort,},});constpubSubName="my-pubsub-name";consttopic="topic-a";// Configure Subscriber for a Topic
// Method 1: Direct subscription through the `subscribe` method
awaitserver.pubsub.subscribe(pubSubName,topic,async(data: any,headers: object)=>console.log(`Received Data: ${JSON.stringify(data)} with headers: ${JSON.stringify(headers)}`),);// Method 2: Direct susbcription with options through the `subscribeWithOptions` method
awaitserver.pubsub.subscribeWithOptions(pubSubName,topic,{callback: async(data: any,headers: object)=>console.log(`Received Data: ${JSON.stringify(data)} with headers: ${JSON.stringify(headers)}`),});// Method 3: Subscription afterwards through the `susbcribeOnEvent` method
// Note: we use default, since if no route was passed (empty options) we utilize "default" as the route name
awaitserver.pubsub.subscribeWithOptions("pubsub-redis","topic-options-1",{});server.pubsub.subscribeToRoute("pubsub-redis","topic-options-1","default",async(data: any,headers: object)=>{console.log(`Received Data: ${JSON.stringify(data)} with headers: ${JSON.stringify(headers)}`);});// Start the server
awaitserver.start();}
⚠️ The JS SDK allows multiple callbacks on the same topic, we handle priority of status on RETRY > DROP > SUCCESS and default to SUCCESS
⚠️ Make sure to configure resiliency in your application to handle RETRY messages
In the JS SDK we support these messages through the DaprPubSubStatusEnum enum. To ensure Dapr will retry we configure a Resiliency policy as well.
components/resiliency.yaml
apiVersion:dapr.io/v1alpha1kind:Resiliencymetadata:name:myresiliencyspec:policies:retries:# Global Retry Policy for Inbound Component operationsDefaultComponentInboundRetryPolicy:policy:constantduration:500msmaxRetries:10targets:components:messagebus:inbound:retry:DefaultComponentInboundRetryPolicy
src/index.ts
import{DaprServer,DaprPubSubStatusEnum}from"@dapr/dapr";constdaprHost="127.0.0.1";// Dapr Sidecar Host
constdaprPort="3500";// Dapr Sidecar Port of this Example Server
constserverHost="127.0.0.1";// App Host of this Example Server
constserverPort="50051";// App Port of this Example Server "
asyncfunctionstart() {constserver=newDaprServer({serverHost,serverPort,clientOptions:{daprHost,daprPort,},});constpubSubName="my-pubsub-name";consttopic="topic-a";// Process a message successfully
awaitserver.pubsub.subscribe(pubSubName,topic,async(data: any,headers: object)=>{returnDaprPubSubStatusEnum.SUCCESS;});// Retry a message
// Note: this example will keep on retrying to deliver the message
// Note 2: each component can have their own retry configuration
// e.g., https://docs.dapr.io/reference/components-reference/supported-pubsub/setup-redis-pubsub/
awaitserver.pubsub.subscribe(pubSubName,topic,async(data: any,headers: object)=>{returnDaprPubSubStatusEnum.RETRY;});// Drop a message
awaitserver.pubsub.subscribe(pubSubName,topic,async(data: any,headers: object)=>{returnDaprPubSubStatusEnum.DROP;});// Start the server
awaitserver.start();}
E.g., you are writing an application that needs to handle messages depending on their “type” with Dapr, you can send them to different routes handlerType1 and handlerType2 with the default route being handlerDefault
import{DaprServer}from"@dapr/dapr";constdaprHost="127.0.0.1";// Dapr Sidecar Host
constdaprPort="3500";// Dapr Sidecar Port of this Example Server
constserverHost="127.0.0.1";// App Host of this Example Server
constserverPort="50051";// App Port of this Example Server "
asyncfunctionstart() {constserver=newDaprServer({serverHost,serverPort,clientOptions:{daprHost,daprPort,},});constpubSubName="my-pubsub-name";consttopic="topic-a";// Configure Subscriber for a Topic with rule set
// Note: the default route and match patterns are optional
awaitserver.pubsub.subscribe("pubsub-redis","topic-1",{default:"/default",rules:[{match:`event.type == "my-type-1"`,path:"/type-1",},{match:`event.type == "my-type-2"`,path:"/type-2",},],});// Add handlers for each route
server.pubsub.subscribeToRoute("pubsub-redis","topic-1","default",async(data)=>{console.log(`Handling Default`);});server.pubsub.subscribeToRoute("pubsub-redis","topic-1","type-1",async(data)=>{console.log(`Handling Type 1`);});server.pubsub.subscribeToRoute("pubsub-redis","topic-1","type-2",async(data)=>{console.log(`Handling Type 2`);});// Start the server
awaitserver.start();}
Susbcribe with Wildcards
The popular wildcards * and + are supported (make sure to validate if the pubsub component supports it) and can be subscribed to as follows:
import{DaprServer}from"@dapr/dapr";constdaprHost="127.0.0.1";// Dapr Sidecar Host
constdaprPort="3500";// Dapr Sidecar Port of this Example Server
constserverHost="127.0.0.1";// App Host of this Example Server
constserverPort="50051";// App Port of this Example Server "
asyncfunctionstart() {constserver=newDaprServer({serverHost,serverPort,clientOptions:{daprHost,daprPort,},});constpubSubName="my-pubsub-name";// * Wildcard
awaitserver.pubsub.subscribe(pubSubName,"/events/*",async(data: any,headers: object)=>console.log(`Received Data: ${JSON.stringify(data)}`),);// + Wildcard
awaitserver.pubsub.subscribe(pubSubName,"/events/+/temperature",async(data: any,headers: object)=>console.log(`Received Data: ${JSON.stringify(data)}`),);// Start the server
awaitserver.start();}
Bulk Subscribe to messages
Bulk Subscription is supported and is available through following API:
Bulk subscription through the subscribeBulk method: maxMessagesCount and maxAwaitDurationMs are optional; and if not provided, default values for related components will be used.
While listening for messages, the application receives messages from Dapr in bulk. However, like regular subscribe, the callback function receives a single message at a time, and the user can choose to return a DaprPubSubStatusEnum value to acknowledge successfully, retry, or drop the message. The default behavior is to return a success response.
import{DaprServer}from"@dapr/dapr";constpubSubName="orderPubSub";consttopic="topicbulk";constdaprHost=process.env.DAPR_HOST||"127.0.0.1";constdaprHttpPort=process.env.DAPR_HTTP_PORT||"3502";constserverHost=process.env.SERVER_HOST||"127.0.0.1";constserverPort=process.env.APP_PORT||5001;asyncfunctionstart() {constserver=newDaprServer({serverHost,serverPort,clientOptions:{daprHost,daprPort: daprHttpPort,},});// Publish multiple messages to a topic with default config.
awaitclient.pubsub.subscribeBulk(pubSubName,topic,(data)=>console.log("Subscriber received: "+JSON.stringify(data)),);// Publish multiple messages to a topic with specific maxMessagesCount and maxAwaitDurationMs.
awaitclient.pubsub.subscribeBulk(pubSubName,topic,(data)=>{console.log("Subscriber received: "+JSON.stringify(data));returnDaprPubSubStatusEnum.SUCCESS;// If App doesn't return anything, the default is SUCCESS. App can also return RETRY or DROP based on the incoming message.
},{maxMessagesCount: 100,maxAwaitDurationMs: 40,},);}
Dead Letter Topics
Dapr supports dead letter topic. This means that when a message fails to be processed, it gets sent to a dead letter queue. E.g., when a message fails to be handled on /my-queue it will be sent to /my-queue-failed.
E.g., when a message fails to be handled on /my-queue it will be sent to /my-queue-failed.
You can use the following options with subscribeWithOptions method:
deadletterTopic: Specify a deadletter topic name (note: if none is provided we create one named deadletter)
deadletterCallback: The method to trigger as handler for our deadletter
Implementing Deadletter support in the JS SDK can be done by either
Passing the deadletterCallback as an option
By subscribing to route manually with subscribeToRoute
An example is provided below
import{DaprServer}from"@dapr/dapr";constdaprHost="127.0.0.1";// Dapr Sidecar Host
constdaprPort="3500";// Dapr Sidecar Port of this Example Server
constserverHost="127.0.0.1";// App Host of this Example Server
constserverPort="50051";// App Port of this Example Server "
asyncfunctionstart() {constserver=newDaprServer({serverHost,serverPort,clientOptions:{daprHost,daprPort,},});constpubSubName="my-pubsub-name";// Method 1 (direct subscribing through subscribeWithOptions)
awaitserver.pubsub.subscribeWithOptions("pubsub-redis","topic-options-5",{callback: async(data: any)=>{thrownewError("Triggering Deadletter");},deadLetterCallback: async(data: any)=>{console.log("Handling Deadletter message");},});// Method 2 (subscribe afterwards)
awaitserver.pubsub.subscribeWithOptions("pubsub-redis","topic-options-1",{deadletterTopic:"my-deadletter-topic",});server.pubsub.subscribeToRoute("pubsub-redis","topic-options-1","default",async()=>{thrownewError("Triggering Deadletter");});server.pubsub.subscribeToRoute("pubsub-redis","topic-options-1","my-deadletter-topic",async()=>{console.log("Handling Deadletter message");});// Start server
awaitserver.start();}
import{DaprServer}from"@dapr/dapr";constdaprHost="127.0.0.1";constdaprPort="3500";constserverHost="127.0.0.1";constserverPort="5051";asyncfunctionstart() {constclient=newDaprClient({daprHost,daprPort,communicationProtocol: CommunicationProtocolEnum.GRPC,});conststream=awaitclient.configuration.subscribeWithKeys("config-redis",["myconfigkey1","myconfigkey2"],()=>{// Received a key update
});// When you are ready to stop listening, call the following
awaitstream.close();}start().catch((e)=>{console.error(e);process.exit(1);});
How to get up and running with Actors using the Dapr JavaScript SDK
The Dapr actors package allows you to interact with Dapr virtual actors from a JavaScript application. The examples below demonstrate how to use the JavaScript SDK for interacting with virtual actors.
The below code examples loosely describe the scenario of a Parking Garage Spot Monitoring System, which can be seen in this video by Mark Russinovich.
A parking garage consists of hundreds of parking spaces, where each parking space includes a sensor that provides updates to a centralized monitoring system. The parking space sensors (our actors) detect if a parking space is occupied or available.
The actor interface defines the contract that is shared between the actor implementation and the clients calling the actor. In the example below, we have created an interace for a parking garage sensor. Each sensor has 2 methods: carEnter and carLeave, which defines the state of the parking space:
An actor implementation defines a class by extending the base type AbstractActor and implementing the actor interface (ParkingSensorInterface in this case).
The following code describes an actor implementation along with a few helper methods.
import{AbstractActor}from"@dapr/dapr";importParkingSensorInterfacefrom"./ParkingSensorInterface";exportdefaultclassParkingSensorImplextendsAbstractActorimplementsParkingSensorInterface{asynccarEnter():Promise<void>{// Implementation that updates state that this parking spaces is occupied.
}asynccarLeave():Promise<void>{// Implementation that updates state that this parking spaces is available.
}privateasyncgetInfo():Promise<object>{// Implementation of requesting an update from the parking space sensor.
}/**
* @override
*/asynconActivate():Promise<void>{// Initialization logic called by AbstractActor.
}}
Configuring Actor Runtime
To configure actor runtime, use the DaprClientOptions. The various parameters and their default values are documented at How-to: Use virtual actors in Dapr.
Note, the timeouts and intervals should be formatted as time.ParseDuration strings.
import{CommunicationProtocolEnum,DaprClient,DaprServer}from"@dapr/dapr";// Configure the actor runtime with the DaprClientOptions.
constclientOptions={daprHost: daprHost,daprPort: daprPort,communicationProtocol: CommunicationProtocolEnum.HTTP,actor:{actorIdleTimeout:"1h",actorScanInterval:"30s",drainOngoingCallTimeout:"1m",drainRebalancedActors: true,reentrancy:{enabled: true,maxStackDepth: 32,},remindersStoragePartitions: 0,},};// Use the options when creating DaprServer and DaprClient.
// Note, DaprServer creates a DaprClient internally, which needs to be configured with clientOptions.
constserver=newDaprServer({serverHost,serverPort,clientOptions});constclient=newDaprClient(clientOptions);
Registering Actors
Initialize and register your actors by using the DaprServer package:
import{DaprServer}from"@dapr/dapr";importParkingSensorImplfrom"./ParkingSensorImpl";constdaprHost="127.0.0.1";constdaprPort="50000";constserverHost="127.0.0.1";constserverPort="50001";constserver=newDaprServer({serverHost,serverPort,clientOptions:{daprHost,daprPort,},});awaitserver.actor.init();// Let the server know we need actors
server.actor.registerActor(ParkingSensorImpl);// Register the actor
awaitserver.start();// Start the server
// To get the registered actors, you can invoke `getRegisteredActors`:
constresRegisteredActors=awaitserver.actor.getRegisteredActors();console.log(`Registered Actors: ${JSON.stringify(resRegisteredActors)}`);
Invoking Actor Methods
After Actors are registered, create a Proxy object that implements ParkingSensorInterface using the ActorProxyBuilder. You can invoke the actor methods by directly calling methods on the Proxy object. Internally, it translates to making a network call to the Actor API and fetches the result back.
import{ActorId,DaprClient}from"@dapr/dapr";importParkingSensorImplfrom"./ParkingSensorImpl";importParkingSensorInterfacefrom"./ParkingSensorInterface";constdaprHost="127.0.0.1";constdaprPort="50000";constclient=newDaprClient({daprHost,daprPort});// Create a new actor builder. It can be used to create multiple actors of a type.
constbuilder=newActorProxyBuilder<ParkingSensorInterface>(ParkingSensorImpl,client);// Create a new actor instance.
constactor=builder.build(newActorId("my-actor"));// Or alternatively, use a random ID
// const actor = builder.build(ActorId.createRandomId());
// Invoke the method.
awaitactor.carEnter();
Using states with Actor
import{AbstractActor}from"@dapr/dapr";importActorStateInterfacefrom"./ActorStateInterface";exportdefaultclassActorStateExampleextendsAbstractActorimplementsActorStateInterface{asyncsetState(key: string,value: any):Promise<void>{awaitthis.getStateManager().setState(key,value);awaitthis.getStateManager().saveState();}asyncremoveState(key: string):Promise<void>{awaitthis.getStateManager().removeState(key);awaitthis.getStateManager().saveState();}// getState with a specific type
asyncgetState<T>(key: string):Promise<T|null>{returnawaitthis.getStateManager<T>().getState(key);}// getState without type as `any`
asyncgetState(key: string):Promise<any>{returnawaitthis.getStateManager().getState(key);}}
Actor Timers and Reminders
The JS SDK supports actors that can schedule periodic work on themselves by registering either timers or reminders. The main difference between timers and reminders is that the Dapr actor runtime does not retain any information about timers after deactivation, but persists reminders information using the Dapr actor state provider.
This distinction allows users to trade off between light-weight but stateless timers versus more resource-demanding but stateful reminders.
The scheduling interface of timers and reminders is identical. For an more in-depth look at the scheduling configurations see the actors timers and reminders docs.
Actor Timers
// ...
constactor=builder.build(newActorId("my-actor"));// Register a timer
awaitactor.registerActorTimer("timer-id",// Unique name of the timer.
"cb-method",// Callback method to execute when timer is fired.
Temporal.Duration.from({seconds: 2}),// DueTime
Temporal.Duration.from({seconds: 1}),// Period
Temporal.Duration.from({seconds: 1}),// TTL
50,// State to be sent to timer callback.
);// Delete the timer
awaitactor.unregisterActorTimer("timer-id");
Actor Reminders
// ...
constactor=builder.build(newActorId("my-actor"));// Register a reminder, it has a default callback: `receiveReminder`
awaitactor.registerActorReminder("reminder-id",// Unique name of the reminder.
Temporal.Duration.from({seconds: 2}),// DueTime
Temporal.Duration.from({seconds: 1}),// Period
Temporal.Duration.from({seconds: 1}),// TTL
100,// State to be sent to reminder callback.
);// Delete the reminder
awaitactor.unregisterActorReminder("reminder-id");
To handle the callback, you need to override the default receiveReminder implementation in your actor. For example, from our original actor implementation:
The JavaScript SDK comes with a out-of-box Console based logger. The SDK emits various internal logs to help users understand the chain of events and troubleshoot problems. A consumer of this SDK can customize the verbosity of the log, as well as provide their own implementation for the logger.
Configure log level
There are five levels of logging in descending order of importance - error, warn, info, verbose, and debug. Setting the log to a level means that the logger will emit all the logs that are at least as important as the mentioned level. For example, setting to verbose log means that the SDK will not emit debug level logs. The default log level is info.
Dapr Client
import{CommunicationProtocolEnum,DaprClient,LogLevel}from"@dapr/dapr";// create a client instance with log level set to verbose.
constclient=newDaprClient({daprHost,daprPort,communicationProtocol:CommunicationProtocolEnum.HTTP,logger:{level:LogLevel.Verbose},});
import{CommunicationProtocolEnum,DaprServer,LogLevel}from"@dapr/dapr";// create a server instance with log level set to error.
constserver=newDaprServer({serverHost,serverPort,clientOptions:{daprHost,daprPort,logger:{level: LogLevel.Error},},});
import{CommunicationProtocolEnum,DaprClient,LogLevel}from"@dapr/dapr";import{WinstonLoggerService}from"./WinstonLoggerService";constwinstonLoggerService=newWinstonLoggerService();// create a client instance with log level set to verbose and logger service as winston.
constclient=newDaprClient({daprHost,daprPort,communicationProtocol: CommunicationProtocolEnum.HTTP,logger:{level: LogLevel.Verbose,service: winstonLoggerService},});
4.5 - JavaScript Examples
Get started with the Dapr Javascript SDK through some of our examples!
Quickstarts
State Management: Learn the concept of state management with Dapr
Pub Sub: Create your own Publish / Subscribe system
Clone the JavaScript SDK repo and navigate into it.
git clone https://github.com/dapr/js-sdk
cd js-sdk
From the JavaScript SDK root directory, navigate to the Dapr Workflow example.
cd examples/workflow/authoring
Run the following command to install the requirements for running this workflow sample with the Dapr JavaScript SDK.
npm install
Run the activity-sequence.ts
The activity-sequence file registers a workflow and an activity with the Dapr Workflow runtime. The workflow is a sequence of activities that are executed in order. We use DaprWorkflowClient to schedule a new workflow instance and wait for it to complete.
constdaprHost="localhost";constdaprPort="50001";constworkflowClient=newDaprWorkflowClient({daprHost,daprPort,});constworkflowRuntime=newWorkflowRuntime({daprHost,daprPort,});consthello=async(_: WorkflowActivityContext,name: string)=>{return`Hello ${name}!`;};constsequence: TWorkflow=asyncfunction*(ctx: WorkflowContext):any{constcities: string[]=[];constresult1=yieldctx.callActivity(hello,"Tokyo");cities.push(result1);constresult2=yieldctx.callActivity(hello,"Seattle");cities.push(result2);constresult3=yieldctx.callActivity(hello,"London");cities.push(result3);returncities;};workflowRuntime.registerWorkflow(sequence).registerActivity(hello);// Wrap the worker startup in a try-catch block to handle any errors during startup
try{awaitworkflowRuntime.start();console.log("Workflow runtime started successfully");}catch(error){console.error("Error starting workflow runtime:",error);}// Schedule a new orchestration
try{constid=awaitworkflowClient.scheduleNewWorkflow(sequence);console.log(`Orchestration scheduled with ID: ${id}`);// Wait for orchestration completion
conststate=awaitworkflowClient.waitForWorkflowCompletion(id,undefined,30);console.log(`Orchestration completed! Result: ${state?.serializedOutput}`);}catch(error){console.error("Error scheduling or waiting for orchestration:",error);}
In the code above:
workflowRuntime.registerWorkflow(sequence) registers sequence as a workflow in the Dapr Workflow runtime.
await workflowRuntime.start(); builds and starts the engine within the Dapr Workflow runtime.
await workflowClient.scheduleNewWorkflow(sequence) schedules a new workflow instance with the Dapr Workflow runtime.
await workflowClient.waitForWorkflowCompletion(id, undefined, 30) waits for the workflow instance to complete.
In the terminal, execute the following command to kick off the activity-sequence.ts:
npm run start:dapr:activity-sequence
Expected output
You're up and running! Both Dapr and your app logs will appear here.
...
== APP == Orchestration scheduled with ID: dc040bea-6436-4051-9166-c9294f9d2201
== APP == Waiting 30 seconds for instance dc040bea-6436-4051-9166-c9294f9d2201 to complete...
== APP == Received "Orchestrator Request" work item with instance id 'dc040bea-6436-4051-9166-c9294f9d2201'
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Rebuilding local state with 0 history event...
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Processing 2 new history event(s): [ORCHESTRATORSTARTED=1, EXECUTIONSTARTED=1]
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Waiting for 1 task(s) and 0 event(s) to complete...
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Returning 1 action(s)
== APP == Received "Activity Request" work item
== APP == Activity hello completed with output "Hello Tokyo!" (14 chars)
== APP == Received "Orchestrator Request" work item with instance id 'dc040bea-6436-4051-9166-c9294f9d2201'
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Rebuilding local state with 3 history event...
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Processing 2 new history event(s): [ORCHESTRATORSTARTED=1, TASKCOMPLETED=1]
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Waiting for 1 task(s) and 0 event(s) to complete...
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Returning 1 action(s)
== APP == Received "Activity Request" work item
== APP == Activity hello completed with output "Hello Seattle!" (16 chars)
== APP == Received "Orchestrator Request" work item with instance id 'dc040bea-6436-4051-9166-c9294f9d2201'
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Rebuilding local state with 6 history event...
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Processing 2 new history event(s): [ORCHESTRATORSTARTED=1, TASKCOMPLETED=1]
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Waiting for 1 task(s) and 0 event(s) to complete...
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Returning 1 action(s)
== APP == Received "Activity Request" work item
== APP == Activity hello completed with output "Hello London!" (15 chars)
== APP == Received "Orchestrator Request" work item with instance id 'dc040bea-6436-4051-9166-c9294f9d2201'
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Rebuilding local state with 9 history event...
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Processing 2 new history event(s): [ORCHESTRATORSTARTED=1, TASKCOMPLETED=1]
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Orchestration completed with status COMPLETED
== APP == dc040bea-6436-4051-9166-c9294f9d2201: Returning 1 action(s)
INFO[0006] dc040bea-6436-4051-9166-c9294f9d2201: 'sequence' completed with a COMPLETED status. app_id=activity-sequence-workflow instance=kaibocai-devbox scope=wfengine.backend type=log ver=1.12.3
== APP == Instance dc040bea-6436-4051-9166-c9294f9d2201 completed
== APP == Orchestration completed! Result: ["Hello Tokyo!","Hello Seattle!","Hello London!"]
In a directory where you want to create your service, run composer init and answer the questions.
Install with composer require dapr/php-sdk and any other dependencies you may wish to use.
Configure your service
Create a config.php, copying the contents below:
<?phpuseDapr\Actors\Generators\ProxyFactory;useDapr\Middleware\Defaults\{Response\ApplicationJson,Tracing};usePsr\Log\LogLevel;usefunctionDI\{env,get};return[// set the log level
'dapr.log.level'=>LogLevel::WARNING,// Generate a new proxy on each request - recommended for development
'dapr.actors.proxy.generation'=>ProxyFactory::GENERATED,// put any subscriptions here
'dapr.subscriptions'=>[],// if this service will be hosting any actors, add them here
'dapr.actors'=>[],// if this service will be hosting any actors, configure how long until dapr should consider an actor idle
'dapr.actors.idle_timeout'=>null,// if this service will be hosting any actors, configure how often dapr will check for idle actors
'dapr.actors.scan_interval'=>null,// if this service will be hosting any actors, configure how long dapr will wait for an actor to finish during drains
'dapr.actors.drain_timeout'=>null,// if this service will be hosting any actors, configure if dapr should wait for an actor to finish
'dapr.actors.drain_enabled'=>null,// you shouldn't have to change this, but the setting is here if you need to
'dapr.port'=>env('DAPR_HTTP_PORT','3500'),// add any custom serialization routines here
'dapr.serializers.custom'=>[],// add any custom deserialization routines here
'dapr.deserializers.custom'=>[],// the following has no effect, as it is the default middlewares and processed in order specified
'dapr.http.middleware.request'=>[get(Tracing::class)],'dapr.http.middleware.response'=>[get(ApplicationJson::class),get(Tracing::class)],];
If you’re new to the actor pattern, the best place to learn about the actor pattern is in
the Actor Overview.
In the PHP SDK, there are two sides to an actor, the Client, and the Actor (aka, the Runtime). As a client of an actor,
you’ll interact with a remote actor via the ActorProxy class. This class generates a proxy class on-the-fly using one
of several configured strategies.
When writing an actor, state can be managed for you. You can hook into the actor lifecycle, and define reminders and
timers. This gives you considerable power for handling all types of problems that the actor pattern is suited for.
The Actor Proxy
Whenever you want to communicate with an actor, you’ll need to get a proxy object to do so. The proxy is responsible for
serializing your request, deserializing the response, and returning it to you, all while obeying the contract defined by
the specified interface.
In order to create the proxy, you’ll first need an interface to define how and what you send and receive from an actor.
For example, if you want to communicate with a counting actor that solely keeps track of counts, you might define the
interface as follows:
It’s a good idea to put this interface in a shared library that the actor and clients can both access (if both are written in PHP). The DaprType
attribute tells the DaprClient the name of the actor to send to. It should match the implementation’s DaprType, though
you can override the type if needed.
To create an actor, you need to implement the interface you defined earlier and also add the DaprType attribute. All
actors must implement IActor, however there’s an Actor base class that implements the boilerplate making your
implementation much simpler.
The most important bit is the constructor. It takes at least one argument with the name of id which is the id of the
actor. Any additional arguments are injected by the DI container, including any ActorState you want to use.
Actor Lifecycle
An actor is instantiated via the constructor on every request targeting that actor type. You can use it to calculate
ephemeral state or handle any kind of request-specific startup you require, such as setting up other clients or
connections.
After the actor is instantiated, the on_activation() method may be called. The on_activation() method is called any
time the actor “wakes up” or when it is created for the first time. It is not called on every request.
Next, the actor method is called. This may be from a timer, reminder, or from a client. You may perform any work that
needs to be done and/or throw an exception.
Finally, the result of the work is returned to the caller. After some time (depending on how you’ve configured the
service), the actor will be deactivated and on_deactivation() will be called. This may not be called if the host dies,
daprd crashes, or some other error occurs which prevents it from being called successfully.
Actor State
Actor state is a “Plain Old PHP Object” (POPO) that extends ActorState. The ActorState base class provides a couple
of useful methods. Here’s an example implementation:
There are four different modes actor proxies are handled. Each mode presents different trade-offs that you’ll need to
weigh during development and in production.
It can be set with dapr.actors.proxy.generation configuration key.
This is the default mode. In this mode, a class is generated and eval’d on every request. It’s mostly for development
and shouldn’t be used in production.
This is the same as ProxyModes::GENERATED except the class is stored in a tmp file so it doesn’t need to be
regenerated on every request. It doesn’t know when to update the cached class, so using it in development is discouraged
but is offered for when manually generating the files isn’t possible.
In this mode, an exception is thrown if the proxy class doesn’t exist. This is useful for when you don’t want to
generate code in production. You’ll have to make sure the class is generated and pre-/autoloaded.
Generating proxies
You can create a composer script to generate proxies on demand to take advantage of the ONLY_EXISTING mode.
Create a ProxyCompiler.php
<?phpclassProxyCompiler{privateconstPROXIES=[MyActorInterface::class,MyOtherActorInterface::class,];privateconstPROXY_LOCATION=__DIR__.'/proxies/';publicstaticfunctioncompile(){try{$app=\Dapr\App::create();foreach(self::PROXIESas$interface){$output=$app->run(function(\DI\FactoryInterface$factory)use($interface){return\Dapr\Actors\Generators\FileGenerator::generate($interface,$factory);});$reflection=newReflectionClass($interface);$dapr_type=$reflection->getAttributes(\Dapr\Actors\Attributes\DaprType::class)[0]->newInstance()->type;$filename='dapr_proxy_'.$dapr_type.'.php';file_put_contents(self::PROXY_LOCATION.$filename,$output);echo"Compiled: $interface";}}catch(Exception$ex){echo"Failed to generate proxy for $interface\n{$ex->getMessage()} on line {$ex->getLine()} in {$ex->getFile()}\n";}}}
Then add a psr-4 autoloader for the generated proxies and a script in composer.json:
And finally, configure dapr to only use the generated proxies:
<?php// in config.php
return['dapr.actors.proxy.generation'=>ProxyFactory::ONLY_EXISTING,];
In this mode, the proxy satisfies the interface contract, however, it does not actually implement the interface itself
(meaning instanceof will be false). This mode takes advantage of a few quirks in PHP to work and exists for cases
where code cannot be eval’d or generated.
Requests
Creating an actor proxy is very inexpensive for any mode. There are no requests made when creating an actor proxy object.
When you call a method on a proxy object, only methods that you implemented are serviced by your actor implementation.
get_id() is handled locally, and get_reminder(), delete_reminder(), etc. are handled by the daprd.
Actor implementation
Every actor implementation in PHP must implement \Dapr\Actors\IActor and use the \Dapr\Actors\ActorTrait trait. This
allows for fast reflection and some shortcuts. Using the \Dapr\Actors\Actor abstract base class does this for you, but
if you need to override the default behavior, you can do so by implementing the interface and using the trait.
Activation and deactivation
When an actor activates, a token file is written to a temporary directory (by default this is in
'/tmp/dapr_' + sha256(concat(Dapr type, id)) in linux and '%temp%/dapr_' + sha256(concat(Dapr type, id)) on Windows).
This is persisted until the actor deactivates, or the host shuts down. This allows for on_activation to be called once
and only once when Dapr activates the actor on the host.
Performance
Actor method invocation is very fast on a production setup with php-fpm and nginx, or IIS on Windows. Even though
the actor is constructed on every request, actor state keys are only loaded on-demand and not during each request.
However, there is some overhead in loading each key individually. This can be mitigated by storing an array of data in
state, trading some usability for speed. It is not recommended doing this from the start, but as an optimization when
needed.
Versioning state
The names of the variables in the ActorState object directly correspond to key names in the store. This means that if
you change the type or name of a variable, you may run into errors. To get around this, you may need to version your state
object. In order to do this, you’ll need to override how state is loaded and stored. There are many ways to approach this,
one such solution might be something like this:
<?phpclassVersionedStateextends\Dapr\Actors\ActorState{/**
* @var int The current version of the state in the store. We give a default value of the current version.
* However, it may be in the store with a different value.
*/publicint$state_version=self::VERSION;/**
* @var int The current version of the data
*/privateconstVERSION=3;/**
* Call when your actor activates.
*/publicfunctionupgrade(){if($this->state_version<self::VERSION){$value=parent::__get($this->get_versioned_key('key',$this->state_version));// update the value after updating the data structure
parent::__set($this->get_versioned_key('key',self::VERSION),$value);$this->state_version=self::VERSION;$this->save_state();}}// if you upgrade all keys as needed in the method above, you don't need to walk the previous
// keys when loading/saving and you can just get the current version of the key.
privatefunctionget_previous_version(int$version):int{return$this->has_previous_version($version)?$version-1:$version;}privatefunctionhas_previous_version(int$version):bool{return$version>=0;}privatefunctionwalk_versions(int$version,callable$callback,callable$predicate):mixed{$value=$callback($version);if($predicate($value)||!$this->has_previous_version($version)){return$value;}return$this->walk_versions($this->get_previous_version($version),$callback,$predicate);}privatefunctionget_versioned_key(string$key,int$version){return$this->has_previous_version($version)?$version.$key:$key;}publicfunction__get(string$key):mixed{return$this->walk_versions(self::VERSION,fn($version)=>parent::__get($this->get_versioned_key($key,$version)),fn($value)=>isset($value));}publicfunction__isset(string$key):bool{return$this->walk_versions(self::VERSION,fn($version)=>parent::__isset($this->get_versioned_key($key,$version)),fn($isset)=>$isset);}publicfunction__set(string$key,mixed$value):void{// optional: you can unset previous versions of the key
parent::__set($this->get_versioned_key($key,self::VERSION),$value);}publicfunction__unset(string$key):void{// unset this version and all previous versions
$this->walk_versions(self::VERSION,fn($version)=>parent::__unset($this->get_versioned_key($key,$version)),fn()=>false);}}
There’s a lot to be optimized, and it wouldn’t be a good idea to use this verbatim in production, but you can get the
gist of how it would work. A lot of it will depend on your use case which is why there’s not something like this in
the SDK. For instance, in this example implementation, the previous value is kept for where there may be a bug during an upgrade;
keeping the previous value allows for running the upgrade again, but you may wish to delete the previous value.
5.2 - The App
Using the App Class
In PHP, there is no default router. Thus, the \Dapr\App class is provided. It uses
Nikic’s FastRoute under the hood. However, you are free to use any router or
framework that you’d like. Just check out the add_dapr_routes() method in the App class to see how actors and
subscriptions are implemented.
Every app should start with App::create() which takes two parameters, the first is an existing DI container, if you
have one, and the second is a callback to hook into the ContainerBuilder and add your own configuration.
From there, you should define your routes and then call $app->start() to execute the route on the current request.
<?php// app.php
require_once__DIR__.'/vendor/autoload.php';$app=\Dapr\App::create(configure:fn(\DI\ContainerBuilder$builder)=>$builder->addDefinitions('config.php'));// add a controller for GET /test/{id} that returns the id
$app->get('/test/{id}',fn(string$id)=>$id);$app->start();
Returning from a controller
You can return anything from a controller, and it will be serialized into a json object. You can also request the
Psr Response object and return that instead, allowing you to customize headers, and have control over the entire response:
<?php$app=\Dapr\App::create(configure:fn(\DI\ContainerBuilder$builder)=>$builder->addDefinitions('config.php'));// add a controller for GET /test/{id} that returns the id
$app->get('/test/{id}',fn(string$id,\Psr\Http\Message\ResponseInterface$response,\Nyholm\Psr7\Factory\Psr17Factory$factory)=>$response->withBody($factory->createStream($id)));$app->start();
Using the app as a client
When you just want to use Dapr as a client, such as in existing code, you can call $app->run(). In these cases, there’s
usually no need for a custom configuration, however, you may want to use a compiled DI container, especially in production:
A DaprClient object is provided, in fact, all the sugar used by the App object is built on the DaprClient.
<?phprequire_once__DIR__.'/vendor/autoload.php';$clientBuilder=\Dapr\Client\DaprClient::clientBuilder();// you can customize (de)serialization or comment out to use the default JSON serializers.
$clientBuilder=$clientBuilder->withSerializationConfig($yourSerializer)->withDeserializationConfig($yourDeserializer);// you can also pass it a logger
$clientBuilder=$clientBuilder->withLogger($myLogger);// and change the url of the sidecar, for example, using https
$clientBuilder=$clientBuilder->useHttpClient('https://localhost:3800')
There are several functions you can call before
5.2.1 - Unit Testing
Unit Testing
Unit and integration tests are first-class citizens with the PHP SDK. Using the DI container, mocks, stubs,
and the provided \Dapr\Mocks\TestClient allows you to have very fine-grained tests.
Testing Actors
With actors, there are two things we’re interested in while the actor is under test:
The returned result based on an initial state
The resulting state based on the initial state
Here’s an example test a very simple actor that updates its state and returns a specific value:
<?php// TestState.php
classTestStateextends\Dapr\Actors\ActorState{publicint$number;}// TestActor.php
#[\Dapr\Actors\Attributes\DaprType('TestActor')]
classTestActorextends\Dapr\Actors\Actor{publicfunction__construct(string$id,privateTestState$state){parent::__construct($id);}publicfunctionoddIncrement():bool{if($this->state->number%2===0){returnfalse;}$this->state->number+=1;returntrue;}}// TheTest.php
classTheTestextends\PHPUnit\Framework\TestCase{private\DI\Container$container;publicfunctionsetUp():void{parent::setUp();// create a default app and extract the DI container from it
$app=\Dapr\App::create(configure:fn(\DI\ContainerBuilder$builder)=>$builder->addDefinitions(['dapr.actors'=>[TestActor::class]],[\Dapr\DaprClient::class=>\DI\autowire(\Dapr\Mocks\TestClient::class)]));$app->run(fn(\DI\Container$container)=>$this->container=$container);}publicfunctiontestIncrementsWhenOdd(){$id=uniqid();$runtime=$this->container->get(\Dapr\Actors\ActorRuntime::class);$client=$this->getClient();// return the current state from http://localhost:1313/reference/api/actors_api/
$client->register_get("/actors/TestActor/$id/state/number",code:200,data:3);// ensure it increments from http://localhost:1313/reference/api/actors_api/
$client->register_post("/actors/TestActor/$id/state",code:204,response_data:null,expected_request:[['operation'=>'upsert','request'=>['key'=>'number','value'=>4,],],]);$result=$runtime->resolve_actor('TestActor',$id,fn($actor)=>$runtime->do_method($actor,'oddIncrement',null));$this->assertTrue($result);}privatefunctiongetClient():\Dapr\Mocks\TestClient{return$this->container->get(\Dapr\DaprClient::class);}}
When building on transactions, you’ll likely want to test how a failed transaction is handled. In order to do that, you
need to inject failures and ensure the transaction matches what you expect.
<?php// MyState.php
#[\Dapr\State\Attributes\StateStore('statestore', \Dapr\consistency\EventualFirstWrite::class)]
classMyStateextends\Dapr\State\TransactionalState{publicstring$value='';}// SomeService.php
classSomeService{publicfunction__construct(privateMyState$state){}publicfunctiondoWork(){$this->state->begin();$this->state->value="hello world";$this->state->commit();}}// TheTest.php
classTheTestextends\PHPUnit\Framework\TestCase{private\DI\Container$container;publicfunctionsetUp():void{parent::setUp();$app=\Dapr\App::create(configure:fn(\DI\ContainerBuilder$builder)=>$builder->addDefinitions([\Dapr\DaprClient::class=>\DI\autowire(\Dapr\Mocks\TestClient::class)]));$this->container=$app->run(fn(\DI\Container$container)=>$container);}privatefunctiongetClient():\Dapr\Mocks\TestClient{return$this->container->get(\Dapr\DaprClient::class);}publicfunctiontestTransactionFailure(){$client=$this->getClient();// create a response from https://v1-16.docs.dapr.io/reference/api/state_api/
$client->register_post('/state/statestore/bulk',code:200,response_data:[['key'=>'value',// no previous value
],],expected_request:['keys'=>['value'],'parallelism'=>10]);$client->register_post('/state/statestore/transaction',code:200,response_data:null,expected_request:['operations'=>[['operation'=>'upsert','request'=>['key'=>'value','value'=>'hello world']]]]);$state=newMyState($this->container,$this->container);$service=newSomeService($state);$service->doWork();$this->assertSame('hello world',$state->value);}}
Dapr uses JSON serialization and thus (complex) type information is lost when sending/receiving data.
Serialization
When returning an object from a controller, passing an object to the DaprClient, or storing an object in a state store,
only public properties are scanned and serialized. You can customize this behavior by implementing \Dapr\Serialization\ISerialize.
For example, if you wanted to create an ID type that serialized to a string, you may implement it like so:
This works for any type that we have full ownership over, however, it doesn’t work for classes from libraries or PHP itself.
For that, you need to register a custom serializer with the DI container:
<?php// in config.php
classSerializeSomeClassimplements\Dapr\Serialization\Serializers\ISerialize{publicfunctionserialize(mixed$value,\Dapr\Serialization\ISerializer$serializer):mixed{// serialize $value and return the result
}}return['dapr.serializers.custom'=>[SomeClass::class=>newSerializeSomeClass()],];
Deserialization
Deserialization works exactly the same way, except the interface is \Dapr\Deserialization\Deserializers\IDeserialize.
5.4 - Publish and Subscribe with PHP
How to use
With Dapr, you can publish anything, including cloud events. The SDK contains a simple cloud event implementation, but
you can also just pass an array that conforms to the cloud event spec or use another library.
Only <code>application/octet-steam</code> is supported for binary data.
Receiving cloud events
In your subscription handler, you can have the DI Container inject either a Dapr\PubSub\CloudEvent or an array into
your controller. The former does some validation to ensure you have a proper event. If you need direct access to the
data, or the events do not conform to the spec, use an array.
5.5 - State Management with PHP
How to use
Dapr offers a great modular approach to using state in your application. The best way to learn the basics is to visit
the howto.
Metadata
Many state components allow you to pass metadata to the component to control specific aspects of the component’s
behavior. The PHP SDK allows you to pass that metadata through:
<?php// using the state manager
$app->run(fn(\Dapr\State\StateManager$stateManager)=>$stateManager->save_state('statestore',new\Dapr\State\StateItem('key','value',metadata:['port'=>'112'])));// using the DaprClient
$app->run(fn(\Dapr\Client\DaprClient$daprClient)=>$daprClient->saveState(storeName:'statestore',key:'key',value:'value',metadata:['port'=>'112']))
This is an example of how you might pass the port metadata to Cassandra.
Every state operation allows passing metadata.
Consistency/concurrency
In the PHP SDK, there are four classes that represent the four different types of consistency and concurrency in Dapr:
Passing one of them to a StateManager method or using the StateStore() attribute allows you to define how the state
store should handle conflicts.
Parallelism
When doing a bulk read or beginning a transaction, you can specify the amount of parallelism. Dapr will read “at most”
that many keys at a time from the underlying store if it has to read one key at a time. This can be helpful to control
the load on the state store at the expense of performance. The default is 10.
Prefix
Hardcoded key names are useful, but why not make state objects more reusable? When committing a transaction or saving an
object to state, you can pass a prefix that is applied to every key in the object.
<?phpclassTransactionObjectextends\Dapr\State\TransactionalState{publicstring$key;}$app->run(function(TransactionObject$object){$object->begin(prefix:'my-prefix-');$object->key='value';// commit to key `my-prefix-key`
$object->commit();});
<?phpclassStateObject{publicstring$key;}$app->run(function(\Dapr\State\StateManager$stateManager){$stateManager->load_object($obj=newStateObject(),prefix:'my-prefix-');// original value is from `my-prefix-key`
$obj->key='value';// save to `my-prefix-key`
$stateManager->save_object($obj,prefix:'my-prefix-');});
6 - Dapr Python SDK
Python SDK packages for developing Dapr applications
Dapr offers a variety of subpackages to help with the development of Python applications. Using them you can create Python clients, servers, and virtual actors with Dapr.
To get started with the Python SDK, install the main Dapr Python SDK package.
pip install dapr
Note: The development package will contain features and behavior that will be compatible with the pre-release version of the Dapr runtime. Make sure to uninstall any stable versions of the Python SDK before installing the dapr-dev package.
pip install dapr-dev
Available subpackages
SDK imports
Python SDK imports are subpackages included with the main SDK install, but need to be imported when used. The most common imports provided by the Dapr Python SDK are:
Client
Write Python applications to interact with a Dapr sidecar and other Dapr applications, including stateful virtual actors in Python
SDK extensions mainly work as utilities for receiving pub/sub events, programatically creating pub/sub subscriptions, and handling input binding events. While you can acheive all of these tasks without an extension, using a Python SDK extension proves convenient.
gRPC
Create Dapr services with the gRPC server extension.
FastAPI
Integrate with Dapr Python virtual actors and pub/sub using the Dapr FastAPI extension.
Flask
Integrate with Dapr Python virtual actors using the Dapr Flask extension.
Workflow
Author workflows that work with other Dapr APIs in Python.
The dapr package contains the DaprClient, which is used to create and use a client.
fromdapr.clientsimportDaprClient
Initialising the client
You can initialise a Dapr client in multiple ways:
Default values:
When you initialise the client without any parameters it will use the default values for a Dapr
sidecar instance (127.0.0.1:50001).
fromdapr.clientsimportDaprClientwithDaprClient()asd:# use the client
Specifying an endpoint on initialisation:
When passed as an argument in the constructor, the gRPC endpoint takes precedence over any
configuration or environment variable.
fromdapr.clientsimportDaprClientwithDaprClient("mydomain:50051?tls=true")asd:# use the client
Configuration options:
Dapr Sidecar Endpoints
You can use the standardised DAPR_GRPC_ENDPOINT environment variable to
specify the gRPC endpoint. When this variable is set, the client can be initialised
without any arguments:
fromdapr.clientsimportDaprClientwithDaprClient()asd:# the client will use the endpoint specified in the environment variables
The legacy environment variables DAPR_RUNTIME_HOST, DAPR_HTTP_PORT and DAPR_GRPC_PORT are
also supported, but DAPR_GRPC_ENDPOINT takes precedence.
Dapr API Token
If your Dapr instance is configured to require the DAPR_API_TOKEN environment variable, you can
set it in the environment and the client will use it automatically. You can read more about Dapr API token authentication here.
Health timeout
On client initialisation, a health check is performed against the Dapr sidecar (/healthz/outbound).
The client will wait for the sidecar to be up and running before proceeding.
The default healthcheck timeout is 60 seconds, but it can be overridden by setting the DAPR_HEALTH_TIMEOUT
environment variable.
Retries and timeout
The Dapr client can retry a request if a specific error code is received from the sidecar. This is
configurable through the DAPR_API_MAX_RETRIES environment variable and is picked up automatically,
not requiring any code changes.
The default value for DAPR_API_MAX_RETRIES is 0, which means no retries will be made.
You can fine-tune more retry parameters by creating a dapr.clients.retry.RetryPolicy object and
passing it to the DaprClient constructor:
Timeout can be set for all calls through the environment variable DAPR_API_TIMEOUT_SECONDS. The default value is 60 seconds.
Note: You can control timeouts on service invocation separately, by passing a timeout parameter to the invoke_method method.
Error handling
Initially, errors in Dapr followed the Standard gRPC error model. However, to provide more detailed and informative error messages, in version 1.13 an enhanced error model has been introduced which aligns with the gRPC Richer error model. In response, the Python SDK implemented DaprGrpcError, a custom exception class designed to improve the developer experience. It’s important to note that the transition to using DaprGrpcError for all gRPC status exceptions is a work in progress. As of now, not every API call in the SDK has been updated to leverage this custom exception. We are actively working on this enhancement and welcome contributions from the community.
Example of handling DaprGrpcError exceptions when using the Dapr python-SDK:
The Dapr Python SDK provides a simple API for invoking services via either HTTP or gRPC (deprecated). The protocol can be selected by setting the DAPR_API_METHOD_INVOCATION_PROTOCOL environment variable, defaulting to HTTP when unset. GRPC service invocation in Dapr is deprecated and GRPC proxying is recommended as an alternative.
fromdapr.clientsimportDaprClientwithDaprClient()asd:# invoke a method (gRPC or HTTP GET) resp=d.invoke_method('service-to-invoke','method-to-invoke',data='{"message":"Hello World"}')# for other HTTP verbs the verb must be specified# invoke a 'POST' method (HTTP only) resp=d.invoke_method('service-to-invoke','method-to-invoke',data='{"id":"100", "FirstName":"Value", "LastName":"Value"}',http_verb='post')
The base endpoint for HTTP api calls is specified in the DAPR_HTTP_ENDPOINT environment variable.
If this variable is not set, the endpoint value is derived from the DAPR_RUNTIME_HOST and DAPR_HTTP_PORT variables, whose default values are 127.0.0.1 and 3500 accordingly.
The base endpoint for gRPC calls is the one used for the client initialisation (explained above).
Visit Python SDK examples for code samples and instructions to try out service invocation.
Save & get application state
fromdapr.clientsimportDaprClientwithDaprClient()asd:# Save stated.save_state(store_name="statestore",key="key1",value="value1")# Get statedata=d.get_state(store_name="statestore",key="key1").data# Delete stated.delete_state(store_name="statestore",key="key1")
fromdapr.clientsimportDaprClientimportjsonwithDaprClient()asd:cloud_event={'specversion':'1.0','type':'com.example.event','source':'my-service','id':'myid','data':{'id':1,'message':'hello world'},'datacontenttype':'application/json',}# Set the data content type to 'application/cloudevents+json'resp=d.publish_event(pubsub_name='pubsub',topic_name='TOPIC_CE',data=json.dumps(cloud_event),data_content_type='application/cloudevents+json',)
Publish CloudEvents messages with plain text payload:
fromdapr.clientsimportDaprClientimportjsonwithDaprClient()asd:cloud_event={'specversion':'1.0','type':'com.example.event','source':'my-service','id':"myid",'data':'hello world','datacontenttype':'text/plain',}# Set the data content type to 'application/cloudevents+json'resp=d.publish_event(pubsub_name='pubsub',topic_name='TOPIC_CE',data=json.dumps(cloud_event),data_content_type='application/cloudevents+json',)
Subscribe to messages
fromcloudevents.sdk.eventimportv1fromdapr.ext.grpcimportAppimportjsonapp=App()# Default subscription for a topic@app.subscribe(pubsub_name='pubsub',topic='TOPIC_A')defmytopic(event:v1.Event)->None:data=json.loads(event.Data())print(f'Received: id={data["id"]}, message="{data["message"]}"'' content_type="{event.content_type}"',flush=True)# Specific handler using Pub/Sub routing@app.subscribe(pubsub_name='pubsub',topic='TOPIC_A',rule=Rule("event.type == \"important\"",1))defmytopic_important(event:v1.Event)->None:data=json.loads(event.Data())print(f'Received: id={data["id"]}, message="{data["message"]}"'' content_type="{event.content_type}"',flush=True)
Visit Python SDK examples for code samples and instructions to try out pub/sub.
Streaming message subscription
You can create a streaming subscription to a PubSub topic using either the subscribe
or subscribe_handler methods.
The subscribe method returns an iterable Subscription object, which allows you to pull messages from the
stream by using a for loop (ex. for message in subscription) or by
calling the next_message method. This will block on the main thread while waiting for messages.
When done, you should call the close method to terminate the
subscription and stop receiving messages.
The subscribe_with_handler method accepts a callback function that is executed for each message
received from the stream.
It runs in a separate thread, so it doesn’t block the main thread. The callback should return a
TopicEventResponse (ex. TopicEventResponse('success')), indicating whether the message was
processed successfully, should be retried, or should be discarded. The method will automatically
manage message acknowledgements based on the returned status. The call to subscribe_with_handler
method returns a close function, which should be called to terminate the subscription when you’re
done.
Here’s an example of using the subscribe method:
importtimefromdapr.clientsimportDaprClientfromdapr.clients.grpc.subscriptionimportStreamInactiveError,StreamCancelledErrorcounter=0defprocess_message(message):globalcountercounter+=1# Process the message hereprint(f'Processing message: {message.data()} from {message.topic()}...')return'success'defmain():withDaprClient()asclient:globalcountersubscription=client.subscribe(pubsub_name='pubsub',topic='TOPIC_A',dead_letter_topic='TOPIC_A_DEAD')try:formessageinsubscription:ifmessageisNone:print('No message received. The stream might have been cancelled.')continuetry:response_status=process_message(message)ifresponse_status=='success':subscription.respond_success(message)elifresponse_status=='retry':subscription.respond_retry(message)elifresponse_status=='drop':subscription.respond_drop(message)ifcounter>=5:breakexceptStreamInactiveError:print('Stream is inactive. Retrying...')time.sleep(1)continueexceptStreamCancelledError:print('Stream was cancelled')breakexceptExceptionase:print(f'Error occurred during message processing: {e}')finally:print('Closing subscription...')subscription.close()if__name__=='__main__':main()
And here’s an example of using the subscribe_with_handler method:
importtimefromdapr.clientsimportDaprClientfromdapr.clients.grpc._responseimportTopicEventResponsecounter=0defprocess_message(message):# Process the message hereglobalcountercounter+=1print(f'Processing message: {message.data()} from {message.topic()}...')returnTopicEventResponse('success')defmain():with(DaprClient()asclient):# This will start a new thread that will listen for messages# and process them in the `process_message` functionclose_fn=client.subscribe_with_handler(pubsub_name='pubsub',topic='TOPIC_A',handler_fn=process_message,dead_letter_topic='TOPIC_A_DEAD')whilecounter<5:time.sleep(1)print("Closing subscription...")close_fn()if__name__=='__main__':main()
Visit Python SDK examples for code samples and instructions to try out streaming pub/sub.
Conversation (Alpha)
Note
The Dapr Conversation API is currently in alpha.
Since version 1.15 Dapr offers developers the capability to securely and reliably interact with Large Language Models (LLM) through the Conversation API.
fromdapr.clientsimportDaprClientfromdapr.clients.grpc._requestimportConversationInputwithDaprClient()asd:inputs=[ConversationInput(content="What's Dapr?",role='user',scrub_pii=True),ConversationInput(content='Give a brief overview.',role='user',scrub_pii=True),]metadata={'model':'foo','key':'authKey','cacheTTL':'10m',}response=d.converse_alpha1(name='echo',inputs=inputs,temperature=0.7,context_id='chat-123',metadata=metadata)foroutputinresponse.outputs:print(f'Result: {output.result}')
Visit Python SDK examples for code samples and instructions to try out retrieving secrets
Configuration
Get configuration
fromdapr.clientsimportDaprClientwithDaprClient()asd:# Get Configurationconfiguration=d.get_configuration(store_name='configurationstore',keys=['orderId'],config_metadata={})
Subscribe to configuration
importasynciofromtimeimportsleepfromdapr.clientsimportDaprClientasyncdefexecuteConfiguration():withDaprClient()asd:storeName='configurationstore'key='orderId'# Wait for sidecar to be up within 20 seconds.d.wait(20)# Subscribe to configuration by key.configuration=awaitd.subscribe_configuration(store_name=storeName,keys=[key],config_metadata={})whileTrue:ifconfiguration!=None:items=configuration.get_items()forkey,iteminitems:print(f"Subscribe key={key} value={item.value} version={item.version}",flush=True)else:print("Nothing yet")sleep(5)asyncio.run(executeConfiguration())
Visit Python SDK examples for code samples and instructions to try out configuration.
Distributed Lock
fromdapr.clientsimportDaprClientdefmain():# Lock parametersstore_name='lockstore'# as defined in components/lockstore.yamlresource_id='example-lock-resource'client_id='example-client-id'expiry_in_seconds=60withDaprClient()asdapr:print('Will try to acquire a lock from lock store named [%s]'%store_name)print('The lock is for a resource named [%s]'%resource_id)print('The client identifier is [%s]'%client_id)print('The lock will will expire in %s seconds.'%expiry_in_seconds)withdapr.try_lock(store_name,resource_id,client_id,expiry_in_seconds)aslock_result:assertlock_result.success,'Failed to acquire the lock. Aborting.'print('Lock acquired successfully!!!')# At this point the lock was released - by magic of the `with` clause ;)unlock_result=dapr.unlock(store_name,resource_id,client_id)print('We already released the lock so unlocking will not work.')print('We tried to unlock it anyway and got back [%s]'%unlock_result.status)
Visit Python SDK examples for code samples and instructions to try out distributed lock.
Cryptography
fromdapr.clientsimportDaprClientmessage='The secret is "passw0rd"'defmain():withDaprClient()asd:resp=d.encrypt(data=message.encode(),options=EncryptOptions(component_name='crypto-localstorage',key_name='rsa-private-key.pem',key_wrap_algorithm='RSA',),)encrypt_bytes=resp.read()resp=d.decrypt(data=encrypt_bytes,options=DecryptOptions(component_name='crypto-localstorage',key_name='rsa-private-key.pem',),)decrypt_bytes=resp.read()print(decrypt_bytes.decode())# The secret is "passw0rd"
The interface defines the actor contract that is shared between the actor implementation and the clients calling the actor. Because a client may depend on it, it typically makes sense to define it in an assembly that is separate from the actor implementation.
An actor service hosts the virtual actor. It is implemented a class that derives from the base type Actor and implements the interfaces defined in the actor interface.
Actors can be created using one of the Dapr actor extensions:
Mock actors are created by passing your actor class and an actor ID (a string) to the create_mock_actor function. This function returns an instance of the actor with many internal methods overridden. Instead of interacting with Dapr for tasks like saving state or managing timers, the mock actor uses in-memory state to simulate these behaviors.
This state can be accessed through the following variables:
IMPORTANT NOTE: Due to type hinting issues as discussed further down, these variables will not be visible to type hinters/linters/etc, who will think they are invalid variables. You will need to use them with #type: ignore in order to satisfy any such systems.
_state_manager._mock_state() A [str, object] dict where all the actor state is stored. Any variable saved via _state_manager.save_state(key, value), or any other statemanager method is stored in the dict as that key, value pair. Any value loaded via try_get_state or any other statemanager method is taken from this dict.
_state_manager._mock_timers() A [str, ActorTimerData] dict which holds the active actor timers. Any actor method which would add or remove a timer adds or pops the appropriate ActorTimerData object from this dict.
_state_manager._mock_reminders() A [str, ActorReminderData] dict which holds the active actor reminders. Any actor method which would add or remove a timer adds or pops the appropriate ActorReminderData object from this dict.
Note: The timers and reminders will never actually trigger. The dictionaries exist only so methods that should add or remove timers/reminders can be tested. If you need to test the callbacks they should activate, you should call them directly with the appropriate values:
result = await mock_actor.recieve_reminder(name, state, due_time, period, _ttl)
# Test the result directly or test for side effects (like changing state) by querying `_state_manager._mock_state`
Usage and Limitations
To allow for more fine-grained control, the _on_activate method will not be called automatically the way it is when Dapr initializes a new Actor instance. You should call it manually as needed as part of your tests.
A current limitation of the mock actor system is that it does not call the _on_pre_actor_method and _on_post_actor_method methods. You can always call these methods manually as part of a test.
The __init__, register_timer, unregister_timer, register_reminder, unregister_reminder methods are all overwritten by the MockActor class that gets applied as a mixin via create_mock_actor. If your actor itself overwrites these methods, those modifications will themselves be overwritten and the actor will likely not behave as you expect.
note: __init__ is a special case where you are expected to define it as
Mock actors work fine with this, but if you have added any extra logic into __init__, it will be overwritten. It is worth noting that the correct way to apply logic on initialization is via _on_activate (which can also be safely used with mock actors) instead of __init__.
If you have an actor which does override default Dapr actor methods, you can create a custom subclass of the MockActor class (from MockActor.py) which implements whatever custom logic you have along with interacting with _mock_state, _mock_timers, and _mock_reminders as normal, and then applying that custom class as a mixin via a create_mock_actor function you define yourself.
The actor _runtime_ctx variable is set to None. All the normal actor methods have been overwritten such as to not call it, but if your code itself interacts directly with _runtime_ctx, tests may fail.
The actor _state_manager is overwritten with an instance of MockStateManager. This has all the same methods and functionality of the base ActorStateManager, except for using the various _mock variables for storing data instead of the _runtime_ctx. If your code implements its own custom state manager it will be overwritten and tests will likely fail.
Type Hinting
Because of Python’s lack of a unified method for type hinting type intersections (see: python/typing #213), type hinting unfortunately doesn’t work with Mock Actors. The return type is type hinted as “instance of Actor subclass T” when it should really be type hinted as “instance of MockActor subclass T” or “instance of type intersection [Actor subclass T, MockActor]” (where, it is worth noting, MockActor is itself a subclass of Actor).
This means that, for example, if you hover over mockactor._state_manager in a code editor, it will come up as an instance of ActorStateManager (instead of MockStateManager), and various IDE helper functions (like VSCode’s Go to Definition, which will bring you to the definition of ActorStateManager instead of MockStateManager) won’t work properly.
For now, this issue is unfixable, so it’s merely something to be noted because of the confusion it might cause. If in the future it becomes possible to accurately type hint cases like this feel free to open an issue about implementing it.
6.3 - Dapr Python SDK extensions
Python SDK for developing Dapr applications
6.3.1 - Getting started with the Dapr Python gRPC service extension
How to get up and running with the Dapr Python gRPC extension
The Dapr Python SDK provides a built in gRPC server extension, dapr.ext.grpc, for creating Dapr services.
Installation
You can download and install the Dapr gRPC server extension with:
pip install dapr-ext-grpc
Note
The development package will contain features and behavior that will be compatible with the pre-release version of the Dapr runtime. Make sure to uninstall any stable versions of the Python SDK extension before installing the <code>dapr-dev</code> package.
pip3 install dapr-ext-grpc-dev
Examples
The App object can be used to create a server.
Listen for service invocation requests
The InvokeMethodReqest and InvokeMethodResponse objects can be used to handle incoming requests.
A simple service that will listen and respond to requests will look like:
When subscribing to a topic, you can instruct dapr whether the event delivered has been accepted, or whether it should be dropped, or retried later.
fromtypingimportOptionalfromcloudevents.sdk.eventimportv1fromdapr.ext.grpcimportAppfromdapr.clients.grpc._responseimportTopicEventResponseapp=App()# Default subscription for a topic@app.subscribe(pubsub_name='pubsub',topic='TOPIC_A')defmytopic(event:v1.Event)->Optional[TopicEventResponse]:print(event.Data(),flush=True)# Returning None (or not doing a return explicitly) is equivalent# to returning a TopicEventResponse("success").# You can also return TopicEventResponse("retry") for dapr to log# the message and retry delivery later, or TopicEventResponse("drop")# for it to drop the messagereturnTopicEventResponse("success")# Specific handler using Pub/Sub routing@app.subscribe(pubsub_name='pubsub',topic='TOPIC_A',rule=Rule("event.type == \"important\"",1))defmytopic_important(event:v1.Event)->None:print(event.Data(),flush=True)# Handler with disabled topic validation@app.subscribe(pubsub_name='pubsub-mqtt',topic='topic/#',disable_topic_validation=True,)defmytopic_wildcard(event:v1.Event)->None:print(event.Data(),flush=True)app.run(50051)
How to create Dapr Python virtual actors and pubsub with the FastAPI extension
The Dapr Python SDK provides integration with FastAPI using the dapr-ext-fastapi extension.
Installation
You can download and install the Dapr FastAPI extension with:
pip install dapr-ext-fastapi
Note
The development package will contain features and behavior that will be compatible with the pre-release version of the Dapr runtime. Make sure to uninstall any stable versions of the Python SDK extension before installing the <code>dapr-dev</code> package.
pip install dapr-ext-fastapi-dev
Example
Subscribing to events of different types
importuvicornfromfastapiimportBody,FastAPIfromdapr.ext.fastapiimportDaprAppfrompydanticimportBaseModelclassRawEventModel(BaseModel):body:strclassUser(BaseModel):id:intname:strclassCloudEventModel(BaseModel):data:Userdatacontenttype:strid:strpubsubname:strsource:strspecversion:strtopic:strtraceid:strtraceparent:strtracestate:strtype:strapp=FastAPI()dapr_app=DaprApp(app)# Allow handling event with any structure (Easiest, but least robust)# dapr publish --publish-app-id sample --topic any_topic --pubsub pubsub --data '{"id":"7", "desc": "good", "size":"small"}'@dapr_app.subscribe(pubsub='pubsub',topic='any_topic')defany_event_handler(event_data=Body()):print(event_data)# For robustness choose one of the below based on if publisher is using CloudEvents# Handle events sent with CloudEvents# dapr publish --publish-app-id sample --topic cloud_topic --pubsub pubsub --data '{"id":"7", "name":"Bob Jones"}'@dapr_app.subscribe(pubsub='pubsub',topic='cloud_topic')defcloud_event_handler(event_data:CloudEventModel):print(event_data)# Handle raw events sent without CloudEvents# curl -X "POST" http://localhost:3500/v1.0/publish/pubsub/raw_topic?metadata.rawPayload=true -H "Content-Type: application/json" -d '{"body": "345"}'@dapr_app.subscribe(pubsub='pubsub',topic='raw_topic')defraw_event_handler(event_data:RawEventModel):print(event_data)if__name__=="__main__":uvicorn.run(app,host="0.0.0.0",port=30212)
Creating an actor
fromfastapiimportFastAPIfromdapr.ext.fastapiimportDaprActorfromdemo_actorimportDemoActorapp=FastAPI(title=f'{DemoActor.__name__}Service')# Add Dapr Actor Extensionactor=DaprActor(app)@app.on_event("startup")asyncdefstartup_event():# Register DemoActorawaitactor.register_actor(DemoActor)@app.get("/GetMyData")defget_my_data():return"{'message': 'myData'}"
6.3.3 - Dapr Python SDK integration with Flask
How to create Dapr Python virtual actors with the Flask extension
The Dapr Python SDK provides integration with Flask using the flask-dapr extension.
Installation
You can download and install the Dapr Flask extension with:
pip install flask-dapr
Note
The development package will contain features and behavior that will be compatible with the pre-release version of the Dapr runtime. Make sure to uninstall any stable versions of the Python SDK extension before installing the <code>dapr-dev</code> package.
6.3.4 - Dapr Python SDK integration with Dapr Workflow extension
How to get up and running with the Dapr Workflow extension
The Dapr Python SDK provides a built-in Dapr Workflow extension, dapr.ext.workflow, for creating Dapr services.
Installation
You can download and install the Dapr Workflow extension with:
pip install dapr-ext-workflow
Note
The development package will contain features and behavior that will be compatible with the pre-release version of the Dapr runtime. Make sure to uninstall any stable versions of the Python SDK extension before installing the <code>dapr-dev</code> package.
pip install dapr-ext-workflow-dev
Example
fromtimeimportsleepimportdapr.ext.workflowaswfwfr=wf.WorkflowRuntime()@wfr.workflow(name='random_workflow')deftask_chain_workflow(ctx:wf.DaprWorkflowContext,wf_input:int):try:result1=yieldctx.call_activity(step1,input=wf_input)result2=yieldctx.call_activity(step2,input=result1)exceptExceptionase:yieldctx.call_activity(error_handler,input=str(e))raisereturn[result1,result2]@wfr.activity(name='step1')defstep1(ctx,activity_input):print(f'Step 1: Received input: {activity_input}.')# Do some workreturnactivity_input+1@wfr.activitydefstep2(ctx,activity_input):print(f'Step 2: Received input: {activity_input}.')# Do some workreturnactivity_input*2@wfr.activitydeferror_handler(ctx,error):print(f'Executing error handler: {error}.')# Do some compensating workif__name__=='__main__':wfr.start()sleep(10)# wait for workflow runtime to startwf_client=wf.DaprWorkflowClient()instance_id=wf_client.schedule_new_workflow(workflow=task_chain_workflow,input=42)print(f'Workflow started. Instance ID: {instance_id}')state=wf_client.wait_for_workflow_completion(instance_id)print(f'Workflow completed! Status: {state.runtime_status}')wfr.shutdown()
Learn more about authoring and managing workflows:
Note: Since Python3.exe is not defined in Windows, you may need to use python simple.py instead of python3 simple.py.
Expected output
- "== APP == Hi Counter!"
- "== APP == New counter value is: 1!"
- "== APP == New counter value is: 11!"
- "== APP == Retry count value is: 0!"
- "== APP == Retry count value is: 1! This print statement verifies retry"
- "== APP == Appending 1 to child_orchestrator_string!"
- "== APP == Appending a to child_orchestrator_string!"
- "== APP == Appending a to child_orchestrator_string!"
- "== APP == Appending 2 to child_orchestrator_string!"
- "== APP == Appending b to child_orchestrator_string!"
- "== APP == Appending b to child_orchestrator_string!"
- "== APP == Appending 3 to child_orchestrator_string!"
- "== APP == Appending c to child_orchestrator_string!"
- "== APP == Appending c to child_orchestrator_string!"
- "== APP == Get response from hello_world_wf after pause call: Suspended"
- "== APP == Get response from hello_world_wf after resume call: Running"
- "== APP == New counter value is: 111!"
- "== APP == New counter value is: 1111!"
- "== APP == Workflow completed! Result: "Completed"
What happened?
When you run the application, several key workflow features are shown:
Workflow and Activity Registration: The application uses Python decorators to automatically register workflows and activities with the runtime. This decorator-based approach provides a clean, declarative way to define your workflow components:
Workflow Lifecycle Management: The example demonstrates how to pause and resume the workflow:
wf_client.pause_workflow(instance_id=instance_id)metadata=wf_client.get_workflow_state(instance_id=instance_id)# ... check status ...wf_client.resume_workflow(instance_id=instance_id)
Event Raising: After resuming, the workflow raises an event:
Rust SDK packages for developing Dapr applications
Note
The Dapr Rust-SDK is currently in Alpha. Work is underway to bring it to a stable release and will likely involve breaking changes.
A client library to help build Dapr applications using Rust. This client is targeting support for all public Dapr APIs while focusing on idiomatic Rust experiences and developer productivity.
Client
Use the Rust Client SDK for invoking public Dapr APIs
[**Learn more about the Rust Client SDK**](https://v1-16.docs.dapr.io/developing-applications/sdks/rust/rust-client/)
7.1 - Getting started with the Dapr client Rust SDK
How to get up and running with the Dapr Rust SDK
The Dapr client package allows you to interact with other Dapr applications from
a Rust application.
Note
The Dapr Rust-SDK is currently in Alpha. Work is underway to bring it to a
stable release and will likely involve breaking changes.
The Dapr Client provides access to these state management methods: save_state
, get_state, delete_state that can be used like so:
letstore_name=String::from("statestore");letkey=String::from("hello");letval=String::from("world").into_bytes();// save key-value pair in the state store
client.save_state(store_name,key,val,None,None,None).await?;letget_response=client.get_state("statestore","hello",None).await?;// delete a value from the state store
client.delete_state("statestore","hello",None).await?;
Multiple states can be sent with the save_bulk_states method.