This is the multi-page printable view of this section. Click here to print.

Return to the regular view of this page.

Dapr Java SDK

Java SDK packages for developing Dapr applications

Dapr offers a variety of packages to help with the development of Java applications. Using them you can create Java clients, servers, and virtual actors with Dapr.

Prerequisites

Import Dapr’s Java SDK

Next, import the Java SDK packages to get started. Select your preferred build tool to learn how to import.

For a Maven project, add the following to your pom.xml file:

<project>
  ...
  <dependencies>
    ...
    <!-- Dapr's core SDK with all features, except Actors. -->
    <dependency>
      <groupId>io.dapr</groupId>
      <artifactId>dapr-sdk</artifactId>
      <version>1.14.1</version>
    </dependency>
    <!-- Dapr's SDK for Actors (optional). -->
    <dependency>
      <groupId>io.dapr</groupId>
      <artifactId>dapr-sdk-actors</artifactId>
      <version>1.14.1</version>
    </dependency>
    <!-- Dapr's SDK integration with SpringBoot (optional). -->
    <dependency>
      <groupId>io.dapr</groupId>
      <artifactId>dapr-sdk-springboot</artifactId>
      <version>1.14.1</version>
    </dependency>
    ...
  </dependencies>
  ...
</project>

For a Gradle project, add the following to your build.gradle file:

dependencies {
...
    // Dapr's core SDK with all features, except Actors.
    compile('io.dapr:dapr-sdk:1.14.1')
    // Dapr's SDK for Actors (optional).
    compile('io.dapr:dapr-sdk-actors:1.14.1')
    // Dapr's SDK integration with SpringBoot (optional).
    compile('io.dapr:dapr-sdk-springboot:1.14.1')
}

If you are also using Spring Boot, you may run into a common issue where the OkHttp version that the Dapr SDK uses conflicts with the one specified in the Spring Boot Bill of Materials.

You can fix this by specifying a compatible OkHttp version in your project to match the version that the Dapr SDK uses:

<dependency>
  <groupId>com.squareup.okhttp3</groupId>
  <artifactId>okhttp</artifactId>
  <version>1.14.1</version>
</dependency>

Try it out

Put the Dapr Java SDK to the test. Walk through the Java quickstarts and tutorials to see Dapr in action:

SDK samples Description
Quickstarts Experience Dapr’s API building blocks in just a few minutes using the Java SDK.
SDK samples Clone the SDK repo to try out some examples and get started.
import io.dapr.client.DaprClient;
import io.dapr.client.DaprClientBuilder;

try (DaprClient client = (new DaprClientBuilder()).build()) {
  // sending a class with message; BINDING_OPERATION="create"
  client.invokeBinding(BINDING_NAME, BINDING_OPERATION, myClass).block();

  // sending a plain string
  client.invokeBinding(BINDING_NAME, BINDING_OPERATION, message).block();
}

Available packages

Client

Create Java clients that interact with a Dapr sidecar and other Dapr applications.

Workflow

Create and manage workflows that work with other Dapr APIs in Java.

1 - AI

With the Dapr Conversation AI package, you can interact with the Dapr AI workloads from a Java application. To get started, walk through the Dapr AI how-to guide.

1.1 - How to: Author and manage Dapr Conversation AI in the Java SDK

How to get up and running with Conversation AI using the Dapr Java SDK

As part of this demonstration, we will look at how to use the Conversation API to converse with a Large Language Model (LLM). The API will return the response from the LLM for the given prompt. With the provided conversation ai example, you will:

  • You will provide a prompt using the Conversation AI example
  • Filter out Personally identifiable information (PII).

This example uses the default configuration from dapr init in self-hosted mode.

Prerequisites

Set up the environment

Clone the Java SDK repo and navigate into it.

git clone https://github.com/dapr/java-sdk.git
cd java-sdk

Run the following command to install the requirements for running the Conversation AI example with the Dapr Java SDK.

mvn clean install -DskipTests

From the Java SDK root directory, navigate to the examples’ directory.

cd examples

Run the Dapr sidecar.

dapr run --app-id conversationapp --dapr-grpc-port 51439 --dapr-http-port 3500 --app-port 8080

Now, Dapr is listening for HTTP requests at http://localhost:3500 and gRPC requests at http://localhost:51439.

Send a prompt with Personally identifiable information (PII) to the Conversation AI API

In the DemoConversationAI there are steps to send a prompt using the converse method under the DaprPreviewClient.

public class DemoConversationAI {
  /**
   * The main method to start the client.
   *
   * @param args Input arguments (unused).
   */
  public static void main(String[] args) {
    try (DaprPreviewClient client = new DaprClientBuilder().buildPreviewClient()) {
      System.out.println("Sending the following input to LLM: Hello How are you? This is the my number 672-123-4567");

      ConversationInput daprConversationInput = new ConversationInput("Hello How are you? "
              + "This is the my number 672-123-4567");

      // Component name is the name provided in the metadata block of the conversation.yaml file.
      Mono<ConversationResponse> responseMono = client.converse(new ConversationRequest("echo",
              List.of(daprConversationInput))
              .setContextId("contextId")
              .setScrubPii(true).setTemperature(1.1d));
      ConversationResponse response = responseMono.block();
      System.out.printf("Conversation output: %s", response.getConversationOutputs().get(0).getResult());
    } catch (Exception e) {
      throw new RuntimeException(e);
    }
  }
}

Run the DemoConversationAI with the following command.

java -jar target/dapr-java-sdk-examples-exec.jar io.dapr.examples.conversation.DemoConversationAI

Sample output

== APP == Conversation output: Hello How are you? This is the my number <ISBN>

As shown in the output, the number sent to the API is obfuscated and returned in the form of . The example above uses an “echo” component for testing, which simply returns the input message. When integrated with LLMs like OpenAI or Claude, you’ll receive meaningful responses instead of echoed input.

Next steps

2 - Getting started with the Dapr client Java SDK

How to get up and running with the Dapr Java SDK

The Dapr client package allows you to interact with other Dapr applications from a Java application.

Prerequisites

Complete initial setup and import the Java SDK into your project

Initializing the client

You can initialize a Dapr client as so:

DaprClient client = new DaprClientBuilder().build()

This will connect to the default Dapr gRPC endpoint localhost:50001. For information about configuring the client using environment variables and system properties, see Properties.

Error Handling

Initially, errors in Dapr followed the Standard gRPC error model. However, to provide more detailed and informative error messages, in version 1.13 an enhanced error model has been introduced which aligns with the gRPC Richer error model. In response, the Java SDK extended the DaprException to include the error details that were added in Dapr.

Example of handling the DaprException and consuming the error details when using the Dapr Java SDK:

...
      try {
        client.publishEvent("unknown_pubsub", "mytopic", "mydata").block();
      } catch (DaprException exception) {
        System.out.println("Dapr exception's error code: " + exception.getErrorCode());
        System.out.println("Dapr exception's message: " + exception.getMessage());
        // DaprException now contains `getStatusDetails()` to include more details about the error from Dapr runtime.
        System.out.println("Dapr exception's reason: " + exception.getStatusDetails().get(
        DaprErrorDetails.ErrorDetailType.ERROR_INFO,
            "reason",
        TypeRef.STRING));
      }
...

Building blocks

The Java SDK allows you to interface with all of the Dapr building blocks.

Invoke a service

import io.dapr.client.DaprClient;
import io.dapr.client.DaprClientBuilder;

try (DaprClient client = (new DaprClientBuilder()).build()) {
  // invoke a 'GET' method (HTTP) skipping serialization: \say with a Mono<byte[]> return type
  // for gRPC set HttpExtension.NONE parameters below
  response = client.invokeMethod(SERVICE_TO_INVOKE, METHOD_TO_INVOKE, "{\"name\":\"World!\"}", HttpExtension.GET, byte[].class).block();

  // invoke a 'POST' method (HTTP) skipping serialization: to \say with a Mono<byte[]> return type     
  response = client.invokeMethod(SERVICE_TO_INVOKE, METHOD_TO_INVOKE, "{\"id\":\"100\", \"FirstName\":\"Value\", \"LastName\":\"Value\"}", HttpExtension.POST, byte[].class).block();

  System.out.println(new String(response));

  // invoke a 'POST' method (HTTP) with serialization: \employees with a Mono<Employee> return type      
  Employee newEmployee = new Employee("Nigel", "Guitarist");
  Employee employeeResponse = client.invokeMethod(SERVICE_TO_INVOKE, "employees", newEmployee, HttpExtension.POST, Employee.class).block();
}

Save & get application state

import io.dapr.client.DaprClient;
import io.dapr.client.DaprClientBuilder;
import io.dapr.client.domain.State;
import reactor.core.publisher.Mono;

try (DaprClient client = (new DaprClientBuilder()).build()) {
  // Save state
  client.saveState(STATE_STORE_NAME, FIRST_KEY_NAME, myClass).block();

  // Get state
  State<MyClass> retrievedMessage = client.getState(STATE_STORE_NAME, FIRST_KEY_NAME, MyClass.class).block();

  // Delete state
  client.deleteState(STATE_STORE_NAME, FIRST_KEY_NAME).block();
}

Publish & subscribe to messages

Publish messages
import io.dapr.client.DaprClient;
import io.dapr.client.DaprClientBuilder;
import io.dapr.client.domain.Metadata;
import static java.util.Collections.singletonMap;

try (DaprClient client = (new DaprClientBuilder()).build()) {
  client.publishEvent(PUBSUB_NAME, TOPIC_NAME, message, singletonMap(Metadata.TTL_IN_SECONDS, MESSAGE_TTL_IN_SECONDS)).block();
}
Subscribe to messages
import com.fasterxml.jackson.databind.ObjectMapper;
import io.dapr.Topic;
import io.dapr.client.domain.BulkSubscribeAppResponse;
import io.dapr.client.domain.BulkSubscribeAppResponseEntry;
import io.dapr.client.domain.BulkSubscribeAppResponseStatus;
import io.dapr.client.domain.BulkSubscribeMessage;
import io.dapr.client.domain.BulkSubscribeMessageEntry;
import io.dapr.client.domain.CloudEvent;
import io.dapr.springboot.annotations.BulkSubscribe;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RestController;
import reactor.core.publisher.Mono;

@RestController
public class SubscriberController {

  private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();

  @Topic(name = "testingtopic", pubsubName = "${myAppProperty:messagebus}")
  @PostMapping(path = "/testingtopic")
  public Mono<Void> handleMessage(@RequestBody(required = false) CloudEvent<?> cloudEvent) {
    return Mono.fromRunnable(() -> {
      try {
        System.out.println("Subscriber got: " + cloudEvent.getData());
        System.out.println("Subscriber got: " + OBJECT_MAPPER.writeValueAsString(cloudEvent));
      } catch (Exception e) {
        throw new RuntimeException(e);
      }
    });
  }

  @Topic(name = "testingtopic", pubsubName = "${myAppProperty:messagebus}",
          rule = @Rule(match = "event.type == 'myevent.v2'", priority = 1))
  @PostMapping(path = "/testingtopicV2")
  public Mono<Void> handleMessageV2(@RequestBody(required = false) CloudEvent envelope) {
    return Mono.fromRunnable(() -> {
      try {
        System.out.println("Subscriber got: " + cloudEvent.getData());
        System.out.println("Subscriber got: " + OBJECT_MAPPER.writeValueAsString(cloudEvent));
      } catch (Exception e) {
        throw new RuntimeException(e);
      }
    });
  }

  @BulkSubscribe()
  @Topic(name = "testingtopicbulk", pubsubName = "${myAppProperty:messagebus}")
  @PostMapping(path = "/testingtopicbulk")
  public Mono<BulkSubscribeAppResponse> handleBulkMessage(
          @RequestBody(required = false) BulkSubscribeMessage<CloudEvent<String>> bulkMessage) {
    return Mono.fromCallable(() -> {
      if (bulkMessage.getEntries().size() == 0) {
        return new BulkSubscribeAppResponse(new ArrayList<BulkSubscribeAppResponseEntry>());
      }

      System.out.println("Bulk Subscriber received " + bulkMessage.getEntries().size() + " messages.");

      List<BulkSubscribeAppResponseEntry> entries = new ArrayList<BulkSubscribeAppResponseEntry>();
      for (BulkSubscribeMessageEntry<?> entry : bulkMessage.getEntries()) {
        try {
          System.out.printf("Bulk Subscriber message has entry ID: %s\n", entry.getEntryId());
          CloudEvent<?> cloudEvent = (CloudEvent<?>) entry.getEvent();
          System.out.printf("Bulk Subscriber got: %s\n", cloudEvent.getData());
          entries.add(new BulkSubscribeAppResponseEntry(entry.getEntryId(), BulkSubscribeAppResponseStatus.SUCCESS));
        } catch (Exception e) {
          e.printStackTrace();
          entries.add(new BulkSubscribeAppResponseEntry(entry.getEntryId(), BulkSubscribeAppResponseStatus.RETRY));
        }
      }
      return new BulkSubscribeAppResponse(entries);
    });
  }
}
Bulk Publish Messages

Note: API is in Alpha stage

import io.dapr.client.DaprClientBuilder;
import io.dapr.client.DaprPreviewClient;
import io.dapr.client.domain.BulkPublishResponse;
import io.dapr.client.domain.BulkPublishResponseFailedEntry;
import java.util.ArrayList;
import java.util.List;
class Solution {
  public void publishMessages() {
    try (DaprPreviewClient client = (new DaprClientBuilder()).buildPreviewClient()) {
      // Create a list of messages to publish
      List<String> messages = new ArrayList<>();
      for (int i = 0; i < NUM_MESSAGES; i++) {
        String message = String.format("This is message #%d", i);
        messages.add(message);
        System.out.println("Going to publish message : " + message);
      }

      // Publish list of messages using the bulk publish API
      BulkPublishResponse<String> res = client.publishEvents(PUBSUB_NAME, TOPIC_NAME, "text/plain", messages).block()
    }
  }
}

Interact with output bindings

import io.dapr.client.DaprClient;
import io.dapr.client.DaprClientBuilder;

try (DaprClient client = (new DaprClientBuilder()).build()) {
  // sending a class with message; BINDING_OPERATION="create"
  client.invokeBinding(BINDING_NAME, BINDING_OPERATION, myClass).block();

  // sending a plain string
  client.invokeBinding(BINDING_NAME, BINDING_OPERATION, message).block();
}

Interact with input bindings

import org.springframework.web.bind.annotation.*;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

@RestController
@RequestMapping("/")
public class myClass {
    private static final Logger log = LoggerFactory.getLogger(myClass);
        @PostMapping(path = "/checkout")
        public Mono<String> getCheckout(@RequestBody(required = false) byte[] body) {
            return Mono.fromRunnable(() ->
                    log.info("Received Message: " + new String(body)));
        }
}

Retrieve secrets

import com.fasterxml.jackson.databind.ObjectMapper;
import io.dapr.client.DaprClient;
import io.dapr.client.DaprClientBuilder;
import java.util.Map;

try (DaprClient client = (new DaprClientBuilder()).build()) {
  Map<String, String> secret = client.getSecret(SECRET_STORE_NAME, secretKey).block();
  System.out.println(JSON_SERIALIZER.writeValueAsString(secret));
}

Actors

An actor is an isolated, independent unit of compute and state with single-threaded execution. Dapr provides an actor implementation based on the Virtual Actor pattern, which provides a single-threaded programming model and where actors are garbage collected when not in use. With Dapr’s implementaiton, you write your Dapr actors according to the Actor model, and Dapr leverages the scalability and reliability that the underlying platform provides.

import io.dapr.actors.ActorMethod;
import io.dapr.actors.ActorType;
import reactor.core.publisher.Mono;

@ActorType(name = "DemoActor")
public interface DemoActor {

  void registerReminder();

  @ActorMethod(name = "echo_message")
  String say(String something);

  void clock(String message);

  @ActorMethod(returns = Integer.class)
  Mono<Integer> incrementAndGet(int delta);
}

Get & Subscribe to application configurations

Note this is a preview API and thus will only be accessible via the DaprPreviewClient interface and not the normal DaprClient interface

import io.dapr.client.DaprClientBuilder;
import io.dapr.client.DaprPreviewClient;
import io.dapr.client.domain.ConfigurationItem;
import io.dapr.client.domain.GetConfigurationRequest;
import io.dapr.client.domain.SubscribeConfigurationRequest;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;

try (DaprPreviewClient client = (new DaprClientBuilder()).buildPreviewClient()) {
  // Get configuration for a single key
  Mono<ConfigurationItem> item = client.getConfiguration(CONFIG_STORE_NAME, CONFIG_KEY).block();

  // Get configurations for multiple keys
  Mono<Map<String, ConfigurationItem>> items =
          client.getConfiguration(CONFIG_STORE_NAME, CONFIG_KEY_1, CONFIG_KEY_2);

  // Subscribe to configuration changes
  Flux<SubscribeConfigurationResponse> outFlux = client.subscribeConfiguration(CONFIG_STORE_NAME, CONFIG_KEY_1, CONFIG_KEY_2);
  outFlux.subscribe(configItems -> configItems.forEach(...));

  // Unsubscribe from configuration changes
  Mono<UnsubscribeConfigurationResponse> unsubscribe = client.unsubscribeConfiguration(SUBSCRIPTION_ID, CONFIG_STORE_NAME)
}

Query saved state

Note this is a preview API and thus will only be accessible via the DaprPreviewClient interface and not the normal DaprClient interface

import io.dapr.client.DaprClient;
import io.dapr.client.DaprClientBuilder;
import io.dapr.client.DaprPreviewClient;
import io.dapr.client.domain.QueryStateItem;
import io.dapr.client.domain.QueryStateRequest;
import io.dapr.client.domain.QueryStateResponse;
import io.dapr.client.domain.query.Query;
import io.dapr.client.domain.query.Sorting;
import io.dapr.client.domain.query.filters.EqFilter;

try (DaprClient client = builder.build(); DaprPreviewClient previewClient = builder.buildPreviewClient()) {
        String searchVal = args.length == 0 ? "searchValue" : args[0];
        
        // Create JSON data
        Listing first = new Listing();
        first.setPropertyType("apartment");
        first.setId("1000");
        ...
        Listing second = new Listing();
        second.setPropertyType("row-house");
        second.setId("1002");
        ...
        Listing third = new Listing();
        third.setPropertyType("apartment");
        third.setId("1003");
        ...
        Listing fourth = new Listing();
        fourth.setPropertyType("apartment");
        fourth.setId("1001");
        ...
        Map<String, String> meta = new HashMap<>();
        meta.put("contentType", "application/json");

        // Save state
        SaveStateRequest request = new SaveStateRequest(STATE_STORE_NAME).setStates(
          new State<>("1", first, null, meta, null),
          new State<>("2", second, null, meta, null),
          new State<>("3", third, null, meta, null),
          new State<>("4", fourth, null, meta, null)
        );
        client.saveBulkState(request).block();
        
        
        // Create query and query state request

        Query query = new Query()
          .setFilter(new EqFilter<>("propertyType", "apartment"))
          .setSort(Arrays.asList(new Sorting("id", Sorting.Order.DESC)));
        QueryStateRequest request = new QueryStateRequest(STATE_STORE_NAME)
          .setQuery(query);

        // Use preview client to call query state API
        QueryStateResponse<MyData> result = previewClient.queryState(request, MyData.class).block();
        
        // View Query state response 
        System.out.println("Found " + result.getResults().size() + " items.");
        for (QueryStateItem<Listing> item : result.getResults()) {
          System.out.println("Key: " + item.getKey());
          System.out.println("Data: " + item.getValue());
        }
}

Distributed lock

package io.dapr.examples.lock.grpc;

import io.dapr.client.DaprClientBuilder;
import io.dapr.client.DaprPreviewClient;
import io.dapr.client.domain.LockRequest;
import io.dapr.client.domain.UnlockRequest;
import io.dapr.client.domain.UnlockResponseStatus;
import reactor.core.publisher.Mono;

public class DistributedLockGrpcClient {
  private static final String LOCK_STORE_NAME = "lockstore";

  /**
   * Executes various methods to check the different apis.
   *
   * @param args arguments
   * @throws Exception throws Exception
   */
  public static void main(String[] args) throws Exception {
    try (DaprPreviewClient client = (new DaprClientBuilder()).buildPreviewClient()) {
      System.out.println("Using preview client...");
      tryLock(client);
      unlock(client);
    }
  }

  /**
   * Trying to get lock.
   *
   * @param client DaprPreviewClient object
   */
  public static void tryLock(DaprPreviewClient client) {
    System.out.println("*******trying to get a free distributed lock********");
    try {
      LockRequest lockRequest = new LockRequest(LOCK_STORE_NAME, "resouce1", "owner1", 5);
      Mono<Boolean> result = client.tryLock(lockRequest);
      System.out.println("Lock result -> " + (Boolean.TRUE.equals(result.block()) ? "SUCCESS" : "FAIL"));
    } catch (Exception ex) {
      System.out.println(ex.getMessage());
    }
  }

  /**
   * Unlock a lock.
   *
   * @param client DaprPreviewClient object
   */
  public static void unlock(DaprPreviewClient client) {
    System.out.println("*******unlock a distributed lock********");
    try {
      UnlockRequest unlockRequest = new UnlockRequest(LOCK_STORE_NAME, "resouce1", "owner1");
      Mono<UnlockResponseStatus> result = client.unlock(unlockRequest);
      System.out.println("Unlock result ->" + result.block().name());
    } catch (Exception ex) {
      System.out.println(ex.getMessage());
    }
  }
}

Workflow

package io.dapr.examples.workflows;

import io.dapr.workflows.client.DaprWorkflowClient;
import io.dapr.workflows.client.WorkflowInstanceStatus;

import java.time.Duration;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.TimeoutException;

/**
 * For setup instructions, see the README.
 */
public class DemoWorkflowClient {

  /**
   * The main method.
   *
   * @param args Input arguments (unused).
   * @throws InterruptedException If program has been interrupted.
   */
  public static void main(String[] args) throws InterruptedException {
    DaprWorkflowClient client = new DaprWorkflowClient();

    try (client) {
      String separatorStr = "*******";
      System.out.println(separatorStr);
      String instanceId = client.scheduleNewWorkflow(DemoWorkflow.class, "input data");
      System.out.printf("Started new workflow instance with random ID: %s%n", instanceId);

      System.out.println(separatorStr);
      System.out.println("**GetInstanceMetadata:Running Workflow**");
      WorkflowInstanceStatus workflowMetadata = client.getInstanceState(instanceId, true);
      System.out.printf("Result: %s%n", workflowMetadata);

      System.out.println(separatorStr);
      System.out.println("**WaitForInstanceStart**");
      try {
        WorkflowInstanceStatus waitForInstanceStartResult =
            client.waitForInstanceStart(instanceId, Duration.ofSeconds(60), true);
        System.out.printf("Result: %s%n", waitForInstanceStartResult);
      } catch (TimeoutException ex) {
        System.out.printf("waitForInstanceStart has an exception:%s%n", ex);
      }

      System.out.println(separatorStr);
      System.out.println("**SendExternalMessage**");
      client.raiseEvent(instanceId, "TestEvent", "TestEventPayload");

      System.out.println(separatorStr);
      System.out.println("** Registering parallel Events to be captured by allOf(t1,t2,t3) **");
      client.raiseEvent(instanceId, "event1", "TestEvent 1 Payload");
      client.raiseEvent(instanceId, "event2", "TestEvent 2 Payload");
      client.raiseEvent(instanceId, "event3", "TestEvent 3 Payload");
      System.out.printf("Events raised for workflow with instanceId: %s\n", instanceId);

      System.out.println(separatorStr);
      System.out.println("** Registering Event to be captured by anyOf(t1,t2,t3) **");
      client.raiseEvent(instanceId, "e2", "event 2 Payload");
      System.out.printf("Event raised for workflow with instanceId: %s\n", instanceId);


      System.out.println(separatorStr);
      System.out.println("**WaitForInstanceCompletion**");
      try {
        WorkflowInstanceStatus waitForInstanceCompletionResult =
            client.waitForInstanceCompletion(instanceId, Duration.ofSeconds(60), true);
        System.out.printf("Result: %s%n", waitForInstanceCompletionResult);
      } catch (TimeoutException ex) {
        System.out.printf("waitForInstanceCompletion has an exception:%s%n", ex);
      }

      System.out.println(separatorStr);
      System.out.println("**purgeInstance**");
      boolean purgeResult = client.purgeInstance(instanceId);
      System.out.printf("purgeResult: %s%n", purgeResult);

      System.out.println(separatorStr);
      System.out.println("**raiseEvent**");

      String eventInstanceId = client.scheduleNewWorkflow(DemoWorkflow.class);
      System.out.printf("Started new workflow instance with random ID: %s%n", eventInstanceId);
      client.raiseEvent(eventInstanceId, "TestException", null);
      System.out.printf("Event raised for workflow with instanceId: %s\n", eventInstanceId);

      System.out.println(separatorStr);
      String instanceToTerminateId = "terminateMe";
      client.scheduleNewWorkflow(DemoWorkflow.class, null, instanceToTerminateId);
      System.out.printf("Started new workflow instance with specified ID: %s%n", instanceToTerminateId);

      TimeUnit.SECONDS.sleep(5);
      System.out.println("Terminate this workflow instance manually before the timeout is reached");
      client.terminateWorkflow(instanceToTerminateId, null);
      System.out.println(separatorStr);

      String restartingInstanceId = "restarting";
      client.scheduleNewWorkflow(DemoWorkflow.class, null, restartingInstanceId);
      System.out.printf("Started new  workflow instance with ID: %s%n", restartingInstanceId);
      System.out.println("Sleeping 30 seconds to restart the workflow");
      TimeUnit.SECONDS.sleep(30);

      System.out.println("**SendExternalMessage: RestartEvent**");
      client.raiseEvent(restartingInstanceId, "RestartEvent", "RestartEventPayload");

      System.out.println("Sleeping 30 seconds to terminate the eternal workflow");
      TimeUnit.SECONDS.sleep(30);
      client.terminateWorkflow(restartingInstanceId, null);
    }

    System.out.println("Exiting DemoWorkflowClient.");
    System.exit(0);
  }
}

Sidecar APIs

Wait for sidecar

The DaprClient also provides a helper method to wait for the sidecar to become healthy (components only). When using this method, be sure to specify a timeout in milliseconds and block() to wait for the result of a reactive operation.

// Wait for the Dapr sidecar to report healthy before attempting to use Dapr components.
try (DaprClient client = new DaprClientBuilder().build()) {
  System.out.println("Waiting for Dapr sidecar ...");
  client.waitForSidecar(10000).block(); // Specify the timeout in milliseconds
  System.out.println("Dapr sidecar is ready.");
  ...
}

// Perform Dapr component operations here i.e. fetching secrets or saving state.

Shutdown the sidecar

try (DaprClient client = new DaprClientBuilder().build()) {
  logger.info("Sending shutdown request.");
  client.shutdown().block();
  logger.info("Ensuring dapr has stopped.");
  ...
}

Learn more about the Dapr Java SDK packages available to add to your Java applications.

For a full list of SDK properties and how to configure them, visit Properties.

2.1 - Properties

SDK-wide properties for configuring the Dapr Java SDK using environment variables and system properties

Properties

The Dapr Java SDK provides a set of global properties that control the behavior of the SDK. These properties can be configured using environment variables or system properties. System properties can be set using the -D flag when running your Java application.

These properties affect the entire SDK, including clients and runtime. They control aspects such as:

  • Sidecar connectivity (endpoints, ports)
  • Security settings (TLS, API tokens)
  • Performance tuning (timeouts, connection pools)
  • Protocol settings (gRPC, HTTP)
  • String encoding

Environment Variables

The following environment variables are available for configuring the Dapr Java SDK:

Sidecar Endpoints

When these variables are set, the client will automatically use them to connect to the Dapr sidecar.

Environment Variable Description Default
DAPR_GRPC_ENDPOINT The gRPC endpoint for the Dapr sidecar localhost:50001
DAPR_HTTP_ENDPOINT The HTTP endpoint for the Dapr sidecar localhost:3500
DAPR_GRPC_PORT The gRPC port for the Dapr sidecar (legacy, DAPR_GRPC_ENDPOINT takes precedence) 50001
DAPR_HTTP_PORT The HTTP port for the Dapr sidecar (legacy, DAPR_HTTP_ENDPOINT takes precedence) 3500

API Token

Environment Variable Description Default
DAPR_API_TOKEN API token for authentication between app and Dapr sidecar. This is the same token used by the Dapr runtime for API authentication. For more details, see Dapr API token authentication and Environment variables reference. null

gRPC Configuration

TLS Settings

For secure gRPC communication, you can configure TLS settings using the following environment variables:

Environment Variable Description Default
DAPR_GRPC_TLS_INSECURE When set to “true”, enables insecure TLS mode which still uses TLS but doesn’t verify certificates. This uses InsecureTrustManagerFactory to trust all certificates. This should only be used for testing or in secure environments. false
DAPR_GRPC_TLS_CA_PATH Path to the CA certificate file. This is used for TLS connections to servers with self-signed certificates. null
DAPR_GRPC_TLS_CERT_PATH Path to the TLS certificate file for client authentication. null
DAPR_GRPC_TLS_KEY_PATH Path to the TLS private key file for client authentication. null

Keepalive Settings

Configure gRPC keepalive behavior using these environment variables:

Environment Variable Description Default
DAPR_GRPC_ENABLE_KEEP_ALIVE Whether to enable gRPC keepalive false
DAPR_GRPC_KEEP_ALIVE_TIME_SECONDS gRPC keepalive time in seconds 10
DAPR_GRPC_KEEP_ALIVE_TIMEOUT_SECONDS gRPC keepalive timeout in seconds 5
DAPR_GRPC_KEEP_ALIVE_WITHOUT_CALLS Whether to keep gRPC connection alive without calls true

HTTP Client Configuration

These properties control the behavior of the HTTP client used for communication with the Dapr sidecar:

Environment Variable Description Default
DAPR_HTTP_CLIENT_READ_TIMEOUT_SECONDS Timeout in seconds for HTTP client read operations. This is the maximum time to wait for a response from the Dapr sidecar. 60
DAPR_HTTP_CLIENT_MAX_REQUESTS Maximum number of concurrent HTTP requests that can be executed. Above this limit, requests will queue in memory waiting for running calls to complete. 1024
DAPR_HTTP_CLIENT_MAX_IDLE_CONNECTIONS Maximum number of idle connections in the HTTP connection pool. This is the maximum number of connections that can remain idle in the pool. 128

API Configuration

These properties control the behavior of API calls made through the SDK:

Environment Variable Description Default
DAPR_API_MAX_RETRIES Maximum number of retries for retriable exceptions when making API calls to the Dapr sidecar 0
DAPR_API_TIMEOUT_MILLISECONDS Timeout in milliseconds for API calls to the Dapr sidecar. A value of 0 means no timeout. 0

String Encoding

Environment Variable Description Default
DAPR_STRING_CHARSET Character set used for string encoding/decoding in the SDK. Must be a valid Java charset name. UTF-8

System Properties

All environment variables can be set as system properties using the -D flag. Here is the complete list of available system properties:

System Property Description Default
dapr.sidecar.ip IP address for the Dapr sidecar localhost
dapr.http.port HTTP port for the Dapr sidecar 3500
dapr.grpc.port gRPC port for the Dapr sidecar 50001
dapr.grpc.tls.cert.path Path to the gRPC TLS certificate null
dapr.grpc.tls.key.path Path to the gRPC TLS key null
dapr.grpc.tls.ca.path Path to the gRPC TLS CA certificate null
dapr.grpc.tls.insecure Whether to use insecure TLS mode false
dapr.grpc.endpoint gRPC endpoint for remote sidecar null
dapr.grpc.enable.keep.alive Whether to enable gRPC keepalive false
dapr.grpc.keep.alive.time.seconds gRPC keepalive time in seconds 10
dapr.grpc.keep.alive.timeout.seconds gRPC keepalive timeout in seconds 5
dapr.grpc.keep.alive.without.calls Whether to keep gRPC connection alive without calls true
dapr.http.endpoint HTTP endpoint for remote sidecar null
dapr.api.maxRetries Maximum number of retries for API calls 0
dapr.api.timeoutMilliseconds Timeout for API calls in milliseconds 0
dapr.api.token API token for authentication null
dapr.string.charset String encoding used in the SDK UTF-8
dapr.http.client.readTimeoutSeconds Timeout in seconds for HTTP client reads 60
dapr.http.client.maxRequests Maximum number of concurrent HTTP requests 1024
dapr.http.client.maxIdleConnections Maximum number of idle HTTP connections 128

Property Resolution Order

Properties are resolved in the following order:

  1. Override values (if provided when creating a Properties instance)
  2. System properties (set via -D)
  3. Environment variables
  4. Default values

The SDK checks each source in order. If a value is invalid for the property type (e.g., non-numeric for a numeric property), the SDK will log a warning and try the next source. For example:

# Invalid boolean value - will be ignored
java -Ddapr.grpc.enable.keep.alive=not-a-boolean -jar myapp.jar

# Valid boolean value - will be used
export DAPR_GRPC_ENABLE_KEEP_ALIVE=false

In this case, the environment variable is used because the system property value is invalid. However, if both values are valid, the system property takes precedence:

# Valid boolean value - will be used
java -Ddapr.grpc.enable.keep.alive=true -jar myapp.jar

# Valid boolean value - will be ignored
export DAPR_GRPC_ENABLE_KEEP_ALIVE=false

Override values can be set using the DaprClientBuilder in two ways:

  1. Using individual property overrides (recommended for most cases):
import io.dapr.config.Properties;

// Set a single property override
DaprClient client = new DaprClientBuilder()
    .withPropertyOverride(Properties.GRPC_ENABLE_KEEP_ALIVE, "true")
    .build();

// Or set multiple property overrides
DaprClient client = new DaprClientBuilder()
    .withPropertyOverride(Properties.GRPC_ENABLE_KEEP_ALIVE, "true")
    .withPropertyOverride(Properties.HTTP_CLIENT_READ_TIMEOUT_SECONDS, "120")
    .build();
  1. Using a Properties instance (useful when you have many properties to set at once):
// Create a map of property overrides
Map<String, String> overrides = new HashMap<>();
overrides.put("dapr.grpc.enable.keep.alive", "true");
overrides.put("dapr.http.client.readTimeoutSeconds", "120");

// Create a Properties instance with overrides
Properties properties = new Properties(overrides);

// Use these properties when creating a client
DaprClient client = new DaprClientBuilder()
    .withProperties(properties)
    .build();

For most use cases, you’ll use system properties or environment variables. Override values are primarily used when you need different property values for different instances of the SDK in the same application.

Proxy Configuration

You can configure proxy settings for your Java application using system properties. These are standard Java system properties that are part of Java’s networking layer (java.net package), not specific to Dapr. They are used by Java’s networking stack, including the HTTP client that Dapr’s SDK uses.

For detailed information about Java’s proxy configuration, including all available properties and their usage, see the Java Networking Properties documentation.

For example, here’s how to configure a proxy:

# Configure HTTP proxy - replace with your actual proxy server details
java -Dhttp.proxyHost=your-proxy-server.com -Dhttp.proxyPort=8080 -jar myapp.jar

# Configure HTTPS proxy - replace with your actual proxy server details
java -Dhttps.proxyHost=your-proxy-server.com -Dhttps.proxyPort=8443 -jar myapp.jar

Replace your-proxy-server.com with your actual proxy server hostname or IP address, and adjust the port numbers to match your proxy server configuration.

These proxy settings will affect all HTTP/HTTPS connections made by your Java application, including connections to the Dapr sidecar.

3 - Jobs

With the Dapr Jobs package, you can interact with the Dapr Jobs APIs from a Java application to trigger future operations to run according to a predefined schedule with an optional payload. To get started, walk through the Dapr Jobs how-to guide.

3.1 - How to: Author and manage Dapr Jobs in the Java SDK

How to get up and running with Jobs using the Dapr Java SDK

As part of this demonstration we will schedule a Dapr Job. The scheduled job will trigger an endpoint registered in the same app. With the provided jobs example, you will:

This example uses the default configuration from dapr init in self-hosted mode.

Prerequisites

Set up the environment

Clone the Java SDK repo and navigate into it.

git clone https://github.com/dapr/java-sdk.git
cd java-sdk

Run the following command to install the requirements for running the jobs example with the Dapr Java SDK.

mvn clean install -DskipTests

From the Java SDK root directory, navigate to the examples’ directory.

cd examples

Run the Dapr sidecar.

dapr run --app-id jobsapp --dapr-grpc-port 51439 --dapr-http-port 3500 --app-port 8080

Now, Dapr is listening for HTTP requests at http://localhost:3500 and internal Jobs gRPC requests at http://localhost:51439.

Schedule and Get a job

In the DemoJobsClient there are steps to schedule a job. Calling scheduleJob using the DaprPreviewClient will schedule a job with the Dapr Runtime.

public class DemoJobsClient {

  /**
   * The main method of this app to schedule and get jobs.
   */
  public static void main(String[] args) throws Exception {
    try (DaprPreviewClient client = new DaprClientBuilder().withPropertyOverrides(overrides).buildPreviewClient()) {

      // Schedule a job.
      System.out.println("**** Scheduling a Job with name dapr-jobs-1 *****");
      ScheduleJobRequest scheduleJobRequest = new ScheduleJobRequest("dapr-job-1",
          JobSchedule.fromString("* * * * * *")).setData("Hello World!".getBytes());
      client.scheduleJob(scheduleJobRequest).block();

      System.out.println("**** Scheduling job dapr-jobs-1 completed *****");
    }
  }
}

Call getJob to retrieve the job details that were previously created and scheduled.

client.getJob(new GetJobRequest("dapr-job-1")).block()

Run the DemoJobsClient with the following command.

java -jar target/dapr-java-sdk-examples-exec.jar io.dapr.examples.jobs.DemoJobsClient

Sample output

**** Scheduling a Job with name dapr-jobs-1 *****
**** Scheduling job dapr-jobs-1 completed *****
**** Retrieving a Job with name dapr-jobs-1 *****

Set up an endpoint to be invoked when the job is triggered

The DemoJobsSpringApplication class starts a Spring Boot application that registers the endpoints specified in the JobsController This endpoint acts like a callback for the scheduled job requests.

@RestController
public class JobsController {

  /**
   * Handles jobs callback from Dapr.
   *
   * @param jobName name of the job.
   * @param payload data from the job if payload exists.
   * @return Empty Mono.
   */
  @PostMapping("/job/{jobName}")
  public Mono<Void> handleJob(@PathVariable("jobName") String jobName,
                              @RequestBody(required = false) byte[] payload) {
    System.out.println("Job Name: " + jobName);
    System.out.println("Job Payload: " + new String(payload));

    return Mono.empty();
  }
}

Parameters:

  • jobName: The name of the triggered job.
  • payload: Optional payload data associated with the job (as a byte array).

Run the Spring Boot application with the following command.

java -jar target/dapr-java-sdk-examples-exec.jar io.dapr.examples.jobs.DemoJobsSpringApplication

Sample output

Job Name: dapr-job-1
Job Payload: Hello World!

Delete a scheduled job

public class DemoJobsClient {

  /**
   * The main method of this app deletes a job that was previously scheduled.
   */
  public static void main(String[] args) throws Exception {
    try (DaprPreviewClient client = new DaprClientBuilder().buildPreviewClient()) {

      // Delete a job.
      System.out.println("**** Delete a Job with name dapr-jobs-1 *****");
      client.deleteJob(new DeleteJobRequest("dapr-job-1")).block();
    }
  }
}

Next steps

4 - Workflow

How to get up and running with the Dapr Workflow extension

4.1 - How to: Author and manage Dapr Workflow in the Java SDK

How to get up and running with workflows using the Dapr Java SDK

Let’s create a Dapr workflow and invoke it using the console. With the provided workflow example, you will:

This example uses the default configuration from dapr init in self-hosted mode.

Prerequisites

  • Verify you’re using the latest proto bindings

Set up the environment

Clone the Java SDK repo and navigate into it.

git clone https://github.com/dapr/java-sdk.git
cd java-sdk

Run the following command to install the requirements for running this workflow sample with the Dapr Java SDK.

mvn clean install

From the Java SDK root directory, navigate to the Dapr Workflow example.

cd examples

Run the DemoWorkflowWorker

The DemoWorkflowWorker class registers an implementation of DemoWorkflow in Dapr’s workflow runtime engine. In the DemoWorkflowWorker.java file, you can find the DemoWorkflowWorker class and the main method:

public class DemoWorkflowWorker {

  public static void main(String[] args) throws Exception {
    // Register the Workflow with the runtime.
    WorkflowRuntime.getInstance().registerWorkflow(DemoWorkflow.class);
    System.out.println("Start workflow runtime");
    WorkflowRuntime.getInstance().startAndBlock();
    System.exit(0);
  }
}

In the code above:

  • WorkflowRuntime.getInstance().registerWorkflow() registers DemoWorkflow as a workflow in the Dapr Workflow runtime.
  • WorkflowRuntime.getInstance().start() builds and starts the engine within the Dapr Workflow runtime.

In the terminal, execute the following command to kick off the DemoWorkflowWorker:

dapr run --app-id demoworkflowworker --resources-path ./components/workflows --dapr-grpc-port 50001 -- java -jar target/dapr-java-sdk-examples-exec.jar io.dapr.examples.workflows.DemoWorkflowWorker

Expected output

You're up and running! Both Dapr and your app logs will appear here.

...

== APP == Start workflow runtime
== APP == Sep 13, 2023 9:02:03 AM com.microsoft.durabletask.DurableTaskGrpcWorker startAndBlock
== APP == INFO: Durable Task worker is connecting to sidecar at 127.0.0.1:50001.

Run the `DemoWorkflowClient

The DemoWorkflowClient starts instances of workflows that have been registered with Dapr.

public class DemoWorkflowClient {

  // ...
  public static void main(String[] args) throws InterruptedException {
    DaprWorkflowClient client = new DaprWorkflowClient();

    try (client) {
      String separatorStr = "*******";
      System.out.println(separatorStr);
      String instanceId = client.scheduleNewWorkflow(DemoWorkflow.class, "input data");
      System.out.printf("Started new workflow instance with random ID: %s%n", instanceId);

      System.out.println(separatorStr);
      System.out.println("**GetInstanceMetadata:Running Workflow**");
      WorkflowInstanceStatus workflowMetadata = client.getInstanceState(instanceId, true);
      System.out.printf("Result: %s%n", workflowMetadata);

      System.out.println(separatorStr);
      System.out.println("**WaitForInstanceStart**");
      try {
        WorkflowInstanceStatus waitForInstanceStartResult =
            client.waitForInstanceStart(instanceId, Duration.ofSeconds(60), true);
        System.out.printf("Result: %s%n", waitForInstanceStartResult);
      } catch (TimeoutException ex) {
        System.out.printf("waitForInstanceStart has an exception:%s%n", ex);
      }

      System.out.println(separatorStr);
      System.out.println("**SendExternalMessage**");
      client.raiseEvent(instanceId, "TestEvent", "TestEventPayload");

      System.out.println(separatorStr);
      System.out.println("** Registering parallel Events to be captured by allOf(t1,t2,t3) **");
      client.raiseEvent(instanceId, "event1", "TestEvent 1 Payload");
      client.raiseEvent(instanceId, "event2", "TestEvent 2 Payload");
      client.raiseEvent(instanceId, "event3", "TestEvent 3 Payload");
      System.out.printf("Events raised for workflow with instanceId: %s\n", instanceId);

      System.out.println(separatorStr);
      System.out.println("** Registering Event to be captured by anyOf(t1,t2,t3) **");
      client.raiseEvent(instanceId, "e2", "event 2 Payload");
      System.out.printf("Event raised for workflow with instanceId: %s\n", instanceId);


      System.out.println(separatorStr);
      System.out.println("**WaitForInstanceCompletion**");
      try {
        WorkflowInstanceStatus waitForInstanceCompletionResult =
            client.waitForInstanceCompletion(instanceId, Duration.ofSeconds(60), true);
        System.out.printf("Result: %s%n", waitForInstanceCompletionResult);
      } catch (TimeoutException ex) {
        System.out.printf("waitForInstanceCompletion has an exception:%s%n", ex);
      }

      System.out.println(separatorStr);
      System.out.println("**purgeInstance**");
      boolean purgeResult = client.purgeInstance(instanceId);
      System.out.printf("purgeResult: %s%n", purgeResult);

      System.out.println(separatorStr);
      System.out.println("**raiseEvent**");

      String eventInstanceId = client.scheduleNewWorkflow(DemoWorkflow.class);
      System.out.printf("Started new workflow instance with random ID: %s%n", eventInstanceId);
      client.raiseEvent(eventInstanceId, "TestException", null);
      System.out.printf("Event raised for workflow with instanceId: %s\n", eventInstanceId);

      System.out.println(separatorStr);
      String instanceToTerminateId = "terminateMe";
      client.scheduleNewWorkflow(DemoWorkflow.class, null, instanceToTerminateId);
      System.out.printf("Started new workflow instance with specified ID: %s%n", instanceToTerminateId);

      TimeUnit.SECONDS.sleep(5);
      System.out.println("Terminate this workflow instance manually before the timeout is reached");
      client.terminateWorkflow(instanceToTerminateId, null);
      System.out.println(separatorStr);

      String restartingInstanceId = "restarting";
      client.scheduleNewWorkflow(DemoWorkflow.class, null, restartingInstanceId);
      System.out.printf("Started new  workflow instance with ID: %s%n", restartingInstanceId);
      System.out.println("Sleeping 30 seconds to restart the workflow");
      TimeUnit.SECONDS.sleep(30);

      System.out.println("**SendExternalMessage: RestartEvent**");
      client.raiseEvent(restartingInstanceId, "RestartEvent", "RestartEventPayload");

      System.out.println("Sleeping 30 seconds to terminate the eternal workflow");
      TimeUnit.SECONDS.sleep(30);
      client.terminateWorkflow(restartingInstanceId, null);
    }

    System.out.println("Exiting DemoWorkflowClient.");
    System.exit(0);
  }
}

In a second terminal window, start the workflow by running the following command:

java -jar target/dapr-java-sdk-examples-exec.jar io.dapr.examples.workflows.DemoWorkflowClient

Expected output

*******
Started new workflow instance with random ID: 0b4cc0d5-413a-4c1c-816a-a71fa24740d4
*******
**GetInstanceMetadata:Running Workflow**
Result: [Name: 'io.dapr.examples.workflows.DemoWorkflow', ID: '0b4cc0d5-413a-4c1c-816a-a71fa24740d4', RuntimeStatus: RUNNING, CreatedAt: 2023-09-13T13:02:30.547Z, LastUpdatedAt: 2023-09-13T13:02:30.699Z, Input: '"input data"', Output: '']
*******
**WaitForInstanceStart**
Result: [Name: 'io.dapr.examples.workflows.DemoWorkflow', ID: '0b4cc0d5-413a-4c1c-816a-a71fa24740d4', RuntimeStatus: RUNNING, CreatedAt: 2023-09-13T13:02:30.547Z, LastUpdatedAt: 2023-09-13T13:02:30.699Z, Input: '"input data"', Output: '']
*******
**SendExternalMessage**
*******
** Registering parallel Events to be captured by allOf(t1,t2,t3) **
Events raised for workflow with instanceId: 0b4cc0d5-413a-4c1c-816a-a71fa24740d4
*******
** Registering Event to be captured by anyOf(t1,t2,t3) **
Event raised for workflow with instanceId: 0b4cc0d5-413a-4c1c-816a-a71fa24740d4
*******
**WaitForInstanceCompletion**
Result: [Name: 'io.dapr.examples.workflows.DemoWorkflow', ID: '0b4cc0d5-413a-4c1c-816a-a71fa24740d4', RuntimeStatus: FAILED, CreatedAt: 2023-09-13T13:02:30.547Z, LastUpdatedAt: 2023-09-13T13:02:55.054Z, Input: '"input data"', Output: '']
*******
**purgeInstance**
purgeResult: true
*******
**raiseEvent**
Started new workflow instance with random ID: 7707d141-ebd0-4e54-816e-703cb7a52747
Event raised for workflow with instanceId: 7707d141-ebd0-4e54-816e-703cb7a52747
*******
Started new workflow instance with specified ID: terminateMe
Terminate this workflow instance manually before the timeout is reached
*******
Started new  workflow instance with ID: restarting
Sleeping 30 seconds to restart the workflow
**SendExternalMessage: RestartEvent**
Sleeping 30 seconds to terminate the eternal workflow
Exiting DemoWorkflowClient.

What happened?

  1. When you ran dapr run, the workflow worker registered the workflow (DemoWorkflow) and its actvities to the Dapr Workflow engine.
  2. When you ran java, the workflow client started the workflow instance with the following activities. You can follow along with the output in the terminal where you ran dapr run.
    1. The workflow is started, raises three parallel tasks, and waits for them to complete.
    2. The workflow client calls the activity and sends the “Hello Activity” message to the console.
    3. The workflow times out and is purged.
    4. The workflow client starts a new workflow instance with a random ID, uses another workflow instance called terminateMe to terminate it, and restarts it with the workflow called restarting.
    5. The worfklow client is then exited.

Next steps

5 - Getting started with the Dapr and Spring Boot

How to get started with Dapr and Spring Boot

By combining Dapr and Spring Boot, we can create infrastructure independent Java applications that can be deployed across different environments, supporting a wide range of on-premises and cloud provider services.

First, we will start with a simple integration covering the DaprClient and the Testcontainers integration, to then use Spring and Spring Boot mechanisms and programming model to leverage the Dapr APIs under the hood. This helps teams to remove dependencies such as clients and drivers required to connect to environment-specific infrastructure (databases, key-value stores, message brokers, configuration/secret stores, etc)

Adding the Dapr and Spring Boot integration to your project

If you already have a Spring Boot application, you can directly add the following dependencies to your project:

	<dependency>
        <groupId>io.dapr.spring</groupId>
		<artifactId>dapr-spring-boot-starter</artifactId>
                <version>0.x.x</version> // see below for the latest versions
	</dependency>
	<dependency>
		<groupId>io.dapr.spring</groupId>
		<artifactId>dapr-spring-boot-starter-test</artifactId>
                <version>0.x.x</version> // see below for the latest versions
		<scope>test</scope>
	</dependency>

You can find the latest released version here.

By adding these dependencies, you can:

  • Autowire a DaprClient to use inside your applications
  • Use the Spring Data and Messaging abstractions and programming model that uses the Dapr APIs under the hood
  • Improve your inner-development loop by relying on Testcontainers to bootstrap Dapr Control plane services and default components

Once these dependencies are in your application, you can rely on Spring Boot autoconfiguration to autowire a DaprClient instance:

@Autowired
private DaprClient daprClient;

This will connect to the default Dapr gRPC endpoint localhost:50001, requiring you to start Dapr outside of your application.

You can use the DaprClient to interact with the Dapr APIs anywhere in your application, for example from inside a REST endpoint:

@RestController
public class DemoRestController {
  @Autowired
  private DaprClient daprClient;

  @PostMapping("/store")
  public void storeOrder(@RequestBody Order order){
    daprClient.saveState("kvstore", order.orderId(), order).block();
  }
}

record Order(String orderId, Integer amount){}

If you want to avoid managing Dapr outside of your Spring Boot application, you can rely on Testcontainers to bootstrap Dapr beside your application for development purposes. To do this we can create a test configuration that uses Testcontainers to bootstrap all we need to develop our applications using the Dapr APIs.

Using Testcontainers and Dapr integrations, we let the @TestConfiguration bootstrap Dapr for our applications. Notice that for this example, we are configuring Dapr with a Statestore component called kvstore that connects to an instance of PostgreSQL also bootstrapped by Testcontainers.

@TestConfiguration(proxyBeanMethods = false)
public class DaprTestContainersConfig {
  @Bean
  @ServiceConnection
  public DaprContainer daprContainer(Network daprNetwork, PostgreSQLContainer<?> postgreSQLContainer){
    
    return new DaprContainer("daprio/daprd:1.15.4")
            .withAppName("producer-app")
            .withNetwork(daprNetwork)
            .withComponent(new Component("kvstore", "state.postgresql", "v1", STATE_STORE_PROPERTIES))
            .withComponent(new Component("kvbinding", "bindings.postgresql", "v1", BINDING_PROPERTIES))
            .dependsOn(postgreSQLContainer);
  }
}

Inside the test classpath you can add a new Spring Boot Application that uses this configuration for tests:

@SpringBootApplication
public class TestProducerApplication {

  public static void main(String[] args) {

    SpringApplication
            .from(ProducerApplication::main)
            .with(DaprTestContainersConfig.class)
            .run(args);
  }
  
}

Now you can start your application with:

mvn spring-boot:test-run

Running this command will start the application, using the provided test configuration that includes the Testcontainers and Dapr integration. In the logs you should be able to see that the daprd and the placement service containers were started for your application.

Besides the previous configuration (DaprTestContainersConfig) your tests shouldn’t be testing Dapr itself, just the REST endpoints that your application is exposing.

Leveraging Spring & Spring Boot programming model with Dapr

The Java SDK allows you to interface with all of the Dapr building blocks. But if you want to leverage the Spring and Spring Boot programming model you can use the dapr-spring-boot-starter integration. This includes implementations of Spring Data (KeyValueTemplate and CrudRepository) as well as a DaprMessagingTemplate for producing and consuming messages (similar to Spring Kafka, Spring Pulsar and Spring AMQP for RabbitMQ) and Dapr workflows.

Using Spring Data CrudRepository and KeyValueTemplate

You can use well known Spring Data constructs relying on a Dapr-based implementation. With Dapr, you don’t need to add any infrastructure-related driver or client, making your Spring application lighter and decoupled from the environment where it is running.

Under the hood these implementations use the Dapr Statestore and Binding APIs.

Configuration parameters

With Spring Data abstractions you can configure which statestore and bindings will be used by Dapr to connect to the available infrastructure. This can be done by setting the following properties:

dapr.statestore.name=kvstore
dapr.statestore.binding=kvbinding

Then you can @Autowire a KeyValueTemplate or a CrudRepository like this:

@RestController
@EnableDaprRepositories
public class OrdersRestController {
  @Autowired
  private OrderRepository repository;
  
  @PostMapping("/orders")
  public void storeOrder(@RequestBody Order order){
    repository.save(order);
  }

  @GetMapping("/orders")
  public Iterable<Order> getAll(){
    return repository.findAll();
  }


}

Where OrderRepository is defined in an interface that extends the Spring Data CrudRepository interface:

public interface OrderRepository extends CrudRepository<Order, String> {}

Notice that the @EnableDaprRepositories annotation does all the magic of wiring the Dapr APIs under the CrudRespository interface. Because Dapr allow users to interact with different StateStores from the same application, as a user you need to provide the following beans as a Spring Boot @Configuration:

@Configuration
@EnableConfigurationProperties({DaprStateStoreProperties.class})
public class ProducerAppConfiguration {
  
  @Bean
  public KeyValueAdapterResolver keyValueAdapterResolver(DaprClient daprClient, ObjectMapper mapper, DaprStateStoreProperties daprStatestoreProperties) {
    String storeName = daprStatestoreProperties.getName();
    String bindingName = daprStatestoreProperties.getBinding();

    return new DaprKeyValueAdapterResolver(daprClient, mapper, storeName, bindingName);
  }

  @Bean
  public DaprKeyValueTemplate daprKeyValueTemplate(KeyValueAdapterResolver keyValueAdapterResolver) {
    return new DaprKeyValueTemplate(keyValueAdapterResolver);
  }
  
}

Using Spring Messaging for producing and consuming events

Similar to Spring Kafka, Spring Pulsar and Spring AMQP you can use the DaprMessagingTemplate to publish messages to the configured infrastructure. To consume messages you can use the @Topic annotation (soon to be renamed to @DaprListener).

To publish events/messages you can @Autowired the DaprMessagingTemplate in your Spring application. For this example we will be publishing Order events and we are sending messages to the topic named topic.

@Autowired
private DaprMessagingTemplate<Order> messagingTemplate;

@PostMapping("/orders")
public void storeOrder(@RequestBody Order order){
  repository.save(order);
  messagingTemplate.send("topic", order);
}

Similarly to the CrudRepository we need to specify which PubSub broker do we want to use to publish and consume our messages.

dapr.pubsub.name=pubsub

Because with Dapr you can connect to multiple PubSub brokers you need to provide the following bean to let Dapr know which PubSub broker your DaprMessagingTemplate will use:

@Bean
public DaprMessagingTemplate<Order> messagingTemplate(DaprClient daprClient,
                                                             DaprPubSubProperties daprPubSubProperties) {
  return new DaprMessagingTemplate<>(daprClient, daprPubSubProperties.getName());
}

Finally, because Dapr PubSub requires a bidirectional connection between your application and Dapr you need to expand your Testcontainers configuration with a few parameters:

@Bean
@ServiceConnection
public DaprContainer daprContainer(Network daprNetwork, PostgreSQLContainer<?> postgreSQLContainer, RabbitMQContainer rabbitMQContainer){
    
    return new DaprContainer("daprio/daprd:1.15.4")
            .withAppName("producer-app")
            .withNetwork(daprNetwork)
            .withComponent(new Component("kvstore", "state.postgresql", "v1", STATE_STORE_PROPERTIES))
            .withComponent(new Component("kvbinding", "bindings.postgresql", "v1", BINDING_PROPERTIES))
            .withComponent(new Component("pubsub", "pubsub.rabbitmq", "v1", rabbitMqProperties))
            .withAppPort(8080)
            .withAppChannelAddress("host.testcontainers.internal")
            .dependsOn(rabbitMQContainer)
            .dependsOn(postgreSQLContainer);
}

Now, in the Dapr configuration we have included a pubsub component that will connect to an instance of RabbitMQ started by Testcontainers. We have also set two important parameters .withAppPort(8080) and .withAppChannelAddress("host.testcontainers.internal") which allows Dapr to contact back to the application when a message is published in the broker.

To listen to events/messages you need to expose an endpoint in the application that will be responsible to receive the messages. If you expose a REST endpoint you can use the @Topic annotation to let Dapr know where it needs to forward the events/messages too:

@PostMapping("subscribe")
@Topic(pubsubName = "pubsub", name = "topic")
public void subscribe(@RequestBody CloudEvent<Order> cloudEvent){
    events.add(cloudEvent);
}

Upon bootstrapping your application, Dapr will register the subscription to messages to be forwarded to the subscribe endpoint exposed by your application.

If you are writing tests for these subscribers you need to ensure that Testcontainers knows that your application will be running on port 8080, so containers started with Testcontainers know where your application is:

@BeforeAll
public static void setup(){
  org.testcontainers.Testcontainers.exposeHostPorts(8080);
}

You can check and run the full example source code here.

Using Dapr Workflows with Spring Boot

Following the same approach that we used for Spring Data and Spring Messaging, the dapr-spring-boot-starter brings Dapr Workflow integration for Spring Boot users.

To work with Dapr Workflows you need to define and implement your workflows using code. The Dapr Spring Boot Starter makes your life easier by managing Workflows and WorkflowActivitys as Spring beans.

In order to enable the automatic bean discovery you can annotate your @SpringBootApplication with the @EnableDaprWorkflows annotation:

@SpringBootApplication
@EnableDaprWorkflows
public class MySpringBootApplication {}

By adding this annotation, all the WorkflowActivitys will be automatically managed by Spring and registered to the workflow engine.

By having all WorkflowActivitys as managed beans we can use Spring @Autowired mechanism to inject any bean that our workflow activity might need to implement its functionality, for example the @RestTemplate:

public class MyWorkflowActivity implements WorkflowActivity {

  @Autowired
  private RestTemplate restTemplate;

You can also @Autowired the DaprWorkflowClient to create new instances of your workflows.

@Autowired
private DaprWorkflowClient daprWorkflowClient;

This enable applications to schedule new workflow instances and raise events.

String instanceId = daprWorkflowClient.scheduleNewWorkflow(MyWorkflow.class, payload);

and

daprWorkflowClient.raiseEvent(instanceId, "MyEvenet", event);

Check the Dapr Workflow documentation for more information about how to work with Dapr Workflows.

Next steps

Learn more about the Dapr Java SDK packages available to add to your Java applications.