Now I will share a working service example of how to use the event sourcing toolkit starter in practice , in the example I will show the following:
- How to configure and use the event sourcing starter with spring boot web application
- How to implement your aggregate entity using the API of the toolkit
- How to define your entity flow using the execution flow API
- How to configure your entity
- How to configure your Akka system with spring boot
- How to call your aggregates from your service and connect that to your DDD service REST API
- How to use Google Protobuf to serialize your events instead of Java serialization
- The usage of Apache Ignite as your persistence event store with Akka persistence
- In Part 4 we sill cover the summary and possible designs plus some special remarks
How to configure and use the event sourcing starter with spring boot web application:
In your spring boot app , add the event souring tool kit maven dependency :
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
<dependency> | |
<groupId>spring-akka-event-sourcing</groupId> | |
<artifactId>springboot-akka-event-sourcing-starter</artifactId> | |
<version>1.0</version> | |
</dependency> |
How to implement your aggregate entity (OrderManager) using the API of the toolkit and implement its flow using the toolkit DSL abstraction
your order aggregate flow (OrderManager) implementation will be as the following :
where the order manager aggregate class will extend the toolkit persistent entity class and define the flow logic for command and event handlers inside your custom entity using the flow execution DSL exposed to you from the Persistent entity , the flow will be as the following :
The code of the order manager class with enough documentation for the flow DSL is on github: OrderManager Java Code
How to configure your persistent aggregate entity via the toolkit API :
AS being explained before , you just need to implement the following interface PersistentEntityProperties and the toolkit will auto discover it for the your entity cluster sharding and persistence configuration , the code reference for the config in the working sample is: The entity configuration
How to configure your Akka system with spring boot
Just need to add a reference for your akka system config file in your spring boot app config file (application.yml) with the proper properties names and the toolkit will pick it up :
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
spring: | |
jackson: | |
default-property-inclusion: non_null | |
akka: | |
config: eventSourcing.conf | |
system-name: orderManagerSystem |
How to call your aggregates from your service and connect that to your DDD service REST API
- First implement your order broker service that will has a reference for PersistentEntityBroker which provided by the toolkit to abstract the cluster sharding lookup for your entity , the Order broker is here : Order Broker
- Then you use the non blocking asynchronous PatternsCS ask to call the target entity using PersistentEntityBroker , code snippet to show :
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
/** | |
* get order state service API | |
* | |
* @param getOrderStatusCmd get Order state command | |
* @return order state | |
*/ | |
public CompletableFuture<OrderState> getOrderStatus(OrderCmd.GetOrderStatusCmd getOrderStatusCmd) { | |
return PatternsCS.ask(getOrderEntity(), getOrderStatusCmd, timeout).toCompletableFuture() | |
.thenApply(handleGetState); | |
} | |
/** | |
* @return Persistent entity actor reference based into AKKA cluster sharding | |
*/ | |
private final ActorRef getOrderEntity() { | |
return persistentEntityBroker.findPersistentEntity(OrderManager.class); | |
} |
- Then from your REST API resource , you can call your broker to invoke the target command or query in Async non blocking way as well , the rest API class reference is here (OrderRestController): , small code snapshot to show how it is done :
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
/** | |
* The main order domain REST API | |
* | |
* @author romeh | |
*/ | |
@RestController | |
@RequestMapping("/orders") | |
@Api(value = "Order Manager REST API demo") | |
public class OrderRestController { | |
@Autowired | |
private OrdersBroker ordersBroker; | |
/** | |
* @param orderRequest json order request | |
* @return ASYNC generic JSON response | |
*/ | |
@RequestMapping(method = RequestMethod.POST) | |
public CompletableFuture<Response> createOrder(@RequestBody @Valid OrderRequest orderRequest) { | |
return ordersBroker.createOrder(new OrderCmd.CreateCmd(UUID.randomUUID().toString(), orderRequest.getOrderDetails())); | |
} | |
} |
- When you run the app , there is a run-time swagger for the different application REST APIs DOC and testing on http://localhost:9595/swagger-ui.html where you can test order create , validate , sign and state query as well .
How to use Protobuf to serialize your events instead of Java serialization
As we know Java serialization is not optimal for performance optimization , so here I shared also how you can ProtoBuf protocol to do the serialization of the events as it is more optimized , where to check the implementation points :
- You need to implement SerializerWithStringManifest from AKKA perisistance which in our application is OrderManagerSerializer which will use the generated protobuf builder class from the file below
- The protobuf definition for the event classes in proto folder of the project: EventsAndCommands.proto
- Add the needed maven build plugin the generate the needed code based into the schema definition above , the plugin configuration will be as the following :
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
<plugin> | |
<groupId>org.xolstice.maven.plugins</groupId> | |
<artifactId>protobuf-maven-plugin</artifactId> | |
<version>0.5.1</version> | |
<configuration> | |
<protocExecutable>/usr/local/opt/protobuf/bin/protoc</protocExecutable> | |
</configuration> | |
<executions> | |
<execution> | |
<goals> | |
<goal>compile</goal> | |
<goal>test-compile</goal> | |
</goals> | |
<configuration> | |
<protocExecutable>/usr/local/opt/protobuf/bin/protoc</protocExecutable> | |
</configuration> | |
</execution> | |
</executions> | |
</plugin> | |
<plugin> | |
<groupId>org.codehaus.mojo</groupId> | |
<artifactId>build-helper-maven-plugin</artifactId> | |
<executions> | |
<execution> | |
<id>add-protobuf-generate-sources</id> | |
<phase>generate-sources</phase> | |
<goals> | |
<goal>add-source</goal> | |
</goals> | |
<configuration> | |
<sources> | |
<source>target/generated-sources/protobuf/java</source> | |
</sources> | |
</configuration> | |
</execution> | |
<execution> | |
<id>add-protobuf-generate-test-sources</id> | |
<phase>generate-sources</phase> | |
<goals> | |
<goal>add-test-source</goal> | |
</goals> | |
<configuration> | |
<sources> | |
<source>target/generated-test-sources/protobuf/java</source> | |
</sources> | |
</configuration> | |
</execution> | |
</executions> | |
</plugin> |
The usage of Apache Ignite as your persistence event store with Akka persistence:
Here I am going to use a custom Akka persistance plugin with Apache ignite I have created before on : https://github.com/Romeh/akka-persistance-ignite , you can check it out for more technical details about it.
So I just added the needed maven dependency plus add the needed Apache ignite grid configuration for Akka persistance , then this it , now when you build and run the application , it will start Apache ignite server node as well which will be used to store the events and snapshots in its own journals
Now in Part 4 we will go through some remarks and possible different architectures using that toolkit for event sourcing and CQRS services .
References:
- Part 4 :https://mromeh.com/2018/04/27/spring-boot-akka-event-sourcing-starter-part-4-final/
- GitHub toolkit project URL: https://github.com/Romeh/spring-boot-akka-event-sourcing-starter
- Akka persistence : https://doc.akka.io/docs/akka/2.5/persistence.html
- Spring boot : https://projects.spring.io/spring-boot/
Hello!
Good tutorial!
Could you share full project source code?
LikeLike
Hey ! , thx for your comment ! it is already there in the github project : https://github.com/Romeh/spring-boot-akka-event-sourcing-starter/tree/master/spring-event-sourcing-example
Let me know if you any further questions !
Romeh
LikeLike