[LIVE CODING] Event streaming applications with Kafka Streams, Spring Kafka and Actuator
Hi my name is Ko Turk and I am a Senior Java Developer at Blue4IT.
I am working as a Fullstack engineer at the Rabobank, creating features in Kotlin and doing some Typescript ;).
And as a speaker I like talking at international conferences and JUGs about Micrometer / Kafka and Kotlin!
Best seen talk?! All talks with live coding in it! So that’s the reason that I’m trying and doing it more often! 😉
We often make applications in which REST (HTTP) predominates, but is this a smart choice? Can’t it be faster, or rather asynchronously and event based? Especially if you work with BIG DATA, Kafka is usually the better option.
You get a whole platform where scalability, fault tolerance and replay-ability are very important (you don’t want your message to be lost, and preferably you want to process it again if your system has been down).
There are three libraries we will discuss during live coding:
– Spring Kafka, where we will create a consumer and producer and test them with test-containers (without starting a whole cluster yourself).
– Kafka Streams, a perfect fit for functional programming! With the StreamsAPI we will cover the best and most used patterns like branching, joining, mapping and more. We will create a Topology (stream) where we will handle fraud detection.
– Spring Actuator is a match when you want to monitor your stream of events, you already got a lot of metrics for free!
After the session, you can create your own application with consumers, producers and streams!