About this video
Today's application are build in the Microservices Architecture. Having a lot of Microservices that needs to communicate with each other can be problematic as they quickly become tight coupled. Apache Kafka allows us to create services that are loosely coupled and operate in the event driven way.
We can build components that process events and apply business logic as events arrived and publish those events further into the processing chain.We will learn how to use Apache Kafka to create applications that works in the publish-subscribe model.
We will delve into the Kafka Architecture and its Producer and Consumer API. We will learn how replication and fault tolerance is achieved in Kafka. We will learn how to leverage Kafka to build truly Resilient, Scalable and event driven applications.
Style and Approach
This video course will help us to understand to write an application in the event driven way for which we need to employ the publish-subscribe model. Then, we will learn how to leverage that technology to create scalable and resilient applications that work in an event-driven way. When thinking about scalability of our application and using Kafka we need to carefully plan how our data should be partitioned and how many partitions it should have.
We will learn how to choose a proper number of partitions. Then we will learn Producer and Consumer API and how a number of both of them affects performance and scalability of our solution.
- Publication date:
- August 2017
- 1 hour 02 minutes