Overview
In this 5-hour course, you will delve into the intricacies of Kafka data streaming with a focus on data contracts using Schema Registry. You'll learn how to use AVRO for data serialization, build and configure Kafka Producers and Consumers, and integrate Schema Registry for handling evolving data schemas.
What I will be able to do after this course
- Gain a thorough understanding of AVRO as a data serialization format.
- Implement Kafka Producers and Consumers that utilize Schema Registry.
- Develop low-latency, robust data streaming applications with Spring Boot.
- Master Schema Registry configurations, ensuring data schema management.
- Understand schema evolution and its importance in data-driven systems.
Course Instructor(s)
Dilip Sundarraj is an experienced software engineer and educator specializing in distributed systems and Java-based application development. He brings a practical approach to teaching, ensuring learners gain hands-on expertise in Kafka and related technologies. With a rich technical background, Dilip is committed to helping developers upskill effectively.
Who is it for?
This course is ideal for experienced Java developers who aim to specialize in Kafka application development. If you are familiar with building Kafka Producers and seek to deepen your knowledge of AVRO, Schema Registry, and integration techniques, this course is for you. Developers with a need to enforce data contracts and understand schema evolution will find invaluable insights here. Step up your Kafka skills and master AVRO-based data serialization and schema management.