What Does MVP Stand For? It’s Not What You Think.
October 7, 2024
Article
Instructor: Packt - Course Instructors
Included with
Recommended experience
Intermediate level
For software developers and data engineers with basic knowledge of Java and Kafka. Experience with Docker and REST APIs is a plus.
Recommended experience
Intermediate level
For software developers and data engineers with basic knowledge of Java and Kafka. Experience with Docker and REST APIs is a plus.
Create and manage data contracts in Kafka using Schema Registry.
Implement AVRO schemas for scalable and efficient data serialization.
Build real-time Kafka applications using Gradle, Maven, and Spring Boot.
Ensure compatibility with schema evolution strategies.
Add to your LinkedIn profile
February 2025
15 assignments
Add this credential to your LinkedIn profile, resume, or CV
Share it on social media and in your performance review
Unlock the power of data contracts in Kafka with this comprehensive course focusing on Schema Registry and AVRO serialization. You'll explore how to create robust data pipelines, ensuring compatibility and scalability across producer-consumer applications. By the end, you'll master tools and techniques that empower efficient data processing with seamless schema evolution.
Start with the fundamentals of data serialization in Kafka, diving deep into popular formats like AVRO, Protobuf, and Thrift. Gradually, you'll build hands-on expertise by setting up Kafka in a local environment using Docker, creating custom AVRO schemas, and generating Java records for real-world applications. The course includes practical exercises, such as building an end-to-end Coffee Shop order service and exploring schema evolution strategies in Schema Registry. You'll also learn naming conventions, logical schema types, and compatibility strategies that ensure smooth upgrades in production environments. Designed for software developers and data engineers, this course assumes basic knowledge of Java and Kafka. Whether you're a beginner or looking to deepen your expertise in Kafka and Schema Registry, this course is your gateway to mastering data contracts.
In this module, we will set the foundation for the course by providing an overview of its objectives, structure, and prerequisites. You’ll gain a clear understanding of what to expect and how to prepare for success in this learning journey.
2 videos1 reading
In this module, we will dive into the intricacies of data contracts and serialization in Kafka. You’ll explore how serialization enhances Kafka's architecture and examine different serialization formats, such as AVRO, Protobuf, and Thrift, to understand their schema compatibility and applications.
2 videos1 assignment
In this module, we will introduce you to AVRO, one of the most popular serialization systems. You’ll explore the reasons behind its popularity and learn how to build a simple AVRO schema to get hands-on experience with its functionality.
2 videos1 assignment
In this module, we will guide you through setting up a local Kafka environment using Docker Compose. You’ll practice producing and consuming messages with CLI tools and delve into AVRO serialization by leveraging the AVRO console producer and consumer for hands-on experience.
3 videos1 assignment
In this module, we will set up the foundational components for a Greeting App using Gradle. You’ll learn how to configure the project for AVRO support and generate Java records from schema files, preparing the groundwork for seamless AVRO integration.
2 videos1 assignment
In this module, we will set up the Greeting App project using Maven as the build tool. You’ll learn how to configure Maven for AVRO support and generate Java records from schema files, ensuring the project is ready for AVRO-based serialization.
2 videos1 assignment
In this module, we will guide you through building an AVRO-based Kafka producer and consumer using Java. You’ll learn how to implement serialization and deserialization for seamless data exchange within Kafka topics.
2 videos1 assignment
In this module, we will build a real-time Coffee Shop Order Service using AVRO and Kafka. You’ll start with an application overview and progress through project setup, schema creation, and AVRO class generation. Finally, you’ll create producers and consumers to simulate and process coffee shop orders in a real-time streaming scenario.
8 videos1 assignment
In this module, we will explore logical schema types in AVRO and their practical applications. You’ll learn to enhance the CoffeeOrder schema by adding logical types like timestamps, decimals, UUIDs, and dates to improve data accuracy and functionality in real-world scenarios.
4 videos1 assignment
In this module, we will delve into the inner workings of an AVRO record. You’ll uncover how data is stored, organized, and encoded, gaining insights into its efficiency and compatibility with evolving schemas.
1 video1 assignment
In this module, we will explore the effects of schema changes in AVRO, focusing on the consumer application's failure to process data with an updated schema. You’ll gain practical insights into schema evolution challenges and why a schema registry is crucial for maintaining compatibility.
1 video1 assignment
In this module, we will introduce you to the schema registry and its critical role in handling AVRO schemas effectively. You’ll learn how to publish and consume records using the schema registry, interact with its REST API, and work with "key" fields as AVRO records to enhance data management and compatibility.
4 videos1 assignment
In this module, we will focus on data evolution through the lens of schema changes managed by a schema registry. You’ll learn to update project configurations, explore compatibility types like backward, forward, and full, and understand the consequences of modifying AVRO schemas, enabling you to manage data evolution effectively.
7 videos1 assignment
In this module, we will cover different schema naming strategies in AVRO and their practical applications. You’ll create a schema for Coffee Update events and learn how to utilize RecordNameStrategy to manage schema versions while ensuring seamless data processing.
3 videos1 assignment
In this module, we will create a coffee order service application using Spring Boot, Kafka, and Schema Registry. We'll cover every step, from setting up the project using Gradle or Maven to building RESTful endpoints for creating and updating coffee orders. Additionally, we'll explore how to configure Kafka producers and consumers to publish and process coffee order events, leveraging AVRO for structured data serialization.
10 videos2 assignments
Packt helps tech professionals put software to work by distilling and sharing the working knowledge of their peers. Packt is an established global technical learning content provider, founded in Birmingham, UK, with over twenty years of experience delivering premium, rich content from groundbreaking authors on a wide range of emerging and popular technologies.
Georgetown University
Earn a degree
Degree
University of North Texas
Earn a degree
Degree
University of North Texas
Earn a degree
Degree
University of Illinois Urbana-Champaign
Earn a degree
Degree
Unlimited access to 10,000+ world-class courses, hands-on projects, and job-ready certificate programs - all included in your subscription
Earn a degree from world-class universities - 100% online
Upskill your employees to excel in the digital economy
Yes, you can preview the first video and view the syllabus before you enroll. You must purchase the course to access content not included in the preview.
If you decide to enroll in the course before the session start date, you will have access to all of the lecture videos and readings for the course. You’ll be able to submit assignments once the session starts.
Once you enroll and your session begins, you will have access to all videos and other resources, including reading items and the course discussion forum. You’ll be able to view and submit practice assessments, and complete required graded assignments to earn a grade and a Course Certificate.
If you complete the course successfully, your electronic Course Certificate will be added to your Accomplishments page - from there, you can print your Course Certificate or add it to your LinkedIn profile.
This course is one of a few offered on Coursera that are currently available only to learners who have paid or received financial aid, when available.
You will be eligible for a full refund until two weeks after your payment date, or (for courses that have just launched) until two weeks after the first session of the course begins, whichever is later. You cannot receive a refund once you’ve earned a Course Certificate, even if you complete the course within the two-week refund period. See our full refund policy.
Yes. In select learning programs, you can apply for financial aid or a scholarship if you can’t afford the enrollment fee. If fin aid or scholarship is available for your learning program selection, you’ll find a link to apply on the description page.
These cookies are necessary for the website to function and cannot be switched off in our systems. They are usually only set in response to actions made by you which amount to a request for services, such as setting your privacy preferences, logging in or filling in forms. You can set your browser to block or alert you about these cookies, but some parts of the site will not then work.
These cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites. They are based on uniquely identifying your browser and internet device. If you do not allow these cookies, you will experience less targeted advertising.
These cookies allow us to count visits and traffic sources so we can measure and improve the performance of our site. They help us to know which pages are the most and least popular and see how visitors move around the site. If you do not allow these cookies we will not know when you have visited our site, and will not be able to monitor its performance.
These cookies enable the website to provide enhanced functionality and personalization. They may be set by us or by third party providers whose services we have added to our pages. If you do not allow these cookies then some or all of these services may not function properly.