Writing a Kafka Avro Producer in Java

Stephane Maarek | AWS Certified Cloud Practitioner,Solutions Architect,Developer
A free video tutorial from Stephane Maarek | AWS Certified Cloud Practitioner,Solutions Architect,Developer
Best Selling Instructor, Kafka Guru, 9x AWS Certified
4.7 instructor rating • 41 courses • 937,881 students

Lecture description

Learn how to use Java to write your first Avro Producer

Learn more from the full course

Apache Kafka Series - Confluent Schema Registry & REST Proxy

Kafka - Master Avro, the Confluent Schema Registry and Kafka REST Proxy. Build Avro Producers/Consumers, Evolve Schemas

04:23:56 of on-demand video • Updated August 2021

  • Write simple and complex Avro Schemas
  • Create, Write and Read Avro objects in Java
  • Write a Java Producer and Consumer leveraging Avro data and the Schema Registry
  • Learn about Schema Evolution
  • Perform Schema evolution using the command line and in Java
  • Utilize the REST Proxy using a REST Client
English [Auto] OK, so in this lecture, we are going to write an average producer in Java, so this is exciting because when they were feeling for the console producer and the consumer and know they're very helpful for debugging, we're going to do something in Java, something that looks more like an application you would deploy in your production environments. So anything that we'll do here will leverage all the knowledge required from before, as you'll see, and we'll be using specific record as a way to create Uru objects. So that's really exciting, right. So let's get started. OK, so we are in our project and what I'm going to do now is create a new project. It's going to be Mavin. I click next. The package again can be whatever you want. I'll just become that example for now. And this is Kafka Avro the one. So I'll be coding along with you. But all the code, again, is available in the GitHub repository. This window. So as we get started now, we definitely want to enable us to import and so we are in our Mavin project just like usual, and the first thing I want to do is get the pump that Zimmel right. So for this, I will go directly to the Kafka Avro, the one folder that we've had from before, from the code down on GitHub. And here it is, there's all this stuff. So everything from below, from properties you should get. So properties all the way down and we'll go over right now and analyze what that means. OK, so I'll go ahead and paste it. So everything's been copied now and let's go over the code and understand what this pump that XML contains. So first thing is we have an Arab version, one point or two, but also added to Kafka version zero, 11. zero one and a confident version three three one. So those are the latest at the time I'm writing this code. This course, actually there's Kafka 1.0, but because we want to match up to confluent, confirmed 4.0 is not out yet. So I just use the older version of Kafka zero 11 zero one. But that's plenty enough for this course. As we are going to resolve dependencies from conference, you need to add a repository. And for this we have a repositories Bluck with a repository and the ideas conference. And the euro is the Mavin from conference. But the dependencies we have about zero, which is how we'll use it to create our specific records. And then we have our Kovacic client just like before, and their reference to Kafka version from before. So zero 011 zero zero one. And finally we have, I would add, confidence and the artifact is Kafka, Avro, serializer. So here it is. That's how we will write stuff to the schema registry into Kafka in Avro. Regarding the build plugins, we have Java enabled right here. We still have the specific record plug in to build code that will be using and a code and it plug in again to force the discovery of these sources. So fairly simple. The only thing that has changed now is these two blocks. So let's get started. So in my main, I'm going to create a. Come that example that Cafcass Avro producer, the one. OK, and if you want and we have a VM for public static void main, so let's go ahead and create our first producer. So first of all, same as before we create some properties and these properties contains what you would expect. So we have the bootstrap that servers and please set it to your IP nine zero nine to. Then we can set some properties such as the X, the one we could set the retries. Being 10. But you know that we have two sets, the value of the key serializer. And so for this, we'll use a string serializer that claws that get name, so we'll see key with strings. And similarly, we have a value serializer, that serializer and that one is going to be a Kafka Averroes serializer that claws douget name. So here we have a new value. You're listening to Kafka and you can see that from the important statements. We can see that it is from the IO that comforted Kafka package. So because we have a value serializer, we also should set this schema registry. You Earl HGP again one twenty seven zero zero one eight zero eight one. OK, make sure you have it like this. Or again, changed the IP based on what you want. OK, next up, we go ahead and create a Kafka producer. And so the key is going to be string and the value is going to be a customer. We don't have the customer yet and that's our captive producer and it's a new Kafka producer that takes the properties as an input. You still still have the customer, but we have it very soon. By the way, the string topic that we write, you will be customer Avro. So that's perfect. So now we need to create a customer and we haven't done it so yet. So we'll just put question marks and then what we'll do is that we'll create a producer record string. Customer. So we'll produce a record equals new produce a record. And so as an input, the topic is going to be topic. And the key is going to be whatever you want, we want to keep we'll just put the customer as a value. OK, so, so far so good. But there is something and the customer we still don't have. So for now, just comment this code. So it compiles. And so we need to create that customer class. Do you know how to create that customer class? The answer is yes, you do, you do know, so just like before, in resources, we create an average directory and we'll create a customer, the one that AVC file. Now, go ahead, because I'm lazy and I'll just use the one for my project. Copy this. Paste that. So here's our customer, just like before. First name. Last name. I wait. Automated email. You see there is no targets yet so I'll go ahead and Mavin and then I will in the lifecycle clean. First and then package. And he will go ahead and compile my Everest mine to a class, so you see now targets generated sources everywhere. We have our customer excellence. If it didn't work, make sure that your producer could compile. OK, so I commented out this quote. Now, I'm going to uncomfort my code right here, and as you can see now the customer is being imported. It's because I'm in the same package come that example. If you had another package, you would just need to writes the import statements. OK, so here we go. We have a Kafka producer. You have a string, and now we need to create a customer. So for this, we know this customer, a new builder that builds the very end. And so we sets the first name, our good old John. We set the last name. So we set his age. He stole twenty six, we said his height. He's grown up a little bit and we set his weights, he also took some weight. And we'll set the automated email being false. OK, so here's our customer and now we have a producer record which takes a topic in the value of being the customer. So now we can do Katka producer to send the producer record and this time we'll add a callback. So a new callback. And on completion, we check the exception. So if the exception is null, else, OK, so if the exception is null, well, we'll just print Ellan success. And maybe we'll do another print, Alan, and we'll say metadata to string, it's going to print the metadata, so it's going to print basically the record of sets and partition. And if there was an error, we'll do exception. That print stack trace. OK, finally, just to make sure that produce that records does indeed get sent, we'll flush and then we'll close it. And here we go. We have our first Kafka producer that's using Avro. But, you know, let's be convinced about it. Let's run it. So I click here and run it. And again, air strikes, your laser is not an instance of serializer, so obviously the strings you are right here, it gets and it's amazing to get errors because you have to go through them as well. In here, look at the import import. It could housed objects and that's not the right one. I remove this and I will import and then I get a choice. So I will use the first one, the Oregon Apache Ratkovic, to come and serialisation. OK, much better. Let's run our code again and I'll keep these errors because errors are very nice, even though I do errors all the time and they keep you a learning. So that's why I keep them on record. The course is not perfect. I'm a programmer like everyone else. So here we go. Success customer avro zero at zero. That means partition zero and I've said zero. If I run this again, we'll get partition zero and offset one. All right, amazing. So it's been written this double check that's using maybe the UI. So if I refresh this Kumai Registry UI, we now have a customer Avro value and here's our record, our schema right here. But if you want to check for some detail, we can go to topic. And customer Avro. And after a second, we should see here we go, here's our customer, John Doe, it's been there. There's a last way of checking. As you know, the best way of checking is to use a terminal. And for this second, you may dukkha run. It's Kimmy Registry Bausch. All right, I mean, and then I do capture Avro Council, consumer bootstrap server. Being for me, one two seven zero zero one and two, but changes to Europe, the topic is customer Avro. We want it from beginning and we want to add a property. And I'm lazy, so I'm just going to copy this from here. The property is this schema register URL. So let's just go ahead and print this. Here we go, enter. And we see our John Doe using the council consumer. Amazing. Now, if you run the producer once more and go into terminal, we see that there's a third John Doe that has appeared. So congratulations. You just wrote your first Java cafÈ average producer. And that was fairly easy, right? Everything we learned before just applied to produce a record. And then we have a callback flush close, OK. Hope you love the learning. Hope you remember you add this Kumaris real as a prop. and I will see you in the next lecture.