Writing a Kafka Avro Producer in Java

Stephane Maarek | AWS Certified Solutions Architect & Developer Associate
A free video tutorial from Stephane Maarek | AWS Certified Solutions Architect & Developer Associate
Best Selling Instructor, Kafka Guru, 9x AWS Certified
4.7 instructor rating • 38 courses • 727,600 students

Lecture description

Learn how to use Java to write your first Avro Producer

Learn more from the full course

Apache Kafka Series - Confluent Schema Registry & REST Proxy

Kafka - Master Avro, the Confluent Schema Registry and Kafka REST Proxy. Build Avro Producers/Consumers, Evolve Schemas

04:23:56 of on-demand video • Updated February 2021

  • Write simple and complex Avro Schemas
  • Create, Write and Read Avro objects in Java
  • Write a Java Producer and Consumer leveraging Avro data and the Schema Registry
  • Learn about Schema Evolution
  • Perform Schema evolution using the command line and in Java
  • Utilize the REST Proxy using a REST Client
English [Auto] OK so in this lecture we are going to write an average producer in Java. So this is exciting because when they were feeling for the consul producer and the console consumer and you know they're very helpful for debugging. We're going to do something in Java something that looks more like an application you would deploy in your production environments. So anything that we'll do here we'll leverage all the knowledge required from before as you'll see and we'll be using specific record as a way to create arew objects. So that's really exciting right. So let's get started. OK so we are in our project and what I'm going to do now is create a new project. It's going to be maven. Click next. The package again can be whatever you want. I'll just become that example for now. And this is Kafka Avro the one so I'll be coding along with you but all the code again is available in the good repository. This window Kisa as we get started. Now we definitely want to enable auto import. And so we are in our Mavin project just like usual. And the first thing I want to do is get the palm that Exham. Right. So for this I will go directly to the cafe Avro the one folder that we've had from before from the code download a good hub. And here it is. There's all this stuff. So everything from below from properties you should get so properties all the way down and we'll go over right now and analyze what that means. OK so I'll go ahead and paste it. So everything's been copied now. And let's go over the code and understand what this palm that ex-MIL contains. So first thing is we have an ever version 1.8 not two but so did the Kafka version. 0 11 that's 0 1 and a comfort version 3 3 1. So those are the latest at the time I'm writing this code this course actually there's kaffiyeh 1.0 but because we want to match up to conference conference 4.0 is not out yet. So I just use the older version of Kafka 0 11 0 1 but that's plenty enough for this course as we are going to resolve dependencies from conference. You need to add a repository and for this we have a repositories block with a repository and the ideas conference and there you are l is the maven from conference but the dependencies we have about the Avro which is how we'll use it to create our specific records and then we have our Cafcass client just like before. And there were friends that kept a version from before so zero elevons user one. And finally we have I would add conference and the artifact is Kafka Avro serializer. So here it is that's how we'll write stuff to the document registry into Kafka in Avro regarding the bill plug ins. We have Jever eight's enabled right here. We still have this specific record plug in to build code that we'll be using and could and it plug in again to force the discovery of these sources. So fairly simple the only thing that has changed now is these two blocks. So let's get started. So in my main I'm going to create a cumquat example that can get Avro producer D1 OK and then the one and we have a VM for public static void main. So let's go ahead and create our first producer. So first of all same as before we create some properties and these properties contains what you would expect. So we have the bootstrap servers and please set it to your IP 9 2. Then we can set some properties such as the X being one we could set the retries being 10 but you know that we have to sets the value of the key serializer. And so for this will use a string Serializer that closet get name. So we'll serialize are key with strings. And similarly we have a value Serializer Serializer and that one is going to be a cash cow. Averroes Serializer that class that get name. So here we have a new Vialli sure that is this a Kafka Avro and you can see that from the import statements we can see that it is from the IO The company that Kafka package. So because we have a value Serializer we also should set the schema registry you are el HGP slash slash to 127 0 0 1 8 0 1. Okay. Make sure you have it like this or again change the IP based on what you want. Okay. Next up we go ahead and create a character producer. And so the key is going to be string and the value is going to be a customer. We dont have the customer yet and that's her catch her producer and it's a new cat co-producer that takes the properties as an input. So we still have the customer but we'll have it very soon. By the way the string topic we write you will be customer Avro. So that's perfect. So now we need to create a customer. And we haven't done so yet so we'll just put question marks and then what we'll do is that we'll create a producer record string customer care a producer record equals new producer record. And so as an input the topic is going to be topic and the key is going to be whatever you want. We want to just put the customer as a value. Ok so so far so good but there is something and the customer we still don't have. So for now I'll just commence this code so it compiles and so we need to create that customer class. Do you know how to create the customer class. The answer is yes you do. You do know. So just like before in resources we couldn't Avro a directory and we'll create a customer the one that a VC file. Now go ahead I'm lazy and I'll just use the one for my project. Copy this paste that. So here's our customer. Just like before. First Name Last Name age height weight. Automated email. You see there's no target yet so I'll go ahead and maven and then I will in the lifecycle clean first and then package and he will go ahead and compile my avarice key into a class. So you see now targets generated sources Avro. We have our customer excellence. If it didn't work. Make sure that your producer could compile. OK so I commented at this code now I'm going to comment my code right here. And as you can see now the customer is being imported. It's because I'm in the same package. That example if you had an other package you would just need to write the import statements. OK so here we go we either get to a producer go string and now we need to create a customer so for this we know this customer the new builder builds the very end and so we sets the first name. Our good old John was set a last name though. He said his age is still 26. He said his Heights has grown up a little bit and we set his weights. He also took some white and will set the automated email being false. Ok so here is our customer and now we have the producer record which stick to topic in evaluating the customer. So now we can do Kalika producer to send the producer record and this time we'll add a callback. So a new callback and on completion we check the exception. So if the exception is Knol else OK if the exception is null Well we'll just print Elen success and maybe we'll do another print Ellen and we'll say meta data to string. It's going to print the meta data so it's going to print basically the record of sets and partition. And if there was an error we'll do exception print stack trace OK. Finally just to make sure that produce that record does indeed get send will flush and then we'll close it. And here we go. We have our first captive producer that's using Avro. But you know let's be convinced about it. Let's run it. So click here and run it and are going to air drinks your laser is not an instance of serializer. So obviously this brings us right here I get and it's amazing you get errors because you have to go through them as well in here. Look at the import import it could house objects. That's not the right one. I remove this and I will import. And then I get a choice. So I will use the first one the org data Apache that I've got to come and that's your position. OK. Much better. Let's run our code again and I'll keep these errors because errors are very nice. Even I do errors all the time and they keep you on learning. So that's why I keep them on record. The Course is not perfect. I'm a programmer. Like everyone else so here we go success customer Avro zero at zero. That means partition zero and offset zero. If I run this again we'll get partition zero and offset one. All right. Amazing So it's been written. Let's double check that using maybe the UI. So if I refresh the schema obviously we now have a customer have real value. And here is our record our schema right here. But if you want to check for some data we can go to topic and customer Avro. And after a second we should see here we go here's our customer Jondo it's been there. There's a last way of checking as you know the last way of checking is use a terminal. And for this I can do my Or run marriage Bausch. All right I'm in it. And then I do get Avro console consumer bootstrap server being for me 127 001 and you're right. But changes to your IP. The topic is customer Avro. We want it from beginning and we want to add a property and I'm lazy so I'm just going to copy this from here. The proper use this Khumri issue you are ill so let's just go ahead and print this. Here we go. Enter And we see our Jondo using the console consumer. Amazing. Now if you run the producer once more and go into terminal we see that there is a third Jondo that has appeared. So congratulations you just wrote your first java Kafka reproducer and that was fairly easy right. Everything you learned before just applied your producer record and then we have a callback flush close. OK. Hope you love the learning. Hope you remember it you add this. Q My research Yarnall as a property and I will see you in the next lecture.