In this course Apache Kafka and Spring Boot will be used to establish communication between them.
51:34 of on-demand video • Updated May 2019
You will learn how to create a Kafka Consumer using Spring Boot
You will learn how to create a Kafka Producer using Spring Boot
Hello my name is Darby Lizzie and in this video I'm going to show you how to create a simple producer and consumer in of Kafka in Springwood. So basically first of all we need to download and install Kafka in our machine. So let's go to the Kafka home page Kafka. This type. Kafka download. And here we go to the Kafka official Apache Kafka page where you can download one of these two tars basically just download one chosen I have already downloaded and after you download it you should uh and zip it anywhere in your machine Preferably you should use either a Mac or a Linux based machine because in windows are you going to encounter different problems. I have already they already unzipped it in my home folder and uh here I have also renamed deed for uh easier use I have renamed it from Kafka version 2.0 12 or simply Kafka. After that I'm going to start the Kafka server. So let's go. Let's go see the Kafka in the terminal in the directories in the folders where you have an decompressed V or Kafka download. So I go inside the Kafka folder and I type been zookeeper server dot server start dot Assange and the SSA stands for bash. If you're using a Mac you don't need it if you're using a uh a boon to your should or other Linux distributions. You should put it there then config zoo keeper dot proper ts and by this I start by typing this I basically have started the zookeeper server in which the Kafka server will be run. Now let's start let's keep it like this and in a new tab or a new window of the console let's start the actual Comcast server then Kafka. I just like been the for Slash Kafka server. Start dot SDH and then config server dot properties as you'll see that my uh Kafka machine has already started. Now we need to create our own micro service for producing and consuming to a Kafka topic using this pretty simple Spring Boot application. So let's go to the intelligence I use intelligence but I believe you can use whatever you want spring to sweet or Eclipse with the necessary plugins or if you're mastering they use the version. The usage of version control of uh uh of maven or of grades you can do it. Even in command line so I'll go to spring and I basically I have to choose the spring initialize ourselves as the Java home spring and yeah very very D a spring initialize. And uh this is going to help me create the project and uh with the necessary dependencies via the web. So I'll just rename it simple com. That simple example example. And here Kafka producer consumer. And I'm. Going to use maven for dependency injection and Java 8 for actually running this project and next then as dependencies I'm going to choose web. And uh then on messaging I'm going to choose Kafka next. Finnish and let's wait for the dependencies to be downloaded and installed to enable the all to import and let's say that this project is a single Springboard application with nothing in it. So in order to start our coffee shop producer I'm going to need two things. First of all a risk controller in which to pass the information from the client site. Just type Kafka simple. Controller Yep and I'm just going to enter the I'm just going to add a proper annotations so rest controller at request mapping. Dash API Dash I mean for us naturally API for Slash Kafka the in here I'm just going to pass a model that's create our model simple model simple model and this simple model will simply have private let's say string uh field one and a private String field too I'm going to create also the constructors simple model string field one I could have use project Lomborg but failed to there's the failed one there's not failed to fail to know I'm going to generate the getters and setters using getters and setters generator. Eclipse also has this generator. So basically this is the uh the facade or the. My simple object that is going to be transferred in via Kafka. Now I have created the controller I have to create an endpoint to actually get the error VM. This simple object I'm going to get it as Jason so that's public void post. I'm going to use both mapping cape at request body. Simple model simple model and. This simple model will be sent via Kafka so let's create our Kafka configuration and let's name it Kafka config and annotated with the add configuration annotation in order to be recognized and made in order for Bill to be created in the start up. So now let's go produce a factory and I'm going to a string several models. The string is the idea and the simple model is the actual model. Producer factory. And this is going. Actually this is going to be a bean and yep I need to import it. It's spring bean and in here I need uh map String object in order to pass the actual configuration of this producer factory that is going to help the Kafka template to actually produce to a topic in the actual Kafka engine. So yeah let's let's name it config it goes to you new hash map. And this doesn't really matter. So and let's put our actual properties and the actual properties here are going to be producer Fred config that bootstrap server config and the server is going to be either one in either 127 0 0 1 Uh the port is nineteen ninety two the default port for a Kafka. But um you can also change it then producer config can use your laser class config. This is going to assist and actually uh put to decide which serialize is going to be needed for the key. And uh I'm going to use this shrink serialize her from Apache Kafka serialization class then the other one is of the value serialize her producer config dot value serialize our class config. And here I'm going to use the jaison serialize little Franz Kafka and of course from Kafka support dot class and then I'm simply going to return these uh config. I'm going to create a return vs producer factory's new default Kafka producer factory which is going to take its uh argument. The config via the constructor after that we also need the actual Cathcart template to perform the uh producing the sanding to a topic via our application and V. Kafka template is going to be. Yes of course. This should be public. Here it's not a problem because they are in the same package but they should be public. So I've got template string simple model. And Kafka template return you Kafka template which takes s an argument the producer the actual producer called factory. So now let's go back to the Kafka symbol controller and let's inject the actual Kafka template via the constructor Kafka symbol controller. That's a Kafka template string simple model Kafka template and create the actual field for their stamp laid simple model Kafka template and the narratives successfully injected this template. Now let's use these template data to actually sent to an actual topic in uh Kafka. So let's choose a name for the topic let's call it my topic and uh let's put the actual message. Hmm. What we want to send to CAF. So after we've done this let's run the application and using postman Yep. Mm hmm. We only actually also need the. I forgot here post mapping post mapping. So let's restart this and uh using postman I'm going to post a dummy object dummy. Jason uh body to the uh co host for this ADHD for my spring uh Boot application API slash Kafka. And the body is going to be a simple body. Field 1 let's call this field one just like it is. And uh they of their data field. Field 2 field 2 so I'm now going to post it and it returns to 100. Okay. And as you can see here the actual value is posted in the Kafka. And we can check it uh if it works by opening a new tab and being slash Kafka. Ken So consumer dot dash dot. Uh and then choosing the boot straps server local to host nineteen ninety two. And the topic as we have named right here the my topic and uh that's all. So if we do the posting again here uh we can see that uh we actually got the field in one and uh field too. So let's say it again. Yes this is. FIELD AND DO SOMETHING SENT and I am sending to the cover topic. Yes you're gonna see here changed and that's all that's the producing using Springboard application and Apache Kafka. And this is just a simple producer but it would be more convenient to actually demonstrate data using Kafka that has different partitions and the spreading uh different machines. Anyway that's all in this video. Thank you for watching and if you liked it please subscribe.