Realtime Data Processing with Pentaho and Apache Kafka
What you'll learn
- Realtime data processing using Pentaho data integration and Apache Kafka
- Students will learn the techniques through a demo project based on Banking domain
- Computer with Windows, Linux or Mac installed
- Java 1.8 version installed
- Basic understanding of concepts around Pentaho Data Integration and Apache Kafka
Learn how to define a Pentaho Kafka Producer and Consumer to implement a quick solution to derive insights. This course is accompanied with a demo project related to banking domain and as a student of this course, you will get practical application of how Apache Kafka and Pentaho can be used in implementing a real time data streaming solution to discover the market demand for loan or total page visit count in real time.
Content and Overview
Through this course, comprising of several lectures with English subtitles / English captions, Quiz chapters, along with additional resources, you will
Understand what is, when and how to carry out realtime data processing solution
Gain confidence in implementing such realtime data processing solution using Pentaho and Kafka
You can test the knowledge gained through the sessions by attending quizzes and every use case mentioned in the course are explained with demo sessions thereby enabling you to practice the newly learned skills.
I will add more contents to this course as and when possible.
You can download the Pentaho transformations used during the demo sessions (attached as part of a resource material in a lecture of this course), to practice at your end.
Learners who complete this course will gain the knowledge and confidence to implement a realtime data streaming solution with Pentaho and Apache Kafka in the projects.
Who this course is for:
- ETL developers
- Real time data streaming solution developers
- Engineering managers
- Anyone who wish to understand how to use a user interface based ETL tool to integrate with Apache Kafka
My name is Rajkumar and I am so excited to contribute the learning from my industrial experience.
With more than a decade of experience in IT, I have spent the majority of my time dealing with "DATA", that includes data modeling, data profiling, cleansing, data transformation, storage, retrieval, optimization, governance, mining and reporting.
I have played various roles in my career that includes Developer, Data Modeler, Tester, Project Lead, Product Consultant, Data Architect, ETL Specialist, Solution Architect, Release Manager etc.
To sum up, I am absolutely passionate about anything to do with "DATA" and I am looking forward to share my passion and knowledge with you!