mastering data integration (ETL) with pentaho kettle PDI
course's star rating by considering a number of different factors
such as the number of ratings, the age of ratings, and the
likelihood of fraudulent ratings.
Find online courses made by experts from around the world.
Take your courses with you and learn anywhere, anytime.
Learn and practice real-world skills and achieve your goals.
Why should i take this course
Isn't it obvious? Don't you want to be the best ETL, pentaho kettle developer?
The course is the outcome of my 10 year experience with IT projects and business intelligence and data integration with pentaho kettle.
I developed the course because I want to share my knowledge with you.
the best way to learn technological software and concepts is via an online course,
structured by a real developer with actual experience that guide you through his (my) Path to knowledge.
I will help you master ETL with pentaho kettle .
What is the course about?The course is about taking you from the beginning and transfer you to a master of Pentaho kettle .
The main dish of the course is a walk-through of a real pentaho kettle project hands on, case study, tips taking you from easy steps that becomes more and more complex, layer by layer, as you go forward. That way you can learn pentaho kettle as a beginner but also become an expert as you go along (and practice)
Also I cover
Structure of the course
the course is divided 4 main sections:
Section 1: Theory and concepts of data integration in general
(if you already an ETL developer you can skip that)
Section 2: setting up the environment
install and operate the data integration with pentaho kettle.
Including database management and profiling the database as a source.
PDI, navicat (to manage database), jdbc drivers, JRE, sakila database example, mysql and more .
Section 3: the main dish
until the successfully end of the project. Including some 80% of the steps used by pentaho kettle in order to master data integration.
You can see all the steps in the curriculum (it's too many to write them here)
just for the example:
a. Connect to various data sources (databases, files…)
b. manipulate the data
changing strings, dates and calculations, joins, lookups, slowly changing dimensions, consideration of when and how to use different steps.
c. Work with variables
d. outputs steps (bulk load , table output , update/insert , file output…)
Section 4: wrapping up - go to production
You will learn how to:
Not for you? No problem.
30 day money back guarantee.
Learn on the go.
Desktop, iOS and Android.
Certificate of completion.
|Section 1: Introduction|
course promo introPreview
|Section 2: Installations|
The list of software we require in order to run and work with Pentaho ETL
JRE is required by Pentaho in order to run
This lecture shows how to install pentaho data integration
This lecture shows how to install navicat
Install sakila database (and notepad++)
this lecture shows how to install data architect - profile tool for databases
This lecture will show how to install expresso, a tool that acts as wizard for creating regular explressions
|Section 3: Hands on - Pentaho|
Pentaho PDI getting started
kettle variables part 1
kettle variables part 2
kettle database connection
|Section 4: Software Walkthroughs|
This lecture is about Profiling database with power architect
|Section 5: The Date Dimension|
dim date introPreview
generate rows part 1
generate rows part 2
generate rows part 3
the add sequence
the select values
the mapping / string cut / string concat
the table output
the string operation
dim date summary
|Section 6: dim time|
dim time introPreview
arrange steps and create hours and minutes
the Cartesian stepPreview
Cartesian customer example
the modified java script value
the field set / filter rows / dummy steps
dim time summary
|Section 7: dim staff|
dim staff introPreview
the table input
the data grid / value mapper
consideration 1 - historical data in dimensions
consideration 2 - truncate or update table
consideration 3 - be like mike - deleted rows on dimensionPreview
|Section 8: dim store|
dim store intro
the database lookup
the stream lookup
the insert /update step
the system info
|Section 9: dim customer|
dim customer intro
control "changed data only" input
down it goes with the stream
slow changing dimension - concept
slow changing dimension - example
|Section 10: dim film|
dim film intro
the number rangePreview
the merge join / sort rows / value null
the denormaiser / split fields to rows
|Section 11: fact rentals|
fact rental intro
the inventory - film and store id
slow changing dimension on fact table
counter and date diff calculation
key date handling
the time dimension check
error handling step
|Section 12: Go to production|
production steps intro
the final job
kitchen batch file
validation - secure the stream part 1
validation - secure the stream part 2
|Section 13: ETL concepts and sources|
what is ETL
the data warehouse concept
ETL tools comparison
data sources part 1
data sources part 2
|Section 14: Whats next...|
need more input
this is a goodbye
My name is Steinberg itamar,
I am in the field of information technology is for more than 15 years now,
I have a Master's degree (MBA) in information technology is an management.
My BA is also in the area of information technology from the University of Manchester.
During those years I was serving as manager and information technology expert
especially in the field of business intelligence and data integration.
all that time i used Pentaho kettle as the leading Data integration tool.
I started as a developer through team manager of development, head of the application department - responsible for all software, business processes and implementations,
also as project manager, my last role as an employee I was the CIO of the large company.
I was dealing with ERP, CRM business intelligence of course from all aspects of running a business. sales, transportation, customer service, imports, inventory, suppliers and manufacturing.
All of these areas gave me a unique perspective on business processes and how to analyze a company by looking at one large picture – that is the BI.
Of course that in order to combine all of those systems together you'll need data integration.
I have more than eight years working with pentaho kettle.
Six years ago I decided to start my own company - inflow systems and focus myself to business intelligence and data integration.
On those six years I, as CEO, was leading large business intelligence projects.
I hired several employees, very gifted, specific oriented to data integration and business intelligence. We have developed tens of projects from scratch at large companies like Alcatel Lucent (embedded solution), online gaming, binary options (stock market) and traditional businesses like food supplementary and organic food.
Today, I want to give from my knowledge to you because I believe that to share is the right way to go, during my learning phase I was reading a lot of books, struggling with the technology and I think I can make your life easier.