Building a Data Mart with Pentaho Data Integration
3.9 (7 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
180 students enrolled
Wishlisted Wishlist

Please confirm that you want to add Building a Data Mart with Pentaho Data Integration to your Wishlist.

Add to Wishlist

Building a Data Mart with Pentaho Data Integration

A step-by-step tutorial that takes you through the creation of an ETL process to populate a Kimball-style star schema
3.9 (7 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
180 students enrolled
Created by Packt Publishing
Last updated 1/2015
English
Current price: $10 Original price: $85 Discount: 88% off
1 day left at this price!
30-Day Money-Back Guarantee
Includes:
  • 2 hours on-demand video
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
What Will I Learn?
  • Create a star schema
  • Populate and maintain slowly changing dimensions type 1 and type 2
  • Load fact and dimension tables in an efficient manner
  • Use a columnar database to store the data for the star schema
  • Analyze the quality of the data in an agile manner
  • Implement logging and scheduling for the ETL process
  • Get an overview of the whole process: from source data to the end user analyzing the data
  • Learn how to auto-generate data for a date dimension
View Curriculum
Requirements
  • You need to have a basic understanding of star schemas and Pentaho Data Integration to take the next step: setting everything into practice.
Description

Companies store a lot of data, but in most cases, it is not available in a format that makes it easily accessible for analysis and reporting tools. Ralph Kimball realized this a long time ago, so he paved the way for the star schema.

Learning Pentaho Data Integration walks you through the creation of an ETL process to create a data mart based on a fictional company. This course will show you how to source the raw data and prepare it for the star schema step-by-step. The practical approach of this course will get you up and running quickly, and will explain the key concepts in an easy to understand manner.

Learning Pentaho Data Integration teaches you how to source raw data with Pentaho Kettle and transform it so that the output can be a Kimball-style star schema. After sourcing the raw data with our ETL process, you will quality check the data using an agile approach. Next, you will learn how to load slowly changing dimensions and the fact table. The star schema will reside in the column-oriented database, so you will learn about bulk-loading the data whenever possible. You will also learn how to create an OLAP schema and analyze the output of your ETL process easily.
By covering all the essential topics in a hands-down approach, you will be in the position of creating your own ETL processes within a short span of time.

Who is the target audience?
  • If you are are eager to learn how to create an ETL process to populate a star schema, and at the end of the course you want to be in a position to apply your new knowledge to your specific business requirements, then Learning Pentaho Data Integration is for you.
Students Who Viewed This Course Also Viewed
Curriculum For This Course
Expand All 25 Lectures Collapse All 25 Lectures 02:01:31
+
Getting Started
3 Lectures 18:25

Get an insight into the raw data, which we will be working with in this video tutorial.

The Second-hand Lens Store Sample Data
06:49

Create a Star Schema derived from the raw data.

The Derived Star Schema
04:29

We will create the required databases for our project, add JDBC drivers to PDI and create JDNI connections.

Setting up Our Development Environment
07:07
+
Agile BI – Creating ETLs to Prepare Joined Data Set
3 Lectures 12:27

Create an ETL transformation that imports your raw data so that you can apply further manipulation further down the stream and output the data to the Datamart.

Importing Raw Data
03:22

We will learn how to easily make sure that the data types of the ETL output step are in sync with the database table column types.

Exporting Data Using the Standard Table Output Step
04:33

Loading huge amounts of data in the traditional way takes too long ,speed it up by using the bulk loader.

Preview 04:32
+
Agile BI – Building OLAP Schema, Analyzing Data, and Implementing Required ETL I
3 Lectures 11:29

In this first step to Agile ETL development, you will learn how to create a Pentaho Analysis Model so that you can analyze the data later on in Pentaho Analyzer.

Creating a Pentaho Analysis Model
03:25

A very important point is to understand the quality of the data: Are there any duplicates, misspellings and so on. We will find such problems and use this new knowledge to feed it back to the ETL design.

Analyzing the Data Using the Pentaho Analyzer
03:49

Learn how to implement ETL improvements to iron out the data problems found.

Improving Your ETL for Better Data Quality
04:15
+
Slowly Changing Dimensions
3 Lectures 17:03

Learn how to populate a simple dimension.

Creating a Slowly Changing Dimension of Type 1 Using the Insert/Update Step
06:47

Learn how to populate a simple dimension and make it future proof.

Creating a Slowly Changing Dimension of Type 1 Using Dimension Lookup Update Ste
04:58

Learn how to keep historic versions in your dimension table

Preview 05:18
+
Populating Data Dimension
3 Lectures 16:10

In order to make our date dimension transformation more dynamic, we will allow users to define a start and end date in order to specify the period.

Defining Start and End date Parameters
05:17

Based on the provided parameters, the amount of days between the start and end date will be calculated. This figure will be used to generate a data set with the same number of rows.

Preview 04:26

In this part, you will learn how to derive various date attributes such as year, week, day and so on. from the input date.

Auto-generating Year, Month, and Day
06:27
+
Creating the Fact Transformation
3 Lectures 14:28

Learn how to efficiently create an input query for your fact transformation.

Sourcing Raw Data for Fact Table
03:52

Learn how to configure the step to look up the SCD type 1 keys.

Preview 04:28

Learn how to configure the step to look up the SCD type 2 keys.

Lookup Slowly Changing Dimension of the Type 2 key
06:08
+
Orchestration
2 Lectures 10:29

In our setup, dimensions can be loaded in parallel; therefore, we can create an ETL job

Loading Dimensions in Parallel
06:20

We will create the main job, which runs all the required child jobs and transformations

Creating Master Jobs
04:09
+
ID-based Change Data Capture
2 Lectures 09:46

In this section, you will learn how the new data can be automatically loaded into Datamart using the Change Data Capture (CDC) approach.

Implementing Change Data Capture (CDC)
04:58

We will define the order of execution for all the transformations involved.

Creating a CDC Job Flow
04:48
+
Final Touches: Logging and Scheduling
3 Lectures 11:14

We will create a dedicated environment for logging.

Setting up a Dedicated DB Schema
01:22

Pentaho Kettle features built-in logging. You will learn how to configure them.

Setting up Built-in Logging
04:22

Learn how to schedule a daily run of your ETL process.

Scheduling on the Command Line
05:30
About the Instructor
Packt Publishing
3.9 Average rating
5,146 Reviews
40,483 Students
408 Courses
Tech Knowledge in Motion

Packt has been committed to developer learning since 2004. A lot has changed in software since then - but Packt has remained responsive to these changes, continuing to look forward at the trends and tools defining the way we work and live. And how to put them to work.

With an extensive library of content - more than 4000 books and video courses -Packt's mission is to help developers stay relevant in a rapidly changing world. From new web frameworks and programming languages, to cutting edge data analytics, and DevOps, Packt takes software professionals in every field to what's important to them now.

From skills that will help you to develop and future proof your career to immediate solutions to every day tech challenges, Packt is a go-to resource to make you a better, smarter developer.

Packt Udemy courses continue this tradition, bringing you comprehensive yet concise video courses straight from the experts.