From 0 to 1: The Oozie Orchestration Framework
4.1 (198 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
3,611 students enrolled

From 0 to 1: The Oozie Orchestration Framework

A first-principles guide to working with Workflows, Coordinators and Bundles in Oozie
4.1 (198 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
3,611 students enrolled
Created by Loony Corn
Last updated 2/2018
English [Auto]
Current price: $12.99 Original price: $19.99 Discount: 35% off
14 hours left at this price!
30-Day Money-Back Guarantee
This course includes
  • 4 hours on-demand video
  • 36 downloadable resources
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
Training 5 or more people?

Get your team access to 4,000+ top Udemy courses anytime, anywhere.

Try Udemy for Business
What you'll learn
  • Install and set up Oozie
  • Configure Workflows to run jobs on Hadoop
  • Configure time-triggered and data-triggered Workflows
  • Configure data pipelines using Bundles
Course content
Expand all 24 lectures 04:01:46
+ A Brief Overview Of Oozie
2 lectures 22:01

A very first principles discussion of why you would want to use Oozie.

Preview 11:16

Basic Oozie component overview, and where Oozie fits in the Hadoop ecosystem.

Oozie architectural components
+ Oozie Install And Set Up
1 lecture 16:28

Time to install Oozie and run some workflows. Do use the attached text file which has detailed instructions and all the commands you'll need. 

Installing Oozie on your machine
+ Workflows: A Directed Acyclic Graph Of Tasks
7 lectures 01:00:41

Run a simple MapReduce job using the command line. If you're comfortable running MR jobs you can simply skip this!

The attached zip files has a lot of MR examples, we just run the simplest one.

Preview 04:40

Workflows are basic Oozie building blocks, a brief introduction to how Workflows work

Preview 06:12

It's real when you can run stuff! Running our very first MapReduce Workflow on Oozie.

Running our first Oozie Workflow MapReduce application

The properties specified to configure a Workflow.

The file

The actual code (well it's XML, but that is code as far as Oozie is concerned)

The workflow.xml file
A Shell action Workflow

Workflows have advanced control structures to determine which action to execute and ways to specify global configuration for all actions.

Control nodes, Action nodes and Global configurations within Workflows
+ Coordinators: Managing Workflows
6 lectures 01:00:07

Coordinators manage workflows and run them at a specified time, and frequency provided the input data is available.

Running our first Coordinator application

A time-triggered Coordinator is very similar to a Unix cron job

A time-triggered Coordinator definition

Oozie allows pretty fine-grained control over the running of Workflows, you can specify timeouts, throttling, concurrency and the execution order of Workflows materialized by the same Coordinator.

Coordinator control mechanisms

Workflow actions might depend on input data. Coordinators can be configured such that workflows are not launched till the right data is available for them. Such triggers are called data availability triggers.

Data availability triggers

A running example of a Coordinator which launches multiple Workflows, some of which have input data available and others which do not.

Running a Coordinator which waits for input data

Configuring data input triggers is slightly complicated. We have to make sure that we specify the right data instances that the Workflow is interested in.

Coordinator configuration to use data triggers
+ Bundles: A Collection Of Coordinators For Data Pipelines
2 lectures 20:26

Bundles can be used to define data pipelines where multiple coordinators need to be managed together as a single Oozie job

Bundles and why we need them

The bundle kick-ff time can help you determine when the Bundle coordinators run on Oozie. 

The Bundle kick-off time
+ Installing Hadoop in a Local Environment
3 lectures 36:02

Hadoop has 3 different install modes - Standalone, Pseudo-distributed and Fully Distributed. Get an overview of when to use each

Hadoop Install Modes

How to set up Hadoop in the standalone mode. Windows users need to install a Virtual Linux instance before this video. 

Hadoop Install Step 1 : Standalone Mode

Set up Hadoop in the Pseudo-Distributed mode. All Hadoop services will be up and running! 

Hadoop Install Step 2 : Pseudo-Distributed Mode
+ Appendix
2 lectures 24:23

If you are unfamiliar with softwares that require working with a shell/command line environment, this video will be helpful for you. It explains how to update the PATH environment variable, which is needed to set up most Linux/Mac shell based softwares. 

[For Linux/Mac OS Shell Newbies] Path and other Environment Variables

Hadoop is basically for Linux/Unix systems. If you are on Windows, you can set up a Linux Virtual Machine on your computer and use that for the install. 

Setting up a Virtual Linux Instance - For Windows Users
  • Students should have basic knowledge of the Hadoop eco-system and should be able to run MapReduce jobs on Hadoop

Prerequisites: Working with Oozie requires some basic knowledge of the Hadoop eco-system and running MapReduce jobs

Taught by a team which includes 2 Stanford-educated, ex-Googlers  and 2 ex-Flipkart Lead Analysts. This team has decades of practical experience in working with large-scale data processing jobs. 

Oozie is like the formidable, yet super-efficient admin assistant who can get things done for you, if you know how to ask

Let's parse that 

formidable, yet super-efficientOozie is formidable because it is entirely written in XML, which is hard to debug when things go wrong. However, once you've figured out how to work with it, it's like magic. Complex dependencies, managing a multitude of jobs at different time schedules, managing entire data pipelines are all made easy with Oozie

get things done for youOozie allows you to manage Hadoop jobs as well as Java programs, scripts and any other executable with the same basic set up. It manages your dependencies cleanly and logically. 

if you know how to askKnowing the right configurations parameters which gets the job done, that is the key to mastering Oozie

What's Covered: 

Workflow Management: Workflow specifications, Action nodes, Control nodes, Global configuration, real examples with MapReduce and Shell actions which you can run and tweak

Time-based and data-based triggers for Workflows: Coordinator specification, Mimicing simple cron jobs, specifying time and data availability triggers for Workflows, dealing with backlog, running time-triggered and data-triggered coordinator actions

Data Pipelines using Bundles: Bundle specification, the kick-off time for bundles, running a bundle on Oozie

Who this course is for:
  • Yep! Engineers, analysts and sysadmins who are interested in big data processing on Hadoop
  • Nope! Beginners who have no knowledge of the Hadoop eco-system