Hadoop MAPREDUCE in Depth | A Real-Time course on Mapreduce
4.3 (363 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
3,211 students enrolled

Hadoop MAPREDUCE in Depth | A Real-Time course on Mapreduce

A to Z of Hadoop MAPREDUCE - From Scratch to its Real Time Implementation with HANDS-ON Coding of every component of MR
Bestseller
4.3 (363 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
3,211 students enrolled
Last updated 3/2020
English
Current price: $16.99 Original price: $24.99 Discount: 32% off
5 hours left at this price!
30-Day Money-Back Guarantee
This course includes
  • 6 hours on-demand video
  • 23 downloadable resources
  • Full lifetime access
  • Access on mobile and TV
  • Assignments
  • Certificate of Completion
Training 5 or more people?

Get your team access to 4,000+ top Udemy courses anytime, anywhere.

Try Udemy for Business
What you'll learn
  • Every concept that comes under Hadoop Mapreduce framework from SCRATCH to LIVE PROJECT Implementation.
  • Learn to write Mapreduce Codes in a Real-Time working environment.
  • Understand the working of each and every component of Hadoop Mapreduce with HANDS-ON Practicals.
  • Override the default implementation of Java classes in Mapreduce and Code it according to our requirements.
  • ADVANCE level Mapreduce concepts which are even not available on Internet.
  • Real-time Mapreduce Case studies asked in Hadoop Interviews with its proper Mapreduce code run on cluster.
Course content
Expand all 54 lectures 05:55:41
+ Introduction
6 lectures 23:20

In this first lecture of this course Introduction is given to Hadoop closest processing framework i.e. Mapreduce .

Preview 03:11

Announcement  lecture to rate this Hadoop Mapreduce course.

Announcement
01:17

This video explains the difference between the traditional approach and Hadoop approach to do parallel processing of big data. It shows how Hadoop handles most of the tasks by itself.

Preview 02:26

In this lecture, Basic flow of Hadoop Mapreduce program is explained.

Basic Flow of a Mapreduce program
04:41

Continuing with the above lecture, this video explains the basic flow of Hadoop Mapreduce program with an example.

Mapreduce Program flow with Example
04:48

This video explains the basic file input format types of Hadoop  Mapreduce. There are mainly 6 Fileinput formats by default provided by Hadoop. We can directly implement them in a Mapreduce program.

Types of File Input formats in Mapreduce
06:57
+ Default structure of various classes in Mapreduce
6 lectures 31:18

This video contains a lecture explaining the default structure of a Mapper class in a Hadoop Mapreduce program.

Mapper Class structure
06:51

This video contains a lecture explaining the default structure of a Reducer class in a Hadoop Mapreduce program.

Reducer Class structure
03:23

This video contains a lecture explaining the default structure of a Driver class in a Hadoop Mapreduce program.

Driver Class structure
05:55

This video contains a lecture explaining the default structure of a Partitioner class in a Hadoop Mapreduce program. By default, Hadoop implements Hash Partitioner class in MapReduce program.

Partitioner Class structure
03:48

This is a detailed lecture on How shuffling, sorting and partitioning is done internally in Hadoop architecture. Hadoop does all these 3 steps by itself.

Shuffling, Sorting & Partitioning in Detail
03:04

A step by step installation guide(Pdf) to install Hadoop and Mapreduce on your system.

Hadoop Installation
08:17
+ Word Count program in Mapreduce
5 lectures 36:04

A last lecture before practicals, it explains what type of datatypes Hadoop uses.Also how to use those Hadoop datatypes in a Mapreduce program.

What are Writables in Hadoop
05:25

This lecture consists of an explanation to the basic wordcount program in Mapreduce.

Word Count program in Mapreduce
10:31

After knowing the HadoopMapreduce code of word count, in this lecture it is shown how to actually write that Mapreduce code in eclipse and how to create a jar file out of it and finally how to run it on Hadoop cluster.

Word count program Code run
09:11

This lecture explains an optimization technique in Hadoop i.e. Combiner. What is combiner in Hadoop, at what phase of a Mapreduce program flow it is used and how it works.

What is Combiner in Hadoop Mapreduce
07:13

As explained in theory, Combiner in Hadoop can give us better optimization. So in this video we will learn How to implement a Combiner class in a Mapreduce program. 

Implementing Combiner in WordCount Mapreduce program
03:44
+ Set of Mapreduce programs
5 lectures 46:26

This lecture explains how to take out sum of even and odd numbers using a Hadoop Mapreduce program.

Calculate Sum of Even Odd numbers
08:29

Using Mapreduce program, In this video we will calculate the average of success rate of facebook ads of different categories city wise. 

Calculate success rate of Facebook ads
14:08

Hadoop provides us with predefined datatypes but it also gives us freedom to create our own datatypes in form of writables.

Thi video explains - How to create our own Hadoop recognized datatypes using a Mapreduce program

Preview 07:25

One more example of implementing Hadoop Writables by mapreduce.

In this lecture we will calculate the fraud customers of a ecommerce website. Full Mapreduce codes are attached in resources tab.

Fraud customers of an Ecommerce website - part 1
08:59

One more example of implementing Hadoop Writables by mapreduce.

In this lecture we will calculate the fraud customers of a ecommerce website. Full Mapreduce codes are attached in resources tab.

Fraud customers of an Ecommerce website - part 2
07:25
Create a Mapreduce program
Assignment 1
1 question
+ Distributed Cache Implementation
2 lectures 15:55

This lecture explains theory of Distributed cache in Hadoop. What is Distributed Cache in Hadoop, what are its uses etc.

What is Distributed Cache and it's uses in Mapreduce framework
04:01

Using the knowledge of Distributed Cache in Hadoop, in this video we will implement it in a Mapreduce program

All the Mapreduce codes used in video are attached.

Using Distributed cache calculate average salary
11:54
+ Dealing with Input Split Class
2 lectures 07:39

This lecture will show the default Mapreduce code of an input split class in Hadoop.

What are Input splits in Hadoop
05:54

This lecture will show the default Mapreduce code of an input split class in Hadoop.

Input split Class in Mapreduce
01:45
+ Multiple Inputs & Output class
2 lectures 17:21

Hadoop also gives us provision to read more than 1 input files at a time so how exactly we do this in Mapreduce program is shown in this lecture.

Multiple Inputs class and its Implementation
08:49

Hadoop also gives us provision to create more than 1 Output directory and how exactly we do this in Mapreduce program is shown in this lecture.

Multiple Output class and its Implementation
08:32

Quiz 1

Quiz 1
5 questions
+ Joins in Mapreduce
5 lectures 34:58

A lecture to show the pseudo code flow of joins of a Mapreduce program. It explains the thinking process to do joins in Mapreduce.

Pseudo code flow of Joins Mapreduce program
05:16

This video shows How to join 2 files in a Hadoop Mapreduce program.

Join 2 files in a Mapreduce program
09:11

After performing Inner join in Mapreduce , In this lecture we will perform Outer join in mapreduce .

Performing Outer Join in Mapreduce
09:04

What is Map Join and in what scenarios it should be used in Mapreduce .

What is Map Join and Where it is Used
05:29

After knowing Map Join theory, We will implement it in Mapreduce program with example

Implementing Map Join in a Mapreduce program
05:58
+ Counters in Mapreduce
3 lectures 23:06

What are counters in Hadoop. What purpose so they serve in Hadoop architecture. What are various types of counters supported by Hadoop.

What are Counters in Hadoop
06:31

What are Job counters in Hadoop. How to check the counter status after a Mapreduce program run.

Job Counters
05:42

We can custom counters according to out requirements. There are 2 type of counters possible in Hadoop MApreduce framework

Static counters

Dynamic counters

In this lecture we will create a custom counters to calculate number of records processed based on a condition in a store's sales file . Mapreduce codes attached.

Preview 10:53
Create a Mapreduce program on custom counters
Assignment 2
1 question
+ Creating Custom Input Formatter
5 lectures 41:23

What is file input format class in Hadoop Mapreduce. What are the methods present in it. What methods of it should be overridden in Mapreduce program to create own input formatter

File Input format Class's default structure in Mapreduce
07:30

This lecture contains a explanation to why and when there is a need to create a custom input format class in Mapreduce. Hadoop provide us option to create our own input format class to read the input file.

Custom Input Formatter Need & Problem statement
07:34

In this lecture we will create a custom input format class to read a XML file. There are 4 mapreduce Java classes used for this. A proper run on Hadoop cluster is also shown.

Create custom Input Format class to read XML file | Part 1
09:29

In this lecture we will create a custom input format class to read a XML file. There are 4 mapreduce Java classes used for this. A proper run on Hadoop cluster is also shown.

Create custom Input Format class to read XML file | Part 2
12:17

In this lecture we will create a custom input format class to read a XML file. There are 4 mapreduce Java classes used for this. A proper run on Hadoop cluster is also shown.

Create custom Input Format class to read XML file | Part 3
04:33

Quiz 2

Quiz 2
5 questions
Requirements
  • Basic knowledge of HDFS.
  • Basic knowledge of Core Java.
  • Rest everything about Hadoop Mapreduce is covered in this course with Practicals.
Description

Mapreduce framework is closest to Hadoop in terms of processing Big data. It is considered as atomic processing unit in Hadoop and that is why it is never going to be obsolete.

Knowing only basics of MapReduce (Mapper, Reducer etc) is not at all sufficient to work in any Real-time Hadoop Mapreduce project of companies. These basics are just tip of the iceberg in Mapreduce programming. Real-time Mapreduce is way more than that. In Live Big data projects we have to override lot many default implementations of Mapreduce framework to make them work according to our requirements.

This course is an answer to the question "What concepts of Hadoop Mapreduce are used in Live Big data projects and How to implement them in a program ?" To answer this, every Mapreduce concept in the course is explained practically via a Mapreduce program.

Every lecture in this course is explained in 2 Steps.

Step 1 : Explanation of a Hadoop component  | Step 2 : Practicals - How to implement that component in a MapReduce program.

The overall inclusions and benefits of this course:

  • Complete Hadoop Mapreduce explained from scratch to Real-Time implementation.

  • Each and Every Hadoop concept is backed by a HANDS-ON Mapreduce code.

  • Advance level Mapreduce concepts which are even not available on Internet.

  • For non Java backgrounder's help, All Mapreduce Java codes are explained line by line in such a way that even a non technical person can understand.

  • Mapreduce codes and Datasets used in lectures are attached for your convenience. 

  • Includes a section 'Case Studies' that are asked generally in Hadoop Interviews.

Who this course is for:
  • Students who want to learn Hadoop Mapreduce from SCRATCH to its Live Project Implementation.
  • Techies who have only basic theoretical knowledge of Mapreduce and need In-depth knowledge of it to work in Real-time projects.
  • Techies who have fear of Java should take this course as the Java Mapreduce codes are explained in very simple and easy manner.
  • Engineers who want to switch their career to Hadoop.