Getting Started with Hadoop 2.x
0.0 (0 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
0 students enrolled
Wishlisted Wishlist

Please confirm that you want to add Getting Started with Hadoop 2.x to your Wishlist.

Add to Wishlist

Getting Started with Hadoop 2.x

To build strong foundation by exploring Hadoop ecosystem with real-world examples.
0.0 (0 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
0 students enrolled
Created by Packt Publishing
Last updated 5/2017
Current price: $10 Original price: $125 Discount: 92% off
5 hours left at this price!
30-Day Money-Back Guarantee
  • 2.5 hours on-demand video
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
What Will I Learn?
  • Get in-depth knowledge of the Hadoop 2.7 architecture
  • See how to implement your hypothesis/algorithms on big data
  • Understand the Hadoop 2.x Architecture
  • Discover the process to set up an HDFS cluster along with formatting and data transfer in between your local storage and the Hadoop filesystem
  • Get to know all about the Hadoop UI
  • Create Map-reduce jobs
View Curriculum
  • The Getting started with Hadoop 2.x is going to go through a detailed step to building a scalable computational processing on the hadoop framework.
  • To fully benefit from the coverage included in this course, you will need good knowledge of Java and comfort working on the terminal/bash
  • This course has the following software requirements:
  • Ubuntu machine / machine with bash (in windows you can install bash as well)
  • Java
  • Text Editor (to write code for the course)
  • Data set from the provided data source
  • This course has been tested on the following system configuration:
  • OS: Ubuntu 16.04
  • Processor: Core i7 4770HQ
  • Memory: 16GB
  • HDD: 500GB

Hadoop emerged in response to the proliferation of masses and masses of data collected by organizations, offering a strong solution to store, process, and analyze what has commonly become known as Big Data. It comprises a comprehensive stack of components designed to enable these tasks on a distributed scale, across multiple servers and thousands of machines.

This course introduces you to the powerful system synonymous with Big Data, demonstrating how to create an instance and leverage Hadoop ecosystem's many components to store, process, manage, and query massive data sets with confidence.

The video course opens with an introduction to the world of Hadoop, where we discuss Nodes, Data Sets, and operations such as map and reduce. The second section deals HDFS, Hadoop's file-system used to store data. Further on, you’ll discover the differences between jobs and tasks, and get to know about the Hadoop UI. After this, we turn our attention to storing data in HDFS and Data Transformations. Lastly, we will learn how to implement an algorithm in Hadoop map-reduce way and analyze the overall performance.

About The Author

A K M Zahiduzzaman is a software engineer with NewsCred Dhaka. He is a software developer and technology enthusiast. He was a Ruby on Rails developer, but now working on NodeJS and angularJS and python.He is also working with a much wider vision as a technology company. The next goal is introducing SOA within the current applications to scale development via microservices.

Zahiduzzaman has a lot of experience with Spark and is passionate about it. He is also a guitarist and has a band too. He was also a speaker for an international event in Dhaka. He is very enthusiastic and love to share his knowledge.

Who is the target audience?
  • This book is for budding data scientists and data analysts with a firm understanding of Java.
Compare to Other Hadoop Courses
Curriculum For This Course
18 Lectures
Intro to the Hadoop World
6 Lectures 48:09

This video gives an overview of the entire course.

Preview 03:43

In this video we’ll learn how to install Hadoop on our local system

Installing Hadoop in Local

The important part of selecting the Hadoop framework for your own solution is to understand why it is a good fit for your application.


Bring Process to Data

Understand the difference between the two nodes in HDFS; Datanode and Namenode

NameNode Versus DataNode

The new term Map-Reduce… what does it mean and how does it solve a problem?

Map and Reduce Operations

When jumping to parallel programming from serial programming, it is always hard to plan the computation.

Order of Execution and Parallel Thinking
File System Overdrive with HDFS
4 Lectures 25:58

 Prepare your HDD for with HDFS

Preview 06:38

Copy data to/from HDFS.

Formatting a HDFS

 Using the HDFS commands in the shell.

Some Helpful Commands to Communicate with the HDFS

How do we access the HDFS files from a java program

HDFS Protocol and Using It in Applications
Let's Run Some Hadoop Jobs
4 Lectures 26:27

What are Hadoop jobs and tasks?

Preview 04:47

How to see the process flow and progress of a Hadoop job.

The Hadoop UI for Task Progress

Run Hadoop jobs.

Running a Couple of Example Jobs

In this video, we are going to look at how the map and reduce gets executed

Analyze the Work Flow/Data Flow/Process Flow
It's Show Time
4 Lectures 36:44

Understand the dataset provided by the

Preview 04:04

Prepare the data to be fit for our algorithm

Data Transformation and Storing to HDFS

Devise a simple algorithm for recommendation

Devise a Simple Algorithm for Recommendation

 Implement the map-reduce for the transformation of the movie -> genre context

Implement the Algorithm in Hadoop Map-Reduce Way and Analyze Performance
About the Instructor
Packt Publishing
3.9 Average rating
8,229 Reviews
58,948 Students
687 Courses
Tech Knowledge in Motion

Packt has been committed to developer learning since 2004. A lot has changed in software since then - but Packt has remained responsive to these changes, continuing to look forward at the trends and tools defining the way we work and live. And how to put them to work.

With an extensive library of content - more than 4000 books and video courses -Packt's mission is to help developers stay relevant in a rapidly changing world. From new web frameworks and programming languages, to cutting edge data analytics, and DevOps, Packt takes software professionals in every field to what's important to them now.

From skills that will help you to develop and future proof your career to immediate solutions to every day tech challenges, Packt is a go-to resource to make you a better, smarter developer.

Packt Udemy courses continue this tradition, bringing you comprehensive yet concise video courses straight from the experts.