Hadoop installation|Install Hadoop on your own system
4.2 (30 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
2,071 students enrolled

Hadoop installation|Install Hadoop on your own system

Learn to install Hadoop, Hive, Pig, Hbase, Spark, Sqoop,Flume, Hbase
4.2 (30 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
2,071 students enrolled
Created by Sudhanshu Saxena
Last updated 9/2019
English
English [Auto]
Current price: $12.99 Original price: $19.99 Discount: 35% off
2 days left at this price!
30-Day Money-Back Guarantee
This course includes
  • 2 hours on-demand video
  • 9 downloadable resources
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
Training 5 or more people?

Get your team access to 4,000+ top Udemy courses anytime, anywhere.

Try Udemy for Business
What you'll learn
  • Your system will have working Hadoop system
Requirements
  • Zeal to learn Hadoop, basic understanding of operating system
Description

It’s a no-theory Hadoop installation program which is highly emphasized on Practical description. Throughout practical demonstration is the solid foundation of this course. This course is made in a way that the audience will learn the best possible and easiest way of installation. A practically dedicated course that helps you to understand the installation of Hadoop from scratch along with its sub-projects like Hive, Pig, HBase, Spark, SQOOP, Flume, HBase on your own system.


Hive: Hive works on structured and semi-structured data sets and gives Hadoop the capabilities so that it can work like SQL

PIG: Apache pig is one of the versatile tools which can work on all most every-kind of data structured, unstructured, and semi-structured. PIG utilizes pig-Latin, which is very easy to learn and incorporate.

SQOOP: SQOOP is made-up of two words; SQL+Hadoop. SQOOP helps the user to fetch big data, to HDFS system from any RDBMS. Hence SQOOP can fetch only structured data.

FLUME: Flume is the tool which can use to fetch any kind of data from any storage device to HDFS except RDBMs.

HBASE: HBase is one of the NO-SQL Database which is the part of the Hadoop ecosystem and help in storing any kind of data in HDFS

SPARK: Spark is one of the fastest data processing tool available very much like Hadoop, It works on distributed computing and utilizes in-memory processing techniques.


As people struggle to understand the theory and sometimes get confused to acquire, implement the correct guidelines. This course reduces the learning time when major parts of program practically tell you what system resource you need, where to get the software from and how to connect this everything gradually, As it includes the direct links from where you can get Cloudera, VMware player, Ubantu, Apache, Spark, etc.

The entire focus of this course is on complete Hadoop installation, not on the Hadoop administration. Although you will pick up some administration aids along the way.

Who this course is for:
  • Beginners who want to do hands-on of Hadoop on their own system
Course content
Expand 18 lectures 01:57:10
+ Introduction
18 lectures 01:57:10

Introduction of what and how we will do it.

Preview 01:39

What system configuration you need to install Hadoop ?

Preview 04:24

What all software you need and how to download it ?

Preview 08:11

How to install various software to support installation of Hadoop.

Preparing the system and installation of Ubantu
06:21

Get familiar with the Ubuntu Operating system.

Get your self familiarize with Ubuntu
11:11

Prepare your system for the installation of Hadoop. run and Install various command to get the system ready for Hadoop.

Preparing our system for Hadoop
08:20

We are done with Ubantu installation now , let's learn how to install Hadoop.

Hadoop Installation
14:31

Lets start our Hadoop very first time , understand what to look for and how to see if hadoop is working

Start Hadoop first time
02:56

Lets verify if Hadoop is working, see HDFS and verify commands.

Testing Hadoop and HDFS
02:47

Let prepare the system for Hive installation.

Untar and moving Hive
03:24

Lets install and verify if Hive is working.

Installing and testing Hive
13:12

Lets install Pig from scratch.

Installing PIG
04:38

Lets see how Sqoop can be installed .

Installing SQOOP
05:41

Lets Install Flume.

Installing FLUME
08:50

Lets install this NOSQL database on the top of Hadoop.

Installing Hbase
07:06

Initialization of Hbase is little tricky please Pay attention.

initializing Hbase
04:25

Let's install something which the industry likes so much.

Installing SPARK
05:59

Verification and testing Spark.

Testing SPARK
03:35