Buying for a Team? Gift This Course
Wishlisted Wishlist

Please confirm that you want to add Learn Ingestion in Hadoop Using Sqoop and Flume Tool to your Wishlist.

Add to Wishlist

Learn Ingestion in Hadoop Using Sqoop and Flume Tool

Complete Reference for Apache Sqoop and Flume Tool
2.8 (9 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
52 students enrolled
Last updated 5/2016
$15 $200 92% off
30-Day Money-Back Guarantee
  • 1.5 hours on-demand video
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
Have a coupon?

Apache Sqoop is a tool designed to transfer data between Apache Hadoop and RDBMS. Apache Sqoop tool is used to import data from traditional databases such as MySQL, Oracle to Hadoop Distributed File System, and export from Hadoop Distributed file system to RDBMS. This course covers these topics of Apache Sqoop and Flume tool:

Overview of Apache Hadoop

Sqoop Import Process

Basic Sqoop Commands

Using Different File formats in Import and Export Process

Compressing Imported Data

Concept of Staging Table

Architecture and Features of Sqoop2 Tool

Flume Architecture

Flume Events

Interceptors and Channel Selectors

Sink Processors

Who is the target audience?
  • Professionals aspiring to make a career in Big Data Analytics using Hadoop Framework with Sqoop
  • Students willing to learn Sqoop tool
  • ETL developers
  • Analytics Professional
Students Who Viewed This Course Also Viewed
What Will I Learn?
Understand Sqoop Tool Overview
How to Import Data
How to make use of Sqoop in Hadoop ecosystem
Understanding Components of Apache Flume
Understand Flume Events
Sqoop2 Architecture and Features
Sqoop Export Process
Staging Table in Sqoop
View Curriculum
  • Basic Knowledge of Apache Hadoop is required but not mandatory.
  • Basic Programming Knowledge in the field of java
  • Basics of Linux Operating System
  • Basics of SQL
Curriculum For This Course
Expand All 26 Lectures Collapse All 26 Lectures 01:24:02
Module-2 Getting Started with Sqoop
5 Lectures 14:28
2.1 Apache Hadoop

2.2 Traditional Database Applications

2.3 Sqoop as an ImportExport Tool

2.4 Sqoop Import Process

2.5 Basic Sqoop Commands
Module-3 Importing Data in HDFS using Sqoop
7 Lectures 40:45
3.1 Transferring an Entire Table

3.2 Import MySQL Data with Different Options

3.3 Importing a subset of data and Join Operation

3.4 Using a file format other than csv

3.5 Compressing Imported data

3.6 Speeding Up Transfers

3.7 Sqoop Codegen
Module-4 Exporting Data from HDFS
3 Lectures 11:25
4.1 Sqoop Export Process

4.2 Inserting data into batches

4.3 Staging Table
Module-5 Sqoop2
3 Lectures 02:59
5.1 Sqoop1's Challenges

5.2 Sqoop2 Architecture

5.3 Sqoop2 Features
Module-6 Apache Flume
3 Lectures 04:14
6.1 Components of Apache Flume

6.2 Flume Events

6.3 Interceptors, Channels,Sink processor
About the Instructor
3.4 Average rating
141 Reviews
1,448 Students
9 Courses
Make Learning Smarter

Digitorious technologies is a leading publisher of development courses which provide in-depth knowledge and high quality training. Digitorious technologies is serving with a mission of providing right direction to people who are looking for a career in IT/software industry. Digitorious is the best place for learning new technologies and making things easy to understand virtually.

Report Abuse