Buying for a Team? Gift This Course
Wishlisted Wishlist

Please confirm that you want to add Learn Ingestion in Hadoop Using Sqoop and Flume Tool to your Wishlist.

Add to Wishlist

Learn Ingestion in Hadoop Using Sqoop and Flume Tool

Complete Reference for Apache Sqoop and Flume Tool
2.8 (9 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
52 students enrolled
Last updated 5/2016
English
$15 $200 92% off
30-Day Money-Back Guarantee
Includes:
  • 1.5 hours on-demand video
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
Have a coupon?
Description

Apache Sqoop is a tool designed to transfer data between Apache Hadoop and RDBMS. Apache Sqoop tool is used to import data from traditional databases such as MySQL, Oracle to Hadoop Distributed File System, and export from Hadoop Distributed file system to RDBMS. This course covers these topics of Apache Sqoop and Flume tool:

Overview of Apache Hadoop

Sqoop Import Process

Basic Sqoop Commands

Using Different File formats in Import and Export Process

Compressing Imported Data

Concept of Staging Table

Architecture and Features of Sqoop2 Tool

Flume Architecture

Flume Events

Interceptors and Channel Selectors

Sink Processors


Who is the target audience?
  • Professionals aspiring to make a career in Big Data Analytics using Hadoop Framework with Sqoop
  • Students willing to learn Sqoop tool
  • ETL developers
  • Analytics Professional
Students Who Viewed This Course Also Viewed
What Will I Learn?
Understand Sqoop Tool Overview
How to Import Data
How to make use of Sqoop in Hadoop ecosystem
Understanding Components of Apache Flume
Understand Flume Events
Sqoop2 Architecture and Features
Sqoop Export Process
Staging Table in Sqoop
View Curriculum
Requirements
  • Basic Knowledge of Apache Hadoop is required but not mandatory.
  • Basic Programming Knowledge in the field of java
  • Basics of Linux Operating System
  • Basics of SQL
Curriculum For This Course
Expand All 26 Lectures Collapse All 26 Lectures 01:24:02
+
Module-2 Getting Started with Sqoop
5 Lectures 14:28
2.1 Apache Hadoop
02:03

2.2 Traditional Database Applications
05:10

2.3 Sqoop as an ImportExport Tool
02:06

2.4 Sqoop Import Process
02:00

2.5 Basic Sqoop Commands
03:09
+
Module-3 Importing Data in HDFS using Sqoop
7 Lectures 40:45
3.1 Transferring an Entire Table
14:14

3.2 Import MySQL Data with Different Options
04:55

3.3 Importing a subset of data and Join Operation
06:14

3.4 Using a file format other than csv
03:50

3.5 Compressing Imported data
03:08

3.6 Speeding Up Transfers
01:51

3.7 Sqoop Codegen
06:33
+
Module-4 Exporting Data from HDFS
3 Lectures 11:25
4.1 Sqoop Export Process
08:00

4.2 Inserting data into batches
02:04

4.3 Staging Table
01:21
+
Module-5 Sqoop2
3 Lectures 02:59
5.1 Sqoop1's Challenges
00:51

5.2 Sqoop2 Architecture
01:31

5.3 Sqoop2 Features
00:37
+
Module-6 Apache Flume
3 Lectures 04:14
6.1 Components of Apache Flume
01:40

6.2 Flume Events
00:49

6.3 Interceptors, Channels,Sink processor
01:45
About the Instructor
3.4 Average rating
141 Reviews
1,448 Students
9 Courses
Make Learning Smarter

Digitorious technologies is a leading publisher of development courses which provide in-depth knowledge and high quality training. Digitorious technologies is serving with a mission of providing right direction to people who are looking for a career in IT/software industry. Digitorious is the best place for learning new technologies and making things easy to understand virtually.

Report Abuse