Ingestion of Big Data Using Apache Sqoop & Apache Flume
2.5 (2 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
159 students enrolled

Ingestion of Big Data Using Apache Sqoop & Apache Flume

Learn How to Import data to HDFS, HBase and Hive and many sources , including Twitter and MySQL
2.5 (2 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
159 students enrolled
Created by Edionik Solutions
Last updated 6/2018
English
Current price: $11.99 Original price: $199.99 Discount: 94% off
4 days left at this price!
30-Day Money-Back Guarantee
This course includes
  • 4.5 hours on-demand video
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
Training 5 or more people?

Get your team access to Udemy's top 3,000+ courses anytime, anywhere.

Try Udemy for Business
What you'll learn
  • HIVE Import with Sqoop
  • Hive Export with Sqoop

  • Import Data into HBase

  • Sqoop2 Architecture
  • Twitter Data in HDFS
  • Twitter Data in HBase using Flume
  • Interceptor Channels selector and Sink processor etc.
Course content
Expand 35 lectures 04:32:35
+ Introduction
35 lectures 04:32:35
2 Introduction to Flume
01:39
3 Introduction to Sqoop
02:17
4 Prerequisites
01:20
5 What you will learn
03:56
6 Apache Hadoop
02:55
8 Import Export with Sqoop
02:38
9 Sqoop Import process
02:56
12 Secure the Password
03:10
15 Speeding up Transfer
11:33
16 Join two Tables
14:34
17 Existing Record Generator Class
10:10
18 Transfer from RDBMS to HDFS
10:02
19 Transfer Data from Hadoop
12:37
20 Data Transfer into Batches
12:49
21 Updating and Inserting at the Same Time
13:10
22 Apache Hive
02:19
23 HIVE Import with Sqoop
15:58
24 Hive Export with Sqoop
10:47
25 Apache HBase
01:51
26 Import Data into HBase
11:06
27 Sqoop's Challenges
01:13
28 Sqoop2 Architecture
01:36
29 Sqoop2 Features
00:48
30 Components of Flume
02:03
31 Flume Events
01:06
32 Interceptor Channels selector and Sink processor
02:08
33 Telnet as a Source and HBase as Sink
19:21
34 Twitter Data in HBase using Flume
13:51
35 Twitter Data in HDFS
28:21
Requirements
  • Basic Knowledge of Big Data
  • Basic Knowledge of Hadoop
  • Basic Knowledge of Mysql
Description

This course provides basic and advanced concepts of Sqoop. This course is designed for beginners and professionals.

Sqoop is an open source framework provided by Apache. It is a command-line interface application for transferring data between relational databases and Hadoop.

This course includes all topics of Apache Sqoop with Sqoop features, Sqoop Installation, Starting Sqoop, Sqoop Import, Sqoop where clause, Sqoop Export, Sqoop Integration with Hadoop ecosystem etc.

Flume is a standard, simple, robust, flexible, and extensible tool for data ingestion from various data producers (webservers) into Hadoop. In this course, we will be using simple and illustrative example to explain the basics of Apache Flume and how to use it in practice.

This wonderful online course on Sqoop Training is a detailed course which will help you understand all the important concepts and topics of Sqoop Training.

Through this Sqoop Training, you will learn that Sqoop permits quick and rapid import as well as export of data from data stores which are structured such as relational databases, NoSQL systems and enterprise data warehouse.

Who this course is for:
  • Professionals aspiring to make a career in Big Data Analytics using Hadoop Framework with Sqoop
  • ETL developers and professionals who are into analytics in general may as well use this course to good effect.