Apache Sqoop for Certifications - CCA and HDPCD

Deep dive into all the nuances of the Sqoop
Rating: 4.3 out of 5 (339 ratings)
8,693 students
Apache Sqoop for Certifications - CCA and HDPCD
Rating: 4.3 out of 5 (339 ratings)
8,695 students
Data ingestion from RDBMS to Hadoop using Apache Sqoop
Prepare Sqoop for CCA 175 Spark and Hadoop Developer
Prepare Sqoop for HDPCD with Hive, Sqoop and Flume

Requirements

  • Basic linux skills
  • Basic programming skills
  • Cloudera Quickstart VM or Hortonworks Sandbox or valid account for IT Versity Big Data labs or any Hadoop clusters where sqoop and hive are integrated.
  • Minimum memory required based on the environment you are using with 64 bit operating system
Description

As part of this course, we will be

  • Seeing various setup options to explore sqoop
  • Understand how to import data from mysql database to Hadoop HDFS/Hive
  • All the important control arguments while performing import
  • Export data from Hive/HDFS to MySQL

After the course, you can confidently execute scenarios related to sqoop as part of certifications and also make better decisions while building data integration frameworks using Sqoop.

Who this course is for:
  • Any IT aspirant/professional willing to learn Sqoop for certifications or projects
Curriculum
7 sections • 80 lectures • 6h 28m total length
  • Introduction
  • Setup Options
  • Setup Cloudera QuickStart VM
  • Setup Hortonworks Sandbox
  • Data Sets and Big Data labs for practicing Sqoop - from ITVersity
  • Using Windows - Putty
  • Using Windows - Cygwin
  • Introduction to Sqoop
  • Validate Source Database - MySQL
  • Review JDBC Jar file to connect to MySQL
  • Getting help of Sqoop using Command Line
  • Overview of Sqoop User Guide
  • Validate Sqoop and MySQL integration using "sqoop list-databases"
  • List tables in MySQL using "sqoop list-tables"
  • Run Queries in MySQL using "sqoop eval"
  • Understanding Logs in Sqoop
  • Redirecting Sqoop Logs into files
  • Overview of Sqoop Import Command
  • Perform Sqoop Import of orders - --table and --target-dir
  • Perform Sqoop import of order_items - --warehouse-dir
  • Sqoop Import - Managing HDFS Directories - append or overwrite or fail
  • Sqoop Import - Execution Flow
  • Reviewing logs of Sqoop Import
  • Sqoop Import - Specifying Number of Mappers
  • Review the Output Files
  • Sqoop Import - Supported File Formats
  • Validating avro Files using avro-tools
  • Sqoop Import - Using Compression
  • Sqoop Import - Customizing - Introduction
  • Sqoop Import - Specifying Columns
  • Sqoop Import - Using boundary query
  • Sqoop Import - Filter unnecessary data
  • Sqoop Import - Using Split By
  • Sqoop Import - Importing Query Results
  • Sqoop Import - Dealing with Composite Keys
  • Sqoop Import - Dealing with Primary Key or Split By using Non Numeric Field
  • Sqoop Import - Dealing with Tables with out Primary Key
  • Sqoop Import - Autoreset to One Mapper
  • Sqoop Import - Default Delimiters using Text File Format
  • Sqoop Import - Specifying Delimiters - Import NYSE Data with \t as delimiter
  • Sqoop Import - Dealing with NULL Values
  • Sqoop Import - import-all-tables
  • Quick Overview of Hive
  • Sqoop Import - Create Hive Database
  • Creating empty Hive Table using create-hive-table
  • Sqoop Import - Import orders table to Hive Database
  • Sqoop Import - Managing Table using Hive Import - Overwrite
  • Sqoop Import - Managing Table using Hive Import - Error out - create-hive-table
  • Sqoop Import - Understanding Execution Flow while importing into Hive Table
  • Sqoop Import - Review files in Hive Tables
  • Sqoop Delimiters vs. Hive Delimiters - Text Files
  • Sqoop Import - Hive File Formats
  • Sqoop Import all tables - Hive
  • Introduction
  • Prepare data for Export
  • Creating Table in MySQL
  • Sqoop Export - Perform Simple Export - --table and --export-dir
  • Sqoop Export - Execution Flow
  • Sqoop Export - Specifying Number of Mappers
  • Sqoop Export - Troubleshooting the issues
  • Sqoop Export - Merging or Upserting Overview
  • Sqoop Export - Quick Overview of MySQL for Upsert
  • Sqoop Export - Using update-mode - update-only (default)
  • Sqoop Export - Using update-mode - allow-inseert
  • Sqoop Export - Specifying Columns
  • Sqoop Export - Specifying Delimiters
  • Sqoop Export - Using Stage Table
  • Overview of Sqoop Jobs
  • Adding Password File
  • Creating Sqoop Job
  • Running Sqoop Job
  • Overview of Incremental Imports
  • Incremental Import - Using where
  • Incremental Import - Append Mode
  • Incremental Import - Create training_orders_incr in retail_export
  • Incremental Import - Create Job
  • Incremental Import - Execute Job
  • Incremental Import - Add additional data (order_id > 30000)
  • Incremental Import - Rerun the job and validate results
  • Incremental Import - Using mode lastmodified

Instructors
Technology Adviser and Evangelist
Durga Viswanatha Raju Gadiraju
  • 4.1 Instructor Rating
  • 8,037 Reviews
  • 140,918 Students
  • 19 Courses

13+ years of experience in executing complex projects using vast array of technologies including Big Data and Cloud.

I found itversity, llc - a US based startup to provide quality training for IT professionals and staffing as well as consulting solutions for enterprise clients. I have trained thousands of IT professionals in vast array of technologies including Big Data and Cloud.

Building IT career for people and provide quality services to the clients will be paramount to our organization.

As an entry strategy itversity will be providing quality training in the areas of ABCD

* Application Development
* Big Data and Business Intelligence
* Cloud
* Datawarehousing, Databases

Support Account for ITVersity Courses.
Itversity Support
  • 4.1 Instructor Rating
  • 7,783 Reviews
  • 138,836 Students
  • 18 Courses

We have built a team to support going forward. If you send messages to this account for our courses, they will be sent to our Helpdesk from where we will be rewriting to our team.