Big data Internship Program - Foundation
A Complete Guide to Learn Big data and Hadoop from Scratch.
Created by Big Data Trunk
After this course, students will have all the necessary foundational knowledge which is vital to be a successful big data professional.
Requirements
- Basic linux/Unix commands
- Having knowledge of any one programming language.
Description
This course is part of “Big data Internship Program” which is aligned to a typical Big data project life cycle stage.
- Foundation
- Ingestion
- Storage
- Processing
- Visualization
This part 1 course is focused on the foundation of Big data . It cover technical and non-technical items like
Technical Foundation
- Refresh your knowledge on Unix
- Java based on usage into Big Data .
- Understand git /github which is used by most of the companies for source control
- Hadoop Installation
Non-technical Foundation
- Understand Big data project life cycle
- Understand Roles in Big data Implementation
- Understand the Real life Project
Big data Topics
- Learn about the hadoop ecosystem
- HDFS
- MapReduce
- Why Spark?
Who this course is for:
- This course is for anyone who wants to learn about Big Data technologies.
- Any one who wants to work and learn big data with internship program.
Course content
9 sections • 47 lectures • 4h 37m total length
- 02:34Welcome to Internship Course
- 01:47Intro to Big Data Trunk
- 04:43Project Overview
Instructor
All about Big Data and Hadoop
Big Data Trunk is the leading Big Data focus consulting and training firm founded by industry veterans in data domain. It helps is customer gain competitive advantage from open source, big data, cloud and advanced analytics. It provides services like Strategy Consulting, Advisory Consulting and high quality classroom individual and corporate training.