PySpark - Python Spark Hadoop coding framework & testing
What you'll learn
- Python Spark PySpark industry standard coding practices - Logging, Error Handling, reading configuration, unit testing
- Building a data pipeline using Hive, Spark and PostgreSQL
- Python Spark Hadoop development using PyCharm
Requirements
- Basic programming skills
- Basic database skills
- Hadoop entry level knowledge
Description
This course will bridge the gap between your academic and real world knowledge and prepare you for an entry level Big Data Python Spark developer role. You will learn the following
Python Spark coding best practices
Logging
Error Handling
Reading configuration from properties file
Doing development work using PyCharm
Using your local environment as a Hadoop Hive environment
Reading and writing to a Postgres database using Spark
Python unit testing framework
Building a data pipeline using Hadoop , Spark and Postgres
Prerequisites :
Basic programming skills
Basic database knowledge
Hadoop entry level knowledge
Who this course is for:
- Students looking at moving from Big Data Spark academic background to a real world developer role
Instructor
Welcome to FutureXSkills, where we specialize in creating high-quality video content to empower Data Engineers and Data Scientists. With over 50,000 students on Udemy, we take pride in offering top-rated courses that use a simplified, step-by-step approach to learning.
Our courses and videos cater to individuals of all levels, from beginners to experts, and are designed to help you deepen your understanding of data engineering and data science concepts.
Our team of experienced instructors and data professionals works tirelessly to create videos that are easy to follow and comprehend. We achieve this by using clear explanations, visual aids, and real-world examples to make complex topics more accessible. Our video content covers a broad range of topics, including Machine Learning and Big Data technologies, and is designed to provide a comprehensive understanding of each topic.