A Crash Course In PySpark

Learn all the fundamentals of PySpark
New
Rating: 4.6 out of 5 (25 ratings)
2,655 students
A Crash Course In PySpark
New
Rating: 4.6 out of 5 (25 ratings)
2,655 students
PySpark, Apache Spark, Big Data Analytics, Big Data Processing, Python

Requirements

  • Python Familiarity, which can be learned through my 'No Nonsense Python' course
Description

Spark is one of the most in-demand Big Data processing frameworks right now.


This course will take you through the core concepts of PySpark. We will work to enable you to do most of the things you’d do in SQL or Python Pandas library, that is:

  • Getting hold of data

  • Handling missing data and cleaning data up

  • Aggregating your data

  • Filtering it

  • Pivoting it

  • And Writing it back

All of these things will enable you to leverage Spark on large datasets and start getting value from your data.

Let’s get started.

Who this course is for:
  • People wanting to leverage their big data with Spark
Curriculum
5 sections • 19 lectures • 1h 15m total length
  • Introduction
  • How is this course structured
  • Introduction to our development environment
  • Introduction to our dataset & dataframes
  • Environment configuration code snippet
  • Ingesting & Cleaning Data
  • Answering our scenario questions
  • Bringing data into dataframes
  • Inspecting A Dataframe
  • Handling Null & Duplicate Values
  • Selecting & Filtering Data
  • Applying Multiple Filters
  • Running SQL on Dataframes
  • Adding Calculated Columns
  • Group By And Aggregation
  • Writing Dataframe To Files
  • Challenge Overview
  • Challenge Solution
  • Thanks for joining me to learn PySpark!

Instructor
Data Engineer at Kodey
Kieran Keene
  • 4.5 Instructor Rating
  • 156 Reviews
  • 12,475 Students
  • 3 Courses

Hey guys! I am a data engineer by trade and specialize in Python, SQL, Spark, Hive, MongoDB and more. I've come on Udemy to try and make simple, short crash courses into these technologies as I personally find the longer courses too drawn out & I often lose interest. The idea is to keep it short and sharp!


For loads of advanced Spark, Python & Big Data topics, please visit my website (the button on this page will take you there) - where I talk about scaling up to enterprise grade solutions.