Data Analysis with Pandas and Python
4.6 (749 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
21,879 students enrolled
Wishlisted Wishlist

Please confirm that you want to add Data Analysis with Pandas and Python to your Wishlist.

Add to Wishlist

Data Analysis with Pandas and Python

Analyze data quickly and easily with Python's powerful pandas library! All datasets included --- beginners welcome!
Bestselling
4.6 (749 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
21,879 students enrolled
Created by Boris Paskhaver
Last updated 8/2017
English
Current price: $10 Original price: $60 Discount: 83% off
5 hours left at this price!
30-Day Money-Back Guarantee
Includes:
  • 19 hours on-demand video
  • 1 Article
  • 2 Supplemental Resources
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
What Will I Learn?
  • Perform a multitude of data operations in Python's popular "pandas" library including grouping, pivoting, joining and more!
  • Learn hundreds of methods and attributes across numerous pandas objects
  • Possess a strong understanding of manipulating 1D, 2D, and 3D data sets
  • Resolve common issues in broken or incomplete data sets
View Curriculum
Requirements
  • Basic / intermediate experience with Microsoft Excel or another spreadsheet software (common functions, vlookups, Pivot Tables etc)
  • Basic experience with the Python programming language
  • Strong knowledge of data types (strings, integers, floating points, booleans) etc
Description

Student Testimonials:

  • The instructor knows the material, and has detailed explanation on every topic he discusses. Has clarity too, and warns students of potential pitfalls. He has a very logical explanation, and it is easy to follow him. I highly recommend this class, and would look into taking a new class from him. - Diana

  • This is excellent, and I cannot complement the instructor enough. Extremely clear, relevant, and high quality - with helpful practical tips and advice. Would recommend this to anyone wanting to learn pandas. Lessons are well constructed. I'm actually surprised at how well done this is. I don't give many 5 stars, but this has earned it so far. - Michael

  • This course is very thorough, clear, and well thought out. This is the best Udemy course I have taken thus far. (This is my third course.) The instruction is excellent! - James


Welcome to the most comprehensive Pandas course available on Udemy! An excellent choice for both beginners and experts looking to expand their knowledge on one of the most popular Python libraries in the world!

Data Analysis with Pandas and Python offers 19+ hours of in-depth video tutorials on the most powerful data analysis toolkit available today. Lessons include:

  • installing
  • sorting
  • filtering
  • grouping
  • aggregating
  • de-duplicating
  • pivoting
  • munging
  • deleting
  • merging
  • visualizing

and more!

Why learn pandas?

If you've spent time in a spreadsheet software like Microsoft Excel, Apple Numbers, or Google Sheets and are eager to take your data analysis skills to the next level, this course is for you! 

Data Analysis with Pandas and Python introduces you to the popular Pandas library built on top of the Python programming language. 

Pandas is a powerhouse tool that allows you to do anything and everything with colossal data sets -- analyzing, organizing, sorting, filtering, pivoting, aggregating, munging, cleaning, calculating, and more! 

I call it "Excel on steroids"!

Over the course of more than 19 hours, I'll take you step-by-step through Pandas, from installation to visualization! We'll cover hundreds of different methods, attributes, features, and functionalities packed away inside this awesome library. We'll dive into tons of different datasets, short and long, broken and pristine, to demonstrate the incredible versatility and efficiency of this package.

Data Analysis with Pandas and Python is bundled with dozens of datasets for you to use. Dive right in and follow along with my lessons to see how easy it is to get started with pandas!

Whether you're a new data analyst or have spent years (*cough* too long *cough*) in Excel, Data Analysis with pandas and Python offers you an incredible introduction to one of the most powerful data toolkits available today!

Who is the target audience?
  • Data analysts and business analysts
  • Excel users looking to learn a more powerful software for data analysis
Students Who Viewed This Course Also Viewed
Curriculum For This Course
174 Lectures
18:47:30
+
Installation and Setup
21 Lectures 02:08:50
  • Introduces Python, pandas, Anaconda, Jupyter Notebook, and the course prerequisites
  • Explores sample Jupyter Notebooks to showcase the power of pandas for data analysis
  • The pandas.zip attachment with the working files for this course is attached to this lesson. 
  • Download and unpack the pandas.zip file in the directory of your choice.
Preview 12:15

Completed Course Files
00:25

The next batch of lessons focuses on the installation and configuration process for pandas on a Mac machine. In this lesson, we download the Anaconda distribution from the Continuum Analytics.

If you're new to Python, choose the 3.5 version of the distribution.

Preview 03:28

In this lesson, we install the Anaconda distribution on a Mac OS machine from the executable package we downloaded. The process installs Python and over 100 of the most popular libraries for data science in a central directory on your computer.

Preview 07:04

The Terminal is an application for communicating with your Mac with text-based commands. In this lesson, you'll learn two ways to access the Terminal on a Mac OS machine.

Mac OS - Access the Terminal
01:55

We need to install and update some Python libraries to ensure a smooth process with Jupyter Notebooks and pandas. In this lesson, we use the Terminal to complete the update process.

Mac OS - Update Anaconda Libraries
11:18

This course is bundled with a collection of .csv and .xlsx files for you to use. I strongly recommend following along with my tutorials by practicing the syntax on your end.

In this lesson, I'll explain the startup and shutdown process for a Jupyter Notebook session. Follow this process every time you come back to the course.

Mac OS - Unpack Course Materials + The Startdown and Shutdown Process
10:01

The Windows operating system comes in 32-bit and 64-bit versions. In this lesson, we'll access the Control Panel to determine what category your computer falls into and then download the proper version of the Anaconda distribution on the Continuum Analytics website.

Preview 03:47

Run the Anaconda installer package on a Windows computer. The executable installs Python, pandas, Jupyter Notebook and over 100 popular libraries for data analysis.

Preview 05:16

Access the Command Prompt on a Windows machine. The prompt (also known as the command line) is used to interact with the computer with text-based commands. We'll use it to download additional Python libraries for the course and update all installed Anaconda libraries.

Windows - Access the Command Prompt and Update Anaconda Libraries
10:11

This course is bundled with .csv and .xlsx files. The primary .zip file is attached to the first lesson of this course. In this lesson, we'll unpack the course materials and learn the startup and shutdown process for a Jupyter Notebook. Follow this process as you proceed throughout the course.

Windows - Unpack Course Materials + The Startdown and Shutdown Process
08:49

Explore the Jupyter Notebook interface including the toolbars and buttons. We'll also dive into the Kernel > Restart options, which reset the connection between the server and the Notebook.

Intro to the Jupyter Notebook Interface
05:14

Learn about the two different modes (Edit Mode and Command Mode) within a Jupyter Notebook. Edit Mode modifies the contents of a cell and Command Mode enables keyboard shortcuts to work on the entire Notebook as a whole.

Cell Types and Cell Modes
07:03

Learn the multiple keyboard shortcuts to execute code cells and Markdown cells. We'll also learn how Jupyter Notebook chooses what to output below a cell that has multiple commands.

Code Cell Execution
04:47

Memorize some popular keyboard shortcuts for adding and deleting cells in a Jupyter Notebook.

Popular Keyboard Shortcuts
03:06

Use the import keyword to import Python libraries into a Jupyter Notebook. This lesson covers most of the libraries we will utilize throughout the course including pandas, numpy, and matplotlib.

Import Libraries into Jupyter Notebook
07:09

This next batch of lessons offers a quick crash course on the Python programming language. In this lesson, we'll review Python comments, the built-in type function, and variables. 

Python Crash Course, Part 1 - Data Types and Variables
07:05

In this lesson ,we'll review Python lists and how to extract values from them by index position. A list is the equivalent of an array in other programming languages. It is used to store an ordered collection of objects.

Python Crash Course, Part 2 - Lists
05:06

Review the Python dictionary object which associates keys with values. The keys must be unique; the values can be duplicated. Dictionaries are created with curly braces and pairs of comma-separated key value pairs.

Python Crash Course, Part 3 - Dictionaries
04:19

Review Python's mathematical and equality operators. These will be critical for pandas filtering processes later in the course.

Python Crash Course, Part 4 - Operators
04:30

Define and call a sample Python function. A function is a reusable chunk of code that can accept inputs (arguments) and return outputs. We'll use custom functions later on our pandas object to apply operations to all values in a dataset.

Python Crash Course, Part 5 - Functions
06:02
+
Series
22 Lectures 02:06:12

Create a Jupyter Notebook for the Series module. The Series is a one-dimensional pandas object that combines the best features of a Python list and a Python dictionary.

Create Jupyter Notebook for the Series Module
02:12

A pandas Series can be created with the pd.Series() constructor method. In this lesson, we'll practice creating a few sample Series by feeding in Python lists as inputs to the constructor method.

Preview 10:32

The pd.Series() constructor method accepts a variety of inputs. In this lesson, we'll create a Series from a Python dictionary. We'll also explore the differences between the pandas Series and Python's built-in objects, and understand how the index operates in a Series.

Create A Series Object from a Python Dictionary
03:06

Objects in pandas have attributes and methods. Methods actively interact with and modify the object while attributes return information about the object's state. In this lesson, we'll use the .values.index, and .dtype attributes on a Series object.

Intro to Attributes
07:17

In this lesson, we'll continue our exploration of methods on pandas object. We'll utilize the .sum().product(), and the .mean() mathematical methods on a sample Series.

Preview 04:42

Parameters are the options that a method has. Arguments are the choices we choose for those options. In this lesson, we'll learn the syntax of supplying arguments to parameters on pandas methods.

Parameters and Arguments
10:10

The time has come to import our first datasets into our Jupyter Notebook work environment. We'll use the pd.read_csv() method to import 2 CSV files, then modify the squeeze parameter's argument to import the data as a Series object instead of a DataFrame.

Import Series with the .read_csv() Method
10:23

Use the .head() and .tail() methods to return a specified number of rows from the beginning or end of a Series. The methods return a brand new Series.

The .head() and .tail() Methods
03:42

See how the Series interacts with Python's built-in functions including lentypesortedlistdictmax, and min. pandas works seamlessly with all of them.

Python Built-In Functions
05:20

Get some new Series attributes on the pandas Series object including .size.name, and .is_unique. Attributes return information about the object; methods directly modify the object.

More Series Attributes
06:13

Call the .sort_values() method on a Series to sort the values in ascending or descending order. We'll see how this command operates on both a numeric and alphabetical dataset.

Preview 06:04

Modify the argument to the inplace parameter on a Series method to permanently modify the object it is called on. This is an alternative to reassigning the new object to the same variable.

The inplace Parameter
05:07

Call the .sort_index() method on a pandas Series to sort it by the index instead of its values.

The .sort_index() Method
04:38

Use Python's in keyword and attributes to check if a value exists in either the values or index of a Series. If the .index or .values attribute is not included, pandas will default to searching among the Series index.

Python's in Keyword
04:00

Use bracket notation to extract Series values by their index position.

Extract Series Values by Index Position
04:15

Use bracket notation to extract Series values by their index labels

Extract Series Values by Index Label
07:22

Call the .get() method on a Series to extract values from a Series. This is alternative syntax to the traditional bracket syntax.

The .get() Method on a Series
05:03

Call popular mathematical methods including .count().sum(), and .mean() on a Series. There are additional statistical methods available in the official pandas documentation.

Math Methods on Series Objects
05:39

Call the .idxmax() and .idxmin() methods to extract the index positions of the highest or lowest values in a Series. We'll see how these can be used to extract the highest / lowest values as well.

The .idxmax() and .idxmin() Methods
03:10

Call the .value_counts() method to count the number of the times each unique value occurs in a Series. The result will be a brand new Series where each unique value from the original Series serves as an index label.

The .value_counts() Method
03:39

Call the .apply() method and feed it a Python function as an argument to use the function on every Series value. This is helpful for executing custom operations that are not included in pandas or numpy.

The .apply() Method
06:46

Call the .map() method to tie together the values from one object to another. We'll practice with (a) two Series and (b) a Series and a dictionary object.

The .map() Method
06:52

Review the pandas Series concepts you explored in this module with this action-packed quiz!

A Review of the Series Module
7 questions
+
DataFrames I
15 Lectures 01:36:37

Let's create a Jupyter Notebook for this first DataFrame-focused module. We'll import the pandas library and introduce the nba.csv dataset that we'll be using for the next couple of lessons.

Intro to DataFrames I Module
07:24

The pandas Series and DataFrame object share many attributes and methods in common. In this lesson, we'll review popular attributes like .index, .values, .shape, .ndim, and .dtypes and see how they work on a 2-D DataFrame. We'll also introduce new attributes including .columns and .axes that are exclusive to DataFrames.

Shared Methods and Attributes between Series and DataFrames
07:37

Series and DataFrame may share attributes and methods but they are still different objects. In this lesson, we'll see how identical methods operate differently depending on the pandas object they are called on.

Differences between Shared Methods
06:48

Use two syntactical options to extract a single column from a pandas DataFrame. I prefer the square bracket approach because it works 100% of the time. The alternative option is using dot syntax, which treats the columns as attributes of the larger DataFrame object.

Preview 07:57

In this lesson, we'll select two or more columns from a pandas DataFrame. We'll still need bracket syntax to extract but now we'll include a Python list to specify the specific columns we'd like to pull out. The result will be a new DataFrame.

Select Two or More Columns from a DataFrame
05:12

In addition to extracting existing columns, bracket syntax can be assed to create a new column on the right end of a DataFrame and populating it values. In this lesson, we'll also dive into the alternate .insert() method to insert a column into the middle of a DataFrame.

Add New Column to DataFrame
08:03

A broadcasting operation performs an operation on all values within a pandas object. In this lesson, we'll apply several mathematical operations to values in a DataFrame column (i.e. a Series) including the .add(), .sub(), .mul() and .div() methods. We'll also cover the operator shortcuts for these methods.

Broadcasting Operations
09:07

Refresh your memory on the .value_counts() Series method, which counts the number of times each unique value occurs within the Series. The result is a brand new Series.

A Review of the .value_counts() Method
03:54

Null values are represented with a NaN marker in pandas. In this lesson, we'll delete rows with null (NaN) values by caling the .dropna() method. We'll also modify the arguments of the method to specify how to select the rows to be deleted.

Drop Rows with Null Values
06:41

One alternative to dropping null value is populating them with a predefined value. In this lesson, we'll call the .fillna() method to accomplish this. We'll practice the method on both DataFrame and Series objects.

Fill in Null Values with the .fillna() Method
04:25

Data types in a Series will not always be the types we want or the types that are best for efficiency. In this lesson, we'll convert the data types in a Series with the .astype() method. We'll also show how to overwrite an old Series with a Series of new data values.

The .astype() Method
10:38

Call the .sort_values() method to sort the values in a DataFrame based on the values in a single column. The method is a bit more complex than when called on a single-dimensional pandas Series.

Preview 05:46

In this lesson, we'll explore additional parameters to the .sort_values() method to sort the values in a DataFrame based on the values in multiple columns. We'll also cover how to specify different sort orders (ascending vs. descending) on different columns.

Sort a DataFrame with the .sort_values() Method, Part II
04:13

Call the .sort_index() method to sort the values in a DataFrame based on their index positions or labels instead of their values.

Sort DataFrame with the .sort_index() Method
02:59

Values in a Series can be ranked in order with the .rank() method. In this lesson, we'll practice this method on a numeric Series and then confirm the results through our own sort test.

Rank Values with the .rank() Method
05:53
+
DataFrames II
10 Lectures 01:16:56

Create the Jupyter Notebook for this second DataFrame-focused module. The focus of this module is filtering -- how we extract rows from a DataFrame that fit one or more conditions. We'll be using an employees.csv dataset consisting of workers from a fictional company.

This Module's Dataset + Memory Optimization
10:45

In this lesson, we'll filter rows from the DataFrame based on a single condition. The logic involves creating a Boolean Series of True and False values, then passing it in square brackets after our DataFrame.

Preview 12:57

In this lesson, we'll explore more complex row filtering based on multiple conditions. The syntax requires some additional symbols (&) to specify that we want to check the truthiness of multiple conditions.

Filter with More than One Condition (AND - &)
04:41

In this lesson, we'll continue filtering rows from the DataFrame based on multiple conditions. However, this time we'll use a new symbol ( | ) to specify an OR check. This requires only one of the tested conditions to evaluate to True in order to include the row.

Filter with More than One Condition (OR - |)
08:35

A common data challenge is extracting a value only if it is in a collection of values. Instead of creating multiple OR statements, we can invoke the .isin() method to extract rows from a DataFrame where a column value exists in a predefined collection such as a Python list.

The .isin() Method
06:17

Call the .isnull() and .notnull() methods to create Boolean Series for extracting rows will null or non-null values. Both methods return a Boolean Series object, which can be passed within square brackets after the DataFrame to filter it.

The .isnull() and .notnull() Methods
05:07

Call the .between() method to extract rows where a column value falls in between a predefined range. This is another method that return a Boolean Series object, which can be passed within square brackets after the DataFrame to filter it.

The .between() Method
06:51

Call the .duplicated() method to create a Boolean Series and use it to extract rows that have duplicate values. This is another example of a method that returns a Boolean Series object, which can be passed within square brackets after the DataFrame to filter it.

The .duplicated() Method
09:05

An alternative option to identifying duplicate rows and removing them through filtering is the .drop_duplicates() method. In this lesson, we'll invoke the method to remove rows with duplicate values in a DataFrame. We'll also provide custom arguments to modify how the method operates.

The .drop_duplicates() Method
08:16

Call the .unique() and .nunique() methods on a Series to extract the unique values and a count of the unique values. These methods are one letter apart but return completely different results. In addition, the .nunique() requires an additional argument to include null values in its count.

The .unique() and .nunique() Methods
04:22
+
DataFrames III
17 Lectures 01:52:12

Create the Jupyter Notebook for this third DataFrame-focused module. These lessons cover how to:

  • set and reset an index in a DataFrame
  • retrieve rows by index position or index label
  • set new values for one or more cells in the DataFrame
  • rename or delete rows or columns
  • create a random sample of rows / column

and more!

Intro to the DataFrames III Module + Import Dataset
03:23

The standard procedure in pandas is to add a numeric index starting at 0. In this lesson, we'll call the .set_index() and .reset_index() methods to alter the index of a DataFrame.

The .set_index() and .reset_index() Methods
05:37

One or more rows can be extracted from a DataFrame based on index position or index labels. In this lesson, we'll use the .loc[] method to retrieve rows based on index label.

Retrieve Rows by Index Label with .loc[]
09:43

One or more rows can be extracted from a DataFrame based on index position or index labels. In this lesson, we'll use the .iloc[] method to retrieve rows based on index position.

Retrieve Rows by Index Position with .iloc[]
06:07

Use the .ix[] method to retrieve DataFrame rows based on either index label or index position. This is a catch-all method that combines the best features of the .loc[] and .iloc[] methods.

The Catch-All .ix[] Method
08:44

The .loc[].iloc[], and .ix[] methods can take second arguments to specify the column(s) that should be extracted. In this lesson, we'll practice extracting movies from our dataset with this syntax.

Second Arguments to .loc[], .iloc[], and .ix[] Methods
06:21

In this lesson, we'll discuss how to assign a new value to one cell in a DataFrame. We first extract the cell value by using the .ix[] method with a row and column argument, then reset its value with the assignment operator (=).

Set New Values for a Specific Cell or Row
04:27

We can assign a new value to multiple cells in a DataFrame. In this lesson, we'll use the .ix[] method to extract a subset from a DataFrame, then reassign all column values in that subset.

Set Multiple Values in DataFrame
09:16

In this lesson, we'll call the .rename() method on a DataFrame to change the names of the index labels or column names. The method takes an argument of a Python dictionary where the key represents the current column name and the value represents the new column name. We'll also discuss an alternative syntax  (the .columns attribute) for changing the column names.

Preview 06:49

Practice three different syntactical options to delete rows or columns from a DataFrame. They include the .drop() method, the .pop() method, and Python's built in del keyword.

Delete Rows or Columns from a DataFrame
07:29

In this lesson, we'll call the .sample() method to pull out a random sample of rows or columns from a DataFrame. We'll specify the number of values to include by modifying the n parameter.

Create Random Sample with the .sample() Method
04:43

There is a shortcut available to pull out the rows with the smallest or largest values in a column. Instead of sorting the rows and using the .head() method, we can call the .nsmallest() and .nlargest() methods. We'll dive into these methods and their parameters in this lesson.

The .nsmallest() and .nlargest() Methods
05:36

Sometimes, you'll want to retain the structure of the original DataFrame when you extract a subset. In this lesson, we'll call the .where() method to return a modified DataFrame that holds NaN values for all rows that don't match our provided condition.

Filtering with the .where() Method
05:03

Our filtration process so far has involved using official pandas syntax. In this lesson, I'll introduce the .query() method, an alternate string-based syntax for extracting a subset from a DataFrame.

The .query() Method
09:07

In this review of a lesson from our Series Module, we'll call the .apply() method on a Series to apply a Python function on every value within it. This will act as a foundation for the next lesson, where we'll invoke the same method on a DataFrame.

A Review of the .apply() Method on Single Columns
05:53

The .apply() method applies a Python function on a row-by-row basis in a DataFrame. In this example, we'll create a custom ranking function for our films, then demonstrate how it can be applied to a DataFrame.

The .apply() Method with Row Values
06:49

The default bracket syntax extracts a component of the larger DataFrame. Any operations on that component will affect the larger DataFrame. If we want to separate the two objects, we can use the .copy() method, which create an independent copy of a pandas object.

The .copy() Method
07:05
+
Working with Text Data
9 Lectures 59:42

Datasets can arrive with plenty of poorly formatted data. The Working with Text Data module introduces the string methods available in pandas to clean your data. In this introductory lesson, we'll create the Jupyter Notebook for this module and import a CSV file with public data on Chicago employees. We'll also optimize the DataFrame for speed and efficiency.

Intro to the Working with Text Data Module
05:55

String methods in pandas require a .str prefix to operate properly. In this lesson, we'll explore four popular string methods:

  • str.lower() to convert a string's characters to lowercase
  • str.upper() to convert a string's characters to uppercase
  • str.title() to capitalize the first letter of every word in a string
  • str.len() to return a count of the number of characters in a string
Preview 07:14

The .str.replace() method replaces a substring within a string with another value that the user provides. In this lesson, we'll practice calling the method on a Series of string values. We'll use the method to convert our Employee Annual Salary column to a proper numeric column.

The .str.replace() Method
08:07

In this lesson, we'll introduce the .str.contains().str.startswith(), and .str.endswith() methods. All three create a Boolean Series, which can be used to extracting rows from a DataFrame. We'll also discuss case normalization to increase the accuracy of our results.

Filtering with String Methods
06:43

In this lesson, we'll invoke the .str.strip() family of methods to remove leading and trailing whitespace from strings in a Series. The .str.lstrip() method removes whitespace from the left side (beginning) of a string, the .str.strip() method removes whitespace from the right side (end) of a string, and the .str.strip() method does both.

More String Methods - strip, lstrip, and rstrip
04:31

The past few lessons focused on calling string methods on the values in a column of our dataset. In this lesson, we'll familiarize ourselves with calling the same string methods on the index labels and column names of a DataFrame.

String Methods on Index and Columns
05:30

Strings can often contain multiple pieces of information that are separated by a common delimiter. In this lesson, we'll introduce the .str.split() method, which can split a string value based on an occurrence of a user-specified value. This is equivalent to the Text to Columns feature in Microsoft Excel.

Split Strings by Characters with .str.split() Method
08:41

In this lesson, we'll utilize additional parameters on the .str.split() method to modify its performance. We'll extract the first names of all the employees in our dataset, a slightly more challenging puzzle than the one in the previous lesson.

More Practice with Splits
06:01

In this lesson, we'll explore even more parameters on the .str.split() method. The expand parameter allows us to expand the generated Python list into DataFrame columns while the parameter limits the total number of splits.

The expand and n Parameters of the .str.split() Method
07:00
+
MultiIndex
15 Lectures 01:30:52

The index of a pandas object can include multiple levels or layers. The object that stores this index is called a MultiIndex. In this lesson, we'll create a Jupyter Notebook for this module and explore our dataset.

Intro to the MultiIndex Module
04:26

In this lesson, we'll create a multi-layer MultiIndex on a DataFrame with the .set_index() method. The method can be passed a list instead of a string to transfer multiple columns to the index.

Create a MultiIndex with the set_index() Method
09:50

The .index attribute will return the object that makes up the index of a DataFrame. In this lesson, we'll combine this attribute with the .get_level_values() to extract the values from one of its layers.

The .get_level_values() Method
07:52

The levels or layers of a MultiIndex can be changed. In this lesson, we'll call the .set_names() method on a MultiIndex object to rename its levels.

The .set_names() Method
03:08

In this lesson, we'll explore how the .sort_index() operates on a MultiIndex DataFrame. We'll provide a list of arguments to the ascending parameter to modify how each level is sorted.

The sort_index() Method
04:56

In this lesson, we'll review the familiar .loc[] and .ix[] methods for extracting rows from a MultiIndex DataFrame. This time around, we'll feed a tuple argument to specify a value to search for in every layer of the MultiIndex.

Extract Rows from a MultiIndex DataFrame
08:32

In this lesson, we'll call the .transpose() method on a MultiIndex DataFrame to swap its row and column axes. This is a convenience method that avoids having to reset and set each index manually.

The .transpose() Method and MultiIndex on Column Level
05:48

The .swaplevel() method swaps two levels within a MultiIndex. In this lesson, we'll practice this method with our bigmac dataset. If the MultiIndex consists of only two levels, no additional arguments are required

The .swaplevel() Method
02:34

The .stack() method stacks an index from the column axis to the row axis. It essentially transfers the columns to the row index. In this lesson, we'll see a live example on our bigmac dataset.

Preview 06:01

The .unstack() method does the exact opposite of the .stack() method. It moves an index level from the rows to the columns. In this lesson, we'll call the method without any arguments.

The .unstack() Method, Part 1
03:38

In this lesson, we'll continue our exploration of the .unstack() method. We'll introduce the numerous argument types we can feed it as arguments including positive integers, negative integers, and index level names.

The .unstack() Method, Part 2
06:09

Multiple levels of the row-based MultiIndex can be shifted with the .unstack() method. In this lesson, we'll explore how to provide a list argument to the level parameter to move multiple layers at a time. We'll also introduce the fill_value parameter to plug in missing values in the resulting DataFrame.

The .unstack() Method, Part 3
05:09

In this lesson, we'll reorganize the unique values in a DataFrame column as the column headers with the .pivot() method. This can be a particularly effective method for shortening the length of the DataFrame.

The .pivot() Method
06:34

In this lesson, we'll emulate Excel's Pivot Table functionality with the .pivot_table() method. We'll explore the valuesindex, column, and aggfunc parameters. We'll also discuss the variety of aggregation functions that we can use including sumcountmax, and min.

The .pivot_table() Method
10:16

The pd.melt() can effectively perform anti-pivot operations. In this lesson, we'll call the method on a DataFrame to convert its current data structure into a more tabular format. We'll also explore the optional parameters available to modify the resulting column names in the new DataFrame.

Preview 05:59
+
GroupBy
7 Lectures 49:33

The pandas DataFrameGroupBy object allows us to create groupings of data based on common values in one or more DataFrame columns. In this lesson, we'll setup a new Jupyter Notebook in preparation for this module.

Intro to the Groupby Module
07:42

The GroupBy object does not offer us much of substance until we call a method on it. In this lesson, we'll call the .first().last(), and .size() methods on a GroupBy object to gain a better understanding of its internal data structure.

First Operations with groupby Object
09:33

The .get_group() method extracts a grouping from a GroupBy object. In this lesson, we'll practice pulling out a few groups from our companies dataset.

Retrieve A Group with the .get_group() Method
03:47

Aggregation methods allow us to perform calculations on all groupings within a GroupBy object. In this lesson, we'll call some mathematical methods on the groups, including the .sum().mean(), and .max() methods.

Methods on the Groupby Object and DataFrame Columns
08:41

GroupBy object does not have to be made up of values from a single column. In this lesson, we'll create a new GroupBy object based on unique value combinations from two of our DataFame columns.

Grouping by Multiple Columns
04:35

Certain situations may require different aggregation methods on different columns within our groupings. In this lesson, we'll invoke the .agg() method on our GroupBy object to apply a different aggregation operation to each inner column.

Preview 06:11

A standard Python for loop can be used to iterate over the groups in a pandas GroupBy object. In this lesson, we'll loop over all of our gropings to extract selected rows from each inner DataFrame. We'll append these rows to a running DataFrame and then view the final result.

Iterating through Groups
09:04
+
Merging, Joining, and Concatenating
12 Lectures 01:26:22

The Merging, Joining, and Concatenating module focuses on combining data from multiple DataFrames into one. In this introductory lesson, we'll set up a new Jupyter Notebook for this module and import the CSV files that we will use.

Intro to the Merging, Joining, and Concatenating Module
05:47

The pd.concat() method is used to concatenate two or more DataFrames together. The process is simple when the DataFrames have an identical structure. In this lesson, we'll also explore how to overwrite the concatenated index with a new one.

The pd.concat() Method, Part 1
05:39

In this lesson, we'll use the keys parameter on the pd.concat() method to identify what DataFrame the rows came from. This will create a MultiIndex DataFrame where the most outer layer will hold the keys we pass as identifiers for each DataFrame.

The pd.concat() Method, Part 2
06:35

In this lesson, we'll call the .append() method on a DataFrame to concatenate another DataFrame to the end. This is an alternative syntax to the .concat() method which is called directly on the pandas library.

The .append() Method on a DataFrame
02:03

An inner join merges the values in two DataFrames based on common values across one or more columns. In this lesson, we'll explore the concept by merging on identical values in a single column.

Inner Joins, Part 1
09:18

This lesson continues our exploration of the .merge() method. This time, we'll merge the values in two DataFrames based on common values in multiple columns. We'll also validate the data with some filtering.

Inner Joins, Part 2
09:01

An outer join combines values that exist in either DataFrame into a central DataFrame. In this lesson, we'll invoke the .merge() method with a modified argument to the how parameter to perform an outer join on our weekly sales data sets.

Outer Joins
12:23

left join establishes one of the DataFrames as the base dataset for the merge. It attempts to find each value in another DataFrame and drag over that DataFrame's rows when there's a value match. In this lesson, we'll practice executing this join with the .merge() method.

Preview 09:19

DataFrames may come equipped with different names for columns that represent the same data. In this lesson, we'll talk about how to utilize the left_on and right_on parameters to specify how to match values in differently named columns across two DataFrames.

The left_on and right_on Parameters
08:54

Our merges so far have involved matches based on common column values. In this lesson, we'll explore how to merge DataFrames based on common index labels.

Merging by Indexes with the left_index and right_index Parameters
11:02

Call the .join() method, a simple method to concatenate two DataFrames vertically when they share the same index. This is a shortcut to a more explicit .merge() method.

The .join() Method
03:15

Call the pd.merge() method on the pandas library to merge two DataFrames. This is an alternate syntax to calling the .merge() method directly on a DataFrame.

The pd.merge() Method
03:06
+
Working with Dates and Times
18 Lectures 02:27:44

The Working with Dates and Times module offers a review of Python's built-in objects for working with dates and times as well as a comprehensive introduction to similar tools in the pandas library. In this lesson, we'll create our Jupyter Notebook for this module and import Python's datetime module.

Intro to the Working with Dates and Times Module
03:44

Python includes built-in date and datetime objects for working with dates and times. This lesson offers a review of how we can create these objects as well as some of the attributes (.year, .month, .day etc) that are available on them.

Review of Python's datetime Module
09:31

The pandas library includes its own Timestamp object to represent moments in time. In this lesson, we'll use the pd.Timestamp() constructor method with a variety of inputs (strings, date objects, date objects) to create some Timestamp objects.

The pandas Timestamp Object
07:15

DatetimeIndex is a pandas object for storing multiple Timestamp objects. In this lesson, we'll create a few DatetimeIndex objects from Python lists.

The pandas DateTimeIndex Object
05:23

The pd.to_datetime() method is a convenience method to convert various inputs to pandas-focused objects. In this lesson, we'll pass a variety of inputs (date objects, datetime objects, strings, lists) to the constructor method to see what it returns.

The pd.to_datetime() Method
11:11

Over the course of the next three lessons, we'll call the pd.date_range() method to generate a DatetimeIndex of Timestamp objects. This constructor method includes 3 critical parameters (startend, and periods); we need to provide 2 of these 3 for it to function. In this lesson, we'll see how the pd.date_range() method operates with arguments for the start and end parameters.

Preview 10:22

In this lesson, we'll see how the pd.date_range() method operates with arguments for the start and periods parameters. This approach creates a set number of dates beginning from a specific point.

Create Range of Dates with the pd.date_range() Method, Part 2
09:04

In this lesson, we'll see how the pd.date_range() method operates with arguments for the end and periods parameters. This approach creates a set number of dates, proceeding backwards from a specified date point. We'll also continue our exploration of the freq parameter to vary the durations between each Timestamp.

Create Range of Dates with the pd.date_range() Method, Part 3
07:50

The .dt accessor on a Series of Timestamp object allows us to access specific datetime properties, much like the .str accessor allows us to call specific methods on a Series of strings. In this lesson, we'll explore popular attributes like .day.weekday_name, and .month.

The .dt Accessor
07:29

This module and future modules in the course rely on the pandas-datareader library to fetch financial datasets from Google Finance. In this lesson, we'll use the Terminal (Mac OS) to install the pandas-datareader library.

On Mac, the commands are:

  1. source activate root
  2. conda install pandas-datareader


If you're following a long on a Windows machine, open the Command Prompt and execute the following commands:

  1. activate root
  2. conda install pandas-datareader
Install pandas-datareader Library
02:30

In this lesson, we'll import our pandas_datareader library and fetch a financial dataset from Google Finance. This is a real-life example of a dataset with a DatetimeIndex.

Import Financial Data Set with pandas_datareader Library
10:43

The process for extracting rows from a DataFrame with a DatetimeIndex is no different than in previous modules. In this lesson, we'll review the familiar .loc[].iloc[], and .ix[] methods. As a reminder, these methods conclude with square brackets, not parentheses.

Selecting Rows from a DataFrame with a DateTimeIndex
08:01

In this lesson, we'll access some of the datetime attributes available on a pandas Timestamp object. We'll also use the .insert() method to add new columns at the beginning of our DataFrame.

Timestamp Object Attributes
07:27

The .truncate() method is a convenience method for slicing operations on objects with a DatetimeIndex. It includes two parameters -- before and after - to specify the start and end of our date range. In this lesson, we'll practice calling the method on our financial dataset.

The .truncate() Method
02:59

In this lesson, we'll use the pd.DateOffset object to add hours, days, weeks, months, and years to a DatetimeIndex. This is a powerful but slightly hidden feature of the pandas library.

pd.DateOffset Objects
12:00

Some of the DateOffset objects are hidden in a semi-secret location in the pandas library - pandas.tseries.offsets. In this lesson, we'll import these objects into our namespace to perform more specific time calculations on our DataFrames.

More Fun with pd.DateOffset Objects
14:06

Over the next two lessons, we'll explore the pandas Timedelta object which represents durations. A Timedelta represents a distance of time while a Timestamp represents a specific moment in time.

The pandas Timedelta Object
08:39

In this lesson, we'll create a Series of Timedelta objects by calculating the duration differences between two columns of Timestamps. Time difference operations can be easily performed with the subtraction ( - ) sign.

Timedeltas in a Dataset
09:30
5 More Sections
About the Instructor
Boris Paskhaver
4.6 Average rating
897 Reviews
24,688 Students
2 Courses
Software Engineer

Hi there, I'm a NYC-based web developer with experience building apps in React / Redux and Ruby on Rails.

Raised in New Jersey, I graduated from the Stern School of Business at New York University in 2013 with a double major in Business Economics and Marketing. Since graduation, my work has taken me in a wide variety of directions -- I spent a year in marketing, then financial services, and now the tech industry. I've worked everywhere from a 50-person digital agency to an international tech powerhouse with thousands of employees.

I've always had a love of learning but have struggled with the traditional resources available for education. I've used that as inspiration for my work here. My goal is to create comprehensive courses that break down the complex details into small, digestible pieces. I hope to see you in a course soon.