Buying for a Team? Gift This Course
Wishlisted Wishlist

Please confirm that you want to add Data Science: Deep Learning in Python to your Wishlist.

Add to Wishlist

Data Science: Deep Learning in Python

A guide for writing your own neural network in Python and Numpy, and how to do it in Google's TensorFlow.
4.6 (1,015 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
8,571 students enrolled
Last updated 2/2017
$10 $120 92% off
2 days left at this price!
30-Day Money-Back Guarantee
  • 5 hours on-demand video
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
Have a coupon?
What Will I Learn?
Code a neural network from scratch in Python and numpy
Code a neural network using Google's TensorFlow
Describe the various terms related to neural networks, such as "activation", "backpropagation" and "feedforward"
Describe different types of neural networks and the different types of problems they are used for
Derive the backpropagation rule from first principles
Create a neural network with an output that has K > 2 classes using softmax
Install TensorFlow
View Curriculum
  • How to take partial derivatives and log-likelihoods (ex. finding the maximum likelihood estimations for a die)
  • Install Numpy and Python (approx. latest version of Numpy as of Jan 2016)
  • Don't worry about installing TensorFlow, we will do that in the lectures.
  • Being familiar with the content of my logistic regression course (cross-entropy cost, gradient descent, neurons, XOR, donut) will give you the proper context for this course

This course will get you started in building your FIRST artificial neural network using deep learning techniques. Following my previous course on logistic regression, we take this basic building block, and build full-on non-linear neural networks right out of the gate using Python and Numpy. All the materials for this course are FREE.

We extend the previous binary classification model to multiple classes using the softmax function, and we derive the very important training method called "backpropagation" using first principles. I show you how to code backpropagation in Numpy, first "the slow way", and then "the fast way" using Numpy features.

Next, we implement a neural network using Google's new TensorFlow library.

You should take this course if you are interested in starting your journey toward becoming a master at deep learning, or if you are interested in machine learning and data science in general. We go beyond basic models like logistic regression and linear regression and I show you something that automatically learns features.

This course provides you with many practical examples so that you can really see how deep learning can be used on anything. Throughout the course, we'll do a course project, which will show you how to predict user actions on a website given user data like whether or not that user is on a mobile device, the number of products they viewed, how long they stayed on your site, whether or not they are a returning visitor, and what time of day they visited.

Another project at the end of the course shows you how you can use deep learning for facial expression recognition. Imagine being able to predict someone's emotions just based on a picture!

After getting your feet wet with the fundamentals, I provide a brief overview of some of the newest developments in neural networks - slightly modified architectures and what they are used for.


If you already know about softmax and backpropagation, and you want to skip over the theory and speed things up using more advanced techniques along with GPU-optimization, check out my follow-up course on this topic, Data Science: Practical Deep Learning Concepts in Theano and TensorFlow.

I have other courses that cover more advanced topics, such as Convolutional Neural NetworksRestricted Boltzmann MachinesAutoencoders, and more! But you want to be very comfortable with the material in this course before moving on to more advanced subjects.

This course focuses on "how to build and understand", not just "how to use". Anyone can learn to use an API in 15 minutes after reading some documentation. It's not about "remembering facts", it's about "seeing for yourself" via experimentation. It will teach you how to visualize what's happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you.

All the code for this course can be downloaded from my github: /lazyprogrammer/machine_learning_examples

In the directory: ann_class

Make sure you always "git pull" so you have the latest version!


  • calculus
  • linear algebra
  • probability
  • Python coding: if/else, loops, lists, dicts, sets
  • Numpy coding: matrix and vector operations, loading a CSV file

TIPS (for getting through the course):

  • Watch it at 2x.
  • Take handwritten notes. This will drastically increase your ability to retain the information.
  • Ask lots of questions on the discussion board. The more the better!
  • Realize that most exercises will take you days or weeks to complete.


  • (The Numpy Stack in Python)
  • Linear Regression in Python
  • Logistic Regression in Python
  • (Supervised Machine Learning in Python)
  • (Bayesian Machine Learning in Python: A/B Testing)
  • Deep Learning in Python
  • Practical Deep Learning in Theano and TensorFlow
  • (Supervised Machine Learning in Python 2: Ensemble Methods)
  • Convolutional Neural Networks in Python
  • (Easy NLP)
  • (Cluster Analysis and Unsupervised Machine Learning)
  • Unsupervised Deep Learning
  • (Hidden Markov Models)
  • Recurrent Neural Networks in Python
  • Natural Language Processing with Deep Learning in Python
Who is the target audience?
  • Students interested in machine learning - you'll get all the tidbits you need to do well in a neural networks course
  • Professionals who want to use neural networks in their machine learning and data science pipeline. Be able to apply more powerful models, and know its drawbacks.
  • People who already know how to take partial derivatives and log-likelihoods. Since we cover this in more detail in my logistic regression class, it is not covered quite as thoroughly here.
  • People who already know how to code in Python and Numpy. You will need some familiarity because we go through it quite fast. Don't worry, it's not that hard.
Students Who Viewed This Course Also Viewed
Curriculum For This Course
Expand All 49 Lectures Collapse All 49 Lectures 05:12:03
What is a neural network?
5 Lectures 27:27

Overview of the course and prerequisites.

Preview 03:45

Deep Learning Readiness Test

An almost purely qualitative description of neural networks.

Preview 04:20

Introduction to the E-Commerce Course Project
Classifying more than 2 things at a time
12 Lectures 01:08:28
Prediction: Section Introduction and Outline

From Logistic Regression to Neural Networks

What's the function we use to classify more than 2 things?


Sigmoid vs. Softmax

Feedforward in Slow-Mo (part 1)

Feedforward in Slow-Mo (part 2)

Where to get the code for this course

How do we code the softmax in Python?

Softmax in Code

Let's extend softmax and code the entire calculation from input to output.

Building an entire feedforward neural network in Python

E-Commerce Course Project: Pre-Processing the Data

E-Commerce Course Project: Making Predictions

What do you get if you don't use a non-linearity such as sigmoid, tanh, rectifier, or softmax?

Absence of non-linearities
1 question


Calculate neural network output
1 question

Prediction: Section Summary
Training a neural network
11 Lectures 01:18:16
Training: Section Introduction and Outline

What do all these symbols and letters mean?

What does it mean to "train" a neural network?

Derivation of backpropagation from first principles. Defining the objective function, taking the log, and differentiating the log with respect to weights in each layer.

Backpropagation Intro

A further look into backpropagation.

Backpropagation - what does the weight update depend on?

Backpropagation for deeper networks, exposing the structure, and how to code it more efficiently.

Backpropagation - recursiveness

How to code bacpropagation in Python using numpy operations vs. slow for loops.

Backpropagation in code

The WRONG Way to Learn Backpropagation

E-Commerce Course Project: Training Logistic Regression with Softmax

E-Commerce Course Project: Training a Neural Network

Backpropagation for binary output
1 question

Training: Section Summary
Practical Machine Learning
7 Lectures 23:04
Practical Issues: Section Introduction and Outline

What are the donut and XOR problems again?

Donut and XOR Review

We look again at the XOR and donut problem from logistic regression. The features are now learned automatically.

Donut and XOR Revisited

Try the Donut and XOR yourself
1 question

sigmoid, tanh, relu along with their derivatives

Common nonlinearities and their derivatives

Tips on choosing learning rate, regularization penalty, number of hidden units, and number of hidden layers.

Hyperparameters and Cross-Validation

Manually Choosing Learning Rate and Regularization Penalty

Practical Issues: Section Summary
TensorFlow, exercises, practice, and what to learn next
5 Lectures 32:46

A look at Google's new TensorFlow library.

TensorFlow plug-and-play example

Visualizing what a neural network has learned using TensorFlow Playground

What did you learn? What didn't you learn? Where can you learn more?
Where to go from here

You know more than you think you know

How to get good at deep learning + exercises
Project: Facial Expression Recognition
6 Lectures 56:01
Facial Expression Recognition Problem Description

The class imbalance problem

Utilities walkthrough

Facial Expression Recognition in Code (Binary / Sigmoid)

Facial Expression Recognition in Code (Logistic Regression Softmax)

Facial Expression Recognition in Code (ANN Softmax)
3 Lectures 26:01
Gradient Descent Tutorial

Help with Softmax Derivative

How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow
About the Instructor
4.6 Average rating
5,154 Reviews
30,331 Students
17 Courses
Data scientist and big data engineer

I am a data scientist, big data engineer, and full stack software engineer.

For my masters thesis I worked on brain-computer interfaces using machine learning. These assist non-verbal and non-mobile persons communicate with their family and caregivers.

I have worked in online advertising and digital media as both a data scientist and big data engineer, and built various high-throughput web services around said data. I've created new big data pipelines using Hadoop/Pig/MapReduce. I've created machine learning models to predict click-through rate, news feed recommender systems using linear regression, Bayesian Bandits, and collaborative filtering and validated the results using A/B testing.

I have taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Humber College, and The New School. 

Multiple businesses have benefitted from my web programming expertise. I do all the backend (server), frontend (HTML/JS/CSS), and operations/deployment work. Some of the technologies I've used are: Python, Ruby/Rails, PHP, Bootstrap, jQuery (Javascript), Backbone, and Angular. For storage/databases I've used MySQL, Postgres, Redis, MongoDB, and more.

Report Abuse