Data Science: Deep Learning in Python
4.6 (4,760 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
33,902 students enrolled

Data Science: Deep Learning in Python

The MOST in-depth look at neural network theory, and how to code one with pure Python and Tensorflow
4.6 (4,760 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
33,902 students enrolled
Last updated 10/2018
English
English [Auto-generated], Portuguese [Auto-generated], 1 more
  • Spanish [Auto-generated]
Current price: $11.99 Original price: $179.99 Discount: 93% off
30-Day Money-Back Guarantee
This course includes
  • 9.5 hours on-demand video
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
Training 5 or more people?

Get your team access to Udemy's top 3,000+ courses anytime, anywhere.

Try Udemy for Business
What you'll learn
  • Learn how Deep Learning REALLY works (not just some diagrams and magical black box code)
  • Learn how a neural network is built from basic building blocks (the neuron)

  • Code a neural network from scratch in Python and numpy

  • Code a neural network using Google's TensorFlow
  • Describe different types of neural networks and the different types of problems they are used for
  • Derive the backpropagation rule from first principles
  • Create a neural network with an output that has K > 2 classes using softmax
  • Describe the various terms related to neural networks, such as "activation", "backpropagation" and "feedforward"
  • Install TensorFlow
Requirements
  • How to take partial derivatives and log-likelihoods (ex. finding the maximum likelihood estimations for a die)
  • Install Numpy and Python (approx. latest version of Numpy as of Jan 2016)
  • Don't worry about installing TensorFlow, we will do that in the lectures.
  • Being familiar with the content of my logistic regression course (cross-entropy cost, gradient descent, neurons, XOR, donut) will give you the proper context for this course
Description

This course will get you started in building your FIRST artificial neural network using deep learning techniques. Following my previous course on logistic regression, we take this basic building block, and build full-on non-linear neural networks right out of the gate using Python and Numpy. All the materials for this course are FREE.

We extend the previous binary classification model to multiple classes using the softmax function, and we derive the very important training method called "backpropagation" using first principles. I show you how to code backpropagation in Numpy, first "the slow way", and then "the fast way" using Numpy features.

Next, we implement a neural network using Google's new TensorFlow library.

You should take this course if you are interested in starting your journey toward becoming a master at deep learning, or if you are interested in machine learning and data science in general. We go beyond basic models like logistic regression and linear regression and I show you something that automatically learns features.

This course provides you with many practical examples so that you can really see how deep learning can be used on anything. Throughout the course, we'll do a course project, which will show you how to predict user actions on a website given user data like whether or not that user is on a mobile device, the number of products they viewed, how long they stayed on your site, whether or not they are a returning visitor, and what time of day they visited.

Another project at the end of the course shows you how you can use deep learning for facial expression recognition. Imagine being able to predict someone's emotions just based on a picture!

After getting your feet wet with the fundamentals, I provide a brief overview of some of the newest developments in neural networks - slightly modified architectures and what they are used for.

NOTE:

If you already know about softmax and backpropagation, and you want to skip over the theory and speed things up using more advanced techniques along with GPU-optimization, check out my follow-up course on this topic, Data Science: Practical Deep Learning Concepts in Theano and TensorFlow.

I have other courses that cover more advanced topics, such as Convolutional Neural NetworksRestricted Boltzmann MachinesAutoencoders, and more! But you want to be very comfortable with the material in this course before moving on to more advanced subjects.

This course focuses on "how to build and understand", not just "how to use". Anyone can learn to use an API in 15 minutes after reading some documentation. It's not about "remembering facts", it's about "seeing for yourself" via experimentation. It will teach you how to visualize what's happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you.



HARD PREREQUISITES / KNOWLEDGE YOU ARE ASSUMED TO HAVE:

  • calculus

  • linear algebra

  • probability

  • Python coding: if/else, loops, lists, dicts, sets

  • Numpy coding: matrix and vector operations, loading a CSV file


TIPS (for getting through the course):

  • Watch it at 2x.

  • Take handwritten notes. This will drastically increase your ability to retain the information.

  • Write down the equations. If you don't, I guarantee it will just look like gibberish.

  • Ask lots of questions on the discussion board. The more the better!

  • Realize that most exercises will take you days or weeks to complete.

  • Write code yourself, don't just sit there and look at my code.


WHAT ORDER SHOULD I TAKE YOUR COURSES IN?:

  • Check out the lecture "What order should I take your courses in?" (available in the Appendix of any of my courses, including the free Numpy course)


Who this course is for:
  • Students interested in machine learning - you'll get all the tidbits you need to do well in a neural networks course
  • Professionals who want to use neural networks in their machine learning and data science pipeline. Be able to apply more powerful models, and know its drawbacks.
  • People who already know how to take partial derivatives and log-likelihoods. Since we cover this in more detail in my logistic regression class, it is not covered quite as thoroughly here.
  • People who already know how to code in Python and Numpy. You will need some familiarity because we go through it quite fast. Don't worry, it's not that hard.
Course content
Expand all 82 lectures 09:35:31
+ Welcome
4 lectures 22:05

Overview of the course and prerequisites.

Preview 03:45
Where to get the code
04:24
How to Succeed in this Course
03:13
+ Review
6 lectures 30:38
Review Section Introduction
01:58
What does machine learning do?
05:28
Neuron Predictions
05:00
Neuron Training
08:47
Deep Learning Readiness Test
05:33
Review Section Summary
03:52
+ Preliminaries: From Neurons to Neural Networks
2 lectures 13:12

An almost purely qualitative description of neural networks.

Neural Networks with No Math
04:20
Introduction to the E-Commerce Course Project
08:52
+ Classifying more than 2 things at a time
14 lectures 01:19:58
Prediction: Section Introduction and Outline
05:39
From Logistic Regression to Neural Networks
05:12
Interpreting the Weights of a Neural Network
08:05

What's the function we use to classify more than 2 things?

Softmax
02:54
Sigmoid vs. Softmax
01:30
Feedforward in Slow-Mo (part 1)
19:42
Feedforward in Slow-Mo (part 2)
10:55
Where to get the code for this course
01:30

How do we code the softmax in Python?

Softmax in Code
03:39

Let's extend softmax and code the entire calculation from input to output.

Building an entire feedforward neural network in Python
06:23
E-Commerce Course Project: Pre-Processing the Data
05:24
E-Commerce Course Project: Making Predictions
03:55
Prediction Quizzes
03:25
Prediction: Section Summary
01:45
+ Training a neural network
13 lectures 01:31:24
Training: Section Introduction and Outline
02:49
What do all these symbols and letters mean?
09:45
What does it mean to "train" a neural network?
06:15
How to Brace Yourself to Learn Backpropagation
07:38

Derivation of backpropagation from first principles. Defining the objective function, taking the log, and differentiating the log with respect to weights in each layer.

Backpropagation Intro
11:53

A further look into backpropagation.

Backpropagation - what does the weight update depend on?
04:47

Backpropagation for deeper networks, exposing the structure, and how to code it more efficiently.

Backpropagation - recursiveness
04:37

How to code bacpropagation in Python using numpy operations vs. slow for loops.

Backpropagation in code
17:07
The WRONG Way to Learn Backpropagation
03:52
E-Commerce Course Project: Training Logistic Regression with Softmax
08:11
E-Commerce Course Project: Training a Neural Network
06:19
Training Quiz
05:30
Training: Section Summary
02:41
+ Practical Machine Learning
9 lectures 42:27
Practical Issues: Section Introduction and Outline
01:43

What are the donut and XOR problems again?

Donut and XOR Review
01:06

We look again at the XOR and donut problem from logistic regression. The features are now learned automatically.

Donut and XOR Revisited
04:21
Neural Networks for Regression
11:38

sigmoid, tanh, relu along with their derivatives

Common nonlinearities and their derivatives
01:26
Practical Considerations for Choosing Activation Functions
07:45

Tips on choosing learning rate, regularization penalty, number of hidden units, and number of hidden layers.

Hyperparameters and Cross-Validation
04:10
Manually Choosing Learning Rate and Regularization Penalty
04:08
Practical Issues: Section Summary
06:10
+ TensorFlow, exercises, practice, and what to learn next
6 lectures 41:35

A look at Google's new TensorFlow library.

TensorFlow plug-and-play example
07:31
Visualizing what a neural network has learned using TensorFlow Playground
11:35
What did you learn? What didn't you learn? Where can you learn more?
Where to go from here
03:41
You know more than you think you know
04:52
How to get good at deep learning + exercises
05:07
Deep neural networks in just 3 lines of code with Sci-Kit Learn
08:49
+ Project: Facial Expression Recognition
8 lectures 01:02:12
Facial Expression Recognition Project Introduction
04:51
Facial Expression Recognition Problem Description
12:21
The class imbalance problem
06:01
Utilities walkthrough
05:45
Facial Expression Recognition in Code (Binary / Sigmoid)
12:13
Facial Expression Recognition in Code (Logistic Regression Softmax)
08:57
Facial Expression Recognition in Code (ANN Softmax)
10:44
Facial Expression Recognition Project Summary
01:20
+ Backpropagation Supplementary Lectures
5 lectures 30:30
Backpropagation Supplementary Lectures Introduction
01:03
Why Learn the Ins and Outs of Backpropagation?
08:53
Gradient Descent Tutorial
04:30
Help with Softmax Derivative
04:09
Backpropagation with Softmax Troubleshooting
11:55
+ Appendix
15 lectures 02:41:30
What is the Appendix?
02:48
What's the difference between "neural networks" and "deep learning"?
07:58
Windows-Focused Environment Setup 2018
20:20
How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow
17:32
How to Code by Yourself (part 1)
15:54
How to Code by Yourself (part 2)
09:23
How to Succeed in this Course (Long Version)
10:24
Is this for Beginners or Experts? Academic or Practical? Fast or slow-paced?
22:04
Proof that using Jupyter Notebook is the same as not using it
12:29
How to Uncompress a .tar.gz file
03:18
BONUS: Where to get Udemy coupons and FREE deep learning material
02:20
Python 2 vs Python 3
04:38
Where does this course fit into your deep learning studies? (Old Version)
04:57
What order should I take your courses in? (part 1)
11:18
What order should I take your courses in? (part 2)
16:07