Data Science: Practical Deep Learning in Theano + TensorFlow

Take deep learning to the next level with SGD, Nesterov momentum, RMSprop, Theano, TensorFlow, and using the GPU on AWS.
4.6 (98 ratings)
Instead of using a simple lifetime average, Udemy calculates a
course's star rating by considering a number of different factors
such as the number of ratings, the age of ratings, and the
likelihood of fraudulent ratings.
2,380 students enrolled
$19
$120
84% off
Take This Course
  • Lectures 24
  • Length 3 hours
  • Skill Level All Levels
  • Languages English
  • Includes Lifetime access
    30 day money back guarantee!
    Available on iOS and Android
    Certificate of Completion
Wishlisted Wishlist

How taking a course works

Discover

Find online courses made by experts from around the world.

Learn

Take your courses with you and learn anywhere, anytime.

Master

Learn and practice real-world skills and achieve your goals.

About This Course

Published 2/2016 English

Course Description

This course continues where my first course, Deep Learning in Python, left off. You already know how to build an artificial neural network in Python, and you have a plug-and-play script that you can use for TensorFlow. Neural networks are one of the staples of machine learning, and they are always a top contender in Kaggle contests. If you want to improve your skills with neural networks and deep learning, this is the course for you.

You already learned about backpropagation (and because of that, this course contains basically NO MATH), but there were a lot of unanswered questions. How can you modify it to improve training speed? In this course you will learn about batch and stochastic gradient descent, two commonly used techniques that allow you to train on just a small sample of the data at each iteration, greatly speeding up training time.

You will also learn about momentum, which can be helpful for carrying you through local minima and prevent you from having to be too conservative with your learning rate. You will also learn about adaptive learning rate techniques like AdaGrad and RMSprop which can also help speed up your training.

Because you already know about the fundamentals of neural networks, we are going to talk about more modern techniques, like dropout regularization, which we will implement in both TensorFlow and Theano. The course is constantly being updated and more advanced regularization techniques are coming in the near future.

In my last course, I just wanted to give you a little sneak peak at TensorFlow. In this course we are going to start from the basics so you understand exactly what's going on - what are TensorFlow variables and expressions and how can you use these building blocks to create a neural network? We are also going to look at a library that's been around much longer and is very popular for deep learning - Theano. With this library we will also examine the basic building blocks - variables, expressions, and functions - so that you can build neural networks in Theano with confidence.

Because one of the main advantages of TensorFlow and Theano is the ability to use the GPU to speed up training, I will show you how to set up a GPU-instance on AWS and compare the speed of CPU vs GPU for training a deep neural network.

With all this extra speed, we are going to look at a real dataset - the famous MNIST dataset (images of handwritten digits) and compare against various known benchmarks.

This course focuses on "how to build and understand", not just "how to use". Anyone can learn to use an API in 15 minutes after reading some documentation. It's not about "remembering facts", it's about "seeing for yourself" via experimentation. It will teach you how to visualize what's happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you.


NOTES:

All the code for this course can be downloaded from my github: /lazyprogrammer/machine_learning_examples

In the directory: ann_class2

Make sure you always "git pull" so you have the latest version!


HARD PREREQUISITES / KNOWLEDGE YOU ARE ASSUMED TO HAVE:

  • calculus
  • linear algebra
  • probability
  • Python coding: if/else, loops, lists, dicts, sets
  • Numpy coding: matrix and vector operations, loading a CSV file


TIPS (for getting through the course):

  • Watch it at 2x.
  • Take handwritten notes. This will drastically increase your ability to retain the information.
  • Ask lots of questions on the discussion board. The more the better!
  • Realize that most exercises will take you days or weeks to complete.


USEFUL COURSE ORDERING:

  • (The Numpy Stack in Python)
  • Linear Regression in Python
  • Logistic Regression in Python
  • (Supervised Machine Learning in Python)
  • Deep Learning in Python
  • Practical Deep Learning in Theano and TensorFlow
  • Convolutional Neural Networks in Python
  • (Easy NLP)
  • (Cluster Analysis and Unsupervised Machine Learning)
  • Unsupervised Deep Learning
  • (Hidden Markov Models)
  • Recurrent Neural Networks in Python
  • Natural Language Processing with Deep Learning in Python


What are the requirements?

  • Be comfortable with Python, Numpy, and Matplotlib. Install Theano and TensorFlow.
  • If you do not yet know about gradient descent, backprop, and softmax, take my earlier course, deep learning in Python, and then return to this course.

What am I going to get from this course?

  • Apply momentum to backpropagation to train neural networks
  • Apply adaptive learning rate procedures like AdaGrad and RMSprop to backpropagation to train neural networks
  • Understand the basic building blocks of Theano
  • Build a neural network in Theano
  • Understand the basic building blocks of TensorFlow
  • Build a neural network in TensorFlow
  • Build a neural network that performs well on the MNIST dataset
  • Understand the difference between full gradient descent, batch gradient descent, and stochastic gradient descent
  • Understand and implement dropout regularization in Theano and TensorFlow

What is the target audience?

  • Students and professionals who want to deepen their machine learning knowledge
  • Data scientists who want to learn more about deep learning
  • Data scientists who already know about backpropagation and gradient descent and want to improve it with stochastic batch training, momentum, and adaptive learning rate procedures like RMSprop
  • Those who do not yet know about backpropagation or softmax should take my earlier course, deep learning in Python, first

What you get with this course?

Not for you? No problem.
30 day money back guarantee.

Forever yours.
Lifetime access.

Learn on the go.
Desktop, iOS and Android.

Get rewarded.
Certificate of completion.

Curriculum

Section 1: Outline, the MNIST dataset, and Linear (Logistic Regression) Benchmark
02:42

In the previous course you learned about softmax and backpropagation. What will you learn in this course?

Where does this course fit into your deep learning studies?
03:48
04:31

Where to get the MNIST dataset, where to put it to run the code from this course correctly. I run through util.py, which contains functions we'll be using throughout the course. I run a logistic regression benchmark to show the accuracy we should aim to beat with deep learning.

Section 2: Gradient Descent: Full vs Batch vs Stochastic
What are full, batch, and stochastic gradient descent?
02:45
Full vs Batch vs Stochastic Gradient Descent in code
05:38
Section 3: Momentum and adaptive learning rates
01:56

How can you use momentum to speed up neural network training and get out of local minima?

Code for training a neural network using momentum
06:41
03:13

Learn about periodic decay of learning rate, exponential decay, 1/t decay, AdaGrad, and RMSprop.

Constant learning rate vs. RMSProp in Code
04:05
Hyperparameter Optimization: Cross-validation, Grid Search, and Random Search
Preview
03:19
Section 4: Theano
Theano Basics: Variables, Functions, Expressions, Optimization
07:47
Building a neural network in Theano
09:17
Section 5: TensorFlow
TensorFlow Basics: Variables, Functions, Expressions, Optimization
07:27
Building a neural network in TensorFlow
09:43
Section 6: Modern Regularization Techniques
Dropout Regularization
11:38
Section 7: GPU Speedup and Homework
07:06

I show you how to start a GPU instance on Amazon Web Services (AWS) and prove to you that training a neural network using Theano on the GPU can be much faster than the CPU.

02:13

Here are some things you can do to make yourself more confident with Theano and TensorFlow coding. They are exercises that extend the material taught in this class. I also mention a handful of topics you can look forward to hearing about in future courses.

Section 8: Project: Facial Expression Recognition
Facial Expression Recognition Problem Description
12:21
The class imbalance problem
06:01
Utilities walkthrough
05:45
Class-Based ANN in Theano
19:09
Class-Based ANN in TensorFlow
15:28
Section 9: Appendix
Manually Choosing Learning Rate and Regularization Penalty
04:08
How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow
17:22

Students Who Viewed This Course Also Viewed

  • Loading
  • Loading
  • Loading

Instructor Biography

Lazy Programmer Inc., Data scientist and big data engineer

I am a data scientist, big data engineer, and full stack software engineer.

For my masters thesis I worked on brain-computer interfaces using machine learning. These assist non-verbal and non-mobile persons communicate with their family and caregivers.

I have worked in online advertising and digital media as both a data scientist and big data engineer, and built various high-throughput web services around said data. I've created new big data pipelines using Hadoop/Pig/MapReduce. I've created machine learning models to predict click-through rate, news feed recommender systems using linear regression, Bayesian Bandits, and collaborative filtering and validated the results using A/B testing.

I have taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Humber College, and The New School. 

Multiple businesses have benefitted from my web programming expertise. I do all the backend (server), frontend (HTML/JS/CSS), and operations/deployment work. Some of the technologies I've used are: Python, Ruby/Rails, PHP, Bootstrap, jQuery (Javascript), Backbone, and Angular. For storage/databases I've used MySQL, Postgres, Redis, MongoDB, and more.

Ready to start learning?
Take This Course