Unsupervised Deep Learning in Python
4.6 (1,022 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
12,618 students enrolled

Unsupervised Deep Learning in Python

Theano / Tensorflow: Autoencoders, Restricted Boltzmann Machines, Deep Neural Networks, t-SNE and PCA
4.6 (1,022 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
12,618 students enrolled
Last updated 10/2018
English
English [Auto-generated]
Current price: $11.99 Original price: $119.99 Discount: 90% off
30-Day Money-Back Guarantee
This course includes
  • 10.5 hours on-demand video
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
Training 5 or more people?

Get your team access to Udemy's top 3,000+ courses anytime, anywhere.

Try Udemy for Business
What you'll learn
  • Understand the theory behind principal components analysis (PCA)
  • Know why PCA is useful for dimensionality reduction, visualization, de-correlation, and denoising

  • Derive the PCA algorithm by hand

  • Write the code for PCA
  • Understand the theory behind t-SNE
  • Use t-SNE in code
  • Understand the limitations of PCA and t-SNE
  • Understand the theory behind autoencoders
  • Write an autoencoder in Theano and Tensorflow
  • Understand how stacked autoencoders are used in deep learning
  • Write a stacked denoising autoencoder in Theano and Tensorflow
  • Understand the theory behind restricted Boltzmann machines (RBMs)
  • Understand why RBMs are hard to train
  • Understand the contrastive divergence algorithm to train RBMs
  • Write your own RBM and deep belief network (DBN) in Theano and Tensorflow
  • Visualize and interpret the features learned by autoencoders and RBMs
Requirements
  • Knowledge of calculus and linear algebra
  • Python coding skills
  • Some experience with Numpy, Theano, and Tensorflow
  • Know how gradient descent is used to train machine learning models
  • Install Python, Numpy, and Theano
  • Some probability and statistics knowledge
  • Code a feedforward neural network in Theano or Tensorflow
Description

This course is the next logical step in my deep learning, data science, and machine learning series. I’ve done a lot of courses about deep learning, and I just released a course about unsupervised learning, where I talked about clustering and density estimation. So what do you get when you put these 2 together? Unsupervised deep learning!

In these course we’ll start with some very basic stuff - principal components analysis (PCA), and a popular nonlinear dimensionality reduction technique known as t-SNE (t-distributed stochastic neighbor embedding).

Next, we’ll look at a special type of unsupervised neural network called the autoencoder. After describing how an autoencoder works, I’ll show you how you can link a bunch of them together to form a deep stack of autoencoders, that leads to better performance of a supervised deep neural network. Autoencoders are like a non-linear form of PCA.

Last, we’ll look at restricted Boltzmann machines (RBMs). These are yet another popular unsupervised neural network, that you can use in the same way as autoencoders to pretrain your supervised deep neural network. I’ll show you an interesting way of training restricted Boltzmann machines, known as Gibbs sampling, a special case of Markov Chain Monte Carlo, and I’ll demonstrate how even though this method is only a rough approximation, it still ends up reducing other cost functions, such as the one used for autoencoders. This method is also known as Contrastive Divergence or CD-k. As in physical systems, we define a concept called free energy and attempt to minimize this quantity.

Finally, we’ll bring all these concepts together and I’ll show you visually what happens when you use PCA and t-SNE on the features that the autoencoders and RBMs have learned, and we’ll see that even without labels the results suggest that a pattern has been found.

All the materials used in this course are FREE. Since this course is the 4th in the deep learning series, I will assume you already know calculus, linear algebra, and Python coding. You'll want to install Numpy, Theano, and Tensorflow for this course. These are essential items in your data analytics toolbox.

If you are interested in deep learning and you want to learn about modern deep learning developments beyond just plain backpropagation, including using unsupervised neural networks to interpret what features can be automatically and hierarchically learned in a deep learning system, this course is for you.

This course focuses on "how to build and understand", not just "how to use". Anyone can learn to use an API in 15 minutes after reading some documentation. It's not about "remembering facts", it's about "seeing for yourself" via experimentation. It will teach you how to visualize what's happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you.



HARD PREREQUISITES / KNOWLEDGE YOU ARE ASSUMED TO HAVE:

  • calculus

  • linear algebra

  • probability

  • Python coding: if/else, loops, lists, dicts, sets

  • Numpy coding: matrix and vector operations, loading a CSV file

  • can write a feedforward neural network in Theano or Tensorflow


TIPS (for getting through the course):

  • Watch it at 2x.

  • Take handwritten notes. This will drastically increase your ability to retain the information.

  • Write down the equations. If you don't, I guarantee it will just look like gibberish.

  • Ask lots of questions on the discussion board. The more the better!

  • Realize that most exercises will take you days or weeks to complete.

  • Write code yourself, don't just sit there and look at my code.


WHAT ORDER SHOULD I TAKE YOUR COURSES IN?:

  • Check out the lecture "What order should I take your courses in?" (available in the Appendix of any of my courses, including the free Numpy course)



Who this course is for:
  • Students and professionals looking to enhance their deep learning repertoire
  • Students and professionals who want to improve the training capabilities of deep neural networks
  • Students and professionals who want to learn about the more modern developments in deep learning
Course content
Expand all 84 lectures 10:21:02
+ Introduction and Outline
6 lectures 22:50
Where to get the code and data
05:02
Tensorflow or Theano - Your Choice!
04:09
What are the practical applications of unsupervised deep learning?
05:34
+ Principal Components Analysis
10 lectures 01:05:13
How does PCA work?
11:21
Why does PCA work? (PCA derivation)
10:12
PCA only rotates
05:29
MNIST visualization, finding the optimal number of principal components
03:39
PCA implementation
03:29
PCA for NLP
03:37
PCA objective function
02:05
PCA Application: Naive Bayes
09:51
SVD (Singular Value Decomposition)
10:58
+ t-SNE (t-distributed Stochastic Neighbor Embedding)
5 lectures 21:40
t-SNE Theory
04:28
t-SNE Visualization
04:33
t-SNE on the Donut
05:51
t-SNE on XOR
04:36
t-SNE on MNIST
02:12
+ Autoencoders
12 lectures 01:10:46
Autoencoders
03:20
Denoising Autoencoders
01:55
Stacked Autoencoders
03:32
Writing the autoencoder class in code (Theano)
11:55
Testing our Autoencoder (Theano)
03:05
Writing the deep neural network class in code (Theano)
12:42
Autoencoder in Code (Tensorflow)
08:29
Testing greedy layer-wise autoencoder training vs. pure backpropagation
03:33
Cross Entropy vs. KL Divergence
04:39
Deep Autoencoder Visualization Description
01:32
Deep Autoencoder Visualization in Code
11:14
An Autoencoder in 1 Line of Code
04:50
+ Restricted Boltzmann Machines
11 lectures 01:20:26
Basic Outline for RBMs
04:51
Introduction to RBMs
08:21
Motivation Behind RBMs
06:51
Intractability
03:11
Neural Network Equations
07:43
Training an RBM (part 1)
11:34
Training an RBM (part 2)
06:18
Training an RBM (part 3) - Free Energy
07:20
RBM Greedy Layer-Wise Pretraining
04:50
RBM in Code (Theano) with Greedy Layer-Wise Training on MNIST
14:24
RBM in Code (Tensorflow)
05:03
+ The Vanishing Gradient Problem
2 lectures 15:24
The Vanishing Gradient Problem Description
03:07
The Vanishing Gradient Problem Demo in Code
12:17
+ Extras + Visualizing what features a neural network has learned
1 lecture 02:07
Exercises on feature visualization and interpretation
02:07
+ Applications to NLP (Natural Language Processing)
3 lectures 21:16

We use SVD to visualize the words in book titles. You'll see how related words can be made to appear close together in 2 dimensions using the SVD transformation.

Application of PCA and SVD to NLP (Natural Language Processing)
02:30
Latent Semantic Analysis in Code
10:08
Application of t-SNE + K-Means: Finding Clusters of Related Words
08:38
+ Applications to Recommender Systems
10 lectures 01:28:49
Recommender Systems Section Introduction
12:30
Why Autoencoders and RBMs work
05:58
Data Preparation and Logistics
05:33
AutoRec
10:14
AutoRec in Code
11:45
Categorical RBM for Recommender System Ratings
11:32
Recommender RBM Code pt 1
07:26
Recommender RBM Code pt 2
04:16
Recommender RBM Code pt 3
11:42
Recommender RBM Code Speedup
07:53
+ Basics Review
7 lectures 52:05
(Review) Theano Basics
07:47
(Review) Theano Neural Network in Code
09:17
(Review) Tensorflow Basics
07:27
(Review) Tensorflow Neural Network in Code
09:43
(Review) Keras Basics
06:48
(Review) Keras in Code pt 1
06:37
(Review) Keras in Code pt 2
04:26