Deep Learning: GANs and Variational Autoencoders
4.7 (19 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
495 students enrolled
Wishlisted Wishlist

Please confirm that you want to add Deep Learning: GANs and Variational Autoencoders to your Wishlist.

Add to Wishlist

Deep Learning: GANs and Variational Autoencoders

Generative Adversarial Networks and Variational Autoencoders in Python, Theano, and Tensorflow
4.7 (19 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
495 students enrolled
Last updated 7/2017
English
Current price: $10 Original price: $180 Discount: 94% off
5 hours left at this price!
30-Day Money-Back Guarantee
Includes:
  • 5.5 hours on-demand video
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
What Will I Learn?
  • Learn the basic principles of generative models
  • Build a variational autoencoder in Theano and Tensorflow
  • Build a GAN (Generative Adversarial Network) in Theano and Tensorflow
View Curriculum
Requirements
  • Know how to build a neural network in Theano and/or Tensorflow
  • Probability
  • Multivariate Calculus
  • Numpy, etc.
Description

Variational autoencoders and GANs have been 2 of the most interesting developments in deep learning and machine learning recently.

Yann LeCun, a deep learning pioneer, has said that the most important development in recent years has been adversarial training, referring to GANs.

GAN stands for generative adversarial network, where 2 neural networks compete with each other.

What is unsupervised learning?

Unsupervised learning means we’re not trying to map input data to targets, we’re just trying to learn the structure of that input data.

Once we’ve learned that structure, we can do some pretty cool things.

One example is generating poetry - we’ve done examples of this in the past.

But poetry is a very specific thing, how about writing in general?

If we can learn the structure of language, we can generate any kind of text. In fact, big companies are putting in lots of money to research how the news can be written by machines.

But what if we go back to poetry and take away the words?

Well then we get art, in general.

By learning the structure of art, we can create more art.

How about art as sound?

If we learn the structure of music, we can create new music.

Imagine the top 40 hits you hear on the radio are songs written by robots rather than humans.

The possibilities are endless!

You might be wondering, "how is this course different from the first unsupervised deep learning course?"

In this first course, we still tried to learn the structure of data, but the reasons were different.

We wanted to learn the structure of data in order to improve supervised training, which we demonstrated was possible.

In this new course, we want to learn the structure of data in order to produce more stuff that resembles the original data.

This by itself is really cool, but we'll also be incorporating ideas from Bayesian Machine Learning, Reinforcement Learning, and Game Theory. That makes it even cooler!

Thanks for reading and I’ll see you in class. =)


NOTES:

All the code for this course can be downloaded from my github:

/lazyprogrammer/machine_learning_examples

In the directory: unsupervised_class3

Make sure you always "git pull" so you have the latest version!


HARD PREREQUISITES / KNOWLEDGE YOU ARE ASSUMED TO HAVE:

  • Calculus
  • Probability
  • Object-oriented programming
  • Python coding: if/else, loops, lists, dicts, sets
  • Numpy coding: matrix and vector operations
  • Linear regression
  • Gradient descent
  • Know how to build a feedforward and convolutional neural network in Theano and TensorFlow


TIPS (for getting through the course):

  • Watch it at 2x.
  • Take handwritten notes. This will drastically increase your ability to retain the information.
  • Write down the equations. If you don't, I guarantee it will just look like gibberish.
  • Ask lots of questions on the discussion board. The more the better!
  • Realize that most exercises will take you days or weeks to complete.
  • Write code yourself, don't just sit there and look at my code.


USEFUL COURSE ORDERING:

  • (The Numpy Stack in Python)
  • Linear Regression in Python
  • Logistic Regression in Python
  • (Supervised Machine Learning in Python)
  • (Bayesian Machine Learning in Python: A/B Testing)
  • Deep Learning in Python
  • Practical Deep Learning in Theano and TensorFlow
  • (Supervised Machine Learning in Python 2: Ensemble Methods)
  • Convolutional Neural Networks in Python
  • (Easy NLP)
  • (Cluster Analysis and Unsupervised Machine Learning)
  • Unsupervised Deep Learning
  • (Hidden Markov Models)
  • Recurrent Neural Networks in Python
  • Artificial Intelligence: Reinforcement Learning in Python
  • Natural Language Processing with Deep Learning in Python
  • Advanced AI: Deep Reinforcement Learning in Python
  • Deep Learning: GANs and Variational Autoencoders
Who is the target audience?
  • Anyone who wants to improve their deep learning knowledge
Students Who Viewed This Course Also Viewed
Curriculum For This Course
41 Lectures
05:16:29
+
Introduction and Outline
4 Lectures 18:43
+
Generative Modeling Review
8 Lectures 51:06
What does it mean to Sample?
04:57

Sampling Demo: Bayes Classifier
03:57

Gaussian Mixture Model Review
10:31

Sampling Demo: Bayes Classifier with GMM
03:54

Why do we care about generating samples?
11:20

Neural Network and Autoencoder Review
07:26

Tensorflow Warmup
04:07

Theano Warmup
04:54
+
Variational Autoencoders
13 Lectures 01:25:18

Variational Autoencoder Architecture
05:57

Parameterizing a Gaussian with a Neural Network
08:00

The Latent Space, Predictive Distributions and Samples
05:13

Cost Function
07:28

Tensorflow Implementation (pt 1)
07:18

Tensorflow Implementation (pt 2)
02:29

Tensorflow Implementation (pt 3)
09:55

The Reparameterization Trick
05:05

Theano Implementation
10:52

Visualizing the Latent Space
03:09

Bayesian Perspective
10:11

Variational Autoencoder Section Summary
04:02
+
Generative Adversarial Networks (GANs)
11 Lectures 01:50:18
GAN - Basic Principles
05:13

GAN Cost Function (pt 1)
07:23

GAN Cost Function (pt 2)
04:56

DCGAN
07:38

Batch Normalization Review
08:01

Fractionally-Strided Convolution
08:35

Tensorflow Implementation Notes
13:23

Tensorflow Implementation
18:13

Theano Implementation Notes
07:26

Theano Implementation
19:47

GAN Summary
09:43
+
Appendix
5 Lectures 51:04
How to How to install Numpy, Theano, Tensorflow, etc...
17:32

How to Succeed in this Course (Long Version)
05:55

How to Code by Yourself (part 1)
15:54

How to Code by Yourself (part 2)
09:23

Where to get discount coupons and FREE deep learning material
02:20
About the Instructor
Lazy Programmer Inc.
4.6 Average rating
12,373 Reviews
65,538 Students
19 Courses
Data scientist and big data engineer

I am a data scientist, big data engineer, and full stack software engineer.

For my masters thesis I worked on brain-computer interfaces using machine learning. These assist non-verbal and non-mobile persons communicate with their family and caregivers.

I have worked in online advertising and digital media as both a data scientist and big data engineer, and built various high-throughput web services around said data. I've created new big data pipelines using Hadoop/Pig/MapReduce. I've created machine learning models to predict click-through rate, news feed recommender systems using linear regression, Bayesian Bandits, and collaborative filtering and validated the results using A/B testing.

I have taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Humber College, and The New School. 

Multiple businesses have benefitted from my web programming expertise. I do all the backend (server), frontend (HTML/JS/CSS), and operations/deployment work. Some of the technologies I've used are: Python, Ruby/Rails, PHP, Bootstrap, jQuery (Javascript), Backbone, and Angular. For storage/databases I've used MySQL, Postgres, Redis, MongoDB, and more.