Find online courses made by experts from around the world.
Take your courses with you and learn anywhere, anytime.
Learn and practice real-world skills and achieve your goals.
This course is the next logical step in my deep learning, data science, and machine learning series. I’ve done a lot of courses about deep learning, and I just released a course about unsupervised learning, where I talked about clustering and density estimation. So what do you get when you put these 2 together? Unsupervised deep learning!
In these course we’ll start with some very basic stuff - principal components analysis (PCA), and a popular nonlinear dimensionality reduction technique known as t-SNE (t-distributed stochastic neighbor embedding).
Next, we’ll look at a special type of unsupervised neural network called the autoencoder. After describing how an autoencoder works, I’ll show you how you can link a bunch of them together to form a deep stack of autoencoders, that leads to better performance of a supervised deep neural network. Autoencoders are like a non-linear form of PCA.
Last, we’ll look at restricted Boltzmann machines (RBMs). These are yet another popular unsupervised neural network, that you can use in the same way as autoencoders to pretrain your supervised deep neural network. I’ll show you an interesting way of training restricted Boltzmann machines, known as Gibbs sampling, a special case of Markov Chain Monte Carlo, and I’ll demonstrate how even though this method is only a rough approximation, it still ends up reducing other cost functions, such as the one used for autoencoders. This method is also known as Contrastive Divergence or CD-k. As in physical systems, we define a concept called free energy and attempt to minimize this quantity.
Finally, we’ll bring all these concepts together and I’ll show you visually what happens when you use PCA and t-SNE on the features that the autoencoders and RBMs have learned, and we’ll see that even without labels the results suggest that a pattern has been found.
All the materials used in this course are FREE. Since this course is the 4th in the deep learning series, I will assume you already know calculus, linear algebra, and Python coding. You'll want to install Numpy and Theano for this course. These are essential items in your data analytics toolbox.
If you are interested in deep learning and you want to learn about modern deep learning developments beyond just plain backpropagation, including using unsupervised neural networks to interpret what features can be automatically and hierarchically learned in a deep learning system, this course is for you.
This course focuses on "how to build and understand", not just "how to use". Anyone can learn to use an API in 15 minutes after reading some documentation. It's not about "remembering facts", it's about "seeing for yourself" via experimentation. It will teach you how to visualize what's happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you.
All the code for this course can be downloaded from my github: /lazyprogrammer/machine_learning_examples
In the directory: unsupervised_class2
Make sure you always "git pull" so you have the latest version!
HARD PREREQUISITES / KNOWLEDGE YOU ARE ASSUMED TO HAVE:
TIPS (for getting through the course):
USEFUL COURSE ORDERING:
Not for you? No problem.
30 day money back guarantee.
Learn on the go.
Desktop, iOS and Android.
Certificate of completion.
|Section 1: Introduction and Outline|
Introduction and OutlinePreview
Where does this course fit into your deep learning studies?Preview
|Section 2: Principal Components Analysis|
What does PCA do?Preview
MNIST visualization, finding the optimal number of principal components
PCA objective function
|Section 3: t-SNE (t-distributed Stochastic Neighbor Embedding)|
t-SNE on the Donut
t-SNE on XOR
t-SNE on MNIST
|Section 4: Autoencoders|
Writing the autoencoder class in code
Writing the deep neural network class in code
Testing greedy layer-wise autoencoder training vs. pure backpropagation
Cross Entropy vs. KL Divergence
Deep Autoencoder Visualization Description
Deep Autoencoder Visualization in Code
|Section 5: Restricted Boltzmann Machines|
What is a restricted Boltzmann machine? How is it related to neural networks? Why is it difficult to train a RBM?
Deriving Conditional Probabilities from Joint Probability
Learn how to train an RBM using contrastive divergence / Gibbs sampling
RBM in Code + Testing a greedily pre-trained deep belief network on MNIST
|Section 6: The Vanishing Gradient Problem|
The Vanishing Gradient Problem Description
The Vanishing Gradient Problem Demo in Code
|Section 7: Extras + Visualizing what features a neural network has learned|
Exercises on feature visualization and interpretation
BONUS: Where to get Udemy coupons and FREE deep learning materialPreview
BONUS: How to derive the free energy formula
|Section 8: BONUS: Application of PCA / SVD to NLP (Natural Language Processing)|
We use SVD to visualize the words in book titles. You'll see how related words can be made to appear close together in 2 dimensions using the SVD transformation.
BONUS: Latent Semantic Analysis in Code
|Section 9: Appendix|
How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow
I am a data scientist, big data engineer, and full stack software engineer.
For my masters thesis I worked on brain-computer interfaces using machine learning. These assist non-verbal and non-mobile persons communicate with their family and caregivers.
I have worked in online advertising and digital media as both a data scientist and big data engineer, and built various high-throughput web services around said data. I've created new big data pipelines using Hadoop/Pig/MapReduce. I've created machine learning models to predict click-through rate, news feed recommender systems using linear regression, Bayesian Bandits, and collaborative filtering and validated the results using A/B testing.
I have taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Humber College, and The New School.