Deep Learning: Recurrent Neural Networks in Python
4.5 (556 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
7,742 students enrolled
Wishlisted Wishlist

Please confirm that you want to add Deep Learning: Recurrent Neural Networks in Python to your Wishlist.

Add to Wishlist

Deep Learning: Recurrent Neural Networks in Python

GRU, LSTM, + more modern deep learning, machine learning, and data science for sequences
4.5 (556 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
7,742 students enrolled
Last updated 8/2017
English
English [Auto-generated]
Current price: $15 Original price: $120 Discount: 88% off
16 hours left at this price!
30-Day Money-Back Guarantee
Includes:
  • 5 hours on-demand video
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
What Will I Learn?
  • Understand the simple recurrent unit (Elman unit)
  • Understand the GRU (gated recurrent unit)
  • Understand the LSTM (long short-term memory unit)
  • Write various recurrent networks in Theano
  • Understand backpropagation through time
  • Understand how to mitigate the vanishing gradient problem
  • Solve the XOR and parity problems using a recurrent neural network
  • Use recurrent neural networks for language modeling
  • Use RNNs for generating text, like poetry
  • Visualize word embeddings and look for patterns in word vector representations
View Curriculum
Requirements
  • Calculus
  • Linear algebra
  • Python, Numpy, Matplotlib
  • Write a neural network in Theano
  • Understand backpropagation
  • Probability (conditional and joint distributions)
  • Write a neural network in Tensorflow
Description

Like the course I just released on Hidden Markov Models, Recurrent Neural Networks are all about learning sequences - but whereas Markov Models are limited by the Markov assumption, Recurrent Neural Networks are not - and as a result, they are more expressive, and more powerful than anything we’ve seen on tasks that we haven’t made progress on in decades.

So what’s going to be in this course and how will it build on the previous neural network courses and Hidden Markov Models?

In the first section of the course we are going to add the concept of time to our neural networks.

I’ll introduce you to the Simple Recurrent Unit, also known as the Elman unit.

We are going to revisit the XOR problem, but we’re going to extend it so that it becomes the parity problem - you’ll see that regular feedforward neural networks will have trouble solving this problem but recurrent networks will work because the key is to treat the input as a sequence.

In the next section of the course, we are going to revisit one of the most popular applications of recurrent neural networks - language modeling.

You saw when we studied Markov Models that we could do things like generate poetry and it didn’t look too bad. We could even discriminate between 2 different poets just from the sequence of parts-of-speech tags they used.

In this course, we are going to extend our language model so that it no longer makes the Markov assumption.

Another popular application of neural networks for language is word vectors or word embeddings. The most common technique for this is called Word2Vec, but I’ll show you how recurrent neural networks can also be used for creating word vectors.

In the section after, we’ll look at the very popular LSTM, or long short-term memory unit, and the more modern and efficient GRU, or gated recurrent unit, which has been proven to yield comparable performance.

We’ll apply these to some more practical problems, such as learning a language model from Wikipedia data and visualizing the word embeddings we get as a result.

All of the materials required for this course can be downloaded and installed for FREE. We will do most of our work in Numpy, Matplotlib, and Theano. I am always available to answer your questions and help you along your data science journey.

This course focuses on "how to build and understand", not just "how to use". Anyone can learn to use an API in 15 minutes after reading some documentation. It's not about "remembering facts", it's about "seeing for yourself" via experimentation. It will teach you how to visualize what's happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you.

See you in class!


NOTES:

All the code for this course can be downloaded from my github: /lazyprogrammer/machine_learning_examples

In the directory: rnn_class

Make sure you always "git pull" so you have the latest version!

HARD PREREQUISITES / KNOWLEDGE YOU ARE ASSUMED TO HAVE:

  • calculus
  • linear algebra
  • probability (conditional and joint distributions)
  • Python coding: if/else, loops, lists, dicts, sets
  • Numpy coding: matrix and vector operations, loading a CSV file
  • Deep learning: backpropagation, XOR problem
  • Can write a neural network in Theano and Tensorflow


TIPS (for getting through the course):

  • Watch it at 2x.
  • Take handwritten notes. This will drastically increase your ability to retain the information.
  • Write down the equations. If you don't, I guarantee it will just look like gibberish.
  • Ask lots of questions on the discussion board. The more the better!
  • Realize that most exercises will take you days or weeks to complete.
  • Write code yourself, don't just sit there and look at my code.


USEFUL COURSE ORDERING:

  • (The Numpy Stack in Python)
  • Linear Regression in Python
  • Logistic Regression in Python
  • (Supervised Machine Learning in Python)
  • (Bayesian Machine Learning in Python: A/B Testing)
  • Deep Learning in Python
  • Practical Deep Learning in Theano and TensorFlow
  • (Supervised Machine Learning in Python 2: Ensemble Methods)
  • Convolutional Neural Networks in Python
  • (Easy NLP)
  • (Cluster Analysis and Unsupervised Machine Learning)
  • Unsupervised Deep Learning
  • (Hidden Markov Models)
  • Recurrent Neural Networks in Python
  • Artificial Intelligence: Reinforcement Learning in Python
  • Natural Language Processing with Deep Learning in Python
Who is the target audience?
  • If you want to level up with deep learning, take this course.
  • If you are a student or professional who wants to apply deep learning to time series or sequence data, take this course.
  • If you want to learn about word embeddings and language modeling, take this course.
  • If you want to improve the performance you got with Hidden Markov Models, take this course.
  • If you're interested the techniques that led to new developments in machine translation, take this course.
  • If you have no idea about deep learning, don't take this course, take the prerequisites.
Students Who Viewed This Course Also Viewed
Curriculum For This Course
39 Lectures
04:51:16
+
Introduction and Outline
4 Lectures 14:09
+
The Simple Recurrent Unit
9 Lectures 01:04:52

Prediction and Relationship to Markov Models
05:14

Unfolding a Recurrent Network
01:56

We discuss how to do gradient descent when time is involved, pitfalls like the vanishing gradient problem and exploding gradient problem, the gradient clipping technique, and truncated backpropagation through time.

Backpropagation Through Time (BPTT)
04:17

The Parity Problem - XOR on Steroids
04:32

The Parity Problem in Code using a Feedforward ANN
15:05

Theano Scan Tutorial
12:40

The Parity Problem in Code using a Recurrent Neural Network
15:13

On Adding Complexity
01:16
+
Recurrent Neural Networks for NLP
8 Lectures 59:21
Word Embeddings and Recurrent Neural Networks
05:01

Word Analogies with Word Embeddings
02:25

Representing a sequence of words as a sequence of word embeddings
03:14

Generating Poetry
04:23

Generating Poetry in Code (part 1)
19:23

Generating Poetry in Code (part 2)
04:34

Classifying Poetry
03:39

Classifying Poetry in Code
16:42
+
Advanced RNN Units
11 Lectures 01:27:21
Rated RNN Unit
03:24

RRNN in Code - Revisiting Poetry Generation
08:49

Gated Recurrent Unit (GRU)
05:17

GRU in Code
06:28

Long Short-Term Memory (LSTM)
04:30

LSTM in Code
08:14

Learning from Wikipedia Data
06:57

Alternative to Wikipedia Data: Brown Corpus
06:03

Learning from Wikipedia Data in Code (part 1)
17:56

Learning from Wikipedia Data in Code (part 2)
08:37

Visualizing the Word Embeddings
11:06
+
Batch Training
1 Lecture 10:25
Batch Training for Simple RNN
10:25
+
TensorFlow
1 Lecture 07:38
Simple RNN in TensorFlow
07:38
+
Appendix
5 Lectures 47:30
How to install wp2txt or WikiExtractor.py
02:21

How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow
17:32

How to Code by Yourself (part 1)
15:54

How to Code by Yourself (part 2)
09:23

BONUS: Where to get Udemy coupons and FREE deep learning material
02:20
About the Instructor
Lazy Programmer Inc.
4.6 Average rating
14,165 Reviews
75,408 Students
19 Courses
Data scientist and big data engineer

I am a data scientist, big data engineer, and full stack software engineer.

For my masters thesis I worked on brain-computer interfaces using machine learning. These assist non-verbal and non-mobile persons communicate with their family and caregivers.

I have worked in online advertising and digital media as both a data scientist and big data engineer, and built various high-throughput web services around said data. I've created new big data pipelines using Hadoop/Pig/MapReduce. I've created machine learning models to predict click-through rate, news feed recommender systems using linear regression, Bayesian Bandits, and collaborative filtering and validated the results using A/B testing.

I have taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Humber College, and The New School. 

Multiple businesses have benefitted from my web programming expertise. I do all the backend (server), frontend (HTML/JS/CSS), and operations/deployment work. Some of the technologies I've used are: Python, Ruby/Rails, PHP, Bootstrap, jQuery (Javascript), Backbone, and Angular. For storage/databases I've used MySQL, Postgres, Redis, MongoDB, and more.