Natural Language Processing with Deep Learning in Python
4.6 (432 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
7,052 students enrolled
Wishlisted Wishlist

Please confirm that you want to add Natural Language Processing with Deep Learning in Python to your Wishlist.

Add to Wishlist

Natural Language Processing with Deep Learning in Python

Complete guide on deriving and implementing word2vec, GLoVe, word embeddings, and sentiment analysis with recursive nets
Bestselling
4.6 (432 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
7,052 students enrolled
Last updated 5/2017
English
Current price: $10 Original price: $120 Discount: 92% off
30-Day Money-Back Guarantee
Includes:
  • 5 hours on-demand video
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
What Will I Learn?
  • Understand and implement word2vec
  • Understand the CBOW method in word2vec
  • Understand the skip-gram method in word2vec
  • Understand the negative sampling optimization in word2vec
  • Understand and implement GLoVe using gradient descent and alternating least squares
  • Use recurrent neural networks for parts-of-speech tagging
  • Use recurrent neural networks for named entity recognition
  • Understand and implement recursive neural networks for sentiment analysis
  • Understand and implement recursive neural tensor networks for sentiment analysis
View Curriculum
Requirements
  • Install Numpy, Matplotlib, Sci-Kit Learn, Theano, and TensorFlow (should be extremely easy by now)
  • Understand backpropagation and gradient descent, be able to do it on your own.
  • Code a recurrent neural network in Theano
  • Code a feedforward neural network in Theano
Description

In this course we are going to look at advanced NLP.

Previously, you learned about some of the basics, like how many NLP problems are just regular machine learning and data science problems in disguise, and simple, practical methods like bag-of-words and term-document matrices.

These allowed us to do some pretty cool things, like detect spam emails, write poetry, spin articles, and group together similar words.

In this course I’m going to show you how to do even more awesome things. We’ll learn not just 1, but 4 new architectures in this course.

First up is word2vec.

In this course, I’m going to show you exactly how word2vec works, from theory to implementation, and you’ll see that it’s merely the application of skills you already know.

Word2vec is interesting because it magically maps words to a vector space where you can find analogies, like:

  • king - man = queen - woman
  • France - Paris = England - London
  • December - Novemeber = July - June


We are also going to look at the GLoVe method, which also finds word vectors, but uses a technique called matrix factorization, which is a popular algorithm for recommender systems.

Amazingly, the word vectors produced by GLoVe are just as good as the ones produced by word2vec, and it’s way easier to train.

We will also look at some classical NLP problems, like parts-of-speech tagging and named entity recognition, and use recurrent neural networks to solve them. You’ll see that just about any problem can be solved using neural networks, but you’ll also learn the dangers of having too much complexity.

Lastly, you’ll learn about recursive neural networks, which finally help us solve the problem of negation in sentiment analysis. Recursive neural networks exploit the fact that sentences have a tree structure, and we can finally get away from naively using bag-of-words.

All of the materials required for this course can be downloaded and installed for FREE. We will do most of our work in Numpy, Matplotlib, and Theano. I am always available to answer your questions and help you along your data science journey.

This course focuses on "how to build and understand", not just "how to use". Anyone can learn to use an API in 15 minutes after reading some documentation. It's not about "remembering facts", it's about "seeing for yourself" via experimentation. It will teach you how to visualize what's happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you.

See you in class!


NOTES:

All the code for this course can be downloaded from my github: /lazyprogrammer/machine_learning_examples

In the directory: nlp_class2

Make sure you always "git pull" so you have the latest version!


TIPS (for getting through the course):

  • Watch it at 2x.
  • Take handwritten notes. This will drastically increase your ability to retain the information.
  • Write down the equations. If you don't, I guarantee it will just look like gibberish.
  • Ask lots of questions on the discussion board. The more the better!
  • Realize that most exercises will take you days or weeks to complete.
  • Write code yourself, don't just sit there and look at my code.


USEFUL COURSE ORDERING:

  • (The Numpy Stack in Python)
  • Linear Regression in Python
  • Logistic Regression in Python
  • (Supervised Machine Learning in Python)
  • (Bayesian Machine Learning in Python: A/B Testing)
  • Deep Learning in Python
  • Practical Deep Learning in Theano and TensorFlow
  • (Supervised Machine Learning in Python 2: Ensemble Methods)
  • Convolutional Neural Networks in Python
  • (Easy NLP)
  • (Cluster Analysis and Unsupervised Machine Learning)
  • Unsupervised Deep Learning
  • (Hidden Markov Models)
  • Recurrent Neural Networks in Python
  • Artificial Intelligence: Reinforcement Learning in Python
  • Natural Language Processing with Deep Learning in Python


HARD PREREQUISITES / KNOWLEDGE YOU ARE ASSUMED TO HAVE:

  • calculus
  • linear algebra
  • probability (conditional and joint distributions)
  • Python coding: if/else, loops, lists, dicts, sets
  • Numpy coding: matrix and vector operations, loading a CSV file
  • neural networks and backpropagation
  • Can write a feedforward neural network in Theano and TensorFlow
  • Can write a recurrent neural network / LSTM / GRU in Theano and TensorFlow


Who is the target audience?
  • Students and professionals who want to create word vector representations for various NLP tasks
  • Students and professionals who are interested in state-of-the-art neural network architectures like recursive neural networks
  • SHOULD NOT: Anyone who is not comfortable with the prerequisites.
Curriculum For This Course
45 Lectures
05:06:28
+
Outline, Review, and Logistical Things
3 Lectures 10:37

Where to get the code / data for this course
02:00

How to Succeed in this Course
05:55
+
Word Embeddings and Word2Vec
15 Lectures 01:35:04
Alternative to Wikipedia Data: Brown Corpus
06:03


Using pre-trained word embeddings
02:17

Word analogies using word embeddings
03:51

TF-IDF and t-SNE experiment
12:24

Word2Vec introduction
05:07

CBOW
02:19

Skip-Gram
03:30

Negative Sampling
07:36

Why do I have 2 word embedding matrices and what do I do with them?
01:36

Word2Vec in Code with Numpy (part 1)
19:49

Word2Vec in Code with Numpy (part 2)
01:53

Converting a sequence of word indexes to a sequence of word vectors
03:14

How to update only part of a Theano shared variable
05:29

Word2Vec in Code with Theano
09:57
+
Word Embeddings using GLoVe
7 Lectures 47:23
Recommender systems and matrix factorization tutorial
11:08

GLoVe - Global Vectors for Word Representation
04:12

GLoVe in Code - Numpy Gradient Descent
16:48

GLoVe in Code - Theano Gradient Descent
03:50

GLoVe in Code - Alternating Least Squares
04:42

Visualizing country analogies with t-SNE
04:24

Hyperparameter Challenge
02:19
+
Using Neural Networks to Solve NLP Problems
8 Lectures 52:48
Parts-of-Speech (POS) Tagging
05:00

Parts-of-Speech Tagging Baseline
15:18

Parts-of-Speech Tagging Recurrent Neural Network
13:05

Parts-of-Speech Tagging Hidden Markov Model (HMM)
05:58

Named Entity Recognition (NER)
03:01

Named Entity Recognition Baseline
05:54

Named Entity Recognition RNN
02:19

Hyperparameter Challenge II
02:13
+
Recursive Neural Networks (Tree Neural Networks)
7 Lectures 53:06
Data Description for Recursive Neural Networks
06:52

What are Recursive Neural Networks / Tree Neural Networks (TNNs)?
05:41

Building a TNN with Recursion
04:47

Trees to Sequences
06:38

Recursive Neural Network in Theano
18:34

Recursive Neural Tensor Networks
06:22

Recursive Neural Network in TensorFlow with Recursion
04:12
+
Appendix
5 Lectures 47:30
How to install wp2txt or WikiExtractor.py
02:21

How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow
17:32

How to Code by Yourself (part 1)
15:54

How to Code by Yourself (part 2)
09:23

BONUS: Where to get Udemy coupons and FREE deep learning material
02:20
About the Instructor
Lazy Programmer Inc.
4.6 Average rating
12,507 Reviews
66,188 Students
19 Courses
Data scientist and big data engineer

I am a data scientist, big data engineer, and full stack software engineer.

For my masters thesis I worked on brain-computer interfaces using machine learning. These assist non-verbal and non-mobile persons communicate with their family and caregivers.

I have worked in online advertising and digital media as both a data scientist and big data engineer, and built various high-throughput web services around said data. I've created new big data pipelines using Hadoop/Pig/MapReduce. I've created machine learning models to predict click-through rate, news feed recommender systems using linear regression, Bayesian Bandits, and collaborative filtering and validated the results using A/B testing.

I have taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Humber College, and The New School. 

Multiple businesses have benefitted from my web programming expertise. I do all the backend (server), frontend (HTML/JS/CSS), and operations/deployment work. Some of the technologies I've used are: Python, Ruby/Rails, PHP, Bootstrap, jQuery (Javascript), Backbone, and Angular. For storage/databases I've used MySQL, Postgres, Redis, MongoDB, and more.