Find online courses made by experts from around the world.
Take your courses with you and learn anywhere, anytime.
Learn and practice real-world skills and achieve your goals.
In this course we are going to look at advanced NLP.
Previously, you learned about some of the basics, like how many NLP problems are just regular machine learning and data science problems in disguise, and simple, practical methods like bag-of-words and term-document matrices.
These allowed us to do some pretty cool things, like detect spam emails, write poetry, spin articles, and group together similar words.
In this course I’m going to show you how to do even more awesome things. We’ll learn not just 1, but 4 new architectures in this course.
First up is word2vec.
In this course, I’m going to show you exactly how word2vec works, from theory to implementation, and you’ll see that it’s merely the application of skills you already know.
Word2vec is interesting because it magically maps words to a vector space where you can find analogies, like:
We are also going to look at the GLoVe method, which also finds word vectors, but uses a technique called matrix factorization, which is a popular algorithm for recommender systems.
Amazingly, the word vectors produced by GLoVe are just as good as the ones produced by word2vec, and it’s way easier to train.
We will also look at some classical NLP problems, like parts-of-speech tagging and named entity recognition, and use recurrent neural networks to solve them. You’ll see that just about any problem can be solved using neural networks, but you’ll also learn the dangers of having too much complexity.
Lastly, you’ll learn about recursive neural networks, which finally help us solve the problem of negation in sentiment analysis. Recursive neural networks exploit the fact that sentences have a tree structure, and we can finally get away from naively using bag-of-words.
All of the materials required for this course can be downloaded and installed for FREE. We will do most of our work in Numpy, Matplotlib, and Theano. I am always available to answer your questions and help you along your data science journey.
This course focuses on "how to build and understand", not just "how to use". Anyone can learn to use an API in 15 minutes after reading some documentation. It's not about "remembering facts", it's about "seeing for yourself" via experimentation. It will teach you how to visualize what's happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you.
See you in class!
All the code for this course can be downloaded from my github: /lazyprogrammer/machine_learning_examples
In the directory: nlp_class2
Make sure you always "git pull" so you have the latest version!
TIPS (for getting through the course):
USEFUL COURSE ORDERING:
Not for you? No problem.
30 day money back guarantee.
Learn on the go.
Desktop, iOS and Android.
Certificate of completion.
|Section 1: Outline, Review, and Logistical Things|
Introduction, Outline, and ReviewPreview
Where to get the code / data for this course
|Section 2: Word Embeddings and Word2Vec|
What is a word embedding?Preview
Using pre-trained word embeddings
Word analogies using word embeddings
TF-IDF and t-SNE experiment
Why do I have 2 word embedding matrices and what do I do with them?
Word2Vec in Code with Numpy (part 1)
Word2Vec in Code with Numpy (part 2)
Converting a sequence of word indexes to a sequence of word vectors
How to update only part of a Theano shared variable
Word2Vec in Code with Theano
|Section 3: Word Embeddings using GLoVe|
Recommender systems and matrix factorization tutorial
GLoVe - Global Vectors for Word Representation
GLoVe in Code - Numpy Gradient Descent
GLoVe in Code - Theano Gradient Descent
GLoVe in Code - Alternating Least Squares
Visualizing country analogies with t-SNE
|Section 4: Using Neural Networks to Solve NLP Problems|
Parts-of-Speech (POS) Tagging
Parts-of-Speech Tagging Baseline
Parts-of-Speech Tagging Recurrent Neural Network
Parts-of-Speech Tagging Hidden Markov Model (HMM)
Named Entity Recognition (NER)
Named Entity Recognition Baseline
Named Entity Recognition RNN
Hyperparameter Challenge II
|Section 5: Recursive Neural Networks (Tree Neural Networks)|
Data Description for Recursive Neural Networks
What are Recursive Neural Networks / Tree Neural Networks (TNNs)?
Building a TNN with Recursion
Trees to Sequences
Recursive Neural Network in Theano
Recursive Neural Tensor Networks
Recursive Neural Network in TensorFlow with Recursion
|Section 6: Appendix|
How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow
How to install wp2txt or WikiExtractor.py
BONUS: Where to get Udemy coupons and FREE deep learning material
I am a data scientist, big data engineer, and full stack software engineer.
For my masters thesis I worked on brain-computer interfaces using machine learning. These assist non-verbal and non-mobile persons communicate with their family and caregivers.
I have worked in online advertising and digital media as both a data scientist and big data engineer, and built various high-throughput web services around said data. I've created new big data pipelines using Hadoop/Pig/MapReduce. I've created machine learning models to predict click-through rate, news feed recommender systems using linear regression, Bayesian Bandits, and collaborative filtering and validated the results using A/B testing.
I have taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Humber College, and The New School.