Unsupervised Machine Learning Hidden Markov Models in Python

HMMs for stock price analysis, language modeling, web analytics, biology, and PageRank.
4.6 (183 ratings)
Instead of using a simple lifetime average, Udemy calculates a
course's star rating by considering a number of different factors
such as the number of ratings, the age of ratings, and the
likelihood of fraudulent ratings.
3,002 students enrolled
$120
Take This Course
  • Lectures 41
  • Length 4.5 hours
  • Skill Level All Levels
  • Languages English
  • Includes Lifetime access
    30 day money back guarantee!
    Available on iOS and Android
    Certificate of Completion
Wishlisted Wishlist

How taking a course works

Discover

Find online courses made by experts from around the world.

Learn

Take your courses with you and learn anywhere, anytime.

Master

Learn and practice real-world skills and achieve your goals.

About This Course

Published 6/2016 English

Course Description

The Hidden Markov Model or HMM is all about learning sequences.

A lot of the data that would be very useful for us to model is in sequences. Stock prices are sequences of prices. Language is a sequence of words. Credit scoring involves sequences of borrowing and repaying money, and we can use those sequences to predict whether or not you’re going to default. In short, sequences are everywhere, and being able to analyze them is an important skill in your data science toolbox.

The easiest way to appreciate the kind of information you get from a sequence is to consider what you are reading right now. If I had written the previous sentence backwards, it wouldn’t make much sense to you, even though it contained all the same words. So order is important.

While the current fad in deep learning is to use recurrent neural networks to model sequences, I want to first introduce you guys to a machine learning algorithm that has been around for several decades now - the Hidden Markov Model.

This course follows directly from my first course in Unsupervised Machine Learning for Cluster Analysis, where you learned how to measure the probability distribution of a random variable. In this course, you’ll learn to measure the probability distribution of a sequence of random variables.

You guys know how much I love deep learning, so there is a little twist in this course. We’ve already covered gradient descent and you know how central it is for solving deep learning problems. I claimed that gradient descent could be used to optimize any objective function. In this course I will show you how you can use gradient descent to solve for the optimal parameters of an HMM, as an alternative to the popular expectation-maximization algorithm.

We’re going to do it in Theano, which is a popular library for deep learning. This is also going to teach you how to work with sequences in Theano, which will be very useful when we cover recurrent neural networks and LSTMs.

This course is also going to go through the many practical applications of Markov models and hidden Markov models. We’re going to look at a model of sickness and health, and calculate how to predict how long you’ll stay sick, if you get sick. We’re going to talk about how Markov models can be used to analyze how people interact with your website, and fix problem areas like high bounce rate, which could be affecting your SEO. We’ll build language models that can be used to identify a writer and even generate text - imagine a machine doing your writing for you.

We’ll look at what is possibly the most recent and prolific application of Markov models - Google’s PageRank algorithm. And finally we’ll discuss even more practical applications of Markov models, including generating images, smartphone autosuggestions, and using HMMs to answer one of the most fundamental questions in biology - how is DNA, the code of life, translated into physical or behavioral attributes of an organism?

All of the materials of this course can be downloaded and installed for FREE. We will do most of our work in Numpy and Matplotlib, along with a little bit of Theano. I am always available to answer your questions and help you along your data science journey.

This course focuses on "how to build and understand", not just "how to use". Anyone can learn to use an API in 15 minutes after reading some documentation. It's not about "remembering facts", it's about "seeing for yourself" via experimentation. It will teach you how to visualize what's happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you.

See you in class!


NOTES:

All the code for this course can be downloaded from my github: /lazyprogrammer/machine_learning_examples

In the directory: hmm_class

Make sure you always "git pull" so you have the latest version!

HARD PREREQUISITES / KNOWLEDGE YOU ARE ASSUMED TO HAVE:

  • calculus
  • linear algebra
  • probability
  • Be comfortable with the multivariate Gaussian distribution
  • Python coding: if/else, loops, lists, dicts, sets
  • Numpy coding: matrix and vector operations, loading a CSV file
  • Cluster Analysis and Unsupervised Machine Learning in Python will provide you with sufficient background


TIPS (for getting through the course):

  • Watch it at 2x.
  • Take handwritten notes. This will drastically increase your ability to retain the information.
    • Write down the equations. If you don't, I guarantee it will just look like gibberish.
  • Ask lots of questions on the discussion board. The more the better!
  • Realize that most exercises will take you days or weeks to complete.


USEFUL COURSE ORDERING:

  • (The Numpy Stack in Python)
  • Linear Regression in Python
  • Logistic Regression in Python
  • (Supervised Machine Learning in Python)
  • (Bayesian Machine Learning in Python: A/B Testing)
  • Deep Learning in Python
  • Practical Deep Learning in Theano and TensorFlow
  • (Supervised Machine Learning in Python 2: Ensemble Methods)
  • Convolutional Neural Networks in Python
  • (Easy NLP)
  • (Cluster Analysis and Unsupervised Machine Learning)
  • Unsupervised Deep Learning
  • (Hidden Markov Models)
  • Recurrent Neural Networks in Python
  • Natural Language Processing with Deep Learning in Python

What are the requirements?

  • Familiarity with probability and statistics
  • Understand Gaussian mixture models
  • Be comfortable with Python and Numpy

What am I going to get from this course?

  • Understand and enumerate the various applications of Markov Models and Hidden Markov Models
  • Understand how Markov Models work
  • Write a Markov Model in code
  • Apply Markov Models to any sequence of data
  • Understand the mathematics behind Markov chains
  • Apply Markov models to language
  • Apply Markov models to website analytics
  • Understand how Google's PageRank works
  • Understand Hidden Markov Models
  • Write a Hidden Markov Model in Code
  • Write a Hidden Markov Model using Theano
  • Understand how gradient descent, which is normally used in deep learning, can be used for HMMs

Who is the target audience?

  • Students and professionals who do data analysis, especially on sequence data
  • Professionals who want to optimize their website experience
  • Students who want to strengthen their machine learning knowledge and practical skillset
  • Students and professionals interested in DNA analysis and gene expression
  • Students and professionals interested in modeling language and generating text from a model

What you get with this course?

Not for you? No problem.
30 day money back guarantee.

Forever yours.
Lifetime access.

Learn on the go.
Desktop, iOS and Android.

Get rewarded.
Certificate of completion.

Curriculum

Section 1: Introduction and Outline
Introduction and Outline: Why would you want to use an HMM?
Preview
04:04
Unsupervised or Supervised?
Preview
02:58
Where to get the Code and Data
01:19
Section 2: Markov Models
The Markov Property
Preview
04:39
Markov Models
04:50
The Math of Markov Chains
05:13
Section 3: Markov Models: Example Problems and Applications
Example Problem: Sick or Healthy
03:26
Example Problem: Expected number of continuously sick days
02:53
Example application: SEO and Bounce Rate Optimization
08:53
Example Application: Build a 2nd-order language model and generate phrases
13:06
Example Application: Google’s PageRank algorithm
05:04
Section 4: Hidden Markov Models for Discrete Observations
From Markov Models to Hidden Markov Models
06:02
HMMs are Doubly Embedded
01:59
How can we choose the number of hidden states?
04:22
The Forward-Backward Algorithm
04:27
Visual Intuition for the Forward Algorithm
03:32
The Viterbi Algorithm
02:57
Visual Intuition for the Viterbi Algorithm
03:16
The Baum-Welch Algorithm
02:38
Baum-Welch Explanation and Intuition
06:34
Baum-Welch Updates for Multiple Observations
04:53
Discrete HMM in Code
20:33
The underflow problem and how to solve it
05:05
Discrete HMM Updates in Code with Scaling
11:53
Scaled Viterbi Algorithm in Log Space
03:38
Gradient Descent Tutorial
04:30
Theano Scan Tutorial
12:40
Discrete HMM in Theano
11:42
Section 5: HMMs for Continuous Observations
Gaussian Mixture Models with Hidden Markov Models
04:12
Generating Data from a Real-Valued HMM
06:35
Continuous-Observation HMM in Code (part 1)
18:37
Continuous-Observation HMM in Code (part 2)
05:12
Continuous HMM in Theano
16:32
Section 6: HMMs for Classification
Generative vs. Discriminative Classifiers
02:30
HMM Classification on Poetry Data (Robert Frost vs. Edgar Allan Poe)
10:36
Section 7: Bonus Example: Parts-of-Speech Tagging
05:00

Note:

Data is from: http://www.cnts.ua.ac.be/conll2000/chunking/

Code is in the same repo as this course, but in the folder nlp_class2

POS Tagging with an HMM
05:58
Section 8: Appendix
Review of Gaussian Mixture Models
03:04
Theano Tutorial
07:47
How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow
17:22
BONUS: Where to get Udemy coupons and FREE deep learning material
02:20

Students Who Viewed This Course Also Viewed

  • Loading
  • Loading
  • Loading

Instructor Biography

Lazy Programmer Inc., Data scientist and big data engineer

I am a data scientist, big data engineer, and full stack software engineer.

For my masters thesis I worked on brain-computer interfaces using machine learning. These assist non-verbal and non-mobile persons communicate with their family and caregivers.

I have worked in online advertising and digital media as both a data scientist and big data engineer, and built various high-throughput web services around said data. I've created new big data pipelines using Hadoop/Pig/MapReduce. I've created machine learning models to predict click-through rate, news feed recommender systems using linear regression, Bayesian Bandits, and collaborative filtering and validated the results using A/B testing.

I have taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Humber College, and The New School. 

Multiple businesses have benefitted from my web programming expertise. I do all the backend (server), frontend (HTML/JS/CSS), and operations/deployment work. Some of the technologies I've used are: Python, Ruby/Rails, PHP, Bootstrap, jQuery (Javascript), Backbone, and Angular. For storage/databases I've used MySQL, Postgres, Redis, MongoDB, and more.

Ready to start learning?
Take This Course