Unsupervised Machine Learning Hidden Markov Models in Python
4.6 (396 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
6,143 students enrolled
Wishlisted Wishlist

Please confirm that you want to add Unsupervised Machine Learning Hidden Markov Models in Python to your Wishlist.

Add to Wishlist

Unsupervised Machine Learning Hidden Markov Models in Python

HMMs for stock price analysis, language modeling, web analytics, biology, and PageRank.
4.6 (396 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
6,143 students enrolled
Last updated 8/2017
English
Current price: $10 Original price: $120 Discount: 92% off
5 hours left at this price!
30-Day Money-Back Guarantee
Includes:
  • 6 hours on-demand video
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
What Will I Learn?
  • Understand and enumerate the various applications of Markov Models and Hidden Markov Models
  • Understand how Markov Models work
  • Write a Markov Model in code
  • Apply Markov Models to any sequence of data
  • Understand the mathematics behind Markov chains
  • Apply Markov models to language
  • Apply Markov models to website analytics
  • Understand how Google's PageRank works
  • Understand Hidden Markov Models
  • Write a Hidden Markov Model in Code
  • Write a Hidden Markov Model using Theano
  • Understand how gradient descent, which is normally used in deep learning, can be used for HMMs
View Curriculum
Requirements
  • Familiarity with probability and statistics
  • Understand Gaussian mixture models
  • Be comfortable with Python and Numpy
Description

The Hidden Markov Model or HMM is all about learning sequences.

A lot of the data that would be very useful for us to model is in sequences. Stock prices are sequences of prices. Language is a sequence of words. Credit scoring involves sequences of borrowing and repaying money, and we can use those sequences to predict whether or not you’re going to default. In short, sequences are everywhere, and being able to analyze them is an important skill in your data science toolbox.

The easiest way to appreciate the kind of information you get from a sequence is to consider what you are reading right now. If I had written the previous sentence backwards, it wouldn’t make much sense to you, even though it contained all the same words. So order is important.

While the current fad in deep learning is to use recurrent neural networks to model sequences, I want to first introduce you guys to a machine learning algorithm that has been around for several decades now - the Hidden Markov Model.

This course follows directly from my first course in Unsupervised Machine Learning for Cluster Analysis, where you learned how to measure the probability distribution of a random variable. In this course, you’ll learn to measure the probability distribution of a sequence of random variables.

You guys know how much I love deep learning, so there is a little twist in this course. We’ve already covered gradient descent and you know how central it is for solving deep learning problems. I claimed that gradient descent could be used to optimize any objective function. In this course I will show you how you can use gradient descent to solve for the optimal parameters of an HMM, as an alternative to the popular expectation-maximization algorithm.

We’re going to do it in Theano and Tensorflow, which are popular libraries for deep learning. This is also going to teach you how to work with sequences in Theano and Tensorflow, which will be very useful when we cover recurrent neural networks and LSTMs.

This course is also going to go through the many practical applications of Markov models and hidden Markov models. We’re going to look at a model of sickness and health, and calculate how to predict how long you’ll stay sick, if you get sick. We’re going to talk about how Markov models can be used to analyze how people interact with your website, and fix problem areas like high bounce rate, which could be affecting your SEO. We’ll build language models that can be used to identify a writer and even generate text - imagine a machine doing your writing for you. HMMs have been very successful in natural language processing or NLP.

We’ll look at what is possibly the most recent and prolific application of Markov models - Google’s PageRank algorithm. And finally we’ll discuss even more practical applications of Markov models, including generating images, smartphone autosuggestions, and using HMMs to answer one of the most fundamental questions in biology - how is DNA, the code of life, translated into physical or behavioral attributes of an organism?

All of the materials of this course can be downloaded and installed for FREE. We will do most of our work in Numpy and Matplotlib, along with a little bit of Theano. I am always available to answer your questions and help you along your data science journey.

This course focuses on "how to build and understand", not just "how to use". Anyone can learn to use an API in 15 minutes after reading some documentation. It's not about "remembering facts", it's about "seeing for yourself" via experimentation. It will teach you how to visualize what's happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you.

See you in class!


NOTES:

All the code for this course can be downloaded from my github: /lazyprogrammer/machine_learning_examples

In the directory: hmm_class

Make sure you always "git pull" so you have the latest version!

HARD PREREQUISITES / KNOWLEDGE YOU ARE ASSUMED TO HAVE:

  • calculus
  • linear algebra
  • probability
  • Be comfortable with the multivariate Gaussian distribution
  • Python coding: if/else, loops, lists, dicts, sets
  • Numpy coding: matrix and vector operations, loading a CSV file
  • Cluster Analysis and Unsupervised Machine Learning in Python will provide you with sufficient background


TIPS (for getting through the course):

  • Watch it at 2x.
  • Take handwritten notes. This will drastically increase your ability to retain the information.
  • Write down the equations. If you don't, I guarantee it will just look like gibberish.
  • Ask lots of questions on the discussion board. The more the better!
  • Realize that most exercises will take you days or weeks to complete.
  • Write code yourself, don't just sit there and look at my code.


USEFUL COURSE ORDERING:

  • (The Numpy Stack in Python)
  • Linear Regression in Python
  • Logistic Regression in Python
  • (Supervised Machine Learning in Python)
  • (Bayesian Machine Learning in Python: A/B Testing)
  • Deep Learning in Python
  • Practical Deep Learning in Theano and TensorFlow
  • (Supervised Machine Learning in Python 2: Ensemble Methods)
  • Convolutional Neural Networks in Python
  • (Easy NLP)
  • (Cluster Analysis and Unsupervised Machine Learning)
  • Unsupervised Deep Learning
  • (Hidden Markov Models)
  • Recurrent Neural Networks in Python
  • Artificial Intelligence: Reinforcement Learning in Python
  • Natural Language Processing with Deep Learning in Python
Who is the target audience?
  • Students and professionals who do data analysis, especially on sequence data
  • Professionals who want to optimize their website experience
  • Students who want to strengthen their machine learning knowledge and practical skillset
  • Students and professionals interested in DNA analysis and gene expression
  • Students and professionals interested in modeling language and generating text from a model
Students Who Viewed This Course Also Viewed
Curriculum For This Course
49 Lectures
05:45:57
+
Introduction and Outline
4 Lectures 13:49


Where to get the Code and Data
01:19

How to Succeed in this Course
05:28
+
Markov Models
3 Lectures 14:42

Markov Models
04:50

The Math of Markov Chains
05:13
+
Markov Models: Example Problems and Applications
5 Lectures 33:22
Example Problem: Sick or Healthy
03:26

Example Problem: Expected number of continuously sick days
02:53

Example application: SEO and Bounce Rate Optimization
08:53

Example Application: Build a 2nd-order language model and generate phrases
13:06

Example Application: Google’s PageRank algorithm
05:04
+
Hidden Markov Models for Discrete Observations
14 Lectures 01:21:49
From Markov Models to Hidden Markov Models
06:02

HMMs are Doubly Embedded
01:59

How can we choose the number of hidden states?
04:22

The Forward-Backward Algorithm
04:27

Visual Intuition for the Forward Algorithm
03:32

The Viterbi Algorithm
02:57

Visual Intuition for the Viterbi Algorithm
03:16

The Baum-Welch Algorithm
02:38

Baum-Welch Explanation and Intuition
06:34

Baum-Welch Updates for Multiple Observations
04:53

Discrete HMM in Code
20:33

The underflow problem and how to solve it
05:05

Discrete HMM Updates in Code with Scaling
11:53

Scaled Viterbi Algorithm in Log Space
03:38
+
Discrete HMMs Using Deep Learning Libraries
6 Lectures 54:10
Gradient Descent Tutorial
04:30

Theano Scan Tutorial
12:40

Discrete HMM in Theano
11:42

Improving our Gradient Descent-Based HMM
05:09

Tensorflow Scan Tutorial
12:42

Discrete HMM in Tensorflow
07:27
+
HMMs for Continuous Observations
6 Lectures 01:00:34
Gaussian Mixture Models with Hidden Markov Models
04:12

Generating Data from a Real-Valued HMM
06:35

Continuous-Observation HMM in Code (part 1)
18:37

Continuous-Observation HMM in Code (part 2)
05:12

Continuous HMM in Theano
16:32

Continuous HMM in Tensorflow
09:26
+
HMMs for Classification
2 Lectures 13:06
Generative vs. Discriminative Classifiers
02:30

HMM Classification on Poetry Data (Robert Frost vs. Edgar Allan Poe)
10:36
+
Bonus Example: Parts-of-Speech Tagging
2 Lectures 10:58

Note:

Data is from: http://www.cnts.ua.ac.be/conll2000/chunking/

Code is in the same repo as this course, but in the folder nlp_class2

Parts-of-Speech Tagging Concepts
05:00

POS Tagging with an HMM
05:58
+
Appendix
7 Lectures 01:03:27
Review of Gaussian Mixture Models
03:04

Theano Tutorial
07:47

Tensorflow Tutorial
07:27

How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow
17:32

How to Code by Yourself (part 1)
15:54

How to Code by Yourself (part 2)
09:23

BONUS: Where to get Udemy coupons and FREE deep learning material
02:20
About the Instructor
Lazy Programmer Inc.
4.6 Average rating
12,471 Reviews
65,996 Students
19 Courses
Data scientist and big data engineer

I am a data scientist, big data engineer, and full stack software engineer.

For my masters thesis I worked on brain-computer interfaces using machine learning. These assist non-verbal and non-mobile persons communicate with their family and caregivers.

I have worked in online advertising and digital media as both a data scientist and big data engineer, and built various high-throughput web services around said data. I've created new big data pipelines using Hadoop/Pig/MapReduce. I've created machine learning models to predict click-through rate, news feed recommender systems using linear regression, Bayesian Bandits, and collaborative filtering and validated the results using A/B testing.

I have taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Humber College, and The New School. 

Multiple businesses have benefitted from my web programming expertise. I do all the backend (server), frontend (HTML/JS/CSS), and operations/deployment work. Some of the technologies I've used are: Python, Ruby/Rails, PHP, Bootstrap, jQuery (Javascript), Backbone, and Angular. For storage/databases I've used MySQL, Postgres, Redis, MongoDB, and more.