Introduction and Outline: Why would you want to use an HMM?

Lazy Programmer Inc.
A free video tutorial from Lazy Programmer Inc.
Artificial intelligence and machine learning engineer
4.6 instructor rating • 28 courses • 404,605 students

Learn more from the full course

Unsupervised Machine Learning Hidden Markov Models in Python

HMMs for stock price analysis, language modeling, web analytics, biology, and PageRank.

09:04:53 of on-demand video • Updated July 2020

  • Understand and enumerate the various applications of Markov Models and Hidden Markov Models
  • Understand how Markov Models work
  • Write a Markov Model in code
  • Apply Markov Models to any sequence of data
  • Understand the mathematics behind Markov chains
  • Apply Markov models to language
  • Apply Markov models to website analytics
  • Understand how Google's PageRank works
  • Understand Hidden Markov Models
  • Write a Hidden Markov Model in Code
  • Write a Hidden Markov Model using Theano
  • Understand how gradient descent, which is normally used in deep learning, can be used for HMMs
English It you guys are welcome to unsupervised machine learning Hidden Markov models in Python. In this class we're of course going to learn about Hidden Markov models which are used for modeling sequences of data sequences appear everywhere stock prices language credit scoring and Web page visits a lot of the time we're dealing with sequences in machine learning and we don't even realize it or we ignore the fact that the data came from a sequence. Consider the following sentence like. And the cats dog. Of course this sentence doesn't make any sense and that's what happens when you use a model like bag of words. The fact that it becomes very hard to tell what a sentence means when you take away the time aspect tells you that there's a lot of information carried there. The original sentence was I like cats and dogs and you could have probably decoded that yourself but you can imagine how this would get progressively harder as the sentence gets longer. And this course we are going to start with the very basic Markov model. No hidden. These are by themselves very useful for modeling sequences as you'll see. We'll talk about the mathematical properties of a market model and go through a ton of examples so you can really see how they are used. Google's page rank algorithm is based on Markov models. So you know that despite being all technology Markov models are still very useful and very relevant today. We'll also talk about how to model language and how to analyze web visitor data so you can fix problems like high bounce rate. Next we'll look at the Hidden Markov model. This will be much more complex mathematically but the first section should prepare you will look at the three basic problems in Hidden Markov model and which are number one. Predicting the probability of a sequence number two predicting the most likely sequence of hidden states given in observed sequence and number three how to train a hidden Markov model. Typical courses that teach Hidden Markov Models stop there but I know a lot of you guys are interested in deep learning and this course is basically a lead in into my next course on sequences which will teach you about recurrent neural networks. You know that gradient descent is the main technique we use to train our neural networks. But I've said that it can be used to optimize any function. We'll see how this can be true by using gradient descent to train our age. Typically the expectation maximisation algorithm is used and we're going to do this too. But we'll see how Greaney percent makes things much easier to write this code will make use of the deep learning library called Viana which can automatically calculate gradients. I'm gonna show you the very useful scan function MVNO which is a critical part of recurrent neural networks . It will be good to learn now so you can be more comfortable with them later. Finally after looking at plain h memes which are used for modeling discrete observations like rolling a die or flipping a coin will look at how each HMM can be used to model continuous observations. Specifically by combining memes and jazzier a mixture models or idioms we learned about these in my previous unsupervised learning course on cluster analysis. We will end this course by looking at even more practical examples of how each memes can be used. If you have a question I am always around to help just write your question on the discussion board and I will answer it there. This is very useful to other students who may have the same question. Please also write me with your suggestions for the course. Every single one of my courses has been updated since I first started them. Thanks to feedback from my students. I plan to add more examples more explanations if something is confusing and more exercises so you can test whether or not you really know your stuff. See you in the next lecture