Introduction, Outline, and Review

Lazy Programmer Team
A free video tutorial from Lazy Programmer Team
Artificial Intelligence and Machine Learning Engineer
4.6 instructor rating • 14 courses • 152,170 students

Learn more from the full course

Natural Language Processing with Deep Learning in Python

Complete guide on deriving and implementing word2vec, GloVe, word embeddings, and sentiment analysis with recursive nets

12:06:49 of on-demand video • Updated January 2021

  • Understand and implement word2vec
  • Understand the CBOW method in word2vec
  • Understand the skip-gram method in word2vec
  • Understand the negative sampling optimization in word2vec
  • Understand and implement GloVe using gradient descent and alternating least squares
  • Use recurrent neural networks for parts-of-speech tagging
  • Use recurrent neural networks for named entity recognition
  • Understand and implement recursive neural networks for sentiment analysis
  • Understand and implement recursive neural tensor networks for sentiment analysis
  • Use Gensim to obtain pretrained word vectors and compute similarities and analogies
English [Auto] Everyone, and welcome to natural language processing with deep learning in Python. In this lecture, I'm going to give you a high level overview of this course. Firstly, let's talk about what this course is all about to begin with. This course is all about specific developments in the field of deep learning that have led to massive increases in performance in MLP, as you may have heard. Deep learning, which is the study of neural networks, has revolutionised our understanding of machine learning in the past decade. An LP being a subset of machine learning has also been a beneficiary of this revolution. Before deep learning, the main techniques in an LP were the bag of words, model and techniques like TFI T.F.. Now you've baize and support vector machines. Granted, these techniques are not bad at all. In fact, if I had to build a quick, robust, simple system today, these would be options that might be worth considering. In advanced areas of LP, we use techniques like hidden Markov models to do things like speech recognition and parts of speech tagging. However, there is a problem with techniques like bag of words. Consider the phrases dog toy and toy dog. You and I know that these are different things, but in a bag of words, model ordered as a matter. And so these would be treated the same modeling sentences as sequences and as hierarchy's has led to state of the art improvements over previous go to techniques. And betting's are another concept which has revolutionized an LP. These give words a neural representation so that words can be plugged into a neural network just like any other feature vector. In this course, we will look at a few word embedding techniques pioneered at world leading institutions such as Google and Stanford. Thanks to advancements in deep learning, we now have state of the art systems in speech to text machine translation, sentiment analysis, text generation and even speech generation. Recently, Google announced that they were able to automate entire phone calls without the person on the other end, even knowing that they were talking to a robot. So, for example, you might instruct your Google assistant to make a reservation at a restaurant in. This course starts out with one of the most important advances in DPN up here, research, which is where Tim Meddings, of course, before doing this, we'll do a quick review to get you up to speed on all the knowledge you'll need to make it through the rest of the course. We're Adam Beddings allow you to map words into a vector space. Once you can represent something as a vector, you can perform arithmetic on it. So this is where the famous king minus man equals queen minus woman comes from. We'll be looking at two of the most popular algorithms for finding word of meanings, which are words, evac and glove. Next, we'll look at how words effect glove, although developed independently and on the surface, seem totally different, are actually very similar. Next, we'll move on to using deep neural networks for an LP. Of course, the central architecture used in Deep In is the R.N. in the recurrent neural network. Recurrent known networks are special kinds of neural networks that allow us to model sequences. And the reason why that's useful, of course, is because a sentence is nothing but a sequence of words. After that, we'll look at an even more powerful model of deep neuron that works for an LP. You might notice that sentences come in all shapes and sizes. Some are very short, like one or two words long. But you can have very long sentences like one that's 30 words long. But how do we as humans make sense of such long sentences? Well, the truth is we don't conceptualize sentences as sequences of words, but rather we give it a hierarchical structure. A sentence is made up of a group of phrases, and each of those phrases could be made up of smaller phrases. We don't make sense of sentences by thinking of each word from start to end, but rather by the relationships between these phrases. In other words, a sentence is more like a tree, and therefore one might infer a neural network shaped like a tree or a recursive neural network might be the best type of neural network for tasks such as text classification. In fact, recursive neural networks have led to state of the art results. This is one of the most technically challenging sections of any of my deep learning courses. So it's really going to test your limits when it comes to understanding algorithms and V.A. tensor flow coding ability. So just to summarize this lecture and give you a short overview of the outline of this course. First, we're going to look at word of Meddings. This is going to involve learning about famous algorithms like words, effect and glove. Second, we're going to look at how Arnon's can be used to solve deep in IP tasks. And finally, we're going to look at how recursive neural networks and much more powerful but technically challenging type of model can be used to obtain superior results on Depayin IP tests. Thanks for listening and I'll see you in class.