Udemy

Logistic Regression in R - Step 4

A free video tutorial from Kirill Eremenko
Data Scientist
Rating: 4.5 out of 5Instructor rating
59 courses
2,618,168 students
Logistic Regression in R - Step 4

Lecture description

Evaluate the performance of logistic regression model by making a confusion matrix which counts the number of correct and incorrect predictions, which is realized in one line of code via 'table' function.

Learn more from the full course

Machine Learning A-Z™: AI, Python & R + ChatGPT Bonus [2023]

Learn to create Machine Learning Algorithms in Python and R from two Data Science experts. Code templates included.

42:14:38 of on-demand video • Updated November 2023

Master Machine Learning on Python & R
Have a great intuition of many Machine Learning models
Make accurate predictions
Make powerful analysis
Make robust Machine Learning models
Create strong added value to your business
Use Machine Learning for personal purpose
Handle specific topics like Reinforcement Learning, NLP and Deep Learning
Handle advanced techniques like Dimensionality Reduction
Know which Machine Learning model to choose for each type of problem
Build an army of powerful Machine Learning models and know how to combine them to solve any problem
English
Hello and welcome to this art tutorial. So in the previous tutorial we predicted our test results and now and just to tell you all we are going to evaluate those predictions by making the confusion matrix which will count the number of correct predictions and the number of incorrect predictions. So let's do it let's make the matrix. It's very simple. It will just take us one line. So let's call it C M equals. Then very practically We're going to use the table function in R and in the table I will first input the real values which are the test set brackets and then I have to select the columns of the real results which is if we go to the test set this column because this column contains the real results whether the use bought yes or no. The SUV and this column has an x 3. So here I will put come up and three. All right so that's my first argument that's the real values. And then as a second argument I'm going to input the predicted values which is of course the widespread vector here. So here I'm going to add y pred and I don't have to select an index because white bread is already the vector of predictions. So in short this is the vector of real values for all the observations. And this is the vector of predictions for the same observations as in this vector. OK and that's it. The confusion matrix is ready. So now let's select this line and execute. Here is table created. Now let's go to the console and have a look at it. CM. And here it is the most important thing to understand here is that the 57 and the 26 Here are the correct predictions and the 10 and the seven Here are the incorrect predictions. So what's interesting here at first sight is that the classifier made 57 plus 26 equals 83 correct predictions and 10 plus seven equals 17. Incorrect predictions. All right so 17 incorrect predictions on the test set. That's not bad but we can do better and we will do better with other classifiers. You'll see that in the next sections. OK so we're done with the confusion matrix. And finally now it's time for the best part. Because in the next tutorial we will be graphically looking at our results because we will plot this very cool chart that will allow us to make an awesome interpretation of the results. So I look forward to seeing you in the next tutorial where we will make this chart. Until then enjoy mission learning