Machine Learning and AI: Support Vector Machines in Python
4.6 (452 ratings)
4,645 students enrolled

# Machine Learning and AI: Support Vector Machines in Python

Artificial Intelligence and Data Science Algorithms in Python for Classification and Regression
4.6 (452 ratings)
4,645 students enrolled
Last updated 6/2020
English
English [Auto-generated]
Current price: \$139.99 Original price: \$199.99 Discount: 30% off
5 hours left at this price!
30-Day Money-Back Guarantee
This course includes
• 9 hours on-demand video
• Access on mobile and TV
• Certificate of Completion
Training 5 or more people?

What you'll learn
• Apply SVMs to practical applications: image recognition, spam detection, medical diagnosis, and regression analysis
• Understand the theory behind SVMs from scratch (basic geometry)
• Use Lagrangian Duality to derive the Kernel SVM
• Understand how Quadratic Programming is applied to SVM
• Support Vector Regression
• Polynomial Kernel, Gaussian Kernel, and Sigmoid Kernel
• Build your own RBF Network and other Neural Networks based on SVM
Course content
Expand all 73 lectures 08:53:22
+ Welcome
4 lectures 18:51
Preview 02:20
Preview 04:54
Preview 05:49
Where to get the code and data
05:48
+ Beginner's Corner
8 lectures 49:38
Beginner's Corner: Section Introduction
05:18
Image Classification with SVMs
06:00
Spam Detection with SVMs
11:47
Medical Diagnosis with SVMs
05:15
Regression with SVMs
05:35
Cross-Validation
07:20
How do you get the data? How do you process the data?
05:21
Suggestion Box
03:02
+ Review of Linear Classifiers
7 lectures 50:14
Basic Geometry
10:51
Normal Vectors
03:41
Logistic Regression Review
09:45
Loss Function and Regularization
04:09
Prediction Confidence
07:25
Nonlinear Problems
09:58
Linear Classifiers Section Conclusion
04:25
+ Linear SVM
10 lectures 01:06:27
Linear SVM Section Introduction and Outline
03:18
Linear SVM Problem Setup and Definitions
04:30
Margins
08:51
Linear SVM Objective
11:00
12:31
Slack Variables
07:25
Hinge Loss (and its Relationship to Logistic Regression)
06:22
03:10
Linear SVM with Gradient Descent (Code)
05:06
Linear SVM Section Summary
04:14
+ Duality
7 lectures 43:47
Duality Section Introduction
03:43
Duality and Lagrangians (part 1)
13:01
Lagrangian Duality (part 2)
07:08
Relationship to Linear Programming
04:19
Predictions and Support Vectors
09:16
Why Transform Primal to Dual?
03:26
Duality Section Conclusion
02:54
+ Kernel Methods
9 lectures 51:14
Kernel Methods Section Introduction
03:47
The Kernel Trick
08:11
Polynomial Kernel
06:06
Gaussian Kernel
05:13
Using the Gaussian Kernel
07:09
Why does the Gaussian Kernel correspond to infinite-dimensional features?
04:39
Other Kernels
07:04
Mercer's Condition
06:24
Kernel Methods Section Summary
02:41
+ Implementations and Extensions
8 lectures 54:21
Dual with Slack Variables
10:40
Simple Approaches to Implementation
06:25
SVM with Projected Gradient Descent Code
08:19
Kernel SVM Gradient Descent with Primal (Theory)
04:30
Kernel SVM Gradient Descent with Primal (Code)
04:55
SMO (Sequential Minimal Optimization)
09:32
Support Vector Regression
05:26
Multiclass Classification
04:34
+ Neural Networks (Beginner's Corner 2)
8 lectures 50:24
Neural Networks Section Introduction
02:41
RBF Networks
15:38
RBF Approximations
08:38
What Happened to Infinite Dimensionality?
02:53
03:53
Relationship to Deep Learning Neural Networks
06:50
Neural Network-SVM Mashup
07:15
Neural Networks Section Conclusion
02:36
+ Appendix / FAQ
12 lectures 02:28:26
What is the Appendix?
02:48
Windows-Focused Environment Setup 2018
20:20
How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow
17:30
Is this for Beginners or Experts? Academic or Practical? Fast or slow-paced?
22:04
How to Succeed in this Course (Long Version)
10:24
How to Code by Yourself (part 1)
15:54
How to Code by Yourself (part 2)
09:23
Proof that using Jupyter Notebook is the same as not using it
12:29
Python 2 vs Python 3
04:38
What order should I take your courses in? (part 1)
11:18
What order should I take your courses in? (part 2)
16:07
[Bonus] Where to get discount coupons and FREE deep learning material
05:31
Requirements
• Calculus, Matrix Arithmetic / Geometry, Basic Probability
• Python and Numpy coding
• Logistic Regression
Description

Support Vector Machines (SVM) are one of the most powerful machine learning models around, and this topic has been one that students have requested ever since I started making courses.

These days, everyone seems to be talking about deep learning, but in fact there was a time when support vector machines were seen as superior to neural networks. One of the things you’ll learn about in this course is that a support vector machine actually is a neural network, and they essentially look identical if you were to draw a diagram.

The toughest obstacle to overcome when you’re learning about support vector machines is that they are very theoretical. This theory very easily scares a lot of people away, and it might feel like learning about support vector machines is beyond your ability. Not so!

In this course, we take a very methodical, step-by-step approach to build up all the theory you need to understand how the SVM really works. We are going to use Logistic Regression as our starting point, which is one of the very first things you learn about as a student of machine learning. So if you want to understand this course, just have a good intuition about Logistic Regression, and by extension have a good understanding of the geometry of lines, planes, and hyperplanes.

This course will cover the critical theory behind SVMs:

• Linear SVM derivation

• Hinge loss (and its relation to the Cross-Entropy loss)

• Quadratic programming (and Linear programming review)

• Slack variables

• Lagrangian Duality

• Kernel SVM (nonlinear SVM)

• Polynomial Kernels, Gaussian Kernels, Sigmoid Kernels, and String Kernels

• Learn how to achieve an infinite-dimensional feature expansion

• SMO (Sequential Minimal Optimization)

• RBF Networks (Radial Basis Function Neural Networks)

• Support Vector Regression (SVR)

• Multiclass Classification

For those of you who are thinking, "theory is not for me", there’s lots of material in this course for you too!

In this course, there will be not just one, but two full sections devoted to just the practical aspects of how to make effective use of the SVM.

We’ll do end-to-end examples of real, practical machine learning applications, such as:

• Image recognition

• Spam detection

• Medical diagnosis

• Regression analysis

For more advanced students, there are also plenty of coding exercises where you will get to try different approaches to implementing SVMs.

These are implementations that you won't find anywhere else in any other course.

Thanks for reading, and I’ll see you in class!

Suggested Prerequisites:

• Calculus

• Matrix Arithmetic / Geometry

• Basic Probability

• Logistic Regression

• Python coding: if/else, loops, lists, dicts, sets

TIPS (for getting through the course):

• Watch it at 2x.

• Take handwritten notes. This will drastically increase your ability to retain the information.

• Write down the equations. If you don't, I guarantee it will just look like gibberish.

• Ask lots of questions on the discussion board. The more the better!

• The best exercises will take you days or weeks to complete.

• Write code yourself, don't just sit there and look at my code. This is not a philosophy course!

WHAT ORDER SHOULD I TAKE YOUR COURSES IN?:

• Check out the lecture "What order should I take your courses in?" (available in the Appendix of any of my courses, including the free Numpy course)

Who this course is for:
• Beginners who want to know how to use the SVM for practical problems
• Experts who want to know all the theory behind the SVM
• Professionals who want to know how to effectively tune the SVM for their application