Find online courses made by experts from around the world.
Take your courses with you and learn anywhere, anytime.
Learn and practice realworld skills and achieve your goals.
This course is about the fundamental concepts of machine learning, focusing on neural networks, SVM and decision trees. These topics are getting very hot nowadays because these learning algorithms can be used in several fields from software engineering to investment banking. Learning algorithms can recognize patterns which can help detect cancer for example or we may construct algorithms that can have a very very good guess about stock prices movement in the market.
In each section we will talk about the theoretical background for all of these algorithms then we are going to implement these problems together.
The first chapter is about regression: very easy yet very powerful and widely used machine learning technique. We will talk about Naive Bayes classification and tree based algorithms such as decision trees and random forests. These are more sophisticated algorithms, sometimes works, sometimes not. The last chapters will be about SVM and Neural Networks: the most important approaches in machine learning.
Not for you? No problem.
30 day money back guarantee.
Forever yours.
Lifetime access.
Learn on the go.
Desktop, iOS and Android.
Get rewarded.
Certificate of completion.
Section 1: Introduction  

Lecture 1 
Introduction
Preview

01:41  
Lecture 2 
Introduction to machine learning
Preview

06:57  
Lecture 3 
Installing Anaconda

02:08  
Lecture 4 
Datasets we will use

00:56  
Section 2: Regression  
Lecture 5 
Linear regression introduction
Preview

09:24  
Lecture 6 
Linear regression with gradient descent

08:41  
Lecture 7 
Linear regression example I

08:17  
Lecture 8 
Linear regression example II

03:56  
Lecture 9 
Logistic regression introduction

10:15  
Lecture 10 
Logistic regression introduction II  illustration

04:30  
Lecture 11 
Cross validation

03:29  
Lecture 12 
Logistic regression example I  sigmoid function

11:22  
Lecture 13 
Logistic regression example II

05:48  
Lecture 14 
Logistic regression example III  credit scoring

09:59  
Section 3: KNearest Neighbor Classifier  
Lecture 15 
Knearest neighbor introduction

11:23  
Lecture 16 
Knearest neighbor introduction  normalize data

03:40  
Lecture 17 
Knearest neighbor example I  simple problem

04:55  
Lecture 18 
Knearest neighbor example II  credit scoring

04:18  
Section 4: Naive Bayes Classifier  
Lecture 19 
Naive Bayes introduction

09:03  
Lecture 20 
Naive Bayes example I  simple example

02:40  
Lecture 21 
Naive Bayes example II  credit scoring

02:52  
Lecture 22 
Naive Bayes example III  text clustering

19:29  
Section 5: Support Vector Machine (SVM)  
Lecture 23 
Support vector machine introduction I  linear case

08:50  
Lecture 24 
Support vector machine introduction II  nonlinear case

07:18  
Lecture 25 
Support vector machine introduction III  kernels

04:23  
Lecture 26 
Support vector machine example I  simple

04:07  
Lecture 27 
Support vector machine example II  iris dataset

08:24  
Lecture 28 
Support vector machine example III character recognition

11:58  
Section 6: Tree Based Algorithms  
Lecture 29 
Decision trees introduction

09:03  
Lecture 30 
Decision trees example I

02:20  
Lecture 31 
Decision trees example II  iris data

06:11  
Lecture 32 
Pruning and bagging

05:07  
Lecture 33 
Random forests introduction

03:43  
Lecture 34 
Boosting

02:55  
Lecture 35 
Random forests example I  simple example

02:18  
Lecture 36 
Random forests example II  credit scoring

03:14  
Lecture 37 
Random forests example III  iris dataset

03:05  
Section 7: Clustering  
Lecture 38 
Principal component anlysis introduction

03:47  
Lecture 39 
Principal component analysis example

04:32  
Lecture 40 
Kmeans clustering introduction I

06:10  
Lecture 41 
Kmeans clustering introduction II

04:03  
Lecture 42 
Kmeans clustering example

05:55  
Lecture 43 
DBSCAN introduction

04:56  
Lecture 44 
Hierarchical clustering introduction

06:07  
Lecture 45 
Hierarchical clustering example

05:16  
Section 8: Neural Networks  
Lecture 46 
 NEURAL NETWORKS INTRODUCTION 

00:01  
Lecture 47 
Axons and neurons in the human brain

08:22  
Lecture 48 
Modeling human brain

07:24  
Lecture 49 
Learning paradigms

02:58  
Lecture 50 
Artificial neurons  the model

06:57  
Lecture 51 
Artificial neurons  activation functions

06:16  
Lecture 52 
Artificial neurons  an example

05:00  
Lecture 53 
Neural networks  the big picture

04:33  
Lecture 54 
Applications of neural networks

02:12  
Lecture 55 
 BACKPROPAGATION 

00:01  
Lecture 56 
Feedforward neural networks

08:10  
Lecture 57 
Optimization  cost function

10:40  
Lecture 58 
Simplified feedforward network

08:07  
Lecture 59 
Feedforward neural network topology

06:04  
Lecture 60 
The learning algorithm

05:17  
Lecture 61 
Error calculation

06:06  
Lecture 62 
Gradient calculation I  output layer

08:21  
Lecture 63 
Gradient calculation II  hidden layer

03:49  
Lecture 64 
Backpropagation

05:18  
Lecture 65 
Backpropagation II

01:59  
Lecture 66 
Applications of neural networks I  character recognition

04:06  
Lecture 67 
Applications of neural networks II  stock market forecast

04:10  
Lecture 68 
Deep learning

04:11  
Lecture 69 
 IMPLEMENTATION 

00:01  
Lecture 70 
Building networks

06:06  
Lecture 71 
Building networks II

05:53  
Lecture 72 
Handling datasets

03:20  
Lecture 73 
Neural network example I  XOR problem

07:46  
Lecture 74 
Neural network example II  iris dataset

07:24  
Section 9: Face Detection  
Lecture 75 
Face detection introduction

04:00  
Lecture 76 
Installing OpenCV

04:22  
Lecture 77 
CascadeClassifier

08:48  
Lecture 78 
CascadeClassifier parameters

04:07  
Lecture 79 
Tuning the parameters

03:38  
Section 10: Source Code & Data  
Lecture 80 
Source code

00:02  
Lecture 81 
Data

00:02  
Lecture 82 
Slides

00:01  
Lecture 83 
Coupon codes  get any of my courses for a discounted price

00:04 
Hi!
My name is Balazs Holczer. I am from Budapest, Hungary. I am qualified as a physicist and later on I decided to get a master degree in applied mathematics. At the moment I am working as a simulation engineer at a multinational company. I have been interested in algorithms and data structures and its implementations especially in Java since university. Later on I got acquainted with machine learning techniques, artificial intelligence, numerical methods and recipes such as solving differential equations, linear algebra, interpolation and extrapolation. These things may prove to be very very important in several fields: software engineering, research and development or investment banking. I have a special addiction to quantitative models such as the BlackScholes model, or the Mertonmodel. Quantitative analysts use these algorithms and numerical techniques on daily basis so in my opinion these topics are definitely worth learning.
Take a look at my website and join my email list if youÂ are interested in these topics!