Regression Machine Learning with Python
4.5 (4 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
195 students enrolled
Wishlisted Wishlist

Please confirm that you want to add Regression Machine Learning with Python to your Wishlist.

Add to Wishlist

Regression Machine Learning with Python

Learn regression machine learning from basic to expert level through a practical course with Python programming language
4.5 (4 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
195 students enrolled
Created by Diego Fernandez
Last updated 10/2016
English
Current price: $10 Original price: $50 Discount: 80% off
5 hours left at this price!
30-Day Money-Back Guarantee
Includes:
  • 5.5 hours on-demand video
  • 9 Articles
  • 20 Supplemental Resources
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
What Will I Learn?
  • Read data files and perform regression machine learning operations by installing related packages and running code on the Python IDE.
  • Assess bias-variance prediction errors trade-off potentially leading to model under-fitting or over-fitting.
  • Avoid model over-fitting using cross-validation for optimal parameter selection. Evaluate goodness-of-fit through coefficient of determination metric.
  • Test forecasting accuracy through scale-dependent and scale-independent metrics such as mean absolute error, symmetric mean absolute percentage error and mean absolute scaled error.
  • Compute generalized linear models such as linear regression and improve their prediction accuracy doing coefficient shrinkage through Ridge and Lasso regressions.
  • Calculate similarity methods such as k nearest neighbors’ regression and increase their forecasting accurateness with optimal number of nearest neighbors.
  • Estimate frequency methods such as decision trees regression and advance their estimation precision with ideal number of tree splits.
  • Approximate ensemble methods such as random forest regression and gradient boosting machine regression to expand decision tree regression calculation exactness.
  • Explore maximum margin methods such as support vector machine regression with linear and non-linear kernels and escalate their assessment exactitude with best penalty of error term.
View Curriculum
Requirements
  • Python programming language is required. Downloading instructions included.
  • Python Distribution (PD) and Integrated Development Environment (IDE) are recommended. Downloading instructions included.
  • Python code files provided by instructor.
  • Prior basic Python programming language knowledge is useful but not required.
  • Mathematical formulae kept at minimum essential level for main concepts understanding.
Description

Learn regression machine learning through a practical course with Python programming language using real world data. It explores main concepts from basic to expert level which can help you achieve better grades, develop your academic career, apply your knowledge at work or make business forecasting related decisions. All of this while exploring the wisdom of best academics and practitioners in the field.

Become a Regression Machine Learning Expert in this Practical Course with Python

  • Read data files and perform regression machine learning operations by installing related packages and running code on the Python IDE.
  • Assess model bias-variance prediction errors trade-off potentially leading to under-fitting or over-fitting.
  • Avoid model over-fitting using cross-validation for optimal parameter selection.
  • Evaluate goodness-of-fit through coefficient of determination metric.
  • Test forecasting accuracy through scale-dependent and scale-independent metrics.
  • Compute generalized linear models such as linear regression, Ridge regression and Lasso regression.
  • Calculate similarity methods such as optimal number of k nearest neighbors’ regression. 
  • Estimate frequency methods such as ideal number of splits decision trees regression.
  • Approximate ensemble methods such as random forest regression and gradient boosting machine regression to enhance decision tree regression prediction accuracy.
  • Explore maximum margin methods such as best penalty of error term support vector machines with linear and non-linear kernels.

Become a Regression Machine Learning Expert and Put Your Knowledge in Practice

Learning regression machine learning is indispensable for data mining applications in areas such as consumer analytics, finance, banking, health care, science, e-commerce and social media. It is also essential for academic careers in data mining, applied statistical learning or artificial intelligence. And it is necessary for any business forecasting related decision.

But as learning curve can become steep as complexity grows, this course helps by leading you through step by step real world practical examples for greater effectiveness.

Content and Overview

This practical course contains 41 lectures and 5 hours of content. It’s designed for all regression machine learning knowledge levels and a basic understanding of Python programming language is useful but not required.

At first, you’ll learn how to read data files and perform regression machine learning computing operations by installing related packages and running code on the Python IDE. Next, you’ll asses model bias-variance prediction errors trade-off which can potentially lead to its under-fitting or over-fitting. After that, you’ll avoid model over-fitting by using cross-validation for optimal parameter selection. Later, you’ll evaluate goodness-of-fit through coefficient of determination. Then, you’ll test forecasting accuracy through scale-dependent metrics such as mean absolute error and scale-independent ones such as symmetric mean absolute percentage error and mean absolute scaled error.

After that, you’ll compute generalized linear models such as linear regression and improve its prediction accuracy through coefficient shrinkage done by Ridge regression and Lasso regression. Next, you’ll calculate similarity methods such as k nearest neighbors’ regression and increase their forecasting accurateness by selecting optimal number of nearest neighbors. Later, you’ll estimate frequency methods such as decision trees regression and advance their estimation precision with ideal number of splits.

Then, you’ll approximate ensemble methods such as random forest regression and gradient boosting machine regression in order to expand decision tree regression calculation exactness. Finally, you’ll explore maximum margin methods such as support vector machine regression using linear and non-linear or radial basis function kernels and escalate their assessment exactitude with best penalty error term.

Who is the target audience?
  • Undergraduates or postgraduates at any knowledge level who want to learn about regression machine learning using Python programming language.
  • Academic researchers who wish to deepen their knowledge in data mining, applied statistical learning or artificial intelligence.
  • Business data scientists who desire to apply this knowledge in areas such as consumer analytics, finance, banking, health care, e-commerce or social media.
Students Who Viewed This Course Also Viewed
Curriculum For This Course
41 Lectures
05:21:41
+
Course Overview
7 Lectures 30:10

In this lecture you will view course disclaimer and learn which are its objectives, how you will benefit from it, its previous requirements and my profile as instructor.

Preview 04:11

In this lecture you will learn that it is recommended to view course in an ascendant manner as each section builds on last one and also does its complexity. You will also study course structure and main sections (course overview, bias-variance trade-off, generalized linear models, similarity methods, frequency methods, ensemble methods, maximum margin methods and summary results).

Preview 02:30

In this lecture you will learn regression machine learning definition, Miniconda Distribution for Python 3.5 64-bit (PD) and Python PyCharm Integrated Development Environment (IDE) downloading websites.

Regression Machine Learning
05:39

In this lecture you will learn regression machine learning .TXT in .CSV format data reading into Python PyCharm Integrated Development Environment (IDE), data sources, code files originally in .TXT format that need to be converted in .PY format with regression machine learning computation instructions, Python packages Miniconda Distribution for Python 3.5 64-bit (PD) installation (numpy, pandas, scipy, matplotlib, scikit-learn) and related code (import <package> as <name>, read.csv(<path> ), plot(<y=data>), figure.add_subplot(<plot coordinate>) functions).

Regression Machine Learning Data
17:41

Before starting course please download .TXT data file in .CSV format as additional resources.

Course Data File
00:03

Before starting course please download .TXT Python code files as additional resources.

Course Code Files
00:03

You can download .PDF section slides file as additional resources.

Course Overview Slides
00:02
+
Bias-Variance Trade-Off
6 Lectures 01:10:20

You can download .PDF section slides file as additional resources.

Bias-Variance Trade-Off Slides
00:02

In this lecture you will learn section lectures’ details and main themes to be covered related to bias-variance trade-off (bias-variance errors, optimal parameter selection, goodness-of-fit score and forecasting accuracy).

Preview 10:01

In this lecture you will learn bias and variance errors definitions and main calculations (reshape(<data>), LinearRegression(<parameters>), Ridge(<fixed alpha>, <solver>), fit(<input data>, <output data>).predict(<input data>), for(<loop>), list.insert(<data>), mean(<data>), print(<data>) functions).

Bias-Variance Errors
16:22

In this lecture you will learn optimal parameter selection, cross-validation definition and main calculations (reshape(<data>), GridSearchCV(<model>, <fixed parameters>, <variable parameters grid>), fit.(<input data>, <output data>).best_estimator_, LinearRegression(<parameters>), Ridge(<fixed alpha>, <solver>), Ridge(<optimal alpha>, <solver>), print(<data>) functions).

Optimal Parameter Selection
15:42

In this lecture you will learn goodness-of-fit score, coefficient of determination definition and main calculations (reshape(<data>), LinearRegression(<parameters>), Ridge(<fixed alpha>, <solver>), Ridge(<optimal alpha>, <solver>), fit(<input data>, <output data>).score(<input data>, <output data>), print(<data>) functions).

Goodness-Of-Fit Score
11:15

In this lecture you will learn forecasting accuracy, mean absolute error, symmetric mean absolute percentage error, mean absolute scaled error definitions and main calculations (reshape(<data>), LinearRegression(<parameters>), Ridge(<fixed alpha>, <solver>), Ridge(<optimal alpha>, <solver>), for(<loop>), list.insert(<data>), absolute(<data>), mean(<data>), print(<data>) functions).

Forecasting Accuracy
16:58
+
Generalized Linear Models
8 Lectures 01:02:00

You can download .PDF section slides file as additional resources.

Generalized Linear Models Slides
00:02

In this lecture you will learn section lectures’ details and main themes to be covered related to generalized linear models definition, fitting and forecasting (linear regression, Ridge regression and Lasso regression). 

Generalized Linear Models Overview
07:06

In this lecture you will learn linear regression definition, characteristics, essential mathematical formulae and graphical representation. 

Linear Regression Definition
02:16

In this lecture you will learn linear regression fitting, forecasting coding and main calculations (reshape(<data>), LinearRegression(<parameters>), fit(<input data>, <output data>).predict(<input data>), fit(<input data>, <output data>).score(<input data>, <output data>), for(<loop>), list.insert(<data>), absolute(<data>), mean(<data>), print(<data>) functions).

Linear Regression Fitting and Forecasting
13:31

In this lecture you will learn Ridge regression definition, characteristics and essential mathematical formulae.

Ridge Regression Definition
02:34

In this lecture you will learn Ridge regression fitting, forecasting coding and main calculations (reshape(<data>), Ridge(<optimal alpha>, <solver>), fit(<input data>, <output data>).predict(<input data>), fit(<input data>, <output data>).score(<input data>, <output data>), for(<loop>), list.insert(<data>), absolute(<data>), mean(<data>), print(<data>) functions).

Ridge Regression Fitting and Forecasting
18:05

In this lecture you will learn Lasso regression fitting definition, characteristics and essential mathematical formulae.

Lasso Regression Definition
02:41

In this lecture you will learn Lasso regression fitting, forecasting coding and main calculations (reshape(<data>), Lasso(<optimal alpha>, <solver>), fit(<input data>, <output data>).predict(<input data>), fit(<input data>, <output data>).score(<input data>, <output data>), for(<loop>), list.insert(<data>), absolute(<data>), mean(<data>), print(<data>) functions).

Lasso Regression Fitting and Forecasting
15:45
+
Similarity Methods
4 Lectures 24:23

You can download .PDF section slides file as additional resources.

Similarity Methods Slides
00:02

In this lecture you will learn section lectures’ details and main themes to be covered related to similarity methods definition, fitting and forecasting (k nearest neighbors’ regression).

Similarity Methods Overview
06:18

In this lecture you will learn k nearest neighbors’ regression definition, characteristics, essential mathematical formulae and graphical representation

K Nearest Neighbors Regression Definition
04:36

In this lecture you will learn k nearest neighbors regression fitting, forecasting coding and main calculations (reshape(<data>), KNeighborsRegressor(<optimal n_neighbors>, <weights>, <algorithm>, <distance metric>), fit(<input data>, <output data>).predict(<input data>), fit(<input data>, <output data>).score(<input data>, <output data>), for(<loop>), list.insert(<data>), absolute(<data>), mean(<data>), print(<data>) functions).

K Nearest Neighbors Regression Fitting and Forecasting
13:27
+
Frequency Methods
4 Lectures 24:15

You can download .PDF section slides file as additional resources.

Frequency Methods Slides
00:02

In this lecture you will learn section lectures’ details and main themes to be covered related to frequency methods definition, fitting and forecasting (decision tree regression).

Frequency Methods Overview
07:14

In this lecture you will learn decision trees regression definition, characteristics, essential mathematical formulae and graphical representation.

Decision Tree Regression Definition
02:51

In this lecture you will learn decision trees regression fitting, forecasting coding and main calculations (reshape(<data>), DecisionTreeRegressor(<optimal max depth>, <criterion>, <best splitter>), fit(<input data>, <output data>).predict(<input data>), fit(<input data>, <output data>).score(<input data>, <output data>), for(<loop>), list.insert(<data>), absolute(<data>), mean(<data>), print(<data>) functions).

Decision Tree Regression Fitting and Forecasting
14:08
+
Ensemble Methods
6 Lectures 50:27

You can download .PDF section slides file as additional resources.

Ensemble Methods Slides
00:02

In this lecture you will learn section lectures’ details and main themes to be covered related to ensemble methods definition, fitting and forecasting (random forest regression and gradient boosting machine regression).

Ensemble Methods Overview
09:41

In this lecture you will learn random forest regression definition, characteristics and essential mathematical formulae.

Random Forest Regression Definition
02:25

In this lecture you will learn random forest regression fitting, forecasting coding and main calculations (reshape(<data>), RandomForestRegressor(<optimal max depth>, <criterion>, <best splitter>), fit(<input data>, <output data>).predict(<input data>), fit(<input data>, <output data>).score(<input data>, <output data>), for(<loop>), list.insert(<data>), absolute(<data>), mean(<data>), pricnt(<data>) functions).

Random Forest Regression Fitting and Forecasting
16:38

In this lecture you will learn gradient boosting machine regression definition, characteristics, essential mathematical formulae and graphical representation.

Gradient Boosting Machine Regression Definition
03:52

In this lecture you will learn gradient boosting machine regression fitting, forecasting coding and main calculations (reshape(<data>), GradientBoostingRegressor(<optimal max depth>, <loss>, <learning rate>, <n estimators>), fit(<input data>, <output data>).predict(<input data>), <output data>).score(<input data>, <output data>), for(<loop>), list.insert(<data>), absolute(<data>), mean(<data>), print(<data>) functions).

Gradient Boosting Machine Regression Fitting and Forecasting
17:49
+
Maximum Margin Methods
5 Lectures 44:39

You can download .PDF section slides file as additional resources.

Maximum Margin Methods Slides
00:02

In this lecture you will learn section lectures’ details and main themes to be covered related to maximum margin methods definition, fitting and forecasting (linear and non-linear support vector machine regression).

Maximum Margin Methods Overview
08:35

In this lecture you will learn support vector machine regression definition, characteristics, essential mathematical formulae and graphical representation.

Support Vector Machine Regression Definition
04:42

In this lecture you will learn linear support vector machine regression fitting, forecasting coding and main calculations (reshape(<data>), SVR(<optimal error penalty>, <epsilon>, <linear kernel>), fit(<input data>, <output data>).predict(<input data>), fit(<input data>, <output data>).score(<input data>, <output data>), for(<loop>), list.insert(<data>), absolute(<data>), mean(<data>), print(<data>) functions).

Linear Support Vector Machine Regression Fitting and Forecasting
15:26

In this lecture you will learn non-linear support vector machine regression fitting, forecasting coding and main calculations (reshape(<data>), SVR(<optimal error penalty>, <epsilon>, <radial basis function kernel>), fit(<input data>, <output data>).predict(<input data>), fit(<input data>, <output data>).score(<input data>, <output data>), for(<loop>), list.insert(<data>), absolute(<data>), mean(<data>), print(<data>) functions).

Non-Linear Support Vector Machine Regression Fitting and Forecasting
15:54
+
Summary Results
1 Lecture 15:22

In this lecture you will learn goodness-of-fit and forecasting accuracy summary results for all methods, models and algorithms.

Regression Machine Learning Summary Results
15:22
About the Instructor
Diego Fernandez
3.8 Average rating
429 Reviews
2,966 Students
21 Courses
Exfinsis

Diego Fernandez is author of high-quality online courses and ebooks at Exfinsis for anyone who wants to become an expert in financial data analysis.

His main areas of expertise are financial analysis and data science. Within financial analysis he has focused on computational finance, quantitative finance and trading strategies analysis. Within data science he has concentrated on machine learning, applied statistics and econometrics. For all of this he has become proficient in Microsoft Excel®, R statistical software® and Python programming language® analysis tools. 

He has important online business development experience at fast-growing startups and blue-chip companies in several European countries. He has always exceeded expected professional objectives by starting with a comprehensive analysis of business environment and then efficiently executing formulated strategy.

He also achieved outstanding performance in his undergraduate and postgraduate degrees at world-class academic institutions. This outperformance allowed him to become teacher assistant for specialized subjects and constant student leader within study groups. 

His motivation is a lifelong passion for financial data analysis which he intends to transmit in all of the courses.