LEARNING PATH: MATLAB: Powerful Machine Learning with MATLAB
4.2 (32 ratings)
180 students enrolled

# LEARNING PATH: MATLAB: Powerful Machine Learning with MATLAB

Level up your machine learning skills to extract patterns and knowledge from your data with ease using MATLAB
4.2 (32 ratings)
180 students enrolled
Created by Packt Publishing
Last updated 2/2018
English
English [Auto-generated]
Current price: \$139.99 Original price: \$199.99 Discount: 30% off
5 hours left at this price!
30-Day Money-Back Guarantee
This course includes
• 4 hours on-demand video
• Access on mobile and TV
• Certificate of Completion
Training 5 or more people?

What you'll learn
• Learn the introductory concepts of machine learning
• Explore different ways to transform data using SAS XPORT, import, and export tools
• Discover the basics of classification methods and how to implement the Naive Bayes algorithm and decision trees in the MATLAB environment.
• Use clustering methods such as hierarchical clustering to group data using similarity measures
• Perform data fitting, pattern recognition, and clustering analysis with the help of the MATLAB neural network toolbox
Course content
Expand all 28 lectures 03:58:54
+ Getting Started with MATLAB Machine Learning
10 lectures 01:48:24

This video will give you an overview about the course.

Preview 03:23

MATLAB will show us a desk of contents with all the necessary items for its proper and smooth operation.

Familiarizing Yourself with the MATLAB Desktop
10:41

In this video, we will start from data collection and import in MATLAB for whatever analysis we are going to do. As well, we will finish our activities by exporting the results.

Importing Data into MATLAB
15:37

Many of the functions we have used to import data into MATLAB have a corresponding function that allows us to export data. In this video, we will see those functions.

Exporting Data from MATLAB
08:32

So far, for data organization, we have mostly used standard arrays that represent useful data structures for storing a large number of objects, but all of the same type, such as a matrix of numbers or characters. However, such arrays cannot be used if you want to memorize both numbers and strings in the same object. This is a problem that can be solved by so-called cell arrays, structure arrays, and more generally all those structures that the MATLAB programming environment provides us.

Data Organization
14:59

Before passing our data to machine learning algorithms, we need to give a first look at what we've imported into MATLAB to see if there are any issues. To get started, it's good practice to keep your original data. To do this, every change will be performed on a copy of the dataset.

Preview 11:55

In the exploratory phase of a study, we try to gather a first set of information needed to derive features that can guide us in choosing the right tools to extract knowledge from the data.

Exploratory Statistics – Numerical Measures
12:37

We will begin our visual analysis with an example in which we will draw a simple diagram to extract statistical indicators. It helps us to calculate and plot descriptive statistics with the data.

Exploratory Visualization
16:41

To introduce the key concepts, we will get started with a simple linear regression example. We just use a spreadsheet that contains the number of vehicles registered in Italy and the population of the different regions.

Searching Linear Relationships
06:56

In this video, we will learn to create a linear regression model.

Creating a Linear Regression Model
07:03
+ Mastering Machine Learning with MATLAB
18 lectures 02:10:30

This video gives an overview of the entire course.

Preview 03:36

We will explore decision trees methods. Then, we will learn the concepts like nodes, branches, and leaf nodes. We will see how to classify objects into a finite number of classes by repeatedly dividing the records into homogeneous subsets with respect to the target attribute.

Predicting a Response by Decision Trees
13:08

We will learn the basic concepts of probability theory: classical probability definition, dependent and independent events, joint probability and conditional probability, which is the basis of these methods.

Probabilistic Classification Algorithms – Naive Bayes
06:15

In this video we will explore discriminant analysis methodologies; several examples will be analyzed to compare the relative results. We will also learn how to create models to minimize the expected classification cost.

Describing Differences by Discriminant Analysis
10:33

With nearest neighbor classifiers, we will learn how to identify the class of a sample based on the distance of it from other classified objects. Discover how to fix the distance metric and how to choose the optimal value for K. So, as to understand how to improve model performance through cross-validation.

Find Similarities Using Nearest Neighbor Classifiers
09:00

In this video, we will analyze the Classification Learner app, and how it leads us into step-by-step classification analysis. With the help of this app, to import and explore data, select features, specify validation schemes, train models, and evaluate results, will be extremely simple and fast.

Classification Learner App
05:36

We will look at a couple of methods for grouping objects: hierarchical clustering and partitioning clustering. In the first method, clusters are constructed by recursively partitioning the instances in either a top-down or bottom-up fashion. The second one decomposes a dataset into a set of disjoint clusters.

Introduction to Clustering
04:17

To determine the proximity of objects to each other, we will use the linkage function. With the cluster function, we will cut the ramifications from the bottom of the hierarchy tree and assign all the objects below each cut to a single cluster.

Hierarchical Clustering
05:37

In this video, we will explore partitioning clustering through the k-means method. Then we will learn how to locate K - centroids, one for each cluster, by an iterative procedure. We will discover how well these clusters are separated and how to make a silhouette plot using cluster indices issued by K-means.

Partitioning-Based Clustering Methods – K-means Algorithm
06:58

K-medoids is more robust to noise and outliers than K-means, because a mean is easily influenced by extreme values. We will learn to use the K-medoids function; it partitions the observations of a matrix into k clusters and returns a vector containing the cluster indices of each observation.

Partitioning around the Actual Center – K-medoids Clustering
05:21

The models are composed of k (positive integer) multivariate normal density components. Each component has an n-dimensional mean, n-by-n covariance matrix, and mixing proportion. We will use the fitgmdist function to return a Gaussian mixture distribution model with k components (fixed by the user) fitted to the input dataset.

Clustering Using Gaussian Mixture Models
08:38

Neural networks work in parallel and are therefore able to deal with lots of data at the same time, as opposed to serial computers, in which each one is processed individually and in succession.

Getting Started with Neural Networks
03:55

Neuron input is a set of fibers called dendrites; they are in contact with the axonsof other neurons from which they receive electrical potentials. The connection point between an axon of a neuron and the dendrite of another neuron is called synapse.

Basic Elements of a Neural Network
04:48

The Neural Network Toolbox provides algorithms, pre-trained models, and apps to create, train, visualize, and simulate neural networks with one hidden layer and neural networks with several hidden layers. Let’s see how to do this?

Neural Network Toolbox
04:11

To make the application of neural networks as simple as possible, the toolbox gives us a series of GUIs. In this video, We will check out the neural network getting started GUI, the starting point for our neural network fitting, pattern recognition, clustering, and time series analysis.

Exploring Neural Network Start GUI
02:51

In this video, we will focus on fitting data with a neural network. We will see how to use the Neural Fitting app (nftool). Then, we will run a script analysis to learn how to use neural network functions from the command line.

Data Fitting with Neural Networks
11:43

Selection of features is necessary to create a functional model so as to achieve a reduction in cardinality, imposing a limit greater than the number of features that must be considered during its creation. Feature selection is based on finding a subset of the original variables, usually iteratively, thus detecting new combinations of variables and comparing prediction errors.

Feature Selection
11:28

The process of transforming the input data into a set of features is called feature extraction. If the features extracted are carefully chosen, it is expected that the features set will perform the desired task using the reduced representation instead of the full size input.

Feature Extraction
12:35
Requirements
• Basic knowledge MATLAB is needed
• Basic mathematical and statistical background is assumed
• Basic programming knowledge of C, C++, Java, and Python is needed
Description

How do you deal with data that’s messy, incomplete, or in varied formats? How do you choose the right model for the data?

The solution to these questions is MATLAB.

MATLAB is the language of choice for many researchers and mathematics experts when it comes to machine learning. Engineers and data scientists work with large amounts of data in a variety of formats such as sensor, image, video, telemetry, databases, and much more. They use machine learning to find patterns in data and to build models that predict future outcomes based on historical data. With MATLAB, you have immediate access to prebuilt functions, extensive toolboxes, and specialized apps for classification, regression, and clustering. MATLAB is designed to give developers fluency in MATLAB programming language. Problem-based MATLAB examples have been given in simple and easy way to make your learning fast and effective. If you're interested to learn and implement powerful machine learning techniques, using MATLAB, then go for this Learning Path.

Packt’s Video Learning Paths are a series of individual video products put together in a logical and stepwise manner such that each video builds on the skills learned in the video before it.

The highlights of this Learning Path are:

• Explore the different types of regression techniques such as simple and multiple linear regression, ordinary least squares estimation, correlations, and how to apply them to your data
• Perform data fitting, pattern recognition, and clustering analysis with the help of the MATLAB neural network toolbox.
• Use feature selection and extraction for dimensionality reduction, leading to improved performance.

Let’s take a quick look at your learning journey. This Learning Path will help you build a foundation in machine learning using MATLAB. You'll start by getting your system ready with the MATLAB environment for machine learning and see how to easily interact with the MATLAB workspace. You'll then move on to data cleansing, mining, and analyzing various data types in machine learning. You’ll also learn to display data values on a plot. Next, you'll learn about the different types of regression techniques and how to apply them to your data using the MATLAB functions. You'll understand the basic concepts of neural networks and perform data fitting, pattern recognition, and clustering analysis. You'll also explore feature selection and extraction techniques for dimensionality reduction to improve performance. Finally, you’ll learn to put it all together through real-world use cases covering major machine learning algorithms and will now be an expert in performing machine learning with MATLAB.

By the end of this Learning Path, you'll have acquired a complete knowledge on powerful machine learning techniques of MATLAB