Learning Path: R: Complete Machine Learning & Deep Learning
3.7 (24 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
304 students enrolled
Wishlisted Wishlist

Please confirm that you want to add Learning Path: R: Complete Machine Learning & Deep Learning to your Wishlist.

Add to Wishlist

Learning Path: R: Complete Machine Learning & Deep Learning

Unleash the true potential of R to unlock the hidden layers of data
3.7 (24 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
304 students enrolled
Created by Packt Publishing
Last updated 6/2017
English
Curiosity Sale
Current price: $10 Original price: $200 Discount: 95% off
30-Day Money-Back Guarantee
Includes:
  • 17.5 hours on-demand video
  • 1 Supplemental Resource
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
What Will I Learn?
  • Develop R packages and extend the functionality of your model
  • Perform pre-model building steps
  • Understand the working behind core machine learning algorithms
  • Build recommendation engines using multiple algorithms
  • Incorporate R and Hadoop to solve machine learning problems on Big Data
  • Understand advanced strategies that help speed up your R code
  • Learn the basics of deep learning and artificial neural networks
  • Learn the intermediate and advanced concepts of artificial and recurrent neural networks
View Curriculum
Requirements
  • Basic knowledge of R would be beneficial
  • Knowledge of linear algebra and statistics is required
Description

Are you looking to gain in-depth knowledge of machine learning and deep learning? If yes, then this Learning Path just right for you.

Packt’s Video Learning Paths are a series of individual video products put together in a logical and stepwise manner such that each video builds on the skills learned in the video before it.

R is one of the leading technologies in the field of data science. Starting out at a basic level, this Learning Path will teach you how to develop and implement machine learning and deep learning algorithms using R in real-world scenarios.

The Learning Path begins with covering some basic concepts of R to refresh your knowledge of R before we deep-dive into the advanced techniques. You will start with setting up the environment and then perform data ETL in R. You will then learn important machine learning topics, including data classification, regression, clustering, association rule mining, and dimensionality reduction. Next, you will understand the basics of deep learning and artificial neural networks and then move on to exploring topics such as ANNs, RNNs, and CNNs. Finally, you will learn about the applications of deep learning in various fields and understand the practical implementations of scalability, HPC, and feature engineering.

By the end of the Learning Path, you will have a solid knowledge of all these algorithms and techniques and be able to implement them efficiently in your data science projects.

Do not worry if this seems too far-fetched right now; we have combined the best works of the following esteemed authors to ensure that your learning journey is smooth:

About the Authors

Selva Prabhakaran is a data scientist with a large e-commerce organization. In his 7 years of experience in data science, he has tackled complex real-world data science problems and delivered production-grade solutions for top multinational companies.

Yu-Wei, Chiu (David Chiu) is the founder of LargitData, a startup company that mainly focuses on providing Big Data and machine learning products. He has previously worked for Trend Micro as a software engineer, where he was responsible for building Big Data platforms for business intelligence and customer relationship management systems. In addition to being a startup entrepreneur and data scientist, he specializes in using Spark and Hadoop to process Big Data and apply data mining techniques for data analysis.

Vincenzo Lomonaco is a deep learning PhD student at the University of Bologna and founder of ContinuousAI, an open source project aiming to connect people and reorganize resources in the context of continuous learning and AI. He is also the PhD students' representative at the Department of Computer Science of Engineering (DISI) and teaching assistant of the courses machine learning and computer architectures in the same department.

Who is the target audience?
  • The Learning Path is for machine learning engineers, statisticians, and data scientists who want to create cutting-edge machine learning and deep learning models using R
Students Who Viewed This Course Also Viewed
Curriculum For This Course
213 Lectures
17:36:05
+
Mastering R Programming
54 Lectures 05:12:00

This video gives an overview of the entire course.

Preview 07:44

In this video, we will take a look at how to perform univariate analysis.

Performing Univariate Analysis
05:22

The goal of this video is to perform bivariate analysis in R using three cases.

Bivariate Analysis – Correlation, Chi-Sq Test, and ANOVA
05:42

In this video, we will see how to detect and treat outliers.

Detecting and Treating Outlier
03:20

The goal of this video is to see how to treat missing values in R.

Treating Missing Values with `mice`
03:59

In this video we'll see what is linear regression, its purpose, when to use it, and how to implement in R.

Preview 07:35

We'll see how to interpret regression results and Interaction effects in this video

Interpreting Regression Results and Interactions Terms
05:19

In this video we will discuss what is residual analysis and detect multivariate outliers using Cook's Distance

Performing Residual Analysis & Extracting Extreme Observations Cook's Distance
03:25

The goal of this video is to understand how to do model selection and comparison using best subsets, stepwise regression and ANOVA.

Extracting Better Models with Best Subsets, Stepwise Regression, and ANOVA
04:39

In this video we will see how to do k-fold cross validation in R.

Validating Model Performance on New Data with k-Fold Cross Validation
02:29

The goal of this video is check out how to build non-linear regression models using Splines and GAMs.

Building Non-Linear Regressors with Splines and GAMs
05:19

Our goal in this video would be to understand logistic regression, evaluation metrics of binary classification problems, and interpretation of the ROC curve.

Preview 12:38

In this video, we will understand the concept and working of naïve Bayes classifier and how to implement the R code.

Understanding the Concept and Building Naive Bayes Classifier
09:23

In this video, we will look at what k-nearest neighbors algorithms, how does it works and how to implement it in T.

Building k-Nearest Neighbors Classifier
07:01

The goal of this video is to understand how decision trees work, what they are used for, and how to implement then.

Building Tree Based Models Using RPart, cTree, and C5.0
06:32

The goal of this video is know what the various features of the caret package are and how to build predictive models.

Building Predictive Models with the caret Package
08:11

The goal of this video is to know how to do feature selection before building predictive models.

Selecting Important Features with RFE, varImp, and Boruta
05:19

In this video, we will look at how support vector machines work.

Preview 08:03

In this video, we will look at the concept behind bagging and random forests and how to implement it to solve problems.

Understanding Bagging and Building Random Forest Classifier
05:06

Let's understand what boosting is and how stochastic gradient boosting works with GBM.

Implementing Stochastic Gradient Boosting with GBM
05:18

In this video, we will look at what regularization is, ridge and lasso regression, and how to implement it.

Regularization with Ridge, Lasso, and Elasticnet
08:52

Let's look at how XG Boost works and how to implement it in this video.

Building Classifiers and Regressors with XGBoost
10:10

Our goal in this video would be to reduce the dimensionality of data with principal components, and understand the concept and how to implement it in R.

Preview 05:04

In this video, we will understand the k-means clustering algorithm and implement it using the principal components.

Clustering with k-means and Principal Components
03:16

In this video, we will analyze the clustering tendency of a dataset and identify the ideal number of clusters or groups.

Determining Optimum Number of Clusters
05:24

The goal of this video is to understand the logic of hierarchical clustering, types, and how to implement it in R.

Understanding and Implementing Hierarchical Clustering
02:36

How to use affinity propagation to cluster data points? How is it different from conventional algorithms?

Clustering with Affinity Propagation
05:24

How to build recommendation engines to recommend products/movies to new and existing users?

Building Recommendation Engines
09:00

The goal of this video is to understand what a time series is, how to create time series of various frequencies, and the enhanced facilities available in the xts package.

Preview 05:41

The goal of this video is to understand the characteristics of a time series: stationarity and how to de-trend and de-seasonalize a time series.

Stationarity, De-Trend, and De-Seasonalize
04:07

In this video, we will introduce the characteristics of time series such as ACF, PACF, and CCF; why they matter; and how to interpret them.

Understanding the Significance of Lags, ACF, PACF, and CCF
03:49

Our goal in this video would be to understand moving average and exponential smoothing and use it to forecast.

Forecasting with Moving Average and Exponential Smoothing
02:25

In this video, we will understand how double exponential smoothing and holt winter forecasting works, when to use them, and how to implement them in R.

Forecasting with Double Exponential and Holt Winters
03:22

Let's look at what ARIMA forecasting is, understand the concepts, and learn how ARIMA modelling works in this video.

Forecasting with ARIMA Modelling
05:26

In this video, we'll take a look at how to scrape data from web pages and how to clean and process raw web and other textual data.

Preview 09:24

Our goal in this video is to know how to process texts using tm package and understand the significance of TF-IDF and its implementation. Finally, we see how to draw a word cloud in R.

Corpus, TDM, TF-IDF, and Word Cloud
09:06

Let's see how to use cosine similarity and latent semantic analysis to find and map similar documents.

Cosine Similarity and Latent Semantic Analysis
07:20

In this video, we will see how to extract the underlying topics in a document, the keywords related to each topic and the proportion of topics in each document.

Extracting Topics with Latent Dirichlet Allocation
05:07

Let's check out how to perform sentiment analysis and scoring in R.

Sentiment Scoring with tidytext and Syuzhet
04:23

How to classify texts with machine learning algorithms using the RTextTools package?

Classifying Texts with RTextTools
03:57

The goal of this videos is to understand what is the basic structure of to make charts with ggplot, how to customize the aesthetics, and manipulate the theme elements.

Preview 07:18

In this video, we will see how to manipulate the legend the way we want and how to add texts and annotation in ggplot.

Manipulating Legend, AddingText, and Annotation
03:31

The goal of this video is to understand how to plot multiple plots in the same chart and how to change the layouts of ggplot.

Drawing Multiple Plots with Faceting and Changing Layouts
03:18

How to make various types of plots in ggplot such as bar chart, time series, boxplot, ribbon chart,and so on.

Creating Bar Charts, Boxplots, Time Series, and Ribbon Plots
05:25

In this video, we will understand what the popular ggplot extensions are, and where to find them, and their applications.

ggplot2 Extensions and ggplotly
03:11

We will discuss the best practices that should be followed to minimize code runtime in this video.

Preview 05:46

Let's tackle the implementation of parallel computing in R.

Implementing Parallel Computing with doParallel and foreach
04:22

The goal of this video is understand how to work with DplyR and pipes.

Writing Readable and Fast R Code with Pipes and DPlyR
05:39

In this video, we will discuss how to manipulate data with the data.table package, how to achieve maximum speed, and what the various features of data.table are.

Writing Super Fast R Code with Minimal Keystrokes Using Data.Table
06:38

Our main focus in this video is to understand how to write C++ code and make it work in R. Also leverage the speed of C++ in R, interface Rcpp with R, and write Rcpp code.

Interface C++ in R with RCpp
11:09

We'll take a look at the components of an R package in this video.

Preview 05:02

In this video, we will look at how to create an R Package so that it can be submitted to CRAN.

Build, Document, and Host an R Package on GitHub
07:09

We will understand the mandatory checks and common problems faced by developers when creating R packages in this video.

Performing Important Checks Before Submitting to CRAN
04:05

The goal of this video is to show how to submit an R package to CRAN.

Submitting an R Package to CRAN
03:10
+
R Machine Learning solutions
124 Lectures 08:19:36

This is give you brief information about the course.

Preview 04:38

R must be first installed on your system to work on it.

Downloading and Installing R
06:10

RStudio makes the process of development with R easier.

Downloading and Installing RStudio
03:10

R packages are an essential part of R as they are required in all our programs. Let's learn to do that.

Installing and Loading Packages
05:46

You must know how to give data to R to work with data. You will learn that here.

Reading and Writing Data
05:54

Data manipulation is time consuming and hence needs to be done with the help of built-in R functions.

Using R to Manipulate Data
05:46

R is widely used for statistical applications. Hence it is necessary to learn about the built in functions of R.

Applying Basic Statistics
04:47

To communicate information effectively and make data easier to comprehend we need graphical representation. You will learn to plot figures in this section.

Visualizing Data
03:33

Because of some limitations, it is a good practice to get data from external repositories. You will be able to do just that after this video.

Getting a Dataset for Machine Learning
02:38

Reading a dataset is the first and foremost step in data exploration. We need to learn to how to do that.

Preview 08:36

In R, since nominal, ordinal, interval, and ratio variable are treated differently in statistical modeling, we have to convert a nominal variable from a character into a factor.

Converting Types on Character Variables
03:05

Missing values affect the inference of a dataset. Thus it is important to detect them.

Detecting Missing Values
03:18

After detecting missing values, we need to impute them as their absence may affect the conclusion.

Imputing Missing Values
04:30

After imputing the missing values, you should perform an exploratory analysis to summarize the data characteristics.

Exploring and Visualizing Data
04:24

The exploratory analysis helps users gain insights into how single or multiple variables may affect the survival rate. However, it does not determine what combinations may generate a prediction model. We need to use a decision tree for that.

Predicting Passenger Survival with a Decision Tree
03:58

After constructing the prediction model, it is important to validate how the model performs while predicting the labels.

Validating the Power of Prediction with a Confusion Matrix
02:08

Another way of measuring performance is the ROC curve.

Assessing performance with the ROC curve
02:32

When there are huge datasets, we can find the characteristics of the entire dataset with a part or sample of the data. Hence data sampling is essential.

Preview 03:30

Probability distribution and statistics are interdependent. To provide a justification to the statistical information, we need probability.

Operating a Probability Distribution in R
05:41

Univariate statistics deals with a single variable and hence is very simple.

Working with Univariate Descriptive Statistics in R
05:09

To analyze the relation among more than two variables, multivariate analysis is done.

Performing Correlations and Multivariate Analysis
03:01

Assessing the relation between dependent and independent variables is carried out through linear regression.

Operating Linear Regression and Multivariate Analysis
03:25

To validate that the experiment results are significant, hypothesis testing is done.

Conducting an Exact Binomial Test
03:48

To compare means of two different groups, one- and two-sample t-tests are conducted.

Performing Student's t-test
03:13

Comparing a sample with a reference probability or comparing cumulative distributions of two data sets calls for a Kolmogorov- Smirnov test.

Performing the Kolmogorov-Smirnov Test
04:43

The Wilcoxon Test is a non-parametric test for null hypothesis.

Understanding the Wilcoxon Rank Sum and Signed Rank Test
02:04

To check the distribution of categorical variables of two groups, Pearson's chi-squared test is used.

Working with Pearson's Chi-Squared Test
05:08

To examine the relation between categorical independent variables and continuous dependent variables, Anova is used. When there is a single variable, one-way ANOVA is used.

Conducting a One-Way ANOVA
04:15

When there are two categorical values to be compared, two-way ANOVA is used.

Performing a Two-Way ANOVA
04:02

Linear regression is the simplest model in regression and can be used when there is one predictor value.

Preview 04:53

To obtain summarized information of a fitted model, we need to learn how to summarize linear model fits.

Summarizing Linear Model Fits
05:20

It would be really convenient for us if we could predict unknown values. You can do that using linear regression.

Using Linear Regression to Predict Unknown Values
02:51

To check if the fitted model adequately represents the data, we perform diagnostics.

Generating a Diagnostic Plot of a Fitted Model
03:57

In the case of a non-linear relationship between predictor and response variables, a polynomial regression model is formed. We need to fit the model. This video will enable you to do that.

Fitting a Polynomial Regression Model with lm
02:16

An outlier will cause diversion from the slope of the regression line. In order to avoid that, we need to fit a robust linear regression model.

Fitting a Robust Linear Regression Model with rlm
02:15

We will perform linear regression on a real-life example, the SLID dataset.

Studying a case of linear regression on SLID data
06:38

GLM generalizes linear regression by allowing the linear model to be related to the response variable via a link function and by allowing the magnitude of the variance of each measurement to be a function of its predicted value.

Reducing Dimensions with SVD
02:11

GLM allows response variables with error distribution other than a normal distribution. We apply the Poisson model to see how that is done.

Applying the Poisson model for Generalized Linear Regression
01:33

When a variable is binary, we apply the binomial model.

Applying the Binomial Model for Generalized Linear Regression
02:02

GAM has the ability to deal with non-linear relationships between dependent and independent variables. We learn to fit a regression using GAM.

Fitting a Generalized Additive Model to Data
03:13

Visualizing a GAM helps it to understand better.

Visualizing a Generalized Additive Model
01:26

You can also diagnose a GAM model to analyze it.

Diagnosing a Generalized Additive Model
03:38

Training and testing datasets are both essential for building a classification model.

Preview 03:44

A partitioning tree works on the basis of split condition starting from the base node to the terminal node.

Building a Classification Model with Recursive Partitioning Trees
06:10

Plotting the classification tree will make analyzing the data easier. You will learn to do this now.

Visualizing a Recursive Partitioning Tree
03:03

Before making a prediction, it is essential to compute the prediction performance of the model.

Measuring the Prediction Performance of a Recursive Partitioning Tree
02:48

There can be parts in a dataset which are not essential for classification. In order to remove these parts, we have to prune the dataset.

Pruning a Recursive Partitioning Tree
02:37

Conditional inference trees are better than traditional classification trees because they adapt the test procedures for selecting the output.

Building a Classification Model with a Conditional Inference Tree
01:56

Visualizing a conditional inference tree will make it easier to extract and analyze data from the dataset.

Visualizing a Conditional Inference Tree
02:38

Like the prediction performance of a traditional classification tree, we can also evaluate the performance of a conditional inference tree.

Measuring the Prediction Performance of a Conditional Inference Tree
02:10

k-nearest neighbor classifier is a non parametric lazy learning method. Thus it has the advantages of both the types of methods.

Classifying Data with the K-Nearest Neighbor Classifier
05:31

Classification in logistic regression is done based one or more features. It is more robust and doesn't have as many conditions as the traditional classification model.

Classifying Data with Logistic Regression
04:37

The Naïve Bayes classifier is based on applying Bayes' theorem with a strong independent assumption.

Classifying data with the Naïve Bayes Classifier
06:16

Support vector machines are better at classification because they can capture complex relations between data points and provide both linear and non-linear classifications

Preview 05:57

To control our training errors and margins, we use the cost function. The SVM classifier is affected by the cost.

Choosing the Cost of an SVM
02:56

To visualize the SVM fit, we can use the plot function.

Visualizing an SVM Fit
03:33

We can use the trained SVM to predict labels on a model.

Predicting Labels Based on a Model Trained by an SVM
03:48

According to the desired output, you may need to generate different combinations of gamma and cost to train different SVMs. This is called tuning.

Tuning an SVM
02:47

A neural network is used in classification, clustering and prediction. Its efficiency depends on how well you train it. Let's learn to do that.

Training a Neural Network with neuralnet
04:07

We can use the trained SVM to predict labels on a model.

Visualizing a Neural Network Trained by neuralnet
02:21

Similar to other classification models, we can predict labels using neural networks and also validate performance using confusion matrix.

Predicting Labels based on a Model Trained by neuralnet
03:07

Nnet provides the functionality to train feed-forward neural networks with backpropagation.

Training a Neural Network with nnet
02:45

As we have already trained the neural network using nnet, we can use the model to predict labels.

Predicting labels based on a model trained by nnet
02:49

The k-fold cross-validation technique is a common technique used to estimate the performance of a classifier as it overcomes the problem of over-fitting. In this video we will illustrate how to perform a k-fold cross-validation.

Preview 03:42

In this video, we will illustrate how to use tune.svm to perform 10-fold cross-validation and obtain the optimum classification model.

Performing Cross Validation with the e1071 Package
03:22

In this video we will demonstrate how to perform k-fold cross validation using the caret package.

Performing Cross Validation with the caret Package
02:59

This video will show you how to rank the variable importance with the caret package.

Ranking the Variable Importance with the caret Package
02:21

In this video, we will illustrate how to use rminer to obtain the variable importance of a fitted model.

Ranking the Variable Importance with the rminer Package
02:30

In this video we will show how to find highly correlated features using the caret package.

Finding Highly Correlated Features with the caret Package
02:13

In this video, we will demonstrate how to use the caret package to perform feature selection.

Selecting Features Using the Caret Package
04:58

To measure the performance of a regression model, we can calculate the distance from the predicted output and the actual output as a quantifier of the performance of the model. In this video we will illustrate how to compute these measurements from a built regression model.

Measuring the Performance of the Regression Model
03:57

In this video we will demonstrate how to retrieve a confusion matrix using the caret package.

Measuring Prediction Performance with a Confusion Matrix
02:07

In this video, we will demonstrate how to illustrate an ROC curve and calculate the AUC to measure the performance of a classification model.

Measuring Prediction Performance Using ROCR
02:46

In this video we will use the function provided by the caret package to compare different algorithm-trained models on the same dataset.

Comparing an ROC Curve Using the Caret Package
03:43

In this video we will see how to measure performance differences between fitted models with the caret package.

Measuring Performance Differences between Models with the caret Package
03:41

The adabag package implements both boosting and bagging methods. For the bagging method, the package first generates multiple versions of classifiers, and then obtains an aggregated classifier. Let's learn the bagging method from adabag to generate a classification model.

Preview 07:53

To assess the prediction power of a classifier, you can run a cross validation method to test the robustness of the classification model. This video will show how to use bagging.cv to perform cross validation with the bagging method.

Performing Cross Validation with the Bagging Method
01:56

Boosting starts with a simple or weak classifier and gradually improves it by reweighting the misclassified samples. Thus, the new classifier can learn from previous classifiers. One can use the boosting method to perform ensemble learning. Let's see how to use the boosting method to classify the telecom churn dataset.

Classifying Data with the Boosting Method
06:04

Similar to the bagging function, adabag provides a cross validation function for the boosting method, named boosting.cv. In this video, we will learn how to perform cross-validation using boosting.cv.

Performing Cross Validation with the Boosting Method
02:06

Gradient boosting creates a new base learner that maximally correlates with the negative gradient of the loss function. One may apply this method on either regression or classification problems. But first, we need to learn how to use gbm.

Classifying Data with Gradient Boosting
07:09

A margin is a measure of certainty of a classification. It calculates the difference between the support of a correct class and the maximum support of an incorrect class. This video will show us how to calculate the margins of the generated classifiers.

Calculating the Margins of a Classifier
05:30

The adabag package provides the errorevol function for a user to estimate the ensemble method errors in accordance with the number of iterations. Let's explore how to use errorevol to show the evolution of errors of each ensemble classifier.

Calculating the Error Evolution of the Ensemble Method
02:18

Random forest grows multiple decision trees which will output their own prediction results. The forest will use the voting mechanism to select the most voted class as the prediction result. In this video, we illustrate how to classify data using the randomForest package.

Classifying Data with Random Forest
07:01

At the beginning of this section, we discussed why we use ensemble learning and how it can improve the prediction performance. Let's now validate whether the ensemble model performs better than a single decision tree by comparing the performance of each method.

Estimating the Prediction Errors of Different Classifiers
04:35

Hierarchical clustering adopts either an agglomerative or a divisive method to build a hierarchy of clusters. This video shows us how to cluster data with the help of hierarchical clustering.

Preview 08:40

In this video we demonstrate how to use the cutree function to separate the data into a given number of clusters.

Cutting Trees into Clusters
03:30

In this video, we will demonstrate how to perform k-means clustering on the customer dataset.

Clustering Data with the k-Means Method
04:10

We will now illustrate how to create a bivariate cluster plot.

Drawing a Bivariate Cluster Plot
03:32

In this video we will see how to compare different clustering methods using cluster.stat from the fpc package.

Comparing Clustering Methods
04:15

In this video we will see how to compute silhouette information.

Extracting Silhouette Information from Clustering
02:40

In this video we will discuss how to find the optimum number of clusters for the k-means clustering method.

Obtaining the Optimum Number of Clusters for k-Means
02:48

In this video, we will demonstrate how to use DBSCAN to perform density-based clustering.

Clustering Data with the Density-Based Method
06:42

In this video, we will demonstrate how to use the model-based method to determine the most likely number of clusters.

Clustering Data with the Model-Based Method
04:37

A dissimilarity matrix can be used as a measurement for the quality of a cluster. In this video, we will discuss some techniques that are useful to visualize a dissimilarity matrix.

Visualizing a Dissimilarity Matrix
03:23

In this video, we will demonstrate how clustering methods differ with regard to data with known clusters.

Validating Clusters Externally
04:12

Before starting with a mining association rule, you need to transform the data into transactions. This video will show how to transform any of a list, matrix, or data frame into transactions.

Preview 03:35

The arule package uses its own transactions class to store transaction data. As such, we must use the generic function provided by arule to display transactions and association rules. Let's see how to display transactions and association rules via various functions in the arule package.

Displaying Transactions and Associations
02:14

Association mining is a technique that can discover interesting relationships hidden in transaction datasets. This approach first finds all frequent itemsets and then generates strong association rules from frequent itemsets. In this video, we see how to perform association analysis using the apriori rule.

Mining Associations with the Apriori Rule
07:24

This video will show you how to determine the number of principal components using the Kaiser method.

Pruning Redundant Rules
02:26

Besides listing rules as text, you can visualize association rules, making it easier to find the relationship between itemsets. In this video, we will learn how to use the aruleViz package to visualize the association rules.

Visualizing Association Rules
05:06

An apriori algorithm performs a breadth-first search to scan the database. So, support counting becomes time consuming. Alternatively, if the database fits into the memory, you can use the Eclat algorithm, which performs a depth-first search to count the supports. Let's see how to use the Eclat algorithm.

Mining Frequent Itemsets with Eclat
03:36

In addition to mining interesting associations within the transaction database, we can mine interesting sequential patterns using transactions with temporal information. This video demonstrates how to create transactions with temporal information.

Creating Transactions with Temporal Information
02:41

In contrast to association mining, we should explore patterns shared among transactions where a set of itemsets occurs sequentially. One of the most famous frequent sequential pattern mining algorithms is the Sequential Pattern Discovery using Equivalence classes (SPADE) algorithm. Let's see how to use SPADE to mine frequent sequential patterns.

Mining Frequent Sequential Patterns with cSPADE
04:15

This video will give you an introduction on how to perform feature selection with the FSelector package.

Preview 07:37

Principal component analysis (PCA) is the most widely used linear method in dealing with dimension reduction problems. This video will show you how to use it.

Performing Dimension Reduction with PCA
07:18

This video demonstrates how to determine the number of principal components using a scree plot. Let's have a look at it.

Determining the Number of Principal Components Using the Scree Test
03:34

This video will show you how to determine the number of principal components using the Kaiser method.

Determining the Number of Principal Components Using the Kaiser Method
02:05

Let's see how to use biplot to plot both variables and data

Visualizing Multivariate Data Using biplot
03:16

In MDS, you can either use a metric or a nonmetric solution. This video illustrates how to perform MDS on the swiss dataset.

Performing Dimension Reduction with MDS
05:37

You may require several times, reducing the dimension of matrices while working on datasets. Let us see how we could do this with SVD.

Reducing Dimensions with SVD
03:18

Let's see how to perform SVD on the classic image processing material, Lenna.

Compressing Images with SVD
03:05

This video will show you how to perform a nonlinear dimension reduction with ISOMAP. This is one of the approaches to manifold learning and generalizes linear frameworks to nonlinear data structures.

Performing Nonlinear Dimension Reduction with ISOMAP
04:34

This video will give you a short introduction of how to use LLE on an s-curve data

Performing Nonlinear Dimension Reduction with Local Linear Embedding
04:54

In order to prepare the RHadoop environment we need to download the Cloudera and QuickStart VM.

Preview 05:35

Installation of R and the rmr2 package is essential to perform MapReduce, which is used in performing data processing and analysis.

Installing rmr2
03:52

In order to access HDFS resources you need to install rhdfs on every task node.

Installing rhdfs
04:15

You can easily operate HDFS from the R console with the help of rhdfs.

Operating HDFS with rhdfs
05:47

You will understand how rmr2 is used for word count in this video.

Implementing a Word Count Problem with RHadoop
05:26

Comparing Hadoop and a standard R program can help us decide which language is best suited for our needs.

Comparing the Performance between an R MapReduce Program & a Standard R Program
05:03

Since running a MapReduce program will require a considerable amount of time, testing and debugging become very important.

Testing and Debugging the rmr2 Program
03:48

Plyrmr makes data manipulation operations easy.

Installing plyrmr
03:12

Writing a MapReduce program can be difficult for non-developers. Hence plyr-like operations can be used to manipulate data.

Manipulating Data with plyrmr
03:52

You can perform machine learning operations with RHadoop.

Conducting Machine Learning with RHadoop
04:38

Multinode clusters can be deployed with the help of Amazon EMR on RHadoop.

Configuring RHadoop Clusters on Amazon EMR
05:28
+
Deep Learning with R
35 Lectures 04:04:29

This video provides an overview of the entire course.

Preview 05:22

The main objective is to understand the fundamental concepts and key features that make it so special and different from the classical Machine Learning approach.

Fundamental Concepts in Deep Learning
07:42

The goal of this video is to learn more about Artificial Neural Networks and their vast world of variations, explore the basic architectures of ANNs in detail and talk about their possible implementations in R.

Introduction to Artificial Neural Networks
07:57

Applying what you have learned about the Multilayer Perceptron algorithm to a real-world application, which classifies handwritten digits in images.

Classification with Two-Layers Artificial Neural Networks
08:02

To get probabilistic predictions using Artificial Neural Networks and specifically in the context of a multi-class classification problem.

Probabilistic Predictions with Two-Layer ANNs
06:33

To add multiple hidden layers to the basic Multilayers Perceptron algorithm in order to build more complex models of the world and increase the accuracy of our predictions.

Preview 04:31

The goal of this video is to learn the best practices for tuning the hyper-parameters of an ANN and being able to generalize well on the data we have never seen before. This would be the latest essential skill to acquire in order to get the best out of our ANN solution.

Tuning ANNs Hyper-Parameters and Best Practices
06:12

The goal of this video is to learn more about Multi-hidden-layer Neural Networks and how to use them in order to solve the practical problem of classifying handwritten digits within the R language.

Neural Network Architectures
04:57

The goal of this video is to apply what we have learned about Multi-hidden-layer ANNs to a new real-world problem and get more confidence in the use of the H2O package.

Neural Network Architectures Continued
08:02

The main objective is to understand the optimization process behind and common to every Deep Learning model with a more formal definition with respect to what was previously introduced.

Preview 05:35

To explore with more details, the most common algorithm to minimize the loss function called Stochastic Gradient Descent.

Optimization Algorithms and Stochastic Gradient Descent
08:11

The goal of this video is to understand how to actually learn the weights of our Deep Learning model using Stochastic Gradient Descent through Backpropagation, the standard way of computing the gradient for Artificial Neural Networks.

Backpropagation
06:44

To get to tune the hyper-parameters automatically in order to minimize the error on the validation set.

Hyper-Parameters Optimization
07:17

This first video, will be an introduction to the fundamental concepts behind Convolutional Neural Networks. The main objective of this video is to motivate their use highlighting the differences from classical feed-forward neural networks.

Preview 09:57

The goal of this video is to learn more about Convolutional Neural Networks, concluding our dissertation on the layer-wise structure of a CNN and understand how to design architecture suitable for your specific problem.

Introduction to Convolutional Neural Networks Continued
10:35

The aim of this video is to understand how to actually implement CNNs in R, and use it to solve real-world problems.

CNNs in R
10:41

The goal of this video is to learn about the concept of transfer Learning, and how we can use and exchange DL pre-trained models to solve even new tasks with a very tiny computational overhead.

Classifying Real-World Images with Pre-Trained Models
08:29

The aim of this video is to introduce the fundamental concepts behind Recurrent Neural Networks. The main objective is to underline their main differences from classical feed-forward neural networks and CNNs.

Preview 11:57

The aim of this video is to learn more about a specific type of Recurrent Neural Networks, called Long Short-Term Memories, a natural extension of classical RNNs for dealing with long-term dependencies.

Introduction to Long Short-Term Memory
08:07

The aim of this video is to understand how to actually implement RNNs in R, and use it to solve real-world problems.

RNNs in R
08:55

The aim of this video is to learn how to train and use an LSTM to solve a complex problem like predicting the next character in a sentence given the occurrences of the previous characters.

Use-Case – Learning How to Spell English Words from Scratch
06:35

The aim of this video is to understand the main differences from classical supervised learning and how they can be combined together.

Preview 06:44

In this video, you will learn more about a specific unsupervised learning algorithm called Autoencoders. This type of Artificial Neural Networks are simple and effective solutions for learning efficient representation of data without any supervision.

Autoencoders
04:56

The aim of this video is to learn about two very important unsupervised Deep Learning algorithms for features hierarchies: Restricted Boltzmann Machines and Deep Belief Networks.

Restricted Boltzmann Machines and Deep Belief Networks
07:44

The aim of this video is to get a quick picture on the main approaches for solving reinforcement learning tasks with Deep Learning.

Reinforcement Learning with ANNs
07:22

The aim of this video is to learn how to train and use an Autoencoders in R with the H2O package, for solving a real-world anomaly detection task.

Use-Case – Anomaly Detection through Denoising Autoencoders
06:52

The aim of this video is to spark new inspiration for creatively applying Deep Learning techniques to real-world problems in Computer Vision.

Preview 07:19

The aim of this video is to creatively apply Deep Learning techniques to real-world problems in Natural Language Processing NLP.

Deep Learning for Natural Language Processing
06:04

In this video, we'll creatively apply Deep Learning techniques to real-world problems in ASP.

Deep Learning for Audio Signal Processing
05:01

The aim of this video is to introduce some of the most successful applications of Deep Learning for complex multimodal tasks.

Deep Learning for Complex Multimodal Tasks
04:32

In this video, let's take a look at some of the most successful applications in other fields we didn't mention before.

Other Important Applications of Deep Learning
05:24

The aim of this first video is to learn how to deal with models which do not behave as they should.

Preview 05:56

In this video, you will learn how to speed-up the training and deploy complex DL models.

GPU and MGPU Computing for Deep Learning
04:57

The aim of this video is to present a complete overview on every available R package for Deep Learning and Neural Networks.

A Complete Comparison of Every DL Packages in R
04:40

In this video, you will learn about the most interesting research directions and open question for the long-term developments of Deep Learning toward truly intelligent agents.

Research Directions and Open Questions
04:37
About the Instructor
Packt Publishing
3.9 Average rating
7,297 Reviews
52,180 Students
616 Courses
Tech Knowledge in Motion

Packt has been committed to developer learning since 2004. A lot has changed in software since then - but Packt has remained responsive to these changes, continuing to look forward at the trends and tools defining the way we work and live. And how to put them to work.

With an extensive library of content - more than 4000 books and video courses -Packt's mission is to help developers stay relevant in a rapidly changing world. From new web frameworks and programming languages, to cutting edge data analytics, and DevOps, Packt takes software professionals in every field to what's important to them now.

From skills that will help you to develop and future proof your career to immediate solutions to every day tech challenges, Packt is a go-to resource to make you a better, smarter developer.

Packt Udemy courses continue this tradition, bringing you comprehensive yet concise video courses straight from the experts.