Introduction to Artificial Neural Network and Deep Learning
4.3 (320 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
1,367 students enrolled

Introduction to Artificial Neural Network and Deep Learning

The Best Machine Learning Techniques for Data Science in Java and Neuroph with Application in Image Recognition
4.3 (320 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
1,367 students enrolled
Last updated 4/2020
English
English [Auto]
Current price: $139.99 Original price: $199.99 Discount: 30% off
5 hours left at this price!
30-Day Money-Back Guarantee
This course includes
  • 7 hours on-demand video
  • 1 article
  • 29 downloadable resources
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
Training 5 or more people?

Get your team access to 4,000+ top Udemy courses anytime, anywhere.

Try Udemy for Business
What you'll learn
  • The structure of Neural Networks
  • The learning process of Neural Networks
  • Visualization in Neural Networks
  • Deep learning and deep Neural Networks
  • How to do classification using Neural Networks
  • How to do regression and prediction using Neural Networks
  • Implementing Neural Networks in Java
  • Using Neuroph to design, test, and analyze Neural Networks
Course content
Expand all 36 lectures 07:02:48
+ Preliminaries and Essential Definitions in Artificial Neural Networks
3 lectures 24:05

Let's start with a quick and intuitive analogy to see what the purpose of a neuron is.

Preview 11:24

This lesson shows the mathematical of an artificial neuron.

Preview 05:38

This lesson shows how we model the mathematical equations in the last video, as a mode of neuron.

A model of an artificial neuron
07:03
+ An Artificial Neuron (Perceptron)
4 lectures 39:12

This lecture shows how an artificial neuron works in action. I have written a program that allows you to interactively change weights and bias of a neuro to see how it changes the shape of the output line.

An artificial neuron in action: live example
08:07

This lecture introduces the terminologies in the areas of Machine Learning and Neural Networks.

Terminologies in the field of Machine Learning and Neural Networks
11:13

Perceptron is a neuron with a special transfer or activation function. This lecture shows how to mathematically model a perceptron with more than 2 inputs.


A mathematical model of perceptron for problems with more than two features
09:40

No you know how a single neuron and perceptron work with two or more than two inputs. It is time to learn their inspiration.

 

Time to learn the inspiration of perceptron and Neural Networks
10:12
+ Learning: How to train a Perceptron
8 lectures 02:23:30

This lecture covers the concepts of training and learning in Neural Networks. You will learn about the problem of training/learning in Neural network which is to minimize the cost function.

Learning and training in Neural Networks: minimizing a cost (error) function
18:30

In the last lecture, we realized that we have to minimize the cost function in Neural Networks to classify a data set. There are different cost functions in the field of Neural Networks, and we will learn about the most popular ones in this video.

Different cost/error functions
09:42

This video shows you the process of minimizing a cost function using the Gradient Descent algorithm.

How to find the minimum of a cost function? Yes this is learning!
22:11

To better underestand the Gradient Descent algorithm, this video taks you through a numerical example. In this lecture, we focus on finding optimal values for the connection weights.

Gradient Descent algorithm: Numerical example for optimizing weights
15:39

This lecture shows how to find the optimal values for the biases in Neural Networks using the Gradient Descent algorithm.

Gradient Descent algorithm: Numerical example for optimizing biases
13:54

Learning rate has a significant impact on the performance of the Gradient Descent Algorithm. This videos shows the impact of learning rate and some recommendation to define a good value for it.

The impact of the learning rate
08:01

There are several challenges when training Neural Networks. This video discusses the most important ones to be considered when solving real-world problems.

Challenges in training/learning Neural Networks
09:35

This lecture takes you though the steps of implementing a Perceptron in Java.

Coding a simple Perceptron in Java
45:58
+ A Perceptron Network, Deep Neural Networks, and deep learning
5 lectures 01:11:39

We have been using one transfer function so far for our models. The step transfer function is good for binary classification problems. For other types of problems, we might need different transfer functions. This lecture introduces a wide range of transfer functions.

Different activation functions
12:58

After mastering Perceptron and the process of training it, it is time to see how to make a network of neurons. Yes, this is called Neural Networks. We see what will happen when adding one more neuron.

Multiple neurons: a perceptron network
09:47

In this lecture, we will learn the impact of adding a new layer in a Multi-Layer Perceptron (MLP).

MLP: A Multi-layer Perceptron
23:27

This lecture shows you the impact of changing the weights and biases of an MLP on the shape of its output.

MLP in Action: A Live Demo
13:21

In the MLP model, we can add as many layers as we can, but the question is: what will happen when we include more layers. Let's find out the answer to this question by watching this video.

Deep Neural Networks and Deep Learning
12:06
+ BP: Backpropagation Algorithm
2 lectures 36:39

The Back Propagation (BP) algorithm is gradient-based for training MLPs. This videos takes you through the theory and steps of this algorithm.

The theory of the Backpropagation Algorithm
26:32

Momentum is a new parameter in the BP algorithm in addition to the learning rate. It assists BP to jump outside the locally optimal solutions. In this video, we will see its impact on the performance of BP.

The impact of the momentum1
10:07
+ Regression using Neural Networks
6 lectures 40:48

This lecture takes you through the process of solving regression and prediction problems using MLPs. We will be learning about both linear and logistic regression using MLPs.

Linear and logistic (non-linear) regression using MLPs
08:21

This video shows you a live example of an MLP designed for doing regression. You will learn about the impact of the connection weights and biases on the output shape of MLPs.

Regression in action: live demo
06:30

This lesson covers examples and issues in the process of doing regression using MLPs.

Regression examples and issues
08:55

In the above video, we have learned about regression problems with 1 independent variable which requires an MLP with 1 input in the first layer. But the question is: what if we have more than 1 independent variable? This video will answer this question.

Multiple regression
07:20

Do you want to see a live demo of how MLP solves problems require multiple regression? Well, let's watch this video them.

Multiple regression in action: live demo
06:31

There is a theory in the field of Neural Network called Universal Approximator. This video is about this theory.

MLP as a universal approximator
03:11
+ Neuroph
7 lectures 01:06:54

This vidoe shows the Neuroph website and its user interface.

Introduction to Neuroph
05:55

This lesson includes the steps to create an artificial neuron, train, and test it in Neuroph.

Let's create an artificial neuron in neuroph
11:29

This lesson shows the steps of designing, training, and testing MLPS in Neuroph.

Creating an MLP in Neuroph
07:43

Neuroph has a large number of sample projects, which are very good for learning. This video shows where to find and how to use them.

A sample project in Neuroph
03:16

Neuroph allows a wide range of visualization methods, and it is very user friendly. This video takes you through the steps of using one of the visualization tool to see the output of an MLP.

Visualizations in Neuroph
12:42

This lesson shows the process of recognizing hand-written character recognition in Neuroph.

Hand-written character recognition in Neuroph
13:00

In this lesson, you will learn how to recognize images using MLPs in Neuroph.

Image recognition in Neuroph
12:49
+ Free e-book
1 lecture 00:00

Download my book on NNs below.

My book on NNs
00:00
Requirements
  • Have a basic programming skills
  • Have a basic understanding of linear algebra particularly partial derivative
Description

Machine learning is an extremely hot area in Artificial Intelligence and Data Science. There is no doubt that Neural Networks are the most well-regarded and widely used machine learning techniques.

A lot of Data Scientists use Neural Networks without understanding their internal structure. However, understanding the internal structure and mechanism of such machine learning techniques will allow them to solve problems more efficiently. This also allows them to tune, tweak, and even design new Neural Networks for different projects.

This course is the easiest way to understand how Neural Networks work in detail. It also puts you ahead of a lot of data scientists. You will potentially have a higher chance of joining a small pool of well-paid data scientists.


Why learn Neural Networks as a Data Scientist?

Machine learning is getting popular in all industries every single month with the main purpose of improving revenue and decreasing costs. Neural Networks are extremely practical machine learning techniques in different projects. You can use them to automate and optimize the process of solving challenging tasks.  


What does a data scientist need to learn about Neural Networks?  

The first thing you need to learn is the mathematical models behind them. You cannot believe how easy and intuitive the mathematical models and equations are. This course starts with intuitive examples to take you through the most fundamental mathematical models of all Neural Networks. There is no equation in this course without an in-depth explanation and visual examples. If you hate math, then sit back, relax, and enjoy the videos to learn the math behind Neural Networks with minimum efforts.

It is also important to know what types of problems can be solved with Neural Networks. This course shows different types of problems to solve using Neural Networks including classification, regression, and prediction. There will be several examples to practice how to solve such problems as well.  


What does this course cover?

As discussed above, this course starts straight up with an intuitive example to see what a single Neuron is as the most fundamental component of Neural Networks. It also shows you the mathematical and conceptual model of a Neuron. After learning how easy and simple the mathematical models of a single Neuron are, you will see how it performs in action live.

The second part of this course covers terminologies in the field of machine learning, a mathematical model of a special type of neuron called Perceptron, and its inspiration. We will go through the main component of a perceptron as well.

In the third part, we will work with you on the process of training and learning in Neural networks. This includes learning different error/cost functions, optimizing the cost function, gradient descent algorithm, the impact of the learning rate, and challenges in this area.

In the first three parts of this course, you master how a single neuron works (e.g. Perceptron). This prepares you for the fourth part of this course, which is where we will learn how to make a network of these neurons. You will see how powerful even connecting two neurons are. We will learn the impact of multiple neurons and multiple layers on the outputs of a Neural Network. The main model here is a Multi-Layer Perceptron (MLP), which is the most well-regarded Neural Networks in both science and industry. This part of the course also includes Deep Neural Networks (DNN).

In the fifth section of this course, we will learn about the Backpropagation (BP) algorithm to train a multi-layer perceptron. The theory, mathematical model, and numerical example of this algorithm will be discussed in detail.

All the problems used in Sections 1-5 are classification, which is a very important task with a wide range of real-world applications. For instance, you can classify customers based on their interest in a certain product category. However, there are problems that require prediction. Such problems are solved by regression modes. Neural Networks can play the role of a regression method as well. This is exactly what we will be learning in Section 6 of this course. We start with an intuitive example of doing regression using a single neuron. There is a live demo as well to show how a neuron plays the role of a regression model. Other things that you will learn in this section are: linear regression, logistic (non-linear) regression, regression examples and issues, multiple regressions, and an MLP with three layers to solve any type of repression problems.

The last part of this course covers problem-solving using Neural Networks. We will be using Neuroph, which is a Java-based program, to see examples of Neural Networks in the areas and hand-character recognitions and image procession. If you have never used Neuroph before, there is nothing to worry about. There are several videos showing you the steps on how to create and run projects in Neuroph.

By the end of this course, you will have a comprehensive understanding of Neural Networks and able to easily use them in your project. You can analyze, tune, and improve the performance of Neural Networks based on your project too.


Does this course suit you?

This course is an introduction to Neural Networks, so you need absolutely no prior knowledge in Artificial Intelligence, Machine Learning, and AI. However, you need to have a basic understanding of programming especially in Java to easily follow the coding video. If you just want to learn the mathematical model and the problem-solving process using Neural Networks, you can then skip the coding videos.


Who is the instructor?

I am a leading researcher in the field of Machine Learning with expertise in Neural Networks and Optimization. I have more than 150 publications including 80 journal articles, 3 books, and 20 conference papers. These publications have been cited over 13,000 times around the world. As a leading researcher in this field with over 10 years of experience, I have prepared this course to make everything easy for those interested in Machine Learning and Neural Networks. I have been counseling big companies like Facebook and Google in my career too. I am also a star-rising Udemy instructor with more than 5000 students and 1000 5-star reviews, I have designed and developed this course to facilitate the process of learning Neural Networks for those who are interested in this area. You will have my full support throughout your Neural Networks journey in this course.  


There is no RISK!

I have some preview videos, so make sure to watch them to see if this course is for you.  This course comes with a full 30-day money-back guarantee, which means that if you are not happy after your purchase, you can get a 100% refund no question.


What are you waiting?

Enroll now using the “Add to Cart” button on the right and get started today.


Who this course is for:
  • Beginner data scientists interested in using Artificial Neural Networks and deep learning
  • Expert data scientists interested in expanding their knowledge of how Neural Networks work internally
  • Researchers who want to design and analyze current and new Neural Networks