Buying for a Team? Gift This Course
Wishlisted Wishlist

Please confirm that you want to add Unleash Machine Learning: Build Artificial Neuron in Python to your Wishlist.

Add to Wishlist

Unleash Machine Learning: Build Artificial Neuron in Python

A journey into Machine Learning concepts using your very own Artificial Neural Network: Load, Train, Predict, Evaluate
4.2 (75 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
1,406 students enrolled
Last updated 3/2016
English
$10 $150 93% off
3 days left at this price!
30-Day Money-Back Guarantee
Includes:
  • 3 hours on-demand video
  • 2 Articles
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
Have a coupon?
Description
  • Cars that drive themselves hundreds of miles with no accidents?
  • Algorithms that recognize objects and faces from images with better performance than humans?

All possible thanks to Machine Learning!

In this course you will begin Machine Learning by implementing and using your own Artificial Neuronal Network for beginners.

In this Artificial Neuronal Network course you will:

  1. understand intuitively and mathematically the fundamentals of ANN
  2. implement from scratch a multi layer neuronal network in Python
  3. load and visually explore different datasets
  4. transform the data
  5. train you network and use it to make predictions
  6. measure the accuracy of your predictions
  7. use machine learning tools and techniques


Jump in directly:

  • All sourcecode and notebooks on public GitHub
  • Apply Machine Learning: section 4
  • Implement the ANN: section 3
  • Full ride: section 1, 2, 3, 4


Who is the target audience?
  • SHOULD NOT: beginners in Python
  • SHOULD NOT: experts in Machine Learning
  • SHOULD: students that want to begin Machine Learning with concepts and tools
  • SHOULD: students who want to learn and gain insights into why Artificial Neural Networks are such a powerful and unique tool
Students Who Viewed This Course Also Viewed
What Will I Learn?
Build from scratch your own Artificial Neural Network
Know the fundamentals of Machine Learning and ANN
Train your ANN using 3 different datasets with increasing complexity
Predict the correct output using your trained ANN
Evaluate the accuracy of your predictions
Use scikit-learn, numpy and opencv
View Curriculum
Requirements
  • Install scikit learn (for windows use anaconda)
  • Python 2.7.X working
  • ipython notebook working
Curriculum For This Course
Expand All 27 Lectures Collapse All 27 Lectures 02:56:21
+
Introduction
2 Lectures 02:08

Overview of the Artificial Neural Network course.

implement a simple and clean artificial neuron network in python

loading datasets

visualizing high dimensional data

transforming data

training different ann classifiers

predicting, and evaluating the quality of our predictions

Preview 02:04

Github ANN Course repository
00:04
+
Neuron
9 Lectures 47:08

STRUCTURE and FUNCTION of biological neuron

A neuron is a cell that processes and transmits information through electrical and chemical signals.

Soma Greek for body

Dendrite - Greek for tree

Nucleus

Axon

Axon hillock

Myelin sheath

Schwann cell

Nodes of Ranvier

Synapse

Chemical synapse

Biological Neuron
06:09

input X

output Y

weight W

sum inputs

activation function g: step, relu, logistic


Preview 03:23

Step by step calculation of the output given input and weights and the activation function

What logical function is it?

Compute a logical function
04:44

Put the results in a table.

Draw the results.

Observe that the outputs are linearly separable.

Linear Separability
02:42

Student should solve this exercise on his/her own.

Compute another logical function
02:08

You can simplify the representation of a neuron if you transform the threshold into a bias input with the value 1 and a trainable bias weight.

Trick to remove the inside threshold
04:01

One neuron is different from another neuron if it has different weights.

With different weights a neuron behaves differently and can approximate a different function.

Neurons draw decision boundaries...literaly.

Weights
02:59

The weight vector W = [w1 w2] is actually perpendicular to the decision boundary.

Decision boundary
07:44

Perceptron learning algorithm:

1. initialize W randomly

2. pick a sample pattern p and compute the output y

3. if y != target change W

4. repeat 2,3

Perceptron learning
13:18

Review the material

Quiz 1
4 questions
+
Implementation
11 Lectures 01:23:28

Do we want to design it using OOP pricinciples? Detailing all core objects and relationships between them for example:

  • a neuron class, a synapse class, a signal class, an interface for different activation functions, a layer class, the entire network as a collection on layers

OR

  • Do we want a terse, brain twisting, mathematical, matrix formulation that does everything in under 10 lines of code?

I have decided to pick an in-between form that I believe to be simple, clean and clear.


We will have 2 classes:

  • a class represing the entire artificial neuron network
  • a class represing a single layer of neurons

Layer:

self.id = id

self.n_neurons = layer_size

self.bias_val = 1

self.input = [0] * self.n_neurons

self.output = [0] * (self.n_neurons + use_bias)

self.output[0] = self.bias_val

self.error = [0] * self.n_neurons

self.weight = make_matrix(prev_layer_size + use_bias, self.n_neurons)

Top down design
10:54

def predict(self, input):

"""

Return the network prediction for this input.

"""

self.set_input(input)

self.forward_propagate()

return self.get_output()

Predict (forward)
18:44

def train(self, inputs, targets, n_epochs):

  • self.set_input(inputs[i])
  • self.forward_propagate()
  • self.update_error_output(targets[i])

  • self.backward_propagate()
  • self.update_weights()
Preview 07:11

def backward_propagate(self):

"""

Backprop. Propagate the error from the output layer backwards to the input layer.

"""

Train part 2
09:04

run for xor - why doesnt it work with just 2 layers?

The XOR problem
07:20

def hyperbolic_tangent(x):

  • return math.tanh(x)


def deriv_hyperbolic_tangent(x):

  • th = math.tanh(x)
  • return 1 - th * th
Preview 02:15

Refactor the activation function to make the code clear.

Refactor activation function
02:37

good_range = 1.0 / math.sqrt(prev_layer_size + 1)

self.weight[i][j] = between(-good_range, good_range)

Improve weight initialization
01:34

Intuition using visual decision boundary explaining why xor can't be solved with 1 output neuron but requires 2 extra hidden neurons.

Intuition XOR is hard
06:19

Power:

  • translate english to chinese is a function
  • output driving commands given pixels or radar information is a function
  • suggest what the customer should buy given previous purchasees and navigation history is a function

Idea1: you can create a step function

Idea2: you can aproximate any function using steps

Preview 08:26

Use a strange function:

def f(x):

  • return 0.3*x*x + 2*np.sin(x) - 2*x + 0.5

Use our ANN as a function approximator to match the given function.

Approximate a strange function example
09:04
+
Applications
5 Lectures 43:40

from sklearn.datasets import load_iris

iris = load_iris()

iris.feature_names

iris.target_names


  • each row: sample, example, observation, record, instance
  • each column: feature, predictor, attribute, independent variable, input, regressor, covariant
  • target: respose, outcome, label, dependent variable

xsc = MinMaxScaler(feature_range=(0, 1), copy=True)

xsc.fit(X)

ylb = LabelBinarizer()

X_train, X_test, y_train, y_test = train_test_split(xsc.transform(X), y)

nn.train(X_train_l, y_train_l, 5000)

preds = np.array([nn.predict(record) for record in X_test_l])


confusion_matrix(y_test, ypred)

classification_report(y_test, ypred)

Preview 15:41

X = digits.data

y = digits.target

Extend dataset

print('addind more images by moving the original images up, left, right, down by 1 pixel')

nn.train(X_train, Y_train, 10)

visualize the confusion matrix


# see bad predictions

failed = y_pred != y_test

X2_test = X_test[failed]

y2_pred = y_pred[failed]

y2_test = y_test[failed]

print_digits(X2_test, y2_test, y2_pred)

Digits Classifier
10:57

def serialize(nn, fname):

with open(fname, 'wb') as f:

  • # use protocol 2, default 0 has problems
  • pickle.dump(nn, f, 2)


def deserialize(fname):

with open(fname, 'rb') as f:

  • nn = pickle.load(f)
  • return nn
Save and Load functionality
11:26

Save and Load FIX
00:05

# The MNIST database of handwritten digits, available from this page,

# has a training set of 60,000 examples, and a test set of 10,000 examples.

# 28 x 28 pixels


# load a nn model ex: /models/nn_mnist_iter1600000.pickle

model_fname = 'nn_mnist_iter1600000.pickle'

nn = ann_util.deserialize('models/' + model_fname)

Preview 05:31
About the Instructor
4.3 Average rating
152 Reviews
4,450 Students
3 Courses
Source Code Painter

I am a Machine Learning Engineer, Deep Learning Engineer and even an Indie Game Developer with a Major in Compilers and a Master's degree in Artificial Intelligence from University Politehnica of Bucharest.

I am passionate about Games and Artificial Intelligence. I love to give life to A.I. agents in my project or my friend's projects and I want to teach you too.

Report Abuse