Unleash Machine Learning: Build Artificial Neuron in Python

A journey into Machine Learning concepts using your very own Artificial Neural Network: Load, Train, Predict, Evaluate
3.7 (58 ratings) Instead of using a simple lifetime average, Udemy calculates a
course's star rating by considering a number of different factors
such as the number of ratings, the age of ratings, and the
likelihood of fraudulent ratings.
1,244 students enrolled
$19
$60
68% off
Take This Course
  • Lectures 27
  • Length 3 hours
  • Skill Level Beginner Level
  • Languages English
  • Includes Lifetime access
    30 day money back guarantee!
    Available on iOS and Android
    Certificate of Completion
Wishlisted Wishlist

How taking a course works

Discover

Find online courses made by experts from around the world.

Learn

Take your courses with you and learn anywhere, anytime.

Master

Learn and practice real-world skills and achieve your goals.

About This Course

Published 2/2016 English

Course Description

  • Cars that drive themselves hundreds of miles with no accidents?
  • Algorithms that recognize objects and faces from images with better performance than humans?

All possible thanks to Machine Learning!

In this course you will begin Machine Learning by implementing and using your own Artificial Neuronal Network for beginners.

In this Artificial Neuronal Network course you will:

  1. understand intuitively and mathematically the fundamentals of ANN
  2. implement from scratch a multi layer neuronal network in Python
  3. load and visually explore different datasets
  4. transform the data
  5. train you network and use it to make predictions
  6. measure the accuracy of your predictions
  7. use machine learning tools and techniques


Jump in directly:

  • All sourcecode and notebooks on public GitHub
  • Apply Machine Learning: section 4
  • Implement the ANN: section 3
  • Full ride: section 1, 2, 3, 4


What are the requirements?

  • Install scikit learn (for windows use anaconda)
  • Python 2.7.X working
  • ipython notebook working

What am I going to get from this course?

  • Build from scratch your own Artificial Neural Network
  • Know the fundamentals of Machine Learning and ANN
  • Train your ANN using 3 different datasets with increasing complexity
  • Predict the correct output using your trained ANN
  • Evaluate the accuracy of your predictions
  • Use scikit-learn, numpy and opencv

What is the target audience?

  • SHOULD NOT: beginners in Python
  • SHOULD NOT: experts in Machine Learning
  • SHOULD: students that want to begin Machine Learning with concepts and tools
  • SHOULD: students who want to learn and gain insights into why Artificial Neural Networks are such a powerful and unique tool

What you get with this course?

Not for you? No problem.
30 day money back guarantee.

Forever yours.
Lifetime access.

Learn on the go.
Desktop, iOS and Android.

Get rewarded.
Certificate of completion.

Curriculum

Section 1: Introduction
02:04

Overview of the Artificial Neural Network course.

implement a simple and clean artificial neuron network in python

loading datasets

visualizing high dimensional data

transforming data

training different ann classifiers

predicting, and evaluating the quality of our predictions

Github ANN Course repository
Article
Section 2: Neuron
06:09

STRUCTURE and FUNCTION of biological neuron

A neuron is a cell that processes and transmits information through electrical and chemical signals.

Soma Greek for body

Dendrite - Greek for tree

Nucleus

Axon

Axon hillock

Myelin sheath

Schwann cell

Nodes of Ranvier

Synapse

Chemical synapse

03:23

input X

output Y

weight W

sum inputs

activation function g: step, relu, logistic


04:44

Step by step calculation of the output given input and weights and the activation function

What logical function is it?

02:42

Put the results in a table.

Draw the results.

Observe that the outputs are linearly separable.

02:08

Student should solve this exercise on his/her own.

04:01

You can simplify the representation of a neuron if you transform the threshold into a bias input with the value 1 and a trainable bias weight.

02:59

One neuron is different from another neuron if it has different weights.

With different weights a neuron behaves differently and can approximate a different function.

Neurons draw decision boundaries...literaly.

07:44

The weight vector W = [w1 w2] is actually perpendicular to the decision boundary.

13:18

Perceptron learning algorithm:

1. initialize W randomly

2. pick a sample pattern p and compute the output y

3. if y != target change W

4. repeat 2,3

4 questions

Review the material

Section 3: Implementation
10:54

Do we want to design it using OOP pricinciples? Detailing all core objects and relationships between them for example:

  • a neuron class, a synapse class, a signal class, an interface for different activation functions, a layer class, the entire network as a collection on layers

OR

  • Do we want a terse, brain twisting, mathematical, matrix formulation that does everything in under 10 lines of code?

I have decided to pick an in-between form that I believe to be simple, clean and clear.


We will have 2 classes:

  • a class represing the entire artificial neuron network
  • a class represing a single layer of neurons

Layer:

self.id = id

self.n_neurons = layer_size

self.bias_val = 1

self.input = [0] * self.n_neurons

self.output = [0] * (self.n_neurons + use_bias)

self.output[0] = self.bias_val

self.error = [0] * self.n_neurons

self.weight = make_matrix(prev_layer_size + use_bias, self.n_neurons)

18:44

def predict(self, input):

"""

Return the network prediction for this input.

"""

self.set_input(input)

self.forward_propagate()

return self.get_output()

07:11

def train(self, inputs, targets, n_epochs):

  • self.set_input(inputs[i])
  • self.forward_propagate()
  • self.update_error_output(targets[i])

  • self.backward_propagate()
  • self.update_weights()
09:04

def backward_propagate(self):

"""

Backprop. Propagate the error from the output layer backwards to the input layer.

"""

07:20

run for xor - why doesnt it work with just 2 layers?

02:15

def hyperbolic_tangent(x):

  • return math.tanh(x)


def deriv_hyperbolic_tangent(x):

  • th = math.tanh(x)
  • return 1 - th * th
02:37

Refactor the activation function to make the code clear.

01:34

good_range = 1.0 / math.sqrt(prev_layer_size + 1)

self.weight[i][j] = between(-good_range, good_range)

06:19

Intuition using visual decision boundary explaining why xor can't be solved with 1 output neuron but requires 2 extra hidden neurons.

08:26

Power:

  • translate english to chinese is a function
  • output driving commands given pixels or radar information is a function
  • suggest what the customer should buy given previous purchasees and navigation history is a function

Idea1: you can create a step function

Idea2: you can aproximate any function using steps

09:04

Use a strange function:

def f(x):

  • return 0.3*x*x + 2*np.sin(x) - 2*x + 0.5

Use our ANN as a function approximator to match the given function.

Section 4: Applications
15:41

from sklearn.datasets import load_iris

iris = load_iris()

iris.feature_names

iris.target_names


  • each row: sample, example, observation, record, instance
  • each column: feature, predictor, attribute, independent variable, input, regressor, covariant
  • target: respose, outcome, label, dependent variable

xsc = MinMaxScaler(feature_range=(0, 1), copy=True)

xsc.fit(X)

ylb = LabelBinarizer()

X_train, X_test, y_train, y_test = train_test_split(xsc.transform(X), y)

nn.train(X_train_l, y_train_l, 5000)

preds = np.array([nn.predict(record) for record in X_test_l])


confusion_matrix(y_test, ypred)

classification_report(y_test, ypred)

10:57

X = digits.data

y = digits.target

Extend dataset

print('addind more images by moving the original images up, left, right, down by 1 pixel')

nn.train(X_train, Y_train, 10)

visualize the confusion matrix


# see bad predictions

failed = y_pred != y_test

X2_test = X_test[failed]

y2_pred = y_pred[failed]

y2_test = y_test[failed]

print_digits(X2_test, y2_test, y2_pred)

11:26

def serialize(nn, fname):

with open(fname, 'wb') as f:

  • # use protocol 2, default 0 has problems
  • pickle.dump(nn, f, 2)


def deserialize(fname):

with open(fname, 'rb') as f:

  • nn = pickle.load(f)
  • return nn
Save and Load FIX
Article
05:31

# The MNIST database of handwritten digits, available from this page,

# has a training set of 60,000 examples, and a test set of 10,000 examples.

# 28 x 28 pixels


# load a nn model ex: /models/nn_mnist_iter1600000.pickle

model_fname = 'nn_mnist_iter1600000.pickle'

nn = ann_util.deserialize('models/' + model_fname)

Students Who Viewed This Course Also Viewed

  • Loading
  • Loading
  • Loading

Instructor Biography

Razvan Pistolea, Source Code Painter

I am a Machine Learning Engineer, Deep Learning Engineer and even an Indie Game Developer with a Major in Compilers and a Master's degree in Artificial Intelligence from University Politehnica of Bucharest.

I am passionate about Games and Artificial Intelligence. I love to give life to A.I. agents in my project or my friend's projects and I want to teach you too.

Ready to start learning?
Take This Course