Find online courses made by experts from around the world.
Take your courses with you and learn anywhere, anytime.
Learn and practice real-world skills and achieve your goals.
All possible thanks to Machine Learning!
In this course you will begin Machine Learning by implementing and using your own Artificial Neuronal Network for beginners.
In this Artificial Neuronal Network course you will:
Jump in directly:
Not for you? No problem.
30 day money back guarantee.
Learn on the go.
Desktop, iOS and Android.
Certificate of completion.
|Section 1: Introduction|
Overview of the Artificial Neural Network course.
implement a simple and clean artificial neuron network in python
visualizing high dimensional data
training different ann classifiers
predicting, and evaluating the quality of our predictions
Github ANN Course repository
|Section 2: Neuron|
STRUCTURE and FUNCTION of biological neuron
A neuron is a cell that processes and transmits information through electrical and chemical signals.
Soma Greek for body
Dendrite - Greek for tree
Nodes of Ranvier
activation function g: step, relu, logistic
Step by step calculation of the output given input and weights and the activation function
What logical function is it?
Put the results in a table.
Draw the results.
Observe that the outputs are linearly separable.
Student should solve this exercise on his/her own.
You can simplify the representation of a neuron if you transform the threshold into a bias input with the value 1 and a trainable bias weight.
One neuron is different from another neuron if it has different weights.
With different weights a neuron behaves differently and can approximate a different function.
Neurons draw decision boundaries...literaly.
The weight vector W = [w1 w2] is actually perpendicular to the decision boundary.
Perceptron learning algorithm:
1. initialize W randomly
2. pick a sample pattern p and compute the output y
3. if y != target change W
4. repeat 2,3
|Quiz 1||4 questions|
Review the material
|Section 3: Implementation|
Do we want to design it using OOP pricinciples? Detailing all core objects and relationships between them for example:
I have decided to pick an in-between form that I believe to be simple, clean and clear.
We will have 2 classes:
self.id = id
self.n_neurons = layer_size
self.bias_val = 1
self.input =  * self.n_neurons
self.output =  * (self.n_neurons + use_bias)
self.output = self.bias_val
self.error =  * self.n_neurons
self.weight = make_matrix(prev_layer_size + use_bias, self.n_neurons)
def predict(self, input):
Return the network prediction for this input.
def train(self, inputs, targets, n_epochs):
Backprop. Propagate the error from the output layer backwards to the input layer.
run for xor - why doesnt it work with just 2 layers?
Refactor the activation function to make the code clear.
good_range = 1.0 / math.sqrt(prev_layer_size + 1)
self.weight[i][j] = between(-good_range, good_range)
Intuition using visual decision boundary explaining why xor can't be solved with 1 output neuron but requires 2 extra hidden neurons.
Idea1: you can create a step function
Idea2: you can aproximate any function using steps
Use a strange function:
Use our ANN as a function approximator to match the given function.
|Section 4: Applications|
from sklearn.datasets import load_iris
iris = load_iris()
xsc = MinMaxScaler(feature_range=(0, 1), copy=True)
ylb = LabelBinarizer()
X_train, X_test, y_train, y_test = train_test_split(xsc.transform(X), y)
nn.train(X_train_l, y_train_l, 5000)
preds = np.array([nn.predict(record) for record in X_test_l])
X = digits.data
y = digits.target
print('addind more images by moving the original images up, left, right, down by 1 pixel')
nn.train(X_train, Y_train, 10)
visualize the confusion matrix
# see bad predictions
failed = y_pred != y_test
X2_test = X_test[failed]
y2_pred = y_pred[failed]
y2_test = y_test[failed]
print_digits(X2_test, y2_test, y2_pred)
def serialize(nn, fname):
with open(fname, 'wb') as f:
with open(fname, 'rb') as f:
Save and Load FIX
# The MNIST database of handwritten digits, available from this page,
# has a training set of 60,000 examples, and a test set of 10,000 examples.
# 28 x 28 pixels
# load a nn model ex: /models/nn_mnist_iter1600000.pickle
model_fname = 'nn_mnist_iter1600000.pickle'
nn = ann_util.deserialize('models/' + model_fname)
I am a Machine Learning Engineer, Deep Learning Engineer and even an Indie Game Developer with a Major in Compilers and a Master's degree in Artificial Intelligence from University Politehnica of Bucharest.
I am passionate about Games and Artificial Intelligence. I love to give life to A.I. agents in my project or my friend's projects and I want to teach you too.