Find online courses made by experts from around the world.
Take your courses with you and learn anywhere, anytime.
Learn and practice real-world skills and achieve your goals.
This course will get you started in building your FIRST artificial neural network using deep learning techniques. Following my previous course on logistic regression, we take this basic building block, and build full-on non-linear neural networks right out of the gate using Python and Numpy. All the materials for this course are FREE.
We extend the previous binary classification model to multiple classes using the softmax function, and we derive the very important training method called "backpropagation" using first principles. I show you how to code backpropagation in Numpy, first "the slow way", and then "the fast way" using Numpy features.
Next, we implement a neural network using Google's new TensorFlow library.
You should take this course if you are interested in starting your journey toward becoming a master at deep learning, or if you are interested in machine learning and data science in general. We go beyond basic models like logistic regression and linear regression and I show you something that automatically learns features.
This course provides you with many practical examples so that you can really see how deep learning can be used on anything. Throughout the course, we'll do a course project, which will show you how to predict user actions on a website given user data like whether or not that user is on a mobile device, the number of products they viewed, how long they stayed on your site, whether or not they are a returning visitor, and what time of day they visited.
Another project at the end of the course shows you how you can use deep learning for facial expression recognition. Imagine being able to predict someone's emotions just based on a picture!
After getting your feet wet with the fundamentals, I provide a brief overview of some of the newest developments in neural networks - slightly modified architectures and what they are used for.
If you already know about softmax and backpropagation, and you want to skip over the theory and speed things up using more advanced techniques along with GPU-optimization, check out my follow-up course on this topic, Data Science: Practical Deep Learning Concepts in Theano and TensorFlow.
I have other courses that cover more advanced topics, such as Convolutional Neural Networks, Restricted Boltzmann Machines, Autoencoders, and more! But you want to be very comfortable with the material in this course before moving on to more advanced subjects.
This course focuses on "how to build and understand", not just "how to use". Anyone can learn to use an API in 15 minutes after reading some documentation. It's not about "remembering facts", it's about "seeing for yourself" via experimentation. It will teach you how to visualize what's happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you.
All the code for this course can be downloaded from my github: /lazyprogrammer/machine_learning_examples
In the directory: ann_class
Make sure you always "git pull" so you have the latest version!
HARD PREREQUISITES / KNOWLEDGE YOU ARE ASSUMED TO HAVE:
TIPS (for getting through the course):
USEFUL COURSE ORDERING:
Not for you? No problem.
30 day money back guarantee.
Learn on the go.
Desktop, iOS and Android.
Certificate of completion.
|Section 1: What is a neural network?|
Overview of the course and prerequisites.
An almost purely qualitative description of neural networks.
Where does this course fit into your deep learning studies?Preview
Deep Learning Readiness Test
Introduction to the E-Commerce Course Project
|Section 2: Classifying more than 2 things at a time|
From Logistic Regression to Neural Networks
What's the function we use to classify more than 2 things?
Sigmoid vs. Softmax
Where to get the code for this course
How do we code the softmax in Python?
Let's extend softmax and code the entire calculation from input to output.
E-Commerce Course Project: Pre-Processing the Data
E-Commerce Course Project: Making Predictions
|Quiz 1||1 question|
What do you get if you don't use a non-linearity such as sigmoid, tanh, rectifier, or softmax?
|Quiz 2||1 question|
|Section 3: Training a neural network|
What does it mean to "train" a neural network?
Derivation of backpropagation from first principles. Defining the objective function, taking the log, and differentiating the log with respect to weights in each layer.
A further look into backpropagation.
Backpropagation for deeper networks, exposing the structure, and how to code it more efficiently.
How to code bacpropagation in Python using numpy operations vs. slow for loops.
The WRONG Way to Learn Backpropagation
E-Commerce Course Project: Training Logistic Regression with Softmax
E-Commerce Course Project: Training a Neural Network
Backpropagation for binary output
|Section 4: Practical Machine Learning|
What are the donut and XOR problems again?
We look again at the XOR and donut problem from logistic regression. The features are now learned automatically.
Try the Donut and XOR yourself
sigmoid, tanh, relu along with their derivatives
Tips on choosing learning rate, regularization penalty, number of hidden units, and number of hidden layers.
Manually Choosing Learning Rate and Regularization Penalty
|Section 5: TensorFlow, exercises, practice, and what to learn next|
A look at Google's new TensorFlow library.
Visualizing what a neural network has learned using TensorFlow Playground
|What did you learn? What didn't you learn? Where can you learn more?|
You know more than you think you know
How to get good at deep learning + exercises
|Section 6: Project: Facial Expression Recognition|
Facial Expression Recognition Problem Description
The class imbalance problem
Facial Expression Recognition in Code (Binary / Sigmoid)
Facial Expression Recognition in Code (Logistic Regression Softmax)
Facial Expression Recognition in Code (ANN Softmax)
|Section 7: Appendix|
How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow
Gradient Descent Tutorial
I am a data scientist, big data engineer, and full stack software engineer.
For my masters thesis I worked on brain-computer interfaces using machine learning. These assist non-verbal and non-mobile persons communicate with their family and caregivers.
I have worked in online advertising and digital media as both a data scientist and big data engineer, and built various high-throughput web services around said data. I've created new big data pipelines using Hadoop/Pig/MapReduce. I've created machine learning models to predict click-through rate, news feed recommender systems using linear regression, Bayesian Bandits, and collaborative filtering and validated the results using A/B testing.
I have taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Humber College, and The New School.