Deep Learning: Convolutional Neural Networks in Python
What you'll learn
- Understand convolution and why it's useful for Deep Learning
- Understand and explain the architecture of a convolutional neural network (CNN)
- Implement a CNN in TensorFlow 2
- Apply CNNs to challenging Image Recognition tasks
- Apply CNNs to Natural Language Processing (NLP) for Text Classification (e.g. Spam Detection, Sentiment Analysis)
- Basic math (taking derivatives, matrix arithmetic, probability) is helpful
- Python, Numpy, Matplotlib
*** NOW IN TENSORFLOW 2 and PYTHON 3 ***
Learn about one of the most powerful Deep Learning architectures yet!
The Convolutional Neural Network (CNN) has been used to obtain state-of-the-art results in computer vision tasks such as object detection, image segmentation, and generating photo-realistic images of people and things that don't exist in the real world!
This course will teach you the fundamentals of convolution and why it's useful for deep learning and even NLP (natural language processing).
You will learn about modern techniques such as data augmentation and batch normalization, and build modern architectures such as VGG yourself.
This course will teach you:
The basics of machine learning and neurons (just a review to get you warmed up!)
Neural networks for classification and regression (just a review to get you warmed up!)
How to model image data in code
How to model text data for NLP (including preprocessing steps for text)
How to build an CNN using Tensorflow 2
How to use batch normalization and dropout regularization in Tensorflow 2
How to do image classification in Tensorflow 2
How to do data preprocessing for your own custom image dataset
How to use Embeddings in Tensorflow 2 for NLP
How to build a Text Classification CNN for NLP (examples: spam detection, sentiment analysis, parts-of-speech tagging, named entity recognition)
All of the materials required for this course can be downloaded and installed for FREE. We will do most of our work in Numpy, Matplotlib, and Tensorflow. I am always available to answer your questions and help you along your data science journey.
This course focuses on "how to build and understand", not just "how to use". Anyone can learn to use an API in 15 minutes after reading some documentation. It's not about "remembering facts", it's about "seeing for yourself" via experimentation. It will teach you how to visualize what's happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you.
matrix addition and multiplication
basic probability (conditional and joint distributions)
Python coding: if/else, loops, lists, dicts, sets
Numpy coding: matrix and vector operations, loading a CSV file
WHAT ORDER SHOULD I TAKE YOUR COURSES IN?:
Check out the lecture "Machine Learning and AI Prerequisite Roadmap" (available in the FAQ of any of my courses, including the free Numpy course)
Every line of code explained in detail - email me any time if you disagree
No wasted time "typing" on the keyboard like other courses - let's be honest, nobody can really write code worth learning about in just 20 minutes from scratch
Not afraid of university-level math - get important details about algorithms that other courses leave out
Who this course is for:
- Students, professionals, and anyone else interested in Deep Learning, Computer Vision, or NLP
- Software Engineers and Data Scientists who want to level up their career
Today, I spend most of my time as an artificial intelligence and machine learning engineer with a focus on deep learning, although I have also been known as a data scientist, big data engineer, and full stack software engineer.
I received my first masters degree over a decade ago in computer engineering with a specialization in machine learning and pattern recognition. I received my second masters degree in statistics with applications to financial engineering.
Experience includes online advertising and digital media as both a data scientist (optimizing click and conversion rates) and big data engineer (building data processing pipelines). Some big data technologies I frequently use are Hadoop, Pig, Hive, MapReduce, and Spark.
I've created deep learning models to predict click-through rate and user behavior, as well as for image and signal processing and modeling text.
My work in recommendation systems has applied Reinforcement Learning and Collaborative Filtering, and we validated the results using A/B testing.
I have taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Hunter College, and The New School.