Self-Supervised Learning A-Z: Theory & Hands-On Python
What you'll learn
- Self-Supervised Learning | Representation Learning | Contrastive Learning | SimCLR of Chen et al. (2020)
- Pretext Model | Downstream Model | Transfer Learning | Fine-Tuning
- Machine Learning | Deep Learning | Supervised Learning | Unsupervised Learning
- Python 3+ | TensorFlow | Google Colab
- Python 3+ Programming Skills
- Machine Learning Knowledge
- Familiarity with TensorFlow 2.X
- A Gmail & A Web Browser (Preferably Google Chrome)
“If intelligence is a cake, the bulk is self-supervised learning, the icing on the cake is supervised learning, and the cherry on the cake is reinforcement learning.”
Yann André LeCun
Chief AI Scientist at Meta
Some “Musts” Before Starting
You must be familiar with deep learning architectures, including stacks of convolutional, recurrent, dense, pooling, average, and normalization layers using the TensorFlow library in Python 3+.
You must know how to develop, train, and test multi-layer deep learning models using the TensorFlow library in Python 3+.
You must know that this is a “100% Money Back Guarantee” course under Udemy rules.
My name is Mohammad H. Rafiei, Ph.D. I am honored and humbled to serve as your instructor.
I am a machine learning engineer, researcher, and instructor at Johns Hopkins University, College of Engineering, and Georgia State University, Department of Computer Science. I am also the founder of MHR Group LLC in Georgia.
Subject & Materials
This course teaches you “Self-Supervised Learning” (SSL), also known as “Representation Learning.”
SSL is a relatively new and hot subject in machine learning to deal with repositories with limited labeled data.
There are two general SSL techniques, contrastive and generative. This course’s focus is on supervised and unsupervised contrastive models only.
There are several examples and experiments across this course for you to fully grasp the idea behind SSL.
Our domain of focus is the image domain, but you can apply what you learn to other domains, including temporal records and natural language processing (NLP).
In every lecture, you can access the corresponding Python .ipynb notebooks. The notebooks are best to be run with a GPU accelerator. Watch the following lecture for more details.
If the videos are too fast or too slow, you can always change their speed. You can also turn on the video caption.
It is best to watch the videos of this course using 1080p quality with the caption on.
The lectures are created to work best on Google Colab with GPU accelerators.
The TensorFlow version used in these lectures is ‘2.8.2.’ You may use %tensorflow_version 2.x at the very first cell of your Python notebook.
Machine learning libraries in Python, including TensorFlow, are evolving. As such, you must keep yourself updated with changes and modify your codes.
Four Sections and ten Lectures:
Section 01: Introduction.
Lecture 01: An Introduction to the Course.
Lecture 02: Python Notebooks.
Section 02: Supervised Models.
Lecture 03: Supervised Learning.
Lecture 04: Transfer Learning & Fine-Tuning.
Section 03: Labeling Task.
Lecture 05: Labeling Challenges.
Section 04: Self-Supervised Learning.
Lecture 06: Self-Supervised Learning.
Lecture 07: Supervised Contrastive Pretext, Experiment 1.
Lecture 08: Supervised Contrastive Pretext, Experiment 2.
Lecture 09: SimCLR, An Unsupervised Contrastive Pretext Model.
Lecture 10: SimCLR Experiment.
Who this course is for:
- Machine Learning Students and Enthusiasts
- Those Want to Learn Self-Supervised Learning and Practice It in Python 3+
Dr. Mohammad H. Rafiei obtained his Ph.D. in civil engineering from Ohio State University in 2016. Since then, he has worked as a postdoc and a research scientist faculty member in civil, mechanical, and materials engineering, computer science, and medicine in different departments at the Ohio State University, Johns Hopkins University, and Georgia State University. Dr. Rafiei’s research focuses on advances in computer technology, such as advanced deep learning techniques, to bring automation to different engineering and medicine areas.