What you'll learn
- Python
- Pytorch
- BERT
- Deep Learning
- Image Processing
- Natural Language Processing
- Neural Network
- Gradient Descent
- transformers
- huggingface
- T5
Requirements
- python basic syntax
- basic programming skill
Description
Pytorch&Hugginface Deep Learning Course(Colab Hands-On)
Welcome to Pytorch Deep Learning From Zero To Hero Series.
If you have already mastered the basic syntax of python and don't know what to do next, this course will be a rocket booster to skyrocket your programming skill to a business applicable level.
In this course, you will be able to master implementing deep neural network from the very beginning(simple perceptron) to BERT transfer learning/Google's T5 by using pytorch and huggingface yourself by colab. Each Section will have one assignment for you to think and code yourself.
The Agenda is below.
Agenda:
Introduction
Google Colaboratory
Neuron
Perceptron
Make Your Perceptron Trainable
Normalize Data
Activation Function
Loss Function
Gradient Descent
Elegant Pytorch Gradient Descent
Final Project
Final Project Explained
Multi Layer Perceptron(MLP)
One Hot Encoding
Prepare data for MLP
Define MLP
Train & Evaluate MLP
Final Project for MLP
FCNN Explained
FCNN LOVE Letters Classification using MLP
Final Project For FCNN
CNN Explained
CNN Prepare data(Fashion MNIST)
CNN Define Model
CNN Train&Evaluate Model
CNNInference
Final Project For CNN
RNN Explained
RNN Prepare data
RNN Define Model
RNN Train Model
RNN Inference
BERT Sesame Street
BERT Prepare Data IMDB
BERT Model definition
BERT Model Training
BERT Model Evaluation
BERT Model Prediction
BERT Final Project
T5 Prepare Data
T5 Model definition
T5 Model Training
T5 Model Evaluation
T5 Model Prediction
T5 Final Project
Let's start our journey together.
Beautiful is better than ugly.
Explicit is better than implicit.
Simple is better than complex.
Complex is better than complicated.
Flat is better than nested.
Sparse is better than dense.
Readability counts.
Special cases aren’t special enough to break the rules.
Although practicality beats purity.
Errors should never pass silently.
Unless explicitly silenced.
In the face of ambiguity, refuse the temptation to guess.
There should be one– and preferably only one –obvious way to do it.
Although that way may not be obvious at first unless you’re Dutch.
Now is better than never.
Although never is often better than *right* now.
If the implementation is hard to explain, it’s a bad idea.
If the implementation is easy to explain, it may be a good idea.
Namespaces are one honking great idea — let’s do more of those!
Who this course is for:
- Beginner python developers who are curious about deep learning
- Beginner python developers who are curious about pytorch
- Beginner python developers who are curious about Natural Language Processing
- Beginner python developers who are curious about huggingface
- Python developers who are curious about implementing BERT transfer learning model
- Python developers who are curious about implementing T5 generative summarization learning model
Instructor
English Name: Joshua K. Cage
Data Scientist/NLPer at a global research institute.
The author of the amazon kindle books below.
1) Python Natural Language Processing (NLP) Exercises : From Basics to BERT
2) Python Data Analysis for Newbies: Numpy/pandas/matplotlib/scikit-learn/keras
3) Python Numpy 101 Exercises: Skyrocket your Python skill
==
早稲田大学理工学部卒、早稲田大学大学院修了。現在、企業にて自然言語処理及び機械学習に関する研究開発に従事。
著書に『Python自然言語処理101本ノック:: ~基礎からBERTまで~』『初心者向けPythonデータ分析入門: Numpy/Pandas/Matplotlib/Scikit-learn/Keras対応』『実践で理解する G検定 ディープラーニング教本: G検定合格者が教える最短で合格する秘法』『詳解!実践で理解するG検定 Web模試 解説書 Kindle版』