Natural Language Processing: NLP With Transformers in Python
What you'll learn
- Industry standard NLP using transformer models
- Build full-stack question-answering transformer models
- Perform sentiment analysis with transformers models in PyTorch and TensorFlow
- Advanced search technologies like Elasticsearch and Facebook AI Similarity Search (FAISS)
- Create fine-tuned transformers models for specialized use-cases
- Measure performance of language models using advanced metrics like ROUGE
- Vector building techniques like BM25 or dense passage retrievers (DPR)
- An overview of recent developments in NLP
- Understand attention and other key components of transformers
- Learn about key transformers models such as BERT
- Preprocess text data for NLP
- Named entity recognition (NER) using spaCy and transformers
- Fine-tune language classification models
Requirements
- Knowledge of Python
- Experience in data science a plus
- Experience in NLP a plus
Description
Transformer models are the de-facto standard in modern NLP. They have proven themselves as the most expressive, powerful models for language by a large margin, beating all major language-based benchmarks time and time again.
In this course, we cover everything you need to get started with building cutting-edge performance NLP applications using transformer models like Google AI's BERT, or Facebook AI's DPR.
We cover several key NLP frameworks including:
HuggingFace's Transformers
TensorFlow 2
PyTorch
spaCy
NLTK
Flair
And learn how to apply transformers to some of the most popular NLP use-cases:
Language classification/sentiment analysis
Named entity recognition (NER)
Question and Answering
Similarity/comparative learning
Throughout each of these use-cases we work through a variety of examples to ensure that what, how, and why transformers are so important. Alongside these sections we also work through two full-size NLP projects, one for sentiment analysis of financial Reddit data, and another covering a fully-fledged open domain question-answering application.
All of this is supported by several other sections that encourage us to learn how to better design, implement, and measure the performance of our models, such as:
History of NLP and where transformers come from
Common preprocessing techniques for NLP
The theory behind transformers
How to fine-tune transformers
We cover all this and more, I look forward to seeing you in the course!
Who this course is for:
- Aspiring data scientists and ML engineers interested in NLP
- Practitioners looking to upgrade their skills
- Developers looking to implement NLP solutions
- Data scientist
- Machine Learning Engineer
- Python Developers
Instructor
An ML engineer with experience working with Silicon Valley startups, the big four accountancy firms, and other leading financial institutions.
Since entering the world of data science and machine learning, James has specialized in natural language, working on many successful, production-level NLP projects with industry-standard technologies.
Aside from his wide-ranging industry experience, James is a prolific writer and content creator - with the goal of sharing the fascinating world of machine learning (and in particular NLP) with all those that listen. James' articles alone have gathered more than two million viewers.
Coming from a self-taught background, James understands the difficult and winding path towards becoming a data scientist or machine learning engineer. His goal is to deliver content that illuminates that path for others and helps them on their own journey.