NLP Certification- BERT, GPTs, HMTL & Multimodal Large Model
What you'll learn
- Understanding of Transformers from scratch to BERT to GPT3
- Language Translations using Transformers in NLP
- Text Classification and Implementation of Chatbot in RASA and Spicy
- GPTs as Few Shot Learners & Multilingual NLP
- GPT 4- What to expect?
- 50+ NLP Coding Exercises with Coding Solutions
- Attention and Multi- Head Attention in NLP Transformers
- Implement a Transformer for an NLP based task/ activity
- Google Mum as multilingual unified platfrom
- Basic Familiarity with the Natural Language Processing is recommended but not essential
This course introduces you to the fundamentals of Transformers in NLP. The topics include are;
1. Recurrent Neural Networks & LSTM
2. Bi-Directional Encoder Representation from Transformers.
3. Masked Language Modelling.
4. Next Sentence Prediction using Transformers.
5. Generative Pre-trained Transformers and their implementation in RASA and SpiCy.
6. Complete Code for Online Fraud Detection System.
7. Complete Code for Text Classification.
8. Complete Code for Language Translation System.
9. Complete Code for Movie Recommender System.
10. Complete Code for Speech to Text Conversion using GPT-2.
11. Complete Code for Chatbot using GPT3.
12. Complete Code for Text Summary System using GPT3.
13. Automated Essay Scoring using Transformer Models.
14. Sentiment Analysis using Pre-trained Transformers.
15. Training and Testing a GPT- 2 for Novel Writing.
16. Game Design using AlphaGo and Transformers.
17. 50+ NLP coding exercises along with complete solutions to complete this certification.
Transformers (formerly known as PyTorch-transformers and pytorch-pretrained-bert) provide thousands of pre-trained models to perform tasks on different modalities such as text, vision, and audio.
These models can be applied on:
Text, for tasks like text classification, information extraction, question answering, summarization, translation, text generation, in over 100 languages.
Images, for tasks like image classification, object detection, and segmentation.
Audio, for tasks like speech recognition and audio classification.
Transformer Models are great with Sequential Data and are Pre-trained which makes them versatile and capable. It allows further to Gain Out-of-the-Box Functionality. Transformer models enable you to take a large-scale LM (language model) trained on a massive amount of text (the complete works of Shakespeare), then update the model for a specific conceptual task, far beyond mere “reading,” such as sentiment analysis and even predictive analysis.
Who this course is for:
- Beginner students interested in learning NLP via Transformers
Prof. Dr. Engr. Junaid Zafar is currently working as Chairperson in Department of Electrical and Computer Engineering, Government College University, Lahore. He is also Director, Office of Research, innovation and Commercialization. He has completed his PhD in Electrical and Electronics Engineering, The University of Manchester University, UK, and BSc in Electrical Engineering from U.E.T Lahore. He is Academic visitor to the University of Cambridge, UK, MMU, UK and National University of Ireland. He remained Dual Degree programme coordinator at the Lancaster University, UK. Dr. Engr. Junaid Zafar received Roll of Honors for National Education Commission and Outstanding Teacher/ Researcher Awards from the Higher Education Commission, Pakistan. He is leading the macine learning and Artificial Intelligence centre with GC University, Lahore. He is member of Universal Association of Electronics & Computer Engineers, International Association of Computer Science & Information, and member of International Association of Engineers, IAENG Society of Artificial Intelligence, IAENG Society of Electrical Engineering, Science & Engineering Institute, IAENG Society of Imaging Engineering, Institute of Research Engineers & Doctors, and IAENG Society of Wireless Networks. He is member of editorial board in Journal of Future Technologies & Communications, Technical Programme committee, Frontiers of Information & Technologies, and Technical Programme Committee, Multi- Conference on Sciences & Technology. He is also serving as reviewer for IEEE Transactions on Microwave Theory & Techniques, IEEE Transactions on Antennas, IEEE Antenna & Wireless Propagation Letters, IEEE Transactions on Plasma Science, IEEE Transactions on Magnetics, International Journal of Electronics, and IET Antennas & Radio- wave Propagation. He has so far taught over twenty diffrent online courses based on outcome based student oriented models. He has also supervised more than 100 Masters/ MPhil thesis. He has published over 50 high impact factor publications and presented his work at several national and international renowned platforms.