Master Designing, Integrating & Deploying Enterprise AI Apps
What you'll learn
- A complete end-to-end solution consisting of 3 distributed applications using asyncio, flatbuffers, NATS and Docker
- Translate the requirements of a big and complex machine learning project into a scalable solution
- How to divide complex problem into simple & manageable parts using microservices style architecture ?
- Foundations, insights and practical usage of Asynchronous IO in Python
- How to design high performance, low resource and future-proof data formats & protocols using Flatbuffers
- Loosely coupled distributed app development using Message Bus (NATS)
- Packaging, Deploying & Upgrading applications using Docker & Docker Compose
- Practical code examples to support the concepts taught in this course and the fully developed final solution
Requirements
- A good working knowledge of Python Language
Description
Target Audience
Machine Learning Engineers & Data Scientists
What is unique about this course & What will you learn?
Why What & How of designing, integrating & deploying Enterprise Level Data Science/AI/ML applications
How to translate requirements into scalable architectural components?
How to break a big complex problem into simple & manageable parts using microservices style architecture?
An End-to-End real-world enterprise-level machine learning solution
Asynchronous IO - Foundations & Writing I/O bound applications in python 3
NATS - A Cloud Native Computing Foundation open source project to connect distributed applications
FlatBuffers - A language-independent, compact and fast binary structured data representation language
Docker & Docker-compose - The gold standard in deploying and orchestrating applications
Why should you learn all this?
A statistical or deep learning model is not an application rather it is an important component of a solution to real-world problems. A sophisticated solution to a complex problem generally consists of multiple applications written using different languages and running on a cluster of machines.
Your role as a Data Scientist and Machine Learning engineer is not just limited to a model building or tuning its performance rather it is expected that at the very minimum you will design your applications so that they can easily integrate with other applications of a big solution as well as are easily deployable using modern DevOps methodologies.
Mastering how to make AI applications integrate with other applications while ensuring scalability and upgradability will offer you a competitive advantage over others.
The good news is that mastering them is not difficult at all!
How is this course taught?
My teaching style covers 3 key aspects of mastering any technology:
Intuition
Theory
Code
For any solution first I describe the overall goal, its associated challenges, and how to break down a big complex problem into manageable components. This process of simplifying the problems into components will guide you in identifying & selecting the best technology to use. I then explain the why, what & how of the selected technologies (AsyncIO, NATS, Flatbuffers, Docker) with code examples. These code examples start simple and I then iteratively add features to bring them to the level of real-world applications.
I have taken immense care in preparing the material that has great animations to help you develop intuition behind the solutions.
I have made sure that coding sessions follow an iterative development style and more importantly are clear & delightful.
All the source code from the iterative cycles as well as full end to end solution has been provided in the resources.
Who this course is for:
- Machine Learning engineers
- Data Scientists
- Software Engineers
Instructor
I am an engineer and applied research scientist with 20+ years of experience and expertise in diverse domains - embedded operating systems, language runtimes, security, cryptography & high-performance computing on edge devices.
I am very well versed in C, C++, Java, C#, Typescript & Python. I have been fortunate to design & develop real-world products with millions of users worldwide. However, these days Python has become my main language because of machine learning.
In the last few years, I have mastered deep learning with a special focus on computer vision, probabilistic programming, and generative networks. I have been fortunate to work on real-world problems in this space as well and the pace of research and discoveries in this area fascinates me.
After working on products in computer vision I was troubled by two main challenges - lack of explainability of modern deep models and how to include domain information in models. The latter helped me in discovering the Bayesian Probabilistic models and ever since I am on the journey to master the intricacies of this field.
My passion and strength lie in breaking complex subjects into simple to understand parts. I have always believed that sharing knowledge is a gift that one gives to himself/herself and I hope that with modern platforms for sharing the knowledge I can now reach out to more people.
This all being said, even after many years in the industry and experience with multiple domains, I consider myself a perpetual student of science and technology and still am excited by both theoretical underpinnings as well as real-world implementation details.