Hands-On Guide to Argo Workflows on Kubernetes
What you'll learn
- You will learn and understand the Argo Workflows concepts / functionalities and their practical implementation.
- You will get to know the Argo Workflows features.
- You will be able to create complex workflows with and without cron triggers using the different concepts and workflow functionalities.
- You will be able to create workflow templates that can be used as reusable building blocks for complex workflows.
- Basic knowledge of Kubernetes is desirable, but not essential.
Argo Workflows is a container native workflow engine for orchestrating jobs in Kubernetes. This means that complex workflows can be created and executed completely in a Kubernetes cluster.
It provides a mature user interface, which makes operation and monitoring very easy and clear. There is native artifact support, whereby it is possible to use completely different artifact repositories (Minio, AWS S3, Artifactory, HDFS, OSS, HTTP, Git, Google Cloud Service, raw).
Templates and cron workflows can be created, with which individual components can be created and combined into complex workflows. This means that composability is given. Furthermore, workflows can be archived and Argo provides a REST API and an Argo CLI tool, which makes communication with the Argo server easy.
It is also worth mentioning that Argo Workflows can be used to manage thousands of parallel pods and workflows within a Kubernetes cluster. And robust repetition mechanisms ensure a high level of reliability.
There is already a large, global community that is growing steadily. Just to name IBM, SAP and NVIDIA. It is mainly used for machine learning, ETL, Batch - and data processing and for CI / CD. And what is also very important - it is open source and a project of the Cloud Native Computing Foundation.
Upon successful completion of the course, you will be able to create complex workflows with and without cron triggers using the different concepts and workflow functionalities. You will be able to create workflow templates and use them as reusable building blocks for complex workflows. And you get to know and apply the Argo features.
What can you expect in the course?
You will receive more than 50 primarily practical lessons, which include more than 6 hours of video material. You can download the associated workflow definitions as .yaml and the instructions as well as the Powerpoint slides as .pdf from the course materials.
Each chapter ends with an exercise that you have to solve yourself. Of course, the solutions are also made available to you here as a video and the .yamls.
You will get access to the online Q&A forum, where either other course participants or I will answer your questions.
And finally, if you successfully complete the course, you will also receive a certificate that will look good on your CV.
30 days money back guarentee
If you are not satisfied with the course, you are welcome to return it without any problems within 30 days and you will get your money back.
Who this course is for:
- Anyone who wants to use a Kubernetes native orchestration tool to create simple and complex workflows.
- Everyone who wants to get to know most of the features of Argo Workflows for the creation of large workflows with a practical approach.
Es gibt so viele coole Tools da draußen - vor allem im Bereich Small/Large/Big Data. Ein Leben reicht garnicht aus, alle Tools zu kennen und gut darin zu sein. Aber bereits mit einem übersichtlichen und guten Toolset kann man tolle Projekte mit echtem Mehrwert umsetzen.
Ich habe 2012 die Universität als Diplomingenieur für Mechatronik abgeschlossen, wobei Programmierung vor allem im Embedded Umfeld eine wichtige Rolle spielte. Im Laufe meiner ersten Berufsjahre als Ingenieur entdeckte ich mehr und mehr meine Leidenschaft für Python vor allem mit Small/Large/Big Data.
Nach einigen Hobby-Projekten wagte ich 2016 schließlich den Schritt auch beruflich in diesem Umfeld zu arbeiten. Nun arbeite ich seit mehreren Jahren erfolgreich als Data-Engineer und hatte die Möglichkeit in tollen Projekten mitzuwirken.
Dieses Wissen möchte ich gern durch Kurse im Bereich Data-Engineering und Data-Science mit einem hohen Fokus auf die Praxis weitergeben.
There are so many cool tools out there - especially in the small / large / big data area. One life is not enough to know all tools and be proficient with them. But even with a quite small and good toolset, you can implement great projects with real value.
In 2012 I graduated as engineer for mechatronics. Programming especially in the embedded area was an important part of my education. During my first years as an engineer, I discovered more and more my passion for Python, especially with small / large / big data.
After a few hobby projects, I took the step to work professionally in this area in 2016. I've been working now for years as a data engineer being involved in great projects.
I like to pass this knowledge on through courses in data engineering and data science with a high focus on practice.