Argo Workflows on Kubernetes - Core Concepts

Learn how to orchestrate Kubernetes-native workflows with Argo Workflows.
Rating: 4.6 out of 5 (78 ratings)
3,233 students
English
English [Auto]

You will learn how to install a minikube cluster with Argo Workflows on your local machine.
You will get to know the core concepts of Argo Workflows and how to use them creating workflows.
You will be able to communicate with the Argo server using the kubectl CLI and how to use the Argo Server UI.

Requirements

  • Basic knowledge of Kubernetes is desirable, but not essential.

Description

This is an introductory course to the full course Hands-On Guide to Argo Workflows on Kubernetes.

Argo Workflows is a container native workflow engine for orchestrating jobs in Kubernetes. This means that complex workflows can be created and executed completely in a Kubernetes cluster.

It provides a mature user interface, which makes operation and monitoring very easy and clear. There is native artifact support, whereby it is possible to use completely different artifact repositories (Minio, AWS S3, Artifactory, HDFS, OSS, HTTP, Git, Google Cloud Service, raw).

Templates and cron workflows can be created, with which individual components can be created and combined into complex workflows. This means that composability is given. Furthermore, workflows can be archived and Argo provides a REST API and an Argo CLI tool, which makes communication with the Argo server easy.

It is also worth mentioning that Argo Workflows can be used to manage thousands of parallel pods and workflows within a Kubernetes cluster. And robust repetition mechanisms ensure a high level of reliability.

There is already a large, global community that is growing steadily. Just to name IBM, SAP and NVIDIA. It is mainly used for machine learning, ETL, Batch - and data processing and for CI / CD. And what is also very important - it is open source and a project of the Cloud Native Computing Foundation.


Upon successful completion of the course, you will be able to create workflows using the core concepts of Argo Workflows. You will be confident to use the kubectl CLI and the Argo Server UI in order to communicate with the Argo Server and manage your workflows.

Who this course is for:

  • Anyone who wants to use a Kubernetes native orchestration tool to create simple and complex workflows.
  • Everyone who wants to get to know most of the features of Argo Workflows for the creation of large workflows with a practical approach.

Course content

4 sections23 lectures1h 2m total length
  • Introduction
    05:15

Instructor

Data Engineer aus Leidenschaft
Jan Schwarzlose
  • 4.5 Instructor Rating
  • 339 Reviews
  • 8,116 Students
  • 7 Courses

Es gibt so viele coole Tools da draußen - vor allem im Bereich Small/Large/Big Data. Ein Leben reicht garnicht aus, alle Tools zu kennen und gut darin zu sein. Aber bereits mit einem übersichtlichen und guten Toolset kann man tolle Projekte mit echtem Mehrwert umsetzen.

Ich habe 2012 die Universität als Diplomingenieur für Mechatronik abgeschlossen, wobei Programmierung vor allem im Embedded Umfeld eine wichtige Rolle spielte. Im Laufe meiner ersten Berufsjahre als Ingenieur entdeckte ich mehr und mehr meine Leidenschaft für Python vor allem mit Small/Large/Big Data.

Nach einigen Hobby-Projekten wagte ich 2016 schließlich den Schritt auch beruflich in diesem Umfeld zu arbeiten. Nun arbeite ich seit mehreren Jahren erfolgreich als Data-Engineer und hatte die Möglichkeit in tollen Projekten mitzuwirken.

Dieses Wissen möchte ich gern durch Kurse im Bereich Data-Engineering und Data-Science mit einem hohen Fokus auf die Praxis weitergeben.


----------------------------------------

English Version

There are so many cool tools out there - especially in the small / large / big data area. One life is not enough to know all tools and be proficient with them. But even with a quite small and good toolset, you can implement great projects with real value.

In 2012 I graduated as engineer for mechatronics. Programming especially in the embedded area was an important part of my education. During my first years as an engineer, I discovered more and more my passion for Python, especially with small / large / big data.

After a few hobby projects, I took the step to work professionally in this area in 2016. I've been working now for years as a data engineer being involved in great projects.

I like to pass this knowledge on through courses in data engineering and data science with a high focus on practice.