Apache Airflow using Google Cloud Composer: Introduction
4.1 (41 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
195 students enrolled

Apache Airflow using Google Cloud Composer: Introduction

With Google Cloud composer learn Apache Airflow without making any local install. Ensures focus is on Airflow topics.
4.1 (41 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
195 students enrolled
Last updated 10/2019
English
English [Auto]
Current price: $20.99 Original price: $29.99 Discount: 30% off
5 hours left at this price!
30-Day Money-Back Guarantee
This course includes
  • 4 hours on-demand video
  • 18 downloadable resources
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
Training 5 or more people?

Get your team access to 4,000+ top Udemy courses anytime, anywhere.

Try Udemy for Business
What you'll learn
  • Understand automation of Task workflows through Airflow
  • Airflow Architecture - On Premise (local install), Cloud, single node, multiple node
  • How to use connection functionality to connect to different systems to automate data pipelines
  • What is Google cloud Big query and briefly how it can be used in Dataware housing as well as in Airflow DAG
  • Master core functionalities such as DAGs, Operators, Tasks through hands on demonstrations
  • Understand advanced functionalities like XCOM, Branching, Subdags through hands on demonstrations
  • Get an overview understanding on SLAs, Kubernetes executor functionality in Apache Airflow
  • The source files of Python DAG programs (9 .py files) used in demonstration are available for download towards practice for students
Course content
Expand all 38 lectures 03:57:27
+ Introduction
4 lectures 21:34
Uses cases for Apache Airflow
04:25
What is Apache Airflow?
07:30
Environment Options - On-Premise and Google Cloud offering
06:23
+ Apache Airflow architecture
2 lectures 13:58
Apache Airflow architecture
09:28
Apache Airflow - Single Node vs Multinode
04:30
+ Google Cloud Platform: Cloud composer used as Apache Airflow
2 lectures 15:46
Creation of Google Composer - Airflow environment
06:36
Navigation - Cloud composer(Apache airflow) Web UI navigation
09:10
Quiz time 1
6 questions
+ Understanding Apache Airflow program structure
1 lecture 04:01
Understanding Apache Airflow program structure
04:01
+ Activity 1 : Create and submit Apache airflow DAG program
1 lecture 12:40
Activity 1 : Create and submit Apache airflow DAG program
12:40
Quiz time 2
4 questions
+ Activity 2: Using Template functionality in Apache Airflow program
2 lectures 10:45
Activity 2: Using Templating functionality in Apache Airflow program
06:11
Activity 2: Using Templating functionality in Apache Airflow program - Part 2
04:34
+ Activity 4: Calling Bash script in different folder / different machine.
2 lectures 11:23
Activity 4: Calling Bash script in different folder / different machine - Part1
05:10
Activity 4: Calling Bash script in different folder / different machine - Part 2
06:13
Requirements
  • Google Cloud Platform Account OR even Free Trial account - NO Install required
  • Good understanding on Python code and some exposure to bash shell scripting will help.
Description

Apache Airflow is an open-source  platform to programmatically author, schedule and monitor workflows.

Cloud Composer  is a fully managed workflow orchestration service that empowers you to author, schedule, and monitor pipelines that span across clouds and on-premises data centers. Built on the popular Apache Airflow open source project and operated using the Python programming language, Cloud Composer is free from lock-in and easy to use.

With Apache Airflow hosted on cloud ('Google' Cloud composer) and hence,this will assist learner to focus on Apache Airflow product functionality and thereby learn quickly, without any hassles of having Apache Airflow installed locally on a machine.

Cloud Composer pipelines are configured as directed acyclic graphs (DAGs) using Python, making it easy for users of any experience level to author and schedule a workflow. One-click deployment yields instant access to a rich library of connectors and multiple graphical representations of your workflow in action, increasing pipeline reliability by making troubleshooting easy.

This course is designed with beginner in mind, that is first time users of cloud composer / Apache airflow. The course is structured in such a way that it has presentation to discuss the concepts initially and then  provides with hands on demonstration to make the understanding better.

The python DAG programs used in demonstration source file (9 Python files) are available for download toward further practice by students.

Happy learning!!!

Who this course is for:
  • People interested in Data warehousing, Big data, Data engineering
  • People interested in Automated tools for task workflow scheduling
  • Student interested to know about Airflow
  • Professional to wish to explore as how Apache Airflow can be used in Task scheduling and building Data pipelines