From the course: Apache Airflow Essential Training
Unlock this course with a free trial
Join today to access over 24,800 courses taught by industry experts.
DAG using TaskFlow - Apache Airflow Tutorial
From the course: Apache Airflow Essential Training
DAG using TaskFlow
- [Narrator] Now let's see how we can rewrite this same simple DAG using TaskFlow operators. The TaskFlow API was introduced in version 2.0 to make it simple for you to write your DAGs using Python code. So if most of your DAGs use plain Python code, you can eliminate a lot of the boilerplate involved with Python operators by using the TaskFlow API. The TaskFlow API is a new way to define workflows using a more Pythonic and intuitive syntax and it aims to simplify the process of creating complex workflows by providing a higher-level abstraction compared to the traditional DAG-based approach. If you notice on lines 11, 19, and 23, we make use of the @dag and @task decorators to identify the DAG and tasks in our workflow. On line 17, I've defined a function called dag_with_taskflow_api that I have decorated using the @dag decorator. The @dag decorator has parameters which span line 11 through 16. These contain the same…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.
Contents
-
-
-
Prerequisites39s
-
(Locked)
Quick Airflow setup overview3m 27s
-
(Locked)
DAG using PythonOperators3m 33s
-
(Locked)
DAG using TaskFlow3m 55s
-
(Locked)
Passing data using XCom with operators5m 37s
-
(Locked)
Passing data using the TaskFlow API4m 41s
-
(Locked)
Tasks with multiple outputs5m 40s
-
(Locked)
Passing multiple outputs in TaskFlow1m 47s
-
-
-
-
-
-