From the course: Apache Airflow Essential Training
Unlock this course with a free trial
Join today to access over 24,800 courses taught by industry experts.
A complete end-to-end pipeline with PostgreSQL - Apache Airflow Tutorial
From the course: Apache Airflow Essential Training
A complete end-to-end pipeline with PostgreSQL
- [Instructor] I'm now going to make one last change to the DAG that we've been working with so far to write out the filtered data to a CSV file. Now you can see I have a Python function here saving to CSV, which takes in a task instance as an input argument. I use ti.xcom_pull to get the data from the filtering_customers task. And then we open up the filtered_customer_data.csv file in the output folder and then I use a CSV writer to write out this data. This is the code on lines 21 through 26. I instantiate the CSV writer, write out the header row and then write out the individual records in the filtered data. The rest of the code here for the DAG is the same as before, I have the same operators here, at the very bottom I have the filtering_customers PostgresOperator which filters from the complete customer details table for all products with price between lower bound five and upper bound nine. And then of course, I…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.
Contents
-
-
-
-
-
Installing PostgreSQL on macOS2m 28s
-
(Locked)
Installing PostgreSQL on WSL2m 27s
-
(Locked)
Connecting to PostgreSQL4m 19s
-
(Locked)
Using the PostgreSQL operator3m 55s
-
(Locked)
Performing PostgreSQL insert operations2m 44s
-
(Locked)
Performing PostgreSQL join operations3m 2s
-
(Locked)
A complete end-to-end pipeline with PostgreSQL3m 48s
-
(Locked)
Configuring PostgreSQL as a metadata database and using the LocalExecutor6m 6s
-
-
-
-