From the course: Apache Airflow Essential Training

Unlock this course with a free trial

Join today to access over 24,800 courses taught by industry experts.

A complete end-to-end pipeline with PostgreSQL

A complete end-to-end pipeline with PostgreSQL - Apache Airflow Tutorial

From the course: Apache Airflow Essential Training

A complete end-to-end pipeline with PostgreSQL

- [Instructor] I'm now going to make one last change to the DAG that we've been working with so far to write out the filtered data to a CSV file. Now you can see I have a Python function here saving to CSV, which takes in a task instance as an input argument. I use ti.xcom_pull to get the data from the filtering_customers task. And then we open up the filtered_customer_data.csv file in the output folder and then I use a CSV writer to write out this data. This is the code on lines 21 through 26. I instantiate the CSV writer, write out the header row and then write out the individual records in the filtered data. The rest of the code here for the DAG is the same as before, I have the same operators here, at the very bottom I have the filtering_customers PostgresOperator which filters from the complete customer details table for all products with price between lower bound five and upper bound nine. And then of course, I…

Contents