Apache Airflow has become a trusted choice for scheduling and orchestrating workflows. It allows you to automate repetitive processes, manage dependencies, and monitor tasks seamlessly. Many teams use it for moving data, generating reports, or triggering downstream processes. One common use case is dynamically fetching data from a source—whether it’s an API, a database, or a file system—and sending the results by email. This approach saves hours of manual effort while keeping stakeholders informed in real time. Let’s explore how you can implement this efficiently using Apache Airflow.
Setting Up the Airflow DAG
Before you begin fetching data and emailing it, ensure you have a working Airflow environment and a DAG file. A Directed Acyclic Graph (DAG) defines the sequence of tasks you want Airflow to execute. Place your DAG Python script in the dags/
folder of your Airflow installation. In this case, you’ll define a DAG with two main tasks: one to dynamically fetch the data and another to email it.
The DAG’s configuration includes the schedule, default arguments such as retries and start date, and a clear definition of task dependencies. Start by importing the necessary modules. For dynamic data fetching, you’ll typically rely on Python functions—possibly using the requests
library for APIs or psycopg2
for a PostgreSQL database. Airflow’s PythonOperator
is a natural choice here because it allows you to define any Python function as a task.
Dynamically Fetching Data
Dynamic data fetching involves pulling fresh data when the DAG runs, rather than working with static or pre-saved files. The source could be an API endpoint, a production database query, or even a web page scrape. Write a Python function that connects to your data source, queries the current data, and saves it to a temporary location or variable. For APIs, handle authentication, timeouts, and errors carefully, ensuring the response is validated.
In your Airflow DAG, define a PythonOperator
that calls this function. You can pass parameters to the function or let it determine the current date and other context from Airflow’s built-in variables. For databases, use connection hooks like PostgresHook
or MySqlHook
to avoid hard-coding credentials. These hooks let you retrieve secure connection details stored in Airflow’s admin interface. After fetching, consider saving the results as a CSV file in a shared folder, pushing it to XCom (Airflow’s inter-task communication mechanism), or returning it directly if it’s small.
Logging what you fetch—such as the number of records, the timestamp, or any relevant metadata—is a good practice for tracking in Airflow’s UI. The dynamic nature comes from running this at regular intervals with fresh parameters each time, ensuring you’re never relying on stale inputs.
Sending the Data via Email
Once the data is fetched and stored, the next step is to email it to the intended recipients. Airflow offers a built-in EmailOperator
that makes this process simple. You can attach files, set the subject and body, and list one or more recipients. If your fetched data was saved as a file (like a CSV), you can attach it directly. If it’s small enough and stored in XCom, you can retrieve it in the email task and include it in the email body itself.
Configure your SMTP settings in the airflow.cfg
file or through environment variables so that Airflow knows how to connect to your email server. Test this outside of Airflow first to ensure the credentials and server settings are correct. In your DAG, define the EmailOperator
with a clear subject line—often including a date for easy identification—and a concise body message.
For more advanced email needs, create a custom Python function to send emails and use a PythonOperator
for that. This is useful for HTML formatting, inline tables, or conditional logic about what to send. Keep emails concise and avoid sending overly large attachments, which might get blocked or delay delivery. If your data is very large, consider uploading it to a shared storage service and emailing the link instead.
Monitoring and Handling Errors
Despite their simplicity, fetching and emailing can encounter errors. The API may time out, the database might reject a connection, or the SMTP server could refuse the message. Airflow helps you monitor these failures through its web interface, logs, and email alerts. Use retry policies in your DAG to handle transient failures, and implement thorough error checking in your Python functions. Logging meaningful, descriptive error messages will save time when troubleshooting unexpected issues.
It’s also helpful to validate the data after fetching—such as checking that the number of rows isn’t unexpectedly low—and fail the task if validation fails. This prevents sending out incomplete, incorrect, or misleading data. Airflow’s ability to automatically retry tasks and send clear failure notifications means you can rely on it to recover from minor hiccups without constant manual intervention.
Conclusion
Automating dynamic data fetching and emailing with Apache Airflow streamlines routine reporting and data delivery processes efficiently. By building a DAG with clear tasks, you can query fresh data from APIs or databases, save it securely, and distribute it to stakeholders on schedule without unnecessary delays. Using the built-in operators for Python functions and email makes the implementation simple yet flexible, and Airflow’s monitoring features help keep everything running smoothly over time. As your needs grow, this approach can be extended to handle more data sources, advanced formatting, or smarter error handling, making it a reliable and scalable foundation for your workflows.
For more information on Apache Airflow, consider visiting Apache Airflow’s official documentation.