Over a million developers have joined DZone.

Using Airflow to Manage Talend ETL Jobs

DZone's Guide to

Using Airflow to Manage Talend ETL Jobs

Learn how to schedule and execute Talend jobs with Airflow, an open-source platform that programmatically orchestrates workflows as directed acyclic graphs of tasks.

· Big Data Zone ·
Free Resource

Access NoSQL and Big Data through SQL using standard drivers (ODBC, JDBC, ADO.NET). Free Download 

Airflow, an open-source platform, is used to orchestrate workflows as directed acyclic graphs (DAGs) of tasks in a programmatic manner. An airflow scheduler is used to schedule workflows and data processing pipelines. Airflow user interface allows easy visualization of pipelines running in production environment, monitoring of the progress of the workflows, and troubleshooting issues when needed. Rich command line utilities are used to perform complex surgeries on DAGs.

In this blog, let's discuss scheduling and executing Talend jobs with Airflow.


  • Airflow 1.7 or above
  • Python 2.7
  • Talend Open Studio (Big Data or Data Integration)

Use Case

Schedule and execute Talend ETL jobs with Airflow.


  • Author Talend jobs
  • Schedule Talend jobs
  • Monitor workflows in Web UI

Job Description

Talend ETL jobs are created by:

  • Joining application_id from applicant_loan_info and loan_info as shown in the below diagram:


  • Loading matched data into the loan_application_analysis table.
  • Applying a filter on LoanDecisionType field in the loan_application_analysis table to segregate values as ApprovedDenied, and Withdrawn.
  • Applying another filter on the above-segregated values to segregate LoanType as Personal, Auto, Credit, and Home.

The created Talend job is built and moved to the server location. A DAG named Loan_Application_Analysis.py is created with the corresponding path of the scripts to execute the flow as and when required.

Creating DAG Folder and Restarting Airflow Webserver

After installing Airflow, perform the following:

  • Create a DAG folder (/home/ubuntu/airflow/dags) in the Airflow path.
  • Move all the .py files into the DAG folder.
  • Restart the Airflow webserver using the below code to view this DAG in UI list:
Loginto the AIRFLOW_HOME path-- eg.(/home/ubuntu/airflow)
To restart webserver ---> airflow webserver
To restart scheduler ---> airflow scheduler

After restarting the webserver, all .py files or DAGs in the folder will be referred and loaded into the web UI DAG list.

Scheduling Jobs

The created Talend jobs can be scheduled using the Airflow scheduler. For code, see the Reference section.

Note: The job can be manually triggered by clicking the Run button under the Links column as shown below:scheduling_jobs_1

Both the auto-scheduled and manually triggered jobs can be viewed in the UI as follows:scheduling_jobs_2

Monitoring Jobs

On executing the jobs, upstream or downstream processes will be started as created in the DAG. Upon clicking a particular DAG, the corresponding status such as success, failure, retry, queue, and so on of the job can be visualized in different ways in the UI.

Graph View

The statuses of the jobs are represented in a graphical format as shown below:graph_view

Tree View

The statuses of the jobs along with execution dates of the jobs are represented in a tree format as shown below:


Gannt View

The statuses of the jobs along with execution dates of the jobs are represented in a Gannt format as shown below:gannt_view

Viewing Task Duration

Upon clicking Task Duration tab, you can view the task duration of the whole process or DAGs in a graphical format as shown below:viewing_task_duration

Viewing Task Instances

By clicking Browse > Task Instances, you can view the instances on which the tasks are running, as shown below:viewing_task_instances

Viewing Jobs

By clicking Browse > Jobs, you can view details such as start time, end time, and executors of the jobs, as shown in the below diagram:viewing_jobs

Viewing Logs

By clicking Browse > ViewLog, you can view the details of the logs, as shown in the below diagram:viewing_logs

Data Profiling

Airflow provides a simple SQL query interface to query the data and a chart UI to visualize the tasks.

To profile your data, click Admin > Connections to select the database connection type, as shown in the below diagram:date_profiling

Ad Hoc Query

To write and query the data, click Data Profiling > Ad Hoc Query.ad_hoc_query


Different types of visualizations can be created for task duration and task status using charts.

To generate charts such as bar, line, area, and so on for a particular DAG using a SQL query, click Data Profiling > Charts > DAG_id, as shown in the below diagram:charts

All the DAGs are graphically represented, as shown in the below diagram:charts1

Email Notification

Email notifications such as email_on_failure, email_on_success, and email_on_retries can be set to know job status.

To enable the notification, perform the following:

  • Configure settings in the airflow.cfg file in the airflow_home path, as shown below:


  • Reset your email setting to Gmail settings > allow_less secure_apps ON to receive email alerts from Airflow.

Note: You may get authentication_error if the email settings are not properly configured. To overcome this issue, accept the login device as our device in Gmail device review as Yes That Was Me.

A job failure email is shown below:email_notification1

Upon clicking the Link in the email, you will be redirected to the Logs page.


In this blog, we discussed authoring, scheduling, and monitoring the workflows from web UI, as well as triggering the Talend jobs directly from the web UI on demand using the bash operator. You can also transfer data from one database to another database using the generic_transfer operator.


The fastest databases need the fastest drivers - learn how you can leverage CData Drivers for high performance NoSQL & Big Data Access.

airflow ,talend ,etl ,job scheduling ,big data ,profiling ,tutorial

Published at DZone with permission of

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}