.png)
Apache Airflow is the leading open-source platform to programmatically author, schedule, and monitor data pipelines and workflows using Python. Workflows are defined as code (DAGs), making them version-controlled, testable, and reusable. With a rich UI, 100+ built-in operators, dynamic task generation, and native support for cloud providers, Airflow powers ETL/ELT, ML pipelines, and batch jobs at companies like Airbnb, Netflix, and Spotify.
Airflow Application Layout

Airflow-Provider-IRIS Package
Airflow-Provider-IRIS enables seamless integration between Airflow workflows and the InterSystems IRIS data platform. It provides native connection support and operators for executing IRIS SQL and automating IRIS-driven tasks within modern ETL/ELT pipelines. Designed for reliability and ease of use, this provider helps data engineers and developers build scalable, production-ready workflows.
🚀 Features
- ✔️
IrisHook – for managing IRIS connections
- ✔️
IrisSQLOperator – for running SQL queries
- ✔️ Support for both SELECT/CTE and DML statements
- ✔️ Native Airflow connection UI customization
- ✔️ Examples for real-world ETL patterns
📦Installation
The airflow-provider-iris package can be installed separately in any Airflow environment using the following command:
pip install airflow-provider-iris
For detailed documentation, usage examples, and a complete list of operators/hooks, see the published provider package:
.png)
iris-airflow-provider Application
iris-airflow-provider is an Open Exchange application that demonstrates the capabilities and usage of the airflow-provider-iris Python package through ready-to-run examples and sample DAGs.
Navigate to http://localhost:8080/ to access the application [Credentials: airflow/airflow]
.png)
Add IRIS connection
Go to Admin → Connections and click on the Add Connection button
.png)
Fill in the fields and click the Save button at the bottom of the form to create the connection.
.png)
Use your InterSystems IRIS connection by setting the iris_conn_id parameter in any of the provided operators.
In the Airflow DAG example below, the IrisSQLOperator uses the iris_conn_id parameter to connect to the IRIS instance :
from datetime import datetime
from airflow import DAG
from airflow_provider_iris.operators.iris_operator import IrisSQLOperator
with DAG(
dag_id="01_IRIS_Raw_SQL_Demo_Local",
start_date=datetime(2025, 12, 1),
schedule=None,
catchup=False,
tags=["iris-contest"],
) as dag:
create_table = IrisSQLOperator(
task_id="create_table",
iris_conn_id="ContainerInstance",
sql="""
CREATE TABLE IF NOT EXISTS Test.AirflowDemo (
ID INTEGER IDENTITY PRIMARY KEY,
Message VARCHAR(200),
RunDate TIMESTAMP DEFAULT CURRENT_TIMESTAMP
)
""",
)
A DAG (Directed Acyclic Graph) is a Python script that defines an Airflow workflow as a collection of tasks, their dependencies, and execution schedule. Airflow automatically discovers and loads any Python file placed in the designated DAGs folder.
View/Run Sample Dags
The application comes with three pre-loaded DAGs.
- Open the Airflow UI and click on the DAGs tab.
- Use the toggle button next to each DAG to enable or disable it.
.png)
To run a DAG manually, click the Trigger DAG button (▶ arrow) on the right side of the DAG row.
Click the name of DAG (e.g., 01_IRIS_Raw_SQL_Demo) to view its details, graph, and run history.
.png)
The 01_IRIS_Raw_SQL_Demo DAG consists of three tasks:
- Create Table
- Insert Data
- Retrieve Data
.png)
Select a task and click the task box to open its details. Click on the Details tab to see its details.
.png)
Click on the Code tab to see the task’s source code.
.png)
Click on the Log tab to see the Log details.
.png)
If the DAG runs successfully, verify the results in the InterSystems Management Portal.
Navigate to http://localhost:32783/csp/sys/exp/%25CSP.UI.Portal.SQL.Home.zen?$NAMESPACE=USER [Credentials: _SYSTEM/SYS]
.png)
For more details, please visit iris-provider-iris open exchange application page.
Thanks