Triggering Pipelines
Calabi Pipelines runs most pipelines automatically on their configured schedule. However, there are many situations where you need to kick off a run on demand — to test a change, reprocess data for a specific date, or respond to an external event. This page covers every way to trigger a pipeline run.
Manual Trigger from the UI
The quickest way to trigger a run is through the Calabi Pipelines web interface.
- Open Calabi Pipelines from the Calabi navigation sidebar.
- On the DAGs list page, locate your pipeline using the search bar or tag filters.
- Ensure the pipeline is unpaused — the toggle on the left of the row must be blue/active.
- Click the Play button (▶) on the right side of the pipeline row.
- A dialog appears offering two options:
- Trigger DAG — start a run immediately with no additional input.
- Trigger DAG w/ config — start a run and pass a JSON configuration object (see below).
- Click Trigger to confirm.
The new run appears in the Grid view within a few seconds with the state queued.
If the pipeline toggle is grey (paused), triggering is still possible — the run is queued and will begin executing once you unpause, or immediately if you unpause right after triggering.
Triggering with Parameters (conf JSON)
You can pass runtime parameters to a pipeline run using the conf dictionary. Tasks can read these values at execution time using the dag_run.conf context variable.
Passing conf from the UI
In the trigger dialog, click Trigger DAG w/ config and enter a JSON object in the configuration field:
{
"target_date": "2024-06-15",
"region": "us-east-1",
"dry_run": false
}
Reading conf in a task
from airflow.decorators import task, dag
from datetime import datetime
@dag(schedule=None, start_date=datetime(2024, 1, 1), catchup=False)
def parameterised_pipeline():
@task()
def process(**context):
conf = context["dag_run"].conf or {}
target_date = conf.get("target_date", "today")
region = conf.get("region", "us-west-2")
dry_run = conf.get("dry_run", True)
print(f"Processing {region} for {target_date} | dry_run={dry_run}")
process()
dag_instance = parameterised_pipeline()
Always call .get("key", default) when reading conf values. If the pipeline is triggered without a config, dag_run.conf will be an empty dict and your task should not fail.
Defining param schemas (Params API)
For validated, documented parameters, declare them in the @dag definition:
from airflow.models.param import Param
@dag(
schedule=None,
start_date=datetime(2024, 1, 1),
catchup=False,
params={
"target_date": Param(default="2024-01-01", type="string", description="ISO date to process"),
"region": Param(default="us-east-1", type="string", enum=["us-east-1", "eu-west-1"]),
"dry_run": Param(default=True, type="boolean"),
},
)
def parameterised_pipeline():
...
With declared params, the trigger dialog in the UI renders a form with validation instead of a raw JSON text box.
Triggering via REST API
Calabi Pipelines exposes a stable REST API that you can call from any external system — CI/CD pipelines, webhooks, scripts, or other services.
Authentication
All API calls require a valid session token or Basic Auth credentials. Replace <TOKEN> with your API token from the Calabi Pipelines settings page.
Trigger a run (no config)
curl -X POST \
"https://<your-calabi-host>/api/v1/dags/customer_orders_daily/dagRuns" \
-H "Authorization: Bearer <TOKEN>" \
-H "Content-Type: application/json" \
-d '{
"logical_date": "2024-06-15T00:00:00Z",
"note": "Manual backfill triggered by CI"
}'
Trigger a run with conf
curl -X POST \
"https://<your-calabi-host>/api/v1/dags/parameterised_pipeline/dagRuns" \
-H "Authorization: Bearer <TOKEN>" \
-H "Content-Type: application/json" \
-d '{
"conf": {
"target_date": "2024-06-15",
"region": "eu-west-1",
"dry_run": false
},
"note": "Triggered from deployment pipeline"
}'
Response
A successful trigger returns HTTP 200 with the new run details:
{
"dag_id": "customer_orders_daily",
"dag_run_id": "manual__2024-06-15T10:00:00+00:00",
"logical_date": "2024-06-15T00:00:00Z",
"state": "queued",
"conf": { "region": "eu-west-1" }
}
Check run status via API
curl -X GET \
"https://<your-calabi-host>/api/v1/dags/customer_orders_daily/dagRuns/manual__2024-06-15T10:00:00+00:00" \
-H "Authorization: Bearer <TOKEN>"
state value | Meaning |
|---|---|
queued | Run is scheduled, waiting for a worker slot |
running | At least one task is executing |
success | All tasks completed successfully |
failed | One or more tasks failed |
Monitoring Triggered Runs
After triggering, you can track the run's progress in the UI:
- Navigate to the pipeline by clicking its name in the DAGs list.
- The Grid view shows each run as a column. The most recent run is on the right.
- Click a column to expand run details, including start time, duration, and the triggering user.
- Click any individual cell (task instance) to open the task's log viewer.
The run duration counter at the top of the Grid updates in real time while tasks are executing.
Enable the Auto-refresh toggle (top right of the Grid view) to have the page update every few seconds without manually reloading.
Pausing and Unpausing Pipelines
Pausing a pipeline prevents the scheduler from creating new automatic runs. Any runs already queued or executing continue to completion.
From the UI
Toggle the on/off switch to the left of the pipeline name in the DAGs list.
From the REST API
# Pause a pipeline
curl -X PATCH \
"https://<your-calabi-host>/api/v1/dags/customer_orders_daily" \
-H "Authorization: Bearer <TOKEN>" \
-H "Content-Type: application/json" \
-d '{"is_paused": true}'
# Unpause a pipeline
curl -X PATCH \
"https://<your-calabi-host>/api/v1/dags/customer_orders_daily" \
-H "Authorization: Bearer <TOKEN>" \
-H "Content-Type: application/json" \
-d '{"is_paused": false}'
When a new pipeline version is deployed to Calabi Pipelines, the pipeline may briefly pause during the parse cycle. Confirm it is active before expecting scheduled runs to resume.
Backfilling Historic Runs
To reprocess data for a past date range, use the backfill command from the Calabi Pipelines CLI (available in the worker container):
airflow dags backfill \
--start-date 2024-05-01 \
--end-date 2024-05-31 \
customer_orders_daily
This creates one run per schedule interval between the two dates, in chronological order.
Backfilling many intervals concurrently can consume significant compute. Use --max-active-runs 2 to throttle the backfill.
Next Steps
- Monitoring Pipelines — Understand run states, read task logs, and clear failures
- Variables & Connections — Parameterise pipelines with environment-safe config