For a deeper understanding of how workflows relate to the jobs system, see Workflow Orchestration. For complete API details, see the Workflows API Reference.
Prerequisites
- An API token with workflow permissions (
readandwriteaccess toworkflows) - A data plane configured for your account
- Familiarity with NQL syntax and materialized views
What you’ll learn
- How to define a workflow specification in YAML
- How to create and trigger workflows via the API
- How to chain multiple tasks with dependencies
- How to pass data between tasks using export and variable expressions
- How to schedule workflows for automatic execution
- How to monitor workflow runs and troubleshoot failures
Defining a workflow
Workflows use the Serverless Workflow DSL in YAML format. A workflow is a list of tasks that execute sequentially in the order you define them.Minimal example
Here’s the simplest possible workflow—a single task that creates a materialized view:| Section | Purpose |
|---|---|
document | Metadata—DSL version, namespace, name, and version |
do | Ordered list of tasks to execute |
call specifying which task to run, and with providing the task’s parameters.
Supported tasks
Workflows support three operations:| Task | Description |
|---|---|
CreateMaterializedViewIfNotExists | Creates a new materialized view from an NQL query |
RefreshMaterializedView | Refreshes an existing materialized view with the latest data |
ExecuteDml | Executes a DML statement (INSERT, UPDATE, DELETE) |
Direct
ExecuteNQL is not supported. To execute a SELECT query and persist results, use CreateMaterializedViewIfNotExists.Creating and running a workflow
Step 1: Create the workflow
Send your YAML specification to the API:workflow_id that you’ll use for all subsequent operations.
Step 2: Trigger a run
Execute the workflow on a specific data plane:run_id. The workflow’s tasks execute sequentially—each task waits for the previous one to complete before starting.
Step 3: Check the status
Monitor the run to see how it’s progressing:Chaining dependent tasks
The real power of workflows is chaining operations that depend on each other. Tasks run in order, so later tasks can reference datasets created by earlier ones.Example: Create a source dataset, then derive from it
company_data.active_users_source—the dataset created by the first task. The workflow ensures the first task completes before the second one starts.
Example: Refresh multiple views in sequence
raw_events is refreshed before event_aggregates, which may depend on it.
Passing data between tasks
Beyond referencing datasets by name, workflows can pass structured data between tasks using export and variable expressions. This is useful when a downstream task needs metadata from an upstream task — like a dataset ID, row count, or status value.For the complete syntax reference, see Task output, Export, and Variable expressions in the specification reference.
Exporting task output to workflow context
Every task produces a JSON output object after execution. Useexport.as with a jq expression to capture values into the workflow context ($context):
$context.datasetId contains the ID of the created dataset.
Using variable expressions in parameters
Use${…} in parameter values to inject data from the previous task’s output (.) or the workflow context ($context). Use jq string interpolation \(expr) to embed values inside strings:
${…} expression (as the entire parameter value) preserves the JSON type. When used inside a string or with ${"..."}, the result is always a string.
Complete example
This workflow creates a materialized view, captures its dataset ID in the workflow context, then logs the operation using both direct output access and context:createViewcreates the materialized view. Its output includesdataset_id. Theexport.asexpression merges this into$contextasdatasetId.logCreatedDatasetuses.dataset_id— the direct output of the previous task — in a variable expression to insert a log record.logFromContextuses$context.datasetIdto access the same value via the accumulated workflow context. This is useful when the value was exported several tasks ago and is no longer in the immediately preceding output.
Scheduling workflows
Instead of triggering runs manually, you can set workflows to run on a schedule using cron expressions. Add aschedule block to your workflow specification:
Common schedules
| Expression | Schedule |
|---|---|
0 * * * * | Every hour |
0 0 * * * | Daily at midnight |
0 0 * * 0 | Weekly on Sunday |
0 0 1 * * | Monthly on the 1st |
Handling errors
Workflows use fail-fast behavior: if any task fails, execution stops immediately and remaining tasks are skipped. The workflow run status becomesfailed.
To investigate a failure:
- Check the workflow run status via the runs endpoint
- Identify which task failed from the run details
- Query the individual job using its
job_idfor detailed error information
Common errors
| Error | Cause | Solution |
|---|---|---|
Unsupported task | Using a task not in the supported list | Use only CreateMaterializedViewIfNotExists, RefreshMaterializedView, or ExecuteDml |
Missing required field | Incomplete YAML specification | Ensure document, dsl, namespace, name, version, and do are all present |
Invalid workflow specification | Malformed YAML | Validate your YAML syntax before submitting |
Best practices
- Name workflows clearly. Use descriptive names that explain what the workflow does (e.g.,
daily-audience-refreshrather thanworkflow-1). - Version your workflows. Increment the version when you change the specification to maintain a clear history.
- Start with a single task. Build and test one task at a time before assembling multi-step pipelines.
- Test before scheduling. Run workflows manually to confirm they work before setting up automatic execution.
Limitations
- Sequential only — tasks execute one at a time in order; parallel execution is not supported
- No conditional logic — all tasks run unconditionally; there is no if/else branching
- No loops — iterative operations are not supported
- No automatic retries — failed tasks are not retried automatically
Related content
Workflows API Reference
Complete API documentation for workflow endpoints
Creating Materialized Views
How to create and manage materialized views used in workflow tasks
API Keys
Create and manage the API tokens required for workflow access
NQL Syntax Reference
Full NQL syntax reference for writing workflow queries

