Prerequisites
- Snowflake ACCOUNTADMIN role or equivalent privileges
- A Narrative account (create one here)
-
A Narrative API token with the following permissions (manage API keys):
Permission Access level Compute Pools Read and Write Data Planes Read and Write Data Plane Logs Read and Write Datasets Read and Write Jobs Read and Write Models Read - A Snowflake virtual warehouse
- Access to the Narrative Data Collaboration listing on the Snowflake Marketplace
- At least one Snowflake table or view to register as a dataset
Installation steps
Install the Snowflake Native App
Find and install the Narrative Data Collaboration app from the Snowflake Marketplace. Click Get and follow the prompts to install it in your Snowflake account.
Grant privileges
Click Grant Privileges to grant the app the following privileges:
EXECUTE TASKCREATE COMPUTE POOLBIND SERVICE ENDPOINT
Register compute warehouses
In the app configuration screen, expand Warehouse Management and click Select Warehouses to Register. Select the warehouses you want to use as compute pools.Then, grant the app usage on each selected warehouse by running the following in a Snowflake worksheet:Each granted warehouse is registered as a compute pool on your data plane, giving you control over which warehouse handles each workload. You can register additional warehouses later and manage them from the Compute Pools tab on the data plane detail page.
Configure external access integration
- Click Setup External Access Integration.
- Click Next.
- Paste your Narrative API token under Secret value.
Select a source table or view
Select the Snowflake table or view you want to register as a dataset in Narrative.
Narrative is granted read-only access to the selected object. You can change or add additional tables later.
Name your dataset
Set the DATASET UNIQUE NAME for your registered dataset.
Dataset names must use uppercase alphanumeric characters and underscores only (e.g.,
MY_DATASET_NAME).Upload sample data (optional)
Upload a sample of your data for display in the Narrative UI. This helps collaborators understand the structure and content of your dataset. You can skip this step and upload sample data later from within the platform.
What happens next
The app now operates as a data plane operator—it continuously polls Narrative’s control plane for pending jobs and executes NQL queries directly in your Snowflake environment. Your data stays within your Snowflake security perimeter throughout this process. Your registered warehouses appear as compute pools on the data plane detail page. You can set a default compute pool, add or remove warehouses, and configure collaboration policies from the Compute Pools tab. To choose which compute pool runs a specific query, use the context selector.If your data plane was set up before compute pools were available, see Migrate to Compute Pools to transition from the legacy single-warehouse model.
Recommended: Enable AI features
To enable optional Narrative features that use advanced AI models hosted by Snowflake Cortex (such as Ask Rosetta), run the following grants after the app is installed:Related content
Snowflake Native App Reference
Capabilities, requirements, and configuration details
Data Planes
Understand how data planes and operators work
API Keys
Manage API tokens and permissions
Execution Context
Learn how queries are executed across data planes

