Google BigQuery
DirectionBidirectional
Environment
WebiOSAndroidCTVAPI Direct
Capability
ConnectivityRouting
SDK RequiredNo
Product(s) Required
Core PlatformRouting
Google BigQuery enables publishers to analyze large datasets for audience insights, campaign performance, and ad optimization.
Overview
The BigQuery integration enables publishers to leverage Permutive’s bi-directional data capabilities with their BigQuery data warehouse. This integration operates in two modes: Routing (Destination): Stream first-party event data from Permutive to BigQuery. Read more in Routing documentation.Routing capability requires the Routing package in addition to Core Platform. Contact your Customer Success Manager to enable Routing.
- Automatic schema generation and updates for new event types and properties
- Day-partitioned tables for efficient querying
- Support for all Permutive event data including user events, identities, and segment metadata
- Self-service setup through the Permutive Dashboard
Environment Compatibility
| Environment | Supported | Notes |
|---|---|---|
| Web | Yes | — |
| iOS | Yes | — |
| Android | Yes | — |
| CTV | Yes | — |
| API Direct | Yes | — |
Prerequisites
For Routing (exporting data to BigQuery):- Google Cloud Platform project: An active GCP project where BigQuery will store your data
- BigQuery API enabled: The BigQuery API must be enabled on your GCP project (enabled by default for new projects)
- IAM permissions: Ability to grant IAM roles to service accounts at the project level in Google Cloud Console
- Project-level access: Permissions to add service accounts with the BigQuery User role to your GCP project
Setup
- Routing Streaming Setup
- Connectivity Setup
Overview
BigQuery Routing enables self-service setup through the Permutive Dashboard. The setup involves configuring a BigQuery destination in the Dashboard and granting a Permutive service account the BigQuery User role at the project level in your GCP IAM settings. Permutive automatically creates the dataset and tables — no manual dataset creation is required.Prerequisites
- An active Google Cloud Platform project with the BigQuery API enabled
- Your GCP Project ID, a unique dataset name, and your preferred data location (US or EU)
- Permissions to grant IAM roles to service accounts at the project level in Google Cloud Console
What Happens After Setup
Once routing is active:- Tables are created automatically for each event type (e.g.,
Pageview_events,VideoView_events) - Events stream in near real-time with approximately 5-minute latency
- Schemas update automatically when you add new event types or properties
- Daily partitions organize data by event date for efficient querying
Data Types
The BigQuery Routing integration creates the following tables in your dataset to store different types of data:Event Tables
Event Tables
Permutive creates separate tables for each event type. Tables are named
{event_name}_events and contain daily partitions based on event date.Example tables:Pageview_eventsVideoView_eventsAffiliateLinkClick_events
Event timestamp in UTC
Unique identifier for the event
Permutive user identifier
Session identifier
Page view identifier
Array of segment IDs the user belongs to at the time of the event
Event-specific properties stored as a JSON record. Schema varies by event type and is automatically updated when new properties are added.
Workspace identifier
Identities Table
Identities Table
Stores identity resolution data. The table is named
identities and contains daily partitions.Schema:Timestamp when the identity was captured
Type of identity event
Permutive user identifier
External identity value
Identity tag or namespace
Workspace identifier
Segment Metadata
Segment Metadata
Stores segment definitions and metadata. Permutive creates two objects:
- Table:
segment_metadata_snapshots- Raw snapshots of segment metadata - View:
segment_metadata- Deduplicated view of the latest segment metadata
Segment ID number
Segment name
Array of tags associated with the segment
JSON string containing segment configuration and metadata
Workspace identifier
Use the
segment_metadata view for queries to automatically get deduplicated, up-to-date segment information.- Automatic Schema Management: New event types and properties are automatically added to tables without manual intervention
- Partitioning: Event and identity tables are partitioned by day for efficient querying and cost optimization
Troubleshooting
Integration Status Shows 'Failed'
Integration Status Shows 'Failed'
Cause: The Permutive service account was not granted the correct permissions, or permissions were granted after clicking “Confirm account access granted”.Solution:
- Verify the service account has been granted the BigQuery User role at the PROJECT level in the Google Cloud IAM console
- Ensure you selected the correct GCP project
- If permissions were granted incorrectly, you will need to restart the integration configuration from the beginning
- Contact Support if the issue persists after verifying permissions
Dataset Already Exists Error
Dataset Already Exists Error
Cause: A dataset with the same name already exists in your BigQuery project.Solution:
- Option 1 (Recommended): Use a different dataset name that doesn’t already exist in your project
- Option 2: If you must use an existing dataset, grant the Permutive service account the BigQuery Data Owner role at the dataset level (not just BigQuery User at project level)
Service Account Permission Denied
Service Account Permission Denied
Cause: Your organization’s IAM policies may restrict adding external service accounts, or permissions were not granted at the correct level.Solution:
- Check with your GCP administrator about organization policies that may block external service accounts
- Ensure the BigQuery User role was granted at the PROJECT level, not the dataset level
- Verify your user account has permissions to grant IAM roles in the GCP project
- If organization policies block external service accounts, work with your security team to add an exception for
@permutive-routing-production.iam.gserviceaccount.comdomains
Changelog
No changes listed yet. For detailed changelog information, visit our Changelog.