Automatically sync your Saber sales intelligence and signals to BigQuery, and enrich your account research with internal data from BigQuery. This connector:
Exports Saber data to your BigQuery warehouse daily to centralise account- and contact-level intelligence
Pulls in internal account and contact data from BigQuery to enrich your research
Combines external research with your internal data for AI-powered insights
Enables personalised messaging based on your company's historical interactions
Supports custom SQL queries to fetch specific data for account intelligence
The BigQuery connector is only available on the Team and Enterprise plan.
Prerequisites
You will need:
Admin access within Saber
A Google Cloud Project with BigQuery enabled
BigQuery Admin or Data Editor permissions
A service account with appropriate permissions
Initial Setup
Go to your Saber account, navigate to Settings and click on Connectors
Select BigQuery under Data Connectors
Click on the "Connect BigQuery" button
Enter your BigQuery configuration details:
Project ID: Your Google Cloud Project ID
Dataset Name: The dataset where Saber will export data (we'll create it if it doesn't exist)
Service Account Email: The email of your service account
Service Account Key: Upload the JSON key file
Click "Test Connection" to verify access
Click "Save Configuration"
After successful connection, we will have set up:
Authentication between Saber and BigQuery
A dedicated dataset for Saber data (if it didn't exist)
Daily automated exports of your Saber data
The ability to query your BigQuery data from within Saber
Now you can continue setting up how you can export data from Saber and import data to Saber.
Troubleshooting
Data Not Appearing in BigQuery
Verify your service account has BigQuery Data Editor role
Check that the dataset location matches your requirements
Ensure your Saber plan includes BigQuery export
Initial export may take up to 24 hours
Import Issues
Confirm your SQL query returns data in the expected format
Verify domain/company matching fields are properly formatted
Check that your service account can read from source tables
Review query performance (queries timeout after 30 seconds)
Common Errors
"Permission denied": Add required BigQuery permissions to service account
"Dataset not found": Verify dataset name and project ID
"Query timeout": Optimise your SQL query or reduce data volume
"Invalid schema": Ensure query returns columns in expected format
FAQ
How much historical data is exported?
Initial export includes all historical data. Subsequent exports are incremental with full refreshes weekly.
Can I customise the export schema?
Yes, contact support to configure custom fields and tables for your specific needs.
Is data encrypted?
Yes, all data is encrypted in transit and at rest. We use Google Cloud's encryption standards.
What's the data refresh frequency?
Exports to BigQuery: Daily at 2 AM UTC
Imports from BigQuery: Configurable per data source (real-time to daily)