SaaS Manual Resource Creation (Google Cloud + BigQuery)
To integrate Masthead with your BigQuery data warehouse, it is required to create a few integration resources in your Google Cloud project.
Manage resources with Terraform
Add Masthead agent module to your Terraform project
module "masthead-agent" {
source = "masthead-data/masthead-agent/google"
version = "~> 0.2.0"
project_id = "YOUR_PROJECT_ID"
# Enable only specific modules
enable_modules = {
bigquery = true
dataform = false
dataplex = true
analytics_hub = true
}
}
More details about the module on Terraform Registry.
Continue integration on the UI
Go back to Deployment page, click Manual Deployment and enter the project ID again.
Then click Check permissions and connect.
You will be forwarded to the overview dashboard for your newly integrated project.
Manual resource creation
Create Pub/Sub resources
Under this selected project, navigate to Pub/Sub.
Click Create Topic. Set topic id
masthead-topic
. (unmark checkbox Add a default subscription)

Navigate into the newly created. Click Create Subscription
Set subscription id:
masthead-agent-subscription
Set Cloud Pub/Sub topic ID:
masthead-topic
, then scroll down a little ...Set Acknowledgement deadline:
60 seconds
Click Create and we are all done with Pub/Sub topic and Subscription.

Create Logs Router
Navigate to Logs Router from the Logging menu. Click Create Sink button in the upper right corner and fill in the required fields:
Name:
masthead-agent-sink

Set Sink Destination: Cloud Pub/Sub topic -> choose from the dropdown newly created Pub/Sub topic
masthead-topic

Choose logs to include in the sink. Copy the text below and paste it into the filter.
protoPayload.methodName="google.cloud.bigquery.storage.v1.BigQueryWrite.AppendRows" OR "google.cloud.bigquery.v2.JobService.InsertJob"
OR "google.cloud.bigquery.v2.TableService.InsertTable"
OR "google.cloud.bigquery.v2.JobService.Query"
resource.type ="bigquery_table" OR resource.type ="bigquery_dataset" OR resource.type ="bigquery_project"

Hit Create Sink to complete creation

Grant Masthead Service Account roles
Grant [email protected]
next roles:
BigQuery Metadata Viewer
BigQuery Resource Viewer
Pub/Sub Subscriber
Grant Masthead Service Account to quickly onboard from retrospective data
The Masthead platform can gain insights in a few hours by parsing retrospective logs and creating a data model around them. To do so, please grant Service Account [email protected]
a Private Logs Viewer
role.
Navigate to IAM & Admin -> IAM
Click on GRANT ACCESS button on the top left of the screen
Specify
[email protected]
in the New principals fieldClick on Select a role and type
Private Logs Viewer
. Select foundPrivate Logs Viewer
role.Click SAVE
This will enable Masthead Agent to look up only recently produced events in the Google Cloud that corresponds to the filter defined in the Cloud Logs Sink.
Continue integration on the UI
Go back to Deployment page, click Manual Deployment and enter the project ID again.
Then click Check permissions and connect.
You will be forwarded to the overview dashboard within your newly integrated project.
Route retrospective logs
Get observability insights within a few hours after project integration. This can be enabled by exporting the retrospective logs to Masthead for analysis.
Masthead will provide Storage Object Creator
permission to your service account to write the exported retrospective logs into Masthead's Cloud Storage bucket.
The account must have the following permissions in your project:
Logging Admin (
roles/logging.admin
)Logs Configuration Writer (
roles/logging.configWriter
)
Copy the code, insert your project ID and run the command to start the export operation:
gcloud logging copy _Default storage.googleapis.com/masthead_retro_logs_{PROJECT_ID} \
--location=global \
--log-filter='protoPayload.methodName="google.cloud.bigquery.storage.v1.BigQueryWrite.AppendRows" OR protoPayload.methodName="google.cloud.bigquery.v2.JobService.InsertJob" OR protoPayload.methodName="google.cloud.bigquery.v2.TableService.InsertTable" OR protoPayload.methodName="google.cloud.bigquery.v2.JobService.Query" resource.type="bigquery_table" OR resource.type="bigquery_dataset" OR resource.type="bigquery_project" timestamp > "2025-06-18T10:00:00.0Z"'
Last updated