SaaS Manual Resource Creation (Google Cloud + BigQuery)

To integrate Masthead with your BigQuery data warehouse, it is required to create a few integration resources in your Google Cloud project.

An account running resources creation should have Owner permissions for the Google Cloud project. Or the following permissions are required:

Manage resources with Terraform

1

Add Masthead agent module to your Terraform project

module "masthead-agent" {
  source  = "masthead-data/masthead-agent/google"
  version = "~> 0.2.0"
  
  project_id = "YOUR_PROJECT_ID"
  
  # Enable only specific modules
  enable_modules = {
    bigquery      = true
    dataform      = false
    dataplex      = true
    analytics_hub = true
  }
}

More details about the module on Terraform Registry.

2

Apply the new resource changes

Update your dependencies:

terraform init

Plan and verify the resource changes:

terraform plan

Apply the changes to your cloud project:

terraform apply
3

Continue integration on the UI

Go back to Deployment page, click Manual Deployment and enter the project ID again.

Then click Check permissions and connect.

You will be forwarded to the overview dashboard for your newly integrated project.

Manual resource creation

1

Select Google Cloud project

Choose a project where BigQuery datasets are located and you would like Masthead Solution to monitor them. Use this project to create resources further.

Copy the project ID, you will need it further during the the Manual deployment option.

2

Create Pub/Sub resources

Under this selected project, navigate to Pub/Sub.

  1. Click Create Topic. Set topic id masthead-topic. (unmark checkbox Add a default subscription)

  1. Navigate into the newly created. Click Create Subscription

  2. Set subscription id: masthead-agent-subscription

  3. Set Cloud Pub/Sub topic ID: masthead-topic, then scroll down a little ...

  4. Set Acknowledgement deadline: 60 seconds

  5. Click Create and we are all done with Pub/Sub topic and Subscription.

3

Create Logs Router

Navigate to Logs Router from the Logging menu. Click Create Sink button in the upper right corner and fill in the required fields:

  1. Name: masthead-agent-sink

  1. Set Sink Destination: Cloud Pub/Sub topic -> choose from the dropdown newly created Pub/Sub topic masthead-topic

  1. Choose logs to include in the sink. Copy the text below and paste it into the filter.

protoPayload.methodName="google.cloud.bigquery.storage.v1.BigQueryWrite.AppendRows" OR "google.cloud.bigquery.v2.JobService.InsertJob" 
OR "google.cloud.bigquery.v2.TableService.InsertTable" 
OR "google.cloud.bigquery.v2.JobService.Query" 
resource.type ="bigquery_table" OR resource.type ="bigquery_dataset" OR resource.type ="bigquery_project"
  1. Hit Create Sink to complete creation

4

Grant Masthead Service Account roles

Grant [email protected] next roles:

  • BigQuery Metadata Viewer

  • BigQuery Resource Viewer

  • Pub/Sub Subscriber

5

Grant Masthead Service Account to quickly onboard from retrospective data

The Masthead platform can gain insights in a few hours by parsing retrospective logs and creating a data model around them. To do so, please grant Service Account [email protected] a Private Logs Viewer role.

  1. Navigate to IAM & Admin -> IAM

  2. Click on GRANT ACCESS button on the top left of the screen

  3. Specify [email protected] in the New principals field

  4. Click on Select a role and type Private Logs Viewer. Select found Private Logs Viewer role.

  5. Click SAVE

This will enable Masthead Agent to look up only recently produced events in the Google Cloud that corresponds to the filter defined in the Cloud Logs Sink.

6

Continue integration on the UI

Go back to Deployment page, click Manual Deployment and enter the project ID again.

Then click Check permissions and connect.

You will be forwarded to the overview dashboard within your newly integrated project.

Route retrospective logs

Get observability insights within a few hours after project integration. This can be enabled by exporting the retrospective logs to Masthead for analysis.

1

Masthead will provide Storage Object Creator permission to your service account to write the exported retrospective logs into Masthead's Cloud Storage bucket.

The account must have the following permissions in your project:

2

Copy the code, insert your project ID and run the command to start the export operation:

gcloud logging copy _Default storage.googleapis.com/masthead_retro_logs_{PROJECT_ID} \
--location=global \
--log-filter='protoPayload.methodName="google.cloud.bigquery.storage.v1.BigQueryWrite.AppendRows" OR protoPayload.methodName="google.cloud.bigquery.v2.JobService.InsertJob" OR protoPayload.methodName="google.cloud.bigquery.v2.TableService.InsertTable" OR protoPayload.methodName="google.cloud.bigquery.v2.JobService.Query" resource.type="bigquery_table" OR resource.type="bigquery_dataset" OR resource.type="bigquery_project" timestamp > "2025-06-18T10:00:00.0Z"'

How to batch and route logs retrospectively

Last updated