Skip to content

Integrate using Terraform

To connect Masthead with your BigQuery data warehouse, you’ll need to create a few integration resources in your Google Cloud project.

Add Masthead agent module to your IaC project

Section titled “Add Masthead agent module to your IaC project”
module "masthead-agent" {
source = "masthead-data/masthead-agent/google"
project_id = "YOUR_PROJECT_ID"
# Enable modules for the used services
enable_modules = {
bigquery = true
dataform = true
dataplex = true
analytics_hub = true
}
}

More details about the module on Terraform Registry.

bash terraform init terraform plan terraform apply

Go back to Deployment page, click Verify custom deployment, enter the project ID and click Check permissions and connect.

Masthead installation will create your workspace and open your dashboard once setup is completed.

You’ll receive an email notification with the link to the workspace. We’ll notify you once all the metadata is collected and the insights are ready.

All done! Thank you for completing installation!

Masthead allows you to get observability insights within a few hours after project integration.

By default Masthead uses the Private Logs Viewer role to export 30 days of retrospective logs automatically. If you set enable_privatelogviewer_role = false in Terraform module, quick insights can be enabled by exporting the retrospective logs to Masthead for analysis:

Masthead will provide Storage Object Creator permission to your service account to write the exported retrospective logs into Masthead’s Cloud Storage bucket.

The account must have the following permissions in your project:

Copy the code, update the variables and run the command to start the export operation:

  • PROJECT_ID - your project ID
  • YYYY-MM-DD - export start date, 30 days ago
Terminal window
gcloud logging copy _Default storage.googleapis.com/masthead_retro_logs_{PROJECT_ID} \
--location=global \
--log-filter='protoPayload.methodName="google.cloud.bigquery.storage.v1.BigQueryWrite.AppendRows" OR protoPayload.methodName="google.cloud.bigquery.v2.JobService.InsertJob" OR protoPayload.methodName="google.cloud.bigquery.v2.TableService.InsertTable" OR protoPayload.methodName="google.cloud.bigquery.v2.JobService.Query" resource.type="bigquery_table" OR resource.type="bigquery_dataset" OR resource.type="bigquery_project" timestamp > "{YYYY-MM-DD}T00:00:00.0Z"'

How to batch and route logs retrospectively.

[!NOTE] Check the progress of the started operation:

Terminal window
gcloud logging operations describe OPERATION_ID \
--location=global --project=PROJECT_ID

Your Masthead account will use the exported data to generate the compute and cost insights without the need to wait for weeks.

If your Google Cloud project is within a VPC service perimeter, you must configure ingress and egress policies to ensure that the necessary resources are accessible to Masthead Data.

Policies configuration suggestion
restricted_services = [
"bigquery.googleapis.com",
"pubsub.googleapis.com",
"logging.googleapis.com",
"dataplex.googleapis.com",
"analyticshub.googleapis.com"
]
ingress_policies {
ingress_from {
identities = [
"serviceAccount:masthead-data@masthead-prod.iam.gserviceaccount.com",
"serviceAccount:retro-data@masthead-prod.iam.gserviceaccount.com",
"serviceAccount:masthead-dataplex@masthead-prod.iam.gserviceaccount.com",
"serviceAccount:masthead-dataform@masthead-prod.iam.gserviceaccount.com"
]
sources {
resource = "projects/431544431936" # masthead-prod
}
}
ingress_to {
resources = ["*"]
roles = [
"roles/bigquery.metadataViewer",
"roles/bigquery.resourceViewer",
"roles/logging.privateLogViewer",
"roles/pubsub.subscriber",
"roles/analyticshub.viewer",
"projects/YOUR_PROJECT_ID/roles/analyticsHubSubscriptionViewer",
]
}
}
egress_policies {
egress_to {
resources = ["*"]
roles = [
"roles/bigquery.metadataViewer",
"roles/bigquery.resourceViewer"
]
}
egress_from {
identities = [
"serviceAccount:masthead-data@masthead-prod.iam.gserviceaccount.com",
]
sources {
resource = "projects/431544431936" # masthead-prod
}
}
}