Cloud Storage (GCS) Source
Onboarding Google Cloud Storage (GCS) as a Data Transport log source in the Panther Console
Last updated
Onboarding Google Cloud Storage (GCS) as a Data Transport log source in the Panther Console
Last updated
With Google Cloud Storage (GCS) as a log source, Panther can pull log data directly from GCS buckets, write rules, and run queries on this processed data.
Data can be sent compressed (or uncompressed). Learn more about compression specifications in Ingesting compressed data in Panther.
Panther uses Pub/Sub to get notified of new data to consume in your bucket. Configure a bucket to send notifications for new files to the topic you created.
In the left-hand navigation bar of your Panther Console, click Configure > Log Sources.
In the upper-right corner, click Create New.
Click the Custom Log Formats tile.
On the Google Cloud Storage tile, click Start.
On the "Configure your source" page, fill in the fields:
Source's Name: Enter a descriptive name for the GCS log source.
Log Stream Type: Select the log format Panther should use to parse your GCS logs.
GCS Prefix Filter: Define a prefix to tell Panther which folders to include in the event that your GCS instance contains multiple data types. Leave this field blank if you want to allow ingestion of all files.
Click Setup.
On the "Infrastructure & Credentials" page, follow the steps to create the infrastructure component with a Terraform template. If you do not want to use a Terraform Template, you can follow our alternative documentation to complete the infrastructure components process manually.
Download and complete the Terraform template
Download the Terraform Template.
Fill out the fields in the panther.tfvars
file with your configuration.
Initialize a working directory containing Terraform configuration files by running the Terraform Command schema provided.
Copy the corresponding Terraform of gcloud command schema provided and run it in your CLI.
Generate a JSON keyfile by replacing the value for your service account email in the gcloud command code listed.
You can find the key file in the output of the Terraform run.
Provide pulling configuration & JSON Keyfile
Drag and drop or upload the JSON key into the correct field in Step 2.
Paste in your GCS Bucket Name and Pub/Sub Subscription ID, found in the Subscriptions section of your Google Cloud account.
Click Setup. You will be directed to a success screen:
You can optionally enable one or more Detection Packs.
The Trigger an alert when no events are processed setting defaults to YES. We recommend leaving this enabled, as you will be alerted if data stops flowing from the log source after a certain period of time. The timeframe is configurable, with a default of 24 hours.
If you choose to create the infrastructure components manually rather than using a Terraform template during the GCS setup above, follow the instructions below.
Log in to your Google Cloud console.
Determine which bucket Panther will pull logs from.
If you have not created a bucket yet, please see Google's documentation on creating a bucket.
Create a topic for the notifications.
You can create a topic using the gcloud
CLI tool with the following command format:
gcloud pubsub topics create $TOPIC_ID
Configure the bucket to send notifications for new files to the topic you created.
You can create a notification using the gcloud
CLI tool with the following command format:
gsutil notification create -t $TOPIC_NAME -e OBJECT_FINALIZE -f json gs://$BUCKET_NAME
Note: Panther only requires the OBJECT_FINALIZE
type.
Create a subscription to be used with the topic you created. Note: This subscription should not be used by any service other than Panther.
You can create a subscription using the gcloud
CLI tool with the following command format:
gcloud pubsub subscriptions create $SUBSCRIPTION_ID --topic $TOPIC_ID --topic-project $PROJECT_ID
Create a new Google Cloud service account. Make sure to take note of the account email address, as Panther will use this to access the infrastructure created for this GCS integration.
To create the account using the gcloud
CLI tool, use the following command format:
gcloud iam service-accounts create $PROJECT_ID --description="$DESCRIPTION" --display-name="$DISPLAY_NAME"
Assign the required IAM roles to the account.
The following permissions are required for the project where the Pub/Sub subscription and topic lives:
Note: You can create the permissions using the gcloud
CLI tool:
gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" --role="roles/storage.objectViewer"
gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" --role="roles/pubsub.subscriber"
gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" --role="roles/pubsub.viewer"
gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" --role="roles/monitoring.viewer"
Generate a JSON key file for the service account, which will be used in Panther to authenticate to the GCP infrastructure.
To create a JSON key file using the gcloud CLI tool, use the following command format:
gcloud iam service-accounts keys create $KEYFILE_PATH --iam-account=$SERVICE_ACCOUNT_EMAIL
Download the key file.
Open the GCP terminal ("Activate Cloud Shell")
Click the 3 dots icon menu in the top right, then click Download.
Click the folder icon for Browse.
Navigate to the key file and select it, then click Download.
After GCS log sources are fully configured, you can search your data in Data Explorer. For more information and for example queries, please see the documentation on Data Explorer.
Note: You can set conditions or IAM policies on permissions for specific resources. This can be done either in the IAM page of the service account (as seen in the example screenshot below) or in the specific resource's page.
Permissions required
Role
Scope
storage.objects.get
storage.objects.list
roles/storage.objectViewer
bucket-name
pubsub.subscriptions.consume
roles/pubsub.subscriber
subscription-name
pubsub.subscriptions.get
roles/pubsub.viewer
subscription-name
monitoring.timeSeries.list
roles/monitoring.viewer
project