Cloud Storage (GCS) Source
Onboarding GCS as a Data Transport log source in the Panther Console
Overview
Panther supports configuring Google Cloud Storage (GCS) as a Data Transport to pull log data directly from GCS buckets, write rules, and run queries on this processed data. Panther uses Pub/Sub to be notified of new data in your bucket that is ready to be consumed.
Panther can authenticate against your source using Google Cloud Workload Identity Federation or a service account.
Data can be sent compressed (or uncompressed). Learn more about compression specifications in Ingesting compressed data in Panther.
How to set up a GCS log source in Panther
Step 1: Begin creating the GCS source in Panther
In the left-hand navigation bar of your Panther Console, click Configure > Log Sources.
In the upper-right corner, click Create New.
Click the Google Cloud Storage tile.
On the Basic Info page, fill in the fields:
Name: Enter a descriptive name for the GCS log source.
Prefixes & Schemas: Define combinations of prefixes, schemas, and exclusion filters, according the structure of your data storage in GCS.
To attach one or more schemas to all data in the bucket, leave the GCS Prefix field blank. This will create a wildcard (*) prefix.
Click Setup.
On the Log Format page, select the stream type of the incoming logs:
Auto
Lines
JSON
JSON Array
Click Continue.
Step 2: Create required Google Cloud Platform (GCP) infrastructure
Before creating GCP infrastructure, you'll need to decide:
The authentication method: You can use Google Cloud Workload Identity Federation with AWS or a service account—see the top-level tabs below.
The creation method: You can use Terraform to create the infrastructure, or create it manually in the GCP console—see the sub-tabs within each top-level tab below.
If you'd like Panther to authenticate using a Google Cloud service account, follow the instructions in one of the tabs below.
To create GCP infrastructure using Terraform (authenticating with a service account):
On the Infrastructure & Credentials page, click the Service Account tab.

Click Terraform Template to download the Terraform template.
You can also find the Terraform template at this GitHub link.
Fill out the fields in the
panther.tfvarsfile with your configuration.Set
authentication_methodto"service_account".
Initialize a working directory containing Terraform configuration files and run
terraform init.Copy the corresponding Terraform Command provided and run it in your CLI.
Generate a JSON key file by copying the gcloud Command provided, replacing the value for your service account email address, and running it in your CLI.
You can find the service account email in the output of the Terraform Command.
To create the GCP infrastructure components manually in the GCP console (authenticating with a service account):
In your Google Cloud console, determine which bucket Panther will pull logs from.
If you have not created a bucket yet, please see Google's documentation on creating a bucket.
Create a topic for the notifications.
You can create a topic using the
gcloudCLI tool with the following command format:gcloud pubsub topics create $TOPIC_ID
Configure the bucket to send notifications for new files to the topic you created.
You can create a notification using the
gcloudCLI tool with the following command format:gsutil notification create -t $TOPIC_NAME -e OBJECT_FINALIZE -f json gs://$BUCKET_NAMENote: Panther only requires the
OBJECT_FINALIZEtype.
Create a subscription to be used with the topic you created.
This subscription should not be used by any other service or source.
You can create a subscription using the
gcloudCLI tool with the following command format:gcloud pubsub subscriptions create $SUBSCRIPTION_ID --topic $TOPIC_ID --topic-project $PROJECT_ID
Create a new Google Cloud service account. To create the account using the
gcloudCLI tool, use the following command format:gcloud iam service-accounts create SA-NAME \ --description="DESCRIPTION" \ --display-name="DISPLAY_NAME"Make sure to take note of the account email address, as Panther will use this to access the infrastructure created for this GCS integration.
Assign the required IAM roles to the account.
The following permissions are required for the project where the Pub/Sub subscription and topic lives:
Permissions required
Role
Scope
storage.objects.getstorage.objects.listroles/storage.objectViewerbucket-name
pubsub.subscriptions.consumeroles/pubsub.subscribersubscription-name
pubsub.subscriptions.getroles/pubsub.viewersubscription-name
monitoring.timeSeries.listroles/monitoring.viewerproject
Note: You can set conditions or IAM policies on permissions for specific resources. This can be done either in the IAM page of the service account (as seen in the example screenshot below) or in the specific resource's page.

Note: You can create the permissions using the
gcloudCLI tool:gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" --role="roles/storage.objectViewer"gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" --role="roles/pubsub.subscriber"gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" --role="roles/pubsub.viewer"gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" --role="roles/monitoring.viewer"
Generate a JSON key file for the service account, which will be used in Panther to authenticate to the GCP infrastructure.
To create a JSON key file using the gcloud CLI tool, run the following command format:
gcloud iam service-accounts keys create $KEYFILE_PATH --iam-account=$SERVICE_ACCOUNT_EMAILAlternatively, you can run the above command in GCP's terminal instead of locally.
Click the 3 dots icon menu in the top right, then click Download.
Click the folder icon for Browse.
Navigate to the key file and select it, then click Download.
If you'd like Panther to authenticate using Google Cloud Workload Identity Federation, follow the instructions in one of the tabs below.
To create the GCP infrastructure components using Terraform (authenticating with Workload Identity Federation):
On the Infrastructure & Credentials page, click the Workload Identity Federation tab.

Click Terraform Template to download the Terraform template.
You can also find the Terraform template at this GitHub link.
Fill out the fields in the
panther.tfvarsfile with your configuration.Set
authentication_methodto"workload_identity_federation".Provide values for
panther_workload_identity_pool_id,panther_workload_identity_pool_provider_id, andpanther_aws_account_id.
Initialize a working directory containing Terraform configuration files and run
terraform init.Copy the corresponding Terraform Command provided and run it in your CLI.
Generate a credential configuration file for the pool by copying the gcloud Command provided, replacing the value for the project number, pool ID, and provider ID, and running it in your CLI.
You can find the project number, the pool ID and the provider ID in the output of the Terraform Command.
To create the GCP infrastructure components manually int he GCP console (authenticating with Workload Identity Federation):
In your Google Cloud console, determine which bucket Panther will pull logs from.
If you have not created a bucket yet, please see Google's documentation on creating a bucket.
Uniform bucket-level access must be enabled on the target bucket in order to grant Workload Identity Federation entities access to cloud storage resources.
Create a topic for the notifications.
You can create a topic using the
gcloudCLI tool with the following command format:gcloud pubsub topics create $TOPIC_ID
Configure the bucket to send notifications for new files to the topic you created.
You can create a notification using the
gcloudCLI tool with the following command format:gsutil notification create -t $TOPIC_NAME -e OBJECT_FINALIZE -f json gs://$BUCKET_NAMENote: Panther only requires the
OBJECT_FINALIZEtype.
Create a subscription to be used with the topic you created.
This subscription should not be used by any other service or source.
You can create a subscription using the
gcloudCLI tool with the following command format:gcloud pubsub subscriptions create $SUBSCRIPTION_ID --topic $TOPIC_ID --topic-project $PROJECT_ID
Configure Workload Identity Federation with AWS by following the Configure Workload Identity Federation with AWS or Azure documentation.
As you are defining an attribute mapping(s) and condition, take note of the following examples:
Example attribute mappings:
GoogleAWSgoogle.subjectassertion.arn.extract('arn:aws:sts::{account_id}:')+":"+assertion.arn.extract('assumed-role/{role_and_session}').extract('/{session}')attribute.accountassertion.accountThe value of the
google.subjectattribute cannot exceed 127 characters. You may use Common Expression Language (CEL) expressions to transform or combine attributes from the token issued by AWS. The expression suggested in the table above takes this limit into account, and is an attempt at transforming the ARN into a value that uniquely identifies Panther entities. For more information on the AWS attributes, see "Example 2 - Called by user created with AssumeRole" on this AWS documentation page.Example attribute condition:
attribute.account=="<PANTHER_AWS_ACCOUNT_ID>"
When you are adding a provider to your identity pool, select AWS.
Assign the required IAM roles to the account.
The following permissions are required for the project where the Pub/Sub subscription and topic lives:
Permissions required
Role
Scope
storage.objects.getstorage.objects.listroles/storage.objectViewerbucket-name
pubsub.subscriptions.consumeroles/pubsub.subscribersubscription-name
pubsub.subscriptions.getroles/pubsub.viewersubscription-name
monitoring.timeSeries.listroles/monitoring.viewerproject
Note: You can set conditions or IAM policies on permissions for specific resources. This can be done either in the IAM section in GCP (as seen in the example screenshot below) or in the specific resource's page.

Note: You can create the permissions using the
gcloudCLI tool, where the$PRINCIPAL_IDmay be something like:principalSet://iam.googleapis.com/projects/<THE_ACTUAL_GOOGLE_PROJECT_NUMBER>/locations/global/workloadIdentityPools/<THE_ACTUAL_POOL_ID>/attribute.account/<THE_ACTUAL_PANTHER_AWS_ACCOUNT_ID>gcloud projects add-iam-policy-binding $PROJECT_ID --member="$PRINCIPAL_ID" --role="roles/storage.objectViewer"gcloud projects add-iam-policy-binding $PROJECT_ID --member="$PRINCIPAL_ID" --role="roles/pubsub.subscriber"gcloud projects add-iam-policy-binding $PROJECT_ID --member="$PRINCIPAL_ID" --role="roles/pubsub.viewer"gcloud projects add-iam-policy-binding $PROJECT_ID --member="$PRINCIPAL_ID" --role="roles/monitoring.viewer"
Download the credential configuration file, which will be used in Panther to authenticate to the GCP infrastructure.
To generate a credential configuration file using the gcloud CLI tool, use the following command format:
gcloud iam workload-identity-pools create-cred-config projects/$PROJECT_NUMBER/locations/global/workloadIdentityPools/$POOL_ID/providers/$PROVIDER_ID --aws --output-file=config.json
Step 3: Provide credential file and configuration values to Panther
If you are using a Google Cloud service account to authenticate:
Under Provide pulling configuration & JSON Keyfile, upload your JSON key file.
Enter your GCS Bucket Name and Pub/Sub Subscription ID, found in the Subscriptions section of your Google Cloud account.

Click Setup. You will be directed to a success screen:

You can optionally enable one or more Detection Packs.
If you have not done so already, click Attach or Infer Schemas to attach one or more schemas to the source.
The Trigger an alert when no events are processed setting defaults to YES. We recommend leaving this enabled, as you will be alerted if data stops flowing from the log source after a certain period of time. The timeframe is configurable, with a default of 24 hours.

If you are using Google Cloud Workload Identity Federation to authenticate:
On the Infrastructure & Credentials page, if you have not already, click the Workload Identity Federation tab.

Under Provide pulling configuration & credential configuration file, upload your credential configuration file.
Enter your Project ID, GCS Bucket Name, and Pub/Sub Subscription ID found in the Subscriptions section of your Google Cloud account.

Click Setup. You will be directed to a success screen:

You can optionally enable one or more Detection Packs.
If you have not done so already, click Attach or Infer Schemas to attach one or more schemas to the source.
The Trigger an alert when no events are processed setting defaults to YES. We recommend leaving this enabled, as you will be alerted if data stops flowing from the log source after a certain period of time. The timeframe is configurable, with a default of 24 hours.

Viewing ingested logs
After your log source is configured, you can search ingested data using Search or Data Explorer.
Last updated
Was this helpful?


