Onboarding GCS as a Data Transport log source in the Panther Console
Overview
Panther supports configuring Google Cloud Storage (GCS) as a Data Transport to pull log data directly from GCS buckets, write rules, and run queries on this processed data. Panther uses Pub/Sub to be notified of new data in your bucket that is ready to be consumed.
The creation method: You can use Terraform to create the infrastructure, or create it manually in the GCP console—see the sub-tabs within each top-level tab below.
Workload Identity Federation authentication is in open beta starting with Panther version 1.112, and is available to all customers. Please share any bug reports and feature requests with your Panther support team.
If you'd like Panther to authenticate using a Google Cloud service account, follow the instructions in one of the tabs below.
To create GCP infrastructure using Terraform (authenticating with a service account):
Click Terraform Template to download the Terraform template.
Fill out the fields in the panther.tfvars file with your configuration.
Set authentication_method to"service_account".
Initialize a working directory containing Terraform configuration files and run terraform init.
Copy the corresponding Terraform Command provided and run it in your CLI.
Generate a JSON key file by copying the gcloud Command provided, replacing the value for your service account email address, and running it in your CLI.
You can find the service account email in the output of the Terraform Command.
To create the GCP infrastructure components manually (authenticating with a service account):
In your Google Cloud console, determine which bucket Panther will pull logs from.
You can create a notification using the gcloud CLI tool with the following command format: gsutil notification create -t $TOPIC_NAME -e OBJECT_FINALIZE -f json gs://$BUCKET_NAME
Note: Panther only requires the OBJECT_FINALIZE type.
This subscription should not be used by any other service or source.
You can create a subscription using the gcloud CLI tool with the following command format: gcloud pubsub subscriptions create $SUBSCRIPTION_ID --topic $TOPIC_ID --topic-project $PROJECT_ID
Generate a JSON key file for the service account, which will be used in Panther to authenticate to the GCP infrastructure.
To create a JSON key file using the gcloud CLI tool, run the following command format:
gcloud iam service-accounts keys create $KEYFILE_PATH --iam-account=$SERVICE_ACCOUNT_EMAIL
Alternatively, you can run the above command in GCP's terminal instead of locally.
Click the 3 dots icon menu in the top right, then click Download.
Click the folder icon for Browse.
Navigate to the key file and select it, then click Download.
Fill out the fields in the panther.tfvars file with your configuration.
Set authentication_method to"workload_identity_federation".
Provide values forpanther_workload_identity_pool_id,panther_workload_identity_pool_provider_id, andpanther_aws_account_id.
Initialize a working directory containing Terraform configuration files and run terraform init.
Copy the corresponding Terraform Command provided and run it in your CLI.
Generate a credential configuration file for the pool by copying the gcloud Command provided, replacing the value for the project number, pool ID, and provider ID, and running it in your CLI.
You can find the project number, the pool ID and the provider ID in the output of the Terraform Command.
To create the GCP infrastructure components manually (authenticating with Workload Identity Federation):
In your Google Cloud console, determine which bucket Panther will pull logs from.
Uniform bucket-level access must be enabled on the target bucket in order to grant Workload Identity Federation entities access to cloud storage resources.
You can create a notification using the gcloud CLI tool with the following command format: gsutil notification create -t $TOPIC_NAME -e OBJECT_FINALIZE -f json gs://$BUCKET_NAME
Note: Panther only requires the OBJECT_FINALIZE type.
This subscription should not be used by any other service or source.
You can create a subscription using the gcloud CLI tool with the following command format: gcloud pubsub subscriptions create $SUBSCRIPTION_ID --topic $TOPIC_ID --topic-project $PROJECT_ID
The value of the google.subject attribute cannot exceed 127 characters. You may use Common Expression Language (CEL) expressions to transform or combine attributes from the token issued by AWS. The expression suggested in the table above takes this limit into account, and is an attempt at transforming the ARN into a value that uniquely identifies Panther entities. For more information on the AWS attributes, see "Example 2 - Called by user created with AssumeRole" on this AWS documentation page.
The following permissions are required for the project where the Pub/Sub subscription and topic lives:
Permissions required
Role
Scope
storage.objects.get
storage.objects.list
roles/storage.objectViewer
bucket-name
pubsub.subscriptions.consume
roles/pubsub.subscriber
subscription-name
pubsub.subscriptions.get
roles/pubsub.viewer
subscription-name
monitoring.timeSeries.list
roles/monitoring.viewer
project
Note: You can create the permissions using the gcloud CLI tool, where the $PRINCIPAL_ID may be something like:
principalSet://iam.googleapis.com/projects/<THE_ACTUAL_GOOGLE_PROJECT_NUMBER>/locations/global/workloadIdentityPools/<THE_ACTUAL_POOL_ID>/attribute.account/<THE_ACTUAL_PANTHER_AWS_ACCOUNT_ID>
To generate a credential configuration file using the gcloud CLI tool, use the following command format:
gcloud iam workload-identity-pools create-cred-config projects/$PROJECT_NUMBER/locations/global/workloadIdentityPools/$POOL_ID/providers/$PROVIDER_ID --aws --output-file=config.json
Step 3: Provide credential file and configuration values to Panther
If you have not done so already, click Attach or Infer Schemas to attach one or more schemas to the source.
The Trigger an alert when no events are processed setting defaults to YES. We recommend leaving this enabled, as you will be alerted if data stops flowing from the log source after a certain period of time. The timeframe is configurable, with a default of 24 hours.
If you have not done so already, click Attach or Infer Schemas to attach one or more schemas to the source.
The Trigger an alert when no events are processed setting defaults to YES. We recommend leaving this enabled, as you will be alerted if data stops flowing from the log source after a certain period of time. The timeframe is configurable, with a default of 24 hours.
Viewing ingested logs
After your log source is configured, you can search ingested data using Search or Data Explorer.
On the Infrastructure & Credentials page, click the Service Account tab.
Note: You can set conditions or IAM policies on permissions for specific resources. This can be done either in the IAM page of the service account (as seen in the example screenshot below) or in the specific resource's page.
On the Infrastructure & Credentials page, click the Workload Identity Federation tab.
Note: You can set conditions or IAM policies on permissions for specific resources. This can be done either in the IAM section in GCP (as seen in the example screenshot below) or in the specific resource's page.
Enter your GCS Bucket Name and Pub/Sub Subscription ID, found in the Subscriptions section of your Google Cloud account.
On the Infrastructure & Credentials page, if you have not already, click the Workload Identity Federation tab.
Enter your Project ID, GCS Bucket Name, and Pub/Sub Subscription ID found in the Subscriptions section of your Google Cloud account.