Cloud Storage (GCS) Source
Onboarding GCS as a Data Transport log source in the Panther Console
Overview
Panther supports configuring Google Cloud Storage (GCS) as a Data Transport to pull log data directly from GCS buckets, write rules, and run queries on this processed data. Panther uses Pub/Sub to be notified of new data in your bucket that is ready to be consumed.
Panther can authenticate against your source using Google Cloud Workload Identity Federation or a service account.
Data can be sent compressed (or uncompressed). Learn more about compression specifications in Ingesting compressed data in Panther.
How to set up a GCS log source in Panther
Step 1: Begin creating the GCS source in Panther
In the left-hand navigation bar of your Panther Console, click Configure > Log Sources.
In the upper-right corner, click Create New.
Click the Custom Log Formats tile.
On the Google Cloud Storage tile, click Start.
On the Basic Info page, fill in the fields:
Name: Enter a descriptive name for the GCS log source.
Prefixes & Schemas: Define combinations of prefixes, schemas, and exclusion filters, according the structure of your data storage in GCS.
To attach one or more schemas to all data in the bucket, leave the GCS Prefix field blank. This will create a wildcard (*) prefix.
Click Setup.
On the Log Format page, select the stream type of the incoming logs:
Auto
Lines
JSON
JSON Array
Click Continue.
Step 2: Create required Google Cloud Platform (GCP) infrastructure
Before creating GCP infrastructure, you'll need to decide:
The authentication method: You can use Google Cloud Workload Identity Federation with AWS or a service account—see the top-level tabs below.
The creation method: You can use Terraform to create the infrastructure, or create it manually in the GCP console—see the sub-tabs within each top-level tab below.
If you'd like Panther to authenticate using a Google Cloud service account, follow the instructions in one of the tabs below.
To create GCP infrastructure using Terraform (authenticating with a service account):
On the Infrastructure & Credentials page, click the Service Account tab.
Click Terraform Template to download the Terraform template.
You can also find the Terraform template at this GitHub link.
Fill out the fields in the
panther.tfvars
file with your configuration.Set
authentication_method
to"service_account"
.
Initialize a working directory containing Terraform configuration files and run
terraform init
.Copy the corresponding Terraform Command provided and run it in your CLI.
Generate a JSON key file by copying the gcloud Command provided, replacing the value for your service account email address, and running it in your CLI.
You can find the service account email in the output of the Terraform Command.
Step 3: Provide credential file and configuration values to Panther
If you are using a Google Cloud service account to authenticate:
Under Provide pulling configuration & JSON Keyfile, upload your JSON key file.
Enter your GCS Bucket Name and Pub/Sub Subscription ID, found in the Subscriptions section of your Google Cloud account.
Click Setup. You will be directed to a success screen:
You can optionally enable one or more Detection Packs.
If you have not done so already, click Attach or Infer Schemas to attach one or more schemas to the source.
The Trigger an alert when no events are processed setting defaults to YES. We recommend leaving this enabled, as you will be alerted if data stops flowing from the log source after a certain period of time. The timeframe is configurable, with a default of 24 hours.
Viewing ingested logs
After your log source is configured, you can search ingested data using Search or Data Explorer.
Last updated
Was this helpful?