Cloud Storage (GCS) Source

Onboarding Google Cloud Storage (GCS) as a Data Transport log source in the Panther Console

Overview

With Google Cloud Storage (GCS) as a log source, Panther can pull log data directly from GCS buckets, write rules, and run queries on this processed data.

Data can be sent compressed (or uncompressed). Learn more about compression specifications in Ingesting compressed data in Panther.

Prerequisites

Panther uses Pub/Sub to get notified of new data to consume in your bucket. Configure a bucket to send notifications for new files to the topic you created.

How to connect GCS as a Data Transport log source in Panther

  1. In the left-hand navigation bar of your Panther Console, click Configure > Log Sources.

  2. In the upper-right corner, click Create New.

  3. Click the Custom Log Formats tile.

  4. On the Google Cloud Storage tile, click Start.

  5. On the "Configure your source" page, fill in the fields:

    • Source's Name: Enter a descriptive name for the GCS log source.

    • Log Stream Type: Select the log format Panther should use to parse your GCS logs.

  6. Click Setup.

  7. On the "Infrastructure & Credentials" page, follow the steps to create the infrastructure component with a Terraform template. If you do not want to use a Terraform Template, you can follow our alternative documentation to complete the infrastructure components process manually.

    1. Download and complete the Terraform template

      • Download the Terraform Template.

      • Fill out the fields in the panther.tfvars file with your configuration.

      • Initialize a working directory containing Terraform configuration files by running the Terraform Command schema provided.

      • Copy the corresponding Terraform of gcloud command schema provided and run it in your CLI.

      • Generate a JSON keyfile by replacing the value for your service account email in the gcloud command code listed.

        • You can find the key file in the output of the Terraform run.

    2. Provide pulling configuration & JSON Keyfile

      • Drag and drop or upload the JSON key into the correct field in Step 2.

  8. Click Setup. You will be directed to a success screen:

    • You can optionally enable one or more Detection Packs.

    • The Trigger an alert when no events are processed setting defaults to YES. We recommend leaving this enabled, as you will be alerted if data stops flowing from the log source after a certain period of time. The timeframe is configurable, with a default of 24 hours.

Alternative to Terraform template: Configuring your integration manually in Google Cloud Platform (GCP)

If you choose to create the infrastructure components manually rather than using a Terraform template during the GCS setup above, follow the instructions below.

  1. Log in to your Google Cloud console.

  2. Determine which bucket Panther will pull logs from.

  3. Create a topic for the notifications.

    • You can create a topic using the gcloud CLI tool with the following command format: gcloud pubsub topics create $TOPIC_ID

  4. Configure the bucket to send notifications for new files to the topic you created.

    • You can create a notification using the gcloud CLI tool with the following command format: gsutil notification create -t $TOPIC_NAME -e OBJECT_FINALIZE -f json gs://$BUCKET_NAME

    • Note: Panther only requires the OBJECT_FINALIZE type.

  5. Create a subscription to be used with the topic you created. Note: This subscription should not be used by any service other than Panther.

    • You can create a subscription using the gcloud CLI tool with the following command format: gcloud pubsub subscriptions create $SUBSCRIPTION_ID --topic $TOPIC_ID --topic-project $PROJECT_ID

  6. Create a new Google Cloud service account. Make sure to take note of the account email address, as Panther will use this to access the infrastructure created for this GCS integration.

    • To create the account using the gcloud CLI tool, use the following command format: gcloud iam service-accounts create $PROJECT_ID --description="$DESCRIPTION" --display-name="$DISPLAY_NAME"

  7. Assign the required IAM roles to the account.

    • The following permissions are required for the project where the Pub/Sub subscription and topic lives:

      Permissions required

      Role

      Scope

      storage.objects.get

      storage.objects.list

      roles/storage.objectViewer

      bucket-name

      pubsub.subscriptions.consume

      roles/pubsub.subscriber

      subscription-name

      pubsub.subscriptions.get

      roles/pubsub.viewer

      subscription-name

      monitoring.timeSeries.list

      roles/monitoring.viewer

      project

      • Note: You can create the permissions using the gcloud CLI tool:

        • gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" --role="roles/storage.objectViewer"

        • gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" --role="roles/pubsub.subscriber"

        • gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" --role="roles/pubsub.viewer"

        • gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" --role="roles/monitoring.viewer"

  8. Generate a JSON key file for the service account, which will be used in Panther to authenticate to the GCP infrastructure.

    • To create a JSON key file using the gcloud CLI tool, use the following command format: gcloud iam service-accounts keys create $KEYFILE_PATH --iam-account=$SERVICE_ACCOUNT_EMAIL

  9. Download the key file.

    1. Open the GCP terminal ("Activate Cloud Shell")

    2. Click the 3 dots icon menu in the top right, then click Download.

    3. Click the folder icon for Browse.

    4. Navigate to the key file and select it, then click Download.

View collected logs

After GCS log sources are fully configured, you can search your data in Data Explorer. For more information and for example queries, please see the documentation on Data Explorer.

Last updated

Change request #1924: [don't merge until ~Oct] Notion Logs (Beta)