Pub/Sub Source

Onboarding Google Cloud Pub/Sub as a Data Transport log source in the Panther Console

Overview

Panther supports configuring Google Pub/Sub as a Data Transport to pull log data directly from Pub/Sub topics.

Panther can authenticate against your source using Google Cloud Workload Identity Federation or a service account.

How to set up a Cloud Pub/Sub log source in Panther

Step 1: Configure the Pub/Sub source in Panther

  1. In the left-hand navigation bar of your Panther Console, click Configure > Log Sources.

  2. In the upper-right corner, click Create New.

  3. Click the Custom Log Formats tile.

  4. On the Google Cloud Pub/Sub tile, click Start.

  5. On the Basic Info page, fill in the fields:

    • Name: Enter a descriptive name for the log source.

    • Log Types: Select the log types Panther should use to parse your logs.

  6. Click Setup.

  7. On the Log Format page, select the stream type of the incoming logs:

    • Auto

    • Lines

    • JSON

    • JSON Array

  8. Click Continue.

  9. On the Configuration page, follow the steps to create the required infrastructure components.

Step 2: Create the required infrastructure in your GCP cloud

Before creating GCP infrastructure, you'll need to decide:

Workload Identity Federation authentication is in open beta starting with Panther version 1.112, and is available to all customers. Please share any bug reports and feature requests with your Panther support team.

If you'd like Panther to authenticate using a Google Cloud service account, follow the instructions in one of the tabs below.

To create GCP infrastructure using Terraform (authenticating with a service account):

  1. On the Infrastructure & Credentials page, click the Service Account tab.

  2. Click Terraform Template to download the Terraform template.

  3. Fill out the fields in the panther.tfvars file with your configuration.

    • Set authentication_method to "service_account".

  4. Initialize a working directory containing Terraform configuration files and run terraform init.

  5. Copy the corresponding Terraform Command provided and run it in your CLI.

  6. Generate a JSON key file by copying the gcloud Command provided, replacing the value for your service account email address, and running it in your CLI.

    • You can find the service account email in the output of the Terraform Command.

Step 3: Finish the source setup in Panther

If you are using a Google Cloud service account to authenticate:

  1. Under Provide pulling configuration & JSON Keyfile, upload your JSON key file.

  2. Enter your Pub/Sub Subscription ID, found in the Subscriptions section of your Google Cloud account.

  3. Click Setup. You will be directed to a success screen:

    The success screen reads, "Everything looks good! Panther will now automatically pull & process logs from your account"
    • You can optionally enable one or more Detection Packs.

    • If you have not done so already, click Attach or Infer Schemas to attach one or more schemas to the source.

    • The Trigger an alert when no events are processed setting defaults to YES. We recommend leaving this enabled, as you will be alerted if data stops flowing from the log source after a certain period of time. The timeframe is configurable, with a default of 24 hours.

      The "Trigger an alert when no events are processed" toggle is set to YES. The "How long should Panther wait before it sends you an alert that no events have been processed" setting is set to 1 Day

Viewing ingested logs

After your log source is configured, you can search ingested data using Search or Data Explorer.

Last updated

Was this helpful?