Pub/Sub Source

Onboarding Google Cloud Pub/Sub as a Data Transport log source in the Panther Console

Overview

Panther supports configuring Google Pub/Sub as a Data Transport to pull log data directly from Pub/Sub topics.

Panther pulls from a dedicated subscription and authenticates by way of a Service Account.

How to set up a Cloud Pub/Sub log source in Panther

Step 1: Configure the Pub/Sub source in Panther

  1. In the left-hand navigation bar of your Panther Console, click Configure > Log Sources.

  2. In the upper-right corner, click Create New.

  3. Click the Custom Log Formats tile.

  4. On the Google Cloud Pub/Sub tile, click Start.

  5. On the Basic Info page, fill in the fields:

    • Name: Enter a descriptive name for the log source.

    • Log Types: Select the log types Panther should use to parse your logs.

  6. Click Setup.

  7. On the Log Format page, select the stream type of the incoming logs:

    • Auto

    • Lines

    • JSON

    • JSON Array

  8. Click Continue.

  9. On the Configuration page, follow the steps to create the required infrastructure components.

Step 2: Create the required infrastructure in your GCP cloud

  1. On the "Infrastructure & Credentials" page, click Terraform Template to download the Terraform Template.

  2. Fill out the fields in the panther.tfvars file with your configuration:

    • project_id: The ID of your GCP project.

    • topic_name: Topic name to be created. Data will need to be published here.

    • subscription_id: Name for the subscription Panther will use to consume from the topic.

    • gcp_region: Region of the GCP Infrastructure.

    • panther_service_account_id: The ID of the service account that Panther will use.

    • panther_service_account_display_name: The display name of the service account that Panther will use.

  3. Initialize the directory with terraform init

  4. Run terraform apply -var-file="production.tfvars" to create the resources

    • Note: If the topic you want to pull from already exists in your infrastructure, you can remove the relevant parts from the terraform template.

  5. Generate a JSON keyfile for the just-created service account. gcloud iam service-accounts keys create keyfile.json [email protected]

    • You can find the key file in the output of the Terraform run.

Step 3: Finish the source setup in Panther

  1. Navigate back to the Panther Console. Provide the configuration details and the JSON Keyfile in Panther.

    • Drag and drop or upload the JSON key into the correct field in Step 2.

    • Paste in your Pub/Sub Subscription ID, found in the Subscriptions section of your Google Cloud account.

  2. Click Setup. You will be directed to a success screen:

    • You can optionally enable one or more Detection Packs.

    • If you have not done so already, click Attach or Infer Schemas to attach one or more schemas to the source.

    • The Trigger an alert when no events are processed setting defaults to YES. We recommend leaving this enabled, as you will be alerted if data stops flowing from the log source after a certain period of time. The timeframe is configurable, with a default of 24 hours.

Alternative to Terraform template: Configuring your integration manually in Google Cloud Platform (GCP)

If you choose to create the infrastructure components manually rather than using a Terraform template during the Pub/Sub setup above, follow the instructions below.

  1. Log in to your Google Cloud console.

  2. Create a topic for the data.

    • You can create a topic using the gcloud CLI tool with the following command format: gcloud pubsub topics create $TOPIC_ID

  3. Create a subscription to be used with the topic you created. Note: This subscription should not be used by any other service or source.

    • You can create a subscription using the gcloud CLI tool with the following command format: gcloud pubsub subscriptions create $SUBSCRIPTION_ID --topic $TOPIC_ID --topic-project $PROJECT_ID

  4. Create a new Google Cloud service account. Make sure to take note of the account email address, as Panther will use this to access the infrastructure created for this GCS integration.

    • To create the account using the gcloud CLI tool, use the following command format: gcloud iam service-accounts create $PROJECT_ID --description="$DESCRIPTION" --display-name="$DISPLAY_NAME"

  5. Assign the required IAM roles to the account.

    • The following permissions are required for the project where the Pub/Sub subscription and topic lives:

      Required permissions

      Role

      Scope

      pubsub.subscriptions.consume

      roles/pubsub.subscriber

      subscription-name

      pubsub.subscriptions.get

      roles/pubsub.viewer

      subscription-name

      The following monitoring permission is optional, but is recommended for improved autoscaling:

      Optional permission

      Role

      Scope

      monitoring.timeSeries.list

      roles/monitoring.viewer

      project

      • Alternatively, you can create the permissions using the gcloud CLI tool:

        • gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" --role="roles/pubsub.subscriber"

        • gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" --role="roles/pubsub.viewer"

        • gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" --role="roles/monitoring.viewer"

  6. Generate a JSON key file for the service account, which will be used in Panther to authenticate to the GCP infrastructure.

    • To create a JSON key file using the gcloud CLI tool, use the following command format: gcloud iam service-accounts keys create $KEYFILE_PATH --iam-account=$SERVICE_ACCOUNT_EMAIL

    • Alternatively, you can Download the key file from the Cloud Console:

      1. Open the GCP terminal ("Activate Cloud Shell")

      2. Click the 3 dots icon menu in the top right, then click Download.

      3. Click the folder icon for Browse.

      4. Navigate to the key file and select it, then click Download.

Viewing ingested logs

After your log source is configured, you can search ingested data using Search or Data Explorer.

Last updated