Links

Google Cloud Pub/Sub Source

Onboarding Google Cloud Pub/Sub as a Data Transport log source in the Panther Console

Overview

Google Cloud Pub/Sub as a direct log source is in open beta as of version 1.45. Please share any bug reports and feature requests with your account team.
With Google Cloud Pub/Sub as a log source, Panther can pull log data directly from Pub/Sub topics.

How to connect Cloud Pub/Sub as a Data Transport log source in Panther

  1. 1.
    Log in to your Panther Console.
  2. 2.
    In the left sidebar menu click Configure > Log Sources.
  3. 3.
    In the upper right corner, click Create New.
  4. 4.
    Click the Custom Log Formats tile.
  5. 5.
    Click Google Cloud Pub/Sub.
  6. 6.
    On the "Configure your source" page, fill in the fields:
    • Name: Enter a descriptive name for the log source.
    • Log Types: Select the Log Types Panther should use to parse your logs.
      • Note: At least one Log Type must be selected from the dropdown menu.
  7. 7.
    Click Continue Setup.
  8. 8.
    On the "Infrastructure & Credentials" page, follow the steps to create the required infrastructure components. You can do it with a Terraform template, or if you do not want to use Terraform, you can follow our alternative documentation to complete the infrastructure components process manually.
    1. 1.
      Create the required infrastructure in your GCP cloud.
      1. 1.
        Download the Terraform Template.
      2. 2.
        Fill out the fields in the production.tfvars file with your configuration.
      3. 3.
        Initialize the directory with terraform init
      4. 4.
        Run terraform apply -var-file="production.tfvars" to create the resources
        • Note: If the topic you want to pull from already exists in your infrastructure, you can remove the relevant parts from the terraform template.
      5. 5.
        Generate a JSON keyfile for the just-created service account. gcloud iam service-accounts keys create keyfile.json [email protected]
        • You can find the key file in the output of the Terraform run.
    2. 2.
      Provide the configuration details and the JSON Keyfile to Panther.
      • Drag and drop or upload the JSON key into the correct field in Step 2.
      • Paste in your Pub/Sub Subscription ID, found in the Subscriptions section of your Google Cloud account.
  9. 9.
    Click Continue Setup.
    • The page will display a success message that says "Everything looks good!"
      A screen in the Panther Console displays the message "Everything looks good!"
  10. 10.
    To finish the source setup:
    1. 1.
      Optionally configure a log drop-off alarm.
      • Before you finish the setup, we recommend that you create a log drop-off alarm to alert you if data stops flowing from the log source. Be sure to set an appropriate time interval for when you would like Panther to alert you that the log source is not sending data.
        There is a toggle setting to enable the alarm in case the source does not process any events. It is switched to "on."
    2. 2.
      Optionally enable a Detection Pack.
    3. 3.
      Click Finish Setup.

Alternative to Terraform template: Configuring your integration manually in Google Cloud Platform (GCP)

If you choose to create the infrastructure components manually rather than using a Terraform template during the Pub/Sub setup above, follow the instructions below.
  1. 1.
    Log in to your Google Cloud console.
  2. 2.
    Create a topic for the data.
    • You can create a topic using the gcloud CLI tool with the following command format: gcloud pubsub topics create $TOPIC_ID
  3. 3.
    Create a subscription to be used with the topic you created. Note: This subscription should not be used by any service other than Panther.
    • You can create a subscription using the gcloud CLI tool with the following command format: gcloud pubsub subscriptions create $SUBSCRIPTION_ID --topic $TOPIC_ID --topic-project $PROJECT_ID
  4. 4.
    Create a new Google Cloud service account. Make sure to take note of the account email address, as Panther will use this to access the infrastructure created for this GCS integration.
    • To create the account using the gcloud CLI tool, use the following command format: gcloud iam service-accounts create $PROJECT_ID --description="$DESCRIPTION" --display-name="$DISPLAY_NAME"
  5. 5.
    Assign the required IAM roles to the account.
    • The following permissions are required for the project where the Pub/Sub subscription and topic lives:
      Permissions required
      Role
      Scope
      pubsub.subscriptions.consume
      roles/pubsub.subscriber
      subscription-name
      pubsub.subscriptions.get
      roles/pubsub.viewer
      subscription-name
      monitoring.timeSeries.list
      roles/monitoring.viewer
      project
      • Note: You can set conditions or IAM policies on permissions for specific resources. This can be done either in the IAM page of the service account (as seen in the example screenshot below) or in the specific resource's page. The "monitoring" permission is not mandatory, but is recommended for improved autoscaling.
        "Service account details" has a checkmark next to it. Under "Granted this service account access to project," the Role field shows "Storage object viewer" selected.
      • Alternatively, you can create the permissions using the gcloud CLI tool:
        • gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" --role="roles/pubsub.subscriber"
        • gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" --role="roles/pubsub.viewer"
        • gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" --role="roles/monitoring.viewer"
  6. 6.
    Generate a JSON key file for the service account, which will be used in Panther to authenticate to the GCP infrastructure.
    • To create a JSON key file using the gcloud CLI tool, use the following command format: gcloud iam service-accounts keys create $KEYFILE_PATH --iam-account=$SERVICE_ACCOUNT_EMAIL
    • Alternatively, you can Download the key file from the Cloud Console:
      1. 1.
        Open the GCP terminal ("Activate Cloud Shell")
      2. 2.
        Click the 3 dots icon menu in the top right, then click Download.
        • The 3 dot menu is expanded and the option "Download" is highlighted.
      3. 3.
        Click the folder icon for Browse.
      4. 4.
        Navigate to the key file and select it, then click Download.

View collected logs

After Pub/Sub log sources are fully configured, you can search your data in Data Explorer. For more information and for example queries, please see the documentation on Data Explorer.