LogoLogo
Knowledge BaseCommunityRelease NotesRequest Demo
  • Overview
  • Quick Start
    • Onboarding Guide
  • Data Sources & Transports
    • Supported Logs
      • 1Password Logs
      • Apache Logs
      • AppOmni Logs
      • Asana Logs
      • Atlassian Logs
      • Auditd Logs
      • Auth0 Logs
      • AWS Logs
        • AWS ALB
        • AWS Aurora
        • AWS CloudFront
        • AWS CloudTrail
        • AWS CloudWatch
        • AWS Config
        • AWS EKS
        • AWS GuardDuty
        • AWS Security Hub
        • Amazon Security Lake
        • AWS S3
        • AWS Transit Gateway
        • AWS VPC
        • AWS WAF
      • Azure Monitor Logs
      • Bitwarden Logs
      • Box Logs
      • Carbon Black Logs
      • Cisco Umbrella Logs
      • Cloudflare Logs
      • CrowdStrike Logs
        • CrowdStrike Falcon Data Replicator
        • CrowdStrike Event Streams
      • Docker Logs
      • Dropbox Logs
      • Duo Security Logs
      • Envoy Logs
      • Fastly Logs
      • Fluentd Logs
      • GCP Logs
      • GitHub Logs
      • GitLab Logs
      • Google Workspace Logs
      • Heroku Logs
      • Jamf Pro Logs
      • Juniper Logs
      • Lacework Logs
        • Lacework Alert Channel Webhook
        • Lacework Export
      • Material Security Logs
      • Microsoft 365 Logs
      • Microsoft Entra ID Audit Logs
      • Microsoft Graph Logs
      • MongoDB Atlas Logs
      • Netskope Logs
      • Nginx Logs
      • Notion Logs
      • Okta Logs
      • OneLogin Logs
      • Orca Security Logs (Beta)
      • Osquery Logs
      • OSSEC Logs
      • Proofpoint Logs
      • Push Security Logs
      • Rapid7 Logs
      • Salesforce Logs
      • SentinelOne Logs
      • Slack Logs
      • Snowflake Audit Logs (Beta)
      • Snyk Logs
      • Sophos Logs
      • Sublime Security Logs
      • Suricata Logs
      • Sysdig Logs
      • Syslog Logs
      • Tailscale Logs
      • Teleport Logs
      • Tenable Vulnerability Management Logs
      • Thinkst Canary Logs
      • Tines Logs
      • Tracebit Logs
      • Windows Event Logs
      • Wiz Logs
      • Zeek Logs
      • Zendesk Logs
      • Zoom Logs
      • Zscaler Logs
        • Zscaler ZIA
        • Zscaler ZPA
    • Custom Logs
      • Log Schema Reference
      • Transformations
      • Script Log Parser (Beta)
      • Fastmatch Log Parser
      • Regex Log Parser
      • CSV Log Parser
    • Data Transports
      • HTTP Source
      • AWS Sources
        • S3 Source
        • CloudWatch Logs Source
        • SQS Source
          • SNS Source
        • EventBridge
      • Google Cloud Sources
        • Cloud Storage (GCS) Source
        • Pub/Sub Source
      • Azure Blob Storage Source
    • Monitoring Log Sources
    • Ingestion Filters
      • Raw Event Filters
      • Normalized Event Filters (Beta)
    • Data Pipeline Tools
      • Chronosphere Onboarding Guide
      • Cribl Onboarding Guide
      • Fluent Bit Onboarding Guide
        • Fluent Bit Configuration Examples
      • Fluentd Onboarding Guide
        • General log forwarding via Fluentd
        • MacOS System Logs to S3 via Fluentd
        • Syslog to S3 via Fluentd
        • Windows Event Logs to S3 via Fluentd (Legacy)
        • GCP Audit to S3 via Fluentd
      • Observo Onboarding Guide
      • Tarsal Onboarding Guide
    • Tech Partner Log Source Integrations
  • Detections
    • Using Panther-managed Detections
      • Detection Packs
    • Rules and Scheduled Rules
      • Writing Python Detections
        • Python Rule Caching
        • Data Models
        • Global Helper Functions
      • Modifying Detections with Inline Filters (Beta)
      • Derived Detections (Beta)
        • Using Derived Detections to Avoid Merge Conflicts
      • Using the Simple Detection Builder
      • Writing Simple Detections
        • Simple Detection Match Expression Reference
        • Simple Detection Error Codes
    • Correlation Rules (Beta)
      • Correlation Rule Reference
    • PyPanther Detections (Beta)
      • Creating PyPanther Detections
      • Registering, Testing, and Uploading PyPanther Detections
      • Managing PyPanther Detections in the Panther Console
      • PyPanther Detections Style Guide
      • pypanther Library Reference
      • Using the pypanther Command Line Tool
    • Signals
    • Policies
    • Testing
      • Data Replay (Beta)
    • Framework Mapping and MITRE ATT&CK® Matrix
  • Cloud Security Scanning
    • Cloud Resource Attributes
      • AWS
        • ACM Certificate
        • CloudFormation Stack
        • CloudWatch Log Group
        • CloudTrail
        • CloudTrail Meta
        • Config Recorder
        • Config Recorder Meta
        • DynamoDB Table
        • EC2 AMI
        • EC2 Instance
        • EC2 Network ACL
        • EC2 SecurityGroup
        • EC2 Volume
        • EC2 VPC
        • ECS Cluster
        • EKS Cluster
        • ELBV2 Application Load Balancer
        • GuardDuty Detector
        • GuardDuty Detector Meta
        • IAM Group
        • IAM Policy
        • IAM Role
        • IAM Root User
        • IAM User
        • KMS Key
        • Lambda Function
        • Password Policy
        • RDS Instance
        • Redshift Cluster
        • Route 53 Domains
        • Route 53 Hosted Zone
        • S3 Bucket
        • WAF Web ACL
  • Alerts & Destinations
    • Alert Destinations
      • Amazon SNS Destination
      • Amazon SQS Destination
      • Asana Destination
      • Blink Ops Destination
      • Custom Webhook Destination
      • Discord Destination
      • GitHub Destination
      • Google Pub/Sub Destination (Beta)
      • Incident.io Destination
      • Jira Cloud Destination
      • Jira Data Center Destination (Beta)
      • Microsoft Teams Destination
      • Mindflow Destination
      • OpsGenie Destination
      • PagerDuty Destination
      • Rapid7 Destination
      • ServiceNow Destination (Custom Webhook)
      • Slack Bot Destination
      • Slack Destination (Webhook)
      • Splunk Destination (Beta)
      • Tines Destination
      • Torq Destination
    • Assigning and Managing Alerts
      • Managing Alerts in Slack
    • Alert Runbooks
      • Panther-managed Policies Runbooks
        • AWS CloudTrail Is Enabled In All Regions
        • AWS CloudTrail Sending To CloudWatch Logs
        • AWS KMS CMK Key Rotation Is Enabled
        • AWS Application Load Balancer Has Web ACL
        • AWS Access Keys Are Used Every 90 Days
        • AWS Access Keys are Rotated Every 90 Days
        • AWS ACM Certificate Is Not Expired
        • AWS Access Keys not Created During Account Creation
        • AWS CloudTrail Has Log Validation Enabled
        • AWS CloudTrail S3 Bucket Has Access Logging Enabled
        • AWS CloudTrail Logs S3 Bucket Not Publicly Accessible
        • AWS Config Is Enabled for Global Resources
        • AWS DynamoDB Table Has Autoscaling Targets Configured
        • AWS DynamoDB Table Has Autoscaling Enabled
        • AWS DynamoDB Table Has Encryption Enabled
        • AWS EC2 AMI Launched on Approved Host
        • AWS EC2 AMI Launched on Approved Instance Type
        • AWS EC2 AMI Launched With Approved Tenancy
        • AWS EC2 Instance Has Detailed Monitoring Enabled
        • AWS EC2 Instance Is EBS Optimized
        • AWS EC2 Instance Running on Approved AMI
        • AWS EC2 Instance Running on Approved Instance Type
        • AWS EC2 Instance Running in Approved VPC
        • AWS EC2 Instance Running On Approved Host
        • AWS EC2 Instance Running With Approved Tenancy
        • AWS EC2 Instance Volumes Are Encrypted
        • AWS EC2 Volume Is Encrypted
        • AWS GuardDuty is Logging to a Master Account
        • AWS GuardDuty Is Enabled
        • AWS IAM Group Has Users
        • AWS IAM Policy Blocklist Is Respected
        • AWS IAM Policy Does Not Grant Full Administrative Privileges
        • AWS IAM Policy Is Not Assigned Directly To User
        • AWS IAM Policy Role Mapping Is Respected
        • AWS IAM User Has MFA Enabled
        • AWS IAM Password Used Every 90 Days
        • AWS Password Policy Enforces Complexity Guidelines
        • AWS Password Policy Enforces Password Age Limit Of 90 Days Or Less
        • AWS Password Policy Prevents Password Reuse
        • AWS RDS Instance Is Not Publicly Accessible
        • AWS RDS Instance Snapshots Are Not Publicly Accessible
        • AWS RDS Instance Has Storage Encrypted
        • AWS RDS Instance Has Backups Enabled
        • AWS RDS Instance Has High Availability Configured
        • AWS Redshift Cluster Allows Version Upgrades
        • AWS Redshift Cluster Has Encryption Enabled
        • AWS Redshift Cluster Has Logging Enabled
        • AWS Redshift Cluster Has Correct Preferred Maintenance Window
        • AWS Redshift Cluster Has Sufficient Snapshot Retention Period
        • AWS Resource Has Minimum Number of Tags
        • AWS Resource Has Required Tags
        • AWS Root Account Has MFA Enabled
        • AWS Root Account Does Not Have Access Keys
        • AWS S3 Bucket Name Has No Periods
        • AWS S3 Bucket Not Publicly Readable
        • AWS S3 Bucket Not Publicly Writeable
        • AWS S3 Bucket Policy Does Not Use Allow With Not Principal
        • AWS S3 Bucket Policy Enforces Secure Access
        • AWS S3 Bucket Policy Restricts Allowed Actions
        • AWS S3 Bucket Policy Restricts Principal
        • AWS S3 Bucket Has Versioning Enabled
        • AWS S3 Bucket Has Encryption Enabled
        • AWS S3 Bucket Lifecycle Configuration Expires Data
        • AWS S3 Bucket Has Logging Enabled
        • AWS S3 Bucket Has MFA Delete Enabled
        • AWS S3 Bucket Has Public Access Block Enabled
        • AWS Security Group Restricts Ingress On Administrative Ports
        • AWS VPC Default Security Group Restricts All Traffic
        • AWS VPC Flow Logging Enabled
        • AWS WAF Has Correct Rule Ordering
        • AWS CloudTrail Logs Encrypted Using KMS CMK
      • Panther-managed Rules Runbooks
        • AWS CloudTrail Modified
        • AWS Config Service Modified
        • AWS Console Login Failed
        • AWS Console Login Without MFA
        • AWS EC2 Gateway Modified
        • AWS EC2 Network ACL Modified
        • AWS EC2 Route Table Modified
        • AWS EC2 SecurityGroup Modified
        • AWS EC2 VPC Modified
        • AWS IAM Policy Modified
        • AWS KMS CMK Loss
        • AWS Root Activity
        • AWS S3 Bucket Policy Modified
        • AWS Unauthorized API Call
    • Tech Partner Alert Destination Integrations
  • Investigations & Search
    • Search
      • Search Filter Operators
    • Data Explorer
      • Data Explorer SQL Search Examples
        • CloudTrail logs queries
        • GitHub Audit logs queries
        • GuardDuty logs queries
        • Nginx and ALB Access logs queries
        • Okta logs queries
        • S3 Access logs queries
        • VPC logs queries
    • Visualization and Dashboards
      • Custom Dashboards (Beta)
      • Panther-Managed Dashboards
    • Standard Fields
    • Saved and Scheduled Searches
      • Templated Searches
        • Behavioral Analytics and Anomaly Detection Template Macros (Beta)
      • Scheduled Search Examples
    • Search History
    • Data Lakes
      • Snowflake
        • Snowflake Configuration for Optimal Search Performance
      • Athena
  • PantherFlow (Beta)
    • PantherFlow Quick Reference
    • PantherFlow Statements
    • PantherFlow Operators
      • Datatable Operator
      • Extend Operator
      • Join Operator
      • Limit Operator
      • Project Operator
      • Range Operator
      • Sort Operator
      • Search Operator
      • Summarize Operator
      • Union Operator
      • Visualize Operator
      • Where Operator
    • PantherFlow Data Types
    • PantherFlow Expressions
    • PantherFlow Functions
      • Aggregation Functions
      • Date/time Functions
      • String Functions
      • Array Functions
      • Math Functions
      • Control Flow Functions
      • Regular Expression Functions
      • Snowflake Functions
      • Data Type Functions
      • Other Functions
    • PantherFlow Example Queries
      • PantherFlow Examples: Threat Hunting Scenarios
      • PantherFlow Examples: SOC Operations
      • PantherFlow Examples: Panther Audit Logs
  • Enrichment
    • Custom Lookup Tables
      • Creating a GreyNoise Lookup Table
      • Lookup Table Examples
        • Using Lookup Tables: 1Password UUIDs
      • Lookup Table Specification Reference
    • Identity Provider Profiles
      • Okta Profiles
      • Google Workspace Profiles
    • Anomali ThreatStream
    • IPinfo
    • Tor Exit Nodes
    • TrailDiscover (Beta)
  • Panther AI (Beta)
  • System Configuration
    • Role-Based Access Control
    • Identity & Access Integrations
      • Azure Active Directory SSO
      • Duo SSO
      • G Suite SSO
      • Okta SSO
        • Okta SCIM
      • OneLogin SSO
      • Generic SSO
    • Panther Audit Logs
      • Querying and Writing Detections for Panther Audit Logs
      • Panther Audit Log Actions
    • Notifications and Errors (Beta)
      • System Errors
    • Panther Deployment Types
      • SaaS
      • Cloud Connected
        • Configuring Snowflake for Cloud Connected
        • Configuring AWS for Cloud Connected
        • Pre-Deployment Tools
      • Legacy Configurations
        • Snowflake Connected (Legacy)
        • Customer-configured Snowflake Integration (Legacy)
        • Self-Hosted Deployments (Legacy)
          • Runtime Environment
  • Panther Developer Workflows
    • Panther Developer Workflows Overview
    • Using panther-analysis
      • Public Fork
      • Private Clone
      • Panther Analysis Tool
        • Install, Configure, and Authenticate with the Panther Analysis Tool
        • Panther Analysis Tool Commands
        • Managing Lookup Tables and Enrichment Providers with the Panther Analysis Tool
      • CI/CD for Panther Content
        • Deployment Workflows Using Panther Analysis Tool
          • Managing Panther Content via CircleCI
          • Managing Panther Content via GitHub Actions
        • Migrating to a CI/CD Workflow
    • Panther API
      • REST API (Beta)
        • Alerts
        • Alert Comments
        • API Tokens
        • Data Models
        • Globals
        • Log Sources
        • Queries
        • Roles
        • Rules
        • Scheduled Rules
        • Simple Rules
        • Policies
        • Users
      • GraphQL API
        • Alerts & Errors
        • Cloud Account Management
        • Data Lake Queries
        • Log Source Management
        • Metrics
        • Schemas
        • Token Rotation
        • User & Role Management
      • API Playground
    • Terraform
      • Managing AWS S3 Log Sources with Terraform
      • Managing HTTP Log Sources with Terraform
    • pantherlog Tool
    • Converting Sigma Rules
  • Resources
    • Help
      • Operations
      • Security and Privacy
        • Security Without AWS External ID
      • Glossary
      • Legal
    • Panther System Architecture
Powered by GitBook
On this page
  • Overview
  • How to set up a GCS log source in Panther
  • Step 1: Begin creating the GCS source in Panther
  • Step 2: Create required Google Cloud Platform (GCP) infrastructure
  • Step 3: Provide credential file and configuration values to Panther
  • Viewing ingested logs

Was this helpful?

  1. Data Sources & Transports
  2. Data Transports
  3. Google Cloud Sources

Cloud Storage (GCS) Source

Onboarding GCS as a Data Transport log source in the Panther Console

PreviousGoogle Cloud SourcesNextPub/Sub Source

Last updated 1 month ago

Was this helpful?

Overview

Panther supports configuring Google Cloud Storage (GCS) as a Data Transport to pull log data directly from GCS buckets, write rules, and run queries on this processed data. Panther uses to be notified of new data in your bucket that is ready to be consumed.

Panther can authenticate against your source using Google Cloud or a .

Data can be sent compressed (or uncompressed). Learn more about compression specifications in .

How to set up a GCS log source in Panther

Step 1: Begin creating the GCS source in Panther

  1. In the left-hand navigation bar of your Panther Console, click Configure > Log Sources.

  2. In the upper-right corner, click Create New.

  3. Click the Custom Log Formats tile.

  4. On the Google Cloud Storage tile, click Start.

  5. On the Basic Info page, fill in the fields:

    • Name: Enter a descriptive name for the GCS log source.

    • Prefixes & Schemas: Define combinations of prefixes, schemas, and exclusion filters, according the structure of your data storage in GCS.

      • To attach one or more schemas to all data in the bucket, leave the GCS Prefix field blank. This will create a wildcard (*) prefix.

  6. Click Setup.

  7. On the Log Format page, select the of the incoming logs:

    • Auto

    • Lines

    • JSON

    • JSON Array

  8. Click Continue.

Step 2: Create required Google Cloud Platform (GCP) infrastructure

Before creating GCP infrastructure, you'll need to decide:

  • The creation method: You can use Terraform to create the infrastructure, or create it manually in the GCP console—see the sub-tabs within each top-level tab below.

Workload Identity Federation authentication is in open beta starting with Panther version 1.112, and is available to all customers. Please share any bug reports and feature requests with your Panther support team.

To create GCP infrastructure using Terraform (authenticating with a service account):

  1. Click Terraform Template to download the Terraform template.

  2. Fill out the fields in the panther.tfvars file with your configuration.

    • Set authentication_method to "service_account".

  3. Initialize a working directory containing Terraform configuration files and run terraform init.

  4. Copy the corresponding Terraform Command provided and run it in your CLI.

  5. Generate a JSON key file by copying the gcloud Command provided, replacing the value for your service account email address, and running it in your CLI.

    • You can find the service account email in the output of the Terraform Command.

To create the GCP infrastructure components manually (authenticating with a service account):

  1. In your Google Cloud console, determine which bucket Panther will pull logs from.

    • You can create a topic using the gcloud CLI tool with the following command format: gcloud pubsub topics create $TOPIC_ID

    • You can create a notification using the gcloud CLI tool with the following command format: gsutil notification create -t $TOPIC_NAME -e OBJECT_FINALIZE -f json gs://$BUCKET_NAME

    • Note: Panther only requires the OBJECT_FINALIZE type.

  2. This subscription should not be used by any other service or source.

    • You can create a subscription using the gcloud CLI tool with the following command format: gcloud pubsub subscriptions create $SUBSCRIPTION_ID --topic $TOPIC_ID --topic-project $PROJECT_ID

  3. gcloud iam service-accounts create SA-NAME \
        --description="DESCRIPTION" \
        --display-name="DISPLAY_NAME"
    • Make sure to take note of the account email address, as Panther will use this to access the infrastructure created for this GCS integration.

  4. Assign the required IAM roles to the account.

    • The following permissions are required for the project where the Pub/Sub subscription and topic lives:

      Permissions required

      Role

      Scope

      storage.objects.get

      storage.objects.list

      roles/storage.objectViewer

      bucket-name

      pubsub.subscriptions.consume

      roles/pubsub.subscriber

      subscription-name

      pubsub.subscriptions.get

      roles/pubsub.viewer

      subscription-name

      monitoring.timeSeries.list

      roles/monitoring.viewer

      project

      • Note: You can create the permissions using the gcloud CLI tool:

        • gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" --role="roles/storage.objectViewer"

        • gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" --role="roles/pubsub.subscriber"

        • gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" --role="roles/pubsub.viewer"

        • gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" --role="roles/monitoring.viewer"

    • To create a JSON key file using the gcloud CLI tool, run the following command format: gcloud iam service-accounts keys create $KEYFILE_PATH --iam-account=$SERVICE_ACCOUNT_EMAIL

    • Alternatively, you can run the above command in GCP's terminal instead of locally.

      1. Click the 3 dots icon menu in the top right, then click Download.

      2. Click the folder icon for Browse.

      3. Navigate to the key file and select it, then click Download.

To create the GCP infrastructure components using Terraform (authenticating with Workload Identity Federation):

  1. Click Terraform Template to download the Terraform template.

  2. Fill out the fields in the panther.tfvars file with your configuration.

    • Set authentication_method to "workload_identity_federation".

    • Provide values for panther_workload_identity_pool_id, panther_workload_identity_pool_provider_id, and panther_aws_account_id.

  3. Initialize a working directory containing Terraform configuration files and run terraform init.

  4. Copy the corresponding Terraform Command provided and run it in your CLI.

  5. Generate a credential configuration file for the pool by copying the gcloud Command provided, replacing the value for the project number, pool ID, and provider ID, and running it in your CLI.

    • You can find the project number, the pool ID and the provider ID in the output of the Terraform Command.

To create the GCP infrastructure components manually (authenticating with Workload Identity Federation):

  1. In your Google Cloud console, determine which bucket Panther will pull logs from.

    • You can create a topic using the gcloud CLI tool with the following command format: gcloud pubsub topics create $TOPIC_ID

    • You can create a notification using the gcloud CLI tool with the following command format: gsutil notification create -t $TOPIC_NAME -e OBJECT_FINALIZE -f json gs://$BUCKET_NAME

    • Note: Panther only requires the OBJECT_FINALIZE type.

  2. This subscription should not be used by any other service or source.

    • You can create a subscription using the gcloud CLI tool with the following command format: gcloud pubsub subscriptions create $SUBSCRIPTION_ID --topic $TOPIC_ID --topic-project $PROJECT_ID

      • Google
        AWS

        google.subject

        assertion.arn.extract('arn:aws:sts::{account_id}:')+":"+assertion.arn.extract('assumed-role/{role_and_session}').extract('/{session}')

        attribute.account

        assertion.account

  3. Assign the required IAM roles to the account.

    • The following permissions are required for the project where the Pub/Sub subscription and topic lives:

      Permissions required

      Role

      Scope

      storage.objects.get

      storage.objects.list

      roles/storage.objectViewer

      bucket-name

      pubsub.subscriptions.consume

      roles/pubsub.subscriber

      subscription-name

      pubsub.subscriptions.get

      roles/pubsub.viewer

      subscription-name

      monitoring.timeSeries.list

      roles/monitoring.viewer

      project

      • Note: You can create the permissions using the gcloud CLI tool, where the $PRINCIPAL_ID may be something like: principalSet://iam.googleapis.com/projects/<THE_ACTUAL_GOOGLE_PROJECT_NUMBER>/locations/global/workloadIdentityPools/<THE_ACTUAL_POOL_ID>/attribute.account/<THE_ACTUAL_PANTHER_AWS_ACCOUNT_ID>

        • gcloud projects add-iam-policy-binding $PROJECT_ID --member="$PRINCIPAL_ID" --role="roles/storage.objectViewer"

        • gcloud projects add-iam-policy-binding $PROJECT_ID --member="$PRINCIPAL_ID" --role="roles/pubsub.subscriber"

        • gcloud projects add-iam-policy-binding $PROJECT_ID --member="$PRINCIPAL_ID" --role="roles/pubsub.viewer"

        • gcloud projects add-iam-policy-binding $PROJECT_ID --member="$PRINCIPAL_ID" --role="roles/monitoring.viewer"

    • To generate a credential configuration file using the gcloud CLI tool, use the following command format: gcloud iam workload-identity-pools create-cred-config projects/$PROJECT_NUMBER/locations/global/workloadIdentityPools/$POOL_ID/providers/$PROVIDER_ID --aws --output-file=config.json

Step 3: Provide credential file and configuration values to Panther

  1. Under Provide pulling configuration & JSON Keyfile, upload your JSON key file.

  2. Click Setup. You will be directed to a success screen:

    • If you have not done so already, click Attach or Infer Schemas to attach one or more schemas to the source.

    • The Trigger an alert when no events are processed setting defaults to YES. We recommend leaving this enabled, as you will be alerted if data stops flowing from the log source after a certain period of time. The timeframe is configurable, with a default of 24 hours.

  1. Under Provide pulling configuration & credential configuration file, upload your credential configuration file.

  2. Click Setup. You will be directed to a success screen:

    • If you have not done so already, click Attach or Infer Schemas to attach one or more schemas to the source.

    • The Trigger an alert when no events are processed setting defaults to YES. We recommend leaving this enabled, as you will be alerted if data stops flowing from the log source after a certain period of time. The timeframe is configurable, with a default of 24 hours.

Viewing ingested logs

The authentication method: You can use or a —see the top-level tabs below.

If you'd like Panther to authenticate using a , follow the instructions in one of the tabs below.

On the Infrastructure & Credentials page, click the Service Account tab.

You can also find the Terraform template at .

If you have not created a bucket yet, please see .

for the notifications.

for new files to the topic you created.

to be used with the topic you created.

.

. To create the account using the gcloud CLI tool, use the following command format:

Note: You can set conditions or IAM policies on permissions for specific resources. This can be done either in the IAM page of the service account (as seen in the example screenshot below) or in the specific resource's page.

for the service account, which will be used in Panther to authenticate to the GCP infrastructure.

If you'd like Panther to authenticate using , follow the instructions in one of the tabs below.

On the Infrastructure & Credentials page, click the Workload Identity Federation tab.

You can also find the Terraform template at .

If you have not created a bucket yet, please see .

must be enabled on the target bucket in order to grant Workload Identity Federation entities access to cloud storage resources.

for the notifications.

for new files to the topic you created.

to be used with the topic you created.

.

Configure Workload Identity Federation with AWS by following the documentation.

As you are , take note of the following examples:

Example :

The value of the google.subject attribute . You may use to transform or combine attributes from the token issued by AWS. The expression suggested in the table above takes this limit into account, and is an attempt at transforming the ARN into a value that uniquely identifies Panther entities. For more information on the AWS attributes, see "Example 2 - Called by user created with AssumeRole" on .

Example : attribute.account=="<PANTHER_AWS_ACCOUNT_ID>"

When you are , select AWS.

Note: You can set conditions or IAM policies on permissions for specific resources. This can be done either in the IAM section in GCP (as seen in the example screenshot below) or in the specific resource's page.

, which will be used in Panther to authenticate to the GCP infrastructure.

If you are using a to authenticate:

Enter your GCS Bucket Name and Pub/Sub Subscription ID, found in the Subscriptions section of your Google Cloud account.

You can optionally enable one or more .

If you are using to authenticate:

On the Infrastructure & Credentials page, if you have not already, click the Workload Identity Federation tab.

Enter your Project ID, GCS Bucket Name, and Pub/Sub Subscription ID found in the Subscriptions section of your Google Cloud account.

You can optionally enable one or more .

After your log source is configured, you can search ingested data using or .

Google Cloud Workload Identity Federation with AWS
service account
Google Cloud service account
this GitHub link
Google's documentation on creating a bucket
Create a topic
Configure the bucket to send notifications
Create a subscription
Enable the IAM API
Create a new Google Cloud service account
Generate a JSON key file
Google Cloud Workload Identity Federation
this GitHub link
Google's documentation on creating a bucket
Uniform bucket-level access
Create a topic
Configure the bucket to send notifications
Create a subscription
Enable the IAM API
Configure Workload Identity Federation with AWS or Azure
defining an attribute mapping(s) and condition
attribute mappings
cannot exceed 127 characters
Common Expression Language (CEL) expressions
this AWS documentation page
attribute condition
adding a provider to your identity pool
Download the credential configuration file
Google Cloud service account
Detection Packs
Google Cloud Workload Identity Federation
Detection Packs
Search
Data Explorer
Pub/Sub
Workload Identity Federation
service account
stream type
Ingesting compressed data in Panther
The success screen reads, "Everything looks good! Panther will now automatically pull & process logs from your account"
The "Trigger an alert when no events are processed" toggle is set to YES. The "How long should Panther wait before it sends you an alert that no events have been processed" setting is set to 1 Day
The success screen reads, "Everything looks good! Panther will now automatically pull & process logs from your account"
The "Trigger an alert when no events are processed" toggle is set to YES. The "How long should Panther wait before it sends you an alert that no events have been processed" setting is set to 1 Day