LogoLogo
Knowledge BaseCommunityRelease NotesRequest Demo
  • Overview
  • Quick Start
    • Onboarding Guide
  • Data Sources & Transports
    • Supported Logs
      • 1Password Logs
      • Apache Logs
      • AppOmni Logs
      • Asana Logs
      • Atlassian Logs
      • Auditd Logs
      • Auth0 Logs
      • AWS Logs
        • AWS ALB
        • AWS Aurora
        • AWS CloudFront
        • AWS CloudTrail
        • AWS CloudWatch
        • AWS Config
        • AWS EKS
        • AWS GuardDuty
        • AWS Security Hub
        • Amazon Security Lake
        • AWS S3
        • AWS Transit Gateway
        • AWS VPC
        • AWS WAF
      • Azure Monitor Logs
      • Bitwarden Logs
      • Box Logs
      • Carbon Black Logs
      • Cisco Umbrella Logs
      • Cloudflare Logs
      • CrowdStrike Logs
        • CrowdStrike Falcon Data Replicator
        • CrowdStrike Event Streams
      • Docker Logs
      • Dropbox Logs
      • Duo Security Logs
      • Envoy Logs
      • Fastly Logs
      • Fluentd Logs
      • GCP Logs
      • GitHub Logs
      • GitLab Logs
      • Google Workspace Logs
      • Heroku Logs
      • Jamf Pro Logs
      • Juniper Logs
      • Lacework Logs
        • Lacework Alert Channel Webhook
        • Lacework Export
      • Material Security Logs
      • Microsoft 365 Logs
      • Microsoft Entra ID Audit Logs
      • Microsoft Graph Logs
      • MongoDB Atlas Logs
      • Netskope Logs
      • Nginx Logs
      • Notion Logs
      • Okta Logs
      • OneLogin Logs
      • Orca Security Logs (Beta)
      • Osquery Logs
      • OSSEC Logs
      • Proofpoint Logs
      • Push Security Logs
      • Rapid7 Logs
      • Salesforce Logs
      • SentinelOne Logs
      • Slack Logs
      • Snowflake Audit Logs (Beta)
      • Snyk Logs
      • Sophos Logs
      • Sublime Security Logs
      • Suricata Logs
      • Sysdig Logs
      • Syslog Logs
      • Tailscale Logs
      • Teleport Logs
      • Tenable Vulnerability Management Logs
      • Thinkst Canary Logs
      • Tines Logs
      • Tracebit Logs
      • Windows Event Logs
      • Wiz Logs
      • Zeek Logs
      • Zendesk Logs
      • Zoom Logs
      • Zscaler Logs
        • Zscaler ZIA
        • Zscaler ZPA
    • Custom Logs
      • Log Schema Reference
      • Transformations
      • Script Log Parser (Beta)
      • Fastmatch Log Parser
      • Regex Log Parser
      • CSV Log Parser
    • Data Transports
      • HTTP Source
      • AWS Sources
        • S3 Source
        • CloudWatch Logs Source
        • SQS Source
          • SNS Source
        • EventBridge
      • Google Cloud Sources
        • Cloud Storage (GCS) Source
        • Pub/Sub Source
      • Azure Blob Storage Source
    • Monitoring Log Sources
    • Ingestion Filters
      • Raw Event Filters
      • Normalized Event Filters (Beta)
    • Data Pipeline Tools
      • Chronosphere Onboarding Guide
      • Cribl Onboarding Guide
      • Fluent Bit Onboarding Guide
        • Fluent Bit Configuration Examples
      • Fluentd Onboarding Guide
        • General log forwarding via Fluentd
        • MacOS System Logs to S3 via Fluentd
        • Syslog to S3 via Fluentd
        • Windows Event Logs to S3 via Fluentd (Legacy)
        • GCP Audit to S3 via Fluentd
      • Observo Onboarding Guide
      • Tarsal Onboarding Guide
    • Tech Partner Log Source Integrations
  • Detections
    • Using Panther-managed Detections
      • Detection Packs
    • Rules and Scheduled Rules
      • Writing Python Detections
        • Python Rule Caching
        • Data Models
        • Global Helper Functions
      • Modifying Detections with Inline Filters (Beta)
      • Derived Detections (Beta)
        • Using Derived Detections to Avoid Merge Conflicts
      • Using the Simple Detection Builder
      • Writing Simple Detections
        • Simple Detection Match Expression Reference
        • Simple Detection Error Codes
    • Correlation Rules (Beta)
      • Correlation Rule Reference
    • PyPanther Detections (Beta)
      • Creating PyPanther Detections
      • Registering, Testing, and Uploading PyPanther Detections
      • Managing PyPanther Detections in the Panther Console
      • PyPanther Detections Style Guide
      • pypanther Library Reference
      • Using the pypanther Command Line Tool
    • Signals
    • Policies
    • Testing
      • Data Replay (Beta)
    • Framework Mapping and MITRE ATT&CK® Matrix
  • Cloud Security Scanning
    • Cloud Resource Attributes
      • AWS
        • ACM Certificate
        • CloudFormation Stack
        • CloudWatch Log Group
        • CloudTrail
        • CloudTrail Meta
        • Config Recorder
        • Config Recorder Meta
        • DynamoDB Table
        • EC2 AMI
        • EC2 Instance
        • EC2 Network ACL
        • EC2 SecurityGroup
        • EC2 Volume
        • EC2 VPC
        • ECS Cluster
        • EKS Cluster
        • ELBV2 Application Load Balancer
        • GuardDuty Detector
        • GuardDuty Detector Meta
        • IAM Group
        • IAM Policy
        • IAM Role
        • IAM Root User
        • IAM User
        • KMS Key
        • Lambda Function
        • Password Policy
        • RDS Instance
        • Redshift Cluster
        • Route 53 Domains
        • Route 53 Hosted Zone
        • S3 Bucket
        • WAF Web ACL
  • Alerts & Destinations
    • Alert Destinations
      • Amazon SNS Destination
      • Amazon SQS Destination
      • Asana Destination
      • Blink Ops Destination
      • Custom Webhook Destination
      • Discord Destination
      • GitHub Destination
      • Google Pub/Sub Destination (Beta)
      • Incident.io Destination
      • Jira Cloud Destination
      • Jira Data Center Destination (Beta)
      • Microsoft Teams Destination
      • Mindflow Destination
      • OpsGenie Destination
      • PagerDuty Destination
      • Rapid7 Destination
      • ServiceNow Destination (Custom Webhook)
      • Slack Bot Destination
      • Slack Destination (Webhook)
      • Splunk Destination (Beta)
      • Tines Destination
      • Torq Destination
    • Assigning and Managing Alerts
      • Managing Alerts in Slack
    • Alert Runbooks
      • Panther-managed Policies Runbooks
        • AWS CloudTrail Is Enabled In All Regions
        • AWS CloudTrail Sending To CloudWatch Logs
        • AWS KMS CMK Key Rotation Is Enabled
        • AWS Application Load Balancer Has Web ACL
        • AWS Access Keys Are Used Every 90 Days
        • AWS Access Keys are Rotated Every 90 Days
        • AWS ACM Certificate Is Not Expired
        • AWS Access Keys not Created During Account Creation
        • AWS CloudTrail Has Log Validation Enabled
        • AWS CloudTrail S3 Bucket Has Access Logging Enabled
        • AWS CloudTrail Logs S3 Bucket Not Publicly Accessible
        • AWS Config Is Enabled for Global Resources
        • AWS DynamoDB Table Has Autoscaling Targets Configured
        • AWS DynamoDB Table Has Autoscaling Enabled
        • AWS DynamoDB Table Has Encryption Enabled
        • AWS EC2 AMI Launched on Approved Host
        • AWS EC2 AMI Launched on Approved Instance Type
        • AWS EC2 AMI Launched With Approved Tenancy
        • AWS EC2 Instance Has Detailed Monitoring Enabled
        • AWS EC2 Instance Is EBS Optimized
        • AWS EC2 Instance Running on Approved AMI
        • AWS EC2 Instance Running on Approved Instance Type
        • AWS EC2 Instance Running in Approved VPC
        • AWS EC2 Instance Running On Approved Host
        • AWS EC2 Instance Running With Approved Tenancy
        • AWS EC2 Instance Volumes Are Encrypted
        • AWS EC2 Volume Is Encrypted
        • AWS GuardDuty is Logging to a Master Account
        • AWS GuardDuty Is Enabled
        • AWS IAM Group Has Users
        • AWS IAM Policy Blocklist Is Respected
        • AWS IAM Policy Does Not Grant Full Administrative Privileges
        • AWS IAM Policy Is Not Assigned Directly To User
        • AWS IAM Policy Role Mapping Is Respected
        • AWS IAM User Has MFA Enabled
        • AWS IAM Password Used Every 90 Days
        • AWS Password Policy Enforces Complexity Guidelines
        • AWS Password Policy Enforces Password Age Limit Of 90 Days Or Less
        • AWS Password Policy Prevents Password Reuse
        • AWS RDS Instance Is Not Publicly Accessible
        • AWS RDS Instance Snapshots Are Not Publicly Accessible
        • AWS RDS Instance Has Storage Encrypted
        • AWS RDS Instance Has Backups Enabled
        • AWS RDS Instance Has High Availability Configured
        • AWS Redshift Cluster Allows Version Upgrades
        • AWS Redshift Cluster Has Encryption Enabled
        • AWS Redshift Cluster Has Logging Enabled
        • AWS Redshift Cluster Has Correct Preferred Maintenance Window
        • AWS Redshift Cluster Has Sufficient Snapshot Retention Period
        • AWS Resource Has Minimum Number of Tags
        • AWS Resource Has Required Tags
        • AWS Root Account Has MFA Enabled
        • AWS Root Account Does Not Have Access Keys
        • AWS S3 Bucket Name Has No Periods
        • AWS S3 Bucket Not Publicly Readable
        • AWS S3 Bucket Not Publicly Writeable
        • AWS S3 Bucket Policy Does Not Use Allow With Not Principal
        • AWS S3 Bucket Policy Enforces Secure Access
        • AWS S3 Bucket Policy Restricts Allowed Actions
        • AWS S3 Bucket Policy Restricts Principal
        • AWS S3 Bucket Has Versioning Enabled
        • AWS S3 Bucket Has Encryption Enabled
        • AWS S3 Bucket Lifecycle Configuration Expires Data
        • AWS S3 Bucket Has Logging Enabled
        • AWS S3 Bucket Has MFA Delete Enabled
        • AWS S3 Bucket Has Public Access Block Enabled
        • AWS Security Group Restricts Ingress On Administrative Ports
        • AWS VPC Default Security Group Restricts All Traffic
        • AWS VPC Flow Logging Enabled
        • AWS WAF Has Correct Rule Ordering
        • AWS CloudTrail Logs Encrypted Using KMS CMK
      • Panther-managed Rules Runbooks
        • AWS CloudTrail Modified
        • AWS Config Service Modified
        • AWS Console Login Failed
        • AWS Console Login Without MFA
        • AWS EC2 Gateway Modified
        • AWS EC2 Network ACL Modified
        • AWS EC2 Route Table Modified
        • AWS EC2 SecurityGroup Modified
        • AWS EC2 VPC Modified
        • AWS IAM Policy Modified
        • AWS KMS CMK Loss
        • AWS Root Activity
        • AWS S3 Bucket Policy Modified
        • AWS Unauthorized API Call
    • Tech Partner Alert Destination Integrations
  • Investigations & Search
    • Search
      • Search Filter Operators
    • Data Explorer
      • Data Explorer SQL Search Examples
        • CloudTrail logs queries
        • GitHub Audit logs queries
        • GuardDuty logs queries
        • Nginx and ALB Access logs queries
        • Okta logs queries
        • S3 Access logs queries
        • VPC logs queries
    • Visualization and Dashboards
      • Custom Dashboards (Beta)
      • Panther-Managed Dashboards
    • Standard Fields
    • Saved and Scheduled Searches
      • Templated Searches
        • Behavioral Analytics and Anomaly Detection Template Macros (Beta)
      • Scheduled Search Examples
    • Search History
    • Data Lakes
      • Snowflake
        • Snowflake Configuration for Optimal Search Performance
      • Athena
  • PantherFlow (Beta)
    • PantherFlow Quick Reference
    • PantherFlow Statements
    • PantherFlow Operators
      • Datatable Operator
      • Extend Operator
      • Join Operator
      • Limit Operator
      • Project Operator
      • Range Operator
      • Sort Operator
      • Search Operator
      • Summarize Operator
      • Union Operator
      • Visualize Operator
      • Where Operator
    • PantherFlow Data Types
    • PantherFlow Expressions
    • PantherFlow Functions
      • Aggregation Functions
      • Date/time Functions
      • String Functions
      • Array Functions
      • Math Functions
      • Control Flow Functions
      • Regular Expression Functions
      • Snowflake Functions
      • Data Type Functions
      • Other Functions
    • PantherFlow Example Queries
      • PantherFlow Examples: Threat Hunting Scenarios
      • PantherFlow Examples: SOC Operations
      • PantherFlow Examples: Panther Audit Logs
  • Enrichment
    • Custom Lookup Tables
      • Creating a GreyNoise Lookup Table
      • Lookup Table Examples
        • Using Lookup Tables: 1Password UUIDs
      • Lookup Table Specification Reference
    • Identity Provider Profiles
      • Okta Profiles
      • Google Workspace Profiles
    • Anomali ThreatStream
    • IPinfo
    • Tor Exit Nodes
    • TrailDiscover (Beta)
  • Panther AI (Beta)
    • Managing Panther AI Response History
  • System Configuration
    • Role-Based Access Control
    • Identity & Access Integrations
      • Azure Active Directory SSO
      • Duo SSO
      • G Suite SSO
      • Okta SSO
        • Okta SCIM
      • OneLogin SSO
      • Generic SSO
    • Panther Audit Logs
      • Querying and Writing Detections for Panther Audit Logs
      • Panther Audit Log Actions
    • Notifications and Errors (Beta)
      • System Errors
    • Panther Deployment Types
      • SaaS
      • Cloud Connected
        • Setting Up a Cloud Connected Panther Instance
      • Legacy Configurations
        • Snowflake Connected (Legacy)
        • Customer-configured Snowflake Integration (Legacy)
        • Self-Hosted Deployments (Legacy)
          • Runtime Environment
  • Panther Developer Workflows
    • Panther Developer Workflows Overview
    • Using panther-analysis
      • Public Fork
      • Private Clone
      • Panther Analysis Tool
        • Install, Configure, and Authenticate with the Panther Analysis Tool
        • Panther Analysis Tool Commands
        • Managing Lookup Tables and Enrichment Providers with the Panther Analysis Tool
      • CI/CD for Panther Content
        • Deployment Workflows Using Panther Analysis Tool
          • Managing Panther Content via CircleCI
          • Managing Panther Content via GitHub Actions
        • Migrating to a CI/CD Workflow
    • Panther API
      • REST API (Beta)
        • Alerts
        • Alert Comments
        • API Tokens
        • Data Models
        • Globals
        • Log Sources
        • Queries
        • Roles
        • Rules
        • Scheduled Rules
        • Simple Rules
        • Policies
        • Users
      • GraphQL API
        • Alerts & Errors
        • Cloud Account Management
        • Data Lake Queries
        • Log Source Management
        • Metrics
        • Schemas
        • Token Rotation
        • User & Role Management
      • API Playground
    • Terraform
      • Managing AWS S3 Log Sources with Terraform
      • Managing HTTP Log Sources with Terraform
    • pantherlog Tool
    • Converting Sigma Rules
  • Resources
    • Help
      • Operations
      • Security and Privacy
        • Security Without AWS External ID
      • Glossary
      • Legal
    • Panther System Architecture
Powered by GitBook
On this page
  • Overview
  • Latency
  • Cost considerations
  • How to onboard Snowflake Audit Logs to Panther
  • Prerequisites
  • Step 1: Create a worksheet in Snowsight
  • Step 2: Create a new Snowflake log source in Panther
  • Supported log types
  • Snowflake.AccessHistory
  • Snowflake.DataTransferHistory
  • Snowflake.LoginHistory
  • Snowflake.QueryHistory
  • Snowflake.Sessions

Was this helpful?

  1. Data Sources & Transports
  2. Supported Logs

Snowflake Audit Logs (Beta)

Panther supports pulling Audit Logs directly from Snowflake's ACCOUNT_USAGE schema

PreviousSlack LogsNextSnyk Logs

Last updated 1 month ago

Was this helpful?

Overview

This feature is in open beta starting with Panther version 1.111, and is available to all customers. Please share any bug reports and feature requests with your Panther support team.

Panther can fetch audit information by querying the views in the in the SNOWFLAKE database.

You can use this integration to monitor any Snowflake instance, however, to monitor your Panther-connected Snowflake instance, it's recommended to instead use —see .

Databases in any Snowflake cloud or region may be monitored, but these factors could affect .

The available views include:

The requires the or higher.

Latency

Total data latency is a combination of Snowflake and Panther latency:

  • Latency varies for each of the available Snowflake views, and can, in certain cases, be as high as three hours. To verify latency for each view, consult the Latency column of the ACCOUNT_USAGE views table in .

  • Panther adds at least one hour of latency.

Cost considerations

Snowflake compute costs incurred by using this integration are affected by various factors, including:

    • Panther must execute queries to pull data, thus it needs to use an active warehouse.

    • You can minimize costs by: selecting a warehouse that is already running.

  • The data refresh interval

    • When setting up the log source in Panther, you will choose how often you'd like to pull data from Snowflake. This can be as frequent as every one minute, up to as long as every 24 hours. You should set this interval based on your desired latency-to-cost balance.

    • You can minimize costs by: choosing a longer refresh interval.

  • Whether the cloud and region of the Snowflake instance you're monitoring is the same as your Panther Snowflake instance

    • You can minimize costs by: the cloud and region being the same as your Panther Snowflake instance.

How to onboard Snowflake Audit Logs to Panther

Prerequisites

To configure this integration, you must:

  • In Snowflake, have CREATE USER , CREATE ROLE, and GRANT USAGE permissions

    • This is only required if you will be creating a service user in Snowflake for Panther to use. If you already have a service user Panther can use, you do not need to have these permissions.

Step 1: Create a worksheet in Snowsight

This step is only required if you need to create a service user in Snowflake that Panther can use to pull data. If you already have a service user Panther can use, skip this step.

Step 2: Create a new Snowflake log source in Panther

  1. In the left-hand navigation bar of your Panther Console, click Configure > Log Sources.

  2. Click Create New.

  3. Search for “Snowflake Audit Logs,” then click its tile.

  4. On the slide-out panel, click Start Setup.

  5. On the Configure page, enter values for the following fields:

    • Name: Enter a descriptive name for the source, e.g. Snowflake Prod.

    • Account Identifier: Enter your Snowflake account identifier in the format <org_name>-<account_name>.

      • Use a hyphen, not a period, between the org and account names.

    • Warehouse: Enter the Snowflake warehouse Panther will use to execute queries to pull data.

    • Run Every: Use the Number and Period fields to choose the interval on which you'd like Panther to pull data from Snowflake.

    • Monitored Log Types: Select the Snowflake views you'd like Panther to fetch.

  6. Click Setup.

  7. On the Set Credentials page, fill in the form fields:

    • Username: The username of the Snowflake user Panther will use to pull data. The default value is PANTHER_AUDIT_VIEW_USER, but you may customize this.

      • If you already have a service user for Panther to use (and don't need to create a new one), enter its username here.

    • Role: The name of the role possessed by the Snowflake user that Panther will use to pull data. The default value is PANTHER_AUDIT_VIEW_ROLE, but you may customize this.

      • If you already have a service role for Panther to use (and don't need to create a new one), enter its name here.

  8. Click Next.

  9. If you did not upload your own RSA key, create a service user for Panther to use with the generated SQL snippet:

    1. Copy the generated SQL snippet.

    2. Click Setup.

  10. If everything is correct, you will be directed to a success screen:

    • The Trigger an alert when no events are processed setting defaults to YES. We recommend leaving this enabled, as you will be alerted if data stops flowing from the log source after a certain period of time. The timeframe is configurable, with a default of 24 hours.

Supported log types

Snowflake.AccessHistory

schema: Snowflake.AccessHistory
description: Snowflake access history log
fields:
    - name: BASE_OBJECTS_ACCESSED
      required: true
      description: List of base objects accessed during the query
      type: array
      element:
        type: json
    - name: DIRECT_OBJECTS_ACCESSED
      description: List of direct objects accessed during the query
      type: array
      element:
        type: json
    - name: OBJECTS_MODIFIED
      description: List of objects modified during the query
      type: array
      element:
        type: json
    - name: POLICIES_REFERENCED
      description: List of policies referenced during the query
      type: array
      element:
        type: json
    - name: OBJECT_MODIFIED_BY_DDL
      description: Object modified by DDL during the query
      type: json
    - name: QUERY_ID
      description: Unique identifier for the query
      type: string
    - name: QUERY_START_TIME
      required: true
      description: The start time of the query
      type: timestamp
      timeFormats:
        - '%Y-%m-%d %H:%M:%S.%f %z'
      isEventTime: true
    - name: USER_NAME
      description: Name of the user who executed the query
      type: string
      indicators:
        - username

Snowflake.DataTransferHistory

schema: Snowflake.DataTransferHistory
description: Snowflake History Of Data Transfers
fields:
    - name: ORGANIZATION_NAME
      required: true
      type: string
    - name: ACCOUNT_NAME
      required: true
      type: string
    - name: ACCOUNT_LOCATOR
      required: true
      type: string
    - name: REGION
      type: string
    - name: USAGE_DATE
      required: true
      type: timestamp
      timeFormats:
        - '%Y-%m-%d %H:%M:%S.%f %z'
      isEventTime: true
    - name: SOURCE_CLOUD
      type: string
    - name: SOURCE_REGION
      type: string
    - name: TARGET_CLOUD
      type: string
    - name: TARGET_REGION
      type: string
    - name: BYTES_TRANSFERRED
      required: true
      type: bigint
    - name: TRANSFER_TYPE
      type: string

Snowflake.LoginHistory

schema: Snowflake.LoginHistory
description: Snowflake login history log
fields:
    - name: CLIENT_IP
      description: IP address of the client initiating the login
      type: string
      indicators:
        - ip
    - name: EVENT_ID
      required: true
      description: Unique identifier for the event
      type: string
    - name: EVENT_TIMESTAMP
      required: true
      description: Timestamp of the event
      type: timestamp
      timeFormats:
        - '%Y-%m-%d %H:%M:%S.%f %z'
      isEventTime: true
    - name: EVENT_TYPE
      description: Type of the event (e.g., LOGIN, LOGOUT)
      type: string
    - name: FIRST_AUTHENTICATION_FACTOR
      description: The first authentication factor used
      type: string
    - name: IS_SUCCESS
      description: Indicates if the event was successful (YES/NO)
      type: string
    - name: RELATED_EVENT_ID
      description: Identifier for a related event, if any
      type: string
    - name: REPORTED_CLIENT_TYPE
      description: Type of the client reported
      type: string
    - name: REPORTED_CLIENT_VERSION
      description: Version of the client reported
      type: string
    - name: USER_NAME
      description: Name of the user involved in the event
      type: string
      indicators:
        - username

Snowflake.QueryHistory

schema: Snowflake.QueryHistory
description: Snowflake query history log
fields:
    - name: BYTES_DELETED
      description: Number of bytes deleted
      type: bigint
    - name: BYTES_READ_FROM_RESULT
      description: Number of bytes read from the result
      type: bigint
    - name: BYTES_SCANNED
      description: Number of bytes scanned
      type: bigint
    - name: BYTES_SENT_OVER_THE_NETWORK
      description: Number of bytes sent over the network
      type: bigint
    - name: BYTES_SPILLED_TO_LOCAL_STORAGE
      description: Number of bytes spilled to local storage
      type: bigint
    - name: BYTES_SPILLED_TO_REMOTE_STORAGE
      description: Number of bytes spilled to remote storage
      type: bigint
    - name: BYTES_WRITTEN
      description: Number of bytes written
      type: bigint
    - name: BYTES_WRITTEN_TO_RESULT
      description: Number of bytes written to the result
      type: bigint
    - name: CHILD_QUERIES_WAIT_TIME
      description: Wait time for child queries
      type: int
    - name: CLUSTER_NUMBER
      description: Number of the cluster
      type: int
    - name: COMPILATION_TIME
      description: Time taken for query compilation
      type: int
    - name: CREDITS_USED_CLOUD_SERVICES
      description: Credits used for cloud services
      type: float
    - name: DATABASE_ID
      description: Database identifier
      type: string
    - name: DATABASE_NAME
      description: Name of the database
      type: string
    - name: END_TIME
      description: The end time of the query
      type: timestamp
      timeFormats:
        - '%Y-%m-%d %H:%M:%S.%f %z'
    - name: EXECUTION_STATUS
      description: Status of query execution
      type: string
    - name: EXECUTION_TIME
      description: Time taken for query execution
      type: int
    - name: EXTERNAL_FUNCTION_TOTAL_INVOCATIONS
      description: Total invocations of external functions
      type: int
    - name: EXTERNAL_FUNCTION_TOTAL_RECEIVED_BYTES
      description: Total bytes received by external functions
      type: int
    - name: EXTERNAL_FUNCTION_TOTAL_RECEIVED_ROWS
      description: Total rows received by external functions
      type: int
    - name: EXTERNAL_FUNCTION_TOTAL_SENT_BYTES
      description: Total bytes sent by external functions
      type: int
    - name: EXTERNAL_FUNCTION_TOTAL_SENT_ROWS
      description: Total rows sent by external functions
      type: int
    - name: INBOUND_DATA_TRANSFER_BYTES
      description: Inbound data transfer in bytes
      type: int
    - name: IS_CLIENT_GENERATED_STATEMENT
      description: Whether the statement was generated by a client
      type: boolean
    - name: LIST_EXTERNAL_FILES_TIME
      description: Time taken to list external files
      type: int
    - name: OUTBOUND_DATA_TRANSFER_BYTES
      description: Outbound data transfer in bytes
      type: int
    - name: PARTITIONS_SCANNED
      description: Number of partitions scanned
      type: int
    - name: PARTITIONS_TOTAL
      description: Total number of partitions
      type: int
    - name: PERCENTAGE_SCANNED_FROM_CACHE
      description: Percentage of data scanned from cache
      type: float
    - name: QUERY_ACCELERATION_BYTES_SCANNED
      description: Bytes scanned for query acceleration
      type: int
    - name: QUERY_ACCELERATION_PARTITIONS_SCANNED
      description: Partitions scanned for query acceleration
      type: int
    - name: QUERY_ACCELERATION_UPPER_LIMIT_SCALE_FACTOR
      description: Upper limit scale factor for query acceleration
      type: int
    - name: QUERY_HASH
      description: Hash of the query string
      type: string
    - name: QUERY_HASH_VERSION
      description: Hash version
      type: string
    - name: QUERY_ID
      required: true
      description: Unique identifier for the query
      type: string
    - name: QUERY_LOAD_PERCENT
      description: Load percentage during the query
      type: float
    - name: QUERY_PARAMETERIZED_HASH
      description: Hash of the parameterized query
      type: string
    - name: QUERY_PARAMETERIZED_HASH_VERSION
      description: Hash version of the parameterized query
      type: string
    - name: QUERY_TAG
      description: Tag associated with the query
      type: string
    - name: QUERY_TEXT
      description: Text of the query
      type: string
    - name: QUERY_TYPE
      description: Type of the query
      type: string
    - name: QUEUED_OVERLOAD_TIME
      description: Time spent in queue due to overload
      type: int
    - name: QUEUED_PROVISIONING_TIME
      description: Time spent in queue for provisioning
      type: int
    - name: QUEUED_REPAIR_TIME
      description: Time spent in queue for repair
      type: int
    - name: RELEASE_VERSION
      description: Version of the release
      type: string
    - name: ROLE_NAME
      description: Name of the role
      type: string
    - name: ROLE_TYPE
      description: Type of the role
      type: string
    - name: ROWS_DELETED
      description: Number of rows deleted
      type: int
    - name: ROWS_INSERTED
      description: Number of rows inserted
      type: int
    - name: ROWS_UNLOADED
      description: Number of rows unloaded
      type: int
    - name: ROWS_UPDATED
      description: Number of rows updated
      type: int
    - name: ROWS_WRITTEN_TO_RESULT
      description: Number of rows written to the result
      type: int
    - name: SCHEMA_ID
      description: Identifier for the schema
      type: string
    - name: SCHEMA_NAME
      description: Name of the schema
      type: string
    - name: SECONDARY_ROLE_STATS
      description: Secondary role stats
      type: string
    - name: SESSION_ID
      description: Identifier for the session
      type: string
    - name: START_TIME
      required: true
      description: The start time of the query
      type: timestamp
      timeFormats:
        - '%Y-%m-%d %H:%M:%S.%f %z'
      isEventTime: true
    - name: TOTAL_ELAPSED_TIME
      description: Total elapsed time for the query
      type: int
    - name: TRANSACTION_BLOCKED_TIME
      description: Time the transaction was blocked
      type: int
    - name: TRANSACTION_ID
      description: Identifier for the transaction
      type: string
    - name: USER_NAME
      description: Name of the user
      type: string
      indicators:
        - username
    - name: WAREHOUSE_ID
      description: Identifier for the warehouse
      type: string
    - name: WAREHOUSE_NAME
      description: Name of the warehouse
      type: string
    - name: WAREHOUSE_SIZE
      description: Size of the warehouse
      type: string
    - name: WAREHOUSE_TYPE
      description: Type of the warehouse
      type: string

Snowflake.Sessions

schema: Snowflake.Sessions
description: Snowflake session history log
fields:
    - name: AUTHENTICATION_METHOD
      description: Method used for authentication
      type: string
    - name: CLIENT_APPLICATION_ID
      description: ID of the client application
      type: string
    - name: CLIENT_APPLICATION_VERSION
      description: Version of the client application
      type: string
    - name: CLIENT_BUILD_ID
      description: Build ID of the client application
      type: string
    - name: CLIENT_ENVIRONMENT
      description: Environment information of the client application (e.g., OS, version)
      type: json
      isEmbeddedJSON: true
    - name: CLIENT_VERSION
      description: Version of the client
      type: string
    - name: CLOSED_REASON
      description: Reason why the session was closed
      type: string
    - name: CREATED_ON
      required: true
      description: Timestamp when the session was created
      type: timestamp
      timeFormats:
        - '%Y-%m-%d %H:%M:%S.%f %z'
      isEventTime: true
    - name: LOGIN_EVENT_ID
      description: Unique identifier for the login event
      type: string
    - name: SESSION_ID
      required: true
      description: Unique identifier for the session
      type: string
    - name: USER_NAME
      description: Name of the user
      type: string

The you select for Panther to use

Learn more on Snowflake's documentation.

Have your . It should be formatted with a hyphen (not a period), like: <org_name>-<account_name>

Have a Snowflake Panther can use to execute queries to pull data

In Snowsight, with the CREATE USER , CREATE ROLE, and GRANT USAGE permissions.

See to learn about how the interval can affect compute costs.

If you already have a service user for Panther to use (and don't need to create a new one), click I want to use my own RSA key, then upload your RSA key file.

Run the SQL snippet in a .

You can optionally enable one or more .

warehouse
Understanding overall cost
Snowflake account identifier
warehouse
create a worksheet
Snowsight worksheet
Detection Packs
Cost considerations
Snowflake
ACCOUNT_USAGE schema
Scheduled Searches
ACCESS_HISTORY
DATA_TRANSFER_HISTORY
LOGIN_HISTORY
QUERY_HISTORY
SESSIONS
ACCESS_HISTORY view
Enterprise Edition of Snowflake
this Snowflake documentation
generated cost
Scheduled Search Examples
An arrow is drawn from a tile titled "Snowflake Audit Logs" to a button labeled "Start Setup."
The success screen reads, "Everything looks good! Panther will now automatically pull & process logs from your account"
The "Trigger an alert when no events are processed" toggle is set to YES. The "How long should Panther wait before it sends you an alert that no events have been processed" setting is set to 1 Day