LogoLogo
Knowledge BaseCommunityRelease NotesRequest Demo
  • Overview
  • Quick Start
    • Onboarding Guide
  • Data Sources & Transports
    • Supported Logs
      • 1Password Logs
      • Apache Logs
      • AppOmni Logs
      • Asana Logs
      • Atlassian Logs
      • Auditd Logs
      • Auth0 Logs
      • AWS Logs
        • AWS ALB
        • AWS Aurora
        • AWS CloudFront
        • AWS CloudTrail
        • AWS CloudWatch
        • AWS Config
        • AWS EKS
        • AWS GuardDuty
        • AWS Security Hub
        • Amazon Security Lake
        • AWS S3
        • AWS Transit Gateway
        • AWS VPC
        • AWS WAF
      • Azure Monitor Logs
      • Bitwarden Logs
      • Box Logs
      • Carbon Black Logs
      • Cisco Umbrella Logs
      • Cloudflare Logs
      • CrowdStrike Logs
        • CrowdStrike Falcon Data Replicator
        • CrowdStrike Event Streams
      • Docker Logs
      • Dropbox Logs
      • Duo Security Logs
      • Envoy Logs
      • Fastly Logs
      • Fluentd Logs
      • GCP Logs
      • GitHub Logs
      • GitLab Logs
      • Google Workspace Logs
      • Heroku Logs
      • Jamf Pro Logs
      • Juniper Logs
      • Lacework Logs
        • Lacework Alert Channel Webhook
        • Lacework Export
      • Material Security Logs
      • Microsoft 365 Logs
      • Microsoft Entra ID Audit Logs
      • Microsoft Graph Logs
      • MongoDB Atlas Logs
      • Netskope Logs
      • Nginx Logs
      • Notion Logs
      • Okta Logs
      • OneLogin Logs
      • Orca Security Logs (Beta)
      • Osquery Logs
      • OSSEC Logs
      • Proofpoint Logs
      • Push Security Logs
      • Rapid7 Logs
      • Salesforce Logs
      • SentinelOne Logs
      • Slack Logs
      • Snowflake Audit Logs (Beta)
      • Snyk Logs
      • Sophos Logs
      • Sublime Security Logs
      • Suricata Logs
      • Sysdig Logs
      • Syslog Logs
      • Tailscale Logs
      • Teleport Logs
      • Tenable Vulnerability Management Logs
      • Thinkst Canary Logs
      • Tines Logs
      • Tracebit Logs
      • Windows Event Logs
      • Wiz Logs
      • Zeek Logs
      • Zendesk Logs
      • Zoom Logs
      • Zscaler Logs
        • Zscaler ZIA
        • Zscaler ZPA
    • Custom Logs
      • Log Schema Reference
      • Transformations
      • Script Log Parser (Beta)
      • Fastmatch Log Parser
      • Regex Log Parser
      • CSV Log Parser
    • Data Transports
      • HTTP Source
      • AWS Sources
        • S3 Source
        • CloudWatch Logs Source
        • SQS Source
          • SNS Source
        • EventBridge
      • Google Cloud Sources
        • Cloud Storage (GCS) Source
        • Pub/Sub Source
      • Azure Blob Storage Source
    • Monitoring Log Sources
    • Ingestion Filters
      • Raw Event Filters
      • Normalized Event Filters (Beta)
    • Data Pipeline Tools
      • Chronosphere Onboarding Guide
      • Cribl Onboarding Guide
      • Fluent Bit Onboarding Guide
        • Fluent Bit Configuration Examples
      • Fluentd Onboarding Guide
        • General log forwarding via Fluentd
        • MacOS System Logs to S3 via Fluentd
        • Syslog to S3 via Fluentd
        • Windows Event Logs to S3 via Fluentd (Legacy)
        • GCP Audit to S3 via Fluentd
      • Observo Onboarding Guide
      • Tarsal Onboarding Guide
    • Tech Partner Log Source Integrations
  • Detections
    • Using Panther-managed Detections
      • Detection Packs
    • Rules and Scheduled Rules
      • Writing Python Detections
        • Python Rule Caching
        • Data Models
        • Global Helper Functions
      • Modifying Detections with Inline Filters (Beta)
      • Derived Detections (Beta)
        • Using Derived Detections to Avoid Merge Conflicts
      • Using the Simple Detection Builder
      • Writing Simple Detections
        • Simple Detection Match Expression Reference
        • Simple Detection Error Codes
    • Correlation Rules (Beta)
      • Correlation Rule Reference
    • PyPanther Detections (Beta)
      • Creating PyPanther Detections
      • Registering, Testing, and Uploading PyPanther Detections
      • Managing PyPanther Detections in the Panther Console
      • PyPanther Detections Style Guide
      • pypanther Library Reference
      • Using the pypanther Command Line Tool
    • Signals
    • Policies
    • Testing
      • Data Replay (Beta)
    • Framework Mapping and MITRE ATT&CK® Matrix
  • Cloud Security Scanning
    • Cloud Resource Attributes
      • AWS
        • ACM Certificate
        • CloudFormation Stack
        • CloudWatch Log Group
        • CloudTrail
        • CloudTrail Meta
        • Config Recorder
        • Config Recorder Meta
        • DynamoDB Table
        • EC2 AMI
        • EC2 Instance
        • EC2 Network ACL
        • EC2 SecurityGroup
        • EC2 Volume
        • EC2 VPC
        • ECS Cluster
        • EKS Cluster
        • ELBV2 Application Load Balancer
        • GuardDuty Detector
        • GuardDuty Detector Meta
        • IAM Group
        • IAM Policy
        • IAM Role
        • IAM Root User
        • IAM User
        • KMS Key
        • Lambda Function
        • Password Policy
        • RDS Instance
        • Redshift Cluster
        • Route 53 Domains
        • Route 53 Hosted Zone
        • S3 Bucket
        • WAF Web ACL
  • Alerts & Destinations
    • Alert Destinations
      • Amazon SNS Destination
      • Amazon SQS Destination
      • Asana Destination
      • Blink Ops Destination
      • Custom Webhook Destination
      • Discord Destination
      • GitHub Destination
      • Google Pub/Sub Destination (Beta)
      • Incident.io Destination
      • Jira Cloud Destination
      • Jira Data Center Destination (Beta)
      • Microsoft Teams Destination
      • Mindflow Destination
      • OpsGenie Destination
      • PagerDuty Destination
      • Rapid7 Destination
      • ServiceNow Destination (Custom Webhook)
      • Slack Bot Destination
      • Slack Destination (Webhook)
      • Splunk Destination (Beta)
      • Tines Destination
      • Torq Destination
    • Assigning and Managing Alerts
      • Managing Alerts in Slack
    • Alert Runbooks
      • Panther-managed Policies Runbooks
        • AWS CloudTrail Is Enabled In All Regions
        • AWS CloudTrail Sending To CloudWatch Logs
        • AWS KMS CMK Key Rotation Is Enabled
        • AWS Application Load Balancer Has Web ACL
        • AWS Access Keys Are Used Every 90 Days
        • AWS Access Keys are Rotated Every 90 Days
        • AWS ACM Certificate Is Not Expired
        • AWS Access Keys not Created During Account Creation
        • AWS CloudTrail Has Log Validation Enabled
        • AWS CloudTrail S3 Bucket Has Access Logging Enabled
        • AWS CloudTrail Logs S3 Bucket Not Publicly Accessible
        • AWS Config Is Enabled for Global Resources
        • AWS DynamoDB Table Has Autoscaling Targets Configured
        • AWS DynamoDB Table Has Autoscaling Enabled
        • AWS DynamoDB Table Has Encryption Enabled
        • AWS EC2 AMI Launched on Approved Host
        • AWS EC2 AMI Launched on Approved Instance Type
        • AWS EC2 AMI Launched With Approved Tenancy
        • AWS EC2 Instance Has Detailed Monitoring Enabled
        • AWS EC2 Instance Is EBS Optimized
        • AWS EC2 Instance Running on Approved AMI
        • AWS EC2 Instance Running on Approved Instance Type
        • AWS EC2 Instance Running in Approved VPC
        • AWS EC2 Instance Running On Approved Host
        • AWS EC2 Instance Running With Approved Tenancy
        • AWS EC2 Instance Volumes Are Encrypted
        • AWS EC2 Volume Is Encrypted
        • AWS GuardDuty is Logging to a Master Account
        • AWS GuardDuty Is Enabled
        • AWS IAM Group Has Users
        • AWS IAM Policy Blocklist Is Respected
        • AWS IAM Policy Does Not Grant Full Administrative Privileges
        • AWS IAM Policy Is Not Assigned Directly To User
        • AWS IAM Policy Role Mapping Is Respected
        • AWS IAM User Has MFA Enabled
        • AWS IAM Password Used Every 90 Days
        • AWS Password Policy Enforces Complexity Guidelines
        • AWS Password Policy Enforces Password Age Limit Of 90 Days Or Less
        • AWS Password Policy Prevents Password Reuse
        • AWS RDS Instance Is Not Publicly Accessible
        • AWS RDS Instance Snapshots Are Not Publicly Accessible
        • AWS RDS Instance Has Storage Encrypted
        • AWS RDS Instance Has Backups Enabled
        • AWS RDS Instance Has High Availability Configured
        • AWS Redshift Cluster Allows Version Upgrades
        • AWS Redshift Cluster Has Encryption Enabled
        • AWS Redshift Cluster Has Logging Enabled
        • AWS Redshift Cluster Has Correct Preferred Maintenance Window
        • AWS Redshift Cluster Has Sufficient Snapshot Retention Period
        • AWS Resource Has Minimum Number of Tags
        • AWS Resource Has Required Tags
        • AWS Root Account Has MFA Enabled
        • AWS Root Account Does Not Have Access Keys
        • AWS S3 Bucket Name Has No Periods
        • AWS S3 Bucket Not Publicly Readable
        • AWS S3 Bucket Not Publicly Writeable
        • AWS S3 Bucket Policy Does Not Use Allow With Not Principal
        • AWS S3 Bucket Policy Enforces Secure Access
        • AWS S3 Bucket Policy Restricts Allowed Actions
        • AWS S3 Bucket Policy Restricts Principal
        • AWS S3 Bucket Has Versioning Enabled
        • AWS S3 Bucket Has Encryption Enabled
        • AWS S3 Bucket Lifecycle Configuration Expires Data
        • AWS S3 Bucket Has Logging Enabled
        • AWS S3 Bucket Has MFA Delete Enabled
        • AWS S3 Bucket Has Public Access Block Enabled
        • AWS Security Group Restricts Ingress On Administrative Ports
        • AWS VPC Default Security Group Restricts All Traffic
        • AWS VPC Flow Logging Enabled
        • AWS WAF Has Correct Rule Ordering
        • AWS CloudTrail Logs Encrypted Using KMS CMK
      • Panther-managed Rules Runbooks
        • AWS CloudTrail Modified
        • AWS Config Service Modified
        • AWS Console Login Failed
        • AWS Console Login Without MFA
        • AWS EC2 Gateway Modified
        • AWS EC2 Network ACL Modified
        • AWS EC2 Route Table Modified
        • AWS EC2 SecurityGroup Modified
        • AWS EC2 VPC Modified
        • AWS IAM Policy Modified
        • AWS KMS CMK Loss
        • AWS Root Activity
        • AWS S3 Bucket Policy Modified
        • AWS Unauthorized API Call
    • Tech Partner Alert Destination Integrations
  • Investigations & Search
    • Search
      • Search Filter Operators
    • Data Explorer
      • Data Explorer SQL Search Examples
        • CloudTrail logs queries
        • GitHub Audit logs queries
        • GuardDuty logs queries
        • Nginx and ALB Access logs queries
        • Okta logs queries
        • S3 Access logs queries
        • VPC logs queries
    • Visualization and Dashboards
      • Custom Dashboards (Beta)
      • Panther-Managed Dashboards
    • Standard Fields
    • Saved and Scheduled Searches
      • Templated Searches
        • Behavioral Analytics and Anomaly Detection Template Macros (Beta)
      • Scheduled Search Examples
    • Search History
    • Data Lakes
      • Snowflake
        • Snowflake Configuration for Optimal Search Performance
      • Athena
  • PantherFlow (Beta)
    • PantherFlow Quick Reference
    • PantherFlow Statements
    • PantherFlow Operators
      • Datatable Operator
      • Extend Operator
      • Join Operator
      • Limit Operator
      • Project Operator
      • Range Operator
      • Sort Operator
      • Search Operator
      • Summarize Operator
      • Union Operator
      • Visualize Operator
      • Where Operator
    • PantherFlow Data Types
    • PantherFlow Expressions
    • PantherFlow Functions
      • Aggregation Functions
      • Date/time Functions
      • String Functions
      • Array Functions
      • Math Functions
      • Control Flow Functions
      • Regular Expression Functions
      • Snowflake Functions
      • Data Type Functions
      • Other Functions
    • PantherFlow Example Queries
      • PantherFlow Examples: Threat Hunting Scenarios
      • PantherFlow Examples: SOC Operations
      • PantherFlow Examples: Panther Audit Logs
  • Enrichment
    • Custom Lookup Tables
      • Creating a GreyNoise Lookup Table
      • Lookup Table Examples
        • Using Lookup Tables: 1Password UUIDs
      • Lookup Table Specification Reference
    • Identity Provider Profiles
      • Okta Profiles
      • Google Workspace Profiles
    • Anomali ThreatStream
    • IPinfo
    • Tor Exit Nodes
    • TrailDiscover (Beta)
  • Panther AI (Beta)
  • System Configuration
    • Role-Based Access Control
    • Identity & Access Integrations
      • Azure Active Directory SSO
      • Duo SSO
      • G Suite SSO
      • Okta SSO
        • Okta SCIM
      • OneLogin SSO
      • Generic SSO
    • Panther Audit Logs
      • Querying and Writing Detections for Panther Audit Logs
      • Panther Audit Log Actions
    • Notifications and Errors (Beta)
      • System Errors
    • Panther Deployment Types
      • SaaS
      • Cloud Connected
        • Configuring Snowflake for Cloud Connected
        • Configuring AWS for Cloud Connected
        • Pre-Deployment Tools
      • Legacy Configurations
        • Snowflake Connected (Legacy)
        • Customer-configured Snowflake Integration (Legacy)
        • Self-Hosted Deployments (Legacy)
          • Runtime Environment
  • Panther Developer Workflows
    • Panther Developer Workflows Overview
    • Using panther-analysis
      • Public Fork
      • Private Clone
      • Panther Analysis Tool
        • Install, Configure, and Authenticate with the Panther Analysis Tool
        • Panther Analysis Tool Commands
        • Managing Lookup Tables and Enrichment Providers with the Panther Analysis Tool
      • CI/CD for Panther Content
        • Deployment Workflows Using Panther Analysis Tool
          • Managing Panther Content via CircleCI
          • Managing Panther Content via GitHub Actions
        • Migrating to a CI/CD Workflow
    • Panther API
      • REST API (Beta)
        • Alerts
        • Alert Comments
        • API Tokens
        • Data Models
        • Globals
        • Log Sources
        • Queries
        • Roles
        • Rules
        • Scheduled Rules
        • Simple Rules
        • Policies
        • Users
      • GraphQL API
        • Alerts & Errors
        • Cloud Account Management
        • Data Lake Queries
        • Log Source Management
        • Metrics
        • Schemas
        • Token Rotation
        • User & Role Management
      • API Playground
    • Terraform
      • Managing AWS S3 Log Sources with Terraform
      • Managing HTTP Log Sources with Terraform
    • pantherlog Tool
    • Converting Sigma Rules
  • Resources
    • Help
      • Operations
      • Security and Privacy
        • Security Without AWS External ID
      • Glossary
      • Legal
    • Panther System Architecture
Powered by GitBook
On this page
  • Overview
  • Panther-managed Detection Packs
  • How to use Detection Packs
  • Viewing Detection Packs
  • Enabling and disabling Detection Packs
  • Update or roll back Detection Pack
  • Managing Detections with Packs
  • Editing Managed Detections
  • Cloning and editing a managed detection
  • Pack Sources (Legacy)
  • Pack manifests
  • Accessing private repositories
  • Release signatures
  • Managing Pack Sources
  • Creating a GitHub release - Panther Analysis Tool

Was this helpful?

  1. Detections
  2. Using Panther-managed Detections

Detection Packs

Use Packs to group detections and enable updates via the Panther Console

PreviousUsing Panther-managed DetectionsNextRules and Scheduled Rules

Last updated 9 months ago

Was this helpful?

Overview

Detection Packs (also called, simply, Packs) are used to logically group , as well as enable detection updates in the Panther Console. Packs are defined in the open-source repository.

A single Pack can group any number of Detections, Queries, Global Helpers, Data Models, and Lookup Tables. , for example, groups all rules that rely on Data Models and all of their dependencies.

Detection Packs are versioned, and Panther periodically releases new versions with updates to core detection logic for the detections contained within. When an update for a Pack is available, an UPDATE AVAILABLE label will be displayed on the Pack's tile on the Pack list page in your Panther Console (under Detections > Packs).

Detections that are part of an enabled Detection Pack will be labeled as MANAGED, and detections that are not part of a Detection Pack will be labeled as UNMANAGED.

Note: Managing Detections via the Panther Console is not recommended if you are already using a Git-Based workflow to manage and upload detections with Panther Analysis Tool. Managing detections via both methods simultaneously may result in unexpected behavior.

You can use the to disable the ability to turn on packs in the UI.

Panther-managed Detection Packs

Panther provides several Detection Packs by default. There are Packs that group all of the related to a particular log source, as well as Detection Packs that are grouped on a particular focus (e.g., generic rules that leverage unified data models or a core set of detections for AWS.) Some popular examples include:

Display Name

Description

This pack groups the standard rules that leverage unified data models

Group of the most critical and high value detections pertinent to the AWS environment

Group of all Panther created detections for Okta

How to use Detection Packs

Viewing Detection Packs

You can view a list of Panther-provided Detection Packs in your Panther Console by clicking Detections in the navigation bar, then clicking the Packs tab.

Click on a Pack to view its details, including a description, the enabled status, the currently enabled version, and which detections are in the pack.

Enabling and disabling Detection Packs

Packs can be enabled or disabled in your Panther Console. When you enable a Pack, each of the detections contained in that Pack is enabled. You can disable one or more detections within an enabled Pack (on a one-by-one basis) without having to disable the entire Pack.

If a Pack is disabled, but detections contained within it are enabled, those detections will function normally (i.e., they will not be affected by their Pack being disabled).

When you update a Pack that has disabled detections, the detections will be updated but they will remain disabled.

To enable or disable a Pack:

  1. In the left-hand navigation bar of your Panther Console, click Detections.

  2. Click the Packs tab.

  3. Locate the Pack you want to enable or disable.

  4. On the right-hand side of that Pack's tile, toggle the Enabled slider on or off.

The Enabled slider also appears on Pack detail pages:

Update or roll back Detection Pack

New updates to Detection Packs are periodically released to the panther-analysis repository. These updates are automatically detected by Panther, and the pack overview page will show an Update Available flag next to relevant packs.

To update pack detections:

  1. In the left-hand navigation bar of your Panther Console, click Detections.

  2. Click the Packs tab.

  3. Locate the Pack you want to update.

  4. Within the Pack's tile, in the Version dropdown, select a version number.

  5. Click Update Pack.

To revert to a previous Pack version:

  1. In the left-hand navigation bar of your Panther Console, click Detections.

  2. Click the Packs tab.

  3. Locate the Pack you want to revert.

  4. Within the Pack's tile, in the Version dropdown, select a version number.

  5. Click Revert Pack.

Managing Detections with Packs

Editing Managed Detections

After a Pack has been enabled, there are a subset of fields that you can manually edit in the Panther Console:

  • Enabled / Disabled

  • Severity

  • Deduplication Period

  • Events Threshold

  • Destination Overrides

  • Runbook

Any changes made to these fields in the Panther Console will be preserved when the pack is updated or reverted to a different version. All other fields will be greyed out in the Panther Console, and the "Functions and Tests" editor will be read-only.

Note: For enabled Packs, the above fields can only be edited manually in the Panther Console. Editing these fields in the Detection .yml files will not override these values.

You can make changes to the editable fields in the Panther Console:

  1. In the left-hand navigation bar of your Panther Console, click Detections.

  2. Click the Packs tab.

  3. Click the name of the Pack that contains the detection you want to edit, then click the name of the detection.

  4. Make any desired changes to the detection.

    • Fields that are not editable will be greyed out.

  5. Click Update in the upper right corner of the page to save your changes.

Cloning and editing a managed detection

If a rule or policy included in a Detection Pack does not fit your needs, you can clone it and then customize the cloned copy:

  1. In the left-hand navigation bar of your Panther Console, click Detections.

  2. Click the Packs tab.

  3. Click the name Pack that contains the detection you want to edit, then click the name of the detection.

Pack Sources (Legacy)

Pack Sources provide a way to configure custom GitHub sources for Detection Packs. Once a Pack Source is configured, Panther will check the source repository for new tagged releases every 24 hours. In order for Panther to find your custom Pack(s) from your Pack Source, you must:

  • Ensure that your release is finalized, and not in a draft state

  • Ensure that the artifact of the release is named panther-analysis-all.zip (and a corresponding panther-analysis-all.sig if you are signing your release)

Pack source fields are described in the following table.

Field Name

Required

Description

Expected Value

Owner

Yes

The owner/organization of the target repository

String

Repository

Yes

The name of the repository

String

kmsKey

No

The ARN for a sign/verify kms key to validate release signatures

String

AccessToken

No

Personal Access Token used to access a private repository

String

Pack manifests

Packs are defined by creating Pack manifest YAML files, which contain metadata about your Pack (such as its name, description, and the detections/files that are included in your Pack).

Your panther-analysis-all.zip release artifact can contain many different Pack manifests along with other files from your repository such as detections, global helpers, data models, etc. If you add your GitHub repository as a Pack Source in the Panther Console, then each of these Pack manifest files will show up as a Pack in the Panther Console that can be separately enabled/disabled.

Field Name
Required
Description
Expected Value

AnalysisType

true

Indicates the analysis type this file is

pack

PackID

true

The unique ID of this pack

String

Description

false

Extra information about this Pack that will be displayed in the Panther Console

String

PackDefinition

true

A mapping with a single field called IDs which is a list of strings. Each string in the IDs list should be a unique ID of a file that is included in this Pack

{ IDs: [string] }

DisplayName

true

The user-friendly title that will be displayed for this Pack in the Panther Console

String

Accessing private repositories

Release signatures

In this example entry to add to the key policy, the account ID should be replaced with the account ID where Panther is running:

{
    "Sid": "Enable KMS Verify",
    "Effect": "Allow",
    "Principal": {
        "AWS": "arn:aws:iam::{accountid}:root"
    },
    "Action": "kms:Verify",
    "Resource": "*"
}

Managing Pack Sources

To add a pack source:

  1. In the left-hand navigation bar of your Panther Console, click Detections.

  2. Click the Packs tab.

  3. Click the Detection Pack Sources tab.

  4. Click Create New in the upper right.

    • Enter the field names for each input field.

  5. Click Save.

To modify the kmsKey or AccessToken fields for a pack source:

  1. In the left-hand navigation bar of your Panther Console, click Detections.

  2. Click the Packs tab.

  3. Click the Detection Pack Sources tab.

  4. Click ... next to a Pack Source then click Edit. Click on a Pack Source.

    • Edit the fields on this page.

  5. Click Save.

To delete a pack source:

  1. In the left-hand navigation bar of your Panther Console, click Detections.

  2. Click the Packs tab.

  3. Click the Detection Pack Sources tab.

  4. Click ... next to a Pack Source then click Delete.

Deleting a Pack Source will delete the packs originating from it, along with all of the detections in it.

Creating a GitHub release - Panther Analysis Tool

The panther_analysis_tool (PAT) can streamline the process of creating an appropriate Github release, with or without an associated signature file.

To generate the release assets, use the release command.

% panther_analysis_tool release --help
usage: panther_analysis_tool release [-h] [--aws-profile AWS_PROFILE]
                                     [--filter KEY=VALUE [KEY=VALUE ...]]
                                     [--kms-key KMS_KEY]
                                     [--minimum-tests MINIMUM_TESTS]
                                     [--out OUT] [--path PATH] [--skip-tests]

optional arguments:
  -h, --help            show this help message and exit
  --aws-profile AWS_PROFILE
                        The AWS profile to use when updating the AWS Panther
                        deployment.
  --filter KEY=VALUE [KEY=VALUE ...]
  --kms-key KMS_KEY     The key id to use to sign the release asset.
  --minimum-tests MINIMUM_TESTS
                        The minimum number of tests in order for a detection
                        to be considered passing. If a number greater than 1
                        is specified, at least one True and one False test is
                        required.
  --out OUT             The path to store output files.
  --path PATH           The relative path to Panther policies and rules.
  --skip-tests

To automatically create a draft release in your Github repository, first set the GITHUB_TOKEN environment variable to a personal access token with appropriate permissions to access the target repository. Then, use the publish command.

Note: Using the panther_analysis_tool publish command creates a draft release. Before Panther is able to pull in this release artifact, you must go to your Github repository and manually finalize the draft into a release.

% panther_analysis_tool publish --help
usage: panther_analysis_tool publish [-h] [--body BODY]
                                     [--github-branch GITHUB_BRANCH]
                                     [--github-owner GITHUB_OWNER]
                                     [--github-repository GITHUB_REPOSITORY]
                                     --github-tag GITHUB_TAG
                                     [--aws-profile AWS_PROFILE]
                                     [--filter KEY=VALUE [KEY=VALUE ...]]
                                     [--kms-key KMS_KEY]
                                     [--minimum-tests MINIMUM_TESTS]
                                     [--out OUT] [--skip-tests]

optional arguments:
  -h, --help            show this help message and exit
  --body BODY           The text body for the release
  --github-branch GITHUB_BRANCH
                        The branch to base the release on
  --github-owner GITHUB_OWNER
                        The github owner of the repsitory
  --github-repository GITHUB_REPOSITORY
                        The github repsitory name
  --github-tag GITHUB_TAG
                        The tag name for this release
  --aws-profile AWS_PROFILE
                        The AWS profile to use when updating the AWS Panther
                        deployment.
  --filter KEY=VALUE [KEY=VALUE ...]
  --kms-key KMS_KEY     The key id to use to sign the release asset.
  --minimum-tests MINIMUM_TESTS
                        The minimum number of tests in order for a detection
                        to be considered passing. If a number greater than 1
                        is specified, at least one True and one False test is
                        required.
  --out OUT             The path to store output files.
  --skip-tests

The kms-key argument is an optional argument that you can use to generate a signature file. If you want to use this argument, be sure to run panther_analysis_tool using the appropriate aws credentials to call kms:Sign on the specified key.

You will receive an to confirm whether the Pack update was successful.

Follow the on Using Panther-managed Detections.

Custom Pack Sources is a legacy feature that is no longer supported for new customers. If you wish to use custom detections created and edited outside of the Panther Console, see the to understand supported workflows.

Ensure that your release is named according to , and the tag of the release must be the same as the name of the release

Ensure that your panther-analysis-all.zip contains at least one Pack Manifest file ( for more information)

You can use the panther_analysis_tool (PAT) to generate the required release assets, as well as publish a draft release (see for additional details.) You can manage custom packs using the same functionality as Panther-provided packs.

The following table is a reference of the different keys that are valid in your Pack manifest. You can find additional examples of the Pack Manifests that Panther uses in our .

In order for Panther to have access to poll a private repository, you must configure the Pack Source with a personal access token. See the for further details on creating a token.

A personal access token will grant access to all the repositories where the account owner has access. We recommend creating a that you can add as an outside collaborator to the repository containing the detection packs. This way, the access token can be scoped for a particular use and repository.

Panther-managed Packs are signed using an asymmetric AWS KMS key. Prior to importing any detections from the Panther pack source, it will validate the signature using the release asset panther-analysis.sig. This ensures that any detections being imported have not been tampered or modified. If you would like to use similar functionality, and modify the policy to allow Panther to run kms:Verify using that key.

in-console notification
Panther Developer Workflows Overview
SemVer format
Panther-provided Packs on Github
GitHub documentation
"machine user"
create a sign/verify KMS key
see section on Pack manifests below
Creating a GitHub Release - Panther Analysis Tool
Panther Universal Detections
Panther Core AWS Pack
Panther Okta Pack
Panther-managed detections
panther-labs/panther-analysis
Panther Universal Detections
Panther-managed detections
Developer Workflow setting
How to clone a Panther-managed detection instructions
The Packs page shows tiles for GreyNoise Advanced and GreyNoise Basic packs. In the upper-right corner, there is a Filters button.
The details page for a Pack called "Panther Core AWS Pack" is shown. It has tabs for Rules, Policies, Helpers, Data Models, Scheduled Rules, and Saved Queries. The Rules tab is selected, and detections called "A CloudTrail Was Created or Updated" and "Account Security Configuration Changed" are shown.
A Pack called Panther GSuite Pack is displayed. At the top of the image, there is a toggle switch labeled "Enabled." It is set to "On."
A detection pack called "Panther Okta Pack" is shown. There is an arrow starting on the dropdown field containing the Pack's version (v3.7.4), and pointing to the "Update Pack" button. Its "Enabled" toggle is set to ON.
A detection pack called "GreyNoise Basic" is shown. There is an arrow starting on the dropdown field containing the Pack's version (v3.7.0), and pointing to the "Revert Pack" button. Its "Enabled" toggle is set to ON.
A screen shot from the Panther Analysis repo on Github. In the middle of the page, the version number is circled. Under a header labeled "Assets", there is a red circle around panther-analysis-all.zip.
The "New Detection Pack Source" form in the Panther Console shows fields for Source ID, Github Owner, Github Repo, Access Token, and Public Key or ARN.
The dropdown menu on a Pack source is expanded, with options visible to Edit or Delete.