LogoLogo
Knowledge BaseCommunityRelease NotesRequest Demo
  • Overview
  • Quick Start
    • Onboarding Guide
  • Data Sources & Transports
    • Supported Logs
      • 1Password Logs
      • Apache Logs
      • AppOmni Logs
      • Asana Logs
      • Atlassian Logs
      • Auditd Logs
      • Auth0 Logs
      • AWS Logs
        • AWS ALB
        • AWS Aurora
        • AWS CloudFront
        • AWS CloudTrail
        • AWS CloudWatch
        • AWS Config
        • AWS EKS
        • AWS GuardDuty
        • AWS Security Hub
        • Amazon Security Lake
        • AWS S3
        • AWS Transit Gateway
        • AWS VPC
        • AWS WAF
      • Azure Monitor Logs
      • Bitwarden Logs
      • Box Logs
      • Carbon Black Logs
      • Cisco Umbrella Logs
      • Cloudflare Logs
      • CrowdStrike Logs
        • CrowdStrike Falcon Data Replicator
        • CrowdStrike Event Streams
      • Docker Logs
      • Dropbox Logs
      • Duo Security Logs
      • Envoy Logs
      • Fastly Logs
      • Fluentd Logs
      • GCP Logs
      • GitHub Logs
      • GitLab Logs
      • Google Workspace Logs
      • Heroku Logs
      • Jamf Pro Logs
      • Juniper Logs
      • Lacework Logs
        • Lacework Alert Channel Webhook
        • Lacework Export
      • Material Security Logs
      • Microsoft 365 Logs
      • Microsoft Entra ID Audit Logs
      • Microsoft Graph Logs
      • MongoDB Atlas Logs
      • Netskope Logs
      • Nginx Logs
      • Notion Logs
      • Okta Logs
      • OneLogin Logs
      • Orca Security Logs (Beta)
      • Osquery Logs
      • OSSEC Logs
      • Proofpoint Logs
      • Push Security Logs
      • Rapid7 Logs
      • Salesforce Logs
      • SentinelOne Logs
      • Slack Logs
      • Snowflake Audit Logs (Beta)
      • Snyk Logs
      • Sophos Logs
      • Sublime Security Logs
      • Suricata Logs
      • Sysdig Logs
      • Syslog Logs
      • Tailscale Logs
      • Teleport Logs
      • Tenable Vulnerability Management Logs
      • Thinkst Canary Logs
      • Tines Logs
      • Tracebit Logs
      • Windows Event Logs
      • Wiz Logs
      • Zeek Logs
      • Zendesk Logs
      • Zoom Logs
      • Zscaler Logs
        • Zscaler ZIA
        • Zscaler ZPA
    • Custom Logs
      • Log Schema Reference
      • Transformations
      • Script Log Parser (Beta)
      • Fastmatch Log Parser
      • Regex Log Parser
      • CSV Log Parser
    • Data Transports
      • HTTP Source
      • AWS Sources
        • S3 Source
        • CloudWatch Logs Source
        • SQS Source
          • SNS Source
        • EventBridge
      • Google Cloud Sources
        • Cloud Storage (GCS) Source
        • Pub/Sub Source
      • Azure Blob Storage Source
    • Monitoring Log Sources
    • Ingestion Filters
      • Raw Event Filters
      • Normalized Event Filters (Beta)
    • Data Pipeline Tools
      • Chronosphere Onboarding Guide
      • Cribl Onboarding Guide
      • Fluent Bit Onboarding Guide
        • Fluent Bit Configuration Examples
      • Fluentd Onboarding Guide
        • General log forwarding via Fluentd
        • MacOS System Logs to S3 via Fluentd
        • Syslog to S3 via Fluentd
        • Windows Event Logs to S3 via Fluentd (Legacy)
        • GCP Audit to S3 via Fluentd
      • Observo Onboarding Guide
      • Tarsal Onboarding Guide
    • Tech Partner Log Source Integrations
  • Detections
    • Using Panther-managed Detections
      • Detection Packs
    • Rules and Scheduled Rules
      • Writing Python Detections
        • Python Rule Caching
        • Data Models
        • Global Helper Functions
      • Modifying Detections with Inline Filters (Beta)
      • Derived Detections (Beta)
        • Using Derived Detections to Avoid Merge Conflicts
      • Using the Simple Detection Builder
      • Writing Simple Detections
        • Simple Detection Match Expression Reference
        • Simple Detection Error Codes
    • Correlation Rules (Beta)
      • Correlation Rule Reference
    • PyPanther Detections (Beta)
      • Creating PyPanther Detections
      • Registering, Testing, and Uploading PyPanther Detections
      • Managing PyPanther Detections in the Panther Console
      • PyPanther Detections Style Guide
      • pypanther Library Reference
      • Using the pypanther Command Line Tool
    • Signals
    • Policies
    • Testing
      • Data Replay (Beta)
    • Framework Mapping and MITRE ATT&CK® Matrix
  • Cloud Security Scanning
    • Cloud Resource Attributes
      • AWS
        • ACM Certificate
        • CloudFormation Stack
        • CloudWatch Log Group
        • CloudTrail
        • CloudTrail Meta
        • Config Recorder
        • Config Recorder Meta
        • DynamoDB Table
        • EC2 AMI
        • EC2 Instance
        • EC2 Network ACL
        • EC2 SecurityGroup
        • EC2 Volume
        • EC2 VPC
        • ECS Cluster
        • EKS Cluster
        • ELBV2 Application Load Balancer
        • GuardDuty Detector
        • GuardDuty Detector Meta
        • IAM Group
        • IAM Policy
        • IAM Role
        • IAM Root User
        • IAM User
        • KMS Key
        • Lambda Function
        • Password Policy
        • RDS Instance
        • Redshift Cluster
        • Route 53 Domains
        • Route 53 Hosted Zone
        • S3 Bucket
        • WAF Web ACL
  • Alerts & Destinations
    • Alert Destinations
      • Amazon SNS Destination
      • Amazon SQS Destination
      • Asana Destination
      • Blink Ops Destination
      • Custom Webhook Destination
      • Discord Destination
      • GitHub Destination
      • Google Pub/Sub Destination (Beta)
      • Incident.io Destination
      • Jira Cloud Destination
      • Jira Data Center Destination (Beta)
      • Microsoft Teams Destination
      • Mindflow Destination
      • OpsGenie Destination
      • PagerDuty Destination
      • Rapid7 Destination
      • ServiceNow Destination (Custom Webhook)
      • Slack Bot Destination
      • Slack Destination (Webhook)
      • Splunk Destination (Beta)
      • Tines Destination
      • Torq Destination
    • Assigning and Managing Alerts
      • Managing Alerts in Slack
    • Alert Runbooks
      • Panther-managed Policies Runbooks
        • AWS CloudTrail Is Enabled In All Regions
        • AWS CloudTrail Sending To CloudWatch Logs
        • AWS KMS CMK Key Rotation Is Enabled
        • AWS Application Load Balancer Has Web ACL
        • AWS Access Keys Are Used Every 90 Days
        • AWS Access Keys are Rotated Every 90 Days
        • AWS ACM Certificate Is Not Expired
        • AWS Access Keys not Created During Account Creation
        • AWS CloudTrail Has Log Validation Enabled
        • AWS CloudTrail S3 Bucket Has Access Logging Enabled
        • AWS CloudTrail Logs S3 Bucket Not Publicly Accessible
        • AWS Config Is Enabled for Global Resources
        • AWS DynamoDB Table Has Autoscaling Targets Configured
        • AWS DynamoDB Table Has Autoscaling Enabled
        • AWS DynamoDB Table Has Encryption Enabled
        • AWS EC2 AMI Launched on Approved Host
        • AWS EC2 AMI Launched on Approved Instance Type
        • AWS EC2 AMI Launched With Approved Tenancy
        • AWS EC2 Instance Has Detailed Monitoring Enabled
        • AWS EC2 Instance Is EBS Optimized
        • AWS EC2 Instance Running on Approved AMI
        • AWS EC2 Instance Running on Approved Instance Type
        • AWS EC2 Instance Running in Approved VPC
        • AWS EC2 Instance Running On Approved Host
        • AWS EC2 Instance Running With Approved Tenancy
        • AWS EC2 Instance Volumes Are Encrypted
        • AWS EC2 Volume Is Encrypted
        • AWS GuardDuty is Logging to a Master Account
        • AWS GuardDuty Is Enabled
        • AWS IAM Group Has Users
        • AWS IAM Policy Blocklist Is Respected
        • AWS IAM Policy Does Not Grant Full Administrative Privileges
        • AWS IAM Policy Is Not Assigned Directly To User
        • AWS IAM Policy Role Mapping Is Respected
        • AWS IAM User Has MFA Enabled
        • AWS IAM Password Used Every 90 Days
        • AWS Password Policy Enforces Complexity Guidelines
        • AWS Password Policy Enforces Password Age Limit Of 90 Days Or Less
        • AWS Password Policy Prevents Password Reuse
        • AWS RDS Instance Is Not Publicly Accessible
        • AWS RDS Instance Snapshots Are Not Publicly Accessible
        • AWS RDS Instance Has Storage Encrypted
        • AWS RDS Instance Has Backups Enabled
        • AWS RDS Instance Has High Availability Configured
        • AWS Redshift Cluster Allows Version Upgrades
        • AWS Redshift Cluster Has Encryption Enabled
        • AWS Redshift Cluster Has Logging Enabled
        • AWS Redshift Cluster Has Correct Preferred Maintenance Window
        • AWS Redshift Cluster Has Sufficient Snapshot Retention Period
        • AWS Resource Has Minimum Number of Tags
        • AWS Resource Has Required Tags
        • AWS Root Account Has MFA Enabled
        • AWS Root Account Does Not Have Access Keys
        • AWS S3 Bucket Name Has No Periods
        • AWS S3 Bucket Not Publicly Readable
        • AWS S3 Bucket Not Publicly Writeable
        • AWS S3 Bucket Policy Does Not Use Allow With Not Principal
        • AWS S3 Bucket Policy Enforces Secure Access
        • AWS S3 Bucket Policy Restricts Allowed Actions
        • AWS S3 Bucket Policy Restricts Principal
        • AWS S3 Bucket Has Versioning Enabled
        • AWS S3 Bucket Has Encryption Enabled
        • AWS S3 Bucket Lifecycle Configuration Expires Data
        • AWS S3 Bucket Has Logging Enabled
        • AWS S3 Bucket Has MFA Delete Enabled
        • AWS S3 Bucket Has Public Access Block Enabled
        • AWS Security Group Restricts Ingress On Administrative Ports
        • AWS VPC Default Security Group Restricts All Traffic
        • AWS VPC Flow Logging Enabled
        • AWS WAF Has Correct Rule Ordering
        • AWS CloudTrail Logs Encrypted Using KMS CMK
      • Panther-managed Rules Runbooks
        • AWS CloudTrail Modified
        • AWS Config Service Modified
        • AWS Console Login Failed
        • AWS Console Login Without MFA
        • AWS EC2 Gateway Modified
        • AWS EC2 Network ACL Modified
        • AWS EC2 Route Table Modified
        • AWS EC2 SecurityGroup Modified
        • AWS EC2 VPC Modified
        • AWS IAM Policy Modified
        • AWS KMS CMK Loss
        • AWS Root Activity
        • AWS S3 Bucket Policy Modified
        • AWS Unauthorized API Call
    • Tech Partner Alert Destination Integrations
  • Investigations & Search
    • Search
      • Search Filter Operators
    • Data Explorer
      • Data Explorer SQL Search Examples
        • CloudTrail logs queries
        • GitHub Audit logs queries
        • GuardDuty logs queries
        • Nginx and ALB Access logs queries
        • Okta logs queries
        • S3 Access logs queries
        • VPC logs queries
    • Visualization and Dashboards
      • Custom Dashboards (Beta)
      • Panther-Managed Dashboards
    • Standard Fields
    • Saved and Scheduled Searches
      • Templated Searches
        • Behavioral Analytics and Anomaly Detection Template Macros (Beta)
      • Scheduled Search Examples
    • Search History
    • Data Lakes
      • Snowflake
        • Snowflake Configuration for Optimal Search Performance
      • Athena
  • PantherFlow (Beta)
    • PantherFlow Quick Reference
    • PantherFlow Statements
    • PantherFlow Operators
      • Datatable Operator
      • Extend Operator
      • Join Operator
      • Limit Operator
      • Project Operator
      • Range Operator
      • Sort Operator
      • Search Operator
      • Summarize Operator
      • Union Operator
      • Visualize Operator
      • Where Operator
    • PantherFlow Data Types
    • PantherFlow Expressions
    • PantherFlow Functions
      • Aggregation Functions
      • Date/time Functions
      • String Functions
      • Array Functions
      • Math Functions
      • Control Flow Functions
      • Regular Expression Functions
      • Snowflake Functions
      • Data Type Functions
      • Other Functions
    • PantherFlow Example Queries
      • PantherFlow Examples: Threat Hunting Scenarios
      • PantherFlow Examples: SOC Operations
      • PantherFlow Examples: Panther Audit Logs
  • Enrichment
    • Custom Lookup Tables
      • Creating a GreyNoise Lookup Table
      • Lookup Table Examples
        • Using Lookup Tables: 1Password UUIDs
      • Lookup Table Specification Reference
    • Identity Provider Profiles
      • Okta Profiles
      • Google Workspace Profiles
    • Anomali ThreatStream
    • IPinfo
    • Tor Exit Nodes
    • TrailDiscover (Beta)
  • Panther AI (Beta)
    • Managing Panther AI Response History
  • System Configuration
    • Role-Based Access Control
    • Identity & Access Integrations
      • Azure Active Directory SSO
      • Duo SSO
      • G Suite SSO
      • Okta SSO
        • Okta SCIM
      • OneLogin SSO
      • Generic SSO
    • Panther Audit Logs
      • Querying and Writing Detections for Panther Audit Logs
      • Panther Audit Log Actions
    • Notifications and Errors (Beta)
      • System Errors
    • Panther Deployment Types
      • SaaS
      • Cloud Connected
        • Setting Up a Cloud Connected Panther Instance
      • Legacy Configurations
        • Snowflake Connected (Legacy)
        • Customer-configured Snowflake Integration (Legacy)
        • Self-Hosted Deployments (Legacy)
          • Runtime Environment
  • Panther Developer Workflows
    • Panther Developer Workflows Overview
    • Using panther-analysis
      • Public Fork
      • Private Clone
      • Panther Analysis Tool
        • Install, Configure, and Authenticate with the Panther Analysis Tool
        • Panther Analysis Tool Commands
        • Managing Lookup Tables and Enrichment Providers with the Panther Analysis Tool
      • CI/CD for Panther Content
        • Deployment Workflows Using Panther Analysis Tool
          • Managing Panther Content via CircleCI
          • Managing Panther Content via GitHub Actions
        • Migrating to a CI/CD Workflow
    • Panther API
      • REST API (Beta)
        • Alerts
        • Alert Comments
        • API Tokens
        • Data Models
        • Globals
        • Log Sources
        • Queries
        • Roles
        • Rules
        • Scheduled Rules
        • Simple Rules
        • Policies
        • Users
      • GraphQL API
        • Alerts & Errors
        • Cloud Account Management
        • Data Lake Queries
        • Log Source Management
        • Metrics
        • Schemas
        • Token Rotation
        • User & Role Management
      • API Playground
    • Terraform
      • Managing AWS S3 Log Sources with Terraform
      • Managing HTTP Log Sources with Terraform
    • pantherlog Tool
    • Converting Sigma Rules
    • MCP Server (Beta)
  • Resources
    • Help
      • Operations
      • Security and Privacy
        • Security Without AWS External ID
      • Glossary
      • Legal
    • Panther System Architecture
Powered by GitBook
On this page
  • Overview
  • Using tests
  • How to create a test
  • How to rename or delete a test in the Panther Console
  • Test example
  • Mocks
  • How to use mocks
  • Example mock
  • Enrich test data

Was this helpful?

  1. Detections

Testing

Use unit tests to ensure your detections are working as expected

PreviousPoliciesNextData Replay (Beta)

Last updated 5 months ago

Was this helpful?

Overview

Testing your detections ensures that once deployed, detections will behave as expected, generating alerts appropriately. Testing also promotes reliability and protects against regressions as code evolves over time.

Testing works by defining a test log event for a certain detection, and indicating whether or not you'd expect an alert to be generated when the test event is processed by that detection. Panther doesn't enforce a required minimum or maximum number of tests for each detection, but it's recommended to configure at least two—one false positive and one true positive.

You can create, edit, and delete tests for custom detections (i.e., those that aren't ). Tests for detections are maintained by Panther and are read-only.

Panther's , which allows you to run historical log data through a rule to preview the outcome, is also useful while testing.

Using tests

How to create a test

You can create tests for detections that are not in the Panther Console, or in the CLI workflow with the (PAT).

  1. In the left-hand navigation bar of your Panther Console, click Detections.

  2. Click the name of an existing detection, or .

  3. Scroll down to the Test section.

  4. On the right-hand side of the Unit Test tile, click Add New.

    • If the detection is , Add New will be disabled, as you cannot create new tests.

  5. The newly created test is given a placeholder name. To its right, click the three dots icon and select Rename.

  6. Provide a meaningful name for the test, and press enter, or return, to save it.

  7. Set the The detection should trigger based on the example event toggle toYES or NO.

  8. Compose a test JSON event in the text editor.

  9. To see whether your test passes, click Run Test.

  10. When you are finished, in the upper-right corner of the page, click Update or Save.

In the Panther Console, you can create a new unit test directly from an alert, using an actual event associated with the alert. Follow the instructions in , taking note of steps 9 and 10.

Note that this functionality is only available for rules.

How to create a test in the CLI workflow

Rules and scheduled rules

In your YAML file (in both and ), add a Tests key with the following fields:

Tests:
  -
    Name: Name to describe our first test
    LogType: LogType.GoesHere
    ExpectedResult: true or false
    Log:
      {
        "hostName": "test-01.prod.acme.io",
        "user": "martin_smith",
        "eventTime": "June 22 5:50:52 PM"
      }

Policies

In your YAML file, add a Tests key. The value of Resource can be a JSON object copied directly from the Policies > Resources explorer.

Tests:
  -
    Name: Name to describe our first test.
    ResourceType: AWS.S3.Bucket
    ExpectedResult: true
    Resource:
      {
        "PublicAccessBlockConfiguration": null,
        "Region": "us-east-1",
        "Policy": null,
        "AccountId": "123456789012",
        "LoggingPolicy": {
          "TargetBucket": "access-logs-us-east-1-100",
          "TargetGrants": null,
          "TargetPrefix": "acmecorp-fiancial-data/"
        },
        "EncryptionRules": [
          {
            "ApplyServerSideEncryptionByDefault": {
              "SSEAlgorithm": "AES256",
              "KMSMasterKeyID": null
            }
          }
        ],
        "Arn": "arn:aws:s3:::acmecorp-fiancial-data",
        "Name": "acmecorp-fiancial-data",
        "LifecycleRules": null,
        "ResourceType": "AWS.S3.Bucket",
        "Grants": [
          {
            "Permission": "FULL_CONTROL",
            "Grantee": {
              "URI": null,
              "EmailAddress": null,
              "DisplayName": "admins",
              "Type": "CanonicalUser",
              "ID": "013ae1034i130431431"
            }
          }
        ],
        "Versioning": "Enabled",
        "ResourceId": "arn:aws:s3:::acmecorp-fiancial-data",
        "Tags": {
          "aws:cloudformation:logical-id": "FinancialDataBucket"
        },
        "Owner": {
          "ID": "013ae1034i130431431",
          "DisplayName": "admins"
        },
        "TimeCreated": "2020-06-13T17:16:36.000Z",
        "ObjectLockConfiguration": null,
        "MFADelete": null
      }

How to rename or delete a test in the Panther Console

  1. In the left-hand navigation bar of your Panther Console, click Detections.

  2. Click the name of a detection.

  3. Scroll down to the Test section.

  4. Within the Unit Test tile, locate the test you'd like to rename or delete.

  5. To the right of the test's name, click the three dot icon.

  6. Click Rename or Delete.

    • If renaming, enter the new name and press enter, or return, to save.

    • If deleting, a Delete Test confirmation modal will pop up.

      • Select Confirm.

Test example

  • Click Edit in the upper right corner of the page.

  • Scroll down, below the Rule Function text editor, to the Unit Test text editor.

def rule(event):
  return event.get('status') == 200 and 'admin-panel' in event.get('request')

    
def title(event):
  return f"Successful admin panel login detected from {event.get('remoteAddr')}"


def dedup(event):
  return event.get('remoteAddr')
Detection:
  - KeyPath: status
    Condition: Equals
    Value: 200
  - KeyPath: request
    Condition: Contains
    Value: 'admin-panel'

AlertTitle: "Successful admin panel login detected from {remoteAddr}"

GroupBy:
  - KeyPath: remoteAddr

Name: Successful admin-panel Logins

Test event should trigger an alert: Yes

{
  "httpReferer": "https://domain1.com/?p=1",
  "httpUserAgent": "Chrome/80.0.3987.132 Safari/537.36",
  "remoteAddr": "180.76.15.143",
  "request": "GET /admin-panel/ HTTP/1.1",
  "status": 200,
  "time": "2019-02-06 00:00:38 +0000 UTC"
}

Name: Errored requests to the access-policy page

Test event should trigger an alert: No

{
  "httpReferer": "https://domain1.com/?p=1",
  "httpUserAgent": "Chrome/80.0.3987.132 Safari/537.36",
  "remoteAddr": "180.76.15.143",
  "request": "GET /access-policy/ HTTP/1.1",
  "status": 500,
  "time": "2019-02-06 00:00:38 +0000 UTC"
}

Use as many combinations as you would like to ensure the highest reliability with your detections.

Mocks

Panther's testing framework also allows for basic Python call mocking. Both policy and rule tests support unit test mocking.

When writing a detection that requires an external API call, mocks can be utilized to mimic the server response in the unit tests without having to actually execute an API call.

Mocks are defined with a Mock Name and Return Value (or objectName and returnValue, in the CLI workflow) which respectively denote the name of the object to patch and the string to be returned when the patched object is invoked.

Mocks are defined on the unit test level, allowing you to define different mocks for each unit test.

How to use mocks

To add a mock to a unit test in the Console:

  1. Within the Unit Test tile, locate the Mock Testing section, below the test event editor.

  2. Add values for Mock Name and Return Value.

  3. To test that your mock is behaving as expected, click Run Test.

To configure this using a CI/CD workflow, add the Mocks key to your test case. The Mocks key is used to define a list of functions you want to mock, and the value that should be returned when that function is called. Multiple functions can be mocked in a single test.

For example, if we have a rule test and want to mock the function get_counter to always return a 1 and the function geoinfo_from_ip to always return a specific set of geo IP info, we could write our unit test like this:

Tests:
  -
    Name: Test With Mock
    LogType: LogType.Custom
    ExpectedResult: true
    Mocks:
      - objectName: get_counter
        returnValue: 1
      - objectName: geoinfo_from_ip
        returnValue: >-
                          {
                          "region": "UnitTestRegion",
                          "city": "UnitTestCityNew",
                          "country": "UnitTestCountry"
                          }
    Log:
      {
        "hostName": "test-01.prod.acme.io",
        "user": "martin_smith",
        "eventTime": "June 22 5:50:52 PM"
      }

Mocking allows us to emulate network calls without requiring API keys or network access in our CI/CD pipeline, and without muddying the state of external tracking systems (such as the panther KV store).

Mocks are allowed on the global and built-in Python namespaces, this includes:

  1. Imported Modules and Functions

  2. Global Variables

  3. Built-in Python Functions

  4. User-Defined Functions

Python provides two distinct forms of importing, which can be mocked as such:

  • import package

    • Mock Name: package

  • from package import module

    • Mock Name: module

Example mock

The detection utilizes a global helper function resource_lookup from panther_aws_helpers which queries the resources-api and returns the resource attributes. However, the unit test should be able to be performed without any external API calls.

This test fails as there is no corresponding resource mapping to the generic example data.

Diving into the detection

# --- Snipped ---
from panther_aws_helpers import resource_lookup
# --- Snipped ---
def policy(resource):
    # --- Snipped ---
    for recorder_name in resource.get("Recorders", []):
        recorder = resource_lookup(recorder_name)
        resource_records_global_resources = bool(
            deep_get(recorder, "RecordingGroup", "IncludeGlobalResourceTypes")
            and deep_get(recorder, "Status", "Recording")
        )
        if resource_records_global_resources:
            return True
    return False
    # --- Snipped ---

The detection uses the from panther_aws_helpers import resource_lookup convention which means the mock should be defined for the resource_lookup function.

Mocks provide a way to leverage real world data to test the detection logic.

The return value used:

 { "AccountId": "012345678910", "Name": "Default", "RecordingGroup": { "AllSupported": true, "IncludeGlobalResourceTypes": true, "ResourceTypes": null }, "Region": "us-east-1", "ResourceId": "012345678910:us-east-1:AWS.Config.Recorder", "ResourceType": "AWS.Config.Recorder", "RoleARN": "arn:aws:iam::012345678910:role/PantherAWSConfig", "Status": { "LastErrorCode": null, "LastErrorMessage": null, "LastStartTime": "2018-10-05T22:45:01.838Z", "LastStatus": "SUCCESS", "LastStatusChangeTime": "2021-05-28T17:45:14.916Z", "LastStopTime": null, "Name": "Default", "Recording": true }, "Tags": null, "TimeCreated": null }

While this resource should be compliant, the unit test fails. Detections that do not expect a string to be returned requires a small tweak for mocks.

In order to get this unit test working as expected, the following modifications need to be made:

# --- Snipped ---
import json
# Another option is to use: from ast import literal_eval
# --- Snipped ---
def policy(resource):
    # --- Snipped ---
        recorder = resource_lookup(recorder_name)
        if isinstance(recorder, str):
            recorder = json.loads(recorder)
    # --- Snipped ---

Once this modification is added, you can now test the detection logic with real data!

Mocks from the CLI

Enrich test data

It's possible to enrich test events with p_enrichment in both the Panther Console and CLI workflow.

You can rename or delete tests for detections that are not .

If the detection is , its tests cannot be renamed or deleted. Instead of a three dot icon next to the name of each test, a Panther icon will appear.

Keeping with the previous example (in ), let's write two tests for this detection:

It is highly discouraged to make external API requests from within your detections in Panther. In general, detections are processed at a very high scale, and making API requests can overload receiving systems and cause your rules to exceed the .

See the related documentation for more information on using and .

This example is based on the detection.

Unit test mocking is also supported with CLI based workflows for writing detections. For details on adding unit test mocks to a CLI based detection, see the section of the PAT documentation.

If you are using the lookup() function in your detection, instead follow the instructions in .

If you have configured and created a detection to leverage them, click Enrich Test Data in the upper right side of the JSON event editor to ensure that p_enrichment is populated correctly.

Use PAT's enrich-test-data command. .

Panther-managed
Panther-managed
Rules
Panther Analysis Tool (PAT)
CI/CD workflows
AWS Config Global Resources
Lookup Table(s)
unit test mocking
Panther-managed
Panther-managed
Data Replay
Panther-managed
Panther Analysis Tool
Panther-managed
Python
Simple Detections
create a new detection
15-second runtime limit
Learn more about enrich-test-data here
Add filters from an alert event
Unit testing detections that use lookup()
In the Unit Test section, a test named "Reset by Company Admin" is shown. The three dot icon to its right has been selected, which has opened a menu with two options: Rename and Delete.
The Unit Test section shows multiple test names, each with a Panther icon to its left.
In the Panther Console, the Unit Test section of a detection page is shown. Below the test event, there is a Mock Testing section. It contains fields for Mock Name and Return Value.
A Unit Test within a Detection's details page is shown. In the Mock Testing section, the fields for Mock Name and Return Value are blank.
A Unit Test within a Detection's details page is shown. The Mock Name field contains "resource_lookup" and the Return Value contains the value from the "Return Value Used" section in the documentation above this image.
A full Policy Function and Unit Test with Mock Testing is shown. The test is named "Global Recorders Present - None Recording Global Resources"
Within the Unit Test section, the test event editor is shown. In the top right corner of the editor is a circled button that reads Enrich Test Data.