LogoLogo
Knowledge BaseCommunityRelease NotesRequest Demo
  • Overview
  • Quick Start
    • Onboarding Guide
  • Data Sources & Transports
    • Supported Logs
      • 1Password Logs
      • Apache Logs
      • AppOmni Logs
      • Asana Logs
      • Atlassian Logs
      • Auditd Logs
      • Auth0 Logs
      • AWS Logs
        • AWS ALB
        • AWS Aurora
        • AWS CloudFront
        • AWS CloudTrail
        • AWS CloudWatch
        • AWS Config
        • AWS EKS
        • AWS GuardDuty
        • AWS Security Hub
        • Amazon Security Lake
        • AWS S3
        • AWS Transit Gateway
        • AWS VPC
        • AWS WAF
      • Azure Monitor Logs
      • Bitwarden Logs
      • Box Logs
      • Carbon Black Logs
      • Cisco Umbrella Logs
      • Cloudflare Logs
      • CrowdStrike Logs
        • CrowdStrike Falcon Data Replicator
        • CrowdStrike Event Streams
      • Docker Logs
      • Dropbox Logs
      • Duo Security Logs
      • Envoy Logs
      • Fastly Logs
      • Fluentd Logs
      • GCP Logs
      • GitHub Logs
      • GitLab Logs
      • Google Workspace Logs
      • Heroku Logs
      • Jamf Pro Logs
      • Juniper Logs
      • Lacework Logs
        • Lacework Alert Channel Webhook
        • Lacework Export
      • Material Security Logs
      • Microsoft 365 Logs
      • Microsoft Entra ID Audit Logs
      • Microsoft Graph Logs
      • MongoDB Atlas Logs
      • Netskope Logs
      • Nginx Logs
      • Notion Logs
      • Okta Logs
      • OneLogin Logs
      • Orca Security Logs (Beta)
      • Osquery Logs
      • OSSEC Logs
      • Proofpoint Logs
      • Push Security Logs
      • Rapid7 Logs
      • Salesforce Logs
      • SentinelOne Logs
      • Slack Logs
      • Snowflake Audit Logs (Beta)
      • Snyk Logs
      • Sophos Logs
      • Sublime Security Logs
      • Suricata Logs
      • Sysdig Logs
      • Syslog Logs
      • Tailscale Logs
      • Teleport Logs
      • Tenable Vulnerability Management Logs
      • Thinkst Canary Logs
      • Tines Logs
      • Tracebit Logs
      • Windows Event Logs
      • Wiz Logs
      • Zeek Logs
      • Zendesk Logs
      • Zoom Logs
      • Zscaler Logs
        • Zscaler ZIA
        • Zscaler ZPA
    • Custom Logs
      • Log Schema Reference
      • Transformations
      • Script Log Parser (Beta)
      • Fastmatch Log Parser
      • Regex Log Parser
      • CSV Log Parser
    • Data Transports
      • HTTP Source
      • AWS Sources
        • S3 Source
        • CloudWatch Logs Source
        • SQS Source
          • SNS Source
        • EventBridge
      • Google Cloud Sources
        • Cloud Storage (GCS) Source
        • Pub/Sub Source
      • Azure Blob Storage Source
    • Monitoring Log Sources
    • Ingestion Filters
      • Raw Event Filters
      • Normalized Event Filters (Beta)
    • Data Pipeline Tools
      • Chronosphere Onboarding Guide
      • Cribl Onboarding Guide
      • Fluent Bit Onboarding Guide
        • Fluent Bit Configuration Examples
      • Fluentd Onboarding Guide
        • General log forwarding via Fluentd
        • MacOS System Logs to S3 via Fluentd
        • Syslog to S3 via Fluentd
        • Windows Event Logs to S3 via Fluentd (Legacy)
        • GCP Audit to S3 via Fluentd
      • Observo Onboarding Guide
      • Tarsal Onboarding Guide
    • Tech Partner Log Source Integrations
  • Detections
    • Using Panther-managed Detections
      • Detection Packs
    • Rules and Scheduled Rules
      • Writing Python Detections
        • Python Rule Caching
        • Data Models
        • Global Helper Functions
      • Modifying Detections with Inline Filters (Beta)
      • Derived Detections (Beta)
        • Using Derived Detections to Avoid Merge Conflicts
      • Using the Simple Detection Builder
      • Writing Simple Detections
        • Simple Detection Match Expression Reference
        • Simple Detection Error Codes
    • Correlation Rules (Beta)
      • Correlation Rule Reference
    • PyPanther Detections (Beta)
      • Creating PyPanther Detections
      • Registering, Testing, and Uploading PyPanther Detections
      • Managing PyPanther Detections in the Panther Console
      • PyPanther Detections Style Guide
      • pypanther Library Reference
      • Using the pypanther Command Line Tool
    • Signals
    • Policies
    • Testing
      • Data Replay (Beta)
    • Framework Mapping and MITRE ATT&CK® Matrix
  • Cloud Security Scanning
    • Cloud Resource Attributes
      • AWS
        • ACM Certificate
        • CloudFormation Stack
        • CloudWatch Log Group
        • CloudTrail
        • CloudTrail Meta
        • Config Recorder
        • Config Recorder Meta
        • DynamoDB Table
        • EC2 AMI
        • EC2 Instance
        • EC2 Network ACL
        • EC2 SecurityGroup
        • EC2 Volume
        • EC2 VPC
        • ECS Cluster
        • EKS Cluster
        • ELBV2 Application Load Balancer
        • GuardDuty Detector
        • GuardDuty Detector Meta
        • IAM Group
        • IAM Policy
        • IAM Role
        • IAM Root User
        • IAM User
        • KMS Key
        • Lambda Function
        • Password Policy
        • RDS Instance
        • Redshift Cluster
        • Route 53 Domains
        • Route 53 Hosted Zone
        • S3 Bucket
        • WAF Web ACL
  • Alerts & Destinations
    • Alert Destinations
      • Amazon SNS Destination
      • Amazon SQS Destination
      • Asana Destination
      • Blink Ops Destination
      • Custom Webhook Destination
      • Discord Destination
      • GitHub Destination
      • Google Pub/Sub Destination (Beta)
      • Incident.io Destination
      • Jira Cloud Destination
      • Jira Data Center Destination (Beta)
      • Microsoft Teams Destination
      • Mindflow Destination
      • OpsGenie Destination
      • PagerDuty Destination
      • Rapid7 Destination
      • ServiceNow Destination (Custom Webhook)
      • Slack Bot Destination
      • Slack Destination (Webhook)
      • Splunk Destination (Beta)
      • Tines Destination
      • Torq Destination
    • Assigning and Managing Alerts
      • Managing Alerts in Slack
    • Alert Runbooks
      • Panther-managed Policies Runbooks
        • AWS CloudTrail Is Enabled In All Regions
        • AWS CloudTrail Sending To CloudWatch Logs
        • AWS KMS CMK Key Rotation Is Enabled
        • AWS Application Load Balancer Has Web ACL
        • AWS Access Keys Are Used Every 90 Days
        • AWS Access Keys are Rotated Every 90 Days
        • AWS ACM Certificate Is Not Expired
        • AWS Access Keys not Created During Account Creation
        • AWS CloudTrail Has Log Validation Enabled
        • AWS CloudTrail S3 Bucket Has Access Logging Enabled
        • AWS CloudTrail Logs S3 Bucket Not Publicly Accessible
        • AWS Config Is Enabled for Global Resources
        • AWS DynamoDB Table Has Autoscaling Targets Configured
        • AWS DynamoDB Table Has Autoscaling Enabled
        • AWS DynamoDB Table Has Encryption Enabled
        • AWS EC2 AMI Launched on Approved Host
        • AWS EC2 AMI Launched on Approved Instance Type
        • AWS EC2 AMI Launched With Approved Tenancy
        • AWS EC2 Instance Has Detailed Monitoring Enabled
        • AWS EC2 Instance Is EBS Optimized
        • AWS EC2 Instance Running on Approved AMI
        • AWS EC2 Instance Running on Approved Instance Type
        • AWS EC2 Instance Running in Approved VPC
        • AWS EC2 Instance Running On Approved Host
        • AWS EC2 Instance Running With Approved Tenancy
        • AWS EC2 Instance Volumes Are Encrypted
        • AWS EC2 Volume Is Encrypted
        • AWS GuardDuty is Logging to a Master Account
        • AWS GuardDuty Is Enabled
        • AWS IAM Group Has Users
        • AWS IAM Policy Blocklist Is Respected
        • AWS IAM Policy Does Not Grant Full Administrative Privileges
        • AWS IAM Policy Is Not Assigned Directly To User
        • AWS IAM Policy Role Mapping Is Respected
        • AWS IAM User Has MFA Enabled
        • AWS IAM Password Used Every 90 Days
        • AWS Password Policy Enforces Complexity Guidelines
        • AWS Password Policy Enforces Password Age Limit Of 90 Days Or Less
        • AWS Password Policy Prevents Password Reuse
        • AWS RDS Instance Is Not Publicly Accessible
        • AWS RDS Instance Snapshots Are Not Publicly Accessible
        • AWS RDS Instance Has Storage Encrypted
        • AWS RDS Instance Has Backups Enabled
        • AWS RDS Instance Has High Availability Configured
        • AWS Redshift Cluster Allows Version Upgrades
        • AWS Redshift Cluster Has Encryption Enabled
        • AWS Redshift Cluster Has Logging Enabled
        • AWS Redshift Cluster Has Correct Preferred Maintenance Window
        • AWS Redshift Cluster Has Sufficient Snapshot Retention Period
        • AWS Resource Has Minimum Number of Tags
        • AWS Resource Has Required Tags
        • AWS Root Account Has MFA Enabled
        • AWS Root Account Does Not Have Access Keys
        • AWS S3 Bucket Name Has No Periods
        • AWS S3 Bucket Not Publicly Readable
        • AWS S3 Bucket Not Publicly Writeable
        • AWS S3 Bucket Policy Does Not Use Allow With Not Principal
        • AWS S3 Bucket Policy Enforces Secure Access
        • AWS S3 Bucket Policy Restricts Allowed Actions
        • AWS S3 Bucket Policy Restricts Principal
        • AWS S3 Bucket Has Versioning Enabled
        • AWS S3 Bucket Has Encryption Enabled
        • AWS S3 Bucket Lifecycle Configuration Expires Data
        • AWS S3 Bucket Has Logging Enabled
        • AWS S3 Bucket Has MFA Delete Enabled
        • AWS S3 Bucket Has Public Access Block Enabled
        • AWS Security Group Restricts Ingress On Administrative Ports
        • AWS VPC Default Security Group Restricts All Traffic
        • AWS VPC Flow Logging Enabled
        • AWS WAF Has Correct Rule Ordering
        • AWS CloudTrail Logs Encrypted Using KMS CMK
      • Panther-managed Rules Runbooks
        • AWS CloudTrail Modified
        • AWS Config Service Modified
        • AWS Console Login Failed
        • AWS Console Login Without MFA
        • AWS EC2 Gateway Modified
        • AWS EC2 Network ACL Modified
        • AWS EC2 Route Table Modified
        • AWS EC2 SecurityGroup Modified
        • AWS EC2 VPC Modified
        • AWS IAM Policy Modified
        • AWS KMS CMK Loss
        • AWS Root Activity
        • AWS S3 Bucket Policy Modified
        • AWS Unauthorized API Call
    • Tech Partner Alert Destination Integrations
  • Investigations & Search
    • Search
      • Search Filter Operators
    • Data Explorer
      • Data Explorer SQL Search Examples
        • CloudTrail logs queries
        • GitHub Audit logs queries
        • GuardDuty logs queries
        • Nginx and ALB Access logs queries
        • Okta logs queries
        • S3 Access logs queries
        • VPC logs queries
    • Visualization and Dashboards
      • Custom Dashboards (Beta)
      • Panther-Managed Dashboards
    • Standard Fields
    • Saved and Scheduled Searches
      • Templated Searches
        • Behavioral Analytics and Anomaly Detection Template Macros (Beta)
      • Scheduled Search Examples
    • Search History
    • Data Lakes
      • Snowflake
        • Snowflake Configuration for Optimal Search Performance
      • Athena
  • PantherFlow (Beta)
    • PantherFlow Quick Reference
    • PantherFlow Statements
    • PantherFlow Operators
      • Datatable Operator
      • Extend Operator
      • Join Operator
      • Limit Operator
      • Project Operator
      • Range Operator
      • Sort Operator
      • Search Operator
      • Summarize Operator
      • Union Operator
      • Visualize Operator
      • Where Operator
    • PantherFlow Data Types
    • PantherFlow Expressions
    • PantherFlow Functions
      • Aggregation Functions
      • Date/time Functions
      • String Functions
      • Array Functions
      • Math Functions
      • Control Flow Functions
      • Regular Expression Functions
      • Snowflake Functions
      • Data Type Functions
      • Other Functions
    • PantherFlow Example Queries
      • PantherFlow Examples: Threat Hunting Scenarios
      • PantherFlow Examples: SOC Operations
      • PantherFlow Examples: Panther Audit Logs
  • Enrichment
    • Custom Lookup Tables
      • Creating a GreyNoise Lookup Table
      • Lookup Table Examples
        • Using Lookup Tables: 1Password UUIDs
      • Lookup Table Specification Reference
    • Identity Provider Profiles
      • Okta Profiles
      • Google Workspace Profiles
    • Anomali ThreatStream
    • IPinfo
    • Tor Exit Nodes
    • TrailDiscover (Beta)
  • Panther AI (Beta)
  • System Configuration
    • Role-Based Access Control
    • Identity & Access Integrations
      • Azure Active Directory SSO
      • Duo SSO
      • G Suite SSO
      • Okta SSO
        • Okta SCIM
      • OneLogin SSO
      • Generic SSO
    • Panther Audit Logs
      • Querying and Writing Detections for Panther Audit Logs
      • Panther Audit Log Actions
    • Notifications and Errors (Beta)
      • System Errors
    • Panther Deployment Types
      • SaaS
      • Cloud Connected
        • Configuring Snowflake for Cloud Connected
        • Configuring AWS for Cloud Connected
        • Pre-Deployment Tools
      • Legacy Configurations
        • Snowflake Connected (Legacy)
        • Customer-configured Snowflake Integration (Legacy)
        • Self-Hosted Deployments (Legacy)
          • Runtime Environment
  • Panther Developer Workflows
    • Panther Developer Workflows Overview
    • Using panther-analysis
      • Public Fork
      • Private Clone
      • Panther Analysis Tool
        • Install, Configure, and Authenticate with the Panther Analysis Tool
        • Panther Analysis Tool Commands
        • Managing Lookup Tables and Enrichment Providers with the Panther Analysis Tool
      • CI/CD for Panther Content
        • Deployment Workflows Using Panther Analysis Tool
          • Managing Panther Content via CircleCI
          • Managing Panther Content via GitHub Actions
        • Migrating to a CI/CD Workflow
    • Panther API
      • REST API (Beta)
        • Alerts
        • Alert Comments
        • API Tokens
        • Data Models
        • Globals
        • Log Sources
        • Queries
        • Roles
        • Rules
        • Scheduled Rules
        • Simple Rules
        • Policies
        • Users
      • GraphQL API
        • Alerts & Errors
        • Cloud Account Management
        • Data Lake Queries
        • Log Source Management
        • Metrics
        • Schemas
        • Token Rotation
        • User & Role Management
      • API Playground
    • Terraform
      • Managing AWS S3 Log Sources with Terraform
      • Managing HTTP Log Sources with Terraform
    • pantherlog Tool
    • Converting Sigma Rules
  • Resources
    • Help
      • Operations
      • Security and Privacy
        • Security Without AWS External ID
      • Glossary
      • Legal
    • Panther System Architecture
Powered by GitBook
On this page
  • Overview
  • How Inline Filters work
  • Creating filters in the Panther Console
  • Inputting array values
  • Indexing an array of objects
  • Creating filters in the CLI workflow
  • YAML InlineFilter syntax
  • Limitations of YAML Inline Filters
  • How to create an Inline Filter in the CLI workflow
  • Working with failed unit tests with filters
  • Inline filter reference
  • Supported Console operators
  • Supported value types

Was this helpful?

  1. Detections
  2. Rules and Scheduled Rules

Modifying Detections with Inline Filters (Beta)

Modify an existing rule without writing code

PreviousGlobal Helper FunctionsNextDerived Detections (Beta)

Last updated 15 days ago

Was this helpful?

Overview

Inline filters are in open beta as of Panther version 1.54. Please share any bug reports and feature requests with your account team.

You can easily tune existing , including , by adding Inline Filters. An Inline Filter is a condition that must pass in order for the detection logic to then be run. Inline Filters are available only on rules, not scheduled rules nor policies.

In the you can create Inline Filters using a no-code builder. In the , you can create Inline Filters by adding the InlineFilters YAML key.

A common use case for filters is to add an allowlist or denylist.

How Inline Filters work

Filter statements are evaluated before a detection's logic. A filter must return true (i.e., match the event) for the detection logic itself to then be run.

In both the Console and CLI workflow, filters can be grouped using AND or OR logic.

If an event does not contain the field the filter is evaluating, the filter will pass. If the field the filter is evaluating has a value of none, the filter will return false on positive comparators or on comparators that don't apply, and true for inverse comparators.

In the Console, filters are not available during new rule creation. In the CLI workflow, you can include InlineFilters on new rules.

While it is broadly discouraged to manage detection content using both CLI workflows and the Console simultaneously, it is possible to use Inline Filters in the Console alongside the CLI workflow. Filters created in the Console will not be overwritten or deleted when an update to detection content is made in the CLI workflow.

Creating filters in the Panther Console

You can add filters to a rule from its edit page, or within an alert triggered by that rule.

Add filters from a rule's edit page

  1. In the left-hand navigation bar of your Panther Console, click Detections.

  2. In the list of detections, click a rule's name to view its details page.

  3. Within the Detect section, under Filter to only include events: and to the right of Where, click +.

    • In the menu that appears, select either Add Filter or Add Filter Group.

  4. For each filter (either on its own or within a group), define the logic:

    1. Click Key, then select an event key the condition will apply to.

      • To indicate a nested field, use JSON path notation.

      • Some options may include [*], indicating the key is an array of objects. .

    2. Click Condition, then select a condition.

    3. If the selected Condition requires an inputted value(s) (e.g., is or contains), provide a value or list of values.

      • If the values(s) field takes in an array, see the .

  5. Between each filter and filter group, ensure the correct combinator (either and or or) is selected.

  6. Run unit tests to ensure they pass with the added filter(s).

  7. In the upper-right corner of the page, click Deploy to save your changes.

Add filters from an alert event

You can add Inline Filters to a rule directly from an event in an associated alert. This is particularly helpful if you've received a false positive alert, and want to tune the triggered detection so it won't match on similar events in the future.

  1. In the left-hand navigation bar of your Panther Console, click Alerts.

  2. Locate the alert whose associated rule you'd like to tune, and click its name.

  3. On the alert's detail page, scroll down to the Event section.

  4. In the event's JSON, hover over the indicator you'd like the new filter to target, and click the target icon.

    • The Add Filter slide-out panel will open on the right-hand side of the window.

  5. In the Add Filter slide-out panel, a new filter will be pre-populated in the following way:

    • Key: defaults to the field on which you clicked the target icon in the event JSON.

    • Condition: defaults to is not, assuming you would not like to receive alerts for events like this in the future.

    • String: defaults to the value of the selected field in the event JSON.

  6. Make any desired changes to the filter. All pre-populated fields (i.e., Key, Condition and String) are editable.

  7. Locate the Unit Test section near the bottom of the panel. If the rule is not and you'd like to create a new unit test for the rule using the current event, click the checkbox labeled Add the current alert event as a unit test.

    • The toggle labeled The detection should trigger based on the example event is editable. It defaults to No, as you are likely trying to prevent alerts like this in the future.

    • If the rule is , this option will be greyed out.

  8. Click Save & Run Test.

    • This runs all of the target rule's unit tests. If you created a new unit test in step 7, it is also run.

    • In order for the new filter(s) to be saved, all of the rule's unit tests must pass. If any of the unit tests fail:

      • If the rule is not , click View Detection to be taken to the rule's detail page to edit unit tests. From there, you can click Update to save your changes to the rule.

      • If the rule is , its unit tests are read-only, meaning you can't alter failing tests to make them pass. To be able to add the filter successfully, instead follow the workflow.

Inputting array values

If the Rule Filter operator you've selected requires the value field to take in an array (such as the is in operator), you'll input the array values in a modal that pops up when you click into the value field.

To add values to an array:

  1. After selecting a Key and Condition for your Filter, click into the values field.

    • This will open the array input modal.

  2. In the modal, enter the array value(s) in the input field.

    • If your input is comma-delimited, check the Values entered above are comma-delimited checkbox.

      • When this field is checked, the text inputted into the values field will be separated (using a comma delimiter) into multiple values. For example, entering "User 1,User 2,User 3" will result in three values added.

    • If your input is not comma-delimited, leave Values entered above are comma-delimited unchecked.

  3. Click Add.

  4. Repeat steps 2-3 as needed, until all values have been added to the array.

  5. Click Apply.

Indexing an array of objects

While creating a filter expression, if there is an event key whose value is an array of objects, that key will be shown in the dropdown selector with the array indexing symbol [*], along with the fields in the object. You can use [*] to target the selected field in all objects in the array, or replace * with an integer to index the array, targeting a single field.

Example

It will be represented in the event field selector as:

By default, Panther applies a wildcard array index ([*]) that will search across values for the chosen field in all objects in the array. When [*] is used, an array of these values is created and searched. Because of this, only the array conditions are available: is empty, is not empty, contains, does not contain.

For example, the below filter expression (resources[*].type contains AWS::IAM::Role) means an event will match if any value of type in the resources array is AWS::IAM::Role.

However, you can replace the * with an integer to index the array, which specifies a single object in the array. In this case, Panther will only evaluate the value of the nested field at that index.

The conditions shown will be updated to those which are applicable to the data type of the nested field chosen:

Creating filters in the CLI workflow

Like the filters created in the Console, YAML filters are evaluated before the detection logic of a rule. If the filter returns true, the detection logic will be executed. If the filter returns false, the evaluation of the detection will stop, and the detection will return false altogether.

YAML InlineFilter syntax

Example:

InlineFilters: 
  - KeyPath: environment
    Condition: StartsWith
    Value: "Sandbox"

Limitations of YAML Inline Filters

  • InlineFilters cannot be used on scheduled rules or policies, only rules.

    • Equals

    • DoesNotEqual

    • IsGreaterThan

    • IsGreaterThanOrEquals

    • IsLessThan

    • IsLessThanOrEquals

    • Contains

    • DoesNotContain

    • StartsWith

    • EndsWith

    • IsIPAddressInCIDR

    • IsIPAddressNotInCIDR

    • CIDRContainsIPAddresses

    • CIDRDoesNotContainIPAddresses

    • IsIn

    • IsNotIn

    • IsIPAddressPublic

    • IsIPAddressPrivate

    • IsNullOrEmpty

    • IsNotNullOrEmpty

How to create an Inline Filter in the CLI workflow

Working with failed unit tests with filters

If a unit test fails, take the following steps:

  1. Clone the Panther-managed rule.

  2. Add your filter(s) to the cloned rule.

  3. Edit the unit tests for the cloned rule so that they pass.

Inline filter reference

Refer to the below operators and value types when building out your filters in the Console.

Supported Console operators

Operation
Usage guidelines
Supported field types
Examples

is / is not

An event will match when the field matches/does not match the value in the filter

string, ip, bool, int

username is “root”

is in / is not in

An event will match when the field matches/does not match an entry in the list of values in the filter

string, int

username is in [ “root”, “admin” ]

port is in [25, 553]

is empty

An event will match when the field's value is not specified. The operator tests only for the absence of data

string, int array, ip array, float array, bool array, string array

errors_list is empty

is not empty

An event will match when the field's value is specified. The operator tests only for the presence of data

string, int array, ip array, float array, bool array, string array

errors_list is not empty

contains

An event will match when the value of the field specified contains the value provided

When the event value is a string or a string array, partial matching is supported

string, int array, ip array, bool array, string array

domain contains “.google.com”

p_any_port contains 22

does not contain

An event will match when the value of the field specified does not contain the value provided

When the event value is a string or a string array, partial matching is supported

string, int array, ip array, bool array, string array

domain !contains “.google.com”

p_any_port !contains 22

starts with

An event will match when the value of the field specified begins with the value provided

string

role starts with “admin_”

ends with

An event will match when the value of the field specified ends with the value provided

string

domain ends with “.cc”

is greater than

An event will match when the field's value is greater than the value provided in the filter

int, float

port > 1023

is less than

An event will match when the field's value is less than the value provided in the filter

int, float

port < 1024

is greater than or equal

An event will match when the field's value is greater than or equal to the value provided in the filter

int

count ≥ 1

is less than or equal

An event will match when the field's value is less than or equal to the value provided in the filter

int

count ≤ 100

is private

An event will match when the IP address specified is private

IP

dst_ip is_private

is public

An event will match when the IP address specified is public

IP

src_ip is_public

is in CIDR / is not in CIDR

An event will match when the IP address specified is/is not within a provided CIDR (Classless Inter-Domain Routing) block

IP

src_ip in_cidr 192.168.0.0/16

does not contain IP in CIDR

An event will match when the IP array specified does not contain any IP address within the CIDR block provided

ip array

p_any_ip_address !contains_ip 8.8.0.0/16

p_any_ip_address !contains_ip 1.1.1.1/32

contains IP in CIDR

An event will match when the IP array specified contains an IP address within the CIDR block provided

ip array

p_any_ip_address contains_ip 8.8.0.0/16

p_any_ip_address contains_ip 1.1.1.1/32

Supported value types

Value types
Description

string

A string value

int

A 32-bit integer number in the range -2147483648, 2147483647

float

A 64-bit floating point number

boolean

A boolean value true / false

array

A JSON array where each element is of the same type

ip

A single valid IPv4 or IPv6 address

CIDR

A classless inter-domain routing block

When this field is unchecked, you can add values that contain commas one at a time. For example, entering "1,000" will add just one value.

For example, take this resources field in the :

In addition to creating no-code rule filters in the Panther Console, you can also create YAML filters on your rules written as or .

A YAML filter is denoted by the InlineFilters key. Within InlineFilters, list one or more match expressions. You can use the All and Any to specify AND or OR logic, respectively, and nest combinators to create filter groups. If a combinator is not specified directly under InlineFilters, All is assumed.

See to learn how to construct different types of match expressions.

Some match expression functionality described in is not possible in InlineFilters. These limitations include:

The following cannot be used within InlineFilters:

The Key and DeepKey cannot be used within InlineFilters—only KeyPath may be used.

The OnlyOne and None cannot be used within InlineFilters—only All and Any may be used.

Many values cannot be used within InlineFilters. Only the following values may be used:

To create an Inline Filter in the CLI workflow on a rule created as either a Python or Simple Detection, in the detection's YAML file, include the InlineFilters key. Within InlineFilters, include one or more .

For rules with filters, you currently cannot add or edit unit tests. You cannot save a rule if the unit test does not pass.

Simple Detections
Python detections
Simple Detection Match Expression Reference
Simple Detection Match Expression Reference
match expressions
Panther-managed
rules
Panther-managed rules
Panther-managed
Panther-managed
Panther-managed
Panther-managed
Panther Console,
CLI workflow
Learn more about indexing an array of objects below
Inputting array values instructions below
Working with failed unit tests with filters
combinators
match expression types
Multi-key match expressions
List comprehension match expressions
Absolute match expressions
Enrichment match expressions
key specifiers
combinators
Condition
AWS.CloudTrail schema
In the Filters to only include: section, Field, Operator and List inputs are shown. Field has an "EventId" value, Operator has an "is in" value, and (empty list) doesn't yet have a value, but the field is circled.