LogoLogo
Knowledge BaseCommunityRelease NotesRequest Demo
  • Overview
  • Quick Start
    • Onboarding Guide
  • Data Sources & Transports
    • Supported Logs
      • 1Password Logs
      • Apache Logs
      • AppOmni Logs
      • Asana Logs
      • Atlassian Logs
      • Auditd Logs
      • Auth0 Logs
      • AWS Logs
        • AWS ALB
        • AWS Aurora
        • AWS CloudFront
        • AWS CloudTrail
        • AWS CloudWatch
        • AWS Config
        • AWS EKS
        • AWS GuardDuty
        • AWS Security Hub
        • Amazon Security Lake
        • AWS S3
        • AWS Transit Gateway
        • AWS VPC
        • AWS WAF
      • Azure Monitor Logs
      • Bitwarden Logs
      • Box Logs
      • Carbon Black Logs
      • Cisco Umbrella Logs
      • Cloudflare Logs
      • CrowdStrike Logs
        • CrowdStrike Falcon Data Replicator
        • CrowdStrike Event Streams
      • Docker Logs
      • Dropbox Logs
      • Duo Security Logs
      • Envoy Logs
      • Fastly Logs
      • Fluentd Logs
      • GCP Logs
      • GitHub Logs
      • GitLab Logs
      • Google Workspace Logs
      • Heroku Logs
      • Jamf Pro Logs
      • Juniper Logs
      • Lacework Logs
        • Lacework Alert Channel Webhook
        • Lacework Export
      • Material Security Logs
      • Microsoft 365 Logs
      • Microsoft Entra ID Audit Logs
      • Microsoft Graph Logs
      • MongoDB Atlas Logs
      • Netskope Logs
      • Nginx Logs
      • Notion Logs
      • Okta Logs
      • OneLogin Logs
      • Orca Security Logs (Beta)
      • Osquery Logs
      • OSSEC Logs
      • Proofpoint Logs
      • Push Security Logs
      • Rapid7 Logs
      • Salesforce Logs
      • SentinelOne Logs
      • Slack Logs
      • Snowflake Audit Logs (Beta)
      • Snyk Logs
      • Sophos Logs
      • Sublime Security Logs
      • Suricata Logs
      • Sysdig Logs
      • Syslog Logs
      • Tailscale Logs
      • Teleport Logs
      • Tenable Vulnerability Management Logs
      • Thinkst Canary Logs
      • Tines Logs
      • Tracebit Logs
      • Windows Event Logs
      • Wiz Logs
      • Zeek Logs
      • Zendesk Logs
      • Zoom Logs
      • Zscaler Logs
        • Zscaler ZIA
        • Zscaler ZPA
    • Custom Logs
      • Log Schema Reference
      • Transformations
      • Script Log Parser (Beta)
      • Fastmatch Log Parser
      • Regex Log Parser
      • CSV Log Parser
    • Data Transports
      • HTTP Source
      • AWS Sources
        • S3 Source
        • CloudWatch Logs Source
        • SQS Source
          • SNS Source
        • EventBridge
      • Google Cloud Sources
        • Cloud Storage (GCS) Source
        • Pub/Sub Source
      • Azure Blob Storage Source
    • Monitoring Log Sources
    • Ingestion Filters
      • Raw Event Filters
      • Normalized Event Filters (Beta)
    • Data Pipeline Tools
      • Chronosphere Onboarding Guide
      • Cribl Onboarding Guide
      • Fluent Bit Onboarding Guide
        • Fluent Bit Configuration Examples
      • Fluentd Onboarding Guide
        • General log forwarding via Fluentd
        • MacOS System Logs to S3 via Fluentd
        • Syslog to S3 via Fluentd
        • Windows Event Logs to S3 via Fluentd (Legacy)
        • GCP Audit to S3 via Fluentd
      • Observo Onboarding Guide
      • Tarsal Onboarding Guide
    • Tech Partner Log Source Integrations
  • Detections
    • Using Panther-managed Detections
      • Detection Packs
    • Rules and Scheduled Rules
      • Writing Python Detections
        • Python Rule Caching
        • Data Models
        • Global Helper Functions
      • Modifying Detections with Inline Filters (Beta)
      • Derived Detections (Beta)
        • Using Derived Detections to Avoid Merge Conflicts
      • Using the Simple Detection Builder
      • Writing Simple Detections
        • Simple Detection Match Expression Reference
        • Simple Detection Error Codes
    • Correlation Rules (Beta)
      • Correlation Rule Reference
    • PyPanther Detections (Beta)
      • Creating PyPanther Detections
      • Registering, Testing, and Uploading PyPanther Detections
      • Managing PyPanther Detections in the Panther Console
      • PyPanther Detections Style Guide
      • pypanther Library Reference
      • Using the pypanther Command Line Tool
    • Signals
    • Policies
    • Testing
      • Data Replay (Beta)
    • Framework Mapping and MITRE ATT&CK® Matrix
  • Cloud Security Scanning
    • Cloud Resource Attributes
      • AWS
        • ACM Certificate
        • CloudFormation Stack
        • CloudWatch Log Group
        • CloudTrail
        • CloudTrail Meta
        • Config Recorder
        • Config Recorder Meta
        • DynamoDB Table
        • EC2 AMI
        • EC2 Instance
        • EC2 Network ACL
        • EC2 SecurityGroup
        • EC2 Volume
        • EC2 VPC
        • ECS Cluster
        • EKS Cluster
        • ELBV2 Application Load Balancer
        • GuardDuty Detector
        • GuardDuty Detector Meta
        • IAM Group
        • IAM Policy
        • IAM Role
        • IAM Root User
        • IAM User
        • KMS Key
        • Lambda Function
        • Password Policy
        • RDS Instance
        • Redshift Cluster
        • Route 53 Domains
        • Route 53 Hosted Zone
        • S3 Bucket
        • WAF Web ACL
  • Alerts & Destinations
    • Alert Destinations
      • Amazon SNS Destination
      • Amazon SQS Destination
      • Asana Destination
      • Blink Ops Destination
      • Custom Webhook Destination
      • Discord Destination
      • GitHub Destination
      • Google Pub/Sub Destination (Beta)
      • Incident.io Destination
      • Jira Cloud Destination
      • Jira Data Center Destination (Beta)
      • Microsoft Teams Destination
      • Mindflow Destination
      • OpsGenie Destination
      • PagerDuty Destination
      • Rapid7 Destination
      • ServiceNow Destination (Custom Webhook)
      • Slack Bot Destination
      • Slack Destination (Webhook)
      • Splunk Destination (Beta)
      • Tines Destination
      • Torq Destination
    • Assigning and Managing Alerts
      • Managing Alerts in Slack
    • Alert Runbooks
      • Panther-managed Policies Runbooks
        • AWS CloudTrail Is Enabled In All Regions
        • AWS CloudTrail Sending To CloudWatch Logs
        • AWS KMS CMK Key Rotation Is Enabled
        • AWS Application Load Balancer Has Web ACL
        • AWS Access Keys Are Used Every 90 Days
        • AWS Access Keys are Rotated Every 90 Days
        • AWS ACM Certificate Is Not Expired
        • AWS Access Keys not Created During Account Creation
        • AWS CloudTrail Has Log Validation Enabled
        • AWS CloudTrail S3 Bucket Has Access Logging Enabled
        • AWS CloudTrail Logs S3 Bucket Not Publicly Accessible
        • AWS Config Is Enabled for Global Resources
        • AWS DynamoDB Table Has Autoscaling Targets Configured
        • AWS DynamoDB Table Has Autoscaling Enabled
        • AWS DynamoDB Table Has Encryption Enabled
        • AWS EC2 AMI Launched on Approved Host
        • AWS EC2 AMI Launched on Approved Instance Type
        • AWS EC2 AMI Launched With Approved Tenancy
        • AWS EC2 Instance Has Detailed Monitoring Enabled
        • AWS EC2 Instance Is EBS Optimized
        • AWS EC2 Instance Running on Approved AMI
        • AWS EC2 Instance Running on Approved Instance Type
        • AWS EC2 Instance Running in Approved VPC
        • AWS EC2 Instance Running On Approved Host
        • AWS EC2 Instance Running With Approved Tenancy
        • AWS EC2 Instance Volumes Are Encrypted
        • AWS EC2 Volume Is Encrypted
        • AWS GuardDuty is Logging to a Master Account
        • AWS GuardDuty Is Enabled
        • AWS IAM Group Has Users
        • AWS IAM Policy Blocklist Is Respected
        • AWS IAM Policy Does Not Grant Full Administrative Privileges
        • AWS IAM Policy Is Not Assigned Directly To User
        • AWS IAM Policy Role Mapping Is Respected
        • AWS IAM User Has MFA Enabled
        • AWS IAM Password Used Every 90 Days
        • AWS Password Policy Enforces Complexity Guidelines
        • AWS Password Policy Enforces Password Age Limit Of 90 Days Or Less
        • AWS Password Policy Prevents Password Reuse
        • AWS RDS Instance Is Not Publicly Accessible
        • AWS RDS Instance Snapshots Are Not Publicly Accessible
        • AWS RDS Instance Has Storage Encrypted
        • AWS RDS Instance Has Backups Enabled
        • AWS RDS Instance Has High Availability Configured
        • AWS Redshift Cluster Allows Version Upgrades
        • AWS Redshift Cluster Has Encryption Enabled
        • AWS Redshift Cluster Has Logging Enabled
        • AWS Redshift Cluster Has Correct Preferred Maintenance Window
        • AWS Redshift Cluster Has Sufficient Snapshot Retention Period
        • AWS Resource Has Minimum Number of Tags
        • AWS Resource Has Required Tags
        • AWS Root Account Has MFA Enabled
        • AWS Root Account Does Not Have Access Keys
        • AWS S3 Bucket Name Has No Periods
        • AWS S3 Bucket Not Publicly Readable
        • AWS S3 Bucket Not Publicly Writeable
        • AWS S3 Bucket Policy Does Not Use Allow With Not Principal
        • AWS S3 Bucket Policy Enforces Secure Access
        • AWS S3 Bucket Policy Restricts Allowed Actions
        • AWS S3 Bucket Policy Restricts Principal
        • AWS S3 Bucket Has Versioning Enabled
        • AWS S3 Bucket Has Encryption Enabled
        • AWS S3 Bucket Lifecycle Configuration Expires Data
        • AWS S3 Bucket Has Logging Enabled
        • AWS S3 Bucket Has MFA Delete Enabled
        • AWS S3 Bucket Has Public Access Block Enabled
        • AWS Security Group Restricts Ingress On Administrative Ports
        • AWS VPC Default Security Group Restricts All Traffic
        • AWS VPC Flow Logging Enabled
        • AWS WAF Has Correct Rule Ordering
        • AWS CloudTrail Logs Encrypted Using KMS CMK
      • Panther-managed Rules Runbooks
        • AWS CloudTrail Modified
        • AWS Config Service Modified
        • AWS Console Login Failed
        • AWS Console Login Without MFA
        • AWS EC2 Gateway Modified
        • AWS EC2 Network ACL Modified
        • AWS EC2 Route Table Modified
        • AWS EC2 SecurityGroup Modified
        • AWS EC2 VPC Modified
        • AWS IAM Policy Modified
        • AWS KMS CMK Loss
        • AWS Root Activity
        • AWS S3 Bucket Policy Modified
        • AWS Unauthorized API Call
    • Tech Partner Alert Destination Integrations
  • Investigations & Search
    • Search
      • Search Filter Operators
    • Data Explorer
      • Data Explorer SQL Search Examples
        • CloudTrail logs queries
        • GitHub Audit logs queries
        • GuardDuty logs queries
        • Nginx and ALB Access logs queries
        • Okta logs queries
        • S3 Access logs queries
        • VPC logs queries
    • Visualization and Dashboards
      • Custom Dashboards (Beta)
      • Panther-Managed Dashboards
    • Standard Fields
    • Saved and Scheduled Searches
      • Templated Searches
        • Behavioral Analytics and Anomaly Detection Template Macros (Beta)
      • Scheduled Search Examples
    • Search History
    • Data Lakes
      • Snowflake
        • Snowflake Configuration for Optimal Search Performance
      • Athena
  • PantherFlow (Beta)
    • PantherFlow Quick Reference
    • PantherFlow Statements
    • PantherFlow Operators
      • Datatable Operator
      • Extend Operator
      • Join Operator
      • Limit Operator
      • Project Operator
      • Range Operator
      • Sort Operator
      • Search Operator
      • Summarize Operator
      • Union Operator
      • Visualize Operator
      • Where Operator
    • PantherFlow Data Types
    • PantherFlow Expressions
    • PantherFlow Functions
      • Aggregation Functions
      • Date/time Functions
      • String Functions
      • Array Functions
      • Math Functions
      • Control Flow Functions
      • Regular Expression Functions
      • Snowflake Functions
      • Data Type Functions
      • Other Functions
    • PantherFlow Example Queries
      • PantherFlow Examples: Threat Hunting Scenarios
      • PantherFlow Examples: SOC Operations
      • PantherFlow Examples: Panther Audit Logs
  • Enrichment
    • Custom Lookup Tables
      • Creating a GreyNoise Lookup Table
      • Lookup Table Examples
        • Using Lookup Tables: 1Password UUIDs
      • Lookup Table Specification Reference
    • Identity Provider Profiles
      • Okta Profiles
      • Google Workspace Profiles
    • Anomali ThreatStream
    • IPinfo
    • Tor Exit Nodes
    • TrailDiscover (Beta)
  • Panther AI (Beta)
    • Managing Panther AI Response History
  • System Configuration
    • Role-Based Access Control
    • Identity & Access Integrations
      • Azure Active Directory SSO
      • Duo SSO
      • G Suite SSO
      • Okta SSO
        • Okta SCIM
      • OneLogin SSO
      • Generic SSO
    • Panther Audit Logs
      • Querying and Writing Detections for Panther Audit Logs
      • Panther Audit Log Actions
    • Notifications and Errors (Beta)
      • System Errors
    • Panther Deployment Types
      • SaaS
      • Cloud Connected
        • Configuring Snowflake for Cloud Connected
        • Configuring AWS for Cloud Connected
        • Pre-Deployment Tools
      • Legacy Configurations
        • Snowflake Connected (Legacy)
        • Customer-configured Snowflake Integration (Legacy)
        • Self-Hosted Deployments (Legacy)
          • Runtime Environment
  • Panther Developer Workflows
    • Panther Developer Workflows Overview
    • Using panther-analysis
      • Public Fork
      • Private Clone
      • Panther Analysis Tool
        • Install, Configure, and Authenticate with the Panther Analysis Tool
        • Panther Analysis Tool Commands
        • Managing Lookup Tables and Enrichment Providers with the Panther Analysis Tool
      • CI/CD for Panther Content
        • Deployment Workflows Using Panther Analysis Tool
          • Managing Panther Content via CircleCI
          • Managing Panther Content via GitHub Actions
        • Migrating to a CI/CD Workflow
    • Panther API
      • REST API (Beta)
        • Alerts
        • Alert Comments
        • API Tokens
        • Data Models
        • Globals
        • Log Sources
        • Queries
        • Roles
        • Rules
        • Scheduled Rules
        • Simple Rules
        • Policies
        • Users
      • GraphQL API
        • Alerts & Errors
        • Cloud Account Management
        • Data Lake Queries
        • Log Source Management
        • Metrics
        • Schemas
        • Token Rotation
        • User & Role Management
      • API Playground
    • Terraform
      • Managing AWS S3 Log Sources with Terraform
      • Managing HTTP Log Sources with Terraform
    • pantherlog Tool
    • Converting Sigma Rules
  • Resources
    • Help
      • Operations
      • Security and Privacy
        • Security Without AWS External ID
      • Glossary
      • Legal
    • Panther System Architecture
Powered by GitBook
On this page
  • Overview
  • How to use Search
  • Using database, table, and date range filters
  • Creating filter expressions
  • Using wildcards in filter expressions
  • Searching Indicators of Compromise
  • Keyboard shortcuts for filter expressions
  • Using PantherFlow in Search
  • Creating a Saved Search
  • Open and reuse a Saved Search in the Search tool
  • Analyzing Search results
  • Search results histogram
  • Adding, removing, and reordering fields in the results table
  • Viewing full result events
  • Search results summary charts
  • Panther AI Search results summary
  • Iterating on a Search
  • How to create an inclusive or exclusive filter expression from results
  • How to replace filter expressions with a value from results
  • How to explore enrichment data for a value from results
  • Sharing a Search

Was this helpful?

  1. Investigations & Search

Search

Construct a data query without writing SQL

PreviousInvestigations & SearchNextSearch Filter Operators

Last updated 17 hours ago

Was this helpful?

Overview

In the Search tool in Panther, search across all of your data—including log events, rule matches, and more—without writing SQL. You can use dropdown fields to create filter expressions, and group them using AND and OR functionality. It's also possible to execute queries in Search.

Filter expressions can be constructed in different ways: as , a search, or a search. Each of these can also use . You can combine different types of filter expressions in one search.

How to use Search

You can effectively search your data using:

Using database, table, and date range filters

Use the database, table, and date range filters to narrow the scope of your search. Using these controls is optional, but can significantly improve search performance when searching over large data sets. Learn more about each of these filters below.

Database filter

Use the database filter to narrow your search to certain databases, such as only Logs or Rule Matches.

The default value of this filter is Logs. The options contained in the database filter are:

  • Rule Matches

  • Logs

  • Lookups

  • Monitor

  • Cloud Security

  • Rule Errors

  • Signals

Table filter

The default value of this filter is All tables, which includes all tables for each included database. You can narrow the search by selecting only certain tables in this dropdown.

Date range filter

Use the date range filter to narrow your search to a certain period of time.

The default value of this filter is Last 20 mins. You can use one of the preset relative options (like Last hour or Last week), set your own relative window with Relative time, remove the time constraint with All time, or set a specific window with Custom range.

Creating filter expressions

Key/value filter expression

With a key/value filter expression, you will select an event key and provide a value (if necessary).

To create a key/value filter expression:

  1. Select an event key from the dropdown list. The dropdown menu contains options grouped into the following categories:

    • All remaining tables with a matching field(s) are displayed in alphabetical order.

  2. Select an operator (also known as a condition) from the dropdown menu.

    • The dropdown options will be limited to those applicable to the selected field's data type.

  3. Enter a value, if the selected operator requires one.

  4. If you would like to create another filter expression:

    • To create an AND filter, click outside the expression you just created (but within the same horizontal bar), or press TAB.

    • To create an OR filter, click + Add OR Condition.

  5. When you are ready to execute your search, click Search or press ENTER.

    • If there are more than two rows, they will collapse. To expand all filters into view, click Show +n conditions.

Free text filter expression

In a free text filter expression, you will enter a string.

Free text filter expressions may cause your search to take a long time to execute, as they search every field in every event (within the database, table, and date constraints), including fields nested in complex objects.

To create a free text filter expression:

  1. Enter the text value.

  2. If you would like to create another filter expression:

    • To create an AND filter, click outside the expression you just created (but within the same horizontal bar), or press TAB.

    • To create an OR filter, click + Add OR Condition.

  3. When you are ready to execute your search, click Search or press ENTER.

    • If there are more than two rows, they will collapse. To expand all filters into view, click Show +n conditions.

Regular expression (regex) filter expression

To create a regex filter expression:

  1. Enter the regular expression you wish to search, e.g., .*aws:.*admin.*.

  2. If you would like to create another filter expression:

    • To create an AND filter, click outside the expression you just created (but within the same horizontal bar), or press TAB.

    • To create an OR filter, click + Add OR Condition.

  3. When you are ready to execute your search, click Search or press ENTER.

    • If there are more than two rows, they will collapse. To expand all filters into view, click Show +n conditions.

Using wildcards in filter expressions

The position of the wildcard character determines which data is returned as a match:

  • Beginning: Any character(s) at or preceding the * are considered a match.

  • Middle: Any character(s) at the * are considered a match.

  • End: Any character(s) at or following the * are considered a match.

Searching Indicators of Compromise

When responding to a public breach disclosure or threat hunting generally, you may need to quickly find out whether any values in a list of Indicators of Compromise (IoCs) are found across any of your organization's event logs.

To search IoCs in Search:

  1. Type or paste in the indicator or list of indicators.

  2. From the dropdown options that appear, select the (auto-detect) option.

    • Each filter expression is joined by an OR.

  3. Click Search or press ENTER.

Video walkthrough

Keyboard shortcuts for filter expressions

Use the keyboard shortcuts below when building filter expressions in Search:

Action
Mac
Windows/Linux

Add filter expression Enter or exit regex mode

⌘/

⌃/

Select all filters in the current group

⌘A

⌃A

Copy selected filters

⌘C

⌃C

Paste

⌘V

⌃V

Undo

⌘Z

⌃Z

Redo

⇧⌘Z

⇧⌃Z

Delete selected filters

⌫

⌫

Using PantherFlow in Search

PantherFlow is in open beta starting with Panther version 1.110, and is available to all customers. Please share any bug reports and feature requests with your Panther support team.

  1. In the left-hand navigation bar of your Panther Console, click Investigate > Search.

    • This will replace the filter expression builder with a PantherFlow code editor.

  2. Enter your PantherFlow query.

    If your PantherFlow query does not specify a database/table, the database, table, and date range filters are all applied. In this scenario, if your PantherFlow query includes a date/time range (with a | where p_event_time ... statement), both date/time ranges are applied—i.e., returned data must fall within the date/time range set in both the date range filter and the range defined by the | where p_event_time ... statement.

  3. Click Search.

Creating a Saved Search

To create a Saved Search:

  1. Under the Add search filter box, click Save As.

  2. Enter values for the fields in the popup modal:

    • Search Name: Add a descriptive name.

    • Tags (optional): Add tags. Tags can be helpful to group related searches.

    • Description (optional): Describe the purpose of the search.

  3. Click Save Search.

    • See the next section to learn how to open and reuse Saved Searches.

Open and reuse a Saved Search in the Search tool

After creating a Saved Search in the Search tool, you can view and reuse it. It can be opened from the Search page, or from the Saved Searches page.

Open a Saved Search from the Search page:

  1. In the left-hand navigation bar of your Panther Console, click Investigate > Search.

  2. In the upper right corner, click the three dots icon, then Open Saved Search.

    • An Open a Search modal will pop up, displaying previously saved search.

  3. Find the search you'd like to open, select it, then click Open Search.

    • The Saved Search will populate in Search.

Open a Saved Search from the Saved Searches page:

  1. In the left-hand navigation bar of your Panther Console, click Investigate > Saved Searches.

  2. Find the search you'd like to open, utilizing the search bar and Filters at the top, if necessary.

  3. In the top right corner of the search's tile, click the three dots icon.

  4. Click View in Search.

    • You will be redirected to Search, where the Saved Search will populate.

Analyzing Search results

Search results histogram

The results histogram displays the distribution of events within the search's date and time window, to help quickly contextualize results.

After a search is run, the histogram is shown collapsed by default. You can expand it by clicking the diagonal arrows button in the upper-right corner of the chart. Clicking the button again will collapse the chart.

Interacting with the histogram

To see additional data insights into the counts by log type for any of the time periods, hover over a bar within the chart.

To drill down and create a new search (in a new browser tab) with a time period set to that of one of the histogram bars, click the bar.

This drill down functionality is only available for bars in the table that represent timeframes longer than one minute.

Adding, removing, and reordering fields in the results table

How to add a column in the Search results table

Add a column to the Search results table from the Available Fields list

  1. In the field list on the left-hand side of the results table, within the Available Fields header, locate the column you'd like to add to the results table.

  2. To the right of the field, click + (the plus symbol).

    • The field will be added as a column in the results table, and listed on the left-hand side of the table within Selected Fields.

Add a column to the Search results table from the JSON event view

  1. In the results table, click on a row to open the JSON event view slide-out panel.

  2. Locate the field you'd like to add to the results table.

    • The field will be added as a column in the results table, and listed on the left-hand side of the table within Selected Fields.

How to remove a column in the Search results table

Remove a column from the Search results table from the Selected Fields list

  1. In the field list on the left-hand side of the results table, within the Selected Fields header, locate the field you'd like to remove from the results table.

  2. To the right of the field, click - (the minus symbol).

    • The field's column will be removed from the results table, and listed on the left-hand side of the table within Available Fields.

Remove a column from the Search results table from the JSON event view

  1. In the results table, click on a row to open the JSON event view slide-out panel.

  2. Locate the field you'd like to remove from the results table.

    • The field's column will be removed from the results table, and listed on the left-hand side of the table within Available Fields.

Remove a column from the Search results table from the header row

  1. In the results table, hover over the header of the column you'd like to remove.

    • The field's column will be removed from the results table, and listed on the left-hand side of the table within Available Fields.

How to reorder columns in the Search results table

  • Reorder the columns in the results table by clicking on a column header and dragging it to the desired position.

Viewing full result events

The results table is loaded, by default, in a compact view. This view displays all log fields, including the often sizable EVENT field, in a single row. To view the EVENT value in this view, scroll horizontally.

Detailed results table view

The results table can alternatively display logs in a detailed view. This view displays the EVENT field value in a new row below the event's other fields, with text-wrapping.

To enable detailed view, click the toggle button in the upper-right corner of the table:

JSON event slide-out panel

It's possible to view the full event data, in JSON format, by clicking an event row. This will open a slide-out panel on the right side of the browser window.

Search results summary charts

Within the results of a Search, the Visualizations tab displays bar charts for field values, which can help provide quick insights into your data. To view these charts, click Visualizations.

How to add or remove summary charts

Add or remove visualizations for event fields using the Available Fields and Selected Fields lists on the left-hand side of the results panel. Adding or removing a field shows or hides the field both as a chart and as a column in the results table.

To add a visualization:

  • Within the Available Fields list, to the right of a field's name, click +.

To remove a visualization:

  • Within the Selected Fields list, to the right of a field's name, click –.

How to set the sort order of a chart

To sort results in ascending order (lowest to highest):

  • In the upper-right corner of a visualization, click the icon with an arrow pointing downward:

To sort results in descending order (highest to lowest):

  • In the upper-right corner of a visualization, click the icon with an arrow pointing upward:

How to expand or condense the number of values shown in a chart

  • To view the first 25 values in a visualization, in its lower-right corner, click Show Top 25. If there are fewer than 25 values available, the text will read Show all <number> rows instead. To view only the first five values in a visualization, in its lower-right corner, click Hide additional rows.

Panther AI Search results summary

AI event summaries in Search are in open beta starting with Panther version 1.113, and is available to all customers. Please share any bug reports and feature requests with your Panther support team.

To view an AI-generated summary for the events visible in the results table:

  1. After running a search, in the upper-right corner of the results table, click Summarize with AI.

    • In the slide-out panel, see the summary below.

      • The events summarized are the events loaded into view in the results table—by default, this is 25 events. If you scroll to the end of the results table and load more events, then click Summarize with AI, a higher number of events will be summarized.

    • Who is this user?

Iterating on a Search

How to create an inclusive or exclusive filter expression from results

How to create a filter expression from the JSON event slide-out panel

  1. In the results table, locate the event row of interest, and click it.

    • The JSON event slide-out panel will be shown.

  2. View the new filter expression in the search bar at the top of the window.

  3. To refresh the search results, click Search.

How to create a filter expression from a summary chart

  • Within a summary chart, hover over a row value.

How to replace filter expressions with a value from results

How to replace filter expressions with a value from the results table

  1. In the results table, locate the event row of interest, and click it.

    • The JSON event slide-out panel will be shown.

  2. In the JSON event slide-out panel, hover over the field on which you'd like to pivot.

    • All existing filters are replaced with a filter expression representing only the key/value you pivoted on.

  3. To refresh the search results, click Search.

How to replace filter expressions with a value from summary charts

    • All existing filters are replaced with a filter expression representing only the key/value you pivoted on.

  1. To refresh the search results, click Search.

How to explore enrichment data for a value from results

How to explore enrichment data for a value from the JSON event slide-out panel

  1. In the results table, locate the event row of interest, and click it.

    • The JSON event slide-out panel will be shown.

How to explore enrichment data for a value from a summary chart

Sharing a Search

While investigating or threat hunting, it may be useful to share a Search or a results set with your team. To do this:

  1. In the upper-right corner of the results table, click Share:

  2. Select one of the menu options:

    • Copy link to view: Copies a URL to this specific Search to your clipboard.

    • Download CSV: Downloads a CSV of the results table.

When a search is run, a results table is displayed below a histogram visualizing the distribution of result events over time. The results table is customizable—you can as columns. Also from the results table, you can to your search, pivot, and . You can .

You can by downloading the results table, or sharing a link to your specific search in Panther.

Search is only available to customers with a data lake. It is not available to Panther instances with an data lake.

A combination of filters: Start by making selections in the —then .

PantherFlow:

Use the table filter to narrow your search to certain tables, within the databases indicated by the .

A filter expression is a clause containing your , , or . To create filter expressions, click the Add search filter bar or use the add filter expression .

Click the Add search filter bar, or use the add filter expression .

Panther Fields: Includes (also known as p_any fields), and (p_udm fields), which are useful when searching across log types.

Multiple tables: Fields that are found in more than one log type.

See a full list of available operators on .

Learn more about using the .

You can also quickly of an initial search.

To improve search performance while using a free text filter expression, to search.

Click the Add search filter bar, or use the add filter expression .

Learn more about using the .

Using regex in Search can be powerful for dynamic text-based searches across logs. Search supports .

Click the Add search filter bar, or use the add filter expression .

To enter regex mode, use the regex .

To exit regex mode, you can repeat the same .

Learn more about using the .

The wildcard character (*) may be used as a placeholder at the beginning, middle, or end of a string or expression. The wildcard character may be used within a (only where the key has type: string and the operator is LIKE), , or .

This functionality is not available when searching with .

Make selections.

Click the Add search filter bar, or use the add filter expression .

Search parses the inputted string where there are spaces, commas, and semicolons—then detects whether each value matches a . For each detected indicator field, Search creates a . Other values remain as .

To execute a query in Search:

On the left side of the , click </> to toggle to PantherFlow mode.

If your PantherFlow query specifies a database/table, the in the upper-right corner of the Search page are ignored.

Learn how to construct a PantherFlow query in .

Note the current .

While it's possible to create a Saved Search in Search, it's not possible to schedule it (i.e., to create a ).

Creating a Saved Search means you can quickly reuse commonly run searches. Learn more on .

Create a search by following the instructions in .

The results of a Search contain a , a , and .

You can customize a search's results table by , , and columns.

You can also , , and .

You can add a column to the Search results table using the on the left-hand side of the table, or from the .

It is only possible to add nested fields to the table from the .

Only top-level fields are shown in this list. If you'd like to add a nested field to the table, you can do so from the .

While hovering over the field, click + (the plus symbol).

You can remove a column from the Search results table using the on the left-hand side of the table, from the , or from the .

While hovering over the field, click - (the minus symbol).

On the right side of the column header, click X.

To more easily view the full event data, you can use or the . An added benefit of using the slide-out panel is the ability to show or hide .

In the JSON event view, are displayed at the top of the JSON object followed by the event fields. You can hide or reveal these fields by clicking the Show Panther fields toggle.

Hovering over fields in the slide-out panel will display icons with which you can perform additional actions, like adding a filter. Learn more in .

You can also , , and .

Use of Panther AI features is subject to the .

After running a search that generates results, you can view an -generated summary of the result events.

AI event summaries are likely to describe the action(s) the log(s) represent, which may include identifying actors, naming resources accessed, making an evaluation of the security risk posed, connecting actions to MITRE ATT&CK tactics, and more. Learn more about Panther AI, including how to and manage AI responses, on and .

(Optional) In the prompt box at the top of the slide-out panel, ask follow-up questions or direct Panther AI to take some action. These prompts and their responses are preserved in the . For example:

Run a search to see if this user generated any CloudTrail logs today.

Directly from the JSON event slide-out panel and summary charts, you can , , and .

In the JSON event slide-out panel, hover over the value you'd like to create an inclusive or exclusive filter expression for.

To create an inclusive filter, click .

To create an exclusive filter, click .

To create an inclusive filter, click .

To create an exclusive filter, click .

Click the replace icon .

Within a summary chart, hover over the value you would like to replace filter expression values with.

Click the replace icon .

In the JSON event slide-out panel, hover over the value you'd like to explore for.

Click the enrichment icon .

In the Lookup Enrichment pop-up modal, use the LOOKUP TABLE column to locate the row of the enrichment source you would like to explore, then click View JSON→.

The enrichment entry will be shown.

Within a summary chart, hover over a value for which you would like to explore .

Click the enrichment icon .

In the Lookup Enrichment pop-up modal, use the LOOKUP TABLE column to locate the row of the enrichment source you would like to explore, then click View JSON→.

The enrichment entry will be shown.

Note that the downloaded CSV file contains only the first 1,000 results. To download the full results set, use . (You can click Copy as SQL in Search to quickly recreate it in Data Explorer.)

Snowflake
Athena
Search Filter Operators
POSIX-extended regular expressions
PantherFlow
PantherFlow
the PantherFlow documentation
Scheduled Search
Saved and Scheduled Searches
Panther fields
Panther AI
AI response history
Data Explorer
add or remove event fields
add inclusive/exclusive filters
look up related enrichment data
use Panther AI to summarize your results set
collaborate with your team
database, table, and date range filters
create your own filter expressions
Learn how to use PantherFlow in Search below
database filter
key/value search logic
free search terms
match patterns
keyboard shortcut
keyboard shortcut
wildcard character below
create key/value filter expressions from the results set
select a subset of tables
keyboard shortcut
wildcard character below
keyboard shortcut
keyboard shortcut
keyboard shortcut
wildcard character below
key/value filter expression
free text filter expression
regex filter expression
database, table, and date range filter
keyboard shortcut
database, table, and date range filters
How to use Search
histogram
table of result events
summary visualizations
adding
removing
reordering
create a new filter directly from the results table
replace filter expressions with a results table value
explore enrichment data for a results table value
Available Fields list
JSON event view
JSON event view
JSON event view
Selected Fields list
JSON event view
table header row
Panther fields
detailed table view
JSON event slide-out panel
Iterating on a Search
create a filter directly from a summary chart
replace filters with a value from a summary chart
explore enrichment data for a summary chart value
create inclusive/exclusive filters
replace filter expressions with a results value
explore enrichment data
PantherFlow
key/value pairs
free text
regular expression
wildcard characters
Panther AI
Managing Panther AI Response History
enrichment data
enrichment data
key/value filter expression
free text expressions
database filter
AI disclaimer found on the Legal page
limitations of PantherFlow
configure AI response length
Panther Indictor Field
Indicator Fields
Core Fields
In the Search page in Panther, there are two conditions joined by "OR" : "Source Domain is apigateway.amazonaws.com" and ".*aws:.*admin.*". Below the search conditions is a histogram, followed by a results table.
The Search UI in Panther is shown. Three dropdowns in the upper-right corner are shown. The first has a selection of "Logs," the second has a selection of "All tables," and the third has a value of "Last 24 hours."
Three dropdown fields are shown. The first one is open, and the checkbox next to "Logs" is selected. The middle dropdown has a selection of "All tables" made, and the third has a selection of "Last 24 hours."
Three dropdowns are shown: in the first, "Monitor" is selected. the second one is open, and the checkbox next to "Classification Failures" is checked. In the third dropdown, "Last month" is selected.
A dropdown field with the selection "Last 20 mins" is shown. It is expanded and multiple options are shown, including "Last 3 months" and "All time".
Two fields are shown: one is a numerical field with increase/decrease arrows, and the other is a dropdown with the value "mins" selected. At the bottom are "Cancel" and "Apply" buttons.
A date and time picker is shown. On the left-hand side there are preset relative values, like "Last hour," "Last 3 days," etc. On the right-hand side is a calendar picker, as well as dropdown fields to select the time. At the bottom are "Cancel" and "Apply" buttons.
The Search tool is shown. The search bar, which has placeholder text of "Add search filter," is empty. It is circled.
In the Search bar is one filter expression. It reads "Emails has john.doe@email.com"
A single filter expression is created in the Search bar. It reads ".*aws:.*admin.*"
In the Search page in Panther are three conditions: "Log Type is not Windows.EventLogs", "Log Type like AWS*Flow", and "ACCE*". Below is a histogram, followed by a results table.
The Search UI is shown, with one filter expression created ("kind is http"). Below the search bar, the "Save As" text is circled.
A histogram is shown, depicting a number of bars spanning from JUN 05 20:08 to JUN 05 21:03.
A histogram is shown. A double-sided arrow button in the upper-right corner, with the tooltip "Expand," is circled.
A list of event fields is displayed underneath an "Available Fields" header.
A tooltip of a button below a histogram (and at the top of the results table) reads "Detailed view." A full event in the results table has been circled.
Below a histogram (and at the top of the results table), a button's tooltip reads, "Compact view."
In Search, the event JSON slide-out panel is shown. Above the event JSON, a "Show Panther fields" toggle, set to ON, is circled.
A "Visualizations" tab is circled. Two bar charts are shown, titled "Log Type" and "Destination Bytes."
Under an "Available Fields" header is a "Panther Fields" header, then a field called "Destination ARNs." To its right is a plus sign that's been hovered over; its tooltip reads "Add as column."
Under a "Selected Fields" header are three fields: "Log Type," "Destination Bytes," and "Destination IP." To their right is a minus sign, one has been hovered over; its tooltip reads "Remove column."
An icon with an arrow pointing downward has been hovered over, its tooltip reads, "Sort Ascending"
An icon with an arrow pointing downward has been hovered over, its tooltip reads, "Sort Descending"
Next to an IP address are two filter icons. One has a tooltip reading "Filter out value"
The results table is shown. In the upper-right corner, the Share button's menu is open, displaying two options: Copy link to view and Download CSV. This button and its options are circled.