Links

Detections

Use detections to analyze data, run queries, and trigger alerts on suspicious behavior

Overview

Detections are Python functions that take in log events to identify suspicious behavior and trigger alerts. There are three types:
  • Rules
    • Python functions that detect suspicious activity in security logs in real-time. For more information, see Rules and Scheduled Rules.
  • Scheduled Rules
    • Python functions that run against results from scheduled queries on your data lake. For more information, see Rules and Scheduled Rules.
  • Policies
    • Python functions that scan and evaluate cloud infrastructure configurations to identify misconfigurations. For more information, see Policies.
Each of these types of detections, when finding a match, triggers an alert—alerts are then routed to destinations based on configurations on the detection and destination.

Getting started with detections

Choose a detection management workflow

You can create and manage Panther detections using one of the following methods:

Migrating to a CI/CD workflow

You can get started quickly by enabling Panther-managed Detection Packs in the Panther Console, but later on you may want to start using a CI/CD workflow. To migrate your workflow to CI/CD, follow the steps in CI/CD for Panther Content.
Managing detections via both the Panther Console and a Git-based workflow simultaneously may result in unexpected behavior.

Enable or write detections

The quickest way to start detecting threats with Panther is to turn on the already written Panther-managed detections that come with your Panther instance. These built-in rules and policies are applicable to various log sources, and Panther periodically releases improvements to their core detection logic. Panther-managed rules can be customized using Rule Filters, or you can clone them and edit the detection logic of the cloned version to suit your exact needs.
If you'd rather write your own detections from scratch, visit Writing and Editing Detections to learn how.

Working with your data in Panther

Schema definitions

Panther’s Schemas provide helpful information on the types of fields contained within your data, which makes it easier to understand how to interact with your data when writing a Detection.
Schema definitions can be found in:
  • Panther's documentation
    • See schemas for each integration within the Supported Logs section of the documentation. Find more information about Custom Log schemas in Custom Logs.
  • The Panther Console
    • Log in to the Panther Console and navigate to Data > Schemas.

Pulling samples out of Data Explorer

Data Explorer makes it easier to understand and investigate data, location of data, and data types when writing Python code. It contains all the data Panther parses from your log sources and stores the data in tables.
Explore and find log events by searching the relevant table for the log type you are interested in writing a detection for.
You can preview example table data without writing SQL. To generate a sample SQL query for that log source, click the eye icon next to the table type:
The eye icon is circled next to a table type on the left side of Data Explorer
When the query has produced results, you will see the example log events in the Results table. You can download these as a CSV file.
To copy the log event to be used in Unit Tests while writing detections, click View JSON.

Detection features

Panther-managed Detections
Panther comes with a number of Panther-managed detections, which are detections for which Panther has written the core logic, and periodically updates. Using Panther-managed detections saves you the effort of having to write your own, and provides the ongoing benefit of receiving improvements to core detection logic over time, as Panther releases new versions. For more information, see Using Panther-managed detections.
Detection Packs
Panther packs logically group and update detections via the Panther Console. Detection packs can group any number of Panther features including but not limited to detections, queries, global helpers, data models, or Lookup Tables. packs are defined in this open source repository: panther-labs/panther-analysis. For more information, see Detection Packs.
Rule Filters
Rule Filters are conditional statements that are evaluated before a detection's rule function. You can apply Rule Filters to easily tune detections, including Panther-managed ones. A filter must return true (i.e., match the event) for the rule function, which is written in code, to then be run. Based on the detection's log type, you can select a field to filter on. From there, you will specify the operator and, if applicable, input a value. To learn more, see Modifying Detections with Rule Filters.
Testing
Panther's detection testing ensures that detections behave as expected and generate alerts once deployed correctly. Test inputs are utilized to determine whether or not an alert will generate in order to promote reliability as code evolves and protect against regressions. For more information, see Testing.
Data Replay
Data Replay (Beta) allows rules to be tested against historical log data to preview the outcome of a rule before enabling it. Data Replay can simulate what types of alerts you are likely to receive before deploying the detection. For more information, see Data Replay.
Caching
Panther examines events one-by-one and provides a way to cache results across invocations. To accommodate stateful checks, Panther rules can cache values by using built-in helper functions. For more information, see Caching.
Data Models
Panther's data models provide a way to configure a set of unified fields across all log types. Data models allow you to monitor particular fields across many log types at once, avoiding cumbersome and complex individual log monitoring. For more information, see Data Models.
Global Helper Functions
Panther supports the common programming pattern to extract repeated code into helper functions, via the global analysis type. Import global helper functions in your detections by inserting certain commands at the top of your analysis function body then calling the global function as if it were any other Python library. For more information, see Global Helper Functions.
Framework Mapping
Panther supports the ability to map rules, policies, and scheduled rules to compliance frameworks (including MITRE ATT&CK®) to track coverage against that framework. Reports can be mapped to your detection within the Detections > All Detections navigation section of the Panther Console. For more information, see Framework Mapping and MITRE ATT&CK® Matrix.