Writing and Editing Detections
You can write your own Python detections in the Panther Console or locally, following the CI/CD workflow. This page contains detection writing examples and best practices, available auxiliary functions, and guidance on how to configure a detection dynamically.
For instructions on how to write detections, see the following pages:
Before you write a new detection, see if there's a Panther-managed detection that meets your needs (or almost meets your needs—Panther-managed rules can be tuned with Rule Filters). Leveraging a Panther-managed detection not only saves you from the effort of writing one yourself, but also provides the ongoing benefit of continuous updates to core detection logic, as Panther releases new versions.
Python Enhancement Proposals publishes resources on how to cleanly and effectively write and style your Python code. For example, you can use autopep8 to automatically ensure that your written detections all follow a consistent style.
The following Python libraries are available to be used in Panther in addition to
boto3
, provided by AWS Lambda:Package | Version | Description | License |
jsonpath-ng | 1.5.2 | JSONPath Implementation | Apache v2 |
policyuniverse | 1.3.3.20210223 | Parse AWS ARNs and Policies | Apache v2 |
requests | 2.23.0 | Easy HTTP Requests | Apache v2 |
See the Detection Functions section below to view all available functions within a Panther Detection.
The only required function is
def rule(event)
, but other functions make your Alerts more dynamic. See the section Configuring Detection functions dynamically for examples and for more information about the different functions.def rule(event):
if event.get("Something"):
return True
return False
return True
triggers an alert, while return False
does not trigger an alert.Before enabling new detections, it is recommended to write tests that define scenarios where alerts should or should not be generated. Best practice dictates at least one positive and one negative to ensure the most reliability.
Lookups for event fields are not case sensitive.
event.get("Event_Type")
or event.get("event_type")
will return the same result.Top-level fields represent the parent fields in a nested data structure. For example, a record may contain a field called
user
under which there are other fields such as ip_address
. In this case, user
is the top-level field, and ip_address
is a nested field underneath it.Nesting can occur many layers deep, and so it is valuable to understand the schema structure and know how to access a given field for a detection.
Basic Rules match a field’s value in the event, and a best practice to avoid errors is to leverage Python’s built-in
get()
function.The example below is a best practice because it leverages a
get()
function. get()
will look for a field, and if the field doesn't exist, it will return None
instead of an error, which will result in the detection returning False
.def rule(event):
return event.get('field') == 'value'
In the example below, if the field exists, the value of the field will be returned. Otherwise,
False
will be returned:def rule(event):
if event.get('field')
return event.get('field')
return False
Bad practice example
The example below is bad practice because the code is explicit about the field name. If the field doesn't exist, Python will throw a KeyError:
def rule(event):
return event['field'] == 'value'
Once many detections are written, a set of patterns and repeated code will begin to emerge. This is a great use case for Global Helper functions, which provide a centralized location for this logic to exist across all detections. For example, see the
deep_get()
function referenced in the next section.If the field is nested deep within the event, use a Panther-supplied function called
deep_get()
to safely access the fields value. deep_get()
must be imported by the panther_base_helpers
library.deep_get()
takes two or more arguments:- The event object itself (required)
- The top-level field name (required)
- Any nested fields, in order (as many nested fields as needed)
Example:
AWS CloudTrail logs nest the type of user accessing the console underneath
userIdentity
.JSON CloudTrail root activity:
{
"eventVersion": "1.05",
"userIdentity": {
"type": "Root",
"principalId": "1111",
"arn": "arn:aws:iam::123456789012:root",
"accountId": "123456789012",
"userName": "root"
},
...
}
Here is how you could check that value safely with
deep_get
:from panther_base_helpers import deep_get
def rule(event):
return deep_get(event, "userIdentity", "type") == "Root"
You may want to know when a specific event has occurred. If it did occur, then the detection should trigger an alert. Since Panther stores everything as normalized JSON, you can check the value of a field against the criteria you specify.
For example, to detect the action of granting Box technical support access to your Box account, the Python below would be used to match events where the
event_type
equals ACCESS_GRANTED
:def rule(event):
return event.get("event_type") == "ACCESS_GRANTED"
If the field is
event_type
and the value is equal to ACCESS_GRANTED
then the rule function will return true
and an Alert will be created.You may need to compare the value of a field against integers. This allows you to use any of Python’s built-in comparisons against your events.
For example, you can create an alert based on HTTP response status codes:
# returns True if 'status_code' equals 404
def rule(event):
if event.get("status_code"):
return event.get("status_code") == 404
else:
return False
# returns True if 'status_code' greater than 400
def rule(event):
if event.get("status_code"):
return event.get("status_code") > 404
else:
return False
Reference:
Data Models provide a way to configure a set of unified fields across all log types. By default, Panther comes with built-in Data Models for several log types. Custom Data Models can be added in the Panther Console or via the Panther Analysis Tool.
event.udm()
can only be used with log types that have an existing Data Model in your Panther environment.Example:
import panther_event_type_helpers as event_type
def rule(event):
# filter events on unified data model field ‘event_type’
return event.udm("event_type") == event_type.FAILED_LOGIN
References:
The
and
keyword is a logical operator and is used to combine conditional statements. It is often required to match multiple fields in an event using the and
keyword. When using and
, all statements must be true:
“string_a” == “this”``
and
``string_b” == “that”
Example:
To track down successful root user access to the AWS console you need to look at several fields:
from panther_base_helpers import deep_get
def rule(event):
return (event.get("eventName") == "ConsoleLogin" and
deep_get(event, "userIdentity", "type") == "Root" and
deep_get(event, "responseElements", "ConsoleLogin") == "Success")
The
or
keyword is a logical operator and is used to combine conditional statements. When using or
, either of the statements may be true:
“string_a” == “this”``
or
``string_b” == “that”
Example:
This example detects if the field contains either Port 80 or Port 22:
# returns True if 'port_number' is 80 or 22
def rule(event):
return event.get("port_number") == 80 or event.get("port_number") == 22
Comparing and matching events against a list of IP addresses, domains, users etc. is very quick and easy in Python. This is often used in conjunction with choosing not to alert on an event if the field being checked also exists in the list. This helps with reducing false positives for known behavior in your environment.
Example: If you have a list of IP addresses that you would like to add to your allow list, but you want to know if an IP address comes through outside of that list, we recommend using a Python set. Sets are similar to Python lists and tuples, but are more memory efficient.
# Set - Recommended over tuples or lists for performance
ALLOW_IP = {'192.0.0.1', '192.0.0.2', '192.0.0.3'}
def rule(event):
return event.get("ip_address") not in ALLOW_IP
In the example below, we use the Panther helper
pattern_match_list
:from panther_base_helpers import pattern_match_list
USER_CREATE_PATTERNS = [
"chage", # user password expiry
"passwd", # change passwords for users
"user*", # create, modify, and delete users
]
def rule(event):
# Filter the events
if event.get("event") != "session.command":
return False
# Check that the program matches our list above
return pattern_match_list(event.get("program", ""), USER_CREATE_PATTERNS)
If you want to match against events using regular expressions - to match subdomains, file paths, or a prefix/suffix of a general string - you can use regex. In Python, regex can be used by importing the
re
library and looking for a matching value.In the example below, the regex pattern will match Administrator or administrator against the nested value of the privilegeGranted field.
import re
from panther_base_helpers import deep_get
#The regex pattern is stored in a variable
# Note: This is better performance than putting it in the rule function, which is evaluated on each event
ADMIN_PATTERN = re.compile(r"[aA]dministrator")
def rule(event):
# using the deep_get function we can pull out the nested value under the "privilegeGranted" field
value_to_search = deep_get(event, "debugContext", "debugData", "privilegeGranted")
# finally we use the regex object we created earlier to check against our value
# if there is a match, "True" is returned
return (bool(ADMIN_PATTERN.search(value_to_search, default="")))
In the example below, we use the Panther helper
pattern_match
:from panther_base_helpers import pattern_match
def rule(event):
return pattern_match(event.get("operation", ""), "REST.*.OBJECT")
References: