pantherlog Tool
pantherlog is a CLI tool to help you work with custom logs
Panther provides a tool,
pantherlog
, to help you work with Custom Logs. This CLI tool parses logs using built-in or custom schemas, and uses sample logs to infer custom schemas.For information on working with custom logs in the Panther Console, see the Custom Logs documentation.
Note the following limitations:
- It will not mark any timestamp field as
isEventTime:true
. Make sure to select the appropriatetimestamp
field and mark it asisEventTime:true.
- It is able to infer only the following types of indicators:
ip
,aws_arn
,url
, andmac
. Make sure to review the fields and add more indicators as appropriate.- Make sure to review the schema generated and edit it appropriately before deploying to your production environment.
Download the latest version at the following links:
You can use pantherlog's
list-schemas
command to list Panther's managed schemas../pantherlog list-schemas
You can use pantherlog's
export-schemas
command to export Panther's managed schemas into a local directory, or print them in the terminal../pantherlog export-schemas --path directory-name
- If
directory-name
does not exist, it will be created. - Note that
-p
may be used in place of--path
.
To print schemas to
stdout
instead of exporting to a local directory, use a dash../pantherlog export-schemas -p -
You can filter the schemas to be exported by using the
-s
option with the names of the schemas you'd like to export, separated by commas../pantherlog export-schemas --path ./managed-schemas -s 'AWS.ALB,Slack.AuditLogs'
You can use pantherlog to generate a schema file out of sample files in new-line delimited JSON format. The tool will scan the provided logs and print the inferred schema to
stdout
.For example, to infer the schema of logs
sample_logs.jsonl
and output to schema.yml
, use:$ ./pantherlog infer sample_logs.jsonl > schema.yml
Note that YAML keys and values are case sensitive. The tool will attempt to infer multiple timestamp formats.
The workflow of inferring a schema from sample logs
You can use the tool to validate a schema file and use it to parse log files. Note that the events in the log files need to be separated by new line. Processed logs are written to
stdout
and errors to stderr
.For example, to parse logs in
sample_logs.jsonl
with the log schema in schema.yml
, use:$ ./pantherlog parse --path schema.yml --schemas Schema.Name sample_logs.jsonl
The tool can also accept input via
stdin
so it can be used in a pipeline:$ cat sample_logs.jsonl | ./pantherlog parse --path schema.yml
You can use the tool to run unit tests. You can define unit tests for your Custom Schema in YAML files. To run tests defined in a
schema_tests.yml
file for a custom schema defined in schema.yml
use:$ ./pantherlog test schema.yml schema_tests.yml
The first argument is a file or directory containing schema YAML files. The rest of the arguments are test files to run. If you don't specify any test files arguments, and the first argument is a directory, the tool will look for tests in YAML files with a
_tests.yml
suffix.For an example of writing multiple tests for one schema, please see How can I write multiple pantherlog tests for a schema?.
Below is an example of a test using the previous JSON log sample, testing against our inferred schema with the added flag
isEventTime: true
under the time
field to ensure the correct timestamp:schema_tests.yml
schema.yml
# Make sure to use camelCase when naming the schema or log type
name: Custom Log Test Name
logType: Custom.SampleLog.V1
input: |
{
"method": "GET",
"path": "/-/metrics",
"format": "html",
"controller": "MetricsController",
"action": "index",
"status": 200,
"params": [],
"remote_ip": "1.1.1.1",
"user_id": null,
"username": null,
"ua": null,
"queue_duration_s": null,
"correlation_id": "c01ce2c1-d9e3-4e69-bfa3-b27e50af0268",
"cpu_s": 0.05,
"db_duration_s": 0,
"view_duration_s": 0.00039,
"duration_s": 0.0459,
"tag": "test",
"time": "2019-11-14T13:12:46.156Z"
}
result: |
{
"action": "index",
"controller": "MetricsController",
"correlation_id": "c01ce2c1-d9e3-4e69-bfa3-b27e50af0268",
"cpu_s": 0.05,
"db_duration_s": 0,
"duration_s": 0.0459,
"format": "html",
"method": "GET",
"path": "/-/metrics",
"remote_ip": "1.1.1.1",
"status": 200,
"tag": "test",
"time": "2019-11-14T13:12:46.156Z",
"view_duration_s": 0.00039,
"p_log_type": "Custom.SampleLog.V1",
"p_row_id": "acde48001122a480ca9eda991001",
"p_event_time": "2019-11-14T13:12:46.156Z",
"p_parse_time": "2022-04-04T16:12:41.059224Z",
"p_any_ip_addresses": [
"1.1.1.1"
]
}
version: 0
schema: Custom.SampleLog.V1
fields:
- name: action
required: true
type: string
- name: controller
required: true
type: string
- name: correlation_id
required: true
type: string
- name: cpu_s
required: true
type: float
- name: db_duration_s
required: true
type: bigint
- name: duration_s
required: true
type: float
- name: format
required: true
type: string
- name: method
required: true
type: string
- name: path
required: true
type: string
- name: remote_ip
required: true
type: string
indicators:
- ip
- name: status
required: true
type: bigint
- name: tag
required: false
type: string
- name: time
required: true
type: timestamp
timeFormats:
- rfc3339
isEventTime: true
- name: view_duration_s
required: true
type: float
For information on uploading schemas via Panther Analysis Tool (PAT), see Custom Logs: Uploading log schemas with the Panther Analysis Tool.
Last modified 3d ago