pantherlog Tool

pantherlog is a CLI tool to help you work with custom logs

Overview

You can use pantherlog, a CLI tool, to work with Custom Logs. It parses logs using Panther-managed or custom schemas, and uses sample logs to infer custom schemas.

For information on working with custom logs in the Panther Console instead, see the Custom Logs documentation.

pantherlog limitations

Note the following limitations:

  • It will not mark any timestamp field as isEventTime:true. Make sure to select the appropriate timestamp field and mark it as isEventTime:true.

    • For more information regarding isEventTime:true see timestamp.

  • It is able to infer only the following types of indicators: ip, aws_arn, url, email, hash digests (MD5, SHA1 and SHA2), and mac. Make sure to review the fields and add more indicators as appropriate.

    • Make sure to review the schema generated and edit it appropriately before deploying to your production environment.

Download

Download the latest version at the following links:

Windows

Darwin/MacOS

amd64 (Intel)

arm64 (Apple Silicon)

list-schemas: List Panther-managed schemas

You can use pantherlog's list-schemas command to list Panther's managed schemas.

export-schemas: Export Panther-managed schemas

You can use pantherlog's export-schemas command to export Panther-managed schemas into a local directory, or print them in the terminal.

Export schemas to local directory

  • If directory-name does not exist, it will be created.

  • Note that -p may be used in place of --path.

To print schemas to stdout instead of exporting to a local directory, use a dash.

Export select schemas

You can filter the schemas to be exported by using the -s option with the names of the schemas you'd like to export, separated by commas.

infer: Generate a schema from JSON log samples

You can use pantherlog to generate a schema file out of sample files in new-line delimited JSON format. The tool will scan the provided logs and print the inferred schema to stdout.

For example, to infer the schema of logs sample_logs.jsonl and output to schema.yml, use:

Note that YAML keys and values are case sensitive. The tool will attempt to infer multiple timestamp formats.

The workflow of inferring a schema from sample logs

parse: Validate a schema

You can use the tool to validate a schema file and use it to parse log files. Note that the events in the log files need to be separated by new line. Processed logs are written to stdout and errors to stderr.

For example, to parse logs in sample_logs.jsonl with the log schema in schema.yml, use:

The tool can also accept input via stdin so it can be used in a pipeline:

test: Run tests for a schema

You can use pantherlog to run unit tests for your custom schema. To run tests defined in a schema_tests.yml file for a custom schema defined in schema.yml, you would run:

The first argument is a file or directory containing schema YAML files. The rest of the arguments are test files to run. If you don't specify any test files arguments, and the first argument is a directory, the tool will look for tests in YAML files with a _tests.yml or _tests.yaml suffix.

For an example of writing multiple tests for one schema, see this article in Panther's Knowledge Base: How can I write multiple pantherlog tests for a schema?

In your test file, include an input key containing the event to parse, and a result key containing the expected result. The test command checks that the schema can parse the event without error, and that the normalized event matches your expected result.

Example:

Uploading schemas via PAT

For information on uploading schemas via Panther Analysis Tool (PAT), see Custom Logs: Uploading log schemas with the Panther Analysis Tool.

Last updated

Was this helpful?