# Panther Analysis Tool Commands

## Overview

You can manage your Panther detection content using the Panther Analysis Tool (PAT). PAT lets you [upload](#upload-uploading-packages-to-panther-directly), [test](#test-running-tests-with-pat), and [delete](#delete-deleting-rules-policies-or-saved-queries-with-pat) assets, among other actions.

Each of the PAT commands accepts certain [options](#pat-command-options-sub-commands). For example, you can use [--filter](#filtering-pat-commands) with several of the commands to narrow the scope of the action.

## PAT commands

See the full list of available PAT commands in the following codeblock. Beneath it, find additional information about several of the commands.&#x20;

To understand which Panther permissions you need to execute each PAT command, see [Permissions required per command](#permissions-required-per-command).

```
% panther_analysis_tool -h
usage: panther_analysis_tool [-h] [--version] [--debug] {release,test,publish,upload,delete,update-custom-schemas,test-lookup-table,validate,zip,check-connection,sdk} ...

Panther Analysis Tool: A command line tool for managing Panther policies and rules.

positional arguments:
  {release,test,publish,upload,delete,update-custom-schemas,test-lookup-table,validate,zip,check-connection,sdk}
    release             Create release assets for repository containing panther detections. Generates a file called panther-analysis-all.zip and optionally generates
                        panther-analysis-all.sig
    test                Validate analysis specifications and run policy and rule tests.
    publish             Publishes a new release, generates the release assets, and uploads them. Generates a file called panther-analysis-all.zip and optionally generates
                        panther-analysis-all.sig
    upload              Upload specified policies and rules to a Panther deployment.
    delete              Delete policies, rules, or saved queries from a Panther deployment
    update-custom-schemas
                        Update or create custom schemas on a Panther deployment.
    test-lookup-table   Validate a Lookup Table spec file.
    validate            Validate your bulk uploads against your panther instance
    zip                 Create an archive of local policies and rules for uploading to Panther.
    check-connection    Check your Panther API connection
    sdk                 Perform operations using the Panther SDK exclusively (pass sdk --help for more)

options:
  -h, --help            show this help message and exit
  --version             show program's version number and exit
  --debug
```

### `test`: Running tests with PAT

Use PAT to load the defined specification files and evaluate unit tests locally:

```bash
panther_analysis_tool test --path <folder-name>
```

To filter rules or policies based on certain attributes:

```bash
panther_analysis_tool test --path <folder-name> --filter RuleID=Category.Behavior.MoreInfo
```

### `validate`: Ensuring detection content is ready to be uploaded

The `validate` command verifies your detection content is ready to be uploaded to your Panther instance by running the same checks that happen during the upload process. Because some of these checks require configuration information in your Panther instance, `validate` makes an API call.

To validate your detections against your Panther instance using PAT:

1. If you have not already, [generate an API token in your Panther Console](https://www.notion.so/panther-developer-workflows/api#step-2-create-an-api-token).
2. Run the following command:&#x20;

   <pre class="language-bash" data-full-width="false"><code class="lang-bash">panther_analysis_tool validate --path &#x3C;path-to-your-detections> --api-token &#x3C;your-api-token> --api-host https://api.&#x3C;your-panther-instance-name>.runpanther.net/public/graphql
   </code></pre>

   * You may exclude the `--api-token` and `--api-host` options if you are [setting configuration values](https://www.notion.so/panther-developer-workflows/ci-cd/deployment-workflows/pat/install-configure-and-authenticate-with-pat#configuring-pat) another way, i.e., by using environment variables or a configuration file.

### `zip`: Creating a package to upload to the Panther Console

To create a package for uploading manually to the Panther Console, run the following command:

```
$ panther_analysis_tool zip --path tests/fixtures/valid_policies/ --out tmp
[INFO]: Testing analysis packs in tests/fixtures/valid_policies/

AWS.IAM.MFAEnabled
	[PASS] Root MFA not enabled fails compliance
	[PASS] User MFA not enabled fails compliance

[INFO]: Zipping analysis packs in tests/fixtures/valid_policies/ to tmp
[INFO]: <current working directory>/tmp/panther-analysis-2020-03-23T12-48-18.zip
```

#### Uploading content in the Panther Console

1. In the lefthand side of the Panther Console, click **Build > Bulk Uploader**.&#x20;
2. Drag and drop your .zip file onto the page, or click **Select file**.

<figure><img src="https://4011785613-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-LgdiSWdyJcXPahGi9Rs-2910905616%2Fuploads%2FvaC3D84AQgDrJeU7234q%2FScreenshot%202023-06-23%20at%2011.31.12%20AM.png?alt=media&#x26;token=bad69bfd-73ca-4678-9ee5-2b72e9fd4eb0" alt="In the Panther Console&#x27;s Bulk Uploader, you can drag and drop a zip file or select a file."><figcaption></figcaption></figure>

### `upload`: Uploading packages to Panther directly

{% hint style="info" %}
Starting with PAT version 0.22.0, if you have authenticated with an API token and execute the `upload` command, an asynchronous bulk upload will automatically be performed, to prevent timeout issues.

If you did not use an API token to authenticate, you can use the `--batch` option.  The `--batch` option is only available in versions of PAT after 0.19.0.
{% endhint %}

To upload detections to your Panther instance using PAT:

1. If you have not already, [generate an API token in your Panther Console](https://docs.panther.com/~/changes/15ann7vKLltCCAGHtdQr/api#step-2-create-an-api-token).
2. Run `panther_analysis_tool test` to ensure your unit tests are passing.
3. Run the following command:\
   `panther_analysis_tool upload --path <path-to-your-detections> --api-token <your-api-token> --api-host https://api.<your-panther-instance-name>.runpanther.net/public/graphql`
   * You may exclude the `--api-token` and `--api-host` options if you are [setting configuration values](https://docs.panther.com/~/changes/15ann7vKLltCCAGHtdQr/panther-developer-workflows/ci-cd/deployment-workflows/install-configure-and-authenticate-with-pat#configuring-pat) another way, i.e., by using environment variables or a configuration file.

When using `upload`, detections and Lookup Tables with existing IDs are overwritten. Locally deleted detections will not automatically be deleted in your Panther instance, and must be removed with the [`delete`](#delete-deleting-rules-policies-or-saved-queries-with-pat) command (or by manually deleting them in your Panther Console). For CLI-driven workflows, it's recommended to set a detection's `Enabled` property to `false`, instead of deleting.

### `delete`: Deleting Rules, Policies, or Saved Queries with PAT

While `panther_analysis_tool upload --path <directory>` will upload everything from `<directory>`, it will not delete anything in your Panther instance if you simply remove a local file from `<directory>`. Instead, you can use the `panther_analysis_tool delete` command to explicitly delete detections from your Panther instance.\
\
To delete a specific detection, you can run the following command:

```
panther_analysis_tool delete --analysis-id MyRuleId
```

This will interactively ask you for a confirmation before it deletes the detection. If you would like to delete without confirming, you can use the following command:

```
panther_analysis_tool delete --analysis-id MyRuleId --no-confirm
```

You can delete up to 1000 detections at once with PAT.

### Permissions required per command

Below is a mapping of permissions required for each command.

<table><thead><tr><th width="313">Command</th><th>Required permission(s)</th></tr></thead><tbody><tr><td>check-connection</td><td>Read Panther Settings Info</td></tr><tr><td>upload</td><td>Bulk Upload</td></tr><tr><td>delete</td><td>Manage Policies<br>Manage Rules<br>Manage Saved Queries</td></tr><tr><td>update-custom-schemas</td><td>View Log Sources<br>Manage Log Sources</td></tr></tbody></table>

## PAT command options (sub commands)

See the options for each of the PAT commands in the codeblock below.

{% tabs %}
{% tab title="release" %}

```
$ panther_analysis_tool release -h
usage: panther_analysis_tool release [-h] [--api-token API_TOKEN] [--api-host API_HOST] [--aws-profile AWS_PROFILE] [--filter KEY=VALUE [KEY=VALUE ...]]
                                     [--ignore-files IGNORE_FILES [IGNORE_FILES ...]] [--kms-key KMS_KEY] [--minimum-tests MINIMUM_TESTS] [--out OUT] [--path PATH]
                                     [--skip-tests] [--skip-disabled-tests] [--available-destination AVAILABLE_DESTINATION] [--sort-test-results]
                                     [--ignore-table-names]

optional arguments:
  -h, --help            show this help message and exit
  --api-token API_TOKEN
                        The Panther API token to use. See: https://docs.panther.com/api-beta
  --api-host API_HOST   The Panther API host to use. See: https://docs.panther.com/api-beta
  --aws-profile AWS_PROFILE
                        The AWS profile to use when updating the AWS Panther deployment.
  --filter KEY=VALUE [KEY=VALUE ...]
  --ignore-files IGNORE_FILES [IGNORE_FILES ...]
                        Relative path to files in this project to be ignored by panther-analysis tool, space separated. Example ./foo.yaml ./bar/baz.yaml
  --kms-key KMS_KEY     The key id to use to sign the release asset.
  --minimum-tests MINIMUM_TESTS
                        The minimum number of tests in order for a detection to be considered passing. If a number greater than 1 is specified, at least one True and
                        one False test is required.
  --out OUT             The path to store output files.
  --path PATH           The relative path to Panther policies and rules.
  --skip-tests
  --skip-disabled-tests
  --available-destination AVAILABLE_DESTINATION
                        A destination name that may be returned by the destinations function. Repeat the argument to define more than one name.
  --sort-test-results   Sort test results by whether the test passed or failed (passing tests first), then by rule ID
  --ignore-table-names  Allows skipping of table names from schema validation. Useful when querying non-Panther or non-Snowflake tables
  --valid-table-names   VALID_TABLE_NAMES [VALID_TABLE_NAMES ...]
                        Fully qualified table names that should be considered valid during schema validation (in addition to standard Panther/Snowflake tables), space
                        separated. Accepts '*' as wildcard character matching 0 or more characters. Example foo.bar.baz bar.baz.* foo.*bar.baz baz.* *.foo.*
```

{% endtab %}

{% tab title="test" %}

```
$ panther_analysis_tool test -h   
usage: panther_analysis_tool test [-h] [--filter KEY=VALUE [KEY=VALUE ...]] [--minimum-tests MINIMUM_TESTS] [--path PATH] [--ignore-extra-keys IGNORE_EXTRA_KEYS]
                                  [--ignore-files IGNORE_FILES [IGNORE_FILES ...]] [--skip-disabled-tests] [--available-destination AVAILABLE_DESTINATION]
                                  [--sort-test-results] [--ignore-table-names]

optional arguments:
  -h, --help            show this help message and exit
  --filter KEY=VALUE [KEY=VALUE ...]
  --minimum-tests MINIMUM_TESTS
                        The minimum number of tests in order for a detection to be considered passing. If a number greater than 1 is specified, at least one True and
                        one False test is required.
  --path PATH           The relative path to Panther policies and rules.
  --ignore-extra-keys IGNORE_EXTRA_KEYS
                        Meant for advanced users; allows skipping of extra keys from schema validation.
  --ignore-files IGNORE_FILES [IGNORE_FILES ...]
                        Relative path to files in this project to be ignored by panther-analysis tool, space separated. Example ./foo.yaml ./bar/baz.yaml
  --skip-disabled-tests
  --available-destination AVAILABLE_DESTINATION
                        A destination name that may be returned by the destinations function. Repeat the argument to define more than one name.
  --sort-test-results   Sort test results by whether the test passed or failed (passing tests first), then by rule ID
  --ignore-table-names  Allows skipping of table names from schema validation. Useful when querying non-Panther or non-Snowflake tables
  --valid-table-names   VALID_TABLE_NAMES [VALID_TABLE_NAMES ...]
                        Fully qualified table names that should be considered valid during schema validation (in addition to standard Panther/Snowflake tables), space
                        separated. Accepts '*' as wildcard character matching 0 or more characters. Example foo.bar.baz bar.baz.* foo.*bar.baz baz.* *.foo.*
```

{% endtab %}

{% tab title="upload" %}

```
$ panther_analysis_tool upload -h
usage: panther_analysis_tool upload [-h] [--max-retries MAX_RETRIES] [--api-token API_TOKEN] [--api-host API_HOST] [--aws-profile AWS_PROFILE]
                                    [--filter KEY=VALUE [KEY=VALUE ...]] [--minimum-tests MINIMUM_TESTS] [--out OUT] [--path PATH] [--skip-tests]
                                    [--skip-disabled-tests] [--ignore-extra-keys IGNORE_EXTRA_KEYS] [--ignore-files IGNORE_FILES [IGNORE_FILES ...]]
                                    [--available-destination AVAILABLE_DESTINATION] [--sort-test-results] [--batch] [--no-async]
                                    [--ignore-table-names] [--valid-table-names VALID_TABLE_NAMES [VALID_TABLE_NAMES ...]]

optional arguments:
  -h, --help            show this help message and exit
  --max-retries MAX_RETRIES
                        Retry to upload on a failure for a maximum number of times
  --api-token API_TOKEN
                        The Panther API token to use. See: https://docs.panther.com/api-beta
  --api-host API_HOST   The Panther API host to use. See: https://docs.panther.com/api-beta
  --aws-profile AWS_PROFILE
                        The AWS profile to use when updating the AWS Panther deployment.
  --filter KEY=VALUE [KEY=VALUE ...]
  --minimum-tests MINIMUM_TESTS
                        The minimum number of tests in order for a detection to be considered passing. If a number greater than 1 is specified, at
                        least one True and one False test is required.
  --out OUT             The path to store output files.
  --path PATH           The relative path to Panther policies and rules.
  --skip-tests
  --skip-disabled-tests
  --ignore-extra-keys IGNORE_EXTRA_KEYS
                        Meant for advanced users; allows skipping of extra keys from schema validation.
  --ignore-files IGNORE_FILES [IGNORE_FILES ...]
                        Relative path to files in this project to be ignored by panther-analysis tool, space separated. Example ./foo.yaml
                        ./bar/baz.yaml
  --available-destination AVAILABLE_DESTINATION
                        A destination name that may be returned by the destinations function. Repeat the argument to define more than one name.
  --sort-test-results   Sort test results by whether the test passed or failed (passing tests first), then by rule ID
  --batch               When set your upload will be broken down into multiple zip files
  --no-async            When set your upload will be synchronous
  --ignore-table-names  Allows skipping of table name validation from schema validation. Useful when querying non-Panther or non-Snowflake tables
  --valid-table-names VALID_TABLE_NAMES [VALID_TABLE_NAMES ...]
                        Fully qualified table names that should be considered valid during schema validation (in addition to standard
                        Panther/Snowflake tables), space separated. Accepts '*' as wildcard character matching 0 or more characters. Example
                        foo.bar.baz bar.baz.* foo.*bar.baz baz.* *.foo.*
```

{% endtab %}

{% tab title="delete" %}

```
$ panther_analysis_tool delete -h
usage: panther_analysis_tool delete [-h] [--no-confirm] [--athena-datalake] [--api-token API_TOKEN] [--api-host API_HOST] [--aws-profile AWS_PROFILE] [--analysis-id ANALYSIS_ID [ANALYSIS_ID ...]] [--query-id QUERY_ID [QUERY_ID ...]]

optional arguments:
  -h, --help            show this help message and exit
  --no-confirm          Skip manual confirmation of deletion
  --athena-datalake     Instance DataLake is backed by Athena
  --api-token API_TOKEN
                        The Panther API token to use. See: https://docs.panther.com/api-beta
  --api-host API_HOST   The Panther API host to use. See: https://docs.panther.com/api-beta
  --aws-profile AWS_PROFILE
                        The AWS profile to use when updating the AWS Panther deployment.
  --analysis-id ANALYSIS_ID [ANALYSIS_ID ...]
                        Space separated list of Detection IDs
  --query-id QUERY_ID [QUERY_ID ...]
                        Space separated list of Saved Queries

```

{% endtab %}

{% tab title="update-custom-schemas" %}

```
panther_analysis_tool update-custom-schemas -h
usage: panther_analysis_tool update-custom-schemas [-h] [--api-token API_TOKEN] [--api-host API_HOST] [--aws-profile AWS_PROFILE] [--path PATH]

optional arguments:
  -h, --help            show this help message and exit
  --api-token API_TOKEN
                        The Panther API token to use. See: https://docs.panther.com/api-beta
  --api-host API_HOST   The Panther API host to use. See: https://docs.panther.com/api-beta
  --aws-profile AWS_PROFILE
                        The AWS profile to use when updating the AWS Panther deployment.
  --path PATH           The relative or absolute path to Panther custom schemas.

```

{% endtab %}

{% tab title="test-lookup-table" %}

```
panther_analysis_tool test-lookup-table -h
usage: panther_analysis_tool test-lookup-table [-h]
                                               [--aws-profile AWS_PROFILE]
                                               --path PATH

optional arguments:
  -h, --help            show this help message and exit
  --aws-profile AWS_PROFILE
                        The AWS profile to use when updating the AWS Panther
                        deployment.
  --path PATH           The relative path to a lookup table input file.

```

{% endtab %}

{% tab title="validate" %}

```
panther_analysis_tool validate -h
usage: panther_analysis_tool validate [-h] [--api-token API_TOKEN] [--api-host API_HOST] [--filter KEY=VALUE [KEY=VALUE ...]] [--ignore-files IGNORE_FILES [IGNORE_FILES ...]] [--path PATH]

options:
  -h, --help            show this help message and exit
  --api-token API_TOKEN
                        The Panther API token to use. See: https://docs.panther.com/api-beta
  --api-host API_HOST   The Panther API host to use. See: https://docs.panther.com/api-beta
  --filter KEY=VALUE [KEY=VALUE ...]
  --ignore-files IGNORE_FILES [IGNORE_FILES ...]
                        Relative path to files in this project to be ignored by panther-analysis tool, space separated. Example ./foo.yaml ./bar/baz.yaml
  --path PATH           The relative path to Panther policies and rules.
```

{% endtab %}

{% tab title="zip" %}

```
$ panther_analysis_tool zip -h
usage: panther_analysis_tool zip [-h] [--filter KEY=VALUE [KEY=VALUE ...]] [--ignore-files IGNORE_FILES [IGNORE_FILES ...]] [--minimum-tests MINIMUM_TESTS]
                                 [--out OUT] [--path PATH] [--skip-tests] [--skip-disabled-tests] [--available-destination AVAILABLE_DESTINATION]
                                 [--sort-test-results] [--ignore-table-names]

optional arguments:
  -h, --help            show this help message and exit
  --filter KEY=VALUE [KEY=VALUE ...]
  --ignore-files IGNORE_FILES [IGNORE_FILES ...]
                        Relative path to files in this project to be ignored by panther-analysis tool, space separated. Example ./foo.yaml ./bar/baz.yaml
  --minimum-tests MINIMUM_TESTS
                        The minimum number of tests in order for a detection to be considered passing. If a number greater than 1 is specified, at least one True and
                        one False test is required.
  --out OUT             The path to store output files.
  --path PATH           The relative path to Panther policies and rules.
  --skip-tests
  --skip-disabled-tests
  --available-destination AVAILABLE_DESTINATION
                        A destination name that may be returned by the destinations function. Repeat the argument to define more than one name.
  --sort-test-results   Sort test results by whether the test passed or failed (passing tests first), then by rule ID
  --ignore-table-names  Allows skipping of table names from schema validation. Useful when querying non-Panther or non-Snowflake tables
  --valid-table-names   VALID_TABLE_NAMES [VALID_TABLE_NAMES ...]
                        Fully qualified table names that should be considered valid during schema validation (in addition to standard Panther/Snowflake tables), space
                        separated. Accepts '*' as wildcard character matching 0 or more characters. Example foo.bar.baz bar.baz.* foo.*bar.baz baz.* *.foo.*
```

{% endtab %}

{% tab title="check-connection" %}

```
panther_analysis_tool check-connection -h
usage: panther_analysis_tool check-connection [-h] [--api-token API_TOKEN] [--api-host API_HOST]

optional arguments:
  -h, --help            show this help message and exit
  --api-token API_TOKEN
                        The Panther API token to use. See: https://docs.panther.com/api-beta
  --api-host API_HOST   The Panther API host to use. See: https://docs.panther.com/api-beta
```

{% endtab %}
{% endtabs %}

### `--filter`: Filtering PAT commands

The `test`, `zip`, `upload`, and `release` commands all support filtering. Filtering works by passing the `--filter` argument with a list of filters specified in the format `KEY=VALUE1,VALUE2`. The keys can be any valid field in a policy or rule. When using a filter, only analysis that matches each filter specified will be considered.&#x20;

For example, the following command will test only items with the AnalysisType of policy AND the severity of High:

```
panther_analysis_tool test --path tests/fixtures/valid_policies --filter AnalysisType=policy Severity=High
[INFO]: Testing analysis packs in tests/fixtures/valid_policies

AWS.IAM.BetaTest
	[PASS] Root MFA not enabled fails compliance
	[PASS] User MFA not enabled fails compliance
```

The following command will test items with the AnalysisType policy OR rule, AND the severity High:

```
panther_analysis_tool test --path tests/fixtures/valid_policies --filter AnalysisType=policy,rule Severity=High
[INFO]: Testing analysis packs in tests/fixtures/valid_policies

AWS.IAM.BetaTest
	[PASS] Root MFA not enabled fails compliance
	[PASS] User MFA not enabled fails compliance

AWS.CloudTrail.MFAEnabled
	[PASS] Root MFA not enabled fails compliance
	[PASS] User MFA not enabled fails compliance
```

When writing policies or rules that refer to the `global` analysis types, be sure to include them in your filter. You can include an empty string as a value in a filter, and it will mean the filter is only applied if the field exists.&#x20;

The following command will return an error, because the policy in question imports a global but the global does not have a severity so it is excluded by the filter:

```
panther_analysis_tool test --path tests/fixtures/valid_policies --filter AnalysisType=policy,global Severity=Critical
[INFO]: Testing analysis packs in tests/fixtures/valid_policies

AWS.IAM.MFAEnabled
	[ERROR] Error loading module, skipping

Invalid: tests/fixtures/valid_policies/example_policy.yml
	No module named 'panther'

[ERROR]: [('tests/fixtures/valid_policies/example_policy.yml', ModuleNotFoundError("No module named 'panther'"))]
```

For this query to work as expected, you need to allow for the severity field to be absent:

```
panther_analysis_tool test --path tests/fixtures/valid_policies --filter AnalysisType=policy,global Severity=Critical,""
[INFO]: Testing analysis packs in tests/fixtures/valid_policies

AWS.IAM.MFAEnabled
	[PASS] Root MFA not enabled fails compliance
	[PASS] User MFA not enabled fails compliance
```

Filters work for the `zip`, `upload`, and `release` commands in the same way they work for the `test` command.

In addition to filtering, you can set a minimum number of unit tests with the `--minimum-tests` flag. Detections that don't have the minimum number of tests will be considered failing, and if `--minimum-tests` is set to 2 or greater it will also enforce that at least one test must return True and one must return False.

In the example below, even though the rules passed all their tests, they're still considered failing because they do not have the correct test coverage:

```
panther_analysis_tool test --path tests/fixtures/valid_policies --minimum-tests 2
% panther_analysis_tool test --path okta_rules --minimum-tests 2
[INFO]: Testing analysis packs in okta_rules

Okta.AdminRoleAssigned
	[PASS] Admin Access Assigned

Okta.BruteForceLogins
	[PASS] Failed login

Okta.GeographicallyImprobableAccess
	[PASS] Non Login
	[PASS] Failed Login

--------------------------
Panther CLI Test Summary
	Path: okta_rules
	Passed: 0
	Failed: 3
	Invalid: 0

--------------------------
Failed Tests Summary
	Okta.AdminRoleAssigned
		['Insufficient test coverage, 2 tests required but only 1 found.', 'Insufficient test coverage: expected at least one passing and one failing test.']

	Okta.BruteForceLogins
		['Insufficient test coverage, 2 tests required but only 1 found.', 'Insufficient test coverage: expected at least one passing and one failing test.']

	Okta.GeographicallyImprobableAccess
		['Insufficient test coverage: expected at least one passing and one failing test.']
```
