Customer-configured Snowflake Integration (Legacy)
Panther does not support this method for new accounts, and will be migrating existing customers towards one of the supported methods in the future.
Overview
In this configuration, Panther has no access and requires a Database Administrator to run commands on our behalf.
This guide assumes you already have a Snowflake instance in AWS.
Ideally, your Panther deployment and Snowflake instance are in the same AWS region. Having both Panther and Snowflake in the same region lowers latency for queries and data movement (relative to cross region communications).
Panther uses two Snowflake users/roles to access your Snowflake instance:
A read only user/role for queries
An admin user/role with strict permissions only to the Panther databases to create tables when new log sources are onboarded into Panther.
In Snowflake, it is possible to share table access. This allows your business data and security data to be queried in Panther (via the PANTHER_READ_ONLY
role).
When you manage your own Snowflake instance, you can create tables and views with data ingested by Panther. Do not place these custom objects inside Panther databases. Unexpected tables and views will cause errors. Instead, create them in non-Panther databases, and share them with Panther.
Do not create users or any other database objects with the prefix PANTHER_
.
How to configure the legacy customer-managed Snowflake integration
1. Gather configuration information from Panther
Log in to the Panther Console.
Click the gear icon in the upper right.
In the dropdown menu, click General.
There you will find:
Snowflake ReadOnly Lambda Role ARN
Snowflake Admin Lambda Role ARN
Lookup Tables Lambda Role ARN
Keep these ARNs handy, we will use this later.
2. Gather configuration information from Snowflake
In order to configure Panther, you need to get the SNOWFLAKE_IAM_USER
from Snowflake.
In a Snowflake SQL shell execute the below sql, replacing myaccountid
with your AWS account ID and myaccountregion
with the account's region:
You should see a response similar to:
In the above example, the SNOWFLAKE_IAM_USER
is the AWS
attribute arn:aws:iam::87654321XXXX:user/k7m2-s-v2st0722
. Keep this handy, we will use this in a later step.
3. Create the Panther databases in Snowflake
Execute in Snowflake SQL shell:
4. Create a read only role and an administrative role in Snowflake
For customers with self-hosted Snowflake deployments who are upgrading to 1.18
Self-hosted customers using Snowflake data cloud should have their Database Administrator add the following permission set, or update their automation scripts to reflect the latest version of the setup instructions:
NOTE: be sure to update <your warehouse>
in the first line of the SQL block below to the desired Snowflake warehouse name that you wish Panther to use.
We recommend you create a dedicated Panther warehouse (e.g., PANTHER_WH), so that you can easily track costs and resize capacity independently of other Snowflake resources.
Execute in Snowflake SQL shell:
5. Create a read only user and an administrative user in Snowflake
NOTE: set <your_readonly_password>
and <your_admin_password>
below. Execute in Snowflake SQL shell:
6. Create a stored procedure to make creating AWS Secrets easier (Optional)
Define this stored procedure that will create a JSON document you can use to copy and paste into AWS Secret Manger (saving typing). Execute in Snowflake SQL shell:
7. Create a KMS key in your AWS account for Panther Snowflake Secrets
You will use this key to encrypt the Snowflake secrets that we will store in your AWS account as part of Step 8.
Log in to your AWS account
(Optional) Go to the same region that your Snowflake account is in
Go to KMS service
Click on
Create a key
Pick
Symmetric
for the type and clickNext
Set the alias to
panther-secret
. ClickNext
. On the next page ClickNext
(accept defaults)Click on
Add another AWS Account
and enter the account id where Panther is installed.Click
Next
and then clickFinish
.
8. Create a read only user AWS Secret and an administrative user AWS Secret
You will use AWS Secrets Manager to store the Snowflake user password. It will be configured to only allow access from a single lambda function in the Panther account.
Repeat the process below, once for panther_readonly
user and once for the panther_admin
user.
Access the AWS Secrets Manager via the console and select
Store a New Secret
button on the page.You will be presented with a page titled
Store a new secret
. SelectOther type of secrets
from the list of types. Specify the following key/value pairs:
Field | Description |
| The name of your Snowflake account. It can be found by executing |
| Snowflake user you created earlier, either |
| The Snowflake user password that you created earlier |
| This is usually |
| Use |
| The name of your Snowflake active warehouse |
You can enter the above by hand OR run the following command in a Snowflake SQL shell, typing in the appropriate values for the 4 specified parameters (account
and port
should autopopulate). Do this once for the panther_readonly
user and once for the panther_admin
:
You can then copy-paste the result into each of the 2 secrets "plaintext" editor tab.
NOTE: Check to make sure that all 6 fields (account, host, password, port, user, warehouse) are filled out and have the correct values, otherwise the Panther lambdas may encounter issues connecting to snowflake.
Under "Select the encryption key," select
panther-secret
from the dropdown.Fill in the Secret key/values.
Click
Next
.
You will be presented with a screen asking for the name and description of the secret. Fill these in and click
Next
.
Configure how often you want AWS Secrets Manager to rotate your secret, then click Next.
Finally, you will be presented with an overview screen. Scroll to the bottom and click the
Store
button.
Update Permissions for the Secrets
We need to configure the permissions for the two Panther AWS secrets such that only the specific Panther lambdas have access to the Snowflake secret.
The Panther panther-snowflake-api
will use the panther_readonly
user for user queries while the panther-snowflake-admin-api
will use the panther_admin
user to create tables when new log sources are onboarded.
The panther-lookup-tables-api
will use the the permissions to manage look up tables in Snowflake.
Go to the console and select each of the secrets you created above. On the overview screen click on the Edit Permissions
button. Copy the below policy JSON, substituting the appropriate <snowflake lambda role>
, either:
panther-snowflake-api
role collected in the first steppanther-snowflake-admin-api
role collected in the first step
Substitute <lookup tables lambda role>
with the panther-lookup-tables-api
role collected in the first step.
For the value of <secret ARN>
use the ARN of the secret you are updating.
Then click Save.
Make a note of the arn
for the secret. We will use this later.
9. Deploy Panther with Snowflake enabled
SaaS Customer-managed Deployment Users
Send to your Panther point of contact (POC):
SNOWFLAKE_IAM_USER
collected in the first stepARN for the
panther_readonly
user AWS SecretARN for the
panther_admin
user AWS Secret
Your Panther POC will re-deploy Panther with these settings to enable Snowflake.
CloudPrem Users
Customers running Panther in their own accounts (we call that CloudPrem)
need to first deploy the master template doing an initial setup of Panther. After deploying the master template configure the master stack parameters as below:
update:
SnowflakeAPISecretARN
parameter as the ARN of the secret created above for thepanther_readonly
user.update:
SnowflakeAdminAPISecretARN
parameter as the ARN of the secret created above for thepanther_admin
user.update:
SnowflakeDestinationClusterARNs
parameter as the value of<SNOWFLAKE_IAM_USER>
from initial step above.
Execute an update to the Cloudformation stack.
Validation of Snowpipe Processing
Once Panther is configured for Snowflake, you should have seven databases:
panther_logs
panther_rule_matches
panther_rule_errors
panther_cloudsecurity
panther_views
pather_stored_procedures
panther_monitor
These are the same database names used in AWS Athena and queries should behave similarly.
Assuming you have data being regularly being processed, there should be data in the tables in a few minutes. This depends on your rate of log ingestion.
You can quickly test if the data ingestion is working by running a simple query:
The configuration can be tested from the Data Explorer. Run some same queries over a table that you know has data (check via Snowflake console).
Rotating Secrets
To rotate secrets, create a NEW user and edit the secret replacing the old user and password with the new user and password. Wait one hour before deleting/disabling the old user in Snowflake.
Last updated