Data Sources & Transports
Onboard your data sources into Panther to normalize and retain logs
Last updated
Onboard your data sources into Panther to normalize and retain logs
Last updated
Panther offers built-in integrations for common data sources and data mapping for custom log sources. This page describes available data source options, how to monitor log source ingestion and health, how to request support for a new log source, and how to configure an Event Threshold alarm.
For information on ingesting Panther Console audit logs, see the Panther Audit Logs page.
You can create an HTTP (webhook) source, or leverage cloud services like S3 buckets, CloudWatch, SQS, SNS, Azure Blob Storage, or Google Cloud Storage (GCS) to push data to Panther. For more information, see Data Transports.
Panther supports pulling logs from vendors via direct integrations that query the API and via AWS EventBridge. In addition, Panther supports pushing logs to common Data Transport sources to ingest logs that have supported schemas but not a direct API integration. For a full list of supported vendors, see the Supported Logs page.
In addition to onboarding AWS as a log source to configure Detections and receive alerts, we recommend configuring Cloud Security Scanning for your AWS account. Cloud Security Scanning works by scanning AWS accounts, modeling the Resources within them, and using Policies to detect misconfigurations. For more information, see Cloud Security Scanning.
Panther allows you to generate a custom schema if you have a log type that is not yet supported. Panther gives you the ability to build custom schemas, which inform Panther how to parse events correctly. For more information, see Custom Logs.
When your log source is onboarded in Panther, you can monitor its individual data processing metrics and health within the log source's operations page, attach new schemas, and view raw data associated with the log source. You can also monitor overall log source ingestion metrics on the Log Source Overview page. For more information, see Monitoring Log Sources.
Ingestion filters let you define conditions under which incoming data should be dropped—i.e., not ingested into Panther. This dropped data will not contribute to your ingestion quota. These filters can be useful, then, to partially ingest high-volume logs that may have previously been cost-prohibitive when connected with Panther.
For more information, see Ingestion Filters.
On the final step of configuring your log source with Panther, you have the option to create an alarm in case the source does not process any events within a configurable period of time. For example, if you configure the threshold to 15 minutes, then you will receive an alert if no events are processed in 15 minutes.
For instructions, see Configuring log drop-off alarms for log sources.
If you do not see the log source you want within the list at Integrations > Log Sources, you can request support of a new log source:
Log in to your Panther Console.
Navigate to Configure > Log Sources.
Click Create New.
Enter the Log Source name you want to request and the use case it will address.
Click Create Request.
If you no longer want to collect logs from a particular log source, you can delete it in the Panther Console or using the Panther API.
After you delete a log source, all events that have already been collected by that source will remain accessible in your Data Lake (meaning they can be queried with Data Explorer and Search).
To delete a log source:
In the left-hand navigation bar of your Panther Console, click Configure > Log Sources.
In the table of log sources, locate the one you would like to delete. On the right side of its row, click the three dots icon.
In the pop-up confirmation modal, click Yes, Delete.
Visit the Panther Knowledge Base to view articles about data sources and transports that answer frequently asked questions and help you resolve common errors and issues.
Scroll to the bottom of the page and click the Request it here hyperlink.
Click Delete.