Tech Partner Log Source Integrations
Integrate your product with Panther as a Tech Partner
Last updated
Was this helpful?
Integrate your product with Panther as a Tech Partner
Last updated
Was this helpful?
This page provides instructions for who are integrating their product with Panther by sending logs to one of Panther's —for example, to an or an . If you need to create a log pulling integration instead, please work directly with the Panther Tech Partner team.
If you would instead like to create an Alert Destination integration, see . If you are a Panther customer looking for information on ingesting custom logs, please see the .
to contact our Tech Partner team.
You will work with our Tech Partner team to get access to an NFR (Not for Resale) Panther instance and a shared Slack channel.
If your application can export events to an HTTP URL (webhook), see the instructions.
The is not recommended if your log source is high-volume (i.e., it emits at least one GB per hour) and/or its .
If your application can export events to an S3 bucket, see the .
If your data can use one of our other transport options, see the individual pages.
Generate one or more schemas for your data:
Generate sample data.
Determine how many log schemas you will need to create—see on Custom Logs.
Infer your schema(s) using your sample data.
You can or .
If you are inferring more than one schema, it's recommended to use either the method or the method.
Review the inferred schema(s) for the following:
If you generated more than one schema and they have a common set of required
properties, events may be misclassified, as the event classification process decides which schema an incoming event belongs to based on the required
properties of the schema. If your schemas have the same required
properties and you can't differentiate them, consider merging the schemas.
If a timestamp
field can be used to define the time the event occurred, mark it with isEventTime: true
—otherwise, its p_parse_time
will be used as its event time, which can be misleading.
Consider any that may help make the events easier to reference or manipulate in detections or searches.
Export your schema(s).
You can export your schema(s) , or you can copy them from the Panther Console and paste it into a text file.
In the left-hand navigation bar of your Panther Console, click Investigate > Search.
Click Search.
Look for events in the results table at the bottom of the page.
Please create a text file with the following information, which will be used to describe your platform in the Panther Console and to generate a documentation page for this integration:
A description of the application
The supported integration method(s)
Step-by-step instructions on how to make any necessary configurations in your application to forward logs from your service to the Data Transport source
If these instructions are outlined on your public documentation, feel free to share a link to that instead
Any caveats or limitations
Common use cases for your integration. When thinking about how customers might use your log integration in Panther, you might consider:
Zip the files containing the following:
The text file of information from Step 4
A square .svg file of the application’s logo
Your test data
The schema
Send the zipped file to Panther via your shared Slack channel.
After submitting your zip file, the Tech Partner team will work with you to coordinate next steps.
This is strongly encouraged, as having detections available will promote adoption of your integration.
At this stage—before a log source tile for your organization has been added in Panther—you may wish to test your integration by setting up a source. After you have configured the source, you can verify that data is being ingested into Panther by using the tool.
You can learn more about Search on , but on a high level:
In the in the upper-right corner, click the name(s) of your log source's schema(s).
Adjust the , if needed.
For an example, see pages under that use Data Transports, such as and
Using to correlate security signals from your system with those from other log sources, to identify complex threat behavior
Using Panther's to pivot off of an identifier found in your events (e.g., an email address, IP address, or AWS ARN) and search for it across logs from other systems (e.g., Okta) during an investigation
Write Python for your log source.
See to learn how to get started, and find full examples in the .
Open a Pull Request with your detection content against the .
Please follow the and .