Tech Partner Log Source Integrations
Integrate your product with Panther as a Tech Partner
Last updated
Was this helpful?
Integrate your product with Panther as a Tech Partner
Last updated
Was this helpful?
This page provides instructions for who are integrating their product with Panther by sending logs to one of Panther's —for example, to an or an . If you need to create a log pulling integration instead, please work directly with the Panther Tech Partner team.
If you would instead like to create an Alert Destination integration, see . If you are a Panther customer looking for information on ingesting custom logs, please see the .
to contact our Tech Partner team.
You will work with our Tech Partner team to get access to an NFR (Not for Resale) Panther instance and a shared Slack channel.
If your application can export events to an S3 bucket, please see the .
If your data can use one of our other transport options, please see the individual pages.
Generate one or more schemas for your data:
Generate sample data.
Determine how many log schemas you will need to create—see on Custom Logs.
Infer your schema(s) using your sample data.
You can or .
If you are inferring more than one schema, it's recommended to use either the method or the method.
Review the inferred schema(s) for the following:
If you generated more than one schema and they have a common set of required
properties, events may be misclassified, as the event classification process decides which schema an incoming event belongs to based on the required
properties of the schema. If your schemas have the same required
properties and you can't differentiate them, consider merging the schemas.
If a timestamp
property can be used to define the time the event occurred, mark it with isEventTime: true
, otherwise its p_parse_time
will be used as its event time, and that may lead to inaccurate event timestamps.
Consider any that may help make the events easier to reference or manipulate in detections or searches.
Export your schema(s).
You can export your schema(s) , or you can copy them from the Panther Console and paste it into a text file.
In the left-hand navigation bar of your Panther Console, click Investigate > Search.
Click Search.
Look for events in the results table at the bottom of the page.
Please create a text file with the following information, which will be used to describe your platform in the Panther Console and to generate a documentation page for this integration:
A description of the application
Common use cases
The supported integration method(s)
Any caveats or limitations
Step-by-step instructions on how to make any necessary configurations in your application to forward logs from your service to the Data Transport source
If these instructions are outlined on your public documentation, feel free to share a link to that instead
Zip the files containing the following:
The text file of information from Step 4
A square .svg file of the application’s logo
Your test data
The schema
Send the zipped file to Panther via your shared Slack channel.
After submitting your zip file, the Tech Partner team will work with you to coordinate next steps.
This is strongly encouraged, as having detections available will promote adoption of your integration.
At this stage—before a log source tile for your organization has been added in Panther—you may wish to test your integration by setting up a source. After you have configured the source, you can verify that data is being ingested into Panther by using the tool.
You can learn more about Search on , but on a high level:
In the in the upper-right corner, click the name(s) of your log source's schema(s).
Adjust the , if needed.
For an example, see pages under that use Data Transports, such as and
Write Python for your log source.
See to learn how to get started, and find full examples in the .
Open a Pull Request with your detection content against the .
Please follow the and .