Tech Partner Log Source Integrations
Integrate your product with Panther as a Tech Partner
This page provides instructions for Panther Technology Partners who are integrating their product with Panther by sending logs to one of Panther's Data Transport sources—for example, to an S3 bucket or an HTTP endpoint. If you need to create a log pulling integration instead, please work directly with the Panther Tech Partner team.
If you would instead like to create an Alert Destination integration, see Tech Partner Alert Destination Integrations. If you are a Panther customer looking for information on ingesting custom logs, please see the Custom Logs documentation.
Step 1: Contact Panther’s Tech Partner team
Fill out this form to contact our Tech Partner team.
You will work with our Tech Partner team to get access to an NFR (Not for Resale) Panther instance and a shared Slack channel.
Step 2: Determine the integration method(s)
If your application can export events to an S3 bucket, please see the S3 Source instructions.
If your data can use one of our other transport options, please see the individual Data Transport documentation pages.
The HTTP source is not recommended if your log source is high-volume (i.e., it emits at least one GB per hour) and/or its payload size exceeds the HTTP payload limit.
Step 3: Generate schema(s)
Generate one or more schemas for your data:
Generate sample data.
Determine how many log schemas you will need to create—see Determine how many custom schemas you need on Custom Logs.
Infer your schema(s) using your sample data.
If you are inferring more than one schema, it's recommended to use either the Inferring a custom schema from sample logs method or the Inferring custom schemas from historical S3 data method.
Review the inferred schema(s) for the following:
If you generated more than one schema and they have a common set of
required
properties, events may be misclassified, as the event classification process decides which schema an incoming event belongs to based on therequired
properties of the schema. If your schemas have the samerequired
properties and you can't differentiate them, consider merging the schemas.If a
timestamp
property can be used to define the time the event occurred, mark it withisEventTime: true
, otherwise itsp_parse_time
will be used as its event time, and that may lead to inaccurate event timestamps.Consider any transformations that may help make the events easier to reference or manipulate in detections or searches.
Export your schema(s).
You can export your schema(s) from the CLI using pantherlog, or you can copy them from the Panther Console and paste it into a text file.
Verifying your data is flowing into Panther
At this stage—before a log source tile for your organization has been added in Panther—you may wish to test your integration by setting up a Data Transport source. After you have configured the source, you can verify that data is being ingested into Panther by using the Search tool.
You can learn more about Search on its documentation page, but on a high level:
In the left-hand navigation bar of your Panther Console, click Investigate > Search.
In the table dropdown filter in the upper-right corner, click the name(s) of your log source's schema(s).
Adjust the date/time range filter, if needed.
Click Search.
Look for events in the results table at the bottom of the page.
Step 4: Write instructional information about the integration
Please create a text file with the following information, which will be used to describe your platform in the Panther Console and to generate a documentation page for this integration:
A description of the application
Common use cases
The supported integration method(s)
Any caveats or limitations
Step-by-step instructions on how to make any necessary configurations in your application to forward logs from your service to the Data Transport source
For an example, see pages under Supported Logs that use Data Transports, such as Auth0 Logs and GitLab Logs
If these instructions are outlined on your public documentation, feel free to share a link to that instead
Step 5: Submit to Panther for review
Zip the files containing the following:
The text file of information from Step 4
A square .svg file of the application’s logo
Your test data
The schema
Send the zipped file to Panther via your shared Slack channel.
After submitting your zip file, the Tech Partner team will work with you to coordinate next steps.
Step 6 (Optional): Create detections for your log source
Write Python detections for your log source.
This is strongly encouraged, as having detections available will promote adoption of your integration.
See Writing Python Detections to learn how to get started, and find full examples in the
panther-analysis
GitHub repository.
Open a Pull Request with your detection content against the public
panther-analysis
GitHub repository.Please follow the contribution guidelines and style guide.
Last updated