Tech Partner Log Source Integrations

Integrate your product with Panther as a Tech Partner

This page provides instructions for Panther Technology Partners who are integrating their product with Panther by sending logs to one of Panther's Data Transport sources—for example, to an S3 bucket or an HTTP endpoint. If you need to create a log pulling integration instead, please work directly with the Panther Tech Partner team.

If you would instead like to create an Alert Destination integration, see Tech Partner Alert Destination Integrations. If you are a Panther customer looking for information on ingesting custom logs, please see the Custom Logs documentation.

Step 1: Contact Panther’s Tech Partner team

  • Fill out this form to contact our Tech Partner team.

    • You will work with our Tech Partner team to get access to an NFR (Not for Resale) Panther instance and a shared Slack channel.

Step 2: Determine the integration method(s)

Step 3: Generate schema(s) and tests

  1. Generate one or more schemas for your data:

    1. Generate or gather all sample data you'd like to be able to parse in Panther.

    2. Determine how many log schemas you will need to create—see Determine how many custom schemas you need on Custom Logs.

    3. Infer your schema(s) using your sample data.

    4. Review the inferred schema(s) for the following:

      • If you generated more than one schema and they have a common set of required properties, events may be misclassified, as the event classification process decides which schema an incoming event belongs to based on the required properties of the schema. If your schemas have the same required properties and you can't differentiate them, consider merging the schemas.

      • If a timestamp field can be used to define the time the event occurred, mark it with isEventTime: true—otherwise, its p_parse_time will be used as its event time, which can be misleading.

      • Consider any transformations that may help make the events easier to reference or manipulate in detections or searches.

    5. Export your schema(s).

  2. For each schema, create a <schema_name>_tests.yml file.

Verifying your data is flowing into Panther

At this stage—before a log source tile for your organization has been added in Panther—you may wish to test your integration by setting up a Data Transport source. After you have configured the source, you can verify that data is being ingested into Panther by using the Search tool.

You can learn more about Search on its documentation page, but on a high level:

  1. In the left-hand navigation bar of your Panther Console, click Investigate > Search.

  2. In the table dropdown filter in the upper-right corner, click the name(s) of your log source's schema(s).

  3. Adjust the date/time range filter, if needed.

  4. Click Search.

    • Look for events in the results table at the bottom of the page.

Step 4: Write instructional information about the integration

Please create a text file with the following information, which will be used to describe your platform in the Panther Console and to generate a documentation page for this integration:

  • A description of the application

  • The supported integration method(s)

  • Step-by-step instructions on how to make any necessary configurations in your application to forward logs from your service to the Data Transport source

    • For an example, see pages under Supported Logs that use Data Transports, such as Auth0 Logs and GitLab Logs

    • If these instructions are outlined on your public documentation, feel free to share a link to that instead

  • Any caveats or limitations

  • Common use cases for your integration. When thinking about how customers might use your log integration in Panther, you might consider:

    • Using Correlation Rules to correlate security signals from your system with those from other log sources, to identify complex threat behavior

    • Using Panther's Search to pivot off of an identifier found in your events (e.g., an email address, IP address, or AWS ARN) and search for it across logs from other systems (e.g., Okta) during an investigation

Step 5: Submit to Panther for review

  1. Zip the following files:

    • The text file of information from Step 4

    • The schema(s) and corresponding <schema_name>_tests.yml file(s)

    • Your raw test data

    • A square .svg file of the application’s logo

  2. Send the zipped file to Panther via your shared Slack channel.

After submitting your zip file, the Tech Partner team will work with you to coordinate next steps.

Step 6 (Optional): Create detections for your log source

  1. Write Python detections for your log source.

  2. Open a Pull Request with your detection content against the public panther-analysis GitHub repository.

Last updated

Was this helpful?