# Tech Partner Log Source Integrations

This page provides instructions for [Panther Technology Partners](https://panther.com/partners/) who are integrating their product with Panther by sending logs to one of Panther's [Data Transport sources](https://docs.panther.com/data-onboarding/data-transports)—for example, to an [S3 bucket](https://docs.panther.com/data-onboarding/data-transports/aws/s3) or an [HTTP endpoint](https://docs.panther.com/data-onboarding/data-transports/http). If you need to create a log pulling integration instead, please work directly with the Panther Tech Partner team.

If you would instead like to create an Alert Destination integration, see [Tech Partner Alert Destination Integrations](https://docs.panther.com/alerts/tech-partner). If you are a Panther customer looking for information on ingesting custom logs, please see the [Custom Logs documentation](https://docs.panther.com/data-onboarding/custom-log-types).

## Step 1: Contact Panther’s Tech Partner team

* [Fill out this form](https://panther.com/partners/request/) to contact our Tech Partner team.
  * You will work with our Tech Partner team to get access to an NFR (Not for Resale) Panther instance and a shared Slack channel.

## Step 2: Determine the integration method(s)

* If your application can export events to an HTTP URL (webhook), see the [HTTP Source](https://docs.panther.com/data-onboarding/data-transports/http) instructions.
  * The [HTTP source](https://docs.panther.com/data-onboarding/data-transports/http) is not recommended if your log source is high-volume (i.e., it emits at least one GB per hour) and/or its [payload size exceeds the HTTP payload limit](https://docs.panther.com/data-transports/http#payload-requirements).
* If your application can export events to an S3 bucket, see the [S3 Source instructions](https://docs.panther.com/data-onboarding/data-transports/aws/s3).
* If your data can use one of our other transport options, see the individual [Data Transport documentation](https://docs.panther.com/data-onboarding/data-transports) pages.

## Step 3: Generate schema(s) and tests

1. Generate one or more schemas for your data:
   1. Generate or gather all sample data you'd like to be able to parse in Panther.
   2. Determine how many log schemas you will need to create—see [Determine how many custom schemas you need](https://docs.panther.com/custom-log-types#determine-how-many-custom-schemas-you-need) on Custom Logs.
   3. Infer your schema(s) using your sample data.
      * You can [infer a schema in the Panther Console](https://docs.panther.com/custom-log-types#how-to-define-a-custom-schema) or [from the CLI using pantherlog](https://docs.panther.com/panther-developer-workflows/pantherlog#infer-generate-a-schema-from-json-log-samples).
      * If you are inferring more than one schema, it's recommended to use either the [Inferring a custom schema from sample logs](https://docs.panther.com/custom-log-types#inferring-a-custom-schema-from-sample-logs) method or the [Inferring custom schemas from historical S3 data](https://docs.panther.com/custom-log-types#inferring-custom-schemas-from-historical-s3-data) method.
   4. Review the inferred schema(s) for the following:
      * If you generated more than one schema and they have a common set of `required` properties, events may be misclassified, as the event classification process decides which schema an incoming event belongs to based on the `required` properties of the schema. If your schemas have the same `required` properties and you can't differentiate them, consider merging the schemas.
      * If a `timestamp` field can be used to define the time the event occurred, mark it with `isEventTime: true`—otherwise, its `p_parse_time` will be used as its event time, which can be misleading.
      * Consider any [transformations](https://docs.panther.com/data-onboarding/custom-log-types/transformations) that may help make the events easier to reference or manipulate in detections or searches.
   5. Export your schema(s).
      * You can export your schema(s) [from the CLI using pantherlog](https://docs.panther.com/panther-developer-workflows/pantherlog#export-schemas-export-panther-managed-schemas), or you can copy and paste them from the Panther Console into a text file.
2. For each schema, create a `<schema_name>_tests.yml` file.
   * Learn how to create schema test files in [Creating a schema test file](https://docs.panther.com/panther-developer-workflows/pantherlog#creating-a-schema-test-file).

### Verifying your data is flowing into Panther

At this stage—before a log source tile for your organization has been added in Panther—you may wish to test your integration by setting up a [Data Transport](https://docs.panther.com/data-onboarding/data-transports) source. After you have configured the source, you can verify that data is being ingested into Panther by using the [Search](https://docs.panther.com/search/search-tool) tool.

You can learn more about Search on [its documentation page](https://docs.panther.com/search/search-tool), but on a high level:

1. In the left-hand navigation bar of your Panther Console, click **Investigate** > **Search**.
2. In the [table dropdown filter](https://docs.panther.com/search/search-tool#table-filter) in the upper-right corner, click the name(s) of your log source's schema(s).
3. Adjust the [date/time range filter](https://docs.panther.com/search/search-tool#date-range-filter), if needed.
4. Click **Search**.
   * Look for events in the results table at the bottom of the page.

## Step 4: Write instructional information about the integration

Please create a text file with the following information, which will be used to describe your platform in the Panther Console and to generate a documentation page for this integration:

* A description of the application
* The supported integration method(s)
* Step-by-step instructions on how to make any necessary configurations in your application to forward logs from your service to the Data Transport source
  * For an example, see pages under [Supported Logs](https://docs.panther.com/data-onboarding/supported-logs) that use Data Transports, such as [Auth0 Logs](https://docs.panther.com/data-onboarding/supported-logs/auth0) and [GitLab Logs](https://docs.panther.com/data-onboarding/supported-logs/gitlab)
  * If these instructions are outlined on your public documentation, feel free to share a link to that instead
* Any caveats or limitations
* Common use cases for your integration. When thinking about how customers might use your log integration in Panther, you might consider:
  * Using [Correlation Rules](https://docs.panther.com/detections/correlation-rules) to correlate security signals from your system with those from other log sources, to identify complex threat behavior
  * Using Panther's [Search](https://docs.panther.com/search/search-tool) to pivot off of an identifier found in your events (e.g., an email address, IP address, or AWS ARN) and search for it across logs from other systems (e.g., Okta) during an investigation

## Step 5: Submit to Panther for review

1. Zip the following files:
   * The text file of information from Step 4
   * The schema(s) and corresponding `<schema_name>_tests.yml` file(s)
   * Your raw test data
   * A square `.svg` file of the application’s logo
2. Send the zipped file to Panther via your shared Slack channel.

After submitting your zip file, the Tech Partner team will work with you to coordinate next steps.

## Step 6 (Optional): Create detections for your log source

1. Write Python [detections](https://docs.panther.com/detections) for your log source.
   * This is strongly encouraged, as having detections available will promote adoption of your integration.
   * See [Writing Python Detections](https://docs.panther.com/detections/rules/python) to learn how to get started, and find full examples in the [`panther-analysis` GitHub repository](https://github.com/panther-labs/panther-analysis/tree/release/rules).
2. Open a Pull Request with your detection content against the [public `panther-analysis` GitHub repository](https://github.com/panther-labs/panther-analysis/pulls).
   * Please follow the [contribution guidelines](https://github.com/panther-labs/panther-analysis/blob/release/CONTRIBUTING.md) and [style guide](https://github.com/panther-labs/panther-analysis/blob/release/STYLE_GUIDE.md).
