# Chronosphere Onboarding Guide

## Overview

[Chronosphere Telemetry Pipeline](https://docs.chronosphere.io/pipelines) is a flexible telemetry pipeline that can stream logs from a variety of sources to an [HTTP Source](https://docs.panther.com/data-onboarding/data-transports/http) in Panther.

While this guide only explains how to configure Chronosphere Telemetry Pipeline with a Panther HTTP Source, it is also possible to stream logs to an [S3 Source](https://docs.panther.com/data-onboarding/data-transports/aws/s3) in Panther. If you would like to stream logs to an S3 Source, use the [Amazon S3 destination plugin](https://docs.chronosphere.io/pipeline-data/route/plugins/destination-plugins/amazon-s3) in Chronosphere Telemetry Pipeline.

## How to route logs to Panther using Chronosphere Telemetry Pipeline

### Prerequisite

* Ensure you have followed the [Chronosphere Telemetry Pipeline installation documentation](https://docs.chronosphere.io/pipeline-install), which includes creating a [Core Instance](https://docs.chronosphere.io/pipeline-install/kubernetes-cluster/ui). Chronosphere Telemetry Pipeline can run on Linux and Kubernetes environments.

### Step 1 (Optional): Decide where to filter and/or transform logs

If your raw logs need to be filtered out or transformed in some way, those actions can happen in Chronosphere or Panther.

| <p>In the Chronosphere Telemetry Pipeline web interface, you can filter or transform logs by:</p><ul><li>Using a <a href="https://docs.chronosphere.io/pipeline-data/process/parsers">parser</a></li><li>Defining <a href="https://docs.chronosphere.io/pipeline-data/process/processing-rules">processing rules</a></li></ul><p>If you'd like to use these tools in Chronosphere, you will configure them in <a href="#step-3-configure-chronosphere-telemetry-pipeline-to-forward-to-the-http-endpoint">Step 3</a>, below.</p> | <p>In Panther, you can filter or transform logs by:</p><ul><li>Using <a href="../ingestion-filters">ingestion filters</a></li><li>Using a parser in the associated <a href="../custom-log-types">log schema</a></li><li>Using <a href="../custom-log-types/transformations">transformations</a> in the associated <a href="../custom-log-types">log schema</a></li></ul><p>If you'd like to use a parser or transformations in Panther, <a href="../../custom-log-types#how-to-define-a-custom-schema">create a custom log schema</a> now.<br><br>If you'd like to use ingestion filters in Panther, you'll configure them in <a href="#step-2-create-an-http-source-in-panther">Step 2</a>, below.</p> |
| -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |

### Step 2: Create an HTTP source in Panther

1. Follow [Panther's instructions for configuring an HTTP Source](https://docs.panther.com/data-onboarding/data-transports/http).
   * For the authentication method, select [Shared secret](https://docs.panther.com/data-transports/http#shared-secret).
   * If you created a schema in Panther in Step 1, attach it to the source. If you haven't created a schema yet, you can [infer one after data has been received](https://docs.panther.com/custom-log-types#inferring-a-custom-schema-from-http-data-received-in-panther).
2. If you'd like to use [ingestion filters](https://docs.panther.com/data-onboarding/ingestion-filters), follow one of the instructions sets below:
   * [How to create a raw event filter](https://docs.panther.com/ingestion-filters/raw-event#how-to-create-a-raw-event-filter)
   * [How to create a normalized event filter](https://docs.panther.com/ingestion-filters/normalized-event#how-to-create-a-normalized-event-filter)

### Step 3: Configure Chronosphere Telemetry Pipeline to forward to the HTTP endpoint

1. In the Chronosphere Telemetry Pipeline web interface, navigate to your Core Instance.
2. Under **Kubernetes Namespaces**, click **Create a custom pipeline**. ![Under a "saved-mentor-b3a4." title, there are various panels. There are Edit, Automate Logging, and Create a custom pipeline buttons.](https://4011785613-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-LgdiSWdyJcXPahGi9Rs-2910905616%2Fuploads%2Fgit-blob-5d61b48d2facc229ab2ec3fdf36dcee49cfb4f81%2Fimage.png?alt=media)
3. Click **+ Source** to add a source.\
   ![Under a "test" header, a "+Source" button is circled.](https://4011785613-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-LgdiSWdyJcXPahGi9Rs-2910905616%2Fuploads%2Fgit-blob-127b20241638f929b3feb68447629e7f91b6456e%2Fimage.png?alt=media)
   1. In the **Add or Edit Source** slide-out panel, select a source tile.
   2. Configure the source as desired, then click **Save**.
4. Click **+ Destination** to add an [HTTP destination plugin](https://docs.chronosphere.io/pipeline-data/route/plugins/destination-plugins/http).\
   ![Under a "test" header, a "+Destination" button is circled.](https://4011785613-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-LgdiSWdyJcXPahGi9Rs-2910905616%2Fuploads%2Fgit-blob-e8dd3edf62946fc06c18b2f441890acf4ff3c6f3%2Fimage.png?alt=media)
   1. On the **Add or Edit Destination** slide-out panel, under **Network Based**, click **HTTP**.\
      ![On a page titled "Add or Edit Destination," an "HTTP" tile is circled.](https://4011785613-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-LgdiSWdyJcXPahGi9Rs-2910905616%2Fuploads%2Fgit-blob-0925c89aaf580774683aa1621791a82c4a3322ee%2Fimage.png?alt=media)
   2. Under **General**, set the following fields:
      1. **Host**: Enter the **HTTP Source URL** you generated in Panther in Step 2.
      2. **Port**: Enter `443`.
      3. **URI**: Enter the end of the **HTTP Source URL** you generated in Panther in Step 2, starting with `/http/`.
         * Example: `/http/cb015ee4-543c-4489-9f4b-testaa16d7a`\
           ![Under an "Add or Edit Destination" header, various form fields are shown. Host, Port, and URI are circled.](https://4011785613-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-LgdiSWdyJcXPahGi9Rs-2910905616%2Fuploads%2Fgit-blob-b5a75776a167362801eccef8b88fc2f5a12e2d55%2Fimage.png?alt=media)
   3. Under **Advanced**, add a Key/Value pair under **Headers**.
      * **Key**: Enter the Shared Secret key you entered in Panther in Step 2.
      * **Value**: Enter the Shared Secret value you generated or entered in Panther in Step 2.\
        ![Under an "Add or Edit Destination" header, Key and Value fields are circled.](https://4011785613-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-LgdiSWdyJcXPahGi9Rs-2910905616%2Fuploads%2Fgit-blob-ce8a45fcf74065919e2238a8b5544d3ead6408bd%2Fimage.png?alt=media)
   4. Under **Security and TLS**, click the **TLS** checkbox and set TLS Certificate Validation to **on**.\
      ![Under an "Add or Edit Destination" header, a "TLS Certificate Validation" field is circled.](https://4011785613-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-LgdiSWdyJcXPahGi9Rs-2910905616%2Fuploads%2Fgit-blob-4825bd705d1b71e578db07bec938f6aeeb8bfede%2Fimage.png?alt=media)
   5. Click **Save**.
5. (Optional) Add processing rules to your pipeline by following the Chronosphere [Add processing rules to your pipeline documentation](https://docs.chronosphere.io/pipeline-data/process/processing-rules#add-processing-rules-to-your-pipeline).
6. Click **Save and deploy**.\
   ![Under a "test" header, there are +Source and +Destination buttons. There are boxes titled "Fluent Bit" and "HTTP," with a line drawn between them.](https://4011785613-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-LgdiSWdyJcXPahGi9Rs-2910905616%2Fuploads%2Fgit-blob-42bd1a027e3bb93350fe0249e11389b7453484a0%2Fimage.png?alt=media)
7. Configure your log sources to route to the endpoint or port defined by the pipeline’s source(s).


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.panther.com/data-onboarding/data-pipeline-tools/chronosphere.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
