Azure Blob Storage Source

Onboarding Azure Blob Storage as a Data Transport log source in the Panther Console

Overview

Panther supports configuring Azure Blob Storage as a Data Transport to pull log data directly from your Azure container, allowing you to then write detections and perform investigations on this processed data.

Data can be sent compressed (or uncompressed). Learn more about compression specifications in Ingesting compressed data in Panther.

How to set up an Azure Blob Storage log source in Panther

To ingest logs from Azure Blob Storage, you will first verify in Azure that certain resource providers are registered for your subscription. You'll begin setting up the source in Panther, then create necessary Azure infrastructure, either using a provided Terraform template or manually in the Azure Console.

Prerequisite

To create an Azure Blob Storage log source, you must first ensure that within your Azure subscription settings, Microsoft.EventGrid and Microsoft.Storage are registered resource providers. Verify this by following these steps:

  1. In your Azure Console, navigate to Subscriptions.

  2. Select the subscription you will be creating your Azure resources in.

  3. In the Filter by name field, search for and locate Microsoft.EventGrid and Microsoft.Storage.

    • For each of these providers, ensure the Status column has a value of Registered.

Step 1: Configure Azure Blob Storage in Panther

  1. In the left-hand navigation bar of your Panther Console, click Configure > Log Sources.

  2. In the upper right corner, click Create New.

  3. Click the Custom Log Formats tile.

  4. On the Basic Info page, fill in the following:

    • Name: Enter a descriptive name for your log source.

    • Log Types: Select one or more log types to associate with this log source.

  5. Click Setup.

  6. On the Log Format page, select the stream type of the incoming logs:

    • Auto

    • Lines

    • JSON

    • JSON Array

  7. Click Continue.

    • The Configuration page will load.

Step 2: Create required Azure infrastructure

On the Infrastructure & Configuration page, you'll create required Azure infrastructure (either by using a Panther-provided Terraform template, or manually configuring resources in the Azure Console) and provide configuration values to Panther.

Using the Terraform template to create Azure infrastructure

After creating Azure resources using the Terraform template, Panther will ingest all logs written to any container in your created storage account. Ensure that the created Azure application has permission to read from each container.

  1. If you do not already have the Azure CLI installed, install it by following Azure's How to install the Azure CLI documentation.

  2. In a terminal, run az login.

  3. Move the Terraform template to a new directory, and navigate to that directory.

  4. Edit the panther.tfvars file to customize your deployment, e.g., by changing the region the infrastructure will be created in, and providing a custom storage account name.

  5. Run the following Terraform commands to create the Azure resources:

    1. terraform init

    2. terraform apply -var-file="panther.tfvars"

  6. After Terraform has finished creating the resources, copy the outputted values into the following fields in the Provide Azure configuration section of the Panther Console:

    • Tenant ID

    • Client ID

    • Storage Account Name

    • Storage Queue Name

    • Client Secret

      • The client secret value will be redacted in your terminal. To view it, run terraform output secret, and copy the value without quotation marks.

        • If you're running macOS, execute terraform output -raw secret | pbcopy to copy the value without printing it.

  7. Select whether your Azure Blob storage is in Azure Government Cloud or Public Cloud.

  8. Click Setup, then continue to Step 3: Verify setup in Panther.

Step 3: Verify setup in Panther

You will be directed to a success screen:

  • You can optionally enable one or more Detection Packs.

  • If you have not done so already, click Attach or Infer Schemas to attach one or more schemas to the source.

  • The Trigger an alert when no events are processed setting defaults to YES. We recommend leaving this enabled, as you will be alerted if data stops flowing from the log source after a certain period of time. The timeframe is configurable, with a default of 24 hours.

Viewing ingested logs

After your log source is configured, you can search ingested data using Search or Data Explorer.

Last updated