Azure Blob Storage Source
Onboarding Azure Blob Storage as a Data Transport log source in the Panther Console
Overview
Panther supports configuring Azure Blob Storage as a Data Transport to pull log data directly from your Azure container, allowing you to then write detections and perform investigations on this processed data.
Data can be sent compressed (or uncompressed). Learn more about compression specifications in Ingesting compressed data in Panther.
How to set up an Azure Blob Storage log source in Panther
To ingest logs from Azure Blob Storage, you will first verify in Azure that certain resource providers are registered for your subscription. You'll begin setting up the source in Panther, then create necessary Azure infrastructure, either using a provided Terraform template or manually in the Azure Console.
Prerequisite
To create an Azure Blob Storage log source, you must first ensure that within your Azure subscription settings, Microsoft.EventGrid and Microsoft.Storage are registered resource providers. Verify this by following these steps:
In your Azure Console, navigate to Subscriptions.
Select the subscription you will be creating your Azure resources in.
Within the subscription settings, click Resource providers.

In the Filter by name field, search for and locate
Microsoft.EventGridandMicrosoft.Storage.For each of these providers, ensure the Status column has a value of Registered.
Step 1: Configure Azure Blob Storage in Panther
In the left-hand navigation bar of your Panther Console, click Configure > Log Sources.
In the upper right corner, click Create New.
Click the Custom Log Formats tile.
In the Azure Blob Storage tile on the slide-out panel, click Start.

On the Basic Info page, fill in the following:
Name: Enter a descriptive name for your log source.
Log Types: Select one or more log types to associate with this log source.
Click Setup.
On the Log Format page, select the stream type of the incoming logs:
Auto
Lines
JSON
JSON Array
Click Continue.
The Configuration page will load.
Step 2: Create required Azure infrastructure
On the Infrastructure & Configuration page, you'll create required Azure infrastructure (either by using a Panther-provided Terraform template, or manually configuring resources in the Azure Console) and provide configuration values to Panther.
Using the Terraform template to create Azure infrastructure
Click Terraform Template to download the Terraform template.
You can also find the Terraform template at this GitHub link.

If you do not already have the Azure CLI installed, install it by following Azure's How to install the Azure CLI documentation.
In a terminal, run
az login.Move the Terraform template to a new directory, and navigate to that directory.
Edit the
panther.tfvarsfile to customize your deployment, e.g., by changing the region the infrastructure will be created in, and providing a custom storage account name.Run the following Terraform commands to create the Azure resources:
terraform initterraform apply -var-file="panther.tfvars"
After Terraform has finished creating the resources, copy the outputted values into the following fields in the Provide Azure configuration section of the Panther Console:
Tenant ID
Client ID
Storage Account Name
Storage Queue Name
Client Secret
The client secret value will be redacted in your terminal. To view it, run
terraform output secret, and copy the value without quotation marks.If you're running macOS, execute
terraform output -raw secret | pbcopyto copy the value without printing it.

Select whether your Azure Blob storage is in Azure Government Cloud or Public Cloud.
Click Setup, then continue to Step 3: Verify setup in Panther.
Manually creating infrastructure in the Azure Console
Step 1: Create resource group and storage account
In your Azure Console, navigate to Subscriptions.
Select the subscription you will be creating your Azure resources in.
Click Resource groups.
Click +Create.

Provide values for Name and Region.
Copy down or remember the value you provide for Name, as you'll need it later in this process.
Click Review and create.
Click Create.
Click the name of your newly created resource group.
Click Create.
In the search bar, enter "storage account" and within the Storage account tile that returns, click Create.

On the Create a storage account page, in the Instance details section, enter values for Storage account name and Region.
Click Review .

Click Create.
Step 2: Add app registration and client secret
In the top search bar, search for "Microsoft Entra ID" and click on Microsoft Entra ID.
Click +Add, and in the dropdown menu that populates, App registration.

Enter a Name.
Click Register.
Securely copy and store the Application (client) ID value, as you'll need it later in this process.
Click on your newly registered app.
On the right hand side, click Add a certificate or secret.

Click +New client secret.
Provide a Description.
Click Add.
Securely copy and store the Client Secret value, as you'll need it later in this process.
Step 3: Create queue and add permission
Navigate to your newly created storage account.
In the left-hand navigation bar, select Queues.
Click +Queue to create a new queue.

Enter a Name for the queue.
Copy down or remember the value you provide for Name, as you'll need it later in this process.
Click Ok.
Click on your newly created queue, then in the left-hand navigation bar, click Access Control (IAM).
Click +Add, then Add Role Assignment.
Search for "Storage Queue Data Message Processor" and select the matching role that populates.
Click on the Members tab.
Click +Select Members.
Search for the name of your registered app created in Step 2, and click Select.
Click Review+Assign.
Step 4: Create system topic and event subscription
In the top search bar, search for "Event Grid System Topics" and click on the matching page that populates.
Click +Create.
On the Create Event Grid System Topic page, fill in the following fields:
Click Review+create.
Click Create.
Navigate back to your storage account.
In the left-hand navigation bar, click Events then +Event Subscription.
On the Create Event Subscription page, provide values for the following fields:
In the Event Subscription Details section, enter a Name.
In the Event Types section, for the Filter to Event Types field, select Blob Created.
In the Endpoint Details section, make the following selections:
Endpoint Type: Select Storage Queue.
Endpoint: Select the queue you created in Step 3.

Click Create.
Step 5: Create container and add permission
Navigate to your newly created storage account.
In the left-hand navigation bar, select Containers.
Click +Container to create a new container.

Enter a Name for the container.
Copy down or remember the value you provide for Name, as you'll need it later in this process.
Click Create.
Click on your newly created container, then in the left-hand navigation bar, click Access Control (IAM).
Click +Add.

Click Add Role Assignment.
Search for "Storage Blob Data Reader" and select the matching role that populates.

Click on the Members tab.
Click +Select Members.
Search for the name of the registered app you created in Step 2, and click Select.
Click Review+Assign.
Step 6: Copy Azure configuration values back into the Panther Console
Return to the Infrastructure & Configuration page in your Panther Console.
In the Provide Azure configuration section, copy in values for the following fields:
Tenant ID: This value can be found on your Azure Console's Microsoft Entra ID home page.
Client ID: The application (client) ID generated in Step 2.
Storage Account Name: The name you gave your storage account in Step 1.
Storage Queue Name: The name you gave your queue in Step 3.
Client Secret: The client secret value generated in Step 2.

Select whether your Azure Blob storage is in Azure Government Cloud or Public Cloud.
Click Setup, then continue to Step 3: Verify setup in Panther.
Step 3: Verify setup in Panther
You will be directed to a success screen:

You can optionally enable one or more Detection Packs.
If you have not done so already, click Attach or Infer Schemas to attach one or more schemas to the source.
The Trigger an alert when no events are processed setting defaults to YES. We recommend leaving this enabled, as you will be alerted if data stops flowing from the log source after a certain period of time. The timeframe is configurable, with a default of 24 hours.

Viewing ingested logs
After your log source is configured, you can search ingested data using Search or Data Explorer.
Last updated
Was this helpful?


