Azure Blob Storage Source
Onboarding Azure Blob Storage as a Data Transport log source in the Panther Console
Last updated
Onboarding Azure Blob Storage as a Data Transport log source in the Panther Console
Last updated
Panther supports configuring Azure Blob Storage as a Data Transport to pull log data directly from your Azure container, allowing you to then write detections and perform investigations on this processed data.
Data can be sent compressed (or uncompressed). Learn more about compression specifications in Ingesting compressed data in Panther.
To ingest logs from Azure Blob Storage, you will first verify in Azure that certain resource providers are registered for your subscription. You'll begin setting up the source in Panther, then create necessary Azure infrastructure, either using a provided Terraform template or manually in the Azure Console.
To create an Azure Blob Storage log source, you must first ensure that within your Azure subscription settings, Microsoft.EventGrid
and Microsoft.Storage
are registered resource providers. Verify this by following these steps:
In your Azure Console, navigate to Subscriptions.
Select the subscription you will be creating your Azure resources in.
Within the subscription settings, click Resource providers.
In the Filter by name field, search for and locate Microsoft.EventGrid
and Microsoft.Storage
.
For each of these providers, ensure the Status column has a value of Registered.
In the left-hand navigation bar of your Panther Console, click Configure > Log Sources.
In the upper right corner, click Create New.
Click the Custom Log Formats tile.
In the Azure Blob Storage tile on the slide-out panel, click Start.
On the Basic Info page, fill in the following:
Name: Enter a descriptive name for your log source.
Log Types: Select one or more log types to associate with this log source.
Click Setup.
On the Log Format page, select the stream type of the incoming logs:
Auto
Lines
JSON
JSON Array
Click Continue.
The Configuration page will load.
On the Infrastructure & Configuration page, you'll create required Azure infrastructure (either by using a Panther-provided Terraform template, or manually configuring resources in the Azure Console) and provide configuration values to Panther.
After creating Azure resources using the Terraform template, Panther will ingest all logs written to any container in your created storage account. Ensure that the created Azure application has permission to read from each container.
Click Terraform Template to download the Terraform template.
If you do not already have the Azure CLI installed, install it by following Azure's How to install the Azure CLI documentation.
In a terminal, run az login
.
Move the Terraform template to a new directory, and navigate to that directory.
Edit the panther.tfvars
file to customize your deployment, e.g., by changing the region the infrastructure will be created in, and providing a custom storage account name.
Run the following Terraform commands to create the Azure resources:
terraform init
terraform apply -var-file="panther.tfvars"
After Terraform has finished creating the resources, copy the outputted values into the following fields in the Provide Azure configuration section of the Panther Console:
Tenant ID
Client ID
Storage Account Name
Storage Queue Name
Client Secret
The client secret value will be redacted in your terminal. To view it, run terraform output secret
, and copy the value without quotation marks.
If you're running macOS, execute terraform output -raw secret | pbcopy
to copy the value without printing it.
Select whether your Azure Blob storage is in Azure Government Cloud or Public Cloud.
Click Setup, then continue to Step 3: Verify setup in Panther.
You will be directed to a success screen:
You can optionally enable one or more Detection Packs.
If you have not done so already, click Attach or Infer Schemas to attach one or more schemas to the source.
The Trigger an alert when no events are processed setting defaults to YES. We recommend leaving this enabled, as you will be alerted if data stops flowing from the log source after a certain period of time. The timeframe is configurable, with a default of 24 hours.
After your log source is configured, you can search ingested data using Search or Data Explorer.
You can also find the Terraform template at this GitHub link.
Click +Create.
In the search bar, enter "storage account" and within the Storage account tile that returns, click Create.
Click Review .
Click +Add, and in the dropdown menu that populates, App registration.
On the right hand side, click Add a certificate or secret.
Click +Queue to create a new queue.
Name: Enter a descriptive topic name.
Click +Container to create a new container.
Click +Add.
Search for "Storage Blob Data Reader" and select the matching role that populates.
Client Secret: The client secret value generated in Step 2.