LogoLogo
Knowledge BaseCommunityRelease NotesRequest Demo
  • Overview
  • Quick Start
    • Onboarding Guide
  • Data Sources & Transports
    • Supported Logs
      • 1Password Logs
      • Apache Logs
      • AppOmni Logs
      • Asana Logs
      • Atlassian Logs
      • Auditd Logs
      • Auth0 Logs
      • AWS Logs
        • AWS ALB
        • AWS Aurora
        • AWS CloudFront
        • AWS CloudTrail
        • AWS CloudWatch
        • AWS Config
        • AWS EKS
        • AWS GuardDuty
        • AWS Security Hub
        • Amazon Security Lake
        • AWS S3
        • AWS Transit Gateway
        • AWS VPC
        • AWS WAF
      • Azure Monitor Logs
      • Bitwarden Logs
      • Box Logs
      • Carbon Black Logs
      • Cisco Umbrella Logs
      • Cloudflare Logs
      • CrowdStrike Logs
        • CrowdStrike Falcon Data Replicator
        • CrowdStrike Event Streams
      • Docker Logs
      • Dropbox Logs
      • Duo Security Logs
      • Envoy Logs
      • Fastly Logs
      • Fluentd Logs
      • GCP Logs
      • GitHub Logs
      • GitLab Logs
      • Google Workspace Logs
      • Heroku Logs
      • Jamf Pro Logs
      • Juniper Logs
      • Lacework Logs
        • Lacework Alert Channel Webhook
        • Lacework Export
      • Material Security Logs
      • Microsoft 365 Logs
      • Microsoft Entra ID Audit Logs
      • Microsoft Graph Logs
      • MongoDB Atlas Logs
      • Netskope Logs
      • Nginx Logs
      • Notion Logs
      • Okta Logs
      • OneLogin Logs
      • Orca Security Logs (Beta)
      • Osquery Logs
      • OSSEC Logs
      • Proofpoint Logs
      • Push Security Logs
      • Rapid7 Logs
      • Salesforce Logs
      • SentinelOne Logs
      • Slack Logs
      • Snowflake Audit Logs (Beta)
      • Snyk Logs
      • Sophos Logs
      • Sublime Security Logs
      • Suricata Logs
      • Sysdig Logs
      • Syslog Logs
      • Tailscale Logs
      • Teleport Logs
      • Tenable Vulnerability Management Logs
      • Thinkst Canary Logs
      • Tines Logs
      • Tracebit Logs
      • Windows Event Logs
      • Wiz Logs
      • Zeek Logs
      • Zendesk Logs
      • Zoom Logs
      • Zscaler Logs
        • Zscaler ZIA
        • Zscaler ZPA
    • Custom Logs
      • Log Schema Reference
      • Transformations
      • Script Log Parser (Beta)
      • Fastmatch Log Parser
      • Regex Log Parser
      • CSV Log Parser
    • Data Transports
      • HTTP Source
      • AWS Sources
        • S3 Source
        • CloudWatch Logs Source
        • SQS Source
          • SNS Source
        • EventBridge
      • Google Cloud Sources
        • Cloud Storage (GCS) Source
        • Pub/Sub Source
      • Azure Blob Storage Source
    • Monitoring Log Sources
    • Ingestion Filters
      • Raw Event Filters
      • Normalized Event Filters (Beta)
    • Data Pipeline Tools
      • Chronosphere Onboarding Guide
      • Cribl Onboarding Guide
      • Fluent Bit Onboarding Guide
        • Fluent Bit Configuration Examples
      • Fluentd Onboarding Guide
        • General log forwarding via Fluentd
        • MacOS System Logs to S3 via Fluentd
        • Syslog to S3 via Fluentd
        • Windows Event Logs to S3 via Fluentd (Legacy)
        • GCP Audit to S3 via Fluentd
      • Observo Onboarding Guide
      • Tarsal Onboarding Guide
    • Tech Partner Log Source Integrations
  • Detections
    • Using Panther-managed Detections
      • Detection Packs
    • Rules and Scheduled Rules
      • Writing Python Detections
        • Python Rule Caching
        • Data Models
        • Global Helper Functions
      • Modifying Detections with Inline Filters (Beta)
      • Derived Detections (Beta)
        • Using Derived Detections to Avoid Merge Conflicts
      • Using the Simple Detection Builder
      • Writing Simple Detections
        • Simple Detection Match Expression Reference
        • Simple Detection Error Codes
    • Correlation Rules (Beta)
      • Correlation Rule Reference
    • PyPanther Detections (Beta)
      • Creating PyPanther Detections
      • Registering, Testing, and Uploading PyPanther Detections
      • Managing PyPanther Detections in the Panther Console
      • PyPanther Detections Style Guide
      • pypanther Library Reference
      • Using the pypanther Command Line Tool
    • Signals
    • Policies
    • Testing
      • Data Replay (Beta)
    • Framework Mapping and MITRE ATT&CK® Matrix
  • Cloud Security Scanning
    • Cloud Resource Attributes
      • AWS
        • ACM Certificate
        • CloudFormation Stack
        • CloudWatch Log Group
        • CloudTrail
        • CloudTrail Meta
        • Config Recorder
        • Config Recorder Meta
        • DynamoDB Table
        • EC2 AMI
        • EC2 Instance
        • EC2 Network ACL
        • EC2 SecurityGroup
        • EC2 Volume
        • EC2 VPC
        • ECS Cluster
        • EKS Cluster
        • ELBV2 Application Load Balancer
        • GuardDuty Detector
        • GuardDuty Detector Meta
        • IAM Group
        • IAM Policy
        • IAM Role
        • IAM Root User
        • IAM User
        • KMS Key
        • Lambda Function
        • Password Policy
        • RDS Instance
        • Redshift Cluster
        • Route 53 Domains
        • Route 53 Hosted Zone
        • S3 Bucket
        • WAF Web ACL
  • Alerts & Destinations
    • Alert Destinations
      • Amazon SNS Destination
      • Amazon SQS Destination
      • Asana Destination
      • Blink Ops Destination
      • Custom Webhook Destination
      • Discord Destination
      • GitHub Destination
      • Google Pub/Sub Destination (Beta)
      • Incident.io Destination
      • Jira Cloud Destination
      • Jira Data Center Destination (Beta)
      • Microsoft Teams Destination
      • Mindflow Destination
      • OpsGenie Destination
      • PagerDuty Destination
      • Rapid7 Destination
      • ServiceNow Destination (Custom Webhook)
      • Slack Bot Destination
      • Slack Destination (Webhook)
      • Splunk Destination (Beta)
      • Tines Destination
      • Torq Destination
    • Assigning and Managing Alerts
      • Managing Alerts in Slack
    • Alert Runbooks
      • Panther-managed Policies Runbooks
        • AWS CloudTrail Is Enabled In All Regions
        • AWS CloudTrail Sending To CloudWatch Logs
        • AWS KMS CMK Key Rotation Is Enabled
        • AWS Application Load Balancer Has Web ACL
        • AWS Access Keys Are Used Every 90 Days
        • AWS Access Keys are Rotated Every 90 Days
        • AWS ACM Certificate Is Not Expired
        • AWS Access Keys not Created During Account Creation
        • AWS CloudTrail Has Log Validation Enabled
        • AWS CloudTrail S3 Bucket Has Access Logging Enabled
        • AWS CloudTrail Logs S3 Bucket Not Publicly Accessible
        • AWS Config Is Enabled for Global Resources
        • AWS DynamoDB Table Has Autoscaling Targets Configured
        • AWS DynamoDB Table Has Autoscaling Enabled
        • AWS DynamoDB Table Has Encryption Enabled
        • AWS EC2 AMI Launched on Approved Host
        • AWS EC2 AMI Launched on Approved Instance Type
        • AWS EC2 AMI Launched With Approved Tenancy
        • AWS EC2 Instance Has Detailed Monitoring Enabled
        • AWS EC2 Instance Is EBS Optimized
        • AWS EC2 Instance Running on Approved AMI
        • AWS EC2 Instance Running on Approved Instance Type
        • AWS EC2 Instance Running in Approved VPC
        • AWS EC2 Instance Running On Approved Host
        • AWS EC2 Instance Running With Approved Tenancy
        • AWS EC2 Instance Volumes Are Encrypted
        • AWS EC2 Volume Is Encrypted
        • AWS GuardDuty is Logging to a Master Account
        • AWS GuardDuty Is Enabled
        • AWS IAM Group Has Users
        • AWS IAM Policy Blocklist Is Respected
        • AWS IAM Policy Does Not Grant Full Administrative Privileges
        • AWS IAM Policy Is Not Assigned Directly To User
        • AWS IAM Policy Role Mapping Is Respected
        • AWS IAM User Has MFA Enabled
        • AWS IAM Password Used Every 90 Days
        • AWS Password Policy Enforces Complexity Guidelines
        • AWS Password Policy Enforces Password Age Limit Of 90 Days Or Less
        • AWS Password Policy Prevents Password Reuse
        • AWS RDS Instance Is Not Publicly Accessible
        • AWS RDS Instance Snapshots Are Not Publicly Accessible
        • AWS RDS Instance Has Storage Encrypted
        • AWS RDS Instance Has Backups Enabled
        • AWS RDS Instance Has High Availability Configured
        • AWS Redshift Cluster Allows Version Upgrades
        • AWS Redshift Cluster Has Encryption Enabled
        • AWS Redshift Cluster Has Logging Enabled
        • AWS Redshift Cluster Has Correct Preferred Maintenance Window
        • AWS Redshift Cluster Has Sufficient Snapshot Retention Period
        • AWS Resource Has Minimum Number of Tags
        • AWS Resource Has Required Tags
        • AWS Root Account Has MFA Enabled
        • AWS Root Account Does Not Have Access Keys
        • AWS S3 Bucket Name Has No Periods
        • AWS S3 Bucket Not Publicly Readable
        • AWS S3 Bucket Not Publicly Writeable
        • AWS S3 Bucket Policy Does Not Use Allow With Not Principal
        • AWS S3 Bucket Policy Enforces Secure Access
        • AWS S3 Bucket Policy Restricts Allowed Actions
        • AWS S3 Bucket Policy Restricts Principal
        • AWS S3 Bucket Has Versioning Enabled
        • AWS S3 Bucket Has Encryption Enabled
        • AWS S3 Bucket Lifecycle Configuration Expires Data
        • AWS S3 Bucket Has Logging Enabled
        • AWS S3 Bucket Has MFA Delete Enabled
        • AWS S3 Bucket Has Public Access Block Enabled
        • AWS Security Group Restricts Ingress On Administrative Ports
        • AWS VPC Default Security Group Restricts All Traffic
        • AWS VPC Flow Logging Enabled
        • AWS WAF Has Correct Rule Ordering
        • AWS CloudTrail Logs Encrypted Using KMS CMK
      • Panther-managed Rules Runbooks
        • AWS CloudTrail Modified
        • AWS Config Service Modified
        • AWS Console Login Failed
        • AWS Console Login Without MFA
        • AWS EC2 Gateway Modified
        • AWS EC2 Network ACL Modified
        • AWS EC2 Route Table Modified
        • AWS EC2 SecurityGroup Modified
        • AWS EC2 VPC Modified
        • AWS IAM Policy Modified
        • AWS KMS CMK Loss
        • AWS Root Activity
        • AWS S3 Bucket Policy Modified
        • AWS Unauthorized API Call
    • Tech Partner Alert Destination Integrations
  • Investigations & Search
    • Search
      • Search Filter Operators
    • Data Explorer
      • Data Explorer SQL Search Examples
        • CloudTrail logs queries
        • GitHub Audit logs queries
        • GuardDuty logs queries
        • Nginx and ALB Access logs queries
        • Okta logs queries
        • S3 Access logs queries
        • VPC logs queries
    • Visualization and Dashboards
      • Custom Dashboards (Beta)
      • Panther-Managed Dashboards
    • Standard Fields
    • Saved and Scheduled Searches
      • Templated Searches
        • Behavioral Analytics and Anomaly Detection Template Macros (Beta)
      • Scheduled Search Examples
    • Search History
    • Data Lakes
      • Snowflake
        • Snowflake Configuration for Optimal Search Performance
      • Athena
  • PantherFlow (Beta)
    • PantherFlow Quick Reference
    • PantherFlow Statements
    • PantherFlow Operators
      • Datatable Operator
      • Extend Operator
      • Join Operator
      • Limit Operator
      • Project Operator
      • Range Operator
      • Sort Operator
      • Search Operator
      • Summarize Operator
      • Union Operator
      • Visualize Operator
      • Where Operator
    • PantherFlow Data Types
    • PantherFlow Expressions
    • PantherFlow Functions
      • Aggregation Functions
      • Date/time Functions
      • String Functions
      • Array Functions
      • Math Functions
      • Control Flow Functions
      • Regular Expression Functions
      • Snowflake Functions
      • Data Type Functions
      • Other Functions
    • PantherFlow Example Queries
      • PantherFlow Examples: Threat Hunting Scenarios
      • PantherFlow Examples: SOC Operations
      • PantherFlow Examples: Panther Audit Logs
  • Enrichment
    • Custom Lookup Tables
      • Creating a GreyNoise Lookup Table
      • Lookup Table Examples
        • Using Lookup Tables: 1Password UUIDs
      • Lookup Table Specification Reference
    • Identity Provider Profiles
      • Okta Profiles
      • Google Workspace Profiles
    • Anomali ThreatStream
    • IPinfo
    • Tor Exit Nodes
    • TrailDiscover (Beta)
  • Panther AI (Beta)
    • Managing Panther AI Response History
  • System Configuration
    • Role-Based Access Control
    • Identity & Access Integrations
      • Azure Active Directory SSO
      • Duo SSO
      • G Suite SSO
      • Okta SSO
        • Okta SCIM
      • OneLogin SSO
      • Generic SSO
    • Panther Audit Logs
      • Querying and Writing Detections for Panther Audit Logs
      • Panther Audit Log Actions
    • Notifications and Errors (Beta)
      • System Errors
    • Panther Deployment Types
      • SaaS
      • Cloud Connected
        • Configuring Snowflake for Cloud Connected
        • Configuring AWS for Cloud Connected
        • Pre-Deployment Tools
      • Legacy Configurations
        • Snowflake Connected (Legacy)
        • Customer-configured Snowflake Integration (Legacy)
        • Self-Hosted Deployments (Legacy)
          • Runtime Environment
  • Panther Developer Workflows
    • Panther Developer Workflows Overview
    • Using panther-analysis
      • Public Fork
      • Private Clone
      • Panther Analysis Tool
        • Install, Configure, and Authenticate with the Panther Analysis Tool
        • Panther Analysis Tool Commands
        • Managing Lookup Tables and Enrichment Providers with the Panther Analysis Tool
      • CI/CD for Panther Content
        • Deployment Workflows Using Panther Analysis Tool
          • Managing Panther Content via CircleCI
          • Managing Panther Content via GitHub Actions
        • Migrating to a CI/CD Workflow
    • Panther API
      • REST API (Beta)
        • Alerts
        • Alert Comments
        • API Tokens
        • Data Models
        • Globals
        • Log Sources
        • Queries
        • Roles
        • Rules
        • Scheduled Rules
        • Simple Rules
        • Policies
        • Users
      • GraphQL API
        • Alerts & Errors
        • Cloud Account Management
        • Data Lake Queries
        • Log Source Management
        • Metrics
        • Schemas
        • Token Rotation
        • User & Role Management
      • API Playground
    • Terraform
      • Managing AWS S3 Log Sources with Terraform
      • Managing HTTP Log Sources with Terraform
    • pantherlog Tool
    • Converting Sigma Rules
  • Resources
    • Help
      • Operations
      • Security and Privacy
        • Security Without AWS External ID
      • Glossary
      • Legal
    • Panther System Architecture
Powered by GitBook
On this page
  • Overview
  • How to onboard Zscaler ZIA logs to Panther
  • Prerequisites
  • Step 1: Set up the Zscaler ZIA source in Panther
  • Step 2 (for S3 ingest only): Set up an S3 bucket
  • Step 3: Configure a Cloud NSS Feed in the Zscaler admin console
  • Supported log types
  • Zscaler.ZIA.AdminAuditLog
  • Zscaler.ZIA.WebLog
  • Zscaler.ZIA.FWLog
  • Zscaler.ZIA.DNSLog

Was this helpful?

  1. Data Sources & Transports
  2. Supported Logs
  3. Zscaler Logs

Zscaler ZIA

Connecting Zscaler ZIA logs to your Panther Console

PreviousZscaler LogsNextZscaler ZPA

Last updated 1 month ago

Was this helpful?

Overview

Panther supports ingesting Internet and SaaS Access (ZIA) logs by using either an HTTP Source or an AWS S3 Source.

In order to onboard Zscaler ZIA logs in Panther, you must have a subscription to Zscaler ZIA.

How to onboard Zscaler ZIA logs to Panther

To onboard Zscaler ZIA logs in Panther, you will first create a Zscaler ZIA source in Panther, then configure a NSS Cloud Feed in Zscaler.

If you are onboarding more than one in Panther:

  • In Zscaler, you must create a different NSS Cloud Feed for each log type.

  • In Panther, you may create either one Zscaler ZIA source for all of your NSS Cloud Feeds, or one Zscaler ZIA source for each NSS Cloud Feed.

Prerequisites

  • You must have permission to access to the Zscaler Admin Console.

Step 1: Set up the Zscaler ZIA source in Panther

  1. In the left-hand navigation bar of your Panther Console, click Configure > Log Sources.

  2. In the top right, click Create New.

  3. Search for "Zscaler ZIA", then click its tile.

  4. In the Transport Mechanism drop-down, select the method you wish to use for this integration: HTTP or AWS S3 Bucket.

    • You can configure Zscaler to either stream ZIA logs directly to a Panther HTTP endpoint, or to an S3 bucket in your environment, which Panther will then pull from.

  5. Click Start Setup.

  6. Follow Panther's instructions for configuring the Data Transport method you chose:

    • HTTP: Follow Panther's , beginning at Step 5.

      • During setup, on the Configure page, you will be required to use .

      • Payloads sent to this source are subject to the .

      • Do not proceed to the next step until the creation of your HTTP endpoint has completed.

    • S3: Follow .

      • Follow the , beginning at Step 1.5.

Step 2 (for S3 ingest only): Set up an S3 bucket

    • Stop when you reach Add a Cloud NSS Feed in the ZIA Admin Portal (on page 37), as you will complete that in the next step.

Step 3: Configure a Cloud NSS Feed in the Zscaler admin console

If you are using HTTP as your Data Transport:

  • When configuring the Cloud NSS Feed, take note of the following:

    • NSS Type: Select NSS for Web if you want to onboard Admin Audit or Web log types, or NSS for Web to onboard Firewall or DNS log types.

    • SIEM Rate: Leave as Unlimited.

    • SIEM Type: Select Other.

    • OAuth 2.0 Authentication: This setting should be disabled.

    • Max Batch Size: Leave as-is.

    • Log Type: Select the log type for which you want to send logs to Panther, and leave the rest of the fields as they are.

If you are using S3 as your Data Transport:

Supported log types

Zscaler.ZIA.AdminAuditLog

The Admin Audit log records key events in the Zscaler admin console, such as logins, logouts, and resource actions (like create and update). The Admin Audit log is primarily used to investigate potentially suspicious activity or diagnose and troubleshoot errors.

References:

schema: Zscaler.ZIA.AdminAuditLog
description: Zscaler ZIA Admin Audit Log
referenceURL: https://help.zscaler.com/zia/nss-feed-output-format-admin-audit-logs
fields:
  - name: sourcetype
    required: true
    description: The type of source generating the log event.
    type: string
  - name: event
    required: true
    description: The audit log event.
    type: object
    fields:
      - name: time
        required: true
        description: The timestamp of the audit log.
        type: timestamp
        timeFormats:
          - '%a %b %e %H:%M:%S %Y'
        isEventTime: true
      - name: recordid
        required: true
        description: The unique identifier for the log.
        type: string
      - name: action
        required: true
        description: The action performed.
        type: string
      - name: category
        description: The location in the portal where the action was performed.
        type: string
      - name: subcategory
        description: The sub-location in the portal where the action was performed.
        type: string
      - name: resource
        description: The specific location within a sub-category.
        type: string
      - name: interface
        description: The means by which the user performed their actions.
        type: string
      - name: adminid
        description: The login id of the admin who performed the action.
        type: string
        indicators:
          - email
          - actor_id
      - name: clientip
        description: The source IP address for the admin.
        type: string
        indicators:
          - ip
      - name: result
        description: The outcome of an action.
        type: string
      - name: errorcode
        description: The error code if the action failed.
        type: string
      - name: auditlogtype
        description: The Admin Audit log type.
        type: string
      - name: preaction
        description: Data before any policy or configuration changes.
        type: json
      - name: postaction
        description: Data after any policy or configuration changes.
        type: json

Zscaler.ZIA.WebLog

The Web Log records detailed information about user internet activity through Zscaler, including allowed and blocked requests to websites. It tracks URL categories, risk levels, and policy enforcement actions, making it essential for monitoring browsing behavior, enforcing compliance, and detecting potential threats.

References:

schema: Zscaler.ZIA.WebLog
description: Zscaler ZIA Web Log
referenceURL: https://help.zscaler.com/zia/nss-feed-output-format-web-logs
fields:
  - name: sourcetype
    required: true
    description: The type of source generating the log event.
    type: string
  - name: event
    required: true
    description: The web log event.
    type: object
    fields:
      - name: datetime
        required: true
        type: timestamp
        description: The time and date of the transaction
        timeFormats:
          - '%Y-%m-%d %H:%M:%S'
        isEventTime: true
      - name: reason
        type: string
        description: The action that the service took and the policy that was applied, if the transaction was blocked
      - name: event_id
        description: The unique record identifier for each log
        type: string
      - name: protocol
        description: The protocol type of the transaction
        type: string
      - name: action
        description: The action that the service took on the transaction
        type: string
      - name: transactionsize
        description: The total size of the HTTP transaction in bytes
        type: bigint
      - name: responsesize
        description: The total size of the HTTP response, including the header and payload, in bytes
        type: bigint
      - name: requestsize
        description: The request size in bytes
        type: bigint
      - name: ClientIP
        description: The IP address of the user
        type: string
        indicators:
          - ip
      - name: appclass
        description: The web application class of the application that was accessed.
        type: string
      - name: appname
        description: The name of the cloud application
        type: string
      - name: bwthrottle
        description: Indicates whether the transaction was throttled due to a configured bandwidth policy
        type: string
      - name: clientpublicIP
        description: The client public IP address
        type: string
        indicators:
          - ip
      - name: contenttype
        description: The name of the content type
        type: string
      - name: department
        description: The department of the user
        type: string
      - name: devicehostname
        description: The hostname of the device
        type: string
      - name: deviceowner
        description: The owner of the device
        type: string
      - name: dlpdictionaries
        description: The DLP dictionaries that were matched, if any
        type: string
      - name: dlpengine
        description: The DLP engine that was matched, if any
        type: string
      - name: fileclass
        description: The class of file downloaded during the transaction
        type: string
      - name: filetype
        description: Type of file involved in the transaction.
        type: string
      - name: hostname
        description: Hostname of the URL being accessed.
        indicators:
          - hostname
        type: string
      - name: keyprotectiontype
        description: Indicates whether an HSM Protection or a Software Protection intermediate CA certificate is used
        type: string
      - name: location
        description: The gateway location or sub-location of the source.
        type: string
      - name: pagerisk
        description: The Page Risk Index score of the destination URL.
        type: string
      - name: product
        description: The product name
        type: string
      - name: refererURL
        description: The referer URL
        type: string
      - name: requestmethod
        description: The request method
        type: string
      - name: serverip
        description: The destination server IP address. This displays 0.0.0.0 if the request was blocked.
        type: string
        indicators:
          - ip
      - name: status
        description: The response code
        type: string
      - name: threatcategory
        description: The category of malware that was detected in the transaction, if any
        type: string
      - name: threatclass
        description: The class of malware that was detected in the transaction, if any
        type: string
      - name: threatname
        description: The name of the threat that was detected in the transaction, if any
        type: string
      - name: unscannabletype
        description: The unscannable file type
        type: string
      - name: url
        required: true
        description: The destination URL
        type: string
        indicators:
          - url
      - name: urlcategory
        description: The category of the destination URL
        type: string
      - name: urlclass
        description: The class of the destination URL
        type: string
      - name: urlsupercategory
        description: The super category of the destination URL
        type: string
      - name: user
        required: true
        description: The user's login name in email address format
        type: string
        indicators:
          - email
          - actor_id
          - username
      - name: useragent
        description: The user agent
        type: string
      - name: vendor
        description: The vendor name
        type: string

Zscaler.ZIA.FWLog

The firewall log records non-web traffic events managed by Zscaler ZIA firewall, including details about source and destination IPs, protocols, ports, and session actions. It is used to monitor application usage, enforce network policies, and identify suspicious or unauthorized traffic patterns.

References:

schema: Zscaler.ZIA.FWLog
description: Zscaler ZIA Firewall Log
referenceURL: https://help.zscaler.com/zia/nss-feed-output-format-firewall-logs
fields:
  - name: sourcetype
    required: true
    description: The type of source generating the log event.
    type: string
  - name: event
    required: true
    description: The firewall log event.
    type: object
    fields:
      - name: datetime
        required: true
        type: timestamp
        description: The time and date of the transaction
        timeFormats:
          - '%a %b %e %H:%M:%S %Y'
        isEventTime: true
      - name: action
        description: The action that the service took on the transaction, Allowed or Blocked
        type: string
      - name: aggregate
        description: Indicates whether the Firewall session is aggregated
        type: string
      - name: avgduration
        description: The average session duration, in milliseconds, if the sessions were aggregated
        type: bigint
      - name: cdip
        description: The client destination IP address
        type: string
        indicators:
          - ip
      - name: cdport
        description: The client source port
        type: bigint
      - name: csip
        required: true
        description: The client source IP address
        type: string
        indicators:
          - ip
      - name: csport
        description: The client source port
        type: bigint
      - name: department
        description: The department of the user
        type: string
      - name: destcountry
        description: The abbreviated code of the country of the destination IP address
        type: string
      - name: devicehostname
        description: The hostname of the device
        type: string
      - name: deviceowner
        description: The owner of the device
        type: string
      - name: dnat
        description: Indicates if the destination NAT policy was applied
        type: string
      - name: duration
        description: The session or request duration in seconds
        type: bigint
      - name: durationms
        description: The session or request duration in milliseconds
        type: bigint
      - name: inbytes
        description: The number of bytes sent from the server to the client
        type: bigint
      - name: ipcat
        description: The URL category that corresponds to the server IP address
        type: string
      - name: ipsrulelabel
        description: The name of the IPS policy that was applied to the Firewall session
        type: string
      - name: locationname
        description: The name of the location from which the session was initiated
        type: string
      - name: numsessions
        description: The number of sessions that were aggregated
        type: bigint
      - name: nwapp
        description: The network application that was accessed
        type: string
      - name: nwsvc
        description: The network service that was used
        type: string
      - name: outbytes
        description: The number of bytes sent from the client to the server
        type: bigint
      - name: proto
        description: The type of IP protocol
        type: string
      - name: rulelabel
        description: The name of the rule that was applied to the transaction
        type: string
      - name: sdip
        description: The server destination IP address
        type: string
        indicators:
          - ip
      - name: sdport
        description: The server destination port
        type: bigint
      - name: ssip
        description: The server source IP address
        type: string
        indicators:
          - ip
      - name: ssport
        description: The server source port
        type: bigint
      - name: stateful
        description: Indicates if the Firewall session is stateful
        type: string
      - name: threatcat
        description: The category of the threat in the Firewall session by the IPS engine
        type: string
      - name: threatname
        description: The name of the threat detected in the Firewall session by the IPS engine
        type: string
      - name: tsip
        description: The tunnel IP address of the client (source)
        type: string
        indicators:
          - ip
      - name: tunsport
        description: The tunnel port
        type: bigint
      - name: tuntype
        description: The traffic forwarding method used to send the traffic to the Firewall
        type: string
      - name: user
        required: true
        description: The user's login name in email address format
        type: string
        indicators:
          - email
          - username
          - actor_id

Zscaler.ZIA.DNSLog

The DNS Log captures all DNS queries and responses processed by Zscaler ZIA, including allowed, blocked, or unresolved domains. It provides visibility into domain usage, helps detect malicious or suspicious activity (such as DNS tunneling), and supports policy enforcement for secure DNS filtering.

References:

schema: Zscaler.ZIA.DNSLog
description: Zscaler ZIA DNS Log
referenceURL: https://help.zscaler.com/zia/nss-feed-output-format-dns-logs
fields:
  - name: sourcetype
    required: true
    description: The type of source generating the log event.
    type: string
  - name: event
    required: true
    description: The dns log event.
    type: object
    fields:
      - name: datetime
        required: true
        type: timestamp
        description: The time and date of the transaction
        timeFormats:
          - '%a %b %e %H:%M:%S %Y'
        isEventTime: true
      - name: category
        type: string
        description: The event category
      - name: clt_sip
        description: The IP address of the user
        type: string
        indicators:
          - ip
      - name: department
        description: The department of the user
        type: string
      - name: devicehostname
        description: The hostname of the device
        type: string
      - name: deviceowner
        description: The owner of the device
        type: string
      - name: dns_req
        description: The DNS request
        type: string
        indicators:
          - domain
      - name: dns_reqtype
        required: true
        description: The DNS request type
        type: string
      - name: dns_resp
        description: The DNS response
        type: string
      - name: durationms
        description: The duration of the DNS request in milliseconds
        type: bigint
      - name: location
        description: The event location
        type: string
      - name: reqaction
        description: The name of the action that was applied to the DNS request
        type: string
      - name: reqrulelabel
        description: The name of the rule that was applied to the DNS request
        type: string
      - name: resaction
        description: The name of the action that was applied to the DNS response
        type: string
      - name: respipcategory
        description: The response IP category
        type: string
      - name: resrulelabel
        description: The name of the rule that was applied to the DNS response
        type: string
      - name: srv_dip
        description: The server IP
        type: string
        indicators:
          - ip
      - name: srv_dport
        description: The server port
        type: bigint
      - name: user
        required: true
        description: The login name in email address format
        type: string
        indicators:
          - email
          - username
          - actor_id

In this , follow the Integrating Zscaler Cloud NSS with Amazon S3 instructions, beginning on page 18.

If you are onboarding more than one , you must create a different NSS Cloud Feed for each log type. Repeat this step for each log type.

Follow guide within the Zscaler documentation based on the log type you are onboarding:

For , follow .

For , follow .

For , follow .

For , follow .

API URL: Enter the HTTP Source URL you generated in the Panther Console in .

HTTP Headers: In the Key field, enter x-panther-zscaler. In the Value field, enter the Shared Secret Value you generated or entered in Panther in .

In this , follow the Add a Cloud NSS Feed in the ZIA Admin Portal instructions, beginning on page 37.

Zscaler SaaS Security API and Amazon S3 Deployment Guide
Adding Cloud NSS Feeds
Zscaler SaaS Security API and Amazon S3 Deployment Guide
General information about Audit Logs
Format of Admin Audit logs
Format of Web logs
Format of Firewall logs
Format of DNS logs
Zscaler ZIA log type
Adding Cloud NSS Feeds for Admin Audit Logs
Zscaler.ZIA.AdminAuditLog
Adding Cloud NSS Feeds for Web Logs
Zscaler.ZIA.WebLog
Adding Cloud NSS Feeds for Firewall Logs
Zscaler.ZIA.FWLog
Adding Cloud NSS Feeds for DNS Logs
Zscaler.ZIA.DNSLog
Step 1
Step 1
Zscaler
Data Transport
instructions for configuring an HTTP Source
payload requirements for all HTTP sources
Panther's instructions for configuring an S3 Source
instructions on setting up an S3 source in Panther
Zscaler ZIA log type
shared secret authentication
Under a "ZScaler ZIA" title at the top of the page is a description of ZScaler. At the top-right is a Transport Mechanism field, as well as a Start Setup button.
Under an "Add Cloud NSS Feed" header are various form fields, including Feed Name, SIEM Type, and HTTP Headers.
Under an "Add Cloud NSS Feed" are various form fields, such as Feed Name, SIEM Type, and AWS Secret Key.