LogoLogo
Knowledge BaseCommunityRelease NotesRequest Demo
  • Overview
  • Quick Start
    • Onboarding Guide
  • Data Sources & Transports
    • Supported Logs
      • 1Password Logs
      • Apache Logs
      • AppOmni Logs
      • Asana Logs
      • Atlassian Logs
      • Auditd Logs
      • Auth0 Logs
      • AWS Logs
        • AWS ALB
        • AWS Aurora
        • AWS CloudFront
        • AWS CloudTrail
        • AWS CloudWatch
        • AWS Config
        • AWS EKS
        • AWS GuardDuty
        • AWS Security Hub
        • Amazon Security Lake
        • AWS S3
        • AWS Transit Gateway
        • AWS VPC
        • AWS WAF
      • Azure Monitor Logs
      • Bitwarden Logs
      • Box Logs
      • Carbon Black Logs
      • Cisco Umbrella Logs
      • Cloudflare Logs
      • CrowdStrike Logs
        • CrowdStrike Falcon Data Replicator
        • CrowdStrike Event Streams
      • Docker Logs
      • Dropbox Logs
      • Duo Security Logs
      • Envoy Logs
      • Fastly Logs
      • Fluentd Logs
      • GCP Logs
      • GitHub Logs
      • GitLab Logs
      • Google Workspace Logs
      • Heroku Logs
      • Jamf Pro Logs
      • Juniper Logs
      • Lacework Logs
        • Lacework Alert Channel Webhook
        • Lacework Export
      • Material Security Logs
      • Microsoft 365 Logs
      • Microsoft Entra ID Audit Logs
      • Microsoft Graph Logs
      • MongoDB Atlas Logs
      • Netskope Logs
      • Nginx Logs
      • Notion Logs
      • Okta Logs
      • OneLogin Logs
      • Orca Security Logs (Beta)
      • Osquery Logs
      • OSSEC Logs
      • Proofpoint Logs
      • Push Security Logs
      • Rapid7 Logs
      • Salesforce Logs
      • SentinelOne Logs
      • Slack Logs
      • Snowflake Audit Logs (Beta)
      • Snyk Logs
      • Sophos Logs
      • Sublime Security Logs
      • Suricata Logs
      • Sysdig Logs
      • Syslog Logs
      • Tailscale Logs
      • Teleport Logs
      • Tenable Vulnerability Management Logs
      • Thinkst Canary Logs
      • Tines Logs
      • Tracebit Logs
      • Windows Event Logs
      • Wiz Logs
      • Zeek Logs
      • Zendesk Logs
      • Zoom Logs
      • Zscaler Logs
        • Zscaler ZIA
        • Zscaler ZPA
    • Custom Logs
      • Log Schema Reference
      • Transformations
      • Script Log Parser (Beta)
      • Fastmatch Log Parser
      • Regex Log Parser
      • CSV Log Parser
    • Data Transports
      • HTTP Source
      • AWS Sources
        • S3 Source
        • CloudWatch Logs Source
        • SQS Source
          • SNS Source
        • EventBridge
      • Google Cloud Sources
        • Cloud Storage (GCS) Source
        • Pub/Sub Source
      • Azure Blob Storage Source
    • Monitoring Log Sources
    • Ingestion Filters
      • Raw Event Filters
      • Normalized Event Filters (Beta)
    • Data Pipeline Tools
      • Chronosphere Onboarding Guide
      • Cribl Onboarding Guide
      • Fluent Bit Onboarding Guide
        • Fluent Bit Configuration Examples
      • Fluentd Onboarding Guide
        • General log forwarding via Fluentd
        • MacOS System Logs to S3 via Fluentd
        • Syslog to S3 via Fluentd
        • Windows Event Logs to S3 via Fluentd (Legacy)
        • GCP Audit to S3 via Fluentd
      • Observo Onboarding Guide
      • Tarsal Onboarding Guide
    • Tech Partner Log Source Integrations
  • Detections
    • Using Panther-managed Detections
      • Detection Packs
    • Rules and Scheduled Rules
      • Writing Python Detections
        • Python Rule Caching
        • Data Models
        • Global Helper Functions
      • Modifying Detections with Inline Filters (Beta)
      • Derived Detections (Beta)
        • Using Derived Detections to Avoid Merge Conflicts
      • Using the Simple Detection Builder
      • Writing Simple Detections
        • Simple Detection Match Expression Reference
        • Simple Detection Error Codes
    • Correlation Rules (Beta)
      • Correlation Rule Reference
    • PyPanther Detections (Beta)
      • Creating PyPanther Detections
      • Registering, Testing, and Uploading PyPanther Detections
      • Managing PyPanther Detections in the Panther Console
      • PyPanther Detections Style Guide
      • pypanther Library Reference
      • Using the pypanther Command Line Tool
    • Signals
    • Policies
    • Testing
      • Data Replay (Beta)
    • Framework Mapping and MITRE ATT&CK® Matrix
  • Cloud Security Scanning
    • Cloud Resource Attributes
      • AWS
        • ACM Certificate
        • CloudFormation Stack
        • CloudWatch Log Group
        • CloudTrail
        • CloudTrail Meta
        • Config Recorder
        • Config Recorder Meta
        • DynamoDB Table
        • EC2 AMI
        • EC2 Instance
        • EC2 Network ACL
        • EC2 SecurityGroup
        • EC2 Volume
        • EC2 VPC
        • ECS Cluster
        • EKS Cluster
        • ELBV2 Application Load Balancer
        • GuardDuty Detector
        • GuardDuty Detector Meta
        • IAM Group
        • IAM Policy
        • IAM Role
        • IAM Root User
        • IAM User
        • KMS Key
        • Lambda Function
        • Password Policy
        • RDS Instance
        • Redshift Cluster
        • Route 53 Domains
        • Route 53 Hosted Zone
        • S3 Bucket
        • WAF Web ACL
  • Alerts & Destinations
    • Alert Destinations
      • Amazon SNS Destination
      • Amazon SQS Destination
      • Asana Destination
      • Blink Ops Destination
      • Custom Webhook Destination
      • Discord Destination
      • GitHub Destination
      • Google Pub/Sub Destination (Beta)
      • Incident.io Destination
      • Jira Cloud Destination
      • Jira Data Center Destination (Beta)
      • Microsoft Teams Destination
      • Mindflow Destination
      • OpsGenie Destination
      • PagerDuty Destination
      • Rapid7 Destination
      • ServiceNow Destination (Custom Webhook)
      • Slack Bot Destination
      • Slack Destination (Webhook)
      • Splunk Destination (Beta)
      • Tines Destination
      • Torq Destination
    • Assigning and Managing Alerts
      • Managing Alerts in Slack
    • Alert Runbooks
      • Panther-managed Policies Runbooks
        • AWS CloudTrail Is Enabled In All Regions
        • AWS CloudTrail Sending To CloudWatch Logs
        • AWS KMS CMK Key Rotation Is Enabled
        • AWS Application Load Balancer Has Web ACL
        • AWS Access Keys Are Used Every 90 Days
        • AWS Access Keys are Rotated Every 90 Days
        • AWS ACM Certificate Is Not Expired
        • AWS Access Keys not Created During Account Creation
        • AWS CloudTrail Has Log Validation Enabled
        • AWS CloudTrail S3 Bucket Has Access Logging Enabled
        • AWS CloudTrail Logs S3 Bucket Not Publicly Accessible
        • AWS Config Is Enabled for Global Resources
        • AWS DynamoDB Table Has Autoscaling Targets Configured
        • AWS DynamoDB Table Has Autoscaling Enabled
        • AWS DynamoDB Table Has Encryption Enabled
        • AWS EC2 AMI Launched on Approved Host
        • AWS EC2 AMI Launched on Approved Instance Type
        • AWS EC2 AMI Launched With Approved Tenancy
        • AWS EC2 Instance Has Detailed Monitoring Enabled
        • AWS EC2 Instance Is EBS Optimized
        • AWS EC2 Instance Running on Approved AMI
        • AWS EC2 Instance Running on Approved Instance Type
        • AWS EC2 Instance Running in Approved VPC
        • AWS EC2 Instance Running On Approved Host
        • AWS EC2 Instance Running With Approved Tenancy
        • AWS EC2 Instance Volumes Are Encrypted
        • AWS EC2 Volume Is Encrypted
        • AWS GuardDuty is Logging to a Master Account
        • AWS GuardDuty Is Enabled
        • AWS IAM Group Has Users
        • AWS IAM Policy Blocklist Is Respected
        • AWS IAM Policy Does Not Grant Full Administrative Privileges
        • AWS IAM Policy Is Not Assigned Directly To User
        • AWS IAM Policy Role Mapping Is Respected
        • AWS IAM User Has MFA Enabled
        • AWS IAM Password Used Every 90 Days
        • AWS Password Policy Enforces Complexity Guidelines
        • AWS Password Policy Enforces Password Age Limit Of 90 Days Or Less
        • AWS Password Policy Prevents Password Reuse
        • AWS RDS Instance Is Not Publicly Accessible
        • AWS RDS Instance Snapshots Are Not Publicly Accessible
        • AWS RDS Instance Has Storage Encrypted
        • AWS RDS Instance Has Backups Enabled
        • AWS RDS Instance Has High Availability Configured
        • AWS Redshift Cluster Allows Version Upgrades
        • AWS Redshift Cluster Has Encryption Enabled
        • AWS Redshift Cluster Has Logging Enabled
        • AWS Redshift Cluster Has Correct Preferred Maintenance Window
        • AWS Redshift Cluster Has Sufficient Snapshot Retention Period
        • AWS Resource Has Minimum Number of Tags
        • AWS Resource Has Required Tags
        • AWS Root Account Has MFA Enabled
        • AWS Root Account Does Not Have Access Keys
        • AWS S3 Bucket Name Has No Periods
        • AWS S3 Bucket Not Publicly Readable
        • AWS S3 Bucket Not Publicly Writeable
        • AWS S3 Bucket Policy Does Not Use Allow With Not Principal
        • AWS S3 Bucket Policy Enforces Secure Access
        • AWS S3 Bucket Policy Restricts Allowed Actions
        • AWS S3 Bucket Policy Restricts Principal
        • AWS S3 Bucket Has Versioning Enabled
        • AWS S3 Bucket Has Encryption Enabled
        • AWS S3 Bucket Lifecycle Configuration Expires Data
        • AWS S3 Bucket Has Logging Enabled
        • AWS S3 Bucket Has MFA Delete Enabled
        • AWS S3 Bucket Has Public Access Block Enabled
        • AWS Security Group Restricts Ingress On Administrative Ports
        • AWS VPC Default Security Group Restricts All Traffic
        • AWS VPC Flow Logging Enabled
        • AWS WAF Has Correct Rule Ordering
        • AWS CloudTrail Logs Encrypted Using KMS CMK
      • Panther-managed Rules Runbooks
        • AWS CloudTrail Modified
        • AWS Config Service Modified
        • AWS Console Login Failed
        • AWS Console Login Without MFA
        • AWS EC2 Gateway Modified
        • AWS EC2 Network ACL Modified
        • AWS EC2 Route Table Modified
        • AWS EC2 SecurityGroup Modified
        • AWS EC2 VPC Modified
        • AWS IAM Policy Modified
        • AWS KMS CMK Loss
        • AWS Root Activity
        • AWS S3 Bucket Policy Modified
        • AWS Unauthorized API Call
    • Tech Partner Alert Destination Integrations
  • Investigations & Search
    • Search
      • Search Filter Operators
    • Data Explorer
      • Data Explorer SQL Search Examples
        • CloudTrail logs queries
        • GitHub Audit logs queries
        • GuardDuty logs queries
        • Nginx and ALB Access logs queries
        • Okta logs queries
        • S3 Access logs queries
        • VPC logs queries
    • Visualization and Dashboards
      • Custom Dashboards (Beta)
      • Panther-Managed Dashboards
    • Standard Fields
    • Saved and Scheduled Searches
      • Templated Searches
        • Behavioral Analytics and Anomaly Detection Template Macros (Beta)
      • Scheduled Search Examples
    • Search History
    • Data Lakes
      • Snowflake
        • Snowflake Configuration for Optimal Search Performance
      • Athena
  • PantherFlow (Beta)
    • PantherFlow Quick Reference
    • PantherFlow Statements
    • PantherFlow Operators
      • Datatable Operator
      • Extend Operator
      • Join Operator
      • Limit Operator
      • Project Operator
      • Range Operator
      • Sort Operator
      • Search Operator
      • Summarize Operator
      • Union Operator
      • Visualize Operator
      • Where Operator
    • PantherFlow Data Types
    • PantherFlow Expressions
    • PantherFlow Functions
      • Aggregation Functions
      • Date/time Functions
      • String Functions
      • Array Functions
      • Math Functions
      • Control Flow Functions
      • Regular Expression Functions
      • Snowflake Functions
      • Data Type Functions
      • Other Functions
    • PantherFlow Example Queries
      • PantherFlow Examples: Threat Hunting Scenarios
      • PantherFlow Examples: SOC Operations
      • PantherFlow Examples: Panther Audit Logs
  • Enrichment
    • Custom Lookup Tables
      • Creating a GreyNoise Lookup Table
      • Lookup Table Examples
        • Using Lookup Tables: 1Password UUIDs
      • Lookup Table Specification Reference
    • Identity Provider Profiles
      • Okta Profiles
      • Google Workspace Profiles
    • Anomali ThreatStream
    • IPinfo
    • Tor Exit Nodes
    • TrailDiscover (Beta)
  • Panther AI (Beta)
    • Managing Panther AI Response History
  • System Configuration
    • Role-Based Access Control
    • Identity & Access Integrations
      • Azure Active Directory SSO
      • Duo SSO
      • G Suite SSO
      • Okta SSO
        • Okta SCIM
      • OneLogin SSO
      • Generic SSO
    • Panther Audit Logs
      • Querying and Writing Detections for Panther Audit Logs
      • Panther Audit Log Actions
    • Notifications and Errors (Beta)
      • System Errors
    • Panther Deployment Types
      • SaaS
      • Cloud Connected
        • Configuring Snowflake for Cloud Connected
        • Configuring AWS for Cloud Connected
        • Pre-Deployment Tools
      • Legacy Configurations
        • Snowflake Connected (Legacy)
        • Customer-configured Snowflake Integration (Legacy)
        • Self-Hosted Deployments (Legacy)
          • Runtime Environment
  • Panther Developer Workflows
    • Panther Developer Workflows Overview
    • Using panther-analysis
      • Public Fork
      • Private Clone
      • Panther Analysis Tool
        • Install, Configure, and Authenticate with the Panther Analysis Tool
        • Panther Analysis Tool Commands
        • Managing Lookup Tables and Enrichment Providers with the Panther Analysis Tool
      • CI/CD for Panther Content
        • Deployment Workflows Using Panther Analysis Tool
          • Managing Panther Content via CircleCI
          • Managing Panther Content via GitHub Actions
        • Migrating to a CI/CD Workflow
    • Panther API
      • REST API (Beta)
        • Alerts
        • Alert Comments
        • API Tokens
        • Data Models
        • Globals
        • Log Sources
        • Queries
        • Roles
        • Rules
        • Scheduled Rules
        • Simple Rules
        • Policies
        • Users
      • GraphQL API
        • Alerts & Errors
        • Cloud Account Management
        • Data Lake Queries
        • Log Source Management
        • Metrics
        • Schemas
        • Token Rotation
        • User & Role Management
      • API Playground
    • Terraform
      • Managing AWS S3 Log Sources with Terraform
      • Managing HTTP Log Sources with Terraform
    • pantherlog Tool
    • Converting Sigma Rules
  • Resources
    • Help
      • Operations
      • Security and Privacy
        • Security Without AWS External ID
      • Glossary
      • Legal
    • Panther System Architecture
Powered by GitBook
On this page
  • Overview
  • How to onboard CrowdStrike Event Streams logs to Panther
  • Prerequisite
  • Step 1: Create CrowdStrike Falcon API client
  • Step 2: Create a new CrowdStrike Event Streams source in Panther
  • Supported log types
  • Crowdstrike.EventStreams

Was this helpful?

  1. Data Sources & Transports
  2. Supported Logs
  3. CrowdStrike Logs

CrowdStrike Event Streams

Panther supports connecting to CrowdStrike's Event Streams API

PreviousCrowdStrike Falcon Data ReplicatorNextDocker Logs

Last updated 4 months ago

Was this helpful?

Overview

Panther can fetch CrowdStrike events by querying the CrowdStrike Event Streams API. Panther queries for new events every one minute.

CrowdStrike Event Streams only exports non-sensor data, which includes SaaS audit activity and CrowdStrike Detection Summary events. To ingest device telemetry, a source is required.

The action of Panther querying the Event Streams API for new events itself generates additional logs. If this creates unwanted noise in your integration, you can configure an to filter out these logs.

How to onboard CrowdStrike Event Streams logs to Panther

Prerequisite

  • Before you can use this method, you'll need to to enable streaming APIs on your CrowdStrike account.

Step 1: Create CrowdStrike Falcon API client

  1. Log into the Falcon console using an account with administrator-level permissions.

  2. In the navigation bar, click Support and resources > API clients and keys.

  3. Within the OAuth2 API clients tab, click Create API client.

  4. Fill in the Create API client form:

    • Client name: Enter a descriptive name.

    • Description: Enter a useful description.

    • In the table of scopes, in the Event streams row, select the Read checkbox.

  5. Click Create.

  6. The API client created pop-up modal will display Client ID, Secret, and Base URL values. Copy these values and store them in a secure location, as you will need them in the next step. This is the only time the Secret will be shown.

  7. Click Done.

Step 2: Create a new CrowdStrike Event Streams source in Panther

  1. In the left-hand navigation bar of your Panther Console, click Configure > Log Sources.

  2. Click Create New.

  3. Search for "CrowdStrike Event Streams," then click its tile.

  4. On the Configure page, enter a descriptive Name for the source.

  5. Click Setup.

  6. On the Credentials page, fill in the form:

    • Client Id: Enter the Client ID you generated in CrowdStrike in the previous step.

    • Client Secret: Enter the Secret you generated in CrowdStrike in the previous step.

    • Client Cloud: Select the region shown in the Base URL you generated in CrowdStrike in the previous step.

    • App Id: Enter a label to identify your connection.

      • There is a maximum of 20 alphanumeric characters (a-z, A-Z, 0-9).

  7. Click Setup. You will be directed to a success screen:

    • The Trigger an alert when no events are processed setting defaults to YES. We recommend leaving this enabled, as you will be alerted if data stops flowing from the log source after a certain period of time. The timeframe is configurable, with a default of 24 hours.

Supported log types

Crowdstrike.EventStreams

Crowdstrike.EventStreams logs represent activity observed on your hosts by the Falcon sensor and shown in the Falcon console's Investigate dashboards and searches.

schema: Crowdstrike.EventStreams
description: Events related to activity that's observed on your hosts by the Falcon sensor and shown in the Falcon console's Investigate dashboards and searches
referenceURL: https://developer.crowdstrike.com/crowdstrike/docs/streaming-api-events
fields:
  - name: event
    required: true
    description: The data for the detection or audit event
    type: object
    fields:
      - name: OperationName
        description: The operation name
        type: string
      - name: ServiceName
        description: The service name
        type: string
      - name: UTCTimestamp
        description: Time when the operation took place in UNIX EPOCH time
        type: timestamp
        timeFormats:
          - unix_auto
        isEventTime: true
      - name: UserId
        type: string
        indicators:
          - email
      - name: UserIp
        type: string
        indicators:
          - ip
      - name: Success
        type: boolean
      - name: ComputerName
        description: Host name
        type: string
        indicators:
          - hostname
      - name: DetectDescription
        description: |
          A description of what an adversary was trying to do in the environment and guidance on how to begin an investigation. NOTE: While these descriptions are robust and drive a helpful console experience, we encourage you to not use this field to drive workflows, as values are updated and added regularly
        type: string
      - name: Description
        type: string
      - name: DetectId
        description: The Detection ID for the detection. Can be used in other APIs, such as Detection Resolution and ThreatGraph
        type: string
      - name: CompositeId
        type: string
      - name: FalconHostLink
        description: Link to view detection event in Falcon console
        type: string
        indicators:
          - url
      - name: IOARuleInstanceId
        type: string
      - name: IOARuleInstanceVersion
        type: string
      - name: IOARuleName
        type: string
      - name: IOARuleGroupName
        type: string
      - name: FileName
        type: string
      - name: FilePath
        type: string
      - name: ProcessStartTime
        description: Timestamp of when a process started in UNIX EPOCH time
        type: timestamp
        timeFormats:
          - unix_auto
      - name: ProcessEndTime
        description: Timestamp of when a process ended in UNIX EPOCH time
        type: timestamp
        timeFormats:
          - unix_auto
      - name: ProcessId
        description: Process ID
        type: string
      - name: UserName
        description: User name
        type: string
        indicators:
          - username
      - name: DetectName
        description: |
          NOTE: The DetectName field has been replaced by Objective, Tactic, and Technique as we have aligned with MITRE’s ATT&CK. DetectName will be deprecated January 16, 2019 - more information
        type: string
      - name: Name
        type: string
      - name: CommandLine
        description: The command line used to create this process
        type: string
      - name: MD5String
        description: MD5 hash
        type: string
        indicators:
          - md5
      - name: SHA1String
        type: string
        indicators:
          - sha1
      - name: SHA256String
        description: SHA256 hash
        type: string
        indicators:
          - sha256
      - name: MachineDomain
        description: The Windows Domain Name to which the machine is currently joined
        type: string
      - name: SensorId
        description: Falcon sensor Agent ID
        type: string
      - name: LocalIP
        type: string
        indicators:
          - ip
      - name: MACAddress
        type: string
        indicators:
          - mac
      - name: Objective
        description: The name of the objective associated to the behavior
        type: string
      - name: PatternDispositionDescription
        description: The description of the pattern associated to the action taken on the behavior
        type: string
      - name: PatternDispositionValue
        description: The numerical ID of the pattern associated to the action taken on the behavior
        type: bigint
      - name: PatternDispositionFlags
        type: json
      - name: DocumentsAccessed
        type: array
        element:
          type: object
          fields:
            - name: Timestamp
              description: Time the document was accessed in UNIX EPOCH time
              type: timestamp
              timeFormats:
                - unix_auto
            - name: Filename
              description: |
                Name of file accessed. Note: A detect Summary can have 0 or more DocumentsAccessed_FileName entries and there is a timestamp for each DocumentsAccessed_FileName entry.
              type: string
            - name: Filepath
              description: |
                File path, if a document was accessed. Note: A detect Summary can have 0 or more DocumentsAccessed_FilePath entries.
              type: string
      - name: Commands
        type: array
        element:
          type: string
      - name: ParentProcessId
        description: Parent Process ID
        type: string
      - name: ParentCommandLine
        type: string
      - name: ParentImageFileName
        type: string
      - name: GrandparentCommandLine
        type: string
      - name: GrandparentImageFilename
        type: string
      - name: NetworkAccesses
        type: array
        element:
          type: object
          fields:
            - name: ConnectionDirection
              description: Whether the connection is inbound (1), outbound (0), or neither (2)
              type: int
            - name: LocalAddress
              description: Local IP address
              type: string
              indicators:
                - ip
            - name: LocalPort
              description: Local port of a network connection, as the normal port number. (i.e. an incoming ssh connection is 22)
              type: bigint
            - name: Protocol
              description: RFC-1700 IP protocol identifier
              type: string
            - name: RemoteAddress
              description: Remote IP address
              type: string
              indicators:
                - ip
            - name: RemotePort
              description: Remote port
              type: bigint
      - name: Severity
        description: 0 (N/A), 1 (Informational), 2 (Low), 3 (Medium), 4 (High), 5 (Critical)
        type: float
      - name: SeverityName
        description: 0 (N/A), 1 (Informational), 2 (Low), 3 (Medium), 4 (High), 5 (Critical)
        type: string
      - name: Tactic
        description: The name of the tactic associated to the behavior
        type: string
      - name: Technique
        description: The name of the technique associated to the behavior
        type: string
      - name: AuditKeyValues
        type: array
        element:
          type: json
      - name: IncidentType
        type: string
      - name: IncidentStartTime
        type: timestamp
        timeFormats:
          - unix_auto
      - name: IncidentEndTime
        type: timestamp
        timeFormats:
          - unix_auto
      - name: IncidentId
        type: string
      - name: State
        type: string
      - name: FineScore
        type: float
      - name: LateralMovement
        type: string
      - name: SessionId
        type: string
        indicators:
          - trace_id
      - name: HostnameField
        type: string
        indicators:
          - hostname
      - name: StartTimestamp
        type: timestamp
        timeFormats:
          - unix_auto
      - name: EndTimestamp
        type: timestamp
        timeFormats:
          - unix_auto
  - name: metadata
    required: true
    description: The metadata for this detection or audit event
    type: object
    fields:
      - name: customerIDString
        description: Unique ID assigned by CS for each customer
        type: string
      - name: offset
        required: true
        description: |
          Starts at offset=0. Each new event (AuthActivityAuditEvent, DetectionSummaryEvent, UserActivityAuditEvent) would increase the offset counter by one. When reconnecting to Falcon Streaming API, you can specify the offset value to tell the API the starting point where you’d like to receive the events. If omitted, the API would return all previous Detection Summary or Authentication events starting with offset=0
        type: bigint
      - name: version
        type: string
      - name: eventType
        type: string
      - name: eventCreationTime
        required: true
        description: Time when this event was generated in UNIX EPOCH time
        type: timestamp
        timeFormats:
          - unix_auto
        isEventTime: true

In the slide-out panel, click Start Setup.

Member Cid (Optional): Optionally enter the Customer ID (CID) selector, for cases when the CrowdStrike Client Id and Secret have access to multiple CIDs.

You can optionally enable one or more .

Detection Packs
CrowdStrike Falcon Data Replicator (FDR)
ingestion filter
contact your CrowdStrike support team
Crowdstrike.EventStreams
The success screen reads, "Everything looks good! Panther will now automatically pull & process logs from your account"
The "Trigger an alert when no events are processed" toggle is set to YES. The "How long should Panther wait before it sends you an alert that no events have been processed" setting is set to 1 Day