LogoLogo
Knowledge BaseCommunityRelease NotesRequest Demo
  • Overview
  • Quick Start
    • Onboarding Guide
  • Data Sources & Transports
    • Supported Logs
      • 1Password Logs
      • Apache Logs
      • AppOmni Logs
      • Asana Logs
      • Atlassian Logs
      • Auditd Logs
      • Auth0 Logs
      • AWS Logs
        • AWS ALB
        • AWS Aurora
        • AWS CloudFront
        • AWS CloudTrail
        • AWS CloudWatch
        • AWS Config
        • AWS EKS
        • AWS GuardDuty
        • AWS Security Hub
        • Amazon Security Lake
        • AWS S3
        • AWS Transit Gateway
        • AWS VPC
        • AWS WAF
      • Azure Monitor Logs
      • Bitwarden Logs
      • Box Logs
      • Carbon Black Logs
      • Cisco Umbrella Logs
      • Cloudflare Logs
      • CrowdStrike Logs
        • CrowdStrike Falcon Data Replicator
        • CrowdStrike Event Streams
      • Docker Logs
      • Dropbox Logs
      • Duo Security Logs
      • Envoy Logs
      • Fastly Logs
      • Fluentd Logs
      • GCP Logs
      • GitHub Logs
      • GitLab Logs
      • Google Workspace Logs
      • Heroku Logs
      • Jamf Pro Logs
      • Juniper Logs
      • Lacework Logs
        • Lacework Alert Channel Webhook
        • Lacework Export
      • Material Security Logs
      • Microsoft 365 Logs
      • Microsoft Entra ID Audit Logs
      • Microsoft Graph Logs
      • MongoDB Atlas Logs
      • Netskope Logs
      • Nginx Logs
      • Notion Logs
      • Okta Logs
      • OneLogin Logs
      • Orca Security Logs (Beta)
      • Osquery Logs
      • OSSEC Logs
      • Proofpoint Logs
      • Push Security Logs
      • Rapid7 Logs
      • Salesforce Logs
      • SentinelOne Logs
      • Slack Logs
      • Snowflake Audit Logs (Beta)
      • Snyk Logs
      • Sophos Logs
      • Sublime Security Logs
      • Suricata Logs
      • Sysdig Logs
      • Syslog Logs
      • Tailscale Logs
      • Teleport Logs
      • Tenable Vulnerability Management Logs
      • Thinkst Canary Logs
      • Tines Logs
      • Tracebit Logs
      • Windows Event Logs
      • Wiz Logs
      • Zeek Logs
      • Zendesk Logs
      • Zoom Logs
      • Zscaler Logs
        • Zscaler ZIA
        • Zscaler ZPA
    • Custom Logs
      • Log Schema Reference
      • Transformations
      • Script Log Parser (Beta)
      • Fastmatch Log Parser
      • Regex Log Parser
      • CSV Log Parser
    • Data Transports
      • HTTP Source
      • AWS Sources
        • S3 Source
        • CloudWatch Logs Source
        • SQS Source
          • SNS Source
        • EventBridge
      • Google Cloud Sources
        • Cloud Storage (GCS) Source
        • Pub/Sub Source
      • Azure Blob Storage Source
    • Monitoring Log Sources
    • Ingestion Filters
      • Raw Event Filters
      • Normalized Event Filters (Beta)
    • Data Pipeline Tools
      • Chronosphere Onboarding Guide
      • Cribl Onboarding Guide
      • Fluent Bit Onboarding Guide
        • Fluent Bit Configuration Examples
      • Fluentd Onboarding Guide
        • General log forwarding via Fluentd
        • MacOS System Logs to S3 via Fluentd
        • Syslog to S3 via Fluentd
        • Windows Event Logs to S3 via Fluentd (Legacy)
        • GCP Audit to S3 via Fluentd
      • Observo Onboarding Guide
      • Tarsal Onboarding Guide
    • Tech Partner Log Source Integrations
  • Detections
    • Using Panther-managed Detections
      • Detection Packs
    • Rules and Scheduled Rules
      • Writing Python Detections
        • Python Rule Caching
        • Data Models
        • Global Helper Functions
      • Modifying Detections with Inline Filters (Beta)
      • Derived Detections (Beta)
        • Using Derived Detections to Avoid Merge Conflicts
      • Using the Simple Detection Builder
      • Writing Simple Detections
        • Simple Detection Match Expression Reference
        • Simple Detection Error Codes
    • Correlation Rules (Beta)
      • Correlation Rule Reference
    • PyPanther Detections (Beta)
      • Creating PyPanther Detections
      • Registering, Testing, and Uploading PyPanther Detections
      • Managing PyPanther Detections in the Panther Console
      • PyPanther Detections Style Guide
      • pypanther Library Reference
      • Using the pypanther Command Line Tool
    • Signals
    • Policies
    • Testing
      • Data Replay (Beta)
    • Framework Mapping and MITRE ATT&CK® Matrix
  • Cloud Security Scanning
    • Cloud Resource Attributes
      • AWS
        • ACM Certificate
        • CloudFormation Stack
        • CloudWatch Log Group
        • CloudTrail
        • CloudTrail Meta
        • Config Recorder
        • Config Recorder Meta
        • DynamoDB Table
        • EC2 AMI
        • EC2 Instance
        • EC2 Network ACL
        • EC2 SecurityGroup
        • EC2 Volume
        • EC2 VPC
        • ECS Cluster
        • EKS Cluster
        • ELBV2 Application Load Balancer
        • GuardDuty Detector
        • GuardDuty Detector Meta
        • IAM Group
        • IAM Policy
        • IAM Role
        • IAM Root User
        • IAM User
        • KMS Key
        • Lambda Function
        • Password Policy
        • RDS Instance
        • Redshift Cluster
        • Route 53 Domains
        • Route 53 Hosted Zone
        • S3 Bucket
        • WAF Web ACL
  • Alerts & Destinations
    • Alert Destinations
      • Amazon SNS Destination
      • Amazon SQS Destination
      • Asana Destination
      • Blink Ops Destination
      • Custom Webhook Destination
      • Discord Destination
      • GitHub Destination
      • Google Pub/Sub Destination (Beta)
      • Incident.io Destination
      • Jira Cloud Destination
      • Jira Data Center Destination (Beta)
      • Microsoft Teams Destination
      • Mindflow Destination
      • OpsGenie Destination
      • PagerDuty Destination
      • Rapid7 Destination
      • ServiceNow Destination (Custom Webhook)
      • Slack Bot Destination
      • Slack Destination (Webhook)
      • Splunk Destination (Beta)
      • Tines Destination
      • Torq Destination
    • Assigning and Managing Alerts
      • Managing Alerts in Slack
    • Alert Runbooks
      • Panther-managed Policies Runbooks
        • AWS CloudTrail Is Enabled In All Regions
        • AWS CloudTrail Sending To CloudWatch Logs
        • AWS KMS CMK Key Rotation Is Enabled
        • AWS Application Load Balancer Has Web ACL
        • AWS Access Keys Are Used Every 90 Days
        • AWS Access Keys are Rotated Every 90 Days
        • AWS ACM Certificate Is Not Expired
        • AWS Access Keys not Created During Account Creation
        • AWS CloudTrail Has Log Validation Enabled
        • AWS CloudTrail S3 Bucket Has Access Logging Enabled
        • AWS CloudTrail Logs S3 Bucket Not Publicly Accessible
        • AWS Config Is Enabled for Global Resources
        • AWS DynamoDB Table Has Autoscaling Targets Configured
        • AWS DynamoDB Table Has Autoscaling Enabled
        • AWS DynamoDB Table Has Encryption Enabled
        • AWS EC2 AMI Launched on Approved Host
        • AWS EC2 AMI Launched on Approved Instance Type
        • AWS EC2 AMI Launched With Approved Tenancy
        • AWS EC2 Instance Has Detailed Monitoring Enabled
        • AWS EC2 Instance Is EBS Optimized
        • AWS EC2 Instance Running on Approved AMI
        • AWS EC2 Instance Running on Approved Instance Type
        • AWS EC2 Instance Running in Approved VPC
        • AWS EC2 Instance Running On Approved Host
        • AWS EC2 Instance Running With Approved Tenancy
        • AWS EC2 Instance Volumes Are Encrypted
        • AWS EC2 Volume Is Encrypted
        • AWS GuardDuty is Logging to a Master Account
        • AWS GuardDuty Is Enabled
        • AWS IAM Group Has Users
        • AWS IAM Policy Blocklist Is Respected
        • AWS IAM Policy Does Not Grant Full Administrative Privileges
        • AWS IAM Policy Is Not Assigned Directly To User
        • AWS IAM Policy Role Mapping Is Respected
        • AWS IAM User Has MFA Enabled
        • AWS IAM Password Used Every 90 Days
        • AWS Password Policy Enforces Complexity Guidelines
        • AWS Password Policy Enforces Password Age Limit Of 90 Days Or Less
        • AWS Password Policy Prevents Password Reuse
        • AWS RDS Instance Is Not Publicly Accessible
        • AWS RDS Instance Snapshots Are Not Publicly Accessible
        • AWS RDS Instance Has Storage Encrypted
        • AWS RDS Instance Has Backups Enabled
        • AWS RDS Instance Has High Availability Configured
        • AWS Redshift Cluster Allows Version Upgrades
        • AWS Redshift Cluster Has Encryption Enabled
        • AWS Redshift Cluster Has Logging Enabled
        • AWS Redshift Cluster Has Correct Preferred Maintenance Window
        • AWS Redshift Cluster Has Sufficient Snapshot Retention Period
        • AWS Resource Has Minimum Number of Tags
        • AWS Resource Has Required Tags
        • AWS Root Account Has MFA Enabled
        • AWS Root Account Does Not Have Access Keys
        • AWS S3 Bucket Name Has No Periods
        • AWS S3 Bucket Not Publicly Readable
        • AWS S3 Bucket Not Publicly Writeable
        • AWS S3 Bucket Policy Does Not Use Allow With Not Principal
        • AWS S3 Bucket Policy Enforces Secure Access
        • AWS S3 Bucket Policy Restricts Allowed Actions
        • AWS S3 Bucket Policy Restricts Principal
        • AWS S3 Bucket Has Versioning Enabled
        • AWS S3 Bucket Has Encryption Enabled
        • AWS S3 Bucket Lifecycle Configuration Expires Data
        • AWS S3 Bucket Has Logging Enabled
        • AWS S3 Bucket Has MFA Delete Enabled
        • AWS S3 Bucket Has Public Access Block Enabled
        • AWS Security Group Restricts Ingress On Administrative Ports
        • AWS VPC Default Security Group Restricts All Traffic
        • AWS VPC Flow Logging Enabled
        • AWS WAF Has Correct Rule Ordering
        • AWS CloudTrail Logs Encrypted Using KMS CMK
      • Panther-managed Rules Runbooks
        • AWS CloudTrail Modified
        • AWS Config Service Modified
        • AWS Console Login Failed
        • AWS Console Login Without MFA
        • AWS EC2 Gateway Modified
        • AWS EC2 Network ACL Modified
        • AWS EC2 Route Table Modified
        • AWS EC2 SecurityGroup Modified
        • AWS EC2 VPC Modified
        • AWS IAM Policy Modified
        • AWS KMS CMK Loss
        • AWS Root Activity
        • AWS S3 Bucket Policy Modified
        • AWS Unauthorized API Call
    • Tech Partner Alert Destination Integrations
  • Investigations & Search
    • Search
      • Search Filter Operators
    • Data Explorer
      • Data Explorer SQL Search Examples
        • CloudTrail logs queries
        • GitHub Audit logs queries
        • GuardDuty logs queries
        • Nginx and ALB Access logs queries
        • Okta logs queries
        • S3 Access logs queries
        • VPC logs queries
    • Visualization and Dashboards
      • Custom Dashboards (Beta)
      • Panther-Managed Dashboards
    • Standard Fields
    • Saved and Scheduled Searches
      • Templated Searches
        • Behavioral Analytics and Anomaly Detection Template Macros (Beta)
      • Scheduled Search Examples
    • Search History
    • Data Lakes
      • Snowflake
        • Snowflake Configuration for Optimal Search Performance
      • Athena
  • PantherFlow (Beta)
    • PantherFlow Quick Reference
    • PantherFlow Statements
    • PantherFlow Operators
      • Datatable Operator
      • Extend Operator
      • Join Operator
      • Limit Operator
      • Project Operator
      • Range Operator
      • Sort Operator
      • Search Operator
      • Summarize Operator
      • Union Operator
      • Visualize Operator
      • Where Operator
    • PantherFlow Data Types
    • PantherFlow Expressions
    • PantherFlow Functions
      • Aggregation Functions
      • Date/time Functions
      • String Functions
      • Array Functions
      • Math Functions
      • Control Flow Functions
      • Regular Expression Functions
      • Snowflake Functions
      • Data Type Functions
      • Other Functions
    • PantherFlow Example Queries
      • PantherFlow Examples: Threat Hunting Scenarios
      • PantherFlow Examples: SOC Operations
      • PantherFlow Examples: Panther Audit Logs
  • Enrichment
    • Custom Lookup Tables
      • Creating a GreyNoise Lookup Table
      • Lookup Table Examples
        • Using Lookup Tables: 1Password UUIDs
      • Lookup Table Specification Reference
    • Identity Provider Profiles
      • Okta Profiles
      • Google Workspace Profiles
    • Anomali ThreatStream
    • IPinfo
    • Tor Exit Nodes
    • TrailDiscover (Beta)
  • Panther AI (Beta)
  • System Configuration
    • Role-Based Access Control
    • Identity & Access Integrations
      • Azure Active Directory SSO
      • Duo SSO
      • G Suite SSO
      • Okta SSO
        • Okta SCIM
      • OneLogin SSO
      • Generic SSO
    • Panther Audit Logs
      • Querying and Writing Detections for Panther Audit Logs
      • Panther Audit Log Actions
    • Notifications and Errors (Beta)
      • System Errors
    • Panther Deployment Types
      • SaaS
      • Cloud Connected
        • Configuring Snowflake for Cloud Connected
        • Configuring AWS for Cloud Connected
        • Pre-Deployment Tools
      • Legacy Configurations
        • Snowflake Connected (Legacy)
        • Customer-configured Snowflake Integration (Legacy)
        • Self-Hosted Deployments (Legacy)
          • Runtime Environment
  • Panther Developer Workflows
    • Panther Developer Workflows Overview
    • Using panther-analysis
      • Public Fork
      • Private Clone
      • Panther Analysis Tool
        • Install, Configure, and Authenticate with the Panther Analysis Tool
        • Panther Analysis Tool Commands
        • Managing Lookup Tables and Enrichment Providers with the Panther Analysis Tool
      • CI/CD for Panther Content
        • Deployment Workflows Using Panther Analysis Tool
          • Managing Panther Content via CircleCI
          • Managing Panther Content via GitHub Actions
        • Migrating to a CI/CD Workflow
    • Panther API
      • REST API (Beta)
        • Alerts
        • Alert Comments
        • API Tokens
        • Data Models
        • Globals
        • Log Sources
        • Queries
        • Roles
        • Rules
        • Scheduled Rules
        • Simple Rules
        • Policies
        • Users
      • GraphQL API
        • Alerts & Errors
        • Cloud Account Management
        • Data Lake Queries
        • Log Source Management
        • Metrics
        • Schemas
        • Token Rotation
        • User & Role Management
      • API Playground
    • Terraform
      • Managing AWS S3 Log Sources with Terraform
      • Managing HTTP Log Sources with Terraform
    • pantherlog Tool
    • Converting Sigma Rules
  • Resources
    • Help
      • Operations
      • Security and Privacy
        • Security Without AWS External ID
      • Glossary
      • Legal
    • Panther System Architecture
Powered by GitBook
On this page
  • Overview
  • How IPinfo works
  • IPinfo datasets
  • How to enable IPinfo datasets
  • CI/CD users
  • How to query IPinfo data in the data lake
  • Using a joinkey
  • Examples
  • Example: Alert based on IPinfo location data
  • Example: Query the data lake for IPinfo data using a joinkey
  • IPinfo Python helper function usage and methods
  • Creating IPinfo objects in a Python rule
  • Calling methods on the IPinfo objects
  • Available methods

Was this helpful?

  1. Enrichment

IPinfo

PreviousAnomali ThreatStreamNextTor Exit Nodes

Last updated 9 months ago

Was this helpful?

Overview

Panther has partnered with , a trusted source for IP address data, to provide integrated IP related enrichment to Panther customers. The IPinfo integration is an , also known as a Panther-managed Lookup Table.

Use IPinfo enrichment data in your Panther detections to reduce false-positive alerts by:

  • Cross-examining the current IP geolocation details of suspicious users to discover irregularities in profile information and blocking them.

  • Preemptively identifying and blocking traffic from high-risk locations or networks before they make it to you.

  • Accurately and reliably discovering other entities related to the target that may pose a security risk.

The IPinfo data sets are available to all Panther accounts at no additional cost and are disabled by default. Learn how to , and how to .

How IPinfo works

Alert events are automatically enriched with IPinfo data within the p_enrichment field in JSON events.

IPinfo data can be accessed in detections with (and ).

are stored as Panther-managed Lookup Tables in bulk, so there is no need to make API calls to leverage this enrichment in your detection logic or alerts.

The data from IPinfo is updated once a day.

IPinfo datasets

There are three data types available from IPinfo that add contextual information about IP addresses:

How to enable IPinfo datasets

To enable Analyst roles to view and manage IPinfo packages in the Panther Console, they will need to be assigned the View Lookups and Manage Lookups permissions.

To enable IPinfo Panther-managed Lookup Tables:

  1. In the left-hand navigation bar of your Panther Console, click Detections.

  2. Click the Packs tab.

    • On this page, you can see built-in packs available for IPinfo.

  3. On the right side of the IPInfo tile you wish to enable, click the toggle to enable the pack.

  4. Click Continue in the dialog that appears.

  5. To verify if the IPinfo data sets are enabled, from the left sidebar menu, click Configure > Enrichment Providers.

    • On this page, you can see Panther-managed enrichment sources (such as IPinfo). You can also see whether the sources are currently enabled or disabled and when a source’s data was last refreshed.

    • The six IPinfo source tables are visible, as well as the time they were last refreshed. Disabled data sets will not be refreshed.

      • The ipinfo_asn ,ipinfo_location and ipinfo_privacy tables are used for real-time lookups in the detection engine.

      • The ipinfo_asn_datalake , ipinfo_location_datalake and ipinfo_privacy_datalake tables are used for querying and joining to IPinfo data in the datalake.

CI/CD users

Please note the following considerations:

  • It is possible for CI/CD users to enable IPinfo Lookup Tables via Detection Packs, as long as you do not customize the IPinfo tables using PAT.

    • If you choose to manage IPinfo through PAT after enabling it in the Panther Console, you must first disable the Detection Packs in the Panther Console. Simultaneous use of both the Panther Console and PAT to manage IPinfo is not supported.

How to query IPinfo data in the data lake

There are three IPinfo tables in the data lake:

  • ipinfo_asn_datalake

  • ipinfo_location_datalake

  • ipinfo_privacy_datalake

For each of the above tables, there is also a <table>_history table that records all changes.

Using a joinkey

When querying the data lake for IPinfo data, you must use a joinkey to make the queries efficient. The following user-defined functions make setting a joinkey easier:

  • PANTHER_LOOKUPS.PUBLIC.IPINFO_RANGE_TO_CIDR

  • PANTHER_LOOKUPS.PUBLIC.IPINFO_TO_INT

  • PANTHER_LOOKUPS.PUBLIC.IPINFO_TO_IP

    • Note: IPinfo's code for TO_IP supports IPv4 only.

  • PANTHER_LOOKUPS.PUBLIC.IPINFO_TO_JOIN_KEY

Examples

Example: Alert based on IPinfo location data

In this example, we create a rule that emits an alert on every login to the AWS console that is done from an unexpected country.

def rule(event):
    global ipinfo_location
    ipinfo_location = IPInfoLocation(event)
    match_field = ""
    if event.get("p_log_type") == "AWS.Cloudtrail":
        match_field = "sourceIPAddress"
    
    if event.get("eventname") == 'ConsoleLogin' and ipinfo_location.country(match_field) != "US":
        return True
    return False

Example: Query the data lake for IPinfo data using a joinkey

To look up the IP 71.114.47.25, you will need to specify a joinkey and range.

SELECT
   *
FROM
   panther_lookups.public.ipinfo_asn_datalake
WHERE 
   PANTHER_LOOKUPS.PUBLIC.IPINFO_TO_JOIN_KEY(joinkey) = PANTHER_LOOKUPS.PUBLIC.IPINFO_TO_JOIN_KEY('71.114.47.25')
   AND 
   PANTHER_LOOKUPS.PUBLIC.IPINFO_TO_INT('71.114.47.25') BETWEEN 
     PANTHER_LOOKUPS.PUBLIC.IPINFO_TO_INT(startip) 
       AND PANTHER_LOOKUPS.PUBLIC.IPINFO_TO_INT(endip)

To join to another table, follow the same pattern as above, but use the IP address in the log table.

SELECT
   log.*, 
   ipinfo.* EXCLUDE (p_schema_version,p_event_time, p_parse_time,p_log_type,p_row_id,p_source_id,p_source_label)
FROM
   panther_logs.public.panther_audit log 
     LEFT OUTER JOIN panther_lookups.public.ipinfo_asn_datalake ipinfo
   ON (
     PANTHER_LOOKUPS.PUBLIC.IPINFO_TO_JOIN_KEY(joinkey) = PANTHER_LOOKUPS.PUBLIC.IPINFO_TO_JOIN_KEY(log.sourceIP)
     AND 
     PANTHER_LOOKUPS.PUBLIC.IPINFO_TO_INT(log.sourceIP) BETWEEN 
       PANTHER_LOOKUPS.PUBLIC.IPINFO_TO_INT(startip) AND PANTHER_LOOKUPS.PUBLIC.IPINFO_TO_INT(endip)
   )
WHERE
   p_occurs_since('1 day', log)
LIMIT 10

IPinfo Python helper function usage and methods

Panther has integrated helper functions to streamline the use of IPInfo data in the real-time detection engine.

Creating IPinfo objects in a Python rule

There are helper functions that create objects with methods that can be called to return relevant data from the dataset.

Below is an example code snippet that shows the creation of these objects:

from panther_ipinfo_helpers import (IPInfoASN, IPInfoLocation, geoinfo_from_ip)

def rule(event):
    global ipinfo_location
    global ipinfo_asn
    ipinfo_location = IPInfoLocation(event) 
    ipinfo_asn = IPInfoASN(event)

The global keyword is only needed if you intend to use the objects outside of the function in which they are declared.

Calling methods on the IPinfo objects

The various components of the IPinfo datasets are available via methods on the _location and _asn objects. It's possible for one event that your rule is processing to have multiple fields (such as IP addresses, source, and destination IP in a network log). When calling the IPInfo objects, make sure to specify which field you are looking for.

The example below demonstrates calling all helper methods on the ipinfo_location and ipinfo_asn objects we created in the previous example, to get all the enrichment information available in the detection's rule.

    match_field = ""
    if event.get("p_log_type") == "AWS.Cloudtrail":
        match_field = "sourceIPAddress"
    
    if ipinfo_location:
        city = ipinfo_location.city(match_field)
        country = ipinfo_location.country(match_field)
        latitude = ipinfo_location.latitude(match_field)
        longitude = ipinfo_location.longitude(match_field)
        postal_code = ipinfo_location.postal_code(match_field)
        region = ipinfo_location.region(match_field)
        region_code = ipinfo_location.region_code(match_field)
        timezone = ipinfo_location.timezone(match_field)
    
    if ipinfo_asn:
        asn = ipinfo_asn.asn(match_field)
        domain = ipinfo_asn.domain(match_field)
        name = ipinfo_asn.name(match_field)
        route = ipinfo_asn.route(match_field)
        asn_type = ipinfo_asn._type(match_field)

The next example uses the geoinfo_from_ip() function that returns a dictionary with geolocation information in the same format as panther_oss_helper.geoinfo_from_ip(), except it does not provide hostname and anycast fields.

result = geoinfo_from_ip(event, "sourceIPAddress")

If the event field being referenced is an array, then the helper function will return an array of the matching values. For example:

countries_of_all_ips = ipinfo_location.country('p_any_ip_addresses')
for country in countries_of_all_ips:
    if country == 'some unusual country':
        return True

Available methods

The following tables shows the available methods for the IPinfo Location, ASN and Privacy Objects, their descriptions, and expected return values.

All methods take the argument of the field you are searching for.

Location

Location method
Return type
Example

city

String

"San Francisco"

country

String

"US"

latitude

String

"37.7812"

longitude

String

"-122.4614"

postal_code

String

"94118"

region

String

"California"

region_code

String

"CA"

timezone

String

"America/Los_Angeles"

context

Object

a dictionary that contains all of the above fields with capitalized method name as a key, e.g.: {

"City":"San Francisco", ...

}

ASN

ASN method
Return type
Example

asn

String

"AS7018"

domain

String

"att.com"

name

String

"AT&T Services, Inc."

route

String

"107.128.0.0/12"

type

String

"isp"

context

Object

a dictionary that contains all of the above fields with capitalized method name as a key, e.g.: {

"ASN":"AS7018",

"Domain" : "att.com", ...

}

Privacy

Privacy method
Return type
Example

hosting

boolean

true

proxy

boolean

false

tor

boolean

true

vpn

boolean

false

relay

boolean

true

service

string

"NordVPN"

If you are using a CI/CD workflow, please see to learn about additional considerations.

If you'd like to make additional changes through CI/CD with the , please contact your Panther representative for more information.

To enable the IPinfo Enrichment Provider in the CLI workflow, see the guide.

CI/CD users do not need to use Detection Packs to get IPinfo Tables. You can pull in the latest release of and use the panther_analysis_tool (PAT) to upload the IPinfo Lookup Tables.

To enable the IPinfo Tables using the repo, make sure to open each corresponding YAML configuration file and set enabled: true.

For more information on how to manage IPinfo Lookup Tables, please see the .

See .

See .

See .

See an .

Panther Analysis Tool (PAT)
Managing Lookup Tables and Enrichment Providers with the Panther Analysis Tool
panther-analysis
panther-analysis
IPinfo files in Panther's GitHub repository
IPinfo's ASN API documentation
IPinfo's IP Geolocation API documentation
IPinfo's Privacy Detection API documentation
the CI/CD Users section below
example showing how to query the data lake using a joinkey below
IPinfo
Geolocation Data
ASN Data
Privacy Data
pre-built Python helpers
IPinfo datasets
Enrichment Provider
view stored Enrichment Provider data here
view log events with enrichment data here
deep_get