GCP Audit to S3 via Fluentd

Consider using Fluent Bit instead of Fluentd to forward logs to Panther. Fluent Bit is easier to set up and less resource intensive than Fluentd.


The objective of this recipe is to stream Google Cloud Project Audit Logs into Panther. Many businesses use several cloud providers, and these steps will allow teams to gather API calls that happen with a GCP account into Panther.
We’ll implement this using a combination of primitives across GCP and AWS.

Solution Brief

At a high level, we’ll be implementing the following flow:
  1. 1.
    Audit logs are generated in GCP and routed to PubSub
  2. 2.
    Fluentd polls PubSub and forwards to an S3 Bucket
  3. 3.
    The S3 bucket is onboarded into Panther for normalization, detection, and long-term storage


Step 1: Create a New Pub/Sub

The image shows the Pub/Sub console from Google Cloud Platform
  1. 1.
    In GCP, open the Pub/Sub console
  2. 2.
    Create a new Topic, call it Panther-Audit
    • Uncheck ‘Add a default subscription’
    • Select CREATE TOPIC
  3. 3.
    Click Subscriptions > Create subscription
    • Input Panther-Audit as the Subscription ID
    • Select the Panther-Audit topic
    • Leave all other options or tune the expiration/retention as needed (per your intended spending/budgeting)
    • Click CREATE
Write down the Topic name (projects/<project-name>/topics/Panther-Audit) and Topic subscription (Panther-Audit); we’ll use it later!

Step 2: Create a Logs Router

The image shows the Logs Explorer page from Google Cloud Platform
Note: You can optionally create aggregated organization log sinks instead of project sinks. To learn more about creating aggregated sinks, please see Google's documentation.
  1. 1.
    Open the Logging console
  2. 2.
    Click Logs Router
  3. 3.
  4. 4.
    Set the name to Panther-Audit
  5. 5.
    Set the Sink Destination
    • Cloud Pub/Sub topic
    • Select the Panther-Audit Topic
  6. 6.
You can validate this pipeline is working by going to Pub/Sub, clicking the Topic ID of Panther-Audit, and viewing the ACTIVITY to see Audit events.
The image shows Google Cloud Platform's Pub/Sub console. In the left sidebar, "Topics" is highlighted. The "Panther-audit" topic is selected.

Step 3: Create a Service Account

  1. 1.
    Open IAM & Admin
  2. 2.
    Click Service Accounts
  3. 3.
    • Set the Service account name to Panther-Audit. Add a description if you like.
    • Click Create and Continue
    • Under Grant this service account access to project, set the service account access role to Pub/Sub Viewer and Pub/Sub Subscriber
    • Click Continue
    • Click Done
  4. 4.
    Under Service accounts -> Actions, click Manage keys, ADD KEY, Create new key, select JSON, and hit CREATE to download your credentials.
  5. 5.
    Keep this credential file safe! We’ll use it soon.
The image shows the IAM & Admin console in Google Cloud Platform. In the left sidebar, "Service Accounts" is highlighted. The center of the page says "Service accounts for project 'My First Project'". In the list, the panther-audit project is selected. A 3-dots icon on the right is expanded to an open dropdown menu, and the option "Manage keys" is highlighted.

Step 4: Configure AWS Infrastructure

On the Getting Started with Fluentd page, review and deploy the Firehose & S3 stack

Step 5: Launch Your Instance in AWS

  1. 1.
    Open the AWS EC2 Console (in the same region where you launched the stack above) and launch an Ubuntu Instance.
  2. 2.
    Click Launch Instance
    • Select Ubuntu Server 20.04 LTS
    • Select t2.medium (or a more powerful instance type, if you’d like)
    • In the IAM Role section, select the value of the InstanceProfileName copied in Step 4, with the format “<stack-name>-FirehoseInstanceProfile-<random-string>”
    • Click Add Storage, and add a 64GiB capacity drive
    • Set your Security Group, Key Pair, and other preferences as you’d like
  3. 3.
    Click Launch

Step 6: Install and Configure Fluentd

  1. 1.
    Add your keypair to your ssh agent
    • ssh-add <path-to-keypair>
  2. 2.
    SCP your GCP credentials downloaded in Step 3 to the instance
    • scp <path-to-gcp-cred-file> ubuntu@<public-ip>:/home/ubuntu/
  3. 3.
    SSH to your newly launched EC2 instance
    • ssh ubuntu@<public-ip>
  4. 4.
    • Follow the instructions for Ubuntu Focal
  5. 5.
    Install the official AWS Kinesis plugin
    • sudo td-agent-gem install fluent-plugin-kinesis
  6. 6.
    Install the GCP plugin
    • sudo td-agent-gem install fluent-plugin-gcloud-pubsub-custom
  7. 7.
    Overwrite the default fluentd config in /etc/td-agent/td-agent.conf:
    log_level debug
    @type gcloud_pubsub
    tag gcp.audit
    project <YOUR-GCP-PROJECT-ID>
    max_messages 1000
    return_immediately true
    pull_interval 1
    pull_threads 2
    parse_error_action exception
    @type json
    <match gcp.**>
    @type kinesis_firehose
    delivery_stream_name <YOUR-FIREHOSE-NAME>
    duration_seconds 3600
    role_session_name "#{Socket.gethostname}-panther-audit"
  8. 8.
    Restart td-agent
    • sudo systemctl restart td-agent

Step 7: Onboard data into Panther

Since GCP audit logs are supported out of the box, you may configure the S3 bucket as a data transport to start ingesting the logs through Panther.


  • Note: logs may take ~5 minutes to show up in the S3 bucket because of the IntervalInSecondsand SizeInMBs parameters within the CloudFormation template.
  • Monitor the td-agent logs for any errors
    • sudo tail -f /var/log/td-agent/td-agent.log
  • If you need more verbose logging, run:
    • td-agent -vv