GCP Audit to S3 via Fluentd
Last updated
Last updated
Consider using Fluent Bit instead of Fluentd to forward logs to Panther. Fluent Bit is easier to set up and less resource intensive than Fluentd.
The objective of this recipe is to stream Google Cloud Project Audit Logs into Panther. Many businesses use several cloud providers, and these steps will allow teams to gather API calls that happen with a GCP account into Panther.
We’ll implement this using a combination of primitives across GCP and AWS.
At a high level, we’ll be implementing the following flow:
Audit logs are generated in GCP and routed to PubSub
Fluentd polls PubSub and forwards to an S3 Bucket
The S3 bucket is onboarded into Panther for normalization, detection, and long-term storage
In GCP, open the Pub/Sub console
Create a new Topic, call it Panther-Audit
Uncheck ‘Add a default subscription’
Select CREATE TOPIC
Click Subscriptions > Create subscription
Input Panther-Audit as the Subscription ID
Select the Panther-Audit topic
Leave all other options or tune the expiration/retention as needed (per your intended spending/budgeting)
Click CREATE
Write down the Topic name (projects/<project-name>/topics/Panther-Audit) and Topic subscription (Panther-Audit)—we’ll use it later.
Note: You can optionally create aggregated organization log sinks instead of project sinks. To learn more about creating aggregated sinks, please see Google's documentation.
Open the Logging console
Click Logs Router
Click CREATE SINK
Set the name to Panther-Audit
Set the Sink Destination
Cloud Pub/Sub topic
Select the Panther-Audit Topic
Click CREATE SINK
You can validate this pipeline is working by going to Pub/Sub, clicking the Topic ID of Panther-Audit, and viewing the ACTIVITY to see Audit events.
Open IAM & Admin
Click Service Accounts
Click CREATE SERVICE ACCOUNT
Set the Service account name to Panther-Audit. Add a description if you like.
Click Create and Continue
Under Grant this service account access to project, set the service account access role to Pub/Sub Viewer and Pub/Sub Subscriber
Click Continue
Click Done
Under Service accounts -> Actions, click Manage keys, ADD KEY, Create new key, select JSON, and hit CREATE to download your credentials.
Keep this credential file safe! We’ll use it soon.
On the Fluentd Onboarding Guide, review and deploy the Firehose & S3 stack.
Open the AWS EC2 Console (in the same region where you launched the stack above) and launch an Ubuntu Instance.
Click Launch Instance
Select Ubuntu Server 20.04 LTS
Select t2.medium (or a more powerful instance type, if you’d like)
In the IAM Role section, select the value of the InstanceProfileName copied in Step 4, with the format “<stack-name>-FirehoseInstanceProfile-<random-string>”
Click Add Storage, and add a 64GiB capacity drive
Set your Security Group, Key Pair, and other preferences as you’d like
Click Launch
Add your keypair to your ssh agent
ssh-add <path-to-keypair>
SCP your GCP credentials downloaded in Step 3 to the instance
scp <path-to-gcp-cred-file> ubuntu@<public-ip>:/home/ubuntu/
SSH to your newly launched EC2 instance
ssh ubuntu@<public-ip>
Follow the instructions for Ubuntu Focal
Install the official AWS Kinesis plugin
sudo td-agent-gem install fluent-plugin-kinesis
Install the GCP plugin
sudo td-agent-gem install fluent-plugin-gcloud-pubsub-custom
Overwrite the default fluentd config in /etc/td-agent/td-agent.conf
:
Restart td-agent
sudo systemctl restart td-agent
Since GCP audit logs are supported out of the box, you may configure the S3 bucket as a data transport to start ingesting the logs through Panther.
Note: logs may take ~5 minutes to show up in the S3 bucket because of the IntervalInSeconds
and SizeInMBs
parameters within the CloudFormation template.
Monitor the td-agent logs for any errors
sudo tail -f /var/log/td-agent/td-agent.log
If you need more verbose logging, run:
td-agent -vv