Windows Event Logs to S3 via Fluentd (Legacy)
Overview
Prerequisites
Setup Fluentd
Step 1. Install Fluentd
Step 2. Edit Fluentd Configuration
C:\opt\td-agent\etc\td-agent\td-agent.conf<source>
@type windows_eventlog2
@id windows_eventlog2
channels application,system,security
tag system
render_as_xml true
<storage>
persistent false
</storage>
parse_description false
read_existing_events false
</source>
<match system.**>
@type s3
s3_bucket <BUCKET-NAME>
s3_region <BUCKET-REGION>
path winevent/%Y/%m/%d/
store_as gzip
## There are two authentication methods below.
## If this is running on EC2, you can use the assume role credentials instead of a token key
## Secret Token Authentication
#aws_key_id <ACCESS-KEY-ID>
#aws_sec_key <SECRET-KEY>
## Assume Role Authentication
<assume_role_credentials>
duration_seconds 3600
role_arn <ROLE-ARN>
role_session_name "#{Socket.gethostname}-panther-audit"
</assume_role_credentials>
<buffer tag,time>
@type file
path /var/log/fluent/s3
timekey 300 # 5 min partition
timekey_wait 2m
timekey_use_utc true # use utc
chunk_limit_size 256m
</buffer>
<format>
@type json
</format>
</match>
#<match system.**>
# @type kinesis_firehose
# region <STREAM-REGION>
# delivery_stream_name <FIREHOSE-STREAM-NAME>
#
# <assume_role_credentials>
# duration_seconds 3600
# role_arn <ROLE-ARN>
# role_session_name "#{Socket.gethostname}-panther-audit"
# </assume_role_credentials>
# <format>
# @type json
# </format>
#</match>Step 3. Start Fluentd
Step 4. Verify Logging
Panther Console
Step 1. Create a Custom Schema
Step 2. Onboard the S3 bucket
Last updated
Was this helpful?

