Skip to main content

CloudWatch Logs Stream

Sync Type: Streaming

Overview

Receives CloudWatch Logs in real-time via Amazon Data Firehose HTTP endpoint delivery. Instead of polling the CloudWatch Logs API, this input receives log events as they are pushed from AWS through a Firehose delivery stream configured with an HTTP endpoint destination.

Requirements

  • An AWS account with CloudWatch Logs
  • A Monad organization API key with pipeline:data:write permission
  • AWS permissions to create:
  • Amazon Data Firehose delivery stream
  • CloudWatch Logs subscription filter

Setup Instructions

Step 1: Create the Monad Pipeline

  1. In the Monad platform, create a new pipeline with the AWS CloudWatch Logs Stream input
  2. Note the Pipeline ID from the pipeline settings - you'll need this for the Firehose configuration

Step 2: Create a Monad API Key

  1. In the Monad platform, navigate to Settings > API Keys
  2. Click Create new API key
  3. Give it a descriptive name (e.g., "AWS Firehose CloudWatch Logs")
  4. Ensure the API key has the pipeline:data:write permission (included in Contributor and System Administrator roles)
  5. Copy and securely store the API key value - it won't be visible again

Step 3: Create an Amazon Data Firehose Delivery Stream

  1. Navigate to the Amazon Data Firehose console in AWS
  2. Click Create Firehose stream
  3. Configure the stream:
  • Source: Select Direct PUT
  • Destination: Select HTTP Endpoint
  1. Configure the HTTP endpoint destination:
  • HTTP endpoint URL: https://app.monad.com/api/v2/http/send/{PIPELINE_ID}
  • Replace {PIPELINE_ID} with your Monad pipeline ID
  • Access key: Enter your Monad API key (this is sent as the X-Amz-Firehose-Access-Key header)
  • Content encoding: Select GZIP (recommended for reduced bandwidth)
  1. Configure buffering hints:
  • Buffer size: 1-5 MiB (recommended: 1 MiB for lower latency)
  • Buffer interval: 60-300 seconds (recommended: 60 seconds for near real-time)
  1. Configure backup settings as needed
  2. Create the delivery stream and note the Firehose ARN

Step 4: Create an IAM Role for CloudWatch Logs

CloudWatch Logs needs permission to write to your Firehose stream. Create an IAM role with the following configuration:

Trust policy (allows CloudWatch Logs to assume the role):

{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": "logs.amazonaws.com"
},
"Action": "sts:AssumeRole",
"Condition": {
"StringLike": {
"aws:SourceArn": "arn:aws:logs:*:YOUR_ACCOUNT_ID:*"
}
}
}
]
}

Replace YOUR_ACCOUNT_ID with your AWS account ID.

Permissions policy (allows writing to Firehose):

{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"firehose:PutRecord",
"firehose:PutRecordBatch"
],
"Resource": "arn:aws:firehose:*:YOUR_ACCOUNT_ID:deliverystream/*"
}
]
}

Replace YOUR_ACCOUNT_ID with your AWS account ID. You can also scope this down to a specific delivery stream ARN for tighter security.

Step 5: Create a CloudWatch Logs Subscription Filter

  1. Navigate to the CloudWatch Logs console
  2. Select the log group you want to stream
  3. Click the Subscription filters tab
  4. Click CreateCreate Kinesis Data Firehose subscription filter
  5. Configure the subscription filter:
  • Firehose stream: Select the delivery stream created in Step 3
  • IAM role: Select the role created in Step 4
  • Filter pattern: Enter a filter pattern (leave empty to send all logs)
  • Filter name: Give it a descriptive name
  1. Create the subscription filter

Configuration

This input requires no configuration in Monad beyond creating the pipeline. All configuration is done on the AWS side.

Settings

SettingTypeRequiredDescription
None--No settings required

Secrets

SecretTypeRequiredDescription
None--No secrets required

Authentication

Authentication is handled via the Monad API key that you configure in the Firehose HTTP endpoint destination. The API key is sent by Firehose in the X-Amz-Firehose-Access-Key header.

API Key Requirements:

  • Must be an organization API key (not a personal API key)
  • Must have the pipeline:data:write permission

Data Format

Each log event is enriched with metadata from both CloudWatch and Firehose before being delivered to the pipeline:

FieldTypeDescription
idstringUnique identifier for the log event
timestampintegerLog event timestamp in Unix milliseconds
messageobject/stringThe log message (preserved as JSON if valid, otherwise as string)
ownerstringAWS account ID that owns the log group
logGroupstringName of the CloudWatch log group
logStreamstringName of the CloudWatch log stream
subscriptionFiltersarrayNames of subscription filters that matched
firehoseRequestIdstringFirehose request ID for tracing
firehoseTimestampintegerFirehose delivery timestamp in Unix milliseconds

Troubleshooting

Common Issues

Issue: Logs are not appearing in Monad

Possible Causes:

  • Firehose delivery stream is not active
  • API key is invalid or lacks permissions
  • Subscription filter pattern is too restrictive

Solution:

  • Check the Firehose stream monitoring metrics in AWS console
  • Verify the API key has pipeline:data:write permission
  • Review the subscription filter pattern

Issue: Firehose shows delivery failures

Possible Causes:

  • Incorrect HTTP endpoint URL
  • Invalid API key in the access key field
  • Network connectivity issues

Solution:

  • Verify the endpoint URL matches https://app.monad.com/api/v2/http/send/{PIPELINE_ID}
  • Regenerate and update the API key
  • Check AWS VPC and security group configurations

Sample Record

{
"id": "38504832190534723847234234234234234234",
"timestamp": 1706540800000,
"message": {
"level": "INFO",
"action": "Request processed",
"requestId": "abc123-def456-ghi789",
"duration_ms": 127
},
"owner": "123456789012",
"logGroup": "/aws/lambda/my-function",
"logStream": "2024/01/29/[$LATEST]abc123def456",
"subscriptionFilters": ["monad-subscription"],
"firehoseRequestId": "firehose-req-xyz789",
"firehoseTimestamp": 1706540799000
}