Datadog
Sends processed data to Datadog Logs. Supports streaming of security events and logs for monitoring, analysis, and alerting.
Requirements
To use the Datadog output connector, you need:
- API Key: A Datadog API key for authentication
- Domain URL: The base domain of your Datadog instance (e.g., datadoghq.com, us5.datadoghq.com)
Obtaining Datadog Credentials
-
Finding Your Domain URL
Your Datadog domain depends on your account region:
datadoghq.com- US1 (default)us3.datadoghq.com- US3us5.datadoghq.com- US5datadoghq.eu- EU1ap1.datadoghq.com- AP1ap2.datadoghq.com- AP2ddog-gov.com- US1-FED
You can verify your region by checking the URL when logged into Datadog (e.g.,
https://app.datadoghq.comorhttps://us5.datadoghq.com). -
Generating an API Key
- Log in to your Datadog account
- Navigate to API Keys
- Click New Key to create a new API key
- Give it a descriptive name (e.g., "Monad Integration")
- Copy and securely store the API key
Details
The Datadog output connector continuously sends processed data to Datadog Logs via the HTTP intake API. The functionality includes the following key features:
-
HTTP Logs Intake: Data is sent to Datadog's HTTP logs intake endpoint at
https://http-intake.logs.<DOMAIN_URL>/api/v2/logs. This ensures reliable delivery of log data to your Datadog account. -
Batch Processing: Logs are processed and sent to Datadog in batches to ensure efficient handling of large volumes of data and optimize API usage.
-
Metadata Enrichment: The connector supports adding metadata to your logs including:
- Source: Identifies the technology from which the log originated. When it matches a Datadog integration name, corresponding parsers and facets are automatically installed
- Service: Links logs to APM services for unified observability
- Hostname: Identifies the originating host
- Tags: Custom tags for filtering and analysis
-
Error Handling: The connector includes mechanisms to handle errors gracefully, such as logging issues and retrying failed operations to maintain data integrity.
Example data format:
Logs are sent to Datadog in JSON format. Each record is structured as a JSON object with your log data plus any configured metadata:
{
"ddsource": "nginx",
"ddtags": "env:production,team:security",
"hostname": "web-server-01",
"service": "web-api",
"message": "<raw json log message>"
}
Configuration
The following configuration defines the output parameters. Each field's specifications, such as type, requirements, and descriptions, are detailed below.
Settings
| Setting | Type | Required | Description |
|---|---|---|---|
| Domain URL | string | Yes | The base domain of the Datadog API (e.g., us5.datadoghq.com). Logs are sent to https://http-intake.logs.<DOMAIN_URL>/api/v2/logs. Valid options: datadoghq.com, us3.datadoghq.com, us5.datadoghq.com, datadoghq.eu, ap1.datadoghq.com, ap2.datadoghq.com, ddog-gov.com. Default: datadoghq.com. |
| Source | string | No | The integration name associated with your log: the technology from which the log originated. When it matches an integration name, Datadog automatically installs the corresponding parsers and facets. Examples: nginx, postgresql, aws, kubernetes. |
| Tags | array | No | Tags associated with your logs. Use key:value format for structured tagging (e.g., env:production, team:security). Tags enable filtering and grouping in Datadog. |
| Hostname | string | No | The name of the originating host of the log. This helps identify which server or container generated the log. |
| Service | string | No | The name of the application or service generating the log events. Used to correlate logs with APM data, so use the same value in both Logs and APM products for unified observability. |
Secrets
| Secret | Type | Required | Description |
|---|---|---|---|
| API Key | string | Yes | The API key for authenticating with the Datadog API. Generate this from your Datadog Organization Settings > API Keys. |