Skip to main content

Datadog

Sends processed data to Datadog Logs. Supports streaming of security events and logs for monitoring, analysis, and alerting.

Requirements

To use the Datadog output connector, you need:

  • API Key: A Datadog API key for authentication
  • Domain URL: The base domain of your Datadog instance (e.g., datadoghq.com, us5.datadoghq.com)

Obtaining Datadog Credentials

  1. Finding Your Domain URL

    Your Datadog domain depends on your account region:

    • datadoghq.com - US1 (default)
    • us3.datadoghq.com - US3
    • us5.datadoghq.com - US5
    • datadoghq.eu - EU1
    • ap1.datadoghq.com - AP1
    • ap2.datadoghq.com - AP2
    • ddog-gov.com - US1-FED

    You can verify your region by checking the URL when logged into Datadog (e.g., https://app.datadoghq.com or https://us5.datadoghq.com).

  2. Generating an API Key

    • Log in to your Datadog account
    • Navigate to API Keys
    • Click New Key to create a new API key
    • Give it a descriptive name (e.g., "Monad Integration")
    • Copy and securely store the API key

Details

The Datadog output connector continuously sends processed data to Datadog Logs via the HTTP intake API. The functionality includes the following key features:

  1. HTTP Logs Intake: Data is sent to Datadog's HTTP logs intake endpoint at https://http-intake.logs.<DOMAIN_URL>/api/v2/logs. This ensures reliable delivery of log data to your Datadog account.

  2. Batch Processing: Logs are processed and sent to Datadog in batches to ensure efficient handling of large volumes of data and optimize API usage.

  3. Metadata Enrichment: The connector supports adding metadata to your logs including:

    • Source: Identifies the technology from which the log originated. When it matches a Datadog integration name, corresponding parsers and facets are automatically installed
    • Service: Links logs to APM services for unified observability
    • Hostname: Identifies the originating host
    • Tags: Custom tags for filtering and analysis
  4. Error Handling: The connector includes mechanisms to handle errors gracefully, such as logging issues and retrying failed operations to maintain data integrity.

Example data format:

Logs are sent to Datadog in JSON format. Each record is structured as a JSON object with your log data plus any configured metadata:

{
"ddsource": "nginx",
"ddtags": "env:production,team:security",
"hostname": "web-server-01",
"service": "web-api",
"message": "<raw json log message>"
}

Configuration

The following configuration defines the output parameters. Each field's specifications, such as type, requirements, and descriptions, are detailed below.

Settings

SettingTypeRequiredDescription
Domain URLstringYesThe base domain of the Datadog API (e.g., us5.datadoghq.com). Logs are sent to https://http-intake.logs.<DOMAIN_URL>/api/v2/logs. Valid options: datadoghq.com, us3.datadoghq.com, us5.datadoghq.com, datadoghq.eu, ap1.datadoghq.com, ap2.datadoghq.com, ddog-gov.com. Default: datadoghq.com.
SourcestringNoThe integration name associated with your log: the technology from which the log originated. When it matches an integration name, Datadog automatically installs the corresponding parsers and facets. Examples: nginx, postgresql, aws, kubernetes.
TagsarrayNoTags associated with your logs. Use key:value format for structured tagging (e.g., env:production, team:security). Tags enable filtering and grouping in Datadog.
HostnamestringNoThe name of the originating host of the log. This helps identify which server or container generated the log.
ServicestringNoThe name of the application or service generating the log events. Used to correlate logs with APM data, so use the same value in both Logs and APM products for unified observability.

Secrets

SecretTypeRequiredDescription
API KeystringYesThe API key for authenticating with the Datadog API. Generate this from your Datadog Organization Settings > API Keys.