Skip to main content

PostgreSQL

This output supports efficient batch loading of data into PostgreSQL tables.

Prerequisites

The PostgreSQL Output requires:

  • An existing PostgreSQL database
  • A database user with permissions to write to tables
  • The target table must exist with appropriate schema

Configuration

The PostgreSQL Output can be configured using either individual connection parameters or a connection string.

Settings

SettingTypeRequiredDefaultDescription
hoststringNo*-The host of the PostgreSQL database
portintegerNo*-The port of the PostgreSQL database
userstringNo*-The user to connect to the PostgreSQL database
databasestringNo*-The database name to connect to
tablestringYes-The table name to write data to
column_namesarray[string]No-The column names to write data to, must match the root fields of the data. If not provided all root fields will be used

*Required if connection_string is not provided

Secrets

SettingTypeRequiredDescription
connection_stringtextNo*The connection string to connect to the PostgreSQL database. This will be used over other fields if provided
passwordtextNo*The password for the PostgreSQL user

*Either connection_string or individual connection parameters (including password) must be provided

Data Loading

The PostgreSQL Output uses efficient batch loading with the following characteristics:

  1. Batch Processing

    • Records are automatically batched for efficient loading
    • Default batch size: 100 records
    • Maximum batch data size: 1 MiB
    • Batch processing interval: 5 seconds
  2. Column Handling

    • Automatically maps JSON fields to table columns
    • Supports explicit column mapping via column_names setting
    • Handles missing fields by inserting NULL values
    • Uses the first record's schema if no column names are specified

Best Practices

Data Types

  • Ensure PostgreSQL column types match your data
  • Consider using JSONB for complex nested structures
  • Use appropriate numeric types for precision requirements