AWS Kinesis¶
Stream events to Amazon Kinesis Data Streams.
Docker Image¶
Configuration¶
With Explicit Credentials¶
sink:
type: kinesis
stream_name: my-stream
region: us-east-1
access_key_id: ${AWS_ACCESS_KEY_ID}
secret_access_key: ${AWS_SECRET_ACCESS_KEY}
For Local Testing¶
sink:
type: kinesis
stream_name: my-stream
region: us-east-1
endpoint_url: http://localhost:4566
access_key_id: local
secret_access_key: local
Options¶
| Option | Type | Required | Default | Metadata Override | Description |
|---|---|---|---|---|---|
stream_name |
string | No | - | Yes | Kinesis stream name (can be overridden per-event) |
region |
string | Yes | - | No | AWS region |
endpoint_url |
string | No | - | No | Custom endpoint for LocalStack |
access_key_id |
string | No | - | No | AWS access key (uses default chain if not set) |
secret_access_key |
string | No | - | No | AWS secret key (uses default chain if not set) |
Dynamic Routing¶
Route events to different streams using metadata:
-- Route by region
metadata_extensions = '[
{"json_path": "stream", "expression": "''events-'' || new.region"}
]'
The sink reads stream from event metadata.
Partition Key¶
The event ID is used as the partition key for shard distribution.
Batching¶
Records are sent using PutRecords with up to 500 records per request (Kinesis limit). Multiple batches are sent concurrently.
Example¶
Complete configuration:
stream:
id: 1
pg_connection:
host: localhost
port: 5432
name: mydb
username: postgres
password: postgres
tls:
enabled: false
batch:
max_size: 1000
max_fill_secs: 5
sink:
type: kinesis
stream_name: postgres-events
region: us-east-1
Records are published with: - Partition Key: Event ID - Data: JSON-serialized payload