DynamoDB Streams Trigger (Restricted Beta)

Learn more about the DynamoDB Streams Trigger and how to use it in the Digibee Integration Platform.

This feature is currently in the Restricted Beta phase and is only available to specific customers.

DynamoDB Streams is a feature of the DynamoDB database that publishes near real-time events for every record modification in a DynamoDB table that has DynamoDB Streams enabled. The DynamoDB Streams Trigger captures these events and sends them to pipelines running in Digibee's infrastructure. This allows the Change Data Capture (CDC) pattern to be easily implemented with Digibee pipelines.

Parameters

Take a look at the configuration parameters of the trigger. Parameters supported by Double Braces expressions are marked with (DB).

ParameterDescriptionDefault valueData type

DynamoDB Client Account

AWS account used to access the DynamoDB Streams API.

N/A

String

Table Name

Name of the DynamoDB table from which to fetch the stream of events.

N/A

String

AWS Region

Defines the AWS region where the DynamoDB is located.

us-east-1

String

Expiration

The amount of time that the event spends in the queue (in milliseconds). If the value equals 0 or is a value greater than 6h (21600000 ms), then the expiration will be 1/4 of the specified Maximum Timeout value.

600000

Integer

Maximum Timeout

Maximum time (in milliseconds) for the pipeline to process information before returning a response. Default: 30000. Limit: 900000.

30000

Integer

Allow Redelivery of Messages

If activated, the option allows messages to be delivered again if the Pipeline Engine fails.

False

Boolean

Additional information

The DynamoDB Streams Trigger does not automatically activate the Streams feature on an existing DynamoDB table. Instead, the trigger assumes that the table exists with the feature pre-configured and sends the events to the pipeline as they are, without any transformation.

See below an example of a DynamoDB Streams event:

{
   "eventID":"1",
   "eventName":"INSERT",
   "eventVersion":"1.0",
   "eventSource":"aws:dynamodb",
   "awsRegion":"us-east-1",
   "dynamodb":{
      "Keys":{
         "Id":{
            "N":"101"
         }
      },
      "NewImage":{
         "Message":{
            "S":"New item!"
         },
         "Id":{
            "N":"101"
         }
      },
      "SequenceNumber":"111",
      "SizeBytes":26,
      "StreamViewType":"NEW_AND_OLD_IMAGES"
   }
}

Note that the trigger, once deployed, always starts by consuming the latest event from the Streams. This means that the pipeline will not receive events that have been published before it is online. This prevents the execution environment from being flooded with events, which can lead to out-of-memory errors and delivery delays.

Last updated