Amazon DynamoDB

Amazon DynamoDB is a fully managed, multi-master, a multi-region non-relational database that offers built-in in-memory caching to deliver reliable performance at any scale.

Hevo uses DynamoDB’s data streams to support change data capture (CDC). Data streams are time-ordered sequences of item-level changes in the DynamoDB tables. All data in DynamoDB streams are subject to a 24-hour lifetime and are automatically removed after this time. We suggest that you keep the ingestion frequency accordingly.

To facilitate incremental data loads to a Destination, Hevo needs to keep track of the data that has been read so far from the data stream. Hevo supports two ways of replicating data to manage the ingestion information:

Refer to the table below to know the differences between the two methods.

Kinesis Data Streams DynamoDB Streams
Recommended method. Default method, if DynamoDB user does not have dynamodb:CreateTable permissions.
User permissions needed on the DynamoDB Source: - Read-only - dynamodb:CreateTable User permissions needed on the DynamoDB Source: - Read-only
Uses the Kinesis Client Library (KCL) to ingest the changed data from the database. Uses the DynamoDB library.
Guarantees real-time data ingestion Data might be ingested with a delay
The Kenesis driver maintains the context. KCL creates an additional table (with prefix hevo_kcl) per table in the Source system, to store the last processed state for a table. Hevo keeps the entire context of data replication as metadata, including positions to indicate the last record ingested.


Perform the following steps to configure your Amazon DynamoDB Source:

Enable Streams

You need to enable Streams on all DynamoDB tables you want to sync through Hevo. To do this:

  1. Sign in to the AWS Management Console and select the DynamoDB service.

  2. In the left navigation bar of the DynamoDB console, under Dashboard, select Tables, and then, select the table for which you want to enable streams. For example, customer in the image below.

    DynamoDB table

  3. In the Exports and streams tab, scroll down to the DynamoDB stream details section and click Enable.

    Export Streams tab

  4. In the Enable DynamoDB stream page, select New and old images and click Enable stream.

    New and old images

  5. Repeat Steps 2-4 for all the tables you want to synchronize.

Create an IAM Policy

Note: An IAM policy is needed for KCL (Kinesis Data Streams) only.

The policy is an object in AWS, which, when associated with an identity or resource, defines their permissions. Therefore, when Hevo makes a request to access the data in your DynamoDB account, the policy is applied to the related API. AWS evaluates the permissions in the policy to determine whether the request is allowed or denied. Most policies are stored in AWS as JSON documents.

Perform the following steps to create the IAM policy:

  1. Log in to the Amazon IAM Console.

  2. Click Policies in the left navigation bar.

    Policies link

  3. Click Create policy in the right pane.

    Create Policy link

  4. Click the JSON tab and paste the following policy into the editor. The JSON statements list the permissions the policy would assign to Hevo.

         "Version": "2012-10-17",
         "Statement": [
             "Effect": "Allow",
             "Action": [
             "Resource": [

    Note: Hevo does not modify any data in the Source tables. The permissions are used solely to store the last processed state for a table by the KCL.

  5. Click Review policy.

  6. In the Review policy page, provide a Name for the policy. For example, Hevo-access.

  7. (Optional) Provide a Description.

    Name for the policy

  8. Click Create policy. You can view the new policy in the list.

    Create the policy

Create the AWS Access Key and the AWS Secret Key

The AWS Access Key and the AWS Secret Key allow Hevo to establish authentication and replicate your Amazon DynamoDB data into your desired Destination system. You need to specify these while configuring Amazon DynamoDB as a Source in Hevo.

To retrieve these:

  1. Log in to the Amazon IAM Console.

  2. In the left navigation pane, under Groups, click Users, and then, click the User name for which you want to create an access key.

    Click the user name

  3. In the Summary page, click the Security credentials tab, and then, click Create access key.

    Create access key

  4. In the Create access key dialog box:

    1. Locate the AWS Access Key under Access key ID.

    2. Click Show under Secret access key to view the AWS Secret Key.

      Note: Once you exit this dialog box, you cannot access the same AWS Access Key and the AWS Secret key. However, you can create a new key and secret by repeating these steps.

    3. Optionally, click Download .csv file to download and save the AWS Access Key and the AWS Secret key in your local machine.

      Download access key

Retrieve the AWS Region

To configure Amazon DynamoDB as a Source in Hevo you need to provide the AWS region where your DynamoDB instance is running.

To know your AWS region:

  1. Log in to the Amazon DynamoDB Console.

  2. On the top-right, locate the AWS region.

    Locate region

Configure Amazon DynamoDB Connection Settings

Perform the following steps to configure DynamoDB as a Source in Hevo:

  1. Click PIPELINES in the Asset Palette.

  2. Click + CREATE in the Pipelines List View.

  3. In the Select Source Type page, select DynamoDB.

  4. In the Configure your DynamoDB Source page, specify the following:

    DynamoDB settings

    • Pipeline Name: A unique name for the Pipeline.

    • AWS Access Key, AWS Secret Key, and AWS Region.

    • Advanced Settings:

      • Load Historical Data: If this option is enabled, the entire table data is fetched during the first run of the Pipeline. If disabled, Hevo loads only the data that was written in your database after the time of creation of the Pipeline.

      • Include New Tables in the Pipeline: Applicable for all Pipeline modes except Custom SQL.

        If enabled, Hevo automatically ingests data from tables created in the Source after the Pipeline has been built. These may include completely new tables or previously deleted tables that have been re-created in the Source.

      You can change this setting later.

  5. Click TEST & CONTINUE to set up the job settings.

  6. You’ll get a list of the tables available to replicate. Note that Hevo will only be able to ingest data from the tables for which DynamoDB Streams is enabled. Deselect the tables you don’t want to replicate. Click Continue to configure the Destination.

  7. Select the Destination where you want to replicate DynamoDB tables or click on ADD DESTINATION to create a new Destination. Read Destinations for more information.

If your DynamoDB source does not contain any new Events to be ingested, Hevo defers the data ingestion for a pre-determined time. Hevo re-attempts to fetch the data only after the deferment period elapses. Read Deferred Data Ingestion.

Schema and Type Mapping

Hevo replicates the schema of the tables from the Source DynamoDB as-is to your Destination database or data warehouse. In rare cases, we skip some columns with an unsupported Source data type while transforming and mapping.

The following table shows how your DynamoDB data types get transformed to a warehouse type.

DynamoDB Data Type Warehouse Data Type
Binary Bytes
Number Decimal/Long
Boolean Boolean



See Also

Revision History

Refer to the following table for the list of key updates made to this page:

Date Release Description of Change
Oct-04-2021 1.73 - Updated the section, Prerequisites to inform users about setting the value of the StreamViewType parameter to NEW_AND_OLD_IMAGES.
- Updated the section, Enable Streams to reflect the latest changes in the DynamoDB console.
Aug-8-2021 NA Added a note in the Source Considerations section about Hevo deferring data ingestion in Pipelines created with this Source.
Jul-12-2021 1.67 Added the field Include New Tables in the Pipeline under Source configuration settings.
Feb-22-2021 1.57 Added sections:
- Create the AWS Access Key and the AWS Secret Key
- Retrieve the AWS Region
Last updated on 26 Aug 2022

Tell us what went wrong