Destination FAQs

How do I enable or disable the deduplication of records in my Destination tables?

Use the Append Rows on Update option within a Destination table to indicate whether the ingested Events must be directly appended as new rows, or should these be checked for duplicates. You can specify this setting for each table.

Note: This feature is available only for Amazon Redshift, Google BigQuery, and Snowflake data warehouse Destinations. For RDBMS Destinations such as Aurora MySQL, MySQL, Postgres, and SQL Server, deduplication is always done.

In the Destination Detailed View page:

  1. Click the More icon next to the table name in the Destination Tables List.
  2. Update the Append rows on update option, as required:

    Option setting Description
    Enabled Events are appended as new rows without deduplication
    Disabled Events are de-duplicated

    Modify Append rows on update setting for a table

  3. Click OK, GOT IT in the confirmation dialog to apply this setting.

Note: If you disable this feature after having previously enabled it, uniqueness is ensured only for future records in case of Google BigQuery and Snowflake. Therefore, both old and new versions of the same record may exist. In case of Amazon Redshift, however, uniqueness can be achieved for the entire data upon disabling the feature.

How do I resolve duplicate records in the Destination table?

Duplicate records may occur in the Destination table in two scenarios:

  • When there is no primary key in the Destination table to carry out the deduplication of records.
  • When the Append Rows on Update setting is enabled for the table.

    Append Rows on Update

You can either set up the primary key and re-run the ingestion for that object from the Pipeline Overview tab of the Pipeline Detailed View, or, disable the Append Rows in Update setting for the table.

Note: The changes are applied only on the data loaded subsequently.

Can I have the same Source and Destination in the Pipeline?

Yes, you can create a Pipeline with the same Source and Destination instance. However, you cannot move data between the same database via the Pipeline.



Revision History

Refer to the following table for the list of key updates made to this page:

Date Release Description of Change
Mar-09-2021 NA Added the FAQ How do I resolve duplicate records in the Destination table?.
Last updated on 01 Jun 2021