Attempt to Load Duplicate Records

Last updated on Aug 06, 2025
Applies To Incremental models with primary keys for PostgreSQL and SQL Server Destinations
Error Message Text(s) For PostgreSQL: ERROR: ON CONFLICT DO UPDATE command cannot affect row a second time. Hint: Ensure that no rows proposed for insertion within the same command have duplicate constrained values.

For SQL Server: The MERGE statement attempted to UPDATE or DELETE the same row more than once. This happens when a target row matches more than one source row. A MERGE statement cannot UPDATE/DELETE the same row of the target table multiple times. Refine the ON clause to ensure a target row matches at most one source row, or use the GROUP BY clause to group the source rows.

Error Summary

This error occurs when an incremental Model attempts to insert duplicate records that share the same primary key into the output table.

Potential Cause

The Model’s SQL query generates duplicate records with the same primary key value and attempts to load those records to the output table.

Suggested Action(s)

Modify your Model’s SQL query to ensure that it does not generate duplicate values. To identify duplicate records, run the following query:

    WITH base_query AS (
    -- Paste your Model query here
)
SELECT
    <primary_key1>,
    <primary_key2>,
    ...
    COUNT(*) AS count
FROM
    base_query
GROUP BY
    <primary_key1>,
    <primary_key2>,
    ...
HAVING COUNT(*) > 1;
ORDER BY COUNT(*) DESC; 

After identifying duplicates, adjust your query logic to ensure only unique rows are inserted into the output table.


Revision History

Refer to the following table for the list of key updates made to this page:

Date Release Description of Change
Aug-06-2025 NA New document.

Tell us what went wrong