Log-based Pipelines

Last updated on Mar 03, 2024

A log-based Pipeline moves data from a Source to a Destination by trimming through logs which consist of Events describing changes made to a database, wherein data is ingested at a fixed interval.

These logs are generally maintained for replication or recovery of data.

You must select the appropriate Pipeline mode depending on the Source type to create a log-based Pipeline:

  • A MySQL Source configured using the BinLog ingestion mode.

  • A PostgreSQL Source configured using the Logical replication ingestion mode.

  • An Amazon DynamoDB Source using Streams.

  • A MongoDB Source configured using either the Change Stream or OpLog ingestion mode.

  • An SQL Server with individual objects configured with Change Tracking as the query mode, the ingestion mode, or both.

  • An Oracle Source configured using the Redo Logs ingestion mode.

From Release 2.21 onwards, new and existing teams can create a streaming Pipeline for the Source types (all variants) listed above, except SQL Server (all variants) and Amazon DynamoDB. To access this feature, contact the Hevo Sales team.



See Also



Tell us what went wrong