Managing Objects in Pipelines

You can see the objects you select to ingest data for in the Pipeline Detailed View. The information displayed is based on the Pipeline mode and the Source configuration you select. You can control the data ingestion for individual objects using the options provided for it.

NOTE: Objects are not listed for Sources which depend on webhooks to receive data.

The following information is typically provided for each object:

Object actions for SaaS Sources

  • Name of the object.

  • Number of Events ingested.

  • A trend-line graph showing the activity in the last 24 hours.

  • Object ingestion state.

  • Time when the Events were last ingested.

SaaS Source Objects

Data for SaaS Sources is ingested based on the respective APIs exposed by the Source.

SaaS Source objects

Available Actions

The following standard actions are available for each object:

  • Sample Events: View a sample of the ingested Events.

  • Run Now: Queue the jobs related to the Source Object. This action works on an Active object and Pipeline.

  • Pause/Resume: Pause or resume the data ingestion for the object.

  • Restart: Restart the incremental data ingestion for the object. The data is ingested again since the beginning. Some Sources may restrict how far back Hevo can go to re-ingest the data. Hevo stores the position of each ingestion run at its end. On every consecutive run, it looks for data beyond the previously stored position.

    NOTE: Not all Sources support the Restart action.

  • Skip/Include Object: Skip and Include are complementary actions. You can skip and include active or paused Source objects.

RDBMS Source Objects

Pipelines for RDBMS Sources display similar information as SaaS Sources, along with the log-based ingestion and historical load status, if Pipeline is log-based and the Load Historical Data option is selected, respectively, as shown in the image below.

Historical data progress

In the above image, BinLog information indicates the record position till which the data has been ingested. This information is also displayed and controllable at the object-level. Click the More icon and then, Change Position to start ingesting from a different position. In log-based Pipelines, the position-change option works on historical load if it exists.

Available Actions

The following standard actions are available for each object:

Object actions for RDBMS Sources

  • Edit Config: This option is provided if data ingestion for the object needs configuration. For example, loading the historical data in a log-based Pipeline needs configuration for the incremental columns.

  • Change Position: This option is provided for all SaaS-based Pipelines and Pipelines with mode as Table. All log-based Pipelines, except for WAL mode in PostgreSQL and Streams in Amazon DynamoDb too support change of position. Editing the position also initiates the data ingestion activity.

  • Sample Events: View a sample of the ingested Events.

  • Restart Historical Load: Restarting means ingesting data since the beginning (the beginning is defined per source type).

    • Log-based ingestion cannot be restarted. For such Pipelines, the historical load has to be restarted.

    • This action works on Active state of the object and Pipeline.

  • Skip/Include Object: Skip and Include are complementary actions.

    • You can skip and include active or paused Source objects.

    • For log-based Pipelines, skipping/including the Source object means dropping/including the related Events from the log-based ingestion.

Last updated on 17 Dec 2020