Audit Tables

Last updated on Jul 25, 2024
On This Page

This feature is currently available for Early Access. Please contact your Hevo account executive or Support team to enable it. Alternatively, request for early access to try out one or more such features.

Audit Tables in Hevo provide improved visibility into the Pipeline activities such as data ingestion and loading. These are tables that you can enable Hevo to create in your Destination for maintaining logs of all the jobs performed by your Pipelines for data replication. Hevo maintains the logs within a database, ingests them, and loads them to the Audit Tables as per the schedule that you define.

Hevo creates the Audit Tables at the job and object level. This means that the details are present for all jobs executed across Pipelines and each object in those Pipelines in your account. Each row in an Audit Table represents a job performed by a Pipeline.

Users who signed up after October 31, 2022, can use the same Audit tables across all the workspaces in their account. However, they must define these table settings within each workspace.

Audit Tables help you do the following:

  • Track and monitor Event usage based on the number of Events ingested, loaded, and failed.

  • Monitor the stages of data replication on the basis of:

    • Start and end timestamp of a data replication status.

    • Mode of data replication: Historical or Incremental.

    • The type of data being replicated: Refresher or Incremental.

    • Total time taken by Hevo to complete the execution for a replication status.

Hevo creates the Audit Tables in the Destination with the following names and schemas:

- Job_details table for Job-level logs

Field Field Type Description
job_id varchar The UUID of the job performed by the Pipeline.
pipeline_id long The sequential serial number of the Pipeline.
source varchar The Source associated with the Pipeline ID.
destination varchar The Destination associated with the Pipeline ID.
start_timestamp long The start timestamp of the job.
end_timestamp long The end timestamp of the job.
status varchar The status of the Pipeline related to the job. For example, Scheduled or In Progress.
mode varchar The type of data being replicated. For example, Refresher or Incremental.
duration long The time taken for the job to be completed.
custom_stage bool The boolean value confirming whether a Transformation was applied to the Pipeline ID or not. This field is TRUE if a Transformation is applied and vice-versa.

- Object_details table for object-level logs

Field Field Type Description
job_id varchar The UUID of the job performed by a Pipeline.
object varchar The name of the Source object associated with the job ID.
event_name varchar The Source Event name associated with an object. For example, if the object, city contains address and zipcode as the associated objects, the Event names will be city.zipcode and city.address.
stage varchar The stage of data replication related to the job. For example, Ingestion or Load.
start_timestamp long The start timestamp of the job.
end_timestamp long The end timestamp of the job.
status varchar The status of data replication. For example, In Progress or Completed.
Note: This field may or may not display the accurate status if Transformations are involved.
mode varchar The mode of data replication. For example, Historical or Incremental.
input_rows long The number of Events ingested for the object associated with the job.
output_rows long The number of Events loaded for the object associated with the job.
Note: This may not match the input rows in case of errors or Transformations.
error_rows long The number of Events that are not replicated due to some error.
billable_rows long The number of loaded Events that are counted towards your Events usage.
duration long The time taken for the job to be completed.

Logs are added as per the sequence in which jobs are completed in Pipelines. You can run suitable queries to view the logs as per your requirements. The following is a sample of how the logs are visible when you query the Audit Tables from the Destination workbench:

Sample View of Tables

Hevo does not charge you for the logs loaded to the Destination or queries run directly from the Hevo UI. However, you may be charged by the Destination for data storage and querying. Refer to your respective Destination for details about such costs.



Revision History

Refer to the following table for the list of key updates made to this page:

Date Release Description of Change
Jan-10-2024 2.19 New document.

Tell us what went wrong