Factors Affecting Event Usage
On This Page
Read the information in this section to understand how you can manage and reduce your Events usage.
Some settings and choices that affect the number of Events you consume include:
- The number of Source objects you ingest data from. More the objects, higher the Events consumption in loading the data.
Skipped objects and Event Types. You can select the objects Hevo must ingest data from while creating the Pipeline to avoid loading the Events you do not need. Similarly, if multiple Event Types are created from a Source object due to any Transformations that you apply, you can choose to load only the ones you need and skip the others via the Schema Mapper to reduce your Events usage.
The structure of the data. The amount of data in the selected tables and their structure. Destinations such as Redshift and PostgreSQL break down nested Events and count each sub-record as a row. This can result in a higher number of Events to be loaded to the Destination compared to the ingested count. Refer to Parsing Nested JSON Fields in Events for more information.
Conversion Window (Ad-based Sources)
Object restarts and offset changes. When you restart ingestion for an object, any historical data is always loaded for free. For Sources configured with a moving historical sync duration, for example, last 30 days, the historical data is re-ingested for the last 30 days from when you perform the restart action, and this too is free. In Table mode-based Pipelines, you are charged for Events dated post-Pipeline creation. In Log-based Pipelines, Events reloaded from the log are chargeable. In PostgreSQL, this is the log created at the time of Pipeline-creation.
- Transformations that result in adding or dropping of fields or filtering of Events.
Refer to the following table for the list of key updates made to this page:
|Date||Release||Description of Change|