Release Version 2.16

Last updated on Aug 29, 2023

The content on this site may have changed or moved since you last viewed it. As a result, some of your bookmarks may become obsolete. Therefore, we recommend accessing the latest content via the Hevo Docs website.

To know the list of features and integrations we are working on next, read our Upcoming Features page!

In this Release

Upcoming Breaking Changes


  • Removing Support for Calendar Events in HubSpot  Breaking change

    • Effective August 31, 2023, HubSpot will stop supporting the Calendar Events object. As a result, Hevo will stop replicating data for the same.

New and Changed Features


  • Improved Support of JDBC Driver in Snowflake

    • Updated the JDBC driver to the latest version for the Snowflake accounts hosted on Google Cloud in Hevo. This change does not impact any new or existing Pipelines.


  • Support for Partition Keys-Based Data Loading in Google BigQuery Destinations

    • Enhanced the data loading process to use partition keys, if any, while writing data to new and existing partitioned tables in BigQuery. With this change, Hevo directly writes to the affected partitions as opposed to earlier, where it scanned the entire table to identify the impacted rows. As a result, the SQL queries that Hevo performs for loading data to your Google BigQuery Destination read a lesser number of bytes, thereby reducing costs and improving performance.

      Read Partitioning in BigQuery and Recreating Partitioned Tables from BigQuery Console.

Fixes and Improvements


  • Support for ECDSA-Generated Public Key for SSH Connections

    • Added support for public keys generated using the Elliptic Curve Digital Signature Algorithm (ECDSA). With this addition, you can use our ECDSA public key while connecting to a Destination or Source through SSH. ECDSA keys are smaller in size and use advanced mathematical operations, thereby improving the security and performance of your SSH connection.

      Read Connecting Through SSH.


  • Additional Validations while Configuring MySQL Sources

    • Enhanced the connectivity checker to validate additional prerequisites and permissions required to successfully set up a MySQL Source with the BinLog ingestion mode. This prevents any errors that may arise due to insufficient permissions post-Pipeline creation.

      This change is applicable to MySQL Pipelines (all variants) created after Release 2.16 and having BinLog as the ingestion mode.

  • Handling of Change Streams Response Documents Larger than 16 MB in MongoDB

    • Enhanced the MongoDB integration to handle files larger than 16 MB and read the entire Change Streams response document during ingestion.

      This change is applicable to MongoDB Pipelines (all variants) created after Release 2.16.

  • Optimizing Query Execution in MongoDB

    • Fixed an issue whereby the query executed by Hevo was causing irregularities in capturing the offset, leading to a data mismatch between the Source and the Destination data.

      This change does not impact any new or existing Pipelines.

  • Ingestion of Geometry data types from MySQL Sources

    • Fixed an issue whereby some Events in Geometry formats (such as Point, LineString, Polygon, MultiPoint, MultiLineString, MultiPolygon, and GeometryCollection) were not getting ingested from MySQL Sources, leading to Pipeline failure.

      However, if customers are loading any data in Point format to a BigQuery Destination, the values must lie within -90 and +90, as that is the range that BigQuery supports. If the data lies outside this range, customers must contact Hevo Support.

  • Move to Version 16.0 of the Facebook Ads Marketing API

    • Effective September 20, 2023, Facebook Ads will deprecate v15.0 of the Marketing API. As a result, starting Release 2.15.3, Hevo replicates data from your Facebook Ads account using Marketing API v16.0.

      This change does not impact any new or existing Pipelines.

  • Support for Additional Data Types as Primary Keys in MySQL

    • Enhanced the integration to support tables having decimal, date, datetime, or timestamp data types as primary keys in MySQL Pipelines. Previously, such tables had to be skipped for ingestion to be successful.

      This change is applicable to MySQL Pipelines (all variants) created after Release 2.16 and having BinLog as the ingestion mode.

Documentation Updates

The following pages have been created, enhanced, or removed in Release 2.16:

Account Management

Data Ingestion


Getting Started




Tell us what went wrong