Azure MySQL

Last updated on Jun 30, 2025

Azure MySQL Database is an easy to set up, operate, and scale fully managed database service from Microsoft. It can automate your database management and maintenance, including routine updates, backups and security, enabling you to focusing on working with your data.

You can ingest data from your Azure MySQL database using Hevo Pipelines and replicate it to a Destination of your choice.


Source Considerations

  • MySQL does not generate log entries for cascading deletes. So, Hevo cannot capture these deletes for log-based Pipelines.

  • If your Pipeline uses BinLog ingestion mode, MySQL replicates timestamp fields such as created_at and updated_at in Coordinated Universal Time (UTC). As BinLogs do not include time zone metadata, Hevo and the Destination interpret these values in UTC. Due to this, you may observe a time difference if the Source database uses a different timezone. For example, if the Source timezone is US Eastern Time (UTC-4) and the timestamp for created_at in MySQL is 2024-05-01 10:00:00, it appears as 2024-05-01 14:00:00 in BigQuery. This behavior applies only to incremental loads via BinLogs. Any data replicated using historical load retains the original timestamp values from the Source.

    As a workaround, you can adjust the UTC timestamps to your local timezone using Python code-based Transformations by adding or subtracting the appropriate offset in the timestamp fields. For example, to convert UTC to UTC+7, add a 7-hour offset to the relevant fields before loading the data into the Destination.

Limitations

  • For MySQL versions 8.0.4 and above, based on the value of the default_authentication_plugin system variable, connecting database users are authenticated using the caching_sha2_password plugin by default. However, Hevo does not currently support this plugin, and you may see the Public Key Retrieval is not allowed error. In that case, you can change the authentication plugin to mysql_native_password for the user.

    To do this, connect to your MySQL server as a root user and run the following command:

    ALTER USER '<database_username>'@'%' IDENTIFIED WITH mysql_native_password BY '<password>';
    

    Note: Replace the placeholder values in the command above with your own. For example, <database_username> with hevouser.

  • Currently, Hevo does not support transaction-based replication using Global Transaction Identifiers (GTIDs).

  • Logging in using SSL not supported. This setting is enabled by default. You can disable it as follows:

    1. In the left navigation pane, under Settings, click Server Parameters.

    2. Under the Top tab, update the value of require_secure_transport server parameter to OFF, and then click Save.

      Disable SSL

  • Hevo only fetches tables from the MySQL database. It does not fetch other entities such as functions, stored procedures, views, and triggers.

    To fetch views, you can create individual Pipelines in Custom SQL mode. However, some limitations may arise based on the type of data synchronization, the query mode, or the number of Events. Contact Hevo Support for more details.

  • During the historical load, Hevo reads table definitions directly from the MySQL database schema, whereas for incremental updates, Hevo reads from the BinLog. As a result, certain fields, such as nested JSON, are parsed differently during historical and incremental loads. In the Destination tables, nested JSON fields are parsed as a struct or JSON during historical loads, but as a string during incremental loads. This leads to a data type mismatch between the Source and Destination data, causing Events to be sidelined.

    To ensure JSON fields are parsed correctly during the historical load, you can apply transformations to every table containing nested JSON fields. Contact Hevo Support for more details.

  • Hevo Pipelines may fail to process transactions in the BinLog if the size of the transaction exceeds 4GB. This problem is due to a MySQL bug that affects the library used by Hevo to stream Events, resulting in ingestion failures. In such cases, Hevo attempts to restart the ingestion process from the beginning of the transaction, skipping already processed Events. If the problem of transaction processing persists and the BinLog remains stuck, contact Hevo Support for assistance.

  • Hevo does not load an Event into the Destination table if its size exceeds 128 MB, which may lead to discrepancies between your Source and Destination data. To avoid such a scenario, ensure that each row in your Source objects contains less than 100 MB of data.


See Also


Revision History

Refer to the following table for the list of key updates made to this page:

Date Release Description of Change
Jun-30-2025 NA Updated section, Source Considerations to add a point about UTC replication of timestamp fields in BinLog mode.
Jun-23-2025 NA - Updated the Create a database user section to segregate the commands based on database version.
- Added a limitation about Hevo not supporting the caching_sha2_password authentication plugin.
May-19-2025 NA Updated section, Limitations to add note about GTID based replication.
Mar-13-2025 NA Updated section, Prerequisites to mention the supported MySQL versions for each ingestion mode.
Jan-20-2025 NA Added a note for Load All Databases in the Pipeline Advanced Settings in the Specify Azure MySQL Connection Settings section.
Jan-07-2025 NA Updated the Limitations section to add information on Event size.
Dec-18-2024 NA Updated section, Limitations to add information about Hevo handling transaction failures in the BinLog due to a MySQL bug affecting transactions exceeding 4GB.
Nov-18-2024 NA Updated sections, Create a Read Replica (Optional), Whitelist Hevo’s IP Addresses, and Limitations as per the latest Azure MySQL UI.
Jul-31-2024 NA Updated section, Limitations to add information about Hevo reading table definitions differently during historical and incremental loads.
Apr-29-2024 NA Updated section, Specify Azure MySQL Connection Settings to include more detailed steps.
Mar-18-2024 2.21.2 Updated section, Specify Azure MySQL Connection Settings to add information about the Load all CA certificates option.
Mar-05-2024 2.21 Added the Data Replication section.
Nov-03-2023 NA Renamed section, Object Settings to Object and Query Mode Settings.
Oct-27-2023 NA Updated section, Create a Database User and Grant Privileges with the latest steps.
Jun-26-2023 NA Added section, Source Considerations.
Apr-21-2023 NA Updated section, Specify Azure MySQL Connection Settings to add a note to inform users that all loaded Events are billable for Custom SQL mode-based Pipelines.
Mar-09-2023 2.09 Updated section, Specify Azure MySQL Connection Settings to mention about SEE MORE in the Select an Ingestion Mode section.
Dec-19-2022 2.04 Updated section, Specify Azure MySQL Connection Settings to add information that you must specify all fields to create a Pipeline.
Dec-07-2022 2.03 Updated section, Specify Azure MySQL Connection Settings to mention about including skipped objects post-Pipeline creation.
Dec-07-2022 2.03 Updated section, Specify Azure MySQL Connection Settings to mention about the connectivity checker.
Oct-13-2022 1.99 Updated section, Specify Azure MySQL Connection Settings to reflect the latest UI changes.
Apr-21-2022 1.86 Updated section, Specify Azure MySQL Connection Settings.
Aug-09-2021 NA Added a note in the Grant privileges to a user step.
Jul-26-2021 1.68 Added a note for the Database Host field.
Jul-12-2021 NA Added section, Specify Azure MySQL Connection Settings.
Feb-22-2021 1.57 Updated the Create a Read Replica section to provide UI-based steps.

Tell us what went wrong