Apache Kafka

Last updated on Jul 03, 2025

Apache Kafka is an open-source community distributed event streaming platform.

You can use Hevo Pipelines to replicate the data from your Apache Kafka Source to the Destination system.


Limitations

  • Hevo only supports SSL/TLS-encrypted and plain text data in Apache Kafka.

  • Hevo only supports JSON data format in Apache Kafka.

  • Hevo does not load data from a column into the Destination table if its size exceeds 16 MB, and skips the Event if it exceeds 40 MB. If the Event contains a column larger than 16 MB, Hevo attempts to load the Event after dropping that column’s data. However, if the Event size still exceeds 40 MB, then the Event is also dropped. As a result, you may see discrepancies between your Source and Destination data. To avoid such a scenario, ensure that each Event contains less than 40 MB of data.


Revision History

Refer to the following table for the list of key updates made to this page:

Date Release Description of Change
Jul-07-2025 NA Updated the Limitations section to inform about the max record and column size in an Event.
Jan-07-2025 NA Updated the Limitations section to add information on Event size.
Mar-18-2024 2.21.2 Updated section, Configure Apache Kafka Connection Settings to add information about the Load all CA certificates option.
Mar-05-2024 2.21 Updated the ingestion frequency table in the Data Replication section.
Dec-07-2022 NA Updated section, Data Replication to reorganize the content for better understanding and coherence.
Sep-19-2022 NA Added section, Data Replication.

Tell us what went wrong