Kafka Confluent Cloud

Last updated on Jul 03, 2025

Kafka Confluent Cloud is a fully managed event streaming platform powered by Apache Kafka.

You can use Hevo Pipelines to replicate data from your Kafka Confluent Cloud account to the Destination system.


Limitations

  • Hevo only supports SASL_SSL-encrypted data in Kafka Confluent Cloud.

  • Hevo supports only JSON data format in Kafka Confluent Cloud.

  • Hevo does not load data from a column into the Destination table if its size exceeds 16 MB, and skips the Event if it exceeds 40 MB. If the Event contains a column larger than 16 MB, Hevo attempts to load the Event after dropping that column’s data. However, if the Event size still exceeds 40 MB, then the Event is also dropped. As a result, you may see discrepancies between your Source and Destination data. To avoid such a scenario, ensure that each Event contains less than 40 MB of data.


Revision History

Refer to the following table for the list of key updates made to this page:

Date Release Description of Change
Jul-07-2025 NA Updated the Limitations section to inform about the max record and column size in an Event.
Jan-07-2025 NA Updated the Limitations section to add information on Event size.
Mar-05-2024 2.21 Updated the ingestion frequency table in the Data Replication section.
Oct-03-2023 NA Updated the page as per the latest Confluent Cloud user interface (UI).
Dec-07-2022 NA Updated section, Data Replication to reorganize the content for better understanding and coherence.
Sep-19-2022 NA Added section, Data Replication.

Tell us what went wrong