Kafka Confluent Cloud

Last updated on Jan 07, 2025

Kafka Confluent Cloud is a fully managed event streaming platform powered by Apache Kafka.

You can use Hevo Pipelines to replicate data from your Kafka Confluent Cloud account to the Destination system.


Limitations

  • Hevo only supports SASL_SSL-encrypted data in Kafka Confluent Cloud.

  • Hevo supports only JSON data format in Kafka Confluent Cloud.

  • Hevo does not load an Event into the Destination table if its size exceeds 128 MB, which may lead to discrepancies between your Source and Destination data. To avoid such a scenario, ensure that each row in your Source objects contains less than 100 MB of data.


Revision History

Refer to the following table for the list of key updates made to this page:

Date Release Description of Change
Jan-07-2025 NA Updated the Limitations section to add information on Event size.
Mar-05-2024 2.21 Updated the ingestion frequency table in the Data Replication section.
Oct-03-2023 NA Updated the page as per the latest Confluent Cloud user interface (UI).
Dec-07-2022 NA Updated section, Data Replication to reorganize the content for better understanding and coherence.
Sep-19-2022 NA Added section, Data Replication.

Tell us what went wrong