Kafka Confluent Cloud
On This Page
Kafka Confluent Cloud is a fully managed event streaming platform powered by Apache Kafka.
You can use Hevo Pipelines to replicate data from your Kafka Confluent Cloud account to the Destination system.
Prerequisites
-
An active Confluent Cloud account from which data is to be ingested is available.
-
One or more bootstrap servers are available in Kafka Confluent Cloud.
-
The API key and the API Secret are available in Kafka Confluent Cloud.
Perform the following steps to configure your Apache Kafka Source:
Locate the Bootstrap Server Information
To locate the bootstrap server information:
-
Log in to your Kafka Confluent Cloud account.
-
In the Clusters tab, select your cluster.
-
In the left navigation pane, click Cluster settings.
-
Locate the bootstrap server value under the General tab, Identification.
Create the API Key and the API Secret
To create the API key and the API secret:
-
Log in to your Kafka Confluent Cloud account.
-
In the Clusters tab, click on your cluster name.
-
In the left navigation pane, click API access, and then, click + Add key.
-
In the Access control page, select Granular access, and then, click Next.
-
In the Service account page, click the Create a new one tab, and then, specify the following to create a service account. This is the account where limited permissions (ACLs) can be defined. Read more here.
-
New service account name: A unique name for your service account.
-
Description: A brief description of your service account.
-
-
Click Save and Next.
-
ACLs are required to assign permissions for each category of API key.
In the Add ACLs to service account page, specify the following Access Control Lists (ACLs):Notes: You need to click + Add ACLs after specifying each ACL.
-
Cluster API
ACL Category Operation Permission Cluster DESCRIBE ALLOW -
Consumer Group ACLs
ACL Category Consumer Group ID Pattern Type Operation Permission Consumer Group hevo-integration PREFIXED DESCRIBE ALLOW Consumer Group hevo-integration PREFIXED READ ALLOW -
Topic ACLS
ACL Category Topic Name Pattern Type Operation Permission Topic * (The topic for which you grant permission to Hevo. By default, all topics can be accessed.) LITERAL or PREFIXED. If Topic Name specified is *, select LITERAL. READ ALLOW Topic * (The topic for which you grant permission to Hevo. By default, all topics can be accessed.) LITERAL or PREFIXED. If Topic Name specified is *, select LITERAL. DESCRIBE_CONFIGS ALLOW
-
-
Click Save and Next.
-
In the Get your API key page, copy the API key and the API secret and save them in your preferred safe place.
Note: Once you exit this screen, you cannot see the same API key and the API secret.
-
Select the checkbox, and then, click Save.
Configure Kafka Confluent Cloud Connection Settings
Perform the following steps to configure Kafka Confluent Cloud as a Source in your Pipeline:
-
Click PIPELINES in the Asset Palette.
-
Click + CREATE in the Pipelines List View.
-
In the Select Source Type page, select Kafka Confluent Cloud.
-
In the Configure your Kafka Confluent Cloud Source page, specify the following:
-
Pipeline Name - A unique name for your Pipeline, not exceeding 255 characters.
-
Bootstrap Server(s): The bootstrap server(s) extracted from Kafka Confluent Cloud.
-
API Key: The API key for your Kafka Confluent Cloud account.
-
API Secret: The API secret for your Kafka Confluent Cloud account.
-
Ingest Data From
-
All Topics: Select this option to ingest data from all topics. Any new topics that are created are automatically included.
-
Specific Topics: Select this option to manually specify a comma-separated list of topics. New topics are not automatically added in this option.
-
Topics Matching a Pattern (Regex): Select this option to specify a Regular Expression (regex) to match and select the topic names. This option also fetches data for new topics that match the pattern dynamically. You can test your regex patterns here.
-
-
Additional SSL Settings: (Optional) Enable this option if you are using a custom Certificate Authority (CA).
- CA File: The file containing the CA of the SSL server.
-
-
Click TEST & CONTINUE.
-
Proceed to configuring the data ingestion and setting up the Destination.
Limitations
-
Hevo only supports SASL_SSL-encrypted data in Kafka Confluent Cloud.
-
Hevo supports only JSON data format in Kafka Confluent Cloud.