Kafka Confluent Cloud

Kafka Confluent Cloud is a fully managed event streaming platform powered by Apache Kafka.

You can use Hevo Pipelines to replicate data from your Kafka Confluent Cloud account to the Destination system.


Prerequisites


Perform the following steps to configure your Apache Kafka Source:

Locate the Bootstrap Server Information

To locate the bootstrap server information:

  1. Log in to your Kafka Confluent Cloud account.

  2. In the Clusters tab, select your cluster.

    Select cluster

  3. In the left navigation pane, click Cluster settings.

  4. Locate the bootstrap server value under the General tab, Identification.

    Cluster settings


Create the API Key and the API Secret

To create the API key and the API secret:

  1. Log in to your Kafka Confluent Cloud account.

  2. In the Clusters tab, click on your cluster name.

    Select cluster

  3. In the left navigation pane, click API access, and then, click + Add key.

    Add key option

  4. In the Access control page, select Granular access, and then, click Next.

    Create Key

  5. In the Service account page, click the Create a new one tab, and then, specify the following to create a service account. This is the account where limited permissions (ACLs) can be defined. Read more here.

    Create service account

    • New service account name: A unique name for your service account.

    • Description: A brief description of your service account.

  6. Click Save and Next.

  7. ACLs are required to assign permissions for each category of API key.
    In the Add ACLs to service account page, specify the following Access Control Lists (ACLs):

    ACLs

    Notes: You need to click + Add ACLs after specifying each ACL.

    • Cluster API

      ACL Category Operation Permission
      Cluster DESCRIBE ALLOW
    • Consumer Group ACLs

      ACL Category Consumer Group ID Pattern Type Operation Permission
      Consumer Group hevo-integration PREFIXED DESCRIBE ALLOW
      Consumer Group hevo-integration PREFIXED READ ALLOW
    • Topic ACLS

      ACL Category Topic Name Pattern Type Operation Permission
      Topic * (The topic for which you grant permission to Hevo. By default, all topics can be accessed.) LITERAL or PREFIXED. If Topic Name specified is *, select LITERAL. READ ALLOW
      Topic * (The topic for which you grant permission to Hevo. By default, all topics can be accessed.) LITERAL or PREFIXED. If Topic Name specified is *, select LITERAL. DESCRIBE_CONFIGS ALLOW
  8. Click Save and Next.

  9. In the Get your API key page, copy the API key and the API secret and save them in your preferred safe place.

    Note: Once you exit this screen, you cannot see the same API key and the API secret.

  10. Select the checkbox, and then, click Save.


Configure Kafka Confluent Cloud Connection Settings

Perform the following steps to configure Kafka Confluent Cloud as a Source in your Pipeline:

  1. Click PIPELINES in the Asset Palette.

  2. Click + CREATE in the Pipelines List View.

  3. In the Select Source Type page, select Kafka Confluent Cloud.

  4. In the Configure your Kafka Confluent Cloud Source page, specify the following:

    Kafka Confluent Cloud

    • Pipeline Name - A unique name for your Pipeline, not exceeding 255 characters.

    • Bootstrap Server(s): The bootstrap server(s) extracted from Kafka Confluent Cloud.

    • API Key: The API key for your Kafka Confluent Cloud account.

    • API Secret: The API secret for your Kafka Confluent Cloud account.

    • Ingest Data From

      • All Topics: Select this option to ingest data from all topics. Any new topics that are created are automatically included.

      • Specific Topics: Select this option to manually specify a comma-separated list of topics. New topics are not automatically added in this option.

      • Topics Matching a Pattern (Regex): Select this option to specify a Regular Expression (regex) to match and select the topic names. This option also fetches data for new topics that match the pattern dynamically. You can test your regex patterns here.

    • Additional SSL Settings: (Optional) Enable this option if you are using a custom Certificate Authority (CA).

      • CA File: The file containing the CA of the SSL server.
  5. Click TEST & CONTINUE.

  6. Proceed to configuring the data ingestion and setting up the Destination.


Limitations

  • Hevo only supports SASL_SSL-encrypted data in Kafka Confluent Cloud.

  • Hevo supports only JSON data format in Kafka Confluent Cloud.

Last updated on 11 Feb 2021