Intercom App

Intercom enables you to offer near real-time engagement and support to your current and target customers through a messaging like platform. Using Hevo’s Intercom connector, you can replicate all your Intercom data to your desired Destination database or data warehouse.

Note: For Pipelines created with this Source, Hevo provides you a fully-managed BigQuery data warehouse Destination if you do not already have one set up. You are only charged the cost that Hevo incurs for your project in Google BigQuery. The invoice is generated at the end of each month and payment is recovered as per the payment instrument you have set up. You can now create your Pipeline and directly start analyzing your Source data. Read Hevo Managed Google BigQuery .

Prerequisites

  • An active Intercom account.

Configuring Intercom as a Source in your Hevo Pipeline

  1. Click PIPELINES in the Asset Palette, and then, in the Pipelines Overview page on the right, click + CREATE PIPELINE.

  2. In the Select Source Type page, select Intercom as the Source.

  3. In the Configure your Intercom Account page:

    • Click ADD INTERCOM ACCOUNT.

      Add intercom account

    • In the Intercom Welcome page, sign in to your Intercom account.

    • In the screen that appears, click the drop-down icon and select the workspace that you want Hevo to access. The list of workspace objects, for which Hevo captures the data, are displayed. You can click the expand icon for each object to see the permissions that Hevo has requested.

      Authorize Hevo

    • Click Authorize access to allow Hevo to access the objects.

  4. In the Configure your Source page, specify a suitable Pipeline Name. The Intercom account you are logging in with is displayed.

  5. Click TEST & CONTINUE.

  6. In the Select Destination page, select the database or data warehouse where the data must be replicated.

  7. Specify the Destination settings and click CONTINUE to create the Pipeline.

Data Replication Jobs

The first run of the Pipeline replicates all existing data for the selected objects to the Destination.

  • Historical Load: The first run of the Pipeline replicates all data that exists for the selected objects to the Destination.

  • Incremental Load: Each subsequent run of the Pipeline subsequent to the initial one loads the changed records for the past hour. We recommend to use hourly sync, to avoid any loss of data.

Schema and Primary Key Information

Hevo replicates data for the following objects from your Intercom account. The primary key of the Destination table is the same as that of the ingested Source object.

Data Model

Intercom ER Diagram

Last updated on 20 Oct 2020