Salesforce

Salesforce is a cloud computing Service as a Software (SaaS) company that allows you to use cloud technology to connect more effectively with customers, partners, and potential customers.

For creating Pipelines using this Source, Hevo provides you a fully managed BigQuery data warehouse as a possible Destination. This option remains available till the time you set up your first BigQuery Destination irrespective of any other Destinations that you may have. With the managed warehouse, you are only charged the cost that Hevo incurs for your project in Google BigQuery. The invoice is generated at the end of each month and payment is recovered as per the payment instrument you have set up. You can now create your Pipeline and directly start analyzing your Source data. Read Hevo Managed Google BigQuery.

Hevo uses Salesforce’s Bulk API to replicate the data from your Salesforce applications to the Destination database or data warehouse. To enable this, you need to authorize Hevo to access data from the relevant Salesforce environment.

Salesforce Environments

Salesforce allows businesses to create accounts in multiple environments, such as:

  • Production: This is the environment that holds live customer data and is used to actively run your business. A production org is identified by URLs starting with https://login.salesforce.com.

  • Sandbox: This is a copy of your production organization. You can create multiple sandbox environments for different purposes, such as development and testing. Working in the sandbox eliminates any risk of compromising your production data and applications. A sandbox is Identified by URLs starting with https://test.salesforce.com.


Prerequisites

  • An active Salesforce production account or sandbox account.

  • History tracking is enabled to track history objects.


Configuring Salesforce as a Source

Perform the following steps to configure Salesforce as a Source in your Pipeline:

  1. Click PIPELINES in the Asset Palette.

  2. Click + CREATE in the Pipelines List View.

  3. In the Select Source Type page, select Salesforce.

  4. In the Configure Your Salesforce Account page, click on + Add Salesforce Account or + Add Another Account (if an account already exists).

    Add account

  5. Select the environment from which Hevo must ingest the data, and then, click Continue.

    Select environment

  6. Log in to your Salesforce account.

  7. Click Allow to authorize Hevo to access your Salesforce Account.

    Allow access

    You are then redirected to the Configure Your Salesforce Account page.

  8. Specify the Pipeline Name, and then, click CONTINUE.

    Specify Pipeline name

  9. Proceed to configuring the data ingestion and setting up the Destination.


Data Replication

Default Pipeline Frequency Minimum Pipeline Frequency Maximum Pipeline Frequency Custom Frequency Range (Hrs)
3 Hrs 15 Mins 24 Hrs 1-24

Note: The custom frequency must be set in hours, as an integer value. For example, 1, 2, 3 but not 1.5 or 1.75.

Note: If your Salesforce account does not contain any new Events to be ingested, Hevo defers the data ingestion for a pre-determined time. Hevo re-attempts to fetch the data only after the deferment period elapses. Read Deferred Data Ingestion.

Hevo loads all the objects linked with your Salesforce account.

  • Historical Data: Once you create the Pipeline, all data associated with the Salesforce account is ingested by Hevo and loaded into your Destination.
  • Incremental Data: Once the historical load is completed, all new and updated records are synchronized with your Destination.

Source Considerations

  • When an Event is deleted in Salesforce, the ISDELETED column is marked TRUE. However, the deleted Events do not appear in the Salesforce dashboard.
    Now, when Hevo starts to replicate data from the Salesforce applications to your selected Destination, these deleted Events also get replicated to your Destination. As a result, you might see more Events in your Destination than the Source.

  • Your organization is allocated a quota of 15000 batches per 24 hours where one batch can contain a maximum of 10000 Events.


Event consumption is calculated across all the Pipelines over a 24-hour period as follows:

  • Number of batches created per Object (X) = Number of Events for the Object/10000.

    Note: This value (X) is rounded off to the next integer.

    Therefore,

  • Total number of batches created across all Objects in the Pipeline (Y) = Sum of the number of batches created for each Object (ΣX).

    Therefore, Total number of batches created across all Objects in the Pipeline (Y) = Sum of the number of batches created for each Object (ΣX).

    This (Y) is the number of batches that are submitted in one run of the Pipeline.

    Note: The number of batches submitted may vary in each run of the Pipeline.

  • Number of Pipeline runs in a day (Z) = 24/Ingestion frequency (in hours).

    Therefore,

    Therefore, Maximum number of batches that can be submitted in one run of the Pipeline = 15000/Z.

Now, suppose you have two Objects containing 55800 and 25000 Events respectively, and the ingestion frequency = 12 hours.

Number of batches created for Object 1 (X1) = 55800/10000 = 5.58.

Therefore, six batches are created; five with 10000 Events each and the sixth with 5800 Events.

Number of batches created for Object 2 (X2) = 25000/10000 = 2.5.

Therefore, three batches are created; two with 10000 Events each and the third with 5000 Events.

Total number of batches created across all Objects in the Pipeline (Y) = 6+3 = 9.

These 9 batches are submitted in one run of the Pipeline.

As the Pipeline frequency is 12 hours,

Total number of Pipeline runs in 24 hours (Z) = 24/12 = 2.

Therefore,

Maximum number of batches that can be submitted in one run of the Pipeline = 15000/2 = 7500.

Here, against the available limit of 7500 batches per Pipeline run, only 9 batches are being submitted.

Similarly, as long as Z x Y <= 15000, you are within the prescribed quota.


Schema and Primary Keys

Hevo uses the following schema to upload the records in the Destination:


Data Model

Hevo uses the following data model to ingest data from your Salesforce account:

Object Description
Account Represents information about the company or business user.
Campaign Represents campaigns and tracks their efficiency with cost, revenue, and converted leads analysis.
Contact Represents a company or a person associated with an account that can become a potential customer.
Event Represents an event in the calendar.
Lead Tracks valuable prospects apart from contacts, and convert them into opportunities.
Opportunity Tracks and stores your deals in progress.
Product Represents a product your company sells.
Custom Objects Represents custom objects, entities that support custom objects, and their standard fields, named with a suffix __c.
StandardObjectNameShare Represent a model for all share objects associated with standard objects.
StandardObjectNameHistory Represent a model for all the history objects associated with standard objects.
StandardObjectNameOwnerSharingRule Represent a model for all owner sharing rule objects associated with standard objects.
StandardObjectNameFeed Represent a model for all the feed objects associated with standard objects.

Limitations



See Also


Revision History

Refer to the following table for the list of key updates made to this page:

Date Release Description of Change
Oct-25-2021 NA Added the Pipeline frequency information in the Data Replication section.
Oct-04-2021 1.73 Updated the Source Consdierations section with an example of calculating quota usage.
Sep-09-2021 NA Updated the Limitations section to remove the limitations around ingestion of the attachment object. Also removed the limitation around ingestion of REST API objects, as this is now supported by Hevo.
Aug-8-2021 NA Added a note in the Source Considerations section about Hevo deferring data ingestion in Pipelines created with this Source.
Jul-26-2021 NA Added a note in the Overview section about Hevo providing a fully-managed Google BigQuery Destination for Pipelines created with this Source.
Feb-22-2021 1.57 Include the setup guide on the Salesforce Source configuration UI.
Last updated on 22 Oct 2021