- Introduction
- Getting Started
- Creating an Account in Hevo
- Subscribing to Hevo via AWS Marketplace
-
Connection Options
- Connecting Through SSH
- Connecting Through Reverse SSH Tunnel
- Connecting Through VPN
- Connecting Through Mongo PrivateLink
- Connecting Through AWS Transit Gateway
- Connecting Through AWS VPC Endpoint
- Connecting Through AWS VPC Peering
- Using Google Account Authentication
- How Hevo Authenticates Sources and Destinations using OAuth
- Reauthorizing an OAuth Account
- Familiarizing with the UI
- Creating your First Pipeline
- Data Loss Prevention and Recovery
- Data Ingestion
- Data Loading
- Loading Data in a Database Destination
- Loading Data to a Data Warehouse
- Optimizing Data Loading for a Destination Warehouse
- Deduplicating Data in a Data Warehouse Destination
- Manually Triggering the Loading of Events
- Scheduling Data Load for a Destination
- Loading Events in Batches
- Data Loading Statuses
- Data Spike Alerts
- Name Sanitization
- Table and Column Name Compression
- Parsing Nested JSON Fields in Events
- Pipelines
- Data Flow in a Pipeline
- Familiarizing with the Pipelines UI
- Working with Pipelines
- Managing Objects in Pipelines
- Pipeline Jobs
-
Transformations
-
Python Code-Based Transformations
- Supported Python Modules and Functions
-
Transformation Methods in the Event Class
- Create an Event
- Retrieve the Event Name
- Rename an Event
- Retrieve the Properties of an Event
- Modify the Properties for an Event
- Fetch the Primary Keys of an Event
- Modify the Primary Keys of an Event
- Fetch the Data Type of a Field
- Check if the Field is a String
- Check if the Field is a Number
- Check if the Field is Boolean
- Check if the Field is a Date
- Check if the Field is a Time Value
- Check if the Field is a Timestamp
-
TimeUtils
- Convert Date String to Required Format
- Convert Date to Required Format
- Convert Datetime String to Required Format
- Convert Epoch Time to a Date
- Convert Epoch Time to a Datetime
- Convert Epoch to Required Format
- Convert Epoch to a Time
- Get Time Difference
- Parse Date String to Date
- Parse Date String to Datetime Format
- Parse Date String to Time
- Utils
- Examples of Python Code-based Transformations
-
Drag and Drop Transformations
- Special Keywords
-
Transformation Blocks and Properties
- Add a Field
- Change Datetime Field Values
- Change Field Values
- Drop Events
- Drop Fields
- Find & Replace
- Flatten JSON
- Format Date to String
- Format Number to String
- Hash Fields
- If-Else
- Mask Fields
- Modify Text Casing
- Parse Date from String
- Parse JSON from String
- Parse Number from String
- Rename Events
- Rename Fields
- Round-off Decimal Fields
- Split Fields
- Examples of Drag and Drop Transformations
- Effect of Transformations on the Destination Table Structure
- Transformation Reference
- Transformation FAQs
-
Python Code-Based Transformations
-
Schema Mapper
- Using Schema Mapper
- Mapping Statuses
- Auto Mapping Event Types
- Manually Mapping Event Types
- Modifying Schema Mapping for Event Types
- Schema Mapper Actions
- Fixing Unmapped Fields
- Resolving Incompatible Schema Mappings
- Resizing String Columns in the Destination
- Schema Mapper Compatibility Table
- Limits on the Number of Destination Columns
- File Log
- Troubleshooting Failed Events in a Pipeline
- Mismatch in Events Count in Source and Destination
- Audit Tables
- Activity Log
-
Pipeline FAQs
- Can multiple Sources connect to one Destination?
- What happens if I re-create a deleted Pipeline?
- Why is there a delay in my Pipeline?
- Can I change the Destination post-Pipeline creation?
- Why is my billable Events high with Delta Timestamp mode?
- Can I drop multiple Destination tables in a Pipeline at once?
- How does Run Now affect scheduled ingestion frequency?
- Will pausing some objects increase the ingestion speed?
- Can I see the historical load progress?
- Why is my Historical Load Progress still at 0%?
- Why is historical data not getting ingested?
- How do I set a field as a primary key?
- How do I ensure that records are loaded only once?
- Events Usage
- Sources
- Free Sources
-
Databases and File Systems
- Data Warehouses
-
Databases
- Connecting to a Local Database
- Amazon DocumentDB
- Amazon DynamoDB
- Elasticsearch
-
MongoDB
- Generic MongoDB
- MongoDB Atlas
- Support for Multiple Data Types for the _id Field
- Example - Merge Collections Feature
-
Troubleshooting MongoDB
-
Errors During Pipeline Creation
- Error 1001 - Incorrect credentials
- Error 1005 - Connection timeout
- Error 1006 - Invalid database hostname
- Error 1007 - SSH connection failed
- Error 1008 - Database unreachable
- Error 1011 - Insufficient access
- Error 1028 - Primary/Master host needed for OpLog
- Error 1029 - Version not supported for Change Streams
- SSL 1009 - SSL Connection Failure
- Troubleshooting MongoDB Change Streams Connection
- Troubleshooting MongoDB OpLog Connection
-
Errors During Pipeline Creation
- SQL Server
-
MySQL
- Amazon Aurora MySQL
- Amazon RDS MySQL
- Azure MySQL
- Generic MySQL
- Google Cloud MySQL
- MariaDB MySQL
-
Troubleshooting MySQL
-
Errors During Pipeline Creation
- Error 1003 - Connection to host failed
- Error 1006 - Connection to host failed
- Error 1007 - SSH connection failed
- Error 1011 - Access denied
- Error 1012 - Replication access denied
- Error 1017 - Connection to host failed
- Error 1026 - Failed to connect to database
- Error 1027 - Unsupported BinLog format
- Failed to determine binlog filename/position
- Schema 'xyz' is not tracked via bin logs
- Errors Post-Pipeline Creation
-
Errors During Pipeline Creation
- MySQL FAQs
- Oracle
-
PostgreSQL
- Amazon Aurora PostgreSQL
- Amazon RDS PostgreSQL
- Azure PostgreSQL
- Generic PostgreSQL
- Google Cloud PostgreSQL
- Heroku PostgreSQL
-
Troubleshooting PostgreSQL
-
Errors during Pipeline creation
- Error 1003 - Authentication failure
- Error 1006 - Connection settings errors
- Error 1011 - Access role issue for logical replication
- Error 1012 - Access role issue for logical replication
- Error 1014 - Database does not exist
- Error 1017 - Connection settings errors
- Error 1023 - No pg_hba.conf entry
- Error 1024 - Number of requested standby connections
- Errors Post-Pipeline Creation
-
Errors during Pipeline creation
- PostgreSQL FAQs
- Troubleshooting Database Sources
- File Storage
- Engineering Analytics
- Finance & Accounting Analytics
-
Marketing Analytics
- ActiveCampaign
- AdRoll
- Amazon Ads
- Apple Search Ads
- AppsFlyer
- CleverTap
- Criteo
- Drip
- Facebook Ads
- Facebook Page Insights
- Firebase Analytics
- Freshsales
- Google Ads
- Google Analytics
- Google Analytics 4
- Google Analytics 360
- Google Play Console
- Google Search Console
- HubSpot
- Instagram Business
- Klaviyo v2
- Lemlist
- LinkedIn Ads
- Mailchimp
- Mailshake
- Marketo
- Microsoft Ads
- Onfleet
- Outbrain
- Pardot
- Pinterest Ads
- Pipedrive
- Recharge
- Segment
- SendGrid Webhook
- SendGrid
- Salesforce Marketing Cloud
- Snapchat Ads
- SurveyMonkey
- Taboola
- TikTok Ads
- Twitter Ads
- Typeform
- YouTube Analytics
- Product Analytics
- Sales & Support Analytics
- Source FAQs
- Destinations
- Familiarizing with the Destinations UI
- Cloud Storage-Based
- Databases
-
Data Warehouses
- Amazon Redshift
- Amazon Redshift Serverless
- Azure Synapse Analytics
- Databricks
- Firebolt
- Google BigQuery
- Hevo Managed Google BigQuery
- Snowflake
-
Destination FAQs
- Can I change the primary key in my Destination table?
- How do I change the data type of table columns?
- Can I change the Destination table name after creating the Pipeline?
- How can I change or delete the Destination table prefix?
- Why does my Destination have deleted Source records?
- How do I filter deleted Events from the Destination?
- Does a data load regenerate deleted Hevo metadata columns?
- How do I filter out specific fields before loading data?
- Transform
- Alerts
- Account Management
- Activate
- Glossary
Releases- Release 2.29.3 (Nov 5-11, 2024)
- Release 2.29.1 (Oct 21-28, 2024)
- Release 2.29 (Sep 30-Oct 22, 2024)
-
2024 Releases
- Release 2.28 (Sep 02-30, 2024)
- Release 2.27 (Aug 05-Sep 02, 2024)
- Release 2.26 (Jul 08-Aug 05, 2024)
- Release 2.25 (Jun 10-Jul 08, 2024)
- Release 2.24 (May 06-Jun 10, 2024)
- Release 2.23 (Apr 08-May 06, 2024)
- Release 2.22 (Mar 11-Apr 08, 2024)
- Release 2.21 (Feb 12-Mar 11, 2024)
- Release 2.20 (Jan 15-Feb 12, 2024)
-
2023 Releases
- Release 2.19 (Dec 04, 2023-Jan 15, 2024)
- Release Version 2.18
- Release Version 2.17
- Release Version 2.16 (with breaking changes)
- Release Version 2.15 (with breaking changes)
- Release Version 2.14
- Release Version 2.13
- Release Version 2.12
- Release Version 2.11
- Release Version 2.10
- Release Version 2.09
- Release Version 2.08
- Release Version 2.07
- Release Version 2.06
-
2022 Releases
- Release Version 2.05
- Release Version 2.04
- Release Version 2.03
- Release Version 2.02
- Release Version 2.01
- Release Version 2.00
- Release Version 1.99
- Release Version 1.98
- Release Version 1.97
- Release Version 1.96
- Release Version 1.95
- Release Version 1.93 & 1.94
- Release Version 1.92
- Release Version 1.91
- Release Version 1.90
- Release Version 1.89
- Release Version 1.88
- Release Version 1.87
- Release Version 1.86
- Release Version 1.84 & 1.85
- Release Version 1.83
- Release Version 1.82
- Release Version 1.81
- Release Version 1.80 (Jan-24-2022)
- Release Version 1.79 (Jan-03-2022)
-
2021 Releases
- Release Version 1.78 (Dec-20-2021)
- Release Version 1.77 (Dec-06-2021)
- Release Version 1.76 (Nov-22-2021)
- Release Version 1.75 (Nov-09-2021)
- Release Version 1.74 (Oct-25-2021)
- Release Version 1.73 (Oct-04-2021)
- Release Version 1.72 (Sep-20-2021)
- Release Version 1.71 (Sep-09-2021)
- Release Version 1.70 (Aug-23-2021)
- Release Version 1.69 (Aug-09-2021)
- Release Version 1.68 (Jul-26-2021)
- Release Version 1.67 (Jul-12-2021)
- Release Version 1.66 (Jun-28-2021)
- Release Version 1.65 (Jun-14-2021)
- Release Version 1.64 (Jun-01-2021)
- Release Version 1.63 (May-19-2021)
- Release Version 1.62 (May-05-2021)
- Release Version 1.61 (Apr-20-2021)
- Release Version 1.60 (Apr-06-2021)
- Release Version 1.59 (Mar-23-2021)
- Release Version 1.58 (Mar-09-2021)
- Release Version 1.57 (Feb-22-2021)
- Release Version 1.56 (Feb-09-2021)
- Release Version 1.55 (Jan-25-2021)
- Release Version 1.54 (Jan-12-2021)
-
2020 Releases
- Release Version 1.53 (Dec-22-2020)
- Release Version 1.52 (Dec-03-2020)
- Release Version 1.51 (Nov-10-2020)
- Release Version 1.50 (Oct-19-2020)
- Release Version 1.49 (Sep-28-2020)
- Release Version 1.48 (Sep-01-2020)
- Release Version 1.47 (Aug-06-2020)
- Release Version 1.46 (Jul-21-2020)
- Release Version 1.45 (Jul-02-2020)
- Release Version 1.44 (Jun-11-2020)
- Release Version 1.43 (May-15-2020)
- Release Version 1.42 (Apr-30-2020)
- Release Version 1.41 (Apr-2020)
- Release Version 1.40 (Mar-2020)
- Release Version 1.39 (Feb-2020)
- Release Version 1.38 (Jan-2020)
- Early Access New
- Upcoming Features
Amazon Ads
From Release 2.27.3 onwards, Hevo uses the Exports API for data ingestion related to core objects of Sponsored Products . Amazon Ads will deprecate the Snapshots API, which Hevo previously used, on October 15, 2024. To ensure continued support for existing functionalities, Hevo will automatically migrate your existing Pipelines to the new API.
The API update impacts data ingestion for the following objects:
Objects | Changes |
---|---|
Ad Groups | - Source object schema is changed and substituted by new object, Ad_Groups_V2. |
Campaigns | - Source object schema is changed and substituted by new object, Campaigns_V2. |
Campaigns Negative Keywords | - Source object is deprecated, and the data has been moved to the Product_Targeting_V2 object. You can filter this data type within the object using the targetType field. |
Keywords | - Source object is deprecated, and the data has been moved to the Product_Targeting_V2 object. You can filter this data type within the object using the targetType field. |
Negative Keywords | - Source object is deprecated, and the data has been moved to the Product_Targeting_V2 object. You can filter this data type within the object using the targetType field. |
Negative Product Targeting | - Source object is deprecated, and the data has been moved to the Product_Targeting_V2 object. You can filter this data type within the object using the targetType field. - Child objects negative_product_targeting_expression and negative_product_targeting_resolved_expression are removed. |
Product Ads | - Source object schema is changed and substituted by new object, Product_Ads_V2. |
Product Targeting | - Source object schema is changed and substituted by new object, Product_Targeting_V2. - Product_Targeting_V2 contains data for Negative Product Targeting, Negative Keywords, Keywords, and Campaigns Negative Keywords data types. You can filter the data within the object for these data types using the targetType field. - Child objects product_targeting_expression and product_targeting_resolved_expression are removed. |
In your existing Pipelines, the older objects you had selected for ingestion will be marked as completed, and the corresponding new objects will be added.
Amazon Ads (formerly known as AMS or Amazon Marketing Services) is an advertising platform that allows you to create product ads. It also provides various advertising performance measurement tools and analytics solutions to help you optimize your marketing strategy.
Hevo ingests data for the Sponsored Products, Sponsored Display, and Sponsored Brands ads from Amazon.
Hevo uses the Amazon Ads API to ingest the data from your Amazon Ads account and replicate it into the desired Destination database or data warehouse for scalable analysis. You must authorize Hevo to access data from your Amazon Ads account.
Prerequisites
-
An active Amazon Ads account from which data is to be ingested exists.
-
You are assigned the Team Administrator, Team Collaborator, or Pipeline Administrator role in Hevo to create the Pipeline.
Configuring Amazon Ads as a Source
Perform the following steps to configure Amazon Ads as the Source in your Pipeline:
-
Click PIPELINES in the Navigation Bar.
-
Click + CREATE PIPELINE in the Pipelines List View.
-
On the Select Source Type page, select Amazon Ads.
-
On the Configure your Amazon Ads account page, do one of the following:
-
Select a previously configured account and click CONTINUE.
-
Click Add Amazon Ads Account and perform the following steps to configure an account:
-
Log in to your Amazon Ads account.
-
Click Allow to authorize Hevo to access your Amazon Ads data.
-
-
-
On the Configure your Amazon Ads Source page, specify the following:
-
Pipeline Name: A unique name for your Pipeline, not exceeding 255 characters.
-
Authorized Account (Non-editable): This field is pre-filled with the email address that you selected earlier when connecting to your Amazon Ads account.
-
Ad Accounts: The Amazon Ad account(s) from where you want to replicate the data.
-
Select Report Type: Select one of the following report types to ingest data from your Amazon Ads reports:
-
Standard Reports: These are reports that Amazon generates based on a valid combination of base metrics and group by values. These can quickly and efficiently replicate data from the report to your desired Destination. Refer to section, Standard Reports for steps to configure this.
-
Custom Reports: You can create these reports by selecting the required base metrics and group by values. You can then replicate the data from these reports to your desired Destination. Refer to section, Custom Reports for steps to configure these.
-
-
-
Click CONTINUE.
-
Proceed to configuring the data ingestion and setting up the Destination.
Standard Reports
Hevo provides you few standard reports generated by Amazon for ease of configuration that you can use according to your requirements.
To configure standard reports, specify the following:
-
Report Names: The report whose data you want to replicate to your Destination.
-
Core Objects: The components that you want to ingest for detailed information about the reports.
-
Historical Sync Duration: The duration for which you want to ingest the existing data from the Source. Default duration: 3 Months.
Custom Reports
Hevo allows you to create your reports by choosing a combination of parameters. You can manually select them in Hevo and replicate the data to your desired Destination.
To configure custom reports, specify the following:
-
Report Name: The report whose data you want to replicate to your Destination.
-
Base Metrics: The parameter(s) that enable you to measure the data retrieved from the above selected report.
-
Group By: The granularity at which you want to organize the data in your Destination.
-
Additional Metrics: The parameter(s) by which you want to group the data in your Destination.
-
Historical Sync Duration: The duration for which you want to ingest the existing data from the Source. Default duration: 3 Months.
Data Replication
For Teams Created | Default Ingestion Frequency | Minimum Ingestion Frequency | Maximum Ingestion Frequency | Custom Frequency Range (in Hrs) |
---|---|---|---|---|
Before Release 2.21 | 12 Hrs | 3 Hrs | 24 Hrs | 3–24 |
After Release 2.21 | 12 Hrs | 30 Mins | 24 Hrs | 1-24 |
-
Historical Data: In the first run of the Pipeline, Hevo fetches the data of all the objects and reports available in your account using the Recent Data First approach. The data is ingested on the basis of the historical sync duration selected at the time of creating the Pipeline and loaded to the Destination. Default duration: 3 Months.
-
Incremental Data: Once the historical load is complete, data is ingested as per the ingestion frequency in Full Load or Incremental mode, as applicable.
-
Refresher Data: Hevo refreshes the data for the last 60 days twice every 24 hours.
Schema and Primary Keys
Hevo uses the following schema to upload the records in the Destination database:
Data Model
You can replicate both objects and report data from your Amazon Ads account.
Objects
The Amazon Ads objects are classified into three categories based on the type of data being ingested:
Common Resources: These objects contain your Amazon Ads account information, such as your profile IDs and associated marketplaces.
Sponsored Products: These objects contain data about your sponsored product ads, which appear in the related shopping results and product pages.
Sponsored Display: These objects contain data associated with your sponsored display ads, which are displayed to audiences based on their shopping behavior on the Amazon home page and across third-party websites.
The following is the list of tables, arranged by object type, that are created at the Destination when you run the Pipeline:
Object | Description |
---|---|
Common Resources | |
Profiles | Contains details of all your advertiser accounts and their associated marketplaces in Amazon Ads. |
Portfolio | Contains details of all the campaigns associated with your Amazon Ads account. |
Sponsored Products | |
Product_Ads_V2 | Contains details of all the ads created to promote products in your account. |
Product_Targeting_V2 | Contains details of the clauses that you have created to specify the brands and products you want to target with your product ads. |
Campaigns_V2 | Contains details of the product ad campaigns running in your account. For example, its ID, name, state, budget, and bidding strategy. An ad campaign consists of one or more ad groups that are used to advertise the products. |
Ad_Groups_V2 | Contains details of the groups created to manage and organize the product ads within a campaign in your Amazon Ads account. |
Ad Group Suggested Keywords | Contains details about the texts and expressions suggested by Amazon for an ad group, to match against a search query on Amazon. |
Theme-based Bid Recommendations | Contains details about the suggested bids for your ads based on Amazon’s bid recommendation themes, along with their historical impact metrics. For example, a bid recommendation of two dollars for a bid associated with the PRIME_DAY theme. |
Sponsored Display | |
Campaigns | Contains details of the display ad campaigns running in your account. |
Ad Groups | Contains details of the groups created to manage and organize display ads within a campaign in your Amazon Ads account. |
Product Ads | Contains details of the ads created to promote your product across Amazon and third-party websites. |
Targeting | Contains details of the clauses that you have created to specify the brands and products that you want to target with your product ads. |
Reports
Hevo ingests the reports generated in Amazon Ads to describe the performance of your ads for a specified duration defined by you.
Report | Description |
---|---|
Sponsored Products | |
Campaign Report | Contains details of the performance data for all the sponsored product campaigns, such as impressions, clicks, and costPerClick, grouped at the campaign level for your specified dates. |
Target Report | Contains details of the performance data for sponsored product ads, including metrics such as clicks, views, and conversions. The data is grouped by the targeting expressions and keywords used in the ads, allowing you to see which targeting strategies are performing well and which may need to be refined. This information can help you optimize your ad campaigns and improve their performance. |
Search Term | Contains details of the performance data for your sponsored product ads. The data is grouped by the words and expressions you use to match against Amazon search queries to display your ads. |
Advertised Products Report | Contains details of the performance of the products advertised in your campaigns. |
Purchased Products Report | Contains details of the performance of products purchased that were not promoted as part of any campaign. |
Sponsored Brands | |
Products Report | Contains details of the performance of products purchased as a result of your advertising campaign. |
Additional Information
Read the detailed Hevo documentation for the following related topics:
Source Considerations
-
Amazon Ads retains data of only the last 95 days for all the reports. This is to ensure that the data is up-to-date and relevant for advertisers. However, the Sponsored Brands Purchased Products report is an exception as it retains data for the last 731 days. This allows advertisers to see long-term performance of their ads and the impact of their campaigns on their sales.
-
The impact metrics for the CONVERSION_OPPORTUNITIES bidding theme are weekly clicks and orders received for similar products. This allows advertisers to measure the performance of their ads on a weekly basis and see how many clicks and orders their ads are generating. For other event-based themes, the impact metrics are clicks and orders received for similar products during the event days. This allows advertisers to measure the performance of their ads during a specific event.
Revision History
Refer to the following table for the list of key updates made to this page:
Date | Release | Description of Change |
---|---|---|
Sep-30-2024 | 2.28.1 | Updated section, Data Replication to change the default ingestion frequency to 12 Hrs. |
Sep-16-2024 | 2.27.3 | Updated sections, Data Model and Schema and Primary Keys to add information about changes to the Amazon Ads ERD with the Export API update. |
Mar-05-2024 | 2.21 | Updated the ingestion frequency table in the Data Replication section. |
Jul-19-2023 | NA | New document. |