- Introduction
- Getting Started
- Creating an Account in Hevo
- Connection Options
- Familiarizing with the UI
- Creating your First Pipeline
- Data Loss Prevention and Recovery
- Data Ingestion
- Data Loading
- Loading Data in a Database Destination
- Loading Data to a Data Warehouse
- Optimizing Data Loading for a Destination Warehouse
- Manually Triggering the Loading of Events
- Scheduling Data Load for a Destination
- Loading Events in Batches
- Data Loading Statuses
- Data Spike Alerts
- Name Sanitization
- Table and Column Name Compression
- Parsing Nested JSON Fields in Events
- Pipelines
- Data Flow in a Pipeline
- Familiarizing with the Pipelines UI
- Working with Pipelines
- Managing Objects in Pipelines
-
Transformations
-
Python Code-Based Transformations
- Supported Python Modules and Functions
-
Transformation Methods in the Event Class
- Create an Event
- Retrieve the Event Name
- Rename an Event
- Retrieve the Properties of an Event
- Modify the Properties for an Event
- Fetch the Primary Keys of an Event
- Modify the Primary Keys of an Event
- Fetch the Data Type of a Field
- Check if the Field is a String
- Check if the Field is a Number
- Check if the Field is Boolean
- Check if the Field is a Date
- Check if the Field is a Time Value
- Check if the Field is a Timestamp
-
TimeUtils
- Convert Date String to Required Format
- Convert Date to Required Format
- Convert Datetime String to Required Format
- Convert Epoch Time to a Date
- Convert Epoch Time to a Datetime
- Convert Epoch to Required Format
- Convert Epoch to a Time
- Get Time Difference
- Parse Date String to Date
- Parse Date String to Datetime Format
- Parse Date String to Time
- Utils
- Examples of Python Code-based Transformations
-
Drag and Drop Transformations
- Special Keywords
-
Transformation Blocks and Properties
- Add a Field
- Change Datetime Field Values
- Change Field Values
- Drop Events
- Drop Fields
- Find & Replace
- Flatten JSON
- Format Date to String
- Format Number to String
- Hash Fields
- If-Else
- Mask Fields
- Modify Text Casing
- Parse Date from String
- Parse JSON from String
- Parse Number from String
- Rename Events
- Rename Fields
- Round-off Decimal Fields
- Split Fields
- Examples of Drag and Drop Transformations
- Effect of Transformations on the Destination Table Structure
- Transformation Reference
- Transformation FAQs
-
Python Code-Based Transformations
-
Schema Mapper
- Using Schema Mapper
- Mapping Statuses
- Auto Mapping Event Types
- Manually Mapping Event Types
- Modifying Schema Mapping for Event Types
- Schema Mapper Actions
- Fixing Unmapped Fields
- Resolving Incompatible Schema Mappings
- Resizing String Columns in the Destination
- Schema Mapper Compatibility Table
- Limits on the Number of Destination Columns
- File Log
- Troubleshooting Failed Events in a Pipeline
- Mismatch in Events Count in Source and Destination
- Activity Log
-
Pipeline FAQs
- Can multiple Sources connect to one Destination?
- What happens if I re-create a deleted Pipeline?
- Why is there a delay in my Pipeline?
- Can I change the Destination post-Pipeline creation?
- How does changing the query mode affect data ingestion?
- Why is my billable Events high with Delta Timestamp mode?
- Can I drop multiple Destination tables in a Pipeline at once?
- How does Run Now affect scheduled ingestion frequency?
- Will pausing some objects increase the ingestion speed?
- Can I see the historical load progress?
- Why is my Historical Load Progress still at 0%?
- Why is historical data not getting ingested?
- How do I set a field as a primary key?
- How do I ensure that records are loaded only once?
- Events Usage
- Sources
- Free Sources
-
Databases and File Systems
- Data Warehouses
-
Databases
- Connecting to a Local Database
- Amazon DocumentDB
- Amazon DynamoDB
- Elasticsearch
-
MongoDB
- Generic MongoDB
- MongoDB Atlas
- Support for Multiple Data Types for the _id Field
- Example - Merge Collections Feature
-
Troubleshooting MongoDB
-
Errors During Pipeline Creation
- Error 1001 - Incorrect credentials
- Error 1005 - Connection timeout
- Error 1006 - Invalid database hostname
- Error 1007 - SSH connection failed
- Error 1008 - Database unreachable
- Error 1011 - Insufficient access
- Error 1028 - Primary/Master host needed for OpLog
- Error 1029 - Version not supported for Change Streams
- SSL 1009 - SSL Connection Failure
- Troubleshooting MongoDB Change Streams Connection
- Troubleshooting MongoDB OpLog Connection
-
Errors During Pipeline Creation
- SQL Server
-
MySQL
- Amazon Aurora MySQL
- Amazon RDS MySQL
- Azure MySQL
- Google Cloud MySQL
- Generic MySQL
- MariaDB MySQL
-
Troubleshooting MySQL
-
Errors During Pipeline Creation
- Error 1003 - Connection to host failed
- Error 1006 - Connection to host failed
- Error 1007 - SSH connection failed
- Error 1011 - Access denied
- Error 1012 - Replication access denied
- Error 1017 - Connection to host failed
- Error 1026 - Failed to connect to database
- Error 1027 - Unsupported BinLog format
- Failed to determine binlog filename/position
- Schema 'xyz' is not tracked via bin logs
- Errors Post-Pipeline Creation
-
Errors During Pipeline Creation
- MySQL FAQs
- Oracle
-
PostgreSQL
- Amazon Aurora PostgreSQL
- Amazon RDS PostgreSQL
- Azure PostgreSQL
- Google Cloud PostgreSQL
- Generic PostgreSQL
- Heroku PostgreSQL
-
Troubleshooting PostgreSQL
-
Errors during Pipeline creation
- Error 1003 - Authentication failure
- Error 1006 - Connection settings errors
- Error 1011 - Access role issue for logical replication
- Error 1012 - Access role issue for logical replication
- Error 1014 - Database does not exist
- Error 1017 - Connection settings errors
- Error 1023 - No pg_hba.conf entry
- Error 1024 - Number of requested standby connections
- Errors Post-Pipeline Creation
-
Errors during Pipeline creation
- PostgreSQL FAQs
- Troubleshooting Database Sources
- File Storage
-
Engineering Analytics
- Apify
- Asana
- Buildkite
- GitHub
-
Streaming
- Android SDK
- Kafka
-
REST API
- Writing JSONPath Expressions
-
REST API FAQs
- Why does my REST API token keep changing?
- Can I use a bearer authorization token for authentication?
- Does Hevo’s REST API support API chaining?
- What is the maximum payload size returned by a REST API?
- How do I split an Event into multiple Event Types?
- How do I split multiple values in a key into separate Events?
- Webhook
- GitLab
- Jira Cloud
- Opsgenie
- PagerDuty
- Pingdom
- QuickBooks Time
- Trello
- Finance & Accounting Analytics
-
Marketing Analytics
- ActiveCampaign
- AdRoll
- Amazon Ads
- Apple Search Ads
- AppsFlyer
- CleverTap
- Criteo
- Drip
- Facebook Ads
- Facebook Page Insights
- Firebase Analytics
- Freshsales
- Google Ads
- Google Analytics
- Google Analytics 4
- Google Analytics 360
- Google Play Console
- Google Search Console
- HubSpot
- Instagram Business
- Klaviyo
- Lemlist
- LinkedIn Ads
- Mailchimp
- Mailshake
- Marketo
- Microsoft Advertising
- Onfleet
- Outbrain
- Pardot
- Pinterest Ads
- Pipedrive
- Recharge
- Segment
- SendGrid Webhook
- SendGrid
- Salesforce Marketing Cloud
- Snapchat Ads
- SurveyMonkey
- Taboola
- TikTok Ads
- Twitter Ads
- Typeform
- YouTube Analytics
- Product Analytics
- Sales & Support Analytics
-
Source FAQs
- From how far back can the Pipeline ingest data?
- Can I connect to a Source not listed in Hevo?
- Can I connect a local database as a Source?
- How can I push data to Hevo API?
- How do I connect a CSV file as a Source?
- Why are my selected Source objects not visible in the Schema Mapper?
- How does the Merge Table feature work?
- Destinations
- Familiarizing with the Destinations UI
- Databases
-
Data Warehouses
- Amazon Redshift
- Azure Synapse Analytics
- Databricks
- Firebolt
- Google BigQuery
- Hevo Managed Google BigQuery
- Snowflake
-
Destination FAQs
- Can I move data between SaaS applications using Hevo?
- Can I change the primary key in my Destination table?
- How do I change the data type of table columns?
- Can I change the Destination table name after creating the Pipeline?
- How can I change or delete the Destination table prefix?
- How do I resolve duplicate records in the Destination table?
- How do I enable or disable deduplication of records?
- Why does my Destination have deleted Source records?
- How do I filter deleted Events from the Destination?
- Does a data load regenerate deleted Hevo metadata columns?
- Can I load data to a specific Destination table?
- How do I filter out specific fields before loading data?
- How do I sort the data in the Destination?
- Transform
- Alerts
- Account Management
- Personal Settings
- Team Settings
-
Billing
- Pricing Plans
- Time-based Events Buffer
- Setting up Pricing Plans, Billing, and Payments
- On-Demand Purchases
- Billing Alerts
- Viewing Billing History
- Billing Notifications
-
Billing FAQs
- Can I get a plan apart from the Starter plan?
- Are free trial Events charged once I purchase a plan?
- For how long can I stay on the Free plan?
- How can I upgrade my plan?
- Is there a discount for non-profit organizations?
- Can I seek a refund of my payment?
- Do ingested Events count towards billing?
- Will Pipeline get paused if I exceed the Events quota?
- Will the initial load of data be free?
- Does the Hevo plan support multiple Destinations?
- Do rows loaded through Models count in my usage?
- Is Hevo subscription environment-specific?
- Can I pause billing if I have no active Pipelines?
- Can you explain the pricing plans in Hevo?
- Where do I get invoices for payments?
- Account Suspension and Restoration
- Account Management FAQs
- Activate
- Glossary
- Release Notes
- Release Version 2.16.4
- Release Version 2.16.3
- Release Version 2.16.2
- Release Version 2.16.1
- Release Version 2.16
- Release Version 2.15
- Release Version 2.14
- Release Version 2.13
- Release Version 2.12
- Release Version 2.11
- Release Version 2.10
- Release Version 2.09
- Release Version 2.08
- Release Version 2.07
- Release Version 2.06
- Release Version 2.05
- Release Version 2.04
- Release Version 2.03
- Release Version 2.02
- Release Version 2.01
- Release Version 2.00
- Release Version 1.99
- Release Version 1.98
- Release Version 1.97
- Release Version 1.96
- Release Version 1.95
- Release Version 1.93 & 1.94
- Release Version 1.92
- Release Version 1.91
- Release Version 1.90
- Release Version 1.89
- Release Version 1.88
- Release Version 1.87
- Release Version 1.86
- Release Version 1.84 & 1.85
- Release Version 1.83
- Release Version 1.82
- Release Version 1.81
- Release Version 1.80 (Jan-24-2022)
- Release Version 1.79 (Jan-03-2022)
- Release Version 1.78 (Dec-20-2021)
- Release Version 1.77 (Dec-06-2021)
- Release Version 1.76 (Nov-22-2021)
- Release Version 1.75 (Nov-09-2021)
- Release Version 1.74 (Oct-25-2021)
- Release Version 1.73 (Oct-04-2021)
- Release Version 1.72 (Sep-20-2021)
- Release Version 1.71 (Sep-09-2021)
- Release Version 1.70 (Aug-23-2021)
- Release Version 1.69 (Aug-09-2021)
- Release Version 1.68 (Jul-26-2021)
- Release Version 1.67 (Jul-12-2021)
- Release Version 1.66 (Jun-28-2021)
- Release Version 1.65 (Jun-14-2021)
- Release Version 1.64 (Jun-01-2021)
- Release Version 1.63 (May-19-2021)
- Release Version 1.62 (May-05-2021)
- Release Version 1.61 (Apr-20-2021)
- Release Version 1.60 (Apr-06-2021)
- Release Version 1.59 (Mar-23-2021)
- Release Version 1.58 (Mar-09-2021)
- Release Version 1.57 (Feb-22-2021)
- Release Version 1.56 (Feb-09-2021)
- Release Version 1.55 (Jan-25-2021)
- Release Version 1.54 (Jan-12-2021)
- Release Version 1.53 (Dec-22-2020)
- Release Version 1.52 (Dec-03-2020)
- Release Version 1.51 (Nov-10-2020)
- Release Version 1.50 (Oct-19-2020)
- Release Version 1.49 (Sep-28-2020)
- Release Version 1.48 (Sep-01-2020)
- Release Version 1.47 (Aug-06-2020)
- Release Version 1.46 (Jul-21-2020)
- Release Version 1.45 (Jul-02-2020)
- Release Version 1.44 (Jun-11-2020)
- Release Version 1.43 (May-15-2020)
- Release Version 1.42 (Apr-30-2020)
- Release Version 1.41 (Apr-2020)
- Release Version 1.40 (Mar-2020)
- Release Version 1.39 (Feb-2020)
- Release Version 1.38 (Jan-2020)
- Upcoming Features
Amazon Ads
Amazon Ads (formerly known as AMS or Amazon Marketing Services) is an advertising platform that allows you to create product ads. It also provides various advertising performance measurement tools and analytics solutions to help you optimize your marketing strategy.
Hevo ingests data for the Sponsored Products, Sponsored Display, and Sponsored Brands ads from Amazon.
Hevo uses the Amazon Ads API to ingest the data from your Amazon Ads account and replicate it into the desired Destination database or data warehouse for scalable analysis. You must authorize Hevo to access data from your Amazon Ads account.
Prerequisites
-
An active Amazon Ads account from which data is to be ingested exists.
-
You are assigned the Team Administrator, Team Collaborator, or Pipeline Administrator role in Hevo to create the Pipeline.
Configuring Amazon Ads as a Source
Perform the following steps to configure Amazon Ads as the Source in your Pipeline:
-
Click PIPELINES in the Navigation Bar.
-
Click + CREATE in the Pipelines List View.
-
In the Select Source Type page, select Amazon Ads.
-
In the Configure your Amazon Ads page, click Add Amazon Ads Account. It redirects you to the Amazon login page.
-
Log in to your Amazon Ads account.
-
Click Allow to authorize Hevo to access your Amazon Ads data.
-
In the Configure your Amazon Ads Source page, specify the following:
-
Pipeline Name: A unique name for your Pipeline, not exceeding 255 characters.
-
Authorized Account (Non-editable): This field is pre-filled with the email address that you selected earlier when connecting to your Amazon Ads account.
-
Ad Accounts: The Amazon Ad account(s) from where you want to replicate the data.
-
Select Report Type: Select one of the following report types to ingest data from your Amazon Ads reports:
-
Standard Reports: These are reports that Amazon generates based on a valid combination of base metrics and group by values. These can quickly and efficiently replicate data from the report to your desired Destination. Refer to section, Standard Reports for steps to configure this.
-
Custom Reports: You can create these reports by selecting the required base metrics and group by values. You can then replicate the data from these reports to your desired Destination. Refer to section, Custom Reports for steps to configure these.
-
-
-
Click CONTINUE.
-
Proceed to configuring the data ingestion and setting up the Destination.
Standard Reports
Hevo provides you few standard reports generated by Amazon for ease of configuration that you can use according to your requirements.
To configure standard reports, specify the following:
-
Report Names: The report whose data you want to replicate to your Destination.
-
Core Objects: The components that you want to ingest for detailed information about the reports.
-
Historical Sync Duration: The duration for which you want to ingest the existing data from the Source. Default duration: 3 Months.
Custom Reports
Hevo allows you to create your reports by choosing a combination of parameters. You can manually select them in Hevo and replicate the data to your desired Destination.
To configure custom reports, specify the following:
-
Report Name: The report whose data you want to replicate to your Destination.
-
Base Metrics: The parameter(s) that enable you to measure the data retrieved from the above selected report.
-
Group By: The granularity at which you want to organize the data in your Destination.
-
Additional Metrics: The parameter(s) by which you want to group the data in your Destination.
-
Historical Sync Duration: The duration for which you want to ingest the existing data from the Source. Default duration: 3 Months.
Data Replication
Default Ingestion Frequency | Minimum Ingestion Frequency | Maximum Ingestion Frequency | Custom Frequency Range (Hrs) |
---|---|---|---|
3 Hrs | 3 Hrs | 24 Hrs | 3–24 |
Note: You must set the custom frequency in hours as an integer value. For example, 1, 2, 3 but not 1.5 or 1.75.
-
Historical Data: In the first run of the Pipeline, Hevo fetches the data of all the objects and reports available in your account using the Recent Data First approach. The data is ingested on the basis of the historical sync duration selected at the time of creating the Pipeline and loaded to the Destination. Default duration: 3 Months.
-
Incremental Data: Once the historical data ingestion is complete, every subsequent run of the Pipeline fetches new and updated data for the objects and reports.
-
Refresher Data: Hevo refreshes the data for the last 60 days twice every 24 hours.
Schema and Primary Keys
Hevo uses the following schema to upload the records in the Destination database:
Data Model
You can replicate both objects and report data from your Amazon Ads account.
Objects
The Amazon Ads objects are classified into three categories based on the type of data being ingested:
Common Resources: These objects contain your Amazon Ads account information, such as your profile IDs and associated marketplaces.
Sponsored Products: These objects contain data about your sponsored product ads, which appear in the related shopping results and product pages.
Sponsored Display: These objects contain data associated with your sponsored display ads, which are displayed to audiences based on their shopping behavior on the Amazon home page and across third-party websites.
The following is the list of tables, arranged by object type, that are created at the Destination when you run the Pipeline:
Object | Description |
---|---|
Common Resources | |
Profiles | Contains details of all your advertiser accounts and their associated marketplaces in Amazon Ads. |
Portfolio | Contains details of all the campaigns associated with your Amazon Ads account. |
Sponsored Products | |
Product Ads | Contains details of all the ads created to promote products in your account. |
Product Targeting | Contains details of the clauses that you have created to specify the brands and products that you want to target your product ads to. |
Negative Product Targeting | Contains details of the clauses that you have created to specify the brands and products that you do not want to target with your product ads. |
Negative Keywords | Contains details about the texts and expressions in Amazon search queries that you do not want associated with your product ads. |
Keywords | Contains details about the words and expressions in Amazon search queries that you want to match your ads to. |
Campaigns | Contains details of the product ad campaigns running in your account. For example, its ID, name, state, budget, and bidding strategy. An ad campaign consists of one or more ad groups that are used to advertise the products. |
Campaigns Negative Keywords | Contains details about the texts and expressions that you do not want to be associated with your product ads. |
Ad Groups | Contains details of the groups created to manage and organize the product ads within a campaign in your Amazon Ads account. |
Ad Group Suggested Keywords | Contains details about the texts and expressions suggested by Amazon for an ad group, to match against a search query on Amazon. |
Theme-based Bid Recommendations | Contains details about the suggested bids for your ads based on Amazon’s bid recommendation themes, along with their historical impact metrics. For example, a bid recommendation of two dollars for a bid associated with the PRIME_DAY theme. |
Sponsored Display | |
Campaigns | Contains details of the display ad campaigns running in your account. |
Ad Groups | Contains details of the groups created to manage and organize display ads within a campaign in your Amazon Ads account. |
Product Ads | Contains details of the ads created to promote your product across Amazon and third-party websites. |
Targeting | Contains details of the clauses that you have created to specify the brands and products that you want to target with your product ads. |
Reports
Hevo ingests the reports generated in Amazon Ads to describe the performance of your ads for a specified duration defined by you.
Report | Description |
---|---|
Sponsored Products | |
Campaign Report | Contains details of the performance data for all the sponsored product campaigns, such as impressions, clicks, and costPerClick, grouped at the campaign level for your specified dates. |
Target Report | Contains details of the performance data for sponsored product ads, including metrics such as clicks, views, and conversions. The data is grouped by the targeting expressions and keywords used in the ads, allowing you to see which targeting strategies are performing well and which may need to be refined. This information can help you optimize your ad campaigns and improve their performance. |
Search Term | Contains details of the performance data for your sponsored product ads. The data is grouped by the words and expressions you use to match against Amazon search queries to display your ads. |
Advertised Products Report | Contains details of the performance of the products advertised in your campaigns. |
Purchased Products report | Contains details of the performance of products purchased that were not promoted as part of any campaign. |
Sponsored Brands | |
Purchased Products Report | Contains details of the performance of products purchased as a result of your advertising campaign. |
Additional Information
Read the detailed Hevo documentation for the following related topics:
Source Considerations
-
Amazon Ads retains data of only the last 95 days for all the reports. This is to ensure that the data is up-to-date and relevant for advertisers. However, the Sponsored Brands Purchased Products report is an exception as it retains data for the last 731 days. This allows advertisers to see long-term performance of their ads and the impact of their campaigns on their sales.
-
The impact metrics for the CONVERSION_OPPORTUNITIES bidding theme are weekly clicks and orders received for similar products. This allows advertisers to measure the performance of their ads on a weekly basis and see how many clicks and orders their ads are generating. For other event-based themes, the impact metrics are clicks and orders received for similar products during the event days. This allows advertisers to measure the performance of their ads during a specific event.
Limitations
None.
Revision History
Refer to the following table for the list of key updates made to this page:
Date | Release | Description of Change |
---|---|---|
Jul-19-2023 | NA | New document. |