- Introduction
- Getting Started
- Creating an Account in Hevo
- Subscribing to Hevo via AWS Marketplace
-
Connection Options
- Connecting Through SSH
- Connecting Through Reverse SSH Tunnel
- Connecting Through VPN
- Connecting Through Mongo PrivateLink
- Connecting Through AWS Transit Gateway
- Connecting Through AWS VPC Endpoint
- Connecting Through AWS VPC Peering
- Using Google Account Authentication
- How Hevo Authenticates Sources and Destinations using OAuth
- Reauthorizing an OAuth Account
- Familiarizing with the UI
- Creating your First Pipeline
- Data Loss Prevention and Recovery
- Data Ingestion
- Data Loading
- Loading Data in a Database Destination
- Loading Data to a Data Warehouse
- Optimizing Data Loading for a Destination Warehouse
- Deduplicating Data in a Data Warehouse Destination
- Manually Triggering the Loading of Events
- Scheduling Data Load for a Destination
- Loading Events in Batches
- Data Loading Statuses
- Data Spike Alerts
- Name Sanitization
- Table and Column Name Compression
- Parsing Nested JSON Fields in Events
- Pipelines
- Data Flow in a Pipeline
- Familiarizing with the Pipelines UI
- Working with Pipelines
- Managing Objects in Pipelines
- Pipeline Jobs
-
Transformations
-
Python Code-Based Transformations
- Supported Python Modules and Functions
-
Transformation Methods in the Event Class
- Create an Event
- Retrieve the Event Name
- Rename an Event
- Retrieve the Properties of an Event
- Modify the Properties for an Event
- Fetch the Primary Keys of an Event
- Modify the Primary Keys of an Event
- Fetch the Data Type of a Field
- Check if the Field is a String
- Check if the Field is a Number
- Check if the Field is Boolean
- Check if the Field is a Date
- Check if the Field is a Time Value
- Check if the Field is a Timestamp
-
TimeUtils
- Convert Date String to Required Format
- Convert Date to Required Format
- Convert Datetime String to Required Format
- Convert Epoch Time to a Date
- Convert Epoch Time to a Datetime
- Convert Epoch to Required Format
- Convert Epoch to a Time
- Get Time Difference
- Parse Date String to Date
- Parse Date String to Datetime Format
- Parse Date String to Time
- Utils
- Examples of Python Code-based Transformations
-
Drag and Drop Transformations
- Special Keywords
-
Transformation Blocks and Properties
- Add a Field
- Change Datetime Field Values
- Change Field Values
- Drop Events
- Drop Fields
- Find & Replace
- Flatten JSON
- Format Date to String
- Format Number to String
- Hash Fields
- If-Else
- Mask Fields
- Modify Text Casing
- Parse Date from String
- Parse JSON from String
- Parse Number from String
- Rename Events
- Rename Fields
- Round-off Decimal Fields
- Split Fields
- Examples of Drag and Drop Transformations
- Effect of Transformations on the Destination Table Structure
- Transformation Reference
- Transformation FAQs
-
Python Code-Based Transformations
-
Schema Mapper
- Using Schema Mapper
- Mapping Statuses
- Auto Mapping Event Types
- Manually Mapping Event Types
- Modifying Schema Mapping for Event Types
- Schema Mapper Actions
- Fixing Unmapped Fields
- Resolving Incompatible Schema Mappings
- Resizing String Columns in the Destination
- Schema Mapper Compatibility Table
- Limits on the Number of Destination Columns
- File Log
- Troubleshooting Failed Events in a Pipeline
- Mismatch in Events Count in Source and Destination
- Audit Tables
- Activity Log
-
Pipeline FAQs
- Can multiple Sources connect to one Destination?
- What happens if I re-create a deleted Pipeline?
- Why is there a delay in my Pipeline?
- Can I change the Destination post-Pipeline creation?
- Why is my billable Events high with Delta Timestamp mode?
- Can I drop multiple Destination tables in a Pipeline at once?
- How does Run Now affect scheduled ingestion frequency?
- Will pausing some objects increase the ingestion speed?
- Can I see the historical load progress?
- Why is my Historical Load Progress still at 0%?
- Why is historical data not getting ingested?
- How do I set a field as a primary key?
- How do I ensure that records are loaded only once?
- Events Usage
- Sources
- Free Sources
-
Databases and File Systems
- Data Warehouses
-
Databases
- Connecting to a Local Database
- Amazon DocumentDB
- Amazon DynamoDB
- Elasticsearch
-
MongoDB
- Generic MongoDB
- MongoDB Atlas
- Support for Multiple Data Types for the _id Field
- Example - Merge Collections Feature
-
Troubleshooting MongoDB
-
Errors During Pipeline Creation
- Error 1001 - Incorrect credentials
- Error 1005 - Connection timeout
- Error 1006 - Invalid database hostname
- Error 1007 - SSH connection failed
- Error 1008 - Database unreachable
- Error 1011 - Insufficient access
- Error 1028 - Primary/Master host needed for OpLog
- Error 1029 - Version not supported for Change Streams
- SSL 1009 - SSL Connection Failure
- Troubleshooting MongoDB Change Streams Connection
- Troubleshooting MongoDB OpLog Connection
-
Errors During Pipeline Creation
- SQL Server
-
MySQL
- Amazon Aurora MySQL
- Amazon RDS MySQL
- Azure MySQL
- Generic MySQL
- Google Cloud MySQL
- MariaDB MySQL
-
Troubleshooting MySQL
-
Errors During Pipeline Creation
- Error 1003 - Connection to host failed
- Error 1006 - Connection to host failed
- Error 1007 - SSH connection failed
- Error 1011 - Access denied
- Error 1012 - Replication access denied
- Error 1017 - Connection to host failed
- Error 1026 - Failed to connect to database
- Error 1027 - Unsupported BinLog format
- Failed to determine binlog filename/position
- Schema 'xyz' is not tracked via bin logs
- Errors Post-Pipeline Creation
-
Errors During Pipeline Creation
- MySQL FAQs
- Oracle
-
PostgreSQL
- Amazon Aurora PostgreSQL
- Amazon RDS PostgreSQL
- Azure PostgreSQL
- Generic PostgreSQL
- Google Cloud PostgreSQL
- Heroku PostgreSQL
-
Troubleshooting PostgreSQL
-
Errors during Pipeline creation
- Error 1003 - Authentication failure
- Error 1006 - Connection settings errors
- Error 1011 - Access role issue for logical replication
- Error 1012 - Access role issue for logical replication
- Error 1014 - Database does not exist
- Error 1017 - Connection settings errors
- Error 1023 - No pg_hba.conf entry
- Error 1024 - Number of requested standby connections
- Errors Post-Pipeline Creation
-
Errors during Pipeline creation
- PostgreSQL FAQs
- Troubleshooting Database Sources
- File Storage
- Engineering Analytics
- Finance & Accounting Analytics
-
Marketing Analytics
- ActiveCampaign
- AdRoll
- Amazon Ads
- Apple Search Ads
- AppsFlyer
- CleverTap
- Criteo
- Drip
- Facebook Ads
- Facebook Page Insights
- Firebase Analytics
- Freshsales
- Google Ads
- Google Analytics
- Google Analytics 4
- Google Analytics 360
- Google Play Console
- Google Search Console
- HubSpot
- Instagram Business
- Klaviyo v2
- Lemlist
- LinkedIn Ads
- Mailchimp
- Mailshake
- Marketo
- Microsoft Ads
- Onfleet
- Outbrain
- Pardot
- Pinterest Ads
- Pipedrive
- Recharge
- Segment
- SendGrid Webhook
- SendGrid
- Salesforce Marketing Cloud
- Snapchat Ads
- SurveyMonkey
- Taboola
- TikTok Ads
- Twitter Ads
- Typeform
- YouTube Analytics
- Product Analytics
- Sales & Support Analytics
- Source FAQs
- Destinations
- Familiarizing with the Destinations UI
- Cloud Storage-Based
- Databases
-
Data Warehouses
- Amazon Redshift
- Amazon Redshift Serverless
- Azure Synapse Analytics
- Databricks
- Firebolt
- Google BigQuery
- Hevo Managed Google BigQuery
- Snowflake
-
Destination FAQs
- Can I change the primary key in my Destination table?
- How do I change the data type of table columns?
- Can I change the Destination table name after creating the Pipeline?
- How can I change or delete the Destination table prefix?
- Why does my Destination have deleted Source records?
- How do I filter deleted Events from the Destination?
- Does a data load regenerate deleted Hevo metadata columns?
- How do I filter out specific fields before loading data?
- Transform
- Alerts
- Account Management
- Activate
- Glossary
Releases- Release 2.31 (Nov 18-Dec 16, 2024)
- Release 2.30 (Oct 21-Nov 18, 2024)
-
2024 Releases
- Release 2.29 (Sep 30-Oct 22, 2024)
- Release 2.28 (Sep 02-30, 2024)
- Release 2.27 (Aug 05-Sep 02, 2024)
- Release 2.26 (Jul 08-Aug 05, 2024)
- Release 2.25 (Jun 10-Jul 08, 2024)
- Release 2.24 (May 06-Jun 10, 2024)
- Release 2.23 (Apr 08-May 06, 2024)
- Release 2.22 (Mar 11-Apr 08, 2024)
- Release 2.21 (Feb 12-Mar 11, 2024)
- Release 2.20 (Jan 15-Feb 12, 2024)
-
2023 Releases
- Release 2.19 (Dec 04, 2023-Jan 15, 2024)
- Release Version 2.18
- Release Version 2.17
- Release Version 2.16 (with breaking changes)
- Release Version 2.15 (with breaking changes)
- Release Version 2.14
- Release Version 2.13
- Release Version 2.12
- Release Version 2.11
- Release Version 2.10
- Release Version 2.09
- Release Version 2.08
- Release Version 2.07
- Release Version 2.06
-
2022 Releases
- Release Version 2.05
- Release Version 2.04
- Release Version 2.03
- Release Version 2.02
- Release Version 2.01
- Release Version 2.00
- Release Version 1.99
- Release Version 1.98
- Release Version 1.97
- Release Version 1.96
- Release Version 1.95
- Release Version 1.93 & 1.94
- Release Version 1.92
- Release Version 1.91
- Release Version 1.90
- Release Version 1.89
- Release Version 1.88
- Release Version 1.87
- Release Version 1.86
- Release Version 1.84 & 1.85
- Release Version 1.83
- Release Version 1.82
- Release Version 1.81
- Release Version 1.80 (Jan-24-2022)
- Release Version 1.79 (Jan-03-2022)
-
2021 Releases
- Release Version 1.78 (Dec-20-2021)
- Release Version 1.77 (Dec-06-2021)
- Release Version 1.76 (Nov-22-2021)
- Release Version 1.75 (Nov-09-2021)
- Release Version 1.74 (Oct-25-2021)
- Release Version 1.73 (Oct-04-2021)
- Release Version 1.72 (Sep-20-2021)
- Release Version 1.71 (Sep-09-2021)
- Release Version 1.70 (Aug-23-2021)
- Release Version 1.69 (Aug-09-2021)
- Release Version 1.68 (Jul-26-2021)
- Release Version 1.67 (Jul-12-2021)
- Release Version 1.66 (Jun-28-2021)
- Release Version 1.65 (Jun-14-2021)
- Release Version 1.64 (Jun-01-2021)
- Release Version 1.63 (May-19-2021)
- Release Version 1.62 (May-05-2021)
- Release Version 1.61 (Apr-20-2021)
- Release Version 1.60 (Apr-06-2021)
- Release Version 1.59 (Mar-23-2021)
- Release Version 1.58 (Mar-09-2021)
- Release Version 1.57 (Feb-22-2021)
- Release Version 1.56 (Feb-09-2021)
- Release Version 1.55 (Jan-25-2021)
- Release Version 1.54 (Jan-12-2021)
-
2020 Releases
- Release Version 1.53 (Dec-22-2020)
- Release Version 1.52 (Dec-03-2020)
- Release Version 1.51 (Nov-10-2020)
- Release Version 1.50 (Oct-19-2020)
- Release Version 1.49 (Sep-28-2020)
- Release Version 1.48 (Sep-01-2020)
- Release Version 1.47 (Aug-06-2020)
- Release Version 1.46 (Jul-21-2020)
- Release Version 1.45 (Jul-02-2020)
- Release Version 1.44 (Jun-11-2020)
- Release Version 1.43 (May-15-2020)
- Release Version 1.42 (Apr-30-2020)
- Release Version 1.41 (Apr-2020)
- Release Version 1.40 (Mar-2020)
- Release Version 1.39 (Feb-2020)
- Release Version 1.38 (Jan-2020)
- Early Access New
- Upcoming Features
Twitter Ads
Twitter Ads is an online advertising platform that allows businesses to reach their desired target audience on Twitter. It can be used to promote content, products, services, events, website traffic, and app downloads. Twitter displays these ads in the form of promoted tweets, accounts, and trend takeovers.
Hevo uses the Twitter Ads API to ingest the data from your Twitter Ads account and replicate it into the desired Destination database or data warehouse for scalable analysis. You must obtain the API keys and tokens to allow Hevo to access data from your Twitter Ads account.
Prerequisites
-
An active Twitter Ads account from which data is to be ingested exists.
-
An active Twitter Developer account exists. If not, read Step one: Signup for a developer account.
-
An active Twitter application from which keys and tokens are to be obtained exists. If not, refer to section, Create a Twitter Application for steps to create one.
-
The keys and tokens are available to authenticate Hevo on your Twitter Ads account. You must be logged in as an Account Administrator, Ad Manager, or Creative Manager to obtain these credentials. Else, you can obtain them from your account administrator.
-
You have access to the Twitter Ads API. Read Step three: Apply for access to the Ads API to know how to access the Twitter Ads API.
-
You are assigned the Team Administrator, Team Collaborator, or Pipeline Administrator role in Hevo to create the Pipeline.
Create a Twitter Application (Optional)
You require a Twitter application to generate the keys and tokens needed for configuring Twitter Ads as a Source in Hevo. If you already have an application that you can use, skip to Step 2.
To create a Twitter application:
-
Log in to your Twitter Developer account.
-
In the Dashboard page, under Projects, click +Add App.
-
In the Name your App page, specify the app name and click Next.
-
In the Keys & Tokens page that appears, click Copy corresponding to the API Key and API Key Secret to copy them, and save them securely like any other password. Alternatively, obtain them later from the App settings. You can use these credentials while configuring your Hevo Pipeline.
Obtaining the Keys and Tokens
You require keys and tokens to authenticate Hevo on your Twitter Ads account. These credentials (keys and tokens) do not expire and can be reused for all your Pipelines.
Note: You must be logged in as an Account Administrator, Ad Manager, or Creative Manager to perform these steps.
To obtain the credentials:
-
Log in to your Twitter Developer account.
-
In the Dashboard page, under Projects, PROJECT APP, click the Settings ( ) icon corresponding to the app for which you want to generate the credentials.
-
In the <Your_App_Name> page, click Keys and tokens.
-
In the page that appears, do the following:
-
In the Consumer Keys section, click Regenerate corresponding to API Key and Secret.
-
In the confirmation dialog, click Yes, regenerate.
-
In the page that appears, click Copy corresponding to the API Key and API Key Secret to copy them, and save them securely like any other password.
-
Click Yes, I saved them.
-
-
Repeat Step 4 above to obtain the Access Token and Secret from the Authentication Tokens section.
You can use these credentials while configuring your Hevo Pipeline.
Configuring Twitter Ads as a Source
Perform the following steps to configure Twitter Ads as the Source in your Pipeline:
-
Click PIPELINES in the Navigation Bar.
-
Click + CREATE PIPELINE in the Pipelines List View.
-
In the Select Source Type page, select Twitter Ads.
-
In the Configure your Twitter Ads Source page, specify the following:
-
Pipeline Name: A unique name for your Pipeline, not exceeding 255 characters.
-
Consumer API Key: The API key that you obtained from your Twitter Ads account.
-
Consumer API Secret: The API secret that you obtained from your Twitter Ads account.
-
Access Token: The access token that you obtained from your Twitter Ads account.
-
Access Token Secret: The access token secret that you obtained from your Twitter Ads account.
-
Ads Accounts: The Twitter Ads account(s) from which you want to replicate the data.
-
Load Reports from specific countries: If enabled, Hevo ingests data only from the countries you select.
If disabled, Hevo ingests all the data available in your Twitter Ads account.
-
-
Click TEST & CONTINUE.
-
Proceed to configuring the data ingestion and setting up the Destination.
Data Replication
For Teams Created | Default Ingestion Frequency | Minimum Ingestion Frequency | Maximum Ingestion Frequency | Custom Frequency Range (in Hrs) |
---|---|---|---|---|
Before Release 2.21 | 12 Hrs | 15 Mins | 24 Hrs | 1-24 |
After Release 2.21 | 12 Hrs | 30 Mins | 24 Hrs | 1-24 |
Objects
-
Historical data: In the first run of the Pipeline, Hevo ingests all available data for the selected objects from your Twitter Ads account using the Recent Data First approach.
-
Incremental data: Once the historical load is complete, data is ingested as per the ingestion frequency in Full Load or Incremental mode, as applicable. Default duration: 12 Hours.
Reports
-
Historical data: In the first run of the Pipeline, Hevo ingests the data of the past 90 days for all the reports in your Twitter Ads account using the Recent Data First approach.
-
Incremental data: Once the historical data ingestion is complete, every subsequent run of the Pipeline fetches new and updated data for the reports as per the ingestion frequency. Default duration: 12 Hours.
Note: From Release 1.79 onwards, Hevo ingests your historical data using the Recent Data First approach, whereby, the data is ingested in the reverse order, starting from the latest to the earliest. This enables you to have quicker access to the most recent data. This change applies to all new and existing Pipelines.
Schema and Primary Keys
Hevo uses the following primary keys to upload the records in the Destination database:
Primary keys for objects
Object | Primary Keys |
---|---|
ACCOUNTS | ACCOUNT_ID, ID |
FUNDING_INSTRUMENTS | ACCOUNT_ID, ID |
CAMPAIGNS | ACCOUNT_ID, ID |
LINE_ITEMS | ACCOUNT_ID, ID |
PROMOTED_TWEETS | ACCOUNT_ID, ID |
MEDIA_CREATIVE | ACCOUNT_ID, ID |
Primary keys for reports
Unsegmented reports
Report | Primary Keys |
---|---|
ACCOUNT_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, PLACEMENT |
CAMPAIGN_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, PLACEMENT |
FUNDING_INSTRUMENT_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, PLACEMENT |
LINE_ITEM_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, PLACEMENT |
PROMOTED_TWEET_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, PLACEMENT |
MEDIA_CREATIVE_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, PLACEMENT |
Segmented reports
Report | Primary Key |
---|---|
ACCOUNT_AGE_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
ACCOUNT_APP_STORE_CATEGORY_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
ACCOUNT_DEVICES_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT, PLATFORM_TARGET_VALUE |
ACCOUNT_EVENTS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
ACCOUNT_GENDER_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
ACCOUNT_INTERESTS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
ACCOUNT_LOCATIONS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
ACCOUNT_PLATFORMS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
ACCOUNT_PLATFORM_VERSIONS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT, PLATFORM_TARGET_VALUE |
ACCOUNT_POSTAL_CODES_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT, COUNTRY_TARGET_VALUE |
ACCOUNT_REGIONS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT, COUNTRY_TARGET_VALUE |
CAMPAIGN_AGE_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
CAMPAIGN_APP_STORE_CATEGORY_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
CAMPAIGN_CONVERSION_TAGS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
CAMPAIGN_DEVICES_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT, PLATFORM_TARGET_VALUE |
CAMPAIGN_EVENTS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
CAMPAIGN_GENDER_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
CAMPAIGN_INTERESTS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
CAMPAIGN_KEYWORDS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
CAMPAIGN_LANGUAGES_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
CAMPAIGN_LOCATIONS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
CAMPAIGN_PLATFORMS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
CAMPAIGN_PLATFORM_VERSIONS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT, PLATFORM_TARGET_VALUE |
CAMPAIGN_POSTAL_CODES_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT, COUNTRY_TARGET_VALUE |
CAMPAIGN_REGIONS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT, COUNTRY_TARGET_VALUE |
CAMPAIGN_SIMILAR_TO_FOLLOWERS_OF_USER_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
CAMPAIGN_TV_SHOWS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
FUNDING_INSTRUMENT_AGE_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
FUNDING_INSTRUMENT_APP_STORE_CATEGORY_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
FUNDING_INSTRUMENT_CONVERSION_TAGS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
FUNDING_INSTRUMENT_DEVICES_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT, PLATFORM_TARGET_VALUE |
FUNDING_INSTRUMENT_EVENTS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
FUNDING_INSTRUMENT_GENDER_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
FUNDING_INSTRUMENT_INTERESTS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
FUNDING_INSTRUMENT_LOCATIONS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
FUNDING_INSTRUMENT_PLATFORMS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
FUNDING_INSTRUMENT_PLATFORM_VERSIONS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT, PLATFORM_TARGET_VALUE |
FUNDING_INSTRUMENT_REGIONS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT, COUNTRY_TARGET_VALUE |
LINE_ITEM_AGE_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
LINE_ITEM_APP_STORE_CATEGORY_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
LINE_ITEM_CONVERSION_TAGS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
LINE_ITEM_DEVICES_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT, PLATFORM_TARGET_VALUE |
LINE_ITEM_EVENTS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
LINE_ITEM_GENDER_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
LINE_ITEM_INTERESTS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
LINE_ITEM_KEYWORDS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
LINE_ITEM_LANGUAGES_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
LINE_ITEM_LOCATIONS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
LINE_ITEM_PLATFORMS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
LINE_ITEM_PLATFORM_VERSIONS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT, PLATFORM_TARGET_VALUE |
LINE_ITEM_POSTAL_CODES_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT, COUNTRY_TARGET_VALUE |
LINE_ITEM_REGIONS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT, COUNTRY_TARGET_VALUE |
LINE_ITEM_SIMILAR_TO_FOLLOWERS_OF_USER_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
LINE_ITEM_TV_SHOWS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
PROMOTED_TWEET_AGE_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
PROMOTED_TWEET_APP_STORE_CATEGORY_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
PROMOTED_TWEET_CONVERSION_TAGS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
PROMOTED_TWEET_DEVICES_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT, PLATFORM_TARGET_VALUE |
PROMOTED_TWEET_EVENTS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
PROMOTED_TWEET_GENDER_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
PROMOTED_TWEET_INTERESTS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
PROMOTED_TWEET_KEYWORDS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
PROMOTED_TWEET_LANGUAGES_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
PROMOTED_TWEET_LOCATIONS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
PROMOTED_TWEET_PLATFORMSREPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
PROMOTED_TWEET_PLATFORM_VERSIONS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT, PLATFORM_TARGET_VALUE |
PROMOTED_TWEET_POSTAL_CODES_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT, COUNTRY_TARGET_VALUE |
PROMOTED_TWEET_REGIONS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT, COUNTRY_TARGET_VALUE |
PROMOTED_TWEET_SIMILAR_TO_FOLLOWERS_USER_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENT_NAME, PLACEMENT |
PROMOTED_TWEET_TV_SHOWS_REPORT | ACCOUNT_ID, ENTITY_ID, DATE, SEGMENTNAME, PLACEMENT |
Data Model
The following is the list of tables that are created in the Destination for your objects and reports when you run the Pipeline:
Objects
These objects enable you to create, manage, and optimize your Twitter Ads campaigns.
-
Accounts
-
Funding Instruments
-
Campaigns
-
Line Items
-
Promoted Tweets
-
Media Creative
Reports
Hevo supports data ingestion for the following types of reports:
-
Unsegmented
-
Segmented
Unsegmented reports
These reports provide you with an overview of your Twitter Ads campaign performance.
-
Account Report
-
Campaign Report
-
Funding Instrument Report
-
Line Item Report
-
Promoted Tweet Report
-
Media Creative Report
Segmented reports
These reports provide you with a detailed view of your Twitter Ads campaign performance, segregated as per a specific criterion. For example, to know the success of a campaign based on age, you can use the CAMPAIGN_AGE_REPORT.
Read Twitter Ads API Analytics and Metrics and Segmentation to know more about segmented reports.
-
Account Age Report
-
Account App Store Category Report
-
Account Devices Report
-
Account Events Report
-
Account Gender Report
-
Account Interests Report
-
Account Locations Report
-
Account Platforms Report
-
Account Platform Versions Report
-
Account Postal Codes Report
-
Account Regions Report
-
Campaign Age Report
-
Campaign App Store Category Report
-
Campaign Conversion Tags Report
-
Campaign Devices Report
-
Campaign Events Report
-
Campaign Gender Report
-
Campaign Interests Report
-
Campaign Keywords Report
-
Campaign Languages Report
-
Campaign Locations Report
-
Campaign Platforms Report
-
Campaign Platform Versions Report
-
Campaign Postal Codes Report
-
Campaign Regions Report
-
Campaign Similar To Followers Of User Report
-
Campaign TV Shows Report
-
Funding Instrument Age Report
-
Funding Instrument App Store Category Report
-
Funding Instrument Conversion Tags Report
-
Funding Instrument Devices Report
-
Funding Instrument Events Report
-
Funding Instrument Gender Report
-
Funding Instrument Interests Report
-
Funding Instrument Locations Report
-
Funding Instrument Platforms Report
-
Funding Instrument Platform Versions Report
-
Funding Instrument Regions Report
-
Line Item Age Report
-
Line Item App Store Category Report
-
Line Item Conversion Tags Report
-
Line Item Devices Report
-
Line Item Events Report
-
Line Item Gender Report
-
Line Item Interests Report
-
Line Item Keywords Report
-
Line Item Languages Report
-
Line Item Locations Report
-
Line Item Platforms Report
-
Line Item Platform Versions Report
-
Line Item Postal Codes Report
-
Line Item Regions Report
-
Line Item Similar To Followers Of User Report
-
Line Item TV Shows Report
-
Promoted Tweet Age Report
-
Promoted Tweet App Store Category Report
-
Promoted Tweet Conversion Tags Report
-
Promoted Tweet Devices Report
-
Promoted Tweet Events Report
-
Promoted Tweet Gender Report
-
Promoted Tweet Interests Report
-
Promoted Tweet Keywords Report
-
Promoted Tweet Languages Report
-
Promoted Tweet Locations Report
-
Promoted Tweet Platforms Report
-
Promoted Tweet Platform Versions Report
-
Promoted Tweet Postal Codes Report
-
Promoted Tweet Regions Report
-
Promoted Tweet Similar To Followers User Report
-
Promoted Tweet TV Shows Report
Additional Information
Read the detailed Hevo documentation for the following related topics:
Source Considerations
-
From version 10 of Twitter Ads API, only advertising-enabled accounts are visible in the Ads Accounts drop-down while configuring Twitter Ads as a Source in Hevo. Any existing Pipelines created with accounts not enabled for advertising are not affected. However, if you want to continue using the non-advertiser accounts, you must enable them for advertising. Read Ads account creation for the steps to do this.
-
Twitter Ads has made the following changes to the Line Items object:
-
Renamed the
bid_type
field tobid_strategy
. -
Deprecated the
automatically_select_bid
andtracking_tags
fields.
As a result, from Release 1.81 onwards, Hevo:
-
automatically maps the
bid_strategy
field to the Destination table, if you had enabled Auto Mapping while creating the Pipeline. -
ingests null values for the
automatically_select_bid
andtracking_tags
fields.
-
Limitations
- None.
See Also
Revision History
Refer to the following table for the list of key updates made to this page:
Date | Release | Description of Change |
---|---|---|
Sep-30-2024 | 2.28.1 | Updated section, Data Replication to change the default ingestion frequency to 12 Hrs. |
Mar-05-2024 | 2.21 | Updated the ingestion frequency table in the Data Replication section. |
Oct-30-2023 | NA | Updated the page as per the latest Twitter Ads UI. |
Dec-07-2022 | NA | Updated section, Data Replication to reorganize the content for better understanding and coherence. |
Feb-07-2022 | 1.81 | - Reorganized the content in the Data Replication section. - Segregated the reports content based on their type in sections, Schema and Primary Keys and Data Model. - Added section, Source Considerations. |
Jan-03-2022 | 1.79 | Added information about reverse historical load in the Data Replication section. |
Oct-25-2021 | NA | Added the Pipeline frequency information in the Data Replication section. |