- Introduction
- Getting Started
- Data Ingestion
- Data Loading
- Loading Data in a Database Destination
- Loading Data to a Data Warehouse
- Optimizing Data Loading for a Destination Warehouse
- Manually Triggering the Loading of Events
- Scheduling Data Load for a Destination
- Loading Events in Batches
- Data Loading Statuses
- Name Sanitization
- Table and Column Name Compression
- Parsing Nested JSON Fields in Events
- Pipelines
- Data Flow in a Pipeline
- Ingestion Modes
- Familiarizing with the Pipelines UI
- Pipeline Objects
- Working with Pipelines
- Transformations
-
Schema Mapper
- Using Schema Mapper
- Mapping Statuses
- Auto Mapping Event Types
- Mapping a Source Event Type with a Destination Table
- Mapping a Source Event Type Field with a Destination Table Column
- Schema Mapper Actions
- Fixing Unmapped Fields
- Resolving Incompatible Schema Mappings
- Resizing String Columns in the Destination
- Schema Mapper Compatibility Table
- Failed Events in a Pipeline
- Pipeline FAQs
- Events Usage
- Sources
- Free Sources
- Analytics
- Collaboration
- CRM
- Data Warehouses
- Databases
- E-Commerce
- File Storage
- Finance & Accounting
-
Marketing
- AdRoll
- Apple Search Ads
- AppsFlyer
- Criteo
- Delighted
- Facebook Ads
- Facebook Page Insights
- Front
- Google Ads
- Google Campaign Manager
- Google Play Console
- Google Search Console
- HubSpot
- Instagram Business
- Klaviyo
- LinkedIn Ads
- Mailchimp
- Marketo
- Microsoft Advertising
- Outbrain
- Pardot
- Pinterest Ads
- Segment
- SendGrid
- SendGrid Webhook
- Salesforce Marketing Cloud
- Snapchat Ads
- Taboola
- Twilio
- TikTok Ads
- Twitter Ads
- Typeform
- Streaming
- Source FAQs
- Destinations
- Transform
- Activate
- Alerts
- Account Management
- Troubleshooting
-
Troubleshooting Sources
- Troubleshooting Amazon DynamoDB
- Troubleshooting FTP/SFTP
- Troubleshooting MongoDB
- Troubleshooting MS SQL
- Troubleshooting MySQL
- Troubleshooting Oracle
-
Troubleshooting PostgreSQL
-
Errors during Pipeline creation
- Error 1003 - Authentication failure
- Error 1006 - Connection settings errors
- Error 1011 - Access role issue for logical replication
- Error 1012 - Access role issue for logical replication
- Error 1014 - Database does not exist
- Error 1017 - Connection settings errors
- Error 1023 - No pg_hba.conf entry
- Error 1024 - Number of requested standby connections
- Errors Post-Pipeline Creation
-
Errors during Pipeline creation
- Troubleshooting Salesforce
- Troubleshooting Destinations
-
Troubleshooting Sources
- Glossary
- Release Notes
- Release Version 1.91
- Release Version 1.90
- Release Version 1.89
- Release Version 1.88
- Release Version 1.87
- Release Version 1.86
- Release Version 1.84 & 1.85
- Release Version 1.83
- Release Version 1.82
- Release Version 1.81
- Release Version 1.80 (Jan-24-2022)
- Release Version 1.79 (Jan-03-2022)
- Release Version 1.78 (Dec-20-2021)
- Release Version 1.77 (Dec-06-2021)
- Release Version 1.76 (Nov-22-2021)
- Release Version 1.75 (Nov-09-2021)
- Release Version 1.74 (Oct-25-2021)
- Release Version 1.73 (Oct-04-2021)
- Release Version 1.72 (Sep-20-2021)
- Release Version 1.71 (Sep-09-2021)
- Release Version 1.70 (Aug-23-2021)
- Release Version 1.69 (Aug-09-2021)
- Release Version 1.68 (Jul-26-2021)
- Release Version 1.67 (Jul-12-2021)
- Release Version 1.66 (Jun-28-2021)
- Release Version 1.65 (Jun-14-2021)
- Release Version 1.64 (Jun-01-2021)
- Release Version 1.63 (May-19-2021)
- Release Version 1.62 (May-05-2021)
- Release Version 1.61 (Apr-20-2021)
- Release Version 1.60 (Apr-06-2021)
- Release Version 1.59 (Mar-23-2021)
- Release Version 1.58 (Mar-09-2021)
- Release Version 1.57 (Feb-22-2021)
- Release Version 1.56 (Feb-09-2021)
- Release Version 1.55 (Jan-25-2021)
- Release Version 1.54 (Jan-12-2021)
- Release Version 1.53 (Dec-22-2020)
- Release Version 1.52 (Dec-03-2020)
- Release Version 1.51 (Nov-10-2020)
- Release Version 1.50 (Oct-19-2020)
- Release Version 1.49 (Sep-28-2020)
- Release Version 1.48 (Sep-01-2020)
- Release Version 1.47 (Aug-06-2020)
- Release Version 1.46 (Jul-21-2020)
- Release Version 1.45 (Jul-02-2020)
- Release Version 1.44 (Jun-11-2020)
- Release Version 1.43 (May-15-2020)
- Release Version 1.42 (Apr-30-2020)
- Release Version 1.41 (Apr-2020)
- Release Version 1.40 (Mar-2020)
- Release Version 1.39 (Feb-2020)
- Release Version 1.38 (Jan-2020)
- Upcoming Features
Resizing String Columns in the Destination
On This Page
When, for an Event, the length of an incoming string value in a Source Event Type field is larger than the size of the mapped column in the Destination, Hevo marks the Event as Failed. For Hevo to be able to load the Event in the Destination table, the field needs to be resized.
This can be done very easily with a single click from the Schema Mapper tab.
In the Schema Mapper tab, under options for the Event Type, click on Resize String Fields.
Once this is done, Hevo resizes the required fields in the Destination table. The operation might take some time depending on the number of rows in the affected column(s). Success and failure notifications for the operation are sent to Slack channels (if enabled) and the Activity Log.
How It Works
Hevo compares the length of the longest value encountered for a particular field with the length of the mapped Destination table column. If the latter is found to be shorter, Hevo resizes the column. While selecting the size of the column, Hevo uses the next larger number that is a power of 2. For example, if the maximum length seen till now is 150, then the Destination column is resized to 256.
The existing data is kept intact while resizing the column.
Once the resizing operation is complete, the changes in the Destination table column size reflect in Schema Mapper and failed Events are replayed.
Notes
-
This option is currently available only for Redshift and Snowflake Destination.
-
This option is available only for
varchar
columns. -
If one of the columns to be resized is a key, then the resize operation fails.