FTP / SFTP
On This Page
Hevo lets you load data from files in an FTP location into your data warehouse.
Note: For Pipelines created with this Source, Hevo provides you a fully-managed BigQuery data warehouse Destination if you do not already have one set up. You are only charged the cost that Hevo incurs for your project in Google BigQuery. The invoice is generated at the end of each month and payment is recovered as per the payment instrument you have set up. You can now create your Pipeline and directly start analyzing your Source data. Read Hevo Managed Google BigQuery.
Configuring FTP/SFTP as a Source
To configure FTP/SFTP as a Source in Hevo:
Click PIPELINES in the Asset Palette.
Click + CREATE in the Pipeline List View.
In the Select Source Type page, select FTP/SFTP.
In the Configure your FTP/SFTP Source page, specify the following:
- Pipeline Name - A unique name for the Pipeline.
- Type - Select FTP or SFTP.
- Host - The IP address or the DNS for your FTP location.
- Port - The port at which Hevo can connect with your FTP/SFTP Server. The default port is 21.
- User - The user ID for logging in to the FTP/SFTP server.
- Password - The password of the user logging in to the FTP/SFTP server. The password is optional for SFTP type connections. However, in that case, you will have to add our public key displayed on the UI to the
.ssh/authorized\_keysfile on your SFTP Server.
- Path Prefix - The path Prefix for the data directory. By default, the files are listed from the root of the directory.
- File Format - Choose a file format. Hevo currently supports CSV, JSON and XML formats. Contact Hevo Support if your Source data is in a different format.
Based on the format you select, you must specify some additional settings:
Specify the Field Delimiter. This is the character on which fields in each line are separated. For example, `\t`, or `,`).
Disable the Treat First Row As Column Headers option if the Source data file does not contain column headers. Hevo, then, automatically creates these during ingestion. Default setting: Enabled.
See Example below.
- Enable the Create Events from child nodes option to load each node under the root node in the XML file as a separate Event.
Create Event Types from folders - Enable this option if the prefix path has subdirectories containing files in different formats. Hevo reads each subdirectory as a separate Event Type. Note: Files lying at the prefix path (and not in a subdirectory) are ignored.
Connect through SSH: Enable this option to connect to Hevo using an SSH tunnel, instead of directly connecting your FTP host to Hevo. Read Connecting Through SSH.
If this option is disabled, you must whitelist Hevo’s IP addresses to allow Hevo to connect to your MySQL host.
Click TEST & CONTINUE.
Proceed to configuring the data ingestion and setting up the Destination.
Things to Note
- Gzipped files are automatically unzipped on ingestion by Hevo.
- Files are re-ingested on update.
Example: Automatic Column Header Creation for CSV Tables
Consider the following data in CSV format, which has no column headers.
CLAY COUNTY,32003,11973623 CLAY COUNTY,32003,46448094 CLAY COUNTY,32003,55206893 CLAY COUNTY,32003,15333743 SUWANNEE COUNTY,32060,85751490 SUWANNEE COUNTY,32062,50972562 ST JOHNS COUNTY,846636,32033, NASSAU COUNTY,32025,88310177 NASSAU COUNTY,32041,34865452
If you disable the Treat first row as column headers option, Hevo auto-generates the column headers, as seen in the schema map here:
The record in the Destination appears as follows:
- Hevo does not support UTF-16 encoding format for CSV files. As a workaround, you can convert the files to UTF-8 encoding format before these are ingested by the Pipeline.
Refer to the following table for the list of key updates made to the page:
|Date||Release||Description of Change|
|22-Feb-2021||NA||Added the limitation about Hevo not supporting UTF-16 encoding format for CSV data. Read Limitations.|