ON THIS PAGE
|Note: The Amazon S3 Destination is not generally available. Please reach out to Hevo Support to enable this for your team.|
Hevo can load data from any of your Pipelines into an S3 location. In this document, we will look at the steps to add S3 as a Destination.
Since S3 is a file-based Destination, Hevo gives you an option to partition your data. To understand how data partitioning works for S3 please refer to this document.
Note: The partition keys you choose for the S3-based Destination are not written as a column in the S3 file.
The AWS Key you provide below must have the PutObject and ListObjects privileges on both the S3 bucket and the location prefix that you will be adding below.
1. Create Destination
You can create a Destination either while creating a Pipeline or by clicking DESTINATIONS in the Asset Palette and clicking + CREATE in the Destinations Overview page.
2. Select the Destination Type
In the Add Destination page, select S3.
3. Fill in connection details
In the Configure Your Destination page, specify the following:
- Destination Name: A unique name for this Destination.
- Access Key ID: The AWS access key.
- Secret Access Key: The Secret Key corresponding to the access key.
- Bucket Name: The name of the S3 bucket you want to load your data into.
- Prefix: A location prefix where you want your data to be in.
- Bucket Region: The AWS region where your bucket is located.
- File Format: The format in which you want to write your data.
- JSON: Hevo writes each Event as a JSON object per line. For more information please go through JSON Lines.
- Should files be GZipped?: If this option is enabled, Hevo GZips the files prior to writing it to S3.
4. Test connection
After filling the details, click TEST CONNECTION to test connectivity and permission with your S3 Destination.
5. Save Destination
Once the test is successful, click SAVE DESTINATION.