On This Page
A Destination is any database or data warehouse to which you want to replicate your data. You can configure multiple Pipelines with the same Destination. You can also replicate data from multiple Sources into the same Destination. The data is loaded from all the Pipelines as per the replication schedule you configure for the Destination.
You can view the current status of Events being loaded to a Destination by using:
The Load Status page in the Pipelines Detailed View: This page displays the number of Events being loaded to the Destination by the specific Pipeline.
Destination Overview page in the Destinations Detailed View: This page displays the status of Events being loaded to the Destination from all Pipelines using that Destination.
Note: Ensure that your Destination is configured correctly. If you try to replicate your data in an inactive or deleted database or data warehouse, the data replication fails.
Refer to the following table for the list of key updates made to this page:
|Date||Release||Description of Change|
|Sep-07-2022||1.97||Added a note about failed Destination in case of invalid database or data warehouse.|
|Mar-21-2022||NA||Added information about Load Status and Destination Overview pages.|
- Articles in this section
- Familiarizing with the Destinations UI
- Amazon Aurora MySQL
- Microsoft SQL Server
- Connecting to a Local Database
- Data Warehouses
- Amazon Redshift
- Structure of Data in the Amazon Redshift Data Warehouse
- Loading Data to an Amazon Redshift Data Warehouse
- Troubleshooting Amazon Redshift Destination
- Amazon Redshift FAQs
- Hevo Managed Google BigQuery
- Google BigQuery
- Clustering in BigQuery
- Partitioning in BigQuery
- Loading Data to a Google BigQuery Data Warehouse
- Near Real-time Data Loading using Streaming
- Troubleshooting Google BigQuery
- Google BigQuery FAQs
- Structure of Data in the Snowflake Data Warehouse
- Loading Data to a Snowflake Data Warehouse
- Troubleshooting Snowflake
- Snowflake FAQs
- Amazon Redshift
- Destination FAQs
- Can I create a Destination through API?
- Can I use Hevo to migrate data from one SaaS application to another SaaS application?
- Can I change the primary key in my Destination table?
- How do I change the data type of a Destination table column?
- Can I change the Destination table name after creating the Pipeline?
- How can I change or delete the Destination table prefix after creating a Pipeline?
- How do I resolve duplicate records in the Destination table?
- How do I enable or disable the deduplication of records in my Destination tables?
- Why does my Destination have records that are deleted in the Source?
- How do I filter deleted Events from the Destination?
- If I remove Hevo metadata columns from the Destination table, are these regenerated in the next load?
- Can I load data to a specific Destination table?
- How do I filter out specific fields before loading data to the Destination?
- How do I sort the data that I have loaded to my Destination?