Share via


Load data to Azure Data Lake Store with high throughput

In general, a suggested way is to leverage Azure Data Factory to copy data from/to Azure Data Lake Store to/from many data sources like Azure Blob, SQL Sever, Azure SQL DW, etc. It is code-free, and it handles performance optimization, resilience handling and scheduling for you. You could even leverage multiple VMs to have an extreme good perfomrance. Refer to this article for more details: https://azure.microsoft.com/en-us/documentation/articles/data-factory-azure-datalake-connector/.

 

On the other hand, if you want to develop a custom application to load data to Azure Data Lake Store, you could leverage the Data Lake Store .Net SDK to load data. Refer to this article for more details: https://azure.microsoft.com/en-us/documentation/articles/data-lake-store-get-started-net-sdk/.