(SQL) Tip of the Day: Large row support

Today’s Tip…

Support for large rows has been a frequent ask from our users. This update adds support for rows larger than 32K and data types over 8K. Large row support adds support for varchar(max), nvarchar(max) and varbinary(max). The addition of these data types makes it even easier to migrate to the cloud by allowing you to use more data types from your existing table definitions.

In this first iteration of large row support, there are a few limits in place which will be lifted in future updates. In this update, loads for large rows is currently only supported through Azure Data Factory (with BCP), Azure Stream Analytics, SSIS, BCP or the .NET SQLBulkCopy class. PolyBase support for large rows will be added in a future release. Similar to Azure SQL Database, support for these new large data types, like varchar(max), are limited to Heap and Clustered Index Tables. Large rows are supported for Clustered Columnstore tables, it’s just the large data type support on Clustered Columnstore that will come later.

You can find more information and features here!