This is a quick demonstration of how we can migrate data from Azure SQL Database (or any other source that can be converted to CSV) into Azure Table Storage.

To accomplish this most easily in a one-time manner you’re best off simply creating a CSV in whatever manner you wish and importing that through Azure Storage Explorer. If we need to do this in a recurring fashion we’d prefer to leverage Azure Data Factory to import the data directly without creating a CSV. We haven’t needed to do anything recurring yet so we’re just using the first option here.

Create the CSV

Create a CSV by calling the SQL server and exporting the results with typical Azure modules. Note, in this example we convert a date to a format without any of the disallowed characters.

$access_token = (Get-AzAccessToken -ResourceUrl;
$SQLDataQuery = "
	EntityId as 'PartitionKey'
	,convert(char(30), CreatedDate, 126) as 'RowKey'
Invoke-Sqlcmd -ServerInstance $SQLServer -Database $SQLDatabase -AccessToken $access_token -Query $SQLDataQuery | Export-Csv -Path "F:\junk\SqlOut\dboTable.csv" -NoTypeInformation

Import the Data

Once you’ve created your CSV, the import is only a few clicks in Azure Storage Explorer.

Simply navigate to your table. Push Import, enter the file name and fill out the data types for each interpreted column. Verify that you have PartitionKey and RowKey columns which will be used as your respective table keys.