
Eugen Podaru
__software_and_more__
__software_and_more__
In the past few weeks, we have been doing a migration of our platform from one Azure account to another and I have been using this as an opportunity to update different parts of our deployment pipelines. While doing this I have become more familiar with Azure CLI and soon enough became a fan (not a literal fan, just a fan of Azure CLI).
Previously we were using ARM templates to deploy our resource groups and the resources in them. This worked fine, but I did not particularly like them. One of the reasons is that they are way too verbose. Even if you want a resource with all the settings set to their defaults, you still need to define a boatload of JSON. Of course, you don't have to write it all from scratch. You can start from a quick start template, and change only what you need, but it still takes too much effort to mentally parse and understand it all. Anyway, it is so much nicer, shorter, and easier to read and write using Azure CLI.
Migrating the platform also means migrating the data, not just the resources. As it happens, we have data across a wide range of services: Storage Accounts (in Blobs, Table Storage, File Shares and Data Lake Gen 2), Data Lake Gen 1 and Azure SQL.
Many of the data migration tasks can be accomplished using Azure Storage Explorer, such as copying entire blob containers, tables, and file shares between accounts, as well as copying entire file systems between data lake accounts (both Gen 1 and Gen 2). It uses AzCopy underneath, which uses server to server APIs to copy data directly between accounts, so it is rather efficient.
However, Azure Storage Explorer cannot automate any of the above tasks. Also, when copying blob containers, tables, and file shares, you need to do it one by one, which is not ideal if you have many in a particular storage account.
As you have probably guessed from the title, you can use Azure CLI to copy all blob containers from one account to the other. There isn't much to it; it's just a simple command (currently in preview) that I found while looking for solutions:
Keep in mind that it is currently in preview (Azure CLI v2.15.1), so it might change in future releases. It also utilizes server to server APIs to perform the operation. You can use SAS tokens instead of the connection strings:
You can get the account connection string or generate a SAS token also using Azure CLI:
The nice thing about Azure CLI is that you can include the scripts in your azure pipelines and run them along with other DevOps tasks manually or on some triggers. Below is an example of a pipeline that runs every day to copy all blob containers between two storage accounts. For example, you could use it to keep the production and acceptance environments in sync:
The pipeline assumes that you have some pipeline variables that define the account names and connection strings. The Subscription variable refers to the Azure service connection to use for this script.
Conveniently, you could use Azure CLI to create both the service connection and the pipeline that runs the Azure CLI task. Mind blown π€―! You would do that using the Azure DevOps CLI extension. Below are the commands and I leave it up to you to figure out how to use them:
When talking about Azure Storage Explorer I said that underneath it uses AzCopy. Well, AzCopy can also be used to copy all blob containers between accounts, as well as for other interesting scenarios. But that is a topic for a future post!