In the past few weeks, we have been doing a migration of our platform from one Azure account to another and I have been using this as an opportunity to update different parts of our deployment pipelines. While doing this I have become more familiarized with Azure CLI and soon enough I have become a fan (not a literal fan, just a fan of Azure CLI).
Previously we were using ARM templates to deploy our resource groups and the resources in them. This worked fine, but I did not particularly like them. One of the reasons is because they are way to verbose. Even if you want a resource with all the settings set to their defaults, you still need to define a boatload of JSON. Of course, you do not have to write it all from scratch. You can start from a quick start template, and change only what you need, but it still takes too much effort to mentally parse it all and understand it. Anyway, it is so much nicer, shorter, and easier to read and write using Azure CLI.
Migrating the platform also means migrating the data, not just the resources. As it happens, we have data across a wide range of services: Storage Accounts (in Blobs, Table Storage, File Shares and Data Lake Gen 2), Data Lake Gen 1 and Azure SQL.
A lot of the data migration tasks can be accomplished by using Azure Storage Explorer, such as copying whole blob containers, tables and file shares between accounts, and copying entire file systems between data lake accounts (both Gen 1 and Gen 2). It uses AzCopy underneath, which uses server to server APIs to copy data directly between accounts, so it is rather efficient.
What cannot be done using Azure Storage Explorer is automating any of the above tasks. Also, while copying the blob containers, tables and file shares, you need to do it one by one, which is not great if you have a lot of them in a particular storage account.
As you have probably guessed from the title, you can use Azure CLI to copy all blob containers from one account to the other. There is not much to it, it is just a simple command, currently in preview, that I found while looking for solutions to do it:
Keep in mind that it is currently in preview (Azure CLI v2.15.1), so it might change in future releases. It also utilizes server to server APIs to perform the operation. You can use SAS tokens instead of the connection strings:
You can get the account connection string or generate a SAS token also using Azure CLI:
The nice thing about Azure CLI is that you can include the scripts in your azure pipelines and run them along with other DevOps tasks manually or on some triggers. Bellow is an example of a pipeline that runs every day to copy all blob containers between two storage accounts. You could use it to keep the production and acceptance environments in sync for example:
The pipeline assumes that you have some pipeline variables that define the account names and connection strings. The Subscription variable refers to the Azure service connection to use for this script.
Conveniently, you could use Azure CLI to create both the service connection and the pipeline that runs the Azure CLI task. Mind blown 🤯! You would do that using the Azure DevOps CLI extension. Below are the commands and I leave it up to you to figure out how to use them:
When talking about Azure Storage Explorer I said that underneath it uses AzCopy. Well, AzCopy can also be used to copy all the blob containers between accounts, as well as other interesting scenarios. But that is a matter for a future post!