Business Central Musings

For the things we have to learn before we can do them, we learn by doing them

Copy files from Azure Blob Storage to File System (using Azure CLI)

Share This Post

Preamble

In the previous blog post I’ve described the usage of Power Automate when copying files from Azure Blob Storage somewhere locally. Today, we’ll learn how to use Azure CLI to copy Azure Blob Storage containers to File System.

Azure Command-Line Interface (CLI) is a cross-platform command-line tool that can be installed locally on Windows computers. You can get this tool from here.

About Azure CLI

Almost anything you can do in Azure Portal GUI you can achieve with Azure CLI.

Visit this list to see what Azure CLI can manage.

In my previous blog I showed that, as soon as a new file makes it in the Azure Storage container, the file is pushed locally via a Power Automate flow.

Use Azure CLI to copy Azure Blob Storage to File System

Today, I’ll describe the steps I took to generate a script that runs under Task Scheduler shell and pulls daily files(blobs) from Azure Storage Blob Container locally at a scheduled time.

In VS Code (or your favorite scripting environment) open a terminal window.

  • First, from the console, login in your azure portal:
az login

Output:

                                                                                                                                            
The default web browser has been opened at https://login.microsoftonline.com/common/oauth2/authorize. Please continue the login in the web browser. If no web browser is available or if the web browser fails to open, use device code flow with `az login --use-device-code`.
You have logged in. Now let us find all the subscriptions to which you have access...
[
  {
    "cloudName": "AzureCloud",
    "homeTenantId": "80e51828-6b27-4102-9478-a14375194b20",
    "id": "48b7fb28-08c7-444f-8e6b-7615735db9b2",
    "isDefault": true,
    "managedByTenants": [],
    "name": "Azure subscription 1",
    "state": "Enabled",
    "tenantId": "80e51828-6b27-4102-9478-a14375194b20",    
    "user": {
      "name": "admin@CRMbc691816.onmicrosoft.com",
      "type": "user"
    }
  }
]
  • Then, in the console, create a service principal with the role Owner:
az ad sp create-for-rbac -n "SVservicePrincipal" --role Owner --create-cert

In plain English this means, create in Azure a service principal that is role based access control with the assigned role of Owner. Also, create a certificate for this service principal.

The result:

Creating 'Owner' role assignment under scope '/subscriptions/48b7fb28-08c7-444f-8e6b-7615735db9b2'
The output includes credentials that you must protect. Be sure that you do not include these credentials in your code or check the credentials into your source control. For more information, see https://aka.ms/azadsp-cli
'name' property in the output is deprecated and will be removed in the future. Use 'appId' instead.
Please copy C:\Users\svir\tmpa41iqdm1.pem to a safe place. When you run 'az login', provide the file path in the --password argument  
  "appId": "a448d6fc-f8b7-4847-9bf7-93f56bc7451f",
  "displayName": "SVservicePrincipal",
  "fileWithCertAndPrivateKey": "C:\\Users\\svir\\tmpnow6fl5e.pem",
  "name": "a448d6fc-f8b7-4847-9bf7-93f56bc7451f",
  "password": null,
  "tenant": "80e51828-6b27-4102-9478-a14375194b20"
  • With the service principal created let’s logout from Azure:
az logout
  • let’s log back in using the service principal created and attempt to copy locally some blobs from an Azure Storage Blob Container:

Log in part:

az login --service-principal --username "a448d6fc-f8b7-4847-9bf7-93f56bc7451f" --password 'C:\\Users\\svir\\tmpnow6fl5e.pem' --tenant "80e51828-6b27-4102-9478-a14375194b20"

Therefore, the output is:


{
  "appId": "a74da28b-5dfd-49eb-b2a3-7dfdc8279998",
  "displayName": "SVservicePrincipal1",
  "fileWithCertAndPrivateKey": "C:\\Users\\svir\\tmpa41iqdm1.pem",
  "name": "a74da28b-5dfd-49eb-b2a3-7dfdc8279998",
  "password": null,
  "tenant": "80e51828-6b27-4102-9478-a14375194b20"
}

And the copy part:

az storage azcopy blob download -c vendorlist --account-name svflorida -s * -d "c:\\temp\\localsvflorida" --recursive

Outcome:


INFO: Scanning...
INFO: Any empty folders will not be processed, because source and/or destination doesn't have full folder support
INFO: azcopy.exe: A newer version 10.11.0 is available to download


Job 4b2742c1-1e57-a44a-6a24-a40e6b60f01f has started
Log file is located at: C:\Users\svir\.azcopy\4b2742c1-1e57-a44a-6a24-a40e6b60f01f.log

0.0 %, 0 Done, 0 Failed, 3 Pending, 0 Skipped, 3 Total, 


Job 4b2742c1-1e57-a44a-6a24-a40e6b60f01f summary
Elapsed Time (Minutes): 0.0334
Number of File Transfers: 3
Number of Folder Property Transfers: 0
Total Number of Transfers: 3
Number of Transfers Completed: 3
Number of Transfers Failed: 0
Number of Transfers Skipped: 0
TotalBytesTransferred: 884
Final Job Status: Completed

We can see below the blobs located in the vendorlist azure storage container are now downloaded locally in my folder C:\Temp\LocalSVFlorida.

  • last step is automatization. I will be using Task Scheduler in Windows.
    • I will create a basic task that will execute the login part and the copy part wrapped in a powershell script.
    • First open Task Scheduler and create a basic task:
      • give it a name
      • schedule it at the desired time
      • when triggered the action is to execute a powershell script:

And below is the script:

az login --service-principal --username "a448d6fc-f8b7-4847-9bf7-93f56bc7451f" --password 'C:\\Users\\svir\\tmpnow6fl5e.pem' --tenant "80e51828-6b27-4102-9478-a14375194b20"

az storage azcopy blob download -c vendorlist --account-name svflorida -s * -d "c:\\temp\\localsvflorida" --recursive

To test, upload a new file in your azure blob container and schedule your task to run.

Conclusion:

We saw two ways to copy blobs/files from an azure storage container locally.

First method was using a Microsoft template flow, while the second of copying Azure Blob Storage containers to File System was through a PowerShell script that was scheduled to run on a regular basis.

In the next blog will see how we can generate extracts in Business Central and store them in the Azure Storage Blob Containers.

Hope this helps!

You can find the script here.

Share This Post

Related Articles

2 Responses

Leave a Reply

Recent Posts

Get Notified About New Posts

Categories