Copy files from Azure Blob Storage to File System (using Power Automate)

I found an older post on community.dynamics.com in which someone was asking for ways to automatically drop data extracts originated in BC SaaS into a local folder.

First, in SaaS, we can’t generate the files automatically and store them locally.

We need to store them in the cloud.

Once in the cloud, how can we automatically download them locally on a machine or a network folder?

I bing-ed the phrase “copy files from azure blob storage to file system” and the first search result was this link to a Power Automate template flow:

There are a multitude of cloud providers, but Microsoft does continuously a great job at connecting everything between BC SaaS, Azure platform, Power Automate and Power Apps, so it’s just convenient to use its tools.

To test it, I went through the following exercise:

  • In Azure Platform I created a storage account and in it I created a Blob Container.
    • “A container organizes a set of blobs, similar to a directory in a file system. A storage account can include an unlimited number of containers, and a container can store an unlimited number of blobs.”
  • I created a local folder that will be synchronized by the new flow with the container in Azure

In Power Automate, I started with the Template provided by Microsoft and set up the flow:

The flow requires two connectors to be set up:

  • one to the azure storage container
  • one to the local or network folder

Editing Azure Blob Storage we see that we need the name of the azure storage, in my case “svflorida” and storage access key:

Storage access key is located in in azure portal under Access Keys:

Editing the File System Connector:

The most time consuming, about half an hour, was to set up and troubleshooting the gateway.

The flow cannot just drop files from Azure on your machine. It needs a gateway.

To create a new gateway, click on the drop down and choose “+ New on-premises data gateway”.

That will prompt you to download an msi to install a gateway: GatewayInstall.msi.

Once gateway installed, the only change I’ve operated was to switch from HTTPS to TCP:

In a live environment I would investigate and maybe set up an Azure Service Bus, but for the purpose of this exercise I went with TCP.

Once that is done the flow will be triggered when new files are uploaded or deleted from the Azure Container.

I noticed that with my free trial license the recurrence of the flow was set to 3 minutes.

The flow seems to pick changes as expected, just be patient and wait for the next run 🙂

In azure portal, upload a new file into your container:

The file will appear after a few minutes in your local folder:

And the flow shows a successful run:

That’s it! In the next blog I will look into how I can generate BC SaaS extracts into an Azure storage container so the flow doesn’t feel useless 🙂

I hope this helps someone. In any way, it’s late here so I call it a night!

Generate Azure Business Central containers using deployment template and parameter files

As soon as I started working with Containers, more specifically with Azure Containers, around mid-December 2018, I quickly run into a few questions: how can I automate the container creation, how can I update a container (scale up or down, override settings)? How can I scale out my configuration? For some of my questions I identified answers, for others the research is ongoing.

As we established I am not exactly an expert and if you’re still here, the process of generating your first Azure Container loaded with Business Central is a fairly easy one. Check my previous blog where I described step by step the process.

I like to mess around, and I did mess around with the tables where the extensions are managed (system tables 2000000150 NAV App*) ending up with a corrupt container, or rather with a corrupt Business Central. Because I did not have any important data I could just delete the container and run through the steps of manually creating it again. But what if I wanted to automate the process? What if I needed to build 5 distinct containers? How can I speed up the process and make it scalable?

Instead of going through last blog exercise, to delete the corrupt container and re-create it, I decided to investigate Microsoft documentation around deployment templates and deployment parameter files.

This is what I learnt:

In the portal, go to the container created in the previous blog, click on “Automated script” and download:

deploy template

Download the automatic script into a new Visual Studio Code folder. I chose to save it as azuredeploy.json.

vscode-build automated script

Above, is the deployment template I’m going to work with to automate the creation of new containers loaded with a Business Central image. The current image, Microsoft/bcsandbox:latest, in the template code, won’t have data. If you want sample data in your new container(s) use this image: Microsoft/bcsandbox:base. If you need more info about loading your Business Central with data, read Waldo’s and Roberto’s blogs.

image with dataAdditionally, create a new file(the script) – I named it templatedeploy.ps1:

auto_script

Before we run this script we have to take a closer look at the deployment template downloaded from the portal.

template param section

I replaced the highlighted section above with this one below:

my params

I’m adding 3 new parameters, but you could parametrize almost any setting in your  deployment template and create placeholders for them in the deployment template:

placeholders

Moreover, I needed to create a new file in our project, parameters.json:

paramsjson

Before running the script “az group deployment create” looks like this:

command

Now I’m ready to run the powershell script:

result_script_template

To be able to log in Business Central we need the credentials for admin which can be obtained with the command:

az container logs -g rg-template -n d365bc-container-fromtemplate

To perform some cleanup (remove resource group and its content)  run:

az group delete -n rg-template –yes

Let’s now scale out our deployment to 2 containers:

scaleout

And after running “templatedeploy.ps1” we go to Azure Portal and we can see 2 containers under our unique deployment:

scaleout_result

Check the logs, identify the Admin password and you’re ready to login in your container!

That’s what I learnt. What would you add?