I found an older post on community.dynamics.com in which someone was asking for ways to automatically drop data extracts originated in BC SaaS into a local folder. Thus, the overarching theme of this blog post is the use of Power Automate to sync Azure Blob Storage to File System.
If you didn’t’ do it yet, you can learn how to export Business Central entities to Azure Blob Storage in this article.
Additionally, if you are interested in learning about Azure CLI, I explained the same use case, copying files from Azure Blob Storage to File System using Azure CLI in this blog post.
First, in SaaS, we can’t generate the files automatically and store them locally. We need to store them in the cloud.
Once in the cloud, how can we automatically download them locally on a machine or a network folder?
I bing-ed the phrase “copy files from azure blob storage to file system” and the first search result was this link to a Power Automate template flow:
There are a multitude of cloud providers, but Microsoft does continuously a great job at connecting everything between BC SaaS, Azure platform, Power Automate and Power Apps, so it’s just convenient to use its tools.
To test it, I went through the following exercise:
- In Azure Platform I created a storage account and in it I created a Blob Container.“A container organizes a set of blobs, similar to a directory in a file system. A storage account can include an unlimited number of containers, and a container can store an unlimited number of blobs.”
- I created a local folder that will be synchronized by the new flow with the container in Azure
In Power Automate, I started with the Template provided by Microsoft and set up the flow:
The flow requires two connectors to be set up:
- one to the azure storage container
- one to the local or network folder
While editing Azure Blob Storage we see that we need the name of the azure storage, in my case “svflorida” and storage access key:
Storage access key is located in in azure portal under Access Keys:
Editing the File System Connector:
The most time consuming, about half an hour, was to set up and to troubleshoot the gateway.
The flow cannot just drop files from Azure on your machine. It needs a gateway.
To create a new gateway, click on the drop down and choose “+ New on-premises data gateway”.
That will prompt you to download an msi to install a gateway: GatewayInstall.msi.
Once gateway installed, the only change I’ve operated was to switch from HTTPS to TCP:
In a live environment I would investigate and maybe set up an Azure Service Bus, but for the purpose of this exercise I went with TCP.
Once that is done the flow will be triggered when new files are uploaded or deleted from the Azure Container.
I noticed that with my free trial license the recurrence of the flow was set to 3 minutes.
The flow seems to pick changes as expected, just be patient and wait for the next run 🙂
In azure portal, upload a new file into your container:
And the flow shows a successful run:
That’s it! I shared here how to leverage Power Automate to sync Azure Blob Storage to File System.
In the next article I will add some value to this current blog by looking into how we can generate BC SaaS extracts into an Azure storage container so the flow is in fact used in a real example.
I hope this helps someone!