I found an older post on community.dynamics.com in which someone was asking for ways to automatically drop data extracts originated in BC SaaS into a local folder.
First, in SaaS, we can’t generate the files automatically and store them locally.
We need to store them in the cloud.
Once in the cloud, how can we automatically download them locally on a machine or a network folder?
I bing-ed the phrase “copy files from azure blob storage to file system” and the first search result was this link to a Power Automate template flow:
There are a multitude of cloud providers, but Microsoft does continuously a great job at connecting everything between BC SaaS, Azure platform, Power Automate and Power Apps, so it’s just convenient to use its tools.
To test it, I went through the following exercise:
- In Azure Platform I created a storage account and in it I created a Blob Container.
- “A container organizes a set of blobs, similar to a directory in a file system. A storage account can include an unlimited number of containers, and a container can store an unlimited number of blobs.”
- I created a local folder that will be synchronized by the new flow with the container in Azure
In Power Automate, I started with the Template provided by Microsoft and set up the flow:
The flow requires two connectors to be set up:
- one to the azure storage container
- one to the local or network folder
Editing Azure Blob Storage we see that we need the name of the azure storage, in my case “svflorida” and storage access key:
Storage access key is located in in azure portal under Access Keys:
Editing the File System Connector:
The most time consuming, about half an hour, was to set up and troubleshooting the gateway.
The flow cannot just drop files from Azure on your machine. It needs a gateway.
To create a new gateway, click on the drop down and choose “+ New on-premises data gateway”.
That will prompt you to download an msi to install a gateway: GatewayInstall.msi.
Once gateway installed, the only change I’ve operated was to switch from HTTPS to TCP:
In a live environment I would investigate and maybe set up an Azure Service Bus, but for the purpose of this exercise I went with TCP.
Once that is done the flow will be triggered when new files are uploaded or deleted from the Azure Container.
I noticed that with my free trial license the recurrence of the flow was set to 3 minutes.
The flow seems to pick changes as expected, just be patient and wait for the next run 🙂
In azure portal, upload a new file into your container:
The file will appear after a few minutes in your local folder:
And the flow shows a successful run:
That’s it! In the next blog I will look into how I can generate BC SaaS extracts into an Azure storage container so the flow doesn’t feel useless 🙂
I hope this helps someone. In any way, it’s late here so I call it a night!