In the last 6 months I’ve been involved with a number of GP to BC migration projects.
A recurring question that reaches our team is how do I see GP data in BC?
One avenue to move your business to BC is to import open transactions and master data, and tested setup tables with RapidStart packages. If the underlying table of the desired GP entity does not exist in BC, then a Business Central developer would need to create the table in BC and, with Edit In Excel functionality you can get GP data in BC.
There is also the Cloud Migration Tool in BC. More about it here.
Using this tool ensures the most important entities, master data and open transactions, will make it into BC. But what if a GP end-user wants additional GP data in BC?
Microsoft recommendation is to bring as little as possible into the cloud from an on-premise database.
Moreover, as your database capacity increases, your cost can increase. See more here.
If the decision is, though, to have some GP data in Business Central, there are tools to make that possible.
We can extend the cloud migration tool so that, when the migration starts, beside the core migrated data (master data and open transactions) the process will also bring into a new space (an extension table) the data from the GP table as mapped in the “Manage Custom Tables” page.
What’s needed to achieve this:
Create a Business Central extension. In it, create an AL table to store your data from a GP table
Add the custom table in Manage Custom Tables
Run migration tool
Check custom table content after migration
Let’s try bringing table GL00100 from GP in BC.
Note: this table was chosen only for demonstration. GL00100 is brought by default by the cloud migration tool into BC table “G/L Account”.
Create extension with GP table
I created an extension that includes a table for this GP entity:
Map migration for new table in “Cloud Migration Management”
In Business Central, search for “Cloud Migration Management”.
Under actions trigger “Manage Custom Tables” action:
Under “Migration Table Mapping” page, map new table in your extension to the GP table:
On “Migration Cloud Management” trigger the “Run Migration Now” action.
You can check the results in the cue on the Migration Information area:
To check the content migrated:
change the company to the migrated company
run the new table by adding “&Table= 50340” to the Business Central URL:
We can now see the result of migrating the GP data to the custom BC table:
To answer the question in the title, you don’t lose GP data. There are multiple ways of accessing your GP data post go-live to BC, involving:
retaining the access to your old system
migrate your Dynamics GP installation to Azure (SQL Server and application)
migrating your GP data warehouse to Azure Data Lake
or, as shown above, with minimal coding, keeping your GP data in Business Central
Engage with your partner and decide what GP data do you really need today so that long term your cloud ERP stays performant.
As most probably know, it is not possible to access the file system while in Business Central cloud environment.
For example, in Dynamics NAV, we could have a job queue entry that, when run, creates a file and copies it in a network folder. We can still do that in an On-Premise environment, but not with cloud BC.
You could create the file and use DownloadFromStream, but that would only prompt you do download it locally, but would not copy it somewhere on a local or network folder.
If you try to use File.Create() you would get the warning: “The type or method ‘Create’ cannot be used for ‘Extension’ development”.
If your customer is happy to grab the file manually every time from the downloads folder then this should suffice:
But, if we want to automatize this process and run the extract on a regular basis, we need to find a cloud solution for storing the files.
Currently, there are 4 types of storage in Azure platform:
In my previous blog I dived into the Azure Storage of type Tables and tackled its API.
This blog is about interacting with the Azure storage blob containers:
manually, via Azure Portal
simulation, via VS Code extension “Rest Client”
Business Central extension
view blob container with Excel
get Azure Blobs locally
I found on Michael Megel’s blog a nice solution for exactly what I need. Awesome job on Blob Containers API, Michael! Thank you for sharing!
What I need:
Set up a blob container to store Business Central exported files
Set up Storage Access Key
In VS Code, write requests with “Rest Client” extension, targeting Azure blob container API
A setup table in Business Central for Azure access stuff
Wrote an export interface that would allow users to run an action(“Write File in Azure”) that will send the extract to Azure container. The same code could be executed by a job queue.
Blob Container Setup
To set up a container, following Michael’s notes on above blog was enough for me.
For blob container accessibility I went on the path of shared access signature “SAS Token”.
Once created, you can start playing with the storage account container API.
I created the storage manually:
Drilling down into the storage account, I created a new container:
In my previous blog I showed that, as soon as a new file makes it in the Azure Storage container, the file is pushed locally via a Power Automate flow.
Today, I’ll describe the steps I took to generate a script that runs under Task Scheduler shell and pulls daily files(blobs) from Azure Storage Blob Container locally at a scheduled time.
In VS Code(or your favorite scripting environment) open a terminal window.
First, from the console, login in your azure portal:
The default web browser has been opened at https://login.microsoftonline.com/common/oauth2/authorize. Please continue the login in the web browser. If no web browser is available or if the web browser fails to open, use device code flow with `az login --use-device-code`.
You have logged in. Now let us find all the subscriptions to which you have access...
"name": "Azure subscription 1",
Then, in the console, create a service principal with the role Owner:
az ad sp create-for-rbac -n "SVservicePrincipal" --role Owner --create-cert
In plain English this means, create in Azure a service principal that is role based access control with the assigned role of Owner. Also, create a certificate for this service principal.
Creating 'Owner' role assignment under scope '/subscriptions/48b7fb28-08c7-444f-8e6b-7615735db9b2'
The output includes credentials that you must protect. Be sure that you do not include these credentials in your code or check the credentials into your source control. For more information, see https://aka.ms/azadsp-cli
'name' property in the output is deprecated and will be removed in the future. Use 'appId' instead.
Please copy C:\Users\svir\tmpa41iqdm1.pem to a safe place. When you run 'az login', provide the file path in the --password argument
With the service principal created let’s logout from Azure:
let’s log back in using the service principal created and attempt to copy locally some blobs from an Azure Storage Blob Container:
Log in part:
az login --service-principal --username "a448d6fc-f8b7-4847-9bf7-93f56bc7451f" --password 'C:\\Users\\svir\\tmpnow6fl5e.pem' --tenant "80e51828-6b27-4102-9478-a14375194b20"
INFO: Any empty folders will not be processed, because source and/or destination doesn't have full folder support
INFO: azcopy.exe: A newer version 10.11.0 is available to download
Job 4b2742c1-1e57-a44a-6a24-a40e6b60f01f has started
Log file is located at: C:\Users\svir\.azcopy\4b2742c1-1e57-a44a-6a24-a40e6b60f01f.log
0.0 %, 0 Done, 0 Failed, 3 Pending, 0 Skipped, 3 Total,
Job 4b2742c1-1e57-a44a-6a24-a40e6b60f01f summary
Elapsed Time (Minutes): 0.0334
Number of File Transfers: 3
Number of Folder Property Transfers: 0
Total Number of Transfers: 3
Number of Transfers Completed: 3
Number of Transfers Failed: 0
Number of Transfers Skipped: 0
Final Job Status: Completed
We can see below the blobs located in the vendorlist azure storage container are now downloaded locally in my folder C:\Temp\LocalSVFlorida.
last step is automatization. I will be using Task Scheduler in Windows.
I will create a basic task that will execute the login part and the copy part wrapped in a powershell script.
First open Task Scheduler and create a basic task:
give it a name
schedule it at the desired time
when triggered the action is to execute a powershell script:
I found an older post on community.dynamics.com in which someone was asking for ways to automatically drop data extracts originated in BC SaaS into a local folder.
First, in SaaS, we can’t generate the files automatically and store them locally.
We need to store them in the cloud.
Once in the cloud, how can we automatically download them locally on a machine or a network folder?
I bing-ed the phrase “copy files from azure blob storage to file system” and the first search result was this link to a Power Automate template flow:
There are a multitude of cloud providers, but Microsoft does continuously a great job at connecting everything between BC SaaS, Azure platform, Power Automate and Power Apps, so it’s just convenient to use its tools.
To test it, I went through the following exercise:
In Azure Platform I created a storage account and in it I created a Blob Container.
“A container organizes a set of blobs, similar to a directory in a file system. A storage account can include an unlimited number of containers, and a container can store an unlimited number of blobs.”
I created a local folder that will be synchronized by the new flow with the container in Azure
In Power Automate, I started with the Template provided by Microsoft and set up the flow:
The flow requires two connectors to be set up:
one to the azure storage container
one to the local or network folder
Editing Azure Blob Storage we see that we need the name of the azure storage, in my case “svflorida” and storage access key:
Storage access key is located in in azure portal under Access Keys:
Editing the File System Connector:
The most time consuming, about half an hour, was to set up and troubleshooting the gateway.
The flow cannot just drop files from Azure on your machine. It needs a gateway.
To create a new gateway, click on the drop down and choose “+ New on-premises data gateway”.
That will prompt you to download an msi to install a gateway: GatewayInstall.msi.
Once gateway installed, the only change I’ve operated was to switch from HTTPS to TCP:
In a live environment I would investigate and maybe set up an Azure Service Bus, but for the purpose of this exercise I went with TCP.
Once that is done the flow will be triggered when new files are uploaded or deleted from the Azure Container.
I noticed that with my free trial license the recurrence of the flow was set to 3 minutes.
The flow seems to pick changes as expected, just be patient and wait for the next run 🙂
In azure portal, upload a new file into your container:
The file will appear after a few minutes in your local folder:
And the flow shows a successful run:
That’s it! In the next blog I will look into how I can generate BC SaaS extracts into an Azure storage container so the flow doesn’t feel useless 🙂
I hope this helps someone. In any way, it’s late here so I call it a night!
The repo consists of this page and 2 codeunits involved with the internal mechanics of this page.
If customers find themselves lost through all the setup and settings pages or if the “Manual Setup” page is too large we could gather the most used pages on a custom page “ABC Advanced Settings”, like I did below:
Why would we do that? Isn’t the browser page automatically refreshed?
Not always. Not when a 3rd party up updates the records.
Let’s do some tests in a SaaS environment.
I created a custom entity (testTable), with a list page and an API page. Will start with pushing 10 records to the table via a batch using Postman:
This is the result when executing “refresh” action:
And now let’s send another batch with 4 Delete request:
Next, I’m going to send another 10 records batch to BC.
Using a new action, “refresh-SelectLatestVersion” that does not contain SelectLatestVersion() gives us the following:
It appears that SelectLatestVersion does not make any difference in SaaS and that affecting records with a BC native API does not require SelectLatestVersion().
Let’s try something similar in an On-prem installation.
When records are updated by other apps, not through Business Central means (by the way, not a great idea), the page is not notified of changes in the underlying data and therefore is in a stale state.
How can we enforce the data to update?
Using SelectLatestVersion() we’re clearing up the client cache for the underlying table, initiating a new Read transaction for the affected table, thus affecting performance.
Let’s see how much is actually taking the server to grab the latest data.
I inserted via T-SQL 1,000,000 records:
and this is what I’ve got when I refreshed the page:
The I removed all records:
As you can see above, while my CurrPage.Update is before Message, the page still shows the records. I am guessing that the Message gets displayed before the page is re-rendered.
After clicking on OK the page get rendered again and shows 0 records.
It took 69 milliseconds but the table had only 2 fields. With more fields the result might take longer.
Sometimes customers will ask for an auto-refresh page. While there are technical means to satisfy the request, we need to recognize that this comes with a price, hurting performance. And when applying an auto-refresh to multiple pages the price consequently multiplies.
Things to consider:
avoid when possible use of Selectlatestversion on prem.
In SaaS no need for SelectLatestVersion, refreshing the page via an action or browser F5 displays the latest data.
avoid auto-refreshing. Rather go with a manual refresh(action on page to refresh and call SelectLatestVersion) than auto-refresh (a timer controladdin)
To decrease the number of SelectLatestVersion() calls and CurrPage.Update, log your refresh triggers (count and refresh datetime), and compare current count against last refresh count, get the maximum System Modified At among your records and compare it against your last log datetime …
Recently I have been involved in projects involving migrating Dynamics GP customers to BC (SaaS).
All these GP customers had integrations developed over the years, integrations that now need to be re-targeted to BC.
Now, if you are new to Business Central API integration you need to know that there are some authority bloggers that have touched on OAuth 2.0 in the last 6 months with very useful how-tos. Have a look at Stefano‘s or Roberto‘s blogs. Most comprehensive blogs though, in this specific niche, I find A.J. Kauffmann writings.
A.J.’s blogs are meticulous. Moreover, it seems that they are coming almost aligned with my work requirements, so … yeah … I found them very useful.
As you probably heard or seen, Basic Authentication is no longer supported for BC online.
The only option now(at least for BC online) is using OAuth2 authorization.
How do we start with setting up OAuth2?
Well, I won’t go into that because A.J.’s blog was immaculate, I didn’t have anything to add, so I won’t add anything.
To conserve the flow of this blog, all I have to say is that you need to:
Register the external application in Azure Active Directory
Create the external application account in Business Central
Once these 3 steps are completed we can move to Postman.
Get a Token
In Postman, add the following request to generate a security token: