The repo consists of this page and 2 codeunits involved with the internal mechanics of this page.
If customers find themselves lost through all the setup and settings pages or if the “Manual Setup” page is too large we could gather the most used pages on a custom page “ABC Advanced Settings”, like I did below:
Recently I have been involved in projects involving migrating Dynamics GP customers to BC (SaaS).
All these GP customers had integrations developed over the years, integrations that now need to be re-targeted to BC.
Now, if you are new to Business Central API integration you need to know that there are some authority bloggers that have touched on OAuth 2.0 in the last 6 months with very useful how-tos. Have a look at Stefano‘s or Roberto‘s blogs. Most comprehensive blogs though, in this specific niche, I find A.J. Kauffmann writings.
A.J.’s blogs are meticulous. Moreover, it seems that they are coming almost aligned with my work requirements, so … yeah … I found them very useful.
As you probably heard or seen, Basic Authentication is no longer supported for BC online.
The only option now(at least for BC online) is using OAuth2 authorization.
How do we start with setting up OAuth2?
Well, I won’t go into that because A.J.’s blog was immaculate, I didn’t have anything to add, so I won’t add anything.
To conserve the flow of this blog, all I have to say is that you need to:
Register the external application in Azure Active Directory
Create the external application account in Business Central
Once these 3 steps are completed we can move to Postman.
Get a Token
In Postman, add the following request to generate a security token:
As a Business Central developer I don’t get every day to set up Business Central standard processes, I mostly design and setup the processes in my customizations work. Standard Business Central setup is covered by Functional Consultants.
The exam MB-800 has been active for about a year (beta version started in October 2020) and given my developer experience with setting up the system I thought I’d give it a try. This blog contains materials I found, read, and tested; hopefully it will provide a good starting point for study for others.
This exam is testing your skills on setting up Business Central SaaS. It is easy to get a trial of BC SaaS which can be used for training for this exam. And if you need more time, you can extend your trial by another 30 days. Just navigate to https://businesscentral.dynamics.com/?page=1828 and extend your trial. If 60 days is not enough, you can start again a new trial.
“Business Central customers can use up to 80 GB of database storage capacity across all of their environments (production and sandbox), meaning that the sum of database capacity usage of all of their environments must not exceed 80 GB” – Microsoft Docs
One way to keep your SaaS customers database size in check is by “Migrating BLOB data types to Media or MediaSet – Data in Media or Media set data types aren’t counted in the database limit. As an extension developer, consider migrating data from blobs to the Media or MediaSet datatypes for your own extensions” – per Microsoft Documentation.
Said and done.
Let’s create a new table that contains a MediaSet field.
In a List page for the above table, displayed the MediaSet field as a factbox with 3 actions:
The code in each action was based on the Customer Image factbox, but adapted to use Microsoft System’s app table “Tenant Media”.
RunRequestPage allows developers to record the request page settings of a Dynamics NAV/Business Central report without actually running the report. The output of this command is an xml string.
XmlParameters := REPORT.RUNREQUESTPAGE(50000);
What if we want to process the report in certain conditions explicitly defined by the report options? We need to be able in this case to parse the output of RunRequestPage.
Simple enough. One way is using XMLDocument LoadXml and load the string into a DotNet variable and use DotNet functions to get the value of the nodes.
If you want to avoid using DotNet you could use “XML Buffer Writer” codeunit (1235) and “XML Buffer” table (1235) in a codeunit called from an action.
XMLBuffer, XMLSpecialInterestNode : Record 1235;
XMLBufferWriter : Codeunit 1235;
First, we’re running the request page for report 50000. This will open up the request page, allowing the user to set all options/filters. Once finished click ok.
All the options/filters for the report will be recorded in the string XmlParameters.
Secondly, we load the xml string into an xml structure inside NAV, using table and codeunit 1235. This is done via function InitializeXMLBufferFromText from codeunit 1235.
We can then filter the entries and locate the option we are interested in.
In my case I had a report option “Run Later” … if this option is true I will do a different type of processing than just running the report. Think in terms of what you could do to a report beside running it: keep track of run time, email output …
As soon as I started working with Containers, more specifically with Azure Containers, around mid-December 2018, I quickly run into a few questions: how can I automate the container creation, how can I update a container (scale up or down, override settings)? How can I scale out my configuration? For some of my questions I identified answers, for others the research is ongoing.
As we established I am not exactly an expert and if you’re still here, the process of generating your first Azure Container loaded with Business Central is a fairly easy one. Check my previous blog where I described step by step the process.
I like to mess around, and I did mess around with the tables where the extensions are managed (system tables 2000000150 NAV App*) ending up with a corrupt container, or rather with a corrupt Business Central. Because I did not have any important data I could just delete the container and run through the steps of manually creating it again. But what if I wanted to automate the process? What if I needed to build 5 distinct containers? How can I speed up the process and make it scalable?
In the portal, go to the container created in the previous blog, click on “Automated script” and download:
Download the automatic script into a new Visual Studio Code folder. I chose to save it as azuredeploy.json.
Above, is the deployment template I’m going to work with to automate the creation of new containers loaded with a Business Central image. The current image, Microsoft/bcsandbox:latest, in the template code, won’t have data. If you want sample data in your new container(s) use this image: Microsoft/bcsandbox:base. If you need more info about loading your Business Central with data, read Waldo’s and Roberto’s blogs.
Additionally, create a new file(the script) – I named it templatedeploy.ps1:
Before we run this script we have to take a closer look at the deployment template downloaded from the portal.
I replaced the highlighted section above with this one below:
I’m adding 3 new parameters, but you could parametrize almost any setting in your deployment template and create placeholders for them in the deployment template:
Moreover, I needed to create a new file in our project, parameters.json:
Before running the script “az group deployment create” looks like this:
Now I’m ready to run the powershell script:
To be able to log in Business Central we need the credentials for admin which can be obtained with the command:
az container logs -g rg-template -n d365bc-container-fromtemplate
To perform some cleanup (remove resource group and its content) run:
az group delete -n rg-template –yes
Let’s now scale out our deployment to 2 containers:
And after running “templatedeploy.ps1” we go to Azure Portal and we can see 2 containers under our unique deployment:
Check the logs, identify the Admin password and you’re ready to login in your container!
To start writing extensions for Business Central we have a few choices: installing locally one of the release candidates that comes in the same format as any other Dynamics NAV DVD packages, creating a locally hosted docker sandbox, or in Azure as a container instance.
As the process of getting your container takes just a few minutes, I prefer to do my extensions testing and development in an Azure container.