Azure Storage Table and its API

One way to make available data from Business Central to different parties and users is to leverage Azure Storage.

Customers will need to have an Azure Portal subscription.

There are 4 types of storage in Azure portal: Blob Containers, File Shares, Queues, and Tables.

Today’s blog is about Azure Storage Tables and its API.

More on its API here.

In this blog I covered:

  • create a storage account table using Azure Platform;
  • insert data in the storage account table via VS Code extension “Rest Client”;
  • an extension to send Business Central vendor data to the Azure Storage table;
  • using Excel to share Azure Storage table data with 3rd party users.

Create storage account and storage account table

I created through the Azure portal one storage account and a few tables stored inside that storage account.

The process is simple and you can find details about storage accounts here and an overview on storage account tables here.

You can insert manually data in these tables using Azure Storage Explorer.

Let’s see how we can interact with them first via Rest Client (extension for VS Code), and then via Business Central extensions.

Using Rest Client to interact with Azure Storage Table

1. we can query the Azure Tables via Rest Client in VS Code:

[in the picture insert your SAS token right after “?”]

2. Query Vendor table:

3. Insert new record in Vendor Table:

Let’s now verify in Azure portal the last action:

Use Azure Storage Table API from Business Central

With the API tested, let’s now move to Business Central and AL and try to insert records in the Azure tables.

The sample code I worked on will scan all vendors and send Vendor.”No.” and Vendor.Name to the Vendor table in Azure Storage.

When creating a new table in Azure Storage Table, each table comes by default with the 2 fields:

  • PartitionKey
  • RowKey

In my example, PartitionKey will be empty, but you could populate it with the company name.

RowKey will be populated with Vendor.SystemId.

“Azure Storage Setup” it’s an extension table to keep all that great Azure stuff:

To log the work done by the Post request I created table and a page and I am inserting records in this table with each successful Post request.

Use the Azure Storage Explorer to view records in the Azure Portal:

What about the 3rd party users?

How do we give them access to the data in the storage table?

The good old Excel is here to contribute.

Use of Excel to share Azure Storage Data with 3rd party users

-Open Excel

  • go to Data menu
  • Get Data -> From Azure Storage Table
  • when prompted enter for the “Account Name or URL” enter the storage URL:
  • For access key enter the access key from storage account

And this is what we get:

Click on Load, and then double click on Vendor Connection:

PowerQuery is opening and we can enable the other fields (by clicking on Content column and select the missing columns):

You can find the AL sample code here.

To use or not to use SelectLatestVersion()

When using web services or API exposed entities you might find useful to request the application service to grab the latest version of the underlying data.

The definition at Database.SelectLatestVersion Method – Business Central | Microsoft Docs states that with using SelectLatestVersion() function you make sure that “the data displayed is the most current data in the database”.

Why would we do that? Isn’t the browser page automatically refreshed?

Not always. Not when a 3rd party up updates the records.

Let’s do some tests in a SaaS environment.

I created a custom entity (testTable), with a list page and an API page. Will start with pushing 10 records to the table via a batch using Postman:

This is the result when executing “refresh” action:

And now let’s send another batch with 4 Delete request:

Next, I’m going to send another 10 records batch to BC.

Using a new action, “refresh-SelectLatestVersion” that does not contain SelectLatestVersion() gives us the following:

It appears that SelectLatestVersion does not make any difference in SaaS and that affecting records with a BC native API does not require SelectLatestVersion().

Let’s try something similar in an On-prem installation.

When records are updated by other apps, not through Business Central means (by the way, not a great idea), the page is not notified of changes in the underlying data and therefore is in a stale state.

How can we enforce the data to update?

Using SelectLatestVersion() we’re clearing up the client cache for the underlying table, initiating a new Read transaction for the affected table, thus affecting performance.

Let’s see how much is actually taking the server to grab the latest data.

I inserted via T-SQL 1,000,000 records:

and this is what I’ve got when I refreshed the page:

The I removed all records:

As you can see above, while my CurrPage.Update is before Message, the page still shows the records. I am guessing that the Message gets displayed before the page is re-rendered.

After clicking on OK the page get rendered again and shows 0 records.

It took 69 milliseconds but the table had only 2 fields. With more fields the result might take longer.

Sometimes customers will ask for an auto-refresh page. While there are technical means to satisfy the request, we need to recognize that this comes with a price, hurting performance. And when applying an auto-refresh to multiple pages the price consequently multiplies.

Things to consider:

  1. avoid when possible use of Selectlatestversion on prem.
  2. In SaaS no need for SelectLatestVersion, refreshing the page via an action or browser F5 displays the latest data.
  3. avoid auto-refreshing. Rather go with a manual refresh(action on page to refresh and call SelectLatestVersion) than auto-refresh (a timer controladdin)
  4. To decrease the number of SelectLatestVersion() calls and CurrPage.Update, log your refresh triggers (count and refresh datetime), and compare current count against last refresh count, get the maximum System Modified At among your records and compare it against your last log datetime …

Extension code for SaaS is located here.

Business Central : Word Layout date fields formatting

One of my co-workers asked how can we format a date field in a MS Word Layout document so that it only shows the date but not the time.

To get started with adding fields to a Word Report Layout start with this:

How to Add Fields to a Word Report Layout – Business Central | Microsoft Docs

Let’s say you added a date field: CustomDate to a BC report.

When you run the report the date field renders on the report as DateTime: 30-06-2021 12:00:00.

How do we enforce the report to only display the date: 30-06-2021

Let’s look at a simple example: you want to add to the report 206, Sales Invoice, in a custom Word Layout the DueDate_SalesInvHeader field from the dataset.

If you followed the link above you might have now a control on your document that looks like this:

If this field is coming from AL with Date and Time components, how do we format it so it only shows the Date?

In RDLC you would go to the properties of the text box that is hosting your datetime field and change the Format property to “d”.

How about Word Layouts?

It’s quite simple, but not easily available in the documentation.

You would change the date field in the layout to show like this:

You can change the formatting to match your region. In US: dd/MM/yyyy.

There is always the option of writing an extension and send to layout an already formatted date field as text.

Business Central On-Premise installation: hardware and software requirements and recommendations

A list of system requirements for Business Central On-Premise is readily available from Microsoft Docs here.

The only issue is, these are bare minimum requirements.

How do we know what level of hardware requirements will be enough to not only guarantee good performance at deployment, but down the line months and years from deployment?

When the decision to go for Business Central On-Premise versus Business Central on Microsoft cloud is behind, end-users and VARs alike are facing decisions regarding the server type: should it be dedicated server in End-Users premises or virtual servers.

If going down the virtual server path, End-Users can chose between a self-managed self-hosted virtualization system (using Hyper-V or other alike solutions) or use a cloud provider(one of them being Microsoft’s Azure platform).

The minimum requirements allow for an installation of standard product. But if the company is using a different Application layer (standard + AddOns) or if the installation needs to accommodate various extensions, minimum requirements won’t be enough.

For example, in my recent experience with 2 VARs with products for Food vertical, the Microsoft Application extension has been replaced with code that includes Microsoft Application and Food application code. There are high chances that the minimum requirements won’t be satisfied by the modified base layer of the application. And with additional extensions installed on top of the base layer, our needs are getting further and further away from the bare minimum requirements.

When we need to research about Business Central requirements we need to understand the architecture of this product. Business Central functions on a three-tier architecture as seen on this Microsoft Docs page:

Each component in this model comes with their own hardware/software recommendations.

More often than not, I have seen the Web Server and the Business Central server side by side on the same server.

Quite often, especially for installations with less than 25 users, I’ve seen installations of SQL Server, NAV Server and Web Server all on 1 machine or virtual machine.

For installation of 25 users or less the topology I’ve seen working quite well is this:

  • Business Central Server + Web Server on one machine
  • SQL Server on a different machine

Business Central Server and Web Server

SpecificationMinimumRecommended
Memory16 GB>= 32 GB
Processor1 Quad Core2 x 16 CPU cores
DisksSAS or SSD Drives, configured with RAID1

For a list of Operating Systems required by Business Central Server visit Microsoft Docs recommendations for Operating System.

SQL Server

SpecificationMinimumRecommended
Memory32 GB>= 64 GB
Processor1 Quad Core 2 x 16 CPU cores
Disks SAS or SSD Drives:
– OS: RAID1
– Data drive: RAID1
– Log drive: RAID1/RAID10
– Master/TempDB: RAID1
– Backup drive: RAID1

Most SQL Server installations use SQL Server Standard.

While I’ve seen installations of Microsoft Dynamics NAV and Business Central on a SQL Server express platform, SQL Server Express should only be deployed for non-production use such as test or development environments. It is not fit as a live environment’s production server.

For a list of SQL Server OS recommendations visit Microsoft Docs page.

The three tier architecture includes:

  • Web Clients
  • Business Central mobile app
  • MS Office Applications (needed for integration of BC with Office products)
  • AL Development workstations.

Please visit Microsoft Docs for each of these additional components’ operating system and other required software recommendation.

MB-800 Business Central Functional Consultant exam : study materials

As a Business Central developer I don’t get every day to set up Business Central standard processes, I mostly design and setup the processes in my customizations work. Standard Business Central setup is covered by Functional Consultants.

The exam MB-800 has been active for about a year (beta version started in October 2020) and given my developer experience with setting up the system I thought I’d give it a try. This blog contains materials I found, read, and tested; hopefully it will provide a good starting point for study for others.

This exam is testing your skills on setting up Business Central SaaS. It is easy to get a trial of BC SaaS which can be used for training for this exam. And if you need more time, you can extend your trial by another 30 days. Just navigate to https://businesscentral.dynamics.com/?page=1828 and extend your trial. If 60 days is not enough, you can start again a new trial.

You should start with Exam MB-800: Microsoft Dynamics 365 Business Central Functional Consultant – Learn | Microsoft Docs.

That page offers:

For example, one learning path is Set up financial management in Microsoft Dynamics 365 Business Central.

This path contains the following modules and lessons:

For all lessons targeting Functional Consultant role go Browse all – Learn | Microsoft Docs.

If you want to take Business Central Microsoft Docs with you, when you are offline, download the Microsoft Docs BC pdf from:

Welcome to Microsoft Dynamics 365 Business Central – Business Central | Microsoft Docs

I sent the pdf to my Kindle. Unfortunately, the file is too big and the experience reading from Kindle is not that great.

Would be great if Microsoft could split the big pdf in separate pdfs for each chapter.

Additional materials (with Danish roots, just like Business Central 😉):

Go ahead and schedule the exam! Good luck!

Parsing RunRequestPage output using XML Buffer

RunRequestPage allows developers to record the request page settings of a Dynamics NAV/Business Central report without actually running the report. The output of this command is an xml string.

E.q.

//XMLParameters: Text;

XmlParameters := REPORT.RUNREQUESTPAGE(50000);

What if we want to process the report in certain conditions explicitly defined by the report options? We need to be able in this case to parse the output of RunRequestPage.

Simple enough. One way is using XMLDocument LoadXml and load the string into a DotNet variable and use DotNet functions to get the value of the nodes.

If you want to avoid using DotNet you could use “XML Buffer Writer” codeunit (1235) and “XML Buffer” table (1235) in a codeunit called from an action.

XMLBuffer, XMLSpecialInterestNode : Record 1235;

XMLBufferWriter : Codeunit 1235;

First, we’re running the request page for report 50000. This will open up the request page, allowing the user to set all options/filters. Once finished click ok.

All the options/filters for the report will be recorded in the string XmlParameters.

Secondly, we load the xml string into an xml structure inside NAV, using table and codeunit 1235. This is done via function InitializeXMLBufferFromText from codeunit 1235.

We can then filter the entries and locate the option we are interested in.

In my case I had a report option “Run Later” … if this option is true I will do a different type of processing than just running the report. Think in terms of what you could do to a report beside running it: keep track of run time, email output … 

 

New-BcContainerWizzard generates scripts to build NAV or Business Central containers

Last few months I found learning increasingly more about Docker and Microsoft Powershell libraries that get you a running instance of NAV or Business Central in a local container.

I am going to investigate 3 ways that get you a running NAV/BC container.

Using pure docker command

You can start with Docker here.

Example of using docker to create first BC container:

docker run -e accept_eula=y mcr.microsoft.com/businesscentral/sandbox

This command will set an environment variable local to the new container and will pull (if not already pulled) the image specified (mcr.microsoft.com/businesscentral/sandbox) and will run it.

You could also run the container based on a Business Central image for Windows Server 2019. This is a lighter image than the previous one which was for Windows 10 OS.

docker run -e accept_eula=y mcr.microsoft.com/businesscentral/sandbox:ltsc2019

To check the size of the images downloaded run from the command prompt:

docker image list

If you want to delete some of your images run for each the following:

docker image rm 0d -f

You can specify the whole identifier for the image or just the first 2 letters (0d).

With the container running, you can open VS Code, install AL code using the ALLanguage.vsix from the location displayed in the log:

http://e8d9bbb19805:8080/ALLanguage.vsix

If you have trouble using the dns name, something must not have been right with writing the hosts file, but you could always use the IP of the container.

Now you should be able to connect to your container and start writing extensions.

2.Using Microsoft’s module NAVContainerHelper, more specifically “New-NAVContainer” command:

New-NavContainer -accept_eula -containerName “firstcontainer” -auth Windows -imageName mcr.microsoft.com/businesscentral/sandbox:ltsc2019

While previous command could have been launched from the command prompt (with docker running), you can launch the above command from the prompt of Powershell ISE (run as admin). This will pull the latest business central image for Windows Server 2019. If you run “docker image ls” you can notice this is a lighter image.

You can connect to that instance to write extensions by running VS Code and installing vsix file that comes with the container.

3. Using Microsoft’s module BcContainerHelper.

Latest Microsoft’s module, BCContainerHelper has a command New_BCContainerWizzard. This command generates a Powershell script that, when run, creates a NAV/BC container.

To gain access to the wizzard, install first the new module BCContainerHelper. When running “Install-Module -Name BcContainerHelper” I had an error:

Then I added the “-AllowClobber” parameter and module was successfully installed.

Install-Module -Name BcContainerHelper -AllowClobber

Once BcContainerHelper installed I had access to the New-BcContainerWizzard:

Let’s launch it and install a container loaded with a NAV 2017 CU5 image:

  1. Accept Eula:

 

Choose Y.

2. Next we need to choose between a locally stored container or a container stored in Azure:

Default is Local and that’s what I chose.

3. For authentication step you have a few options: either username + password or Windows. I choose Windows:

4. Name container:

5. Version: latest BC (onprem or Sandbox) or a specific version.

We are going to install a container with an OnPrem image of NAV 2017 CU5. For a list of sandboxes and onprem images consult these links:

https://mcr.microsoft.com/v2/businesscentral/sandbox/tags/list

https://mcr.microsoft.com/v2/businesscentral/onprem/tags/list

Version

6. Country

Country

I chose NA.

7. Test toolkit ?

10. License (No, or local file or https url for downloading the license file)

11. Database: you can leave the new container loaded with cronus db in a local (to container) sqlexpress instance or you can load a database with the correct version in the SQLExpress instance or use a SQL Server other than container’s SQL Server. I chose the default but planning to connect to the SQL Server express instance and restore a backup later once container is created and runs.

12. DNS settings:

13. Memory Limit:

MemoryLimit

I left default.

14. CAL Development:

CALDev

As it is a NAV 2017 image I chose Yes.

Image

Name and save your powershell script:

script

The window will close and a Powershell ISE opens with the new script:

pswin

Now you can distribute the script with your team. If you want to generate the new container press F5.

In the script please note the new parameter artifactURL. More on this new parameter here.

After 835 seconds the container was installed. However during the last steps (exporting shortcuts to host desktop) script encountered an error:

Error

In the version of BcContainerHelper I was working with (1.0.1) , Export-BcContainerObjects was not included. In fact, the command should have been Export-NAVContainerObjects.

I created the issue in github and next day I found an email from Freddy that this was a bug and it will be shipped in BcContainerHelper 1.0.2-preview153.

As the new release was available, I re-installed the module and I was able to finish running the script.

I needed a current version of the database instead of installed Cronus database, so I connected via SSMS to the container SQL instance and restored the database. Now development can take place in the container.

More information on Freddy‘s and Steve‘s blogs.

Fastest way to get entities in Dynamics NAV or Business Central and not only

It’s 4 pm. To my surprise a skype call from one of the customers I usually talk maybe once a month. She cut the niceties quite abruptly: “Look, I have a list of 100 customers and I need it in production asap. I have 15 fields with data for each new customer. Can you do it today before 5?”

This is the context of this blog post. How do we inject new entities in NAV, and not only, in the fastest way (under one hour) ?

A few weeks ago I engaged a few of my peer developers, not just NAV developers, on what they usually do in this type of scenario.

Some of their answers were really good and could be applied in Dynamics NAV or Business Central.

One of the answers was to ask the customer to enter it manually:)

giphy

That is indeed one way, but I’m not sure if my customer was willing to do it and – under one hour was out of the question.

Another answer was to “quickly” write an integration tool to take the data from the original system and into the target system.

crazy

Some of the answers I recall: “That’s crazy!” or “You have a list!” or “Under one hour, please!”…

Another idea was to manipulate the list, residing in an Excel file, in such a way that we generate the code required to insert the records in Excel in the language of your choice (C/AL, AL, C#) and once generated copy it from Excel worksheet straight into a Run trigger of a codeunit or any other method and execute that method. For Business Central create a new extension, extending only Customer List page with one action “Import Customers” and drop the code generated in Excel in that action OnAction trigger. Install the extension, run the action, un-install extension. I personally used this method at least a dozen times in my career in different environments including NAV. It’s fast, dirty and does the job 🙂

A similar answer was to generate the “INSERT INTO” t-sql statements in Excel and copy the batch in a query window and execute it. We know this is not what we call best practices when working with Dynamics NAV, not to mention Business Central. But this might work very well for other environments, especially when you don’t have to trigger any business logic.

Another answer was to write in the language one prefers a subroutine to manipulate the Excel file programmatically. While this is a method that works most of the time when you have enough time, I don’t think is doable in under one hour unless you already have the bulk of the code and you just need to quickly transform it and polish it for the fields that the customer is including this time. I used this method a few times in Dynamics NAV when one can take advantage of the structure Microsoft put in place since NAV 2013 via table 370 Excel Buffer.

One last answer discussed between the NAV guys was to use RapidStart services. We, the NAV people, are quite lucky to have the mothership design this service for us. We both agreed that this would be one quick way to get the data in and most likely under one hour.

This is what I gathered for this type of time-sensitive requests. What would you do if you encounter this type of task?

How to PowerApps with Business Central

PowerApps – Intro

What is PowerApps? PowerApps is a service for generating cross platform (iOS, Android, Windows Store) applications. It allows connectivity to different systems, comes up with cloud IDE and a cloud admin interface that allows users to publish apps targeting whatever platform you need. The IDE is called PowerApps Studio and can be downloaded from Windows Store locally on your machine or it can be used as a web application. I designed the app detailed below using the web application.

Most importantly, just like the other power tools, PowerBI and MS Flow, PowerApps is accessible not only to professional developers, but also business analysts, junior developers, or expert users in any company. I wrote this app without any code inside PowerApps Studio, just a few Excel functions invoked sporadically.

The quick PowerApps app I built required:

  • a PowerApps license. I got my free license here.
  • a NAV container hosted on Azure. You could build yours easily, some help here.

Application scope

The app will get from the Azure Business Central container the list of items via Item List page exposed as web service, and will present on the first screen the Item No. and Description for all items. App user can then advance into the details screen for each item. Here, if the Quantity is low the user can advance on a third screen where he can generate a purchase invoice for the desired quantity for the item and vendor selected. The result is that in Business Central the app will generate via a second web service a purchase invoice for the item, the vendor selected and the quantity entered.

Application design

There are two main parts to create your app:

  1. Create app connector
  2. Design app

1. Create app connectors

To create a Business Central connector go to the File menu in the PowerApps Studio and choose Connections:conn

 

The connector to the Azure BC Container instance looks like this:

 

new bc connector

Once the connector is set we can access all web services exposed in Business Central Azure Container.

2. Design PowerApps app

The PowerApps Studio comes with 3 main regions:

  1. Left Side is where you work with the screens. In this simple app I have 4 screens: Master, Detail, Order and Confirm.
  2. The center belongs to the canvas where you drop your controls
  3. Right side is occupied by the Data Source (if any) and the properties of the current control you selected on the canvas

 

studio screens

MasterScreen consists of a Galleria control (GalleriaItems) which contains a list of items retrieved via Items web service Data Source. You will see later that this web service is Page 31 exposed as web service in Business Central Azure Container.

OnSelect event for the Forward button has Navigate(screen,effect) function behind to advance to a certain screen in your app.

forwardbutton

The second screen, DetailScreen displays a bit more fields from Items web service.

detailscreen

If the inventory is low, the app user can decide to order more by clicking on “Order more” button:

orderscreen

Once the user enters the desired quantity to be included on a Business Central Purchase Invoice the app will create a POST request to a new ODATA web service data source (OrderItemVendorWS) and ultimately generate the purchase invoice with one purchase line.

Let’s see the app:

runappgiflast

 

And, in Business Central, the new purchase invoice:

purchaseinvoiceinbc

This is what was needed on the PowerApps side, but additionally, I needed to plug a few new things in Business Central.

Business Central Container changes

First, create a new AL project, and point Visual Studio Code to the azure container:

Launch.json:

launchjson

Web services:

  • Items service will support MasterScreen and DetailScreen
  • OrderItemVendorWS will support OrderScreen.

web servicesPage 50100 “PurchaseItemList” is a new page based on a new Table 50100 PurchaseItem:

Table 50100 PurchaseItem:

TabPurchaseItem

Page 50100 PurchaseItemList:

Pag50100PurchaseItemList

The Purchase Invoice is generated during OnInsert trigger on the new table:

InsertTrigger50100

Conclusions

Creating an app with PowerApps assumes 3 tasks:

  1. the app backend, the connectors to your apps
  2. the app design, done in PowerApps Studio
  3. Publishing and Management tasks

PowerApps comes with versioning and management capabilities of a few environments (E.g. Dev, QA, Prod). Once your app has been tested by PowerApps app users, you could export it from QA and import it in Prod and distributed it from there. Select Office and  Dynamics 365 plans will allow you to generate and manage these environments.

More specifically, if you go to web.powerapps.com and click on Solutions you will be able to follow (with the right license) Create a new environment link.

Some useful links: