BC20 in SaaS (preview)

Hello readers,

This morning I noticed we can create a sandbox with Business Central 2020 in Preview.

In your BC admin center, click on +New action to create a new environment and choose Version 20:

And BC 20 sandbox is created:

Ready to explore?

Cheers!

Getting started with Snapshot Debugging in Business Central

Happy new year, readers! Time for another Feynman technique exercise.

Today I tried for the first time (Duh Silviu …. it’s been out there for at least a year!!!) to debug a snapshot.

I started my exercise on Microsoft Docs here. Also, very useful was Stefano’s blog.

But first, what is a snapshot? A snapshot is a recording of executed code in Business Central.

The idea is that when you want to investigate an error in one of your environments (I’ll be showing screenshots of SaaS), you would start the recorder (from VS Code), perform the action you want to investigate, stop the recorder. Then re-play the recording. Simple!

Well, I ran into a few issues, therefore, for my sake in the future debugging sessions and for the interested readers, I’ll recap what I did to be able to replay a snapshot for debugging.

User debugging settings

First, the user that will connect to the SaaS for debugging purposes should be part of a group called D365 SNAPSHOT DEBUG:

Point Snapshot to the right environment

Most of SaaS development environments I come across have one configuration in launch.json.

For snapshot debugging though, you need an additional configuration:

I added the first configuration for this exercise.

The key element is the sessionId.

To find the sessionId you need to go to the admin console:

a) Navigate to https://businesscentral.dynamics.com/%5BTenantId%5D/admin and

b) Click on Environments

c) Launch action Sessions

d) Refresh

e) Take note of your session

Note: This step is going to be a bit tricky. Because you might have plenty of sessions with BC and you might try to record the wrong session. I usually cancel all sessions under my name, close all BC windows, make sure there is no active session under my name, login in BC, check the session id, update configuration, and start recording.

Start Recording

In VS Code, start recording by pressing F7 or from the Pallette launch AL: Initialize Snapshot Debugging.

Play use case

In BC, play your use case.

For example, in my environment I had a Sales Order without External Document No.

Open the Sales Order and attempt to Post.

Get the error referring to the missing “External Document No.”.

Move then to VS Code to stop the recording.

Stop Recording

In VS Code, stop recording. Use ALT + F7 or from the Pallette launch AL: Finish Snapshot debugging on the server. In the Output screen of the VS Code you should see something like this:

Replay recording

In VS Code, on the left side of the toolbar, there is a small button showing all snapshots.

Click on it, and from the top choose the desired snapshot, in my case last snapshot is the one on top:

After choosing the snapshot, the system will automatically play it, stopping through each breakpoint and ending up at the line of code responsible for the error encountered in the web client.

You can see on the left side all the goodies needed for debugging: the Call Stack, the local and global variables…

And if you are interested, you can unzip the snapshot and have a look at what is in it: a set of MDC files, AL files and a version file.

Hope this helps!

Business Central 2021 wave 2: an overview of Data Management

Reasons why data size matters

  • Data size or rather tenant capacity is reflected in the price the end user pays
  • Data size influences performance: smaller databases allow for efficient processing, faster backups, reports, sorting records, faster page display …

To see the current data size, for the entire database and per environment, in the admin center, click on the Capacity blade:

To increase the capacity of their environments or increase number of environments, customers can purchase through their partners, additional capacity Add-Ons (check this).

Under Customer subscriptions, their Partner can purchase additional capacity:

To stay within their license capacity, customers and/or their partners need to handle their data size and compression.

Handle and Compress Data

Users can start with the Table Information page and use Data Administration action or directly launching the Data Administration page from Search.

This page/view contains two lists: one displays data per table and the second list summarizes data per company:

Through Data Administration page one can note or perform the following:

  • Refresh action: updates tables’ size and can be scheduled via a job queue
  • Data Cleanup: various reports can be run to delete Change Logs, Document Archives, Invoiced Documents and so on.
  • Data Compression
    • various reports are used to delete and archive ledger entries and registers:
  • A link to Companies page, allows for deletion of companies or copy an existing company under a different name:
  • An action to Retention Policies brings up a list of records from Retention Policy Setup table.
  • And drilling deeper, we can inspect the Retention Policy Setup record via Retention Policy Setup Card:
  • If you drill down on the Table ID you can see there are a limited number of tables that have defined retention policies:
  • To add your own table in the list above please read Stefano’s blog.
  • There is also a Data Administration Guide action. This wizard takes users through retention policies list, manage companies page, data compression of different tables

A few notes about archiving

  • the archiving mechanism is not a backup restore system
  • integrated in the Data Compression processes, a report option for archiving for each data compression report
  • can be exported to excel or csv files: look in Data Archive list page
  • ability to archive any data to be deleted: start logging, delete data, stop logging.
  • Example: lets archive some test vendors:
  • A quick test of deleting a test vendor reveals the Data archive record and the excel (or csv) created:
  • archives are stored in media fields in Data Archive Table: Table Fields (json) and Table Data not counting in the database size

More to check in Microsoft’s Tomas Navarro and Bardur Knudsen’s video “Data Management in the Business Central Application”.

Business Central 2021 wave 2: documents have now a default line type. See how to set it up!

In Business Central, sales and purchase documents have lines and lines can be of different types:

  • comment line: ” “
  • “G/L Account”
  • “Item”
  • “Resource”
  • “Fixed Asset”
  • “Charge (Item)”

When editing a sales line document, a user would have to pick one of these values.

With BC 2021 wave 2, customers can default the value of line type to a value that is used most often.

E.g. If an end-user has sales documents in which the Resource line types is very frequent, they could set the “Default Line Type” to “Resource” in “Sales & Receivables” page:

Let’s see it in action!

From the get-go, as soon as we open a new Sales Quote, we can see that the lines have “Resource” as line type:

If you manually change the Type to a different value, e.g. “G/L Account”, the subsequent lines are taking the default from the line above:

Similar to sales documents, purchase documents have a default line type.

End-users can set up their purchase line default type in “Purchase & Payables Setup” page:

It is also possible to use a different line type, for example “Tutor”.

An AL developer would need to expand the “Sales Line Type” enum to include the new value:

enumextension 50100 SalesLineTypeWithTutorExt extends "Sales Line Type"
{
    value(50100; Tutor)
    {
        Caption = 'Tutor';
    }
}

Deploying the extended enum would make possible to choose the new enum value:

And the sales documents would use the new line type as default:

Of course, there is more custom code to write to make the new enum usable in the documents, but about that in a future blog.

Have you seen the new “Chart of Accounts Overview” in BC 2021 Wave 2?

With Business Central 2021 Wave 2 there is a new page to inspect the chart of accounts.

Search for “Chart of Accounts Overview”.

This new page displays the chart of accounts in a tree structure.

To create a list page with a tree structure a developer would need to make true the property “ShowAsTree” which can be found under Repeater control:

The columns in the Overview page are similar to the ones in the original Chart Of Accounts list page.

The new page is more compact:

  • less fields: just the Balance, Net Change, “Income/Balance”, “Account Subcategory” “Account Type” and Totaling are available
  • less lines: “End-Total” lines are out

Let’s have a look at how the “Begin-Total” line looks like for Assets in the classic “Chart of Accounts”

And how the Assets “Begin Total” line looks in the Overview page:

We can see now that the Net Change, Balance, Totaling have been brought into the “Begin Total” row from “End-Total” and the “End Total” row is no longer in the list.

The classic “Chart of Accounts” list page:

The new “Chart of accounts Overview”.

Collapsed:

And expanded:

The Overview page does not allow for opening of G/L Account card page.

The Overview page does not allow for editing, inserting or deleting G/L accounts.

But if you want a compact page, with less fields with the option to quickly expand and collapse features for entire groups of accounts, then the new “Chart of Accounts Overview” is a useful alternative.

There is a new Posting Preview Type in Business Central. See how that works!

With BC v19 one of the Application changes affects the Posting Preview functionality.

Read more here.

The new Posting Preview feature can be enabled in the General Ledger Setup:

The way the posting preview worked until now is covered by Posting Preview Type = Standard.

So, if you don’t like the new Posting Preview (Extended) you can always use the previous one.

But let’s recall how the original Posting Preview looks like.

First, Search for General Ledger Setup and set the Posting Preview Type to Standard:

Open a sales order and choose Preview Posting; the image below shows only one group of ledgers, the Related Entries group:

Let’s now head to the General Ledger Setup and set the Posting Preview Type to Extended:

Then re-open the Sales Order and click on Post – > Preview Posting:

Notes:

  • we can see now 3 groups:
    • G/L Entries -> this is the place where will find the G/L Entries
    • VAT Entries -> records from VAT Entry table
    • Related Entries -> all the rest of the ledgers, including extension or custom entries
  • Show Hierarchical View is a toggle on how G/L entries and VAT entries in the posting preview weather grouped by Account No.(if Hierarchical View is on) or as a list (if Hierarchical View is off).

And if we want to view the details we can use the toggle in the upper right corner of the group to expand or collapse the groups:

Of course, the new Posting Preview on journals looks and feels similar to the documents’ Posting Preview.

Hope this helps!

“Field Selection” codeunit – how I select and record the ID of a field in Business Central

Hello readers!

Recently I have been working on a customization for a customer with the goal of changing the out-of-the-box Positive Pay export for a Bank Account record.

While preparing the mapping for the positive pay details, I noticed the way Microsoft wrote the picking up of a field ID. They created a new codeunit: “Field Selection”.

Let’s see how to get to that piece of code:

  • in BC, search for “Data Exchange Definitions”
  • Click on any Exchange Definition Code, then in the Line Definitions, click on Manage and “Field Mapping”
  • In the “Field Mapping”, click on “Field ID” lookup “…”
  • the list of fields in the table 330 is displayed:

This lookup page is triggered by the OnLookup trigger on page 1217 “Data Exch Field Mapping Part”:

field("Field ID"; "Field ID")
                {
                    ApplicationArea = Basic, Suite;
                    ShowMandatory = true;
                    ToolTip = 'Specifies the number of the field in the external file that is mapped to the field in the Target Table ID field, when you are using an intermediate table for data import.';

                    trigger OnLookup(var Text: Text): Boolean
                    var
                        "Field": Record "Field";
                        TableFilter: Record "Table Filter";
                        FieldSelection: Codeunit "Field Selection";
                    begin
                        Field.SetRange(TableNo, "Table ID");
                        if FieldSelection.Open(Field) then begin
                            if Field."No." = "Field ID" then
                                exit;
                            TableFilter.CheckDuplicateField(Field);
                            FillSourceRecord(Field);
                            FieldCaptionText := GetFieldCaption;
                        end;
                    end;

                    trigger OnValidate()
                    begin
                        FieldCaptionText := GetFieldCaption;
                    end;
                }

The OnLookup trigger is using codeunit 9806 “Field Selection”. This codeunit, as well as its associated page, Page 9806 “Fields Lookup” can be located in Microsoft System Application app.

You can go through the source code for this codeunit here.

As you can see below, the main function Open is fairly simple:

Let’s put these objects into a simple practice exercise.

I am going to create a new table and in it a new field in which I am planning to record the ID of the Customer.”No.” field.

Here is field definition in the table:

        field(2; "Customer No Field ID"; Integer)
        {
            DataClassification = CustomerContent;
        }

And here is the field definition on the list page:

                field(MyField2; Rec."Customer No Field ID")
                {
                    ApplicationArea = All;

                    trigger OnLookup(var Text: Text): Boolean
                    var
                        RecField: Record "Field";
                        FieldSelection: Codeunit "Field Selection";
                    begin
                        RecField.SetRange(TableNo, Database::Customer);
                        if RecField.Get(Database::Customer, Rec."Customer No Field ID") then;

                        if FieldSelection.Open(RecField) then
                            Rec.Validate("Customer No Field ID", RecField."No.");
                    end;
                }

First, we’re filtering the Field record (RecField) to the Customer table and then we execute the “Field – Selection” Open method, which in turn displays the “Fields Lookup” page. Lastly, I validate my new field “Customer No Field ID” against the result of the lookup:

In the Lookup page users can pick the field and the field ID needed:

The goal with this exercise reminding those who knew and making aware those who didn’t know about the new Microsoft System app objects and use them in daily tasks, instead of re-inventing them each time.

Moreover, these objects are part of the platform (designed and tested by Microsoft), therefore, we have every reason to use them.

Hope this helps!

So I chose Business Central … But am I losing my Dynamics GP data?

In the last 6 months I’ve been involved with a number of GP to BC migration projects.

A recurring question that reaches our team is how do I see GP data in BC?

One avenue to move your business to BC is to import open transactions and master data, and tested setup tables with RapidStart packages. If the underlying table of the desired GP entity does not exist in BC, then a Business Central developer would need to create the table in BC and, with Edit In Excel functionality you can get GP data in BC.

There is also the Cloud Migration Tool in BC. More about it here.

Using this tool ensures the most important entities, master data and open transactions, will make it into BC. But what if a GP end-user wants additional GP data in BC?

Microsoft recommendation is to bring as little as possible into the cloud from an on-premise database.

Moreover, as your database capacity increases, your cost can increase. See more here.

Rather than bringing GP tables one by one in BC, use the cloud migration tool to move data from GP to Azure Data Lake.

If the decision is, though, to have some GP data in Business Central, there are tools to make that possible.

We can extend the cloud migration tool so that, when the migration starts, beside the core migrated data (master data and open transactions) the process will also bring into a new space (an extension table) the data from the GP table as mapped in the “Manage Custom Tables” page.

What’s needed to achieve this:

  • Create a Business Central extension. In it, create an AL table to store your data from a GP table
  • Add the custom table in Manage Custom Tables
  • Run migration tool
  • Check custom table content after migration

Let’s try bringing table GL00100 from GP in BC.

Note: this table was chosen only for demonstration. GL00100 is brought by default by the cloud migration tool into BC table “G/L Account”.

Create extension with GP table

I created an extension that includes a table for this GP entity:

Map migration for new table in “Cloud Migration Management”

In Business Central, search for “Cloud Migration Management”.

Under actions trigger “Manage Custom Tables” action:

Under “Migration Table Mapping” page, map new table in your extension to the GP table:

On “Migration Cloud Management” trigger the “Run Migration Now” action.

You can check the results in the cue on the Migration Information area:

To check the content migrated:

  • change the company to the migrated company
  • run the new table by adding “&Table= 50340” to the Business Central URL:

We can now see the result of migrating the GP data to the custom BC table:

Conclusion

To answer the question in the title, you don’t lose GP data. There are multiple ways of accessing your GP data post go-live to BC, involving:

  • retaining the access to your old system
  • migrate your Dynamics GP installation to Azure (SQL Server and application)
  • migrating your GP data warehouse to Azure Data Lake
  • or, as shown above, with minimal coding, keeping your GP data in Business Central

Engage with your partner and decide what GP data do you really need today so that long term your cloud ERP stays performant.

Export Business Central online entities to Azure storage blob container

As most probably know, it is not possible to access the file system while in Business Central cloud environment.

For example, in Dynamics NAV, we could have a job queue entry that, when run, creates a file and copies it in a network folder. We can still do that in an On-Premise environment, but not with cloud BC.

You could create the file and use DownloadFromStream, but that would only prompt you do download it locally, but would not copy it somewhere on a local or network folder.

If you try to use File.Create() you would get the warning: “The type or method ‘Create’ cannot be used for ‘Extension’ development”.

If your customer is happy to grab the file manually every time from the downloads folder then this should suffice:

But, if we want to automatize this process and run the extract on a regular basis, we need to find a cloud solution for storing the files.

Currently, there are 4 types of storage in Azure platform:

  • Containers/Blobs
  • File Shares
  • Queues
  • Tables

In my previous blog I dived into the Azure Storage of type Tables and tackled its API.

This blog is about interacting with the Azure storage blob containers:

  • manually, via Azure Portal
  • simulation, via VS Code extension “Rest Client”
  • Business Central extension
  • view blob container with Excel
  • get Azure Blobs locally

I found on Michael Megel’s blog a nice solution for exactly what I need. Awesome job on Blob Containers API, Michael! Thank you for sharing!

What I need:

  • Azure:
    • Set up a blob container to store Business Central exported files
    • Set up Storage Access Key
  • Simulation:
    • In VS Code, write requests with “Rest Client” extension, targeting Azure blob container API
  • Business Central:
    • A setup table in Business Central for Azure access stuff
    • Wrote an export interface that would allow users to run an action(“Write File in Azure”) that will send the extract to Azure container. The same code could be executed by a job queue.

Blob Container Setup

To set up a container, following Michael’s notes on above blog was enough for me.

For blob container accessibility I went on the path of shared access signature “SAS Token”.

Once created, you can start playing with the storage account container API.

I created the storage manually:

Drilling down into the storage account, I created a new container:

Simulation:

In VS Code, using Rest Client,

  1. I sent a request to get the list of containers:

Request:

GET https://svflorida.blob.core.windows.net/?comp=list&%5Bhere you insert your SAS token key]

content-type: application/json

Response:

HTTP/1.1 201 Created

Content-Length: 0

Content-MD5: 1B2M2Y8AsgTpgAmY7PhCfg==

Last-Modified: Wed, 18 Aug 2021 19:05:13 GMT

ETag: “0x8D9627B1BD88A0F”

Server: Windows-Azure-Blob/1.0 Microsoft-HTTPAPI/2.0

x-ms-request-id: 3f97555d-801e-006d-5263-94f646000000

x-ms-version: 2020-08-04

x-ms-content-crc64: AAAAAAAAAAA=

x-ms-request-server-encrypted: true

Date: Wed, 18 Aug 2021 19:05:13 GMT

Connection: close

2. I sent a PUT request to insert an empty file:

Request:

PUT https://svflorida.blob.core.windows.net/vendorlist/vl1111?%5Byour SAS token here]

x-ms-blob-type: BlockBlob

Content-Length: 0

Response:

HTTP/1.1 201 Created

Content-Length: 0

Content-MD5: 1B2M2Y8AsgTpgAmY7PhCfg==

Last-Modified: Wed, 18 Aug 2021 19:23:46 GMT

ETag: “0x8D9627DB340E9DD”

Server: Windows-Azure-Blob/1.0 Microsoft-HTTPAPI/2.0

x-ms-request-id: b77cbfb2-b01e-003b-2566-9407a9000000

x-ms-version: 2020-08-04

x-ms-content-crc64: AAAAAAAAAAA=

x-ms-request-server-encrypted: true Date: Wed, 18 Aug 2021 19:23:46 GMT

Connection: close

And this is the file in Azure portal:

Business Central extension:

This is how the new setup table “Azure Storage Setup” looks in BC:

This is how the new BC interface “Vendors Export Log” looks like:

“Write File In Azure” action on page 50251 “Vendor Export Log” does the following:

  • exports all BC vendors to a blob
  • the blob is then written to a PUT request content
  • the PUT request is sent to Azure Blob Storage API

Consult Blobs with Excel:

BC users can click on the URL link above and download locally the file or they, and other 3rd party users, can access the files via Excel, as I explained in my previous blog.

This time though, when creating the connection choose Data – > Get Data -> From Azure -> From Azure Blob Storage.

And finally displayed in the Excel book:

Get Azure Blobs locally

To help with getting the files locally, I wrote 2 blogs:

  • one about getting the files locally using Power Automate
  • one about Azure CLI to copy the files from azure blob storage locally

For more about storage accounts in Azure check this out.

You can find sample code repository here.

Copy files from Azure Blob Storage to File System (using Power Automate)

I found an older post on community.dynamics.com in which someone was asking for ways to automatically drop data extracts originated in BC SaaS into a local folder.

First, in SaaS, we can’t generate the files automatically and store them locally.

We need to store them in the cloud.

Once in the cloud, how can we automatically download them locally on a machine or a network folder?

I bing-ed the phrase “copy files from azure blob storage to file system” and the first search result was this link to a Power Automate template flow:

There are a multitude of cloud providers, but Microsoft does continuously a great job at connecting everything between BC SaaS, Azure platform, Power Automate and Power Apps, so it’s just convenient to use its tools.

To test it, I went through the following exercise:

  • In Azure Platform I created a storage account and in it I created a Blob Container.
    • “A container organizes a set of blobs, similar to a directory in a file system. A storage account can include an unlimited number of containers, and a container can store an unlimited number of blobs.”
  • I created a local folder that will be synchronized by the new flow with the container in Azure

In Power Automate, I started with the Template provided by Microsoft and set up the flow:

The flow requires two connectors to be set up:

  • one to the azure storage container
  • one to the local or network folder

Editing Azure Blob Storage we see that we need the name of the azure storage, in my case “svflorida” and storage access key:

Storage access key is located in in azure portal under Access Keys:

Editing the File System Connector:

The most time consuming, about half an hour, was to set up and troubleshooting the gateway.

The flow cannot just drop files from Azure on your machine. It needs a gateway.

To create a new gateway, click on the drop down and choose “+ New on-premises data gateway”.

That will prompt you to download an msi to install a gateway: GatewayInstall.msi.

Once gateway installed, the only change I’ve operated was to switch from HTTPS to TCP:

In a live environment I would investigate and maybe set up an Azure Service Bus, but for the purpose of this exercise I went with TCP.

Once that is done the flow will be triggered when new files are uploaded or deleted from the Azure Container.

I noticed that with my free trial license the recurrence of the flow was set to 3 minutes.

The flow seems to pick changes as expected, just be patient and wait for the next run 🙂

In azure portal, upload a new file into your container:

The file will appear after a few minutes in your local folder:

And the flow shows a successful run:

That’s it! In the next blog I will look into how I can generate BC SaaS extracts into an Azure storage container so the flow doesn’t feel useless 🙂

I hope this helps someone. In any way, it’s late here so I call it a night!