Getting started with Snapshot Debugging in Business Central

Happy new year, readers! Time for another Feynman technique exercise.

Today I tried for the first time (Duh Silviu …. it’s been out there for at least a year!!!) to debug a snapshot.

I started my exercise on Microsoft Docs here. Also, very useful was Stefano’s blog.

But first, what is a snapshot? A snapshot is a recording of executed code in Business Central.

The idea is that when you want to investigate an error in one of your environments (I’ll be showing screenshots of SaaS), you would start the recorder (from VS Code), perform the action you want to investigate, stop the recorder. Then re-play the recording. Simple!

Well, I ran into a few issues, therefore, for my sake in the future debugging sessions and for the interested readers, I’ll recap what I did to be able to replay a snapshot for debugging.

User debugging settings

First, the user that will connect to the SaaS for debugging purposes should be part of a group called D365 SNAPSHOT DEBUG:

Point Snapshot to the right environment

Most of SaaS development environments I come across have one configuration in launch.json.

For snapshot debugging though, you need an additional configuration:

I added the first configuration for this exercise.

The key element is the sessionId.

To find the sessionId you need to go to the admin console:

a) Navigate to and

b) Click on Environments

c) Launch action Sessions

d) Refresh

e) Take note of your session

Note: This step is going to be a bit tricky. Because you might have plenty of sessions with BC and you might try to record the wrong session. I usually cancel all sessions under my name, close all BC windows, make sure there is no active session under my name, login in BC, check the session id, update configuration, and start recording.

Start Recording

In VS Code, start recording by pressing F7 or from the Pallette launch AL: Initialize Snapshot Debugging.

Play use case

In BC, play your use case.

For example, in my environment I had a Sales Order without External Document No.

Open the Sales Order and attempt to Post.

Get the error referring to the missing “External Document No.”.

Move then to VS Code to stop the recording.

Stop Recording

In VS Code, stop recording. Use ALT + F7 or from the Pallette launch AL: Finish Snapshot debugging on the server. In the Output screen of the VS Code you should see something like this:

Replay recording

In VS Code, on the left side of the toolbar, there is a small button showing all snapshots.

Click on it, and from the top choose the desired snapshot, in my case last snapshot is the one on top:

After choosing the snapshot, the system will automatically play it, stopping through each breakpoint and ending up at the line of code responsible for the error encountered in the web client.

You can see on the left side all the goodies needed for debugging: the Call Stack, the local and global variables…

And if you are interested, you can unzip the snapshot and have a look at what is in it: a set of MDC files, AL files and a version file.

Hope this helps!

Cloud migration changes with BC19 wave 2

With BC 19 wave 2 Microsoft improved the Cloud Migration functionality by re-designing the migration flow, cleaning up the errors reported by the partners over the summer, allowing migration of a larger set of GP tables with one additional extension installation, and removing the version 18 size limitations.

Improved migration flow

  • search “cloud migration management”
  • go through “Cloud Migration Setup”
    • before version 19, the replication and data upgrade were done in one step
    • now we have 2 separate steps triggered from the Migration Cloud Management page actions:
  1. “Run Migration Now” – for copy/replication; at the end the last “Cloud Migration Management” entry shows “Upgrade Pending”
  2. “Run Data Upgrade Now” – once data is copied it has to be validated/upgraded. Status will move through “Upgrade in Progress” and Completed.

The split of the unique process “Run Migration Now” into two steps was introduced to better manage cases when the data upgrade would fail.

If this is the case, then in the admin center you can restore the environment as of the time of end of the replication step and repeat the data upgrade.

Cloud migration resiliency

  • use of upgrade tags: to avoid multiple data upgrades processing use Upgrade Tags or to completely skip the upgrade. More here.
  • less locking when running cloud migration (more companies per run). There is still a limitation if schema for migrated tables is above a json object limit of 4 Mb. To avoid locking, migrate companies in smaller chunks. Especially if you get an error like this:
    • The size of the lookup activity result exceeds the limitation …
  • data repair after Cloud Migration: some data would be missing while migration would be reported as successful in the previous version

Dynamics GP migration changes

  • Enabled large GP tables (> 500 mb or > 100k records)
  • Support mapping of any custom table
  • add events for GP to BC migration:
    • new sample Microsoft extension to create BC counterpart tables for GP tables (source code here). This allows for a much larger set of tables to be migrated to SaaS.
    • mapping between GP to BC tables
    • there is also a powershell script that generates AL table based on a SQL table

Upload limits

80 GB limitation is lifted. Any size is supported.

Cleanup as much as possible before cloud migration

Some functionality may be disabled after cloud migration if tenant is too large. For ex. you might not be able to create a sandbox from Production if your capacity is 80 GB and you reached already 80 GB with your Production. Alternatively, you can upgrade capacity.

Business Central 2021 wave 2: an overview of Data Management

Reasons why data size matters

  • Data size or rather tenant capacity is reflected in the price the end user pays
  • Data size influences performance: smaller databases allow for efficient processing, faster backups, reports, sorting records, faster page display …

To see the current data size, for the entire database and per environment, in the admin center, click on the Capacity blade:

To increase the capacity of their environments or increase number of environments, customers can purchase through their partners, additional capacity Add-Ons (check this).

Under Customer subscriptions, their Partner can purchase additional capacity:

To stay within their license capacity, customers and/or their partners need to handle their data size and compression.

Handle and Compress Data

Users can start with the Table Information page and use Data Administration action or directly launching the Data Administration page from Search.

This page/view contains two lists: one displays data per table and the second list summarizes data per company:

Through Data Administration page one can note or perform the following:

  • Refresh action: updates tables’ size and can be scheduled via a job queue
  • Data Cleanup: various reports can be run to delete Change Logs, Document Archives, Invoiced Documents and so on.
  • Data Compression
    • various reports are used to delete and archive ledger entries and registers:
  • A link to Companies page, allows for deletion of companies or copy an existing company under a different name:
  • An action to Retention Policies brings up a list of records from Retention Policy Setup table.
  • And drilling deeper, we can inspect the Retention Policy Setup record via Retention Policy Setup Card:
  • If you drill down on the Table ID you can see there are a limited number of tables that have defined retention policies:
  • To add your own table in the list above please read Stefano’s blog.
  • There is also a Data Administration Guide action. This wizard takes users through retention policies list, manage companies page, data compression of different tables

A few notes about archiving

  • the archiving mechanism is not a backup restore system
  • integrated in the Data Compression processes, a report option for archiving for each data compression report
  • can be exported to excel or csv files: look in Data Archive list page
  • ability to archive any data to be deleted: start logging, delete data, stop logging.
  • Example: lets archive some test vendors:
  • A quick test of deleting a test vendor reveals the Data archive record and the excel (or csv) created:
  • archives are stored in media fields in Data Archive Table: Table Fields (json) and Table Data not counting in the database size

More to check in Microsoft’s Tomas Navarro and Bardur Knudsen’s video “Data Management in the Business Central Application”.

5 new features Business Central admins need to know

Microsoft keeps adding new features to all facets of Business Central, including the admin center.

The community has access to the BC Ideas portal to signal Microsoft its wishes and Microsoft delivers.

If you want to contribute to the future of Business Central, add your ideas to

1. How to access Admin URL

Well, this is not something new, but still admins need to know how to access the BC admin center.


2. Copy environments

A) Sandbox to Production

After testing your sandbox you can turn it into a production environment.

Click on a Sandbox environment

Click on Copy action:

A new dialog appears:

Enter the New environment name and the type of the environment, in this case Production, and then click on Copy.

B) Production to Sandbox

Navigate to environment home page and click on a Production environment.

In the next screen, pick a name for the new sandbox and click on Copy:

Confirm operation:

Copy is scheduled:

and later is Copying:

Note 1: you can also perform these 2 actions programmatically via APIs:

Note 2: Clean up or prepare data via 2 new events:

3) Restrict access to a specific environment to only certain users (part of a security group)

A) Create security group in Azure AD or Microsoft 365 admin center

  • Open Admin Center
  • Navigate to Active teams & groups
  • Click Security
  • Select Add a group action
  • add owner(s) and member(s)

B) Assign security group to BC environment

Admin users will be allowed in all these environments.

To restrict access to an environment, create a security group with 0 members. In this case only admins have access.

4) Environment operations are now caught in the Environment Operations:

Navigate to Operations:

5) Restart Environments:

  • Open the environment;
  • Click on Sessions;
  • Click on Restart Environment

6) Update apps in each environment

If you have apps installed in your environments and these apps have updates in the AppSource, starting with BC 2021 wave 2 you can manage the apps and their upgrade from admin center.

  • Click on one environment link
  • Choose Apps action

If the Available Update Action shows “Action Required” click on it and get through the upgrade flow.

Business Central 2021 wave 2: documents have now a default line type. See how to set it up!

In Business Central, sales and purchase documents have lines and lines can be of different types:

  • comment line: ” “
  • “G/L Account”
  • “Item”
  • “Resource”
  • “Fixed Asset”
  • “Charge (Item)”

When editing a sales line document, a user would have to pick one of these values.

With BC 2021 wave 2, customers can default the value of line type to a value that is used most often.

E.g. If an end-user has sales documents in which the Resource line types is very frequent, they could set the “Default Line Type” to “Resource” in “Sales & Receivables” page:

Let’s see it in action!

From the get-go, as soon as we open a new Sales Quote, we can see that the lines have “Resource” as line type:

If you manually change the Type to a different value, e.g. “G/L Account”, the subsequent lines are taking the default from the line above:

Similar to sales documents, purchase documents have a default line type.

End-users can set up their purchase line default type in “Purchase & Payables Setup” page:

It is also possible to use a different line type, for example “Tutor”.

An AL developer would need to expand the “Sales Line Type” enum to include the new value:

enumextension 50100 SalesLineTypeWithTutorExt extends "Sales Line Type"
    value(50100; Tutor)
        Caption = 'Tutor';

Deploying the extended enum would make possible to choose the new enum value:

And the sales documents would use the new line type as default:

Of course, there is more custom code to write to make the new enum usable in the documents, but about that in a future blog.

Have you seen the new “Chart of Accounts Overview” in BC 2021 Wave 2?

With Business Central 2021 Wave 2 there is a new page to inspect the chart of accounts.

Search for “Chart of Accounts Overview”.

This new page displays the chart of accounts in a tree structure.

To create a list page with a tree structure a developer would need to make true the property “ShowAsTree” which can be found under Repeater control:

The columns in the Overview page are similar to the ones in the original Chart Of Accounts list page.

The new page is more compact:

  • less fields: just the Balance, Net Change, “Income/Balance”, “Account Subcategory” “Account Type” and Totaling are available
  • less lines: “End-Total” lines are out

Let’s have a look at how the “Begin-Total” line looks like for Assets in the classic “Chart of Accounts”

And how the Assets “Begin Total” line looks in the Overview page:

We can see now that the Net Change, Balance, Totaling have been brought into the “Begin Total” row from “End-Total” and the “End Total” row is no longer in the list.

The classic “Chart of Accounts” list page:

The new “Chart of accounts Overview”.


And expanded:

The Overview page does not allow for opening of G/L Account card page.

The Overview page does not allow for editing, inserting or deleting G/L accounts.

But if you want a compact page, with less fields with the option to quickly expand and collapse features for entire groups of accounts, then the new “Chart of Accounts Overview” is a useful alternative.

There is a new Posting Preview Type in Business Central. See how that works!

With BC v19 one of the Application changes affects the Posting Preview functionality.

Read more here.

The new Posting Preview feature can be enabled in the General Ledger Setup:

The way the posting preview worked until now is covered by Posting Preview Type = Standard.

So, if you don’t like the new Posting Preview (Extended) you can always use the previous one.

But let’s recall how the original Posting Preview looks like.

First, Search for General Ledger Setup and set the Posting Preview Type to Standard:

Open a sales order and choose Preview Posting; the image below shows only one group of ledgers, the Related Entries group:

Let’s now head to the General Ledger Setup and set the Posting Preview Type to Extended:

Then re-open the Sales Order and click on Post – > Preview Posting:


  • we can see now 3 groups:
    • G/L Entries -> this is the place where will find the G/L Entries
    • VAT Entries -> records from VAT Entry table
    • Related Entries -> all the rest of the ledgers, including extension or custom entries
  • Show Hierarchical View is a toggle on how G/L entries and VAT entries in the posting preview weather grouped by Account No.(if Hierarchical View is on) or as a list (if Hierarchical View is off).

And if we want to view the details we can use the toggle in the upper right corner of the group to expand or collapse the groups:

Of course, the new Posting Preview on journals looks and feels similar to the documents’ Posting Preview.

Hope this helps!

How checking financial journal in background works

With BC 2020 wave 2 a new feature was introduced that allows for background checks on journal lines.

See more about this new feature here.

“On the General Journal Batch page, you can choose Background Error Check to have Business Central validate financial journals, such as general or payment journals, while you’re working on them. When the validation is enabled, the Journal Check FactBox displays next to the journal lines and will show issues in the current line and the whole batch. Validation happens when you load a financial journal batch, and when you choose another journal line”.

Let’s see how that works.

From the General Journal page, lookup into Gen. Journal Batches:

Enable “Background Error Check”.

In the Default general journal batch we can see now a new factbox : “Journal Check”:

We can observe that while we edit the journal the background check takes place.

If we click on the 3rd cue, “Issues Total”, we can see the errors:

We see that the Amount on the first line is 0.

Let’s update it to “12”:

We can now see that the error “Amount must not be empty” is gone, but we still have the error: “Document No. … is out of balance”.

Let’s update one of the lines so that the sum of all lines is 0.

After we update the Amount on the first line with -11 the errors are gone:

How is this checking in the background working?

With BC 2019 wave 2 release introduces a way for AL developers to start programming using multithreading/asynchronous concepts. 

Developers can now calculate expensive operations in a page without blocking the UI, and then update the UI once the calculation is complete.

Read Microsoft document on background tasks here.

If we were to look at page 1921 “Journal Errors Factbox”, in the OnAfterGetCurrRecord()

The method CheckErrorsInBackground() contains a line that enqueues the codeunit responsible with the general journal lines checking:

The check is done in the codeunit 9081 “Check Gen. Jnl. Line. Backgr.” in the OnRun() method:

Digging deeper, ultimately the standard codeunit 11 “Gen. Jnl.-Check Line” is run for each line.

For each journal line, errors are collected and made available for Counts to the factbox:

For example, for the second cue, “Lines with Issues” the system uses the factbox method GetNumberOfLinesWithErrors();

What about custom validations?

How can we catch those?

In a table extension I added a text field:

Exposed it on the page via a page extension:

At last, in a codeunit, I subscribed to an event from codeunit 11 “Gen. Jnl.-Check Line”

And, if we remove the value of “My test field” in one or more of the lines we can see the TestField error captured by the background task:

For more details, including the implementation of a completely new page background task check Tobias Fenster article.

Leveraging “Filter Tokens” codeunit to expand Business Central users’ filtering experience

Hello Readers!

A few weeks back, I watched Erik Hougaard‘s youtube video “Make your own Date Filters in AL and Business Central” and thought of trying it and adding my own bit to it.

First, what was the intention with custom filter tokens?

The standard application comes already with some tokens.

Think of dates, when you press “t” in a date field you get the today’s date, or when you press “q” in a date filter field you get the current quarter, and so on.

But what if we want to build our own tokens?

Custom Date Filters

For example, let’s assume that if I type “sv1” in a date filter I want the system to process my token into Jan 1st – Jan 31st. If I type “sv2” in a date filter I want the system to translate “sv2” into Feb 1st to Feb 29th or 28th depending on the current year, leap or not, and so on.

How can we do that? Extend the event OnResolveDateFilterToken from System Application codeunit “Filter Tokens” like in my sample code below:

[EventSubscriber(ObjectType::Codeunit, Codeunit::"Filter Tokens", 'OnResolveDateFilterToken', '', false, false)]
    local procedure CustomDateFilter(DateToken: Text; var FromDate: Date; var Handled: Boolean; var ToDate: Date)
        DateToken := UpperCase(DateToken);
        case DateToken of
                    FromDate := GetFromDate(Today(), 1);
                    ToDate := GetToDate(Today(), 1);
                    Handled := true;
                    FromDate := GetFromDate(Today(), 2);
                    ToDate := GetToDate(Today(), 2);
                    Handled := true;
                    FromDate := GetFromDate(Today(), 3);
                    ToDate := GetToDate(Today(), 3);
                    Handled := true;
                    FromDate := GetFromDate(Today(), 4);
                    ToDate := GetToDate(Today(), 4);
                    Handled := true;
                    FromDate := GetFromDate(Today(), 5);
                    ToDate := GetToDate(Today(), 5);
                    Handled := true;
                    FromDate := GetFromDate(Today(), 6);
                    ToDate := GetToDate(Today(), 6);
                    Handled := true;
                    FromDate := GetFromDate(Today(), 7);
                    ToDate := GetToDate(Today(), 7);
                    Handled := true;
                    FromDate := GetFromDate(Today(), 8);
                    ToDate := GetToDate(Today(), 8);
                    Handled := true;

                    FromDate := GetFromDate(Today(), 9);
                    ToDate := GetToDate(Today(), 9);
                    Handled := true;
                    FromDate := GetFromDate(Today(), 10);
                    ToDate := GetToDate(Today(), 10);
                    Handled := true;
                    FromDate := GetFromDate(Today(), 11);
                    ToDate := GetToDate(Today(), 11);
                    Handled := true;
                    FromDate := GetFromDate(Today(), 12);
                    ToDate := GetToDate(Today(), 12);
                    Handled := true;

    local procedure GetFromDate(Dt: Date; mo: integer): Date
        Exit(DMY2Date(1, mo, Date2DMY(Dt, 3)));

    local procedure GetToDate(Dt: Date; mo: integer): Date
        case mo of
            1, 3, 5, 7, 8, 10, 12:
                Exit(DMY2Date(31, mo, Date2DMY(Dt, 3)));
            4, 6, 9, 11:
                Exit(DMY2Date(30, mo, Date2DMY(Dt, 3)));
                    if Date2DMY(Dt, 3) div 4 = 0 then
                        Exit(DMY2Date(29, mo, Date2DMY(Dt, 3)))
                        Exit(DMY2Date(28, mo, Date2DMY(Dt, 3)))

The code could be refactored into a function that parses a 4 characters token of form “svxy” and call once GetToDate and once GetFromDate instead of 12 calls, but that’s not the goal of this blog.

Let’s test it.

Open Chart of Accounts page and use the flow filters in the “Filter Totals By” section of the page as below:

What about text filters? Can we customize them?

Custom Text Filters

This is the use case: each user has access to his list of customers (My Customer page):

Users can edit their own list of customers, adding/removing customers.

We also want, when we are in the Customers list, to be able to quickly filter the list of customers to the list in My Customers.

We can create a custom text filter token and by subscribing to event OnResolveTextFilterToken in codeunit “Filter Tokens” we get the functionality desired, like below:

    [EventSubscriber(ObjectType::Codeunit, Codeunit::"Filter Tokens", 'OnResolveTextFilterToken', '', true, true)]
    local procedure CustomTextFilter(TextToken: Text; var TextFilter: Text; var Handled: Boolean)
        _mc: Record "My Customer";
        _maxloops: integer;
        _maxloops := 10;
        TextToken := UpperCase(TextToken);
        Handled := true;
        case TextToken of
                    _mc.SetRange("User ID", UserId());
                    if _mc.FindSet() then begin
                        _maxloops -= 1;
                        _maxloops -= 1;
                        TextFilter := _mc."Customer No.";
                        if _mc.Next() <> 0 then
                                _maxloops -= 1;
                                TextFilter += '|' + _mc."Customer No.";
                            until (_mc.Next() = 0) or (_maxloops <= 0);

In the Customers List we can now use the new token:

When users filter the “No.” field to “%sv” the system finds all Customer “No.” in My Customer list and populates the filter for “No.” field.

My Customer list consists of customers 20000,30000, and 50000 and therefore when using custom text filter “sv” I get the list of my customers.

You could similarly create a custom token to filter Chart of Accounts to G/L accounts in “My Accounts”.

Things to consider

The token above “sv” would be triggered and parsed in any page.

For example, if we are in the Vendors list the same list (20000,30000 and 50000) will be the result of parsing “sv” token. And that might not be what we need.

A possible solution is to specialize the custom filters to customers or to vendors, like having 2 tokens: “csv” for customers and “vsv” for vendors.

For more considerations when using custom tokens read here.

Go on, try them!

“Field Selection” codeunit – how I select and record the ID of a field in Business Central

Hello readers!

Recently I have been working on a customization for a customer with the goal of changing the out-of-the-box Positive Pay export for a Bank Account record.

While preparing the mapping for the positive pay details, I noticed the way Microsoft wrote the picking up of a field ID. They created a new codeunit: “Field Selection”.

Let’s see how to get to that piece of code:

  • in BC, search for “Data Exchange Definitions”
  • Click on any Exchange Definition Code, then in the Line Definitions, click on Manage and “Field Mapping”
  • In the “Field Mapping”, click on “Field ID” lookup “…”
  • the list of fields in the table 330 is displayed:

This lookup page is triggered by the OnLookup trigger on page 1217 “Data Exch Field Mapping Part”:

field("Field ID"; "Field ID")
                    ApplicationArea = Basic, Suite;
                    ShowMandatory = true;
                    ToolTip = 'Specifies the number of the field in the external file that is mapped to the field in the Target Table ID field, when you are using an intermediate table for data import.';

                    trigger OnLookup(var Text: Text): Boolean
                        "Field": Record "Field";
                        TableFilter: Record "Table Filter";
                        FieldSelection: Codeunit "Field Selection";
                        Field.SetRange(TableNo, "Table ID");
                        if FieldSelection.Open(Field) then begin
                            if Field."No." = "Field ID" then
                            FieldCaptionText := GetFieldCaption;

                    trigger OnValidate()
                        FieldCaptionText := GetFieldCaption;

The OnLookup trigger is using codeunit 9806 “Field Selection”. This codeunit, as well as its associated page, Page 9806 “Fields Lookup” can be located in Microsoft System Application app.

You can go through the source code for this codeunit here.

As you can see below, the main function Open is fairly simple:

Let’s put these objects into a simple practice exercise.

I am going to create a new table and in it a new field in which I am planning to record the ID of the Customer.”No.” field.

Here is field definition in the table:

        field(2; "Customer No Field ID"; Integer)
            DataClassification = CustomerContent;

And here is the field definition on the list page:

                field(MyField2; Rec."Customer No Field ID")
                    ApplicationArea = All;

                    trigger OnLookup(var Text: Text): Boolean
                        RecField: Record "Field";
                        FieldSelection: Codeunit "Field Selection";
                        RecField.SetRange(TableNo, Database::Customer);
                        if RecField.Get(Database::Customer, Rec."Customer No Field ID") then;

                        if FieldSelection.Open(RecField) then
                            Rec.Validate("Customer No Field ID", RecField."No.");

First, we’re filtering the Field record (RecField) to the Customer table and then we execute the “Field – Selection” Open method, which in turn displays the “Fields Lookup” page. Lastly, I validate my new field “Customer No Field ID” against the result of the lookup:

In the Lookup page users can pick the field and the field ID needed:

The goal with this exercise reminding those who knew and making aware those who didn’t know about the new Microsoft System app objects and use them in daily tasks, instead of re-inventing them each time.

Moreover, these objects are part of the platform (designed and tested by Microsoft), therefore, we have every reason to use them.

Hope this helps!