Business Central and Twilio: let’s get the message out!

Most of my readers are familiar with “as a Service” acronyms like IaaS, PaaS, SaaS, and Microsoft’s Azure platform supports each of these concepts. Below are a few examples:

I was introduced recently to the concept of CPaaS – Communication Platform as a Service and its multifaceted applications in the modern world.

Microsoft Azure has its own implementation of CPaaS via Azure Communication Services. See more about it on the Microsoft Learn platform.

I would like to share my experience with a Twilio product in this blog post.

The company focuses on building platforms for communication channels and applications:

Their SMS and WhatsApp messaging platform caught my attention.

This is the exercise I worked through: a sales order in Business Central gets posted. This triggers a POST request to an Azure Function which, in turn, leverages Twilio’s messaging API to send WhatsApp messages via Twilio platform.

Business Central – Azure Function POST request

In Business Central, subscribe to an event during posting of a sales order. The SMS/WhatsApp could also be sent at various times during document lifecycle.

codeunit 50100 "Sales Mgt.SVIR"
    [EventSubscriber(ObjectType::Codeunit, Codeunit::"Sales-Post", 'OnAfterFinalizePostingOnBeforeCommit', '', false, false)]
    local procedure CallTwilioAzureFunc(var EverythingInvoiced: Boolean; WhseReceive: Boolean; WhseShip: Boolean;
            CommitIsSuppressed: Boolean;
            PreviewMode: Boolean;
            var GenJnlPostLine: Codeunit "Gen. Jnl.-Post Line";
            var ReturnReceiptHeader: Record "Return Receipt Header";
            var SalesCrMemoHeader: Record "Sales Cr.Memo Header";
            var SalesHeader: Record "Sales Header";
            var SalesInvoiceHeader: Record "Sales Invoice Header";
            var SalesShipmentHeader: Record "Sales Shipment Header")
        _cust: Record Customer;
        _phone: Text;
        _message: Text;
        _jsonBody: Text;
        _httpContent: HttpContent;
        _httpResponse: HttpResponseMessage;
        _httpHeader: HttpHeaders;
        _httpClient: HttpClient;
        _url: Label '';
        if PreviewMode then
        if SalesHeader.IsTemporary then 
        if _cust.Get(SalesHeader."Sell-to Customer No.") then begin
            if _cust."Phone No." <> '' then begin
                // build json for POST request with customer cell phone number and the message to send
                _phone := '1' + _cust."Phone No.";
                _message := SalesHeader."No." + ' was just processed.';
                _jsonBody := ' {"PhoneTo":"' + _phone + '","message":"' + _message + '"}';

                // build POST http request
                _httpHeader.Add('Content-Type', 'application/json');
                _httpClient.Post(_url, _httpContent, _httpResponse);
                if not _httpResponse.IsSuccessStatusCode then begin
                    Error(StrSubstNo('Error - Received %1', Format(_httpResponse.HttpStatusCode)));

Set up Twilio sandbox

Before creating the Azure Function set up your free trial Twilio account.

There is a link on the company’s website for signing up for a free trial.

If you decide to promote the project to production you can upgrade and the monthly bill will consist of charges for:

  • number of phone numbers rented from Twilio and
  • number of messages sent and received.

After trial, there is 1$ per month per phone number and 0.007 $ per message.

You would need to set up a:

  • Messaging Service for SMS
  • a Twilio phone number for WhatsApp messages

This is how Twilio UI looks like for setting app Messaging Service:

And this is the screen where you get assigned a Twilio number for WhatsApp messages:

From Twilio platform let’s attempt to send a SMS:

You need to choose:

  • the destination for your test SMS
  • Twilio messaging service
  • And the test SMS

On the right side, Twilio platform displays sample source code in several programming languages.

Below is the the C# source code for sending a SMS via Twilio platform.

This sample code can go into an Azure function:

using System; 
using System.Collections.Generic; 
using Twilio; 
using Twilio.Rest.Api.V2010.Account; 
using Twilio.Types; 
class Example 
    static void Main(string[] args) 
        var accountSid = "ACe00a276056a4a696ff14691a3bb40f3b"; 
        var authToken = "[AuthToken]"; 
        TwilioClient.Init(accountSid, authToken); 
        var messageOptions = new CreateMessageOptions( 
            new PhoneNumber("+16199307581"));   
        messageOptions.MessagingServiceSid = "MGb06079125586edf64f3c13f51843d07e";  
        messageOptions.Body = "Testing for blog";   
        var message = MessageResource.Create(messageOptions); 

Similarly, to get the source code for sending a WhatsApp message, go to “Send a WhasApp message” blade.

And after connecting to Twilio assigned sandbox you get this on your phone:

Click on “Send One Way” message:

There are 3 supported use cases: Appointment Reminders, Order Notification, Verification Codes. For my example I will choose Order Notification:

And below is the C# sample code for sending the WhatsApp message:

using System; 
using System.Collections.Generic; 
using Twilio; 
using Twilio.Rest.Api.V2010.Account; 
using Twilio.Types; 
class Example 
    static void Main(string[] args) 
        var accountSid = "ACe00a276056a4a696ff14691a3bb40f3b"; 
        var authToken = "[AuthToken]"; 
        TwilioClient.Init(accountSid, authToken); 
        var messageOptions = new CreateMessageOptions( 
            new PhoneNumber("whatsapp:+16199307581")); 
        messageOptions.From = new PhoneNumber("whatsapp:+14155238886");    
        messageOptions.Body = "Your Yummy Cupcakes Company order of 1 dozen frosted cupcakes has shipped and should be delivered on July 10, 2019. Details:";   
        var message = MessageResource.Create(messageOptions); 

A handshake has to take place between a target phone number and the Twilio sandbox.

For the receiving phone to join Twilio sandbox, send from the receiver a message “join fall-wood” to Twilio sandbox.

This could also be triggered with an Azure Function POST request from Business Central that sends the WhatsApp message to the Twilio number.

Writing the Azure function

When it comes to creating the Azure Function, one could:

  • develop inside the Azure Platform:
    • login into Azure Platform
    • Create new function app and new trigger function
    • There is a Test/Debug functionality as well as a Monitor
  • Develop in VS Code
  • My choice for development environment was Visual Studio:
    • create a new Azure Function project
    • create a new trigger function

Last step is to publish the function to Azure:

Once deployed, click on “Get Function URL” and paste it in your AL code:

Publish extension:

Test Scenario

Test it by posting a sales order whose customer has a phone number that joined the Twilio sandbox. The message appears in WhatsApp:

With transition to cloud, Business Central can now integrate with new and emerging communication platforms like Twilio.

Happy development!

An on-demand subscriber with AL

Use Case

An end-user wants the company address on all company reports, but, when printing checks, the address should be a different one.


I had this use case in a few instances and meant to write a few lines about it, but kept postponing, until now. You can add custom address fields on the company information page (and table) or any other custom table/page pair. The idea is that the sales invoices (and all other system reports) should display one address(company address in the company information page), whereas some other report, like the check report, should print an alternate address. Why? Well, say Accounts Payable is in Los Angeles and the warehouse and the head office is in Denver.


How can we accommodate this, while fully taking advantage of the framework Microsoft has made available via Format Address codeunit 365?

I decided to call it on-demand subscribing. On one hand, when I run the check report I want to subscribe to OnBeforeCompany event and execute some custom code, whereas when the sales invoice runs – and all other system reports – I want to bypass the code in my subscriber.

This is the base Company function code in “Format Address” codeunit:

    procedure Company(var AddrArray: array[8] of Text[100]; var CompanyInfo: Record "Company Information")
        IsHandled: Boolean;
        IsHandled := false;
        OnBeforeCompany(AddrArray, CompanyInfo, IsHandled);
        if IsHandled then

        with CompanyInfo do
              AddrArray, Name, "Name 2", '', Address, "Address 2",
              City, "Post Code", County, '');

And this would be my subscriber to the OnBeforeCompany event:

    [EventSubscriber(ObjectType::Codeunit, Codeunit::"Format Address", 'OnBeforeCompany', '', false, false)]
    local procedure CompanyAlternateAddress(var AddrArray: array[8] of Text[100]; var CompanyInfo: Record "Company Information"; var IsHandled: Boolean)
        FormatAddr: Codeunit "Format Address";
        FormatAddr.FormatAddr(AddrArray, CompanyInfo."Alt. Name.SVI",
                                            CompanyInfo."Alt. Name 2.SVI",
                                            CompanyInfo."Alternate Address.SVI",
                                            CompanyInfo."Alternate Address2.SVI",
                                            CompanyInfo."Alternate City.SVI",
                                            CompanyInfo."Alternate Zip Code.SVI",
                                            CompanyInfo."Alternate Country.SVI");
        IsHandled := true;

How would I execute the subscriber only when coming from the check report and not on all other reports utilizing FormatAddr.Company function?

There might be other ways, but the way I always go with is using BindSubscription and UnBindSubscription platform functions.

Then, in the check report, wrap FormatAddr.Company call, with BindSubscription and UnBindSubscription like below:

        Codeunit50010: Codeunit "Check Address Manual Events";
//code for check report        
        FormatAddr.Company(CompanyAddr, CompanyInfo);

Do not forget to set the property EventSubscriberInstance, of the bound codeunit, to Manual.

codeunit 50010 "Check Address Manual Events"
    EventSubscriberInstance = Manual;
    [EventSubscriber(ObjectType::Codeunit, Codeunit::"Format Address", 'OnBeforeCompany', '', false, false)]
    local procedure CompanyAlternateAddress(var AddrArray: array[8] of Text[100]; var CompanyInfo: Record "Company Information"; var IsHandled: Boolean)
        FormatAddr: Codeunit "Format Address";
        FormatAddr.FormatAddr(AddrArray, CompanyInfo."Alt. Name.SVI",
                                            CompanyInfo."Alt. Name 2.SVI",
                                            CompanyInfo."Alternate Address.SVI",
                                            CompanyInfo."Alternate Address2.SVI",
                                            CompanyInfo."Alternate City.SVI",
                                            CompanyInfo."Alternate Zip Code.SVI",
                                            CompanyInfo."Alternate Country.SVI");
        IsHandled := true;


  • on-demand subscribers need to live in a codeunit whose EventSubscriberInstance = Manual;
  • on-demand subscribers are brought in memory with BindSubscription
  • Codeunits having EventSubscriberInstance = Manual should contain event subscribers serving one specific use case only.

Hope this helps!

How to get disk usage per folder

Recently I received a question coming to the support ticketing system on how to get a listing of all folders inside c:\Users folder and their sizes.

The sys admin that asked the question had hundreds of folders in C:\Users and did not want to to do “right click” + Properties on each one.

I started digging a bit and almost wanted to write my own little dot net app, but …

Fortunately, I found this gem: sysinternals disk usage:

I download it and gave it a try:

What I needed was a break down of all folders inside C:\users folder so that the admin can see which user takes the most space on an RDS.

I used this command: “du -q -l 1 c:\Users”

Hope this helps you!

Say Hello to the new “Performance Profiler”

While going through what is new in BC 2022 wave 1, I found out a cool tool.

Here comes the new Performance Profiler. While many have implemented Telemetry and got some insight into their processes, this is a step forward from Microsoft to bring Telemetry tools on the BC user interface.

Now, the concept of a profiler is not a futuristic functionality, it has been implemented on many platforms, like Java Profiling or SQL Server Profiler or even MS Edge Profiler.

A profiler is a code monitoring tool, a tool that helps trace, recreate, and troubleshoot various issues in code.

Nevertheless, seeing a profiler in Business Central is so refreshing.

Let’s do a quick test to see how that works at high level.

I added an action on Customer List page that does nothing.

Well, it does something, it is waiting for 5 seconds before ending.

pageextension 50100 CustomerListExt extends "Customer List"
                ApplicationArea = All;

                trigger OnAction()
                    _DateTime: DateTime;
                    _DateTime := CurrentDateTime();
                    _DateTime := CreateDateTime(DT2Date(_DateTime), DT2Time(_DateTime) + 5000);
                    while (CurrentDateTime() < _DateTime) do
    trigger OnOpenPage();
        Message('App published: Hello world');


I published it and let’s see if it is captured by Performance Profiler.

As the documentation recommended, I opened the Performance Profiler in its own page.

On the Business Central main browser window go to the Customer List and locate the new action:

On the Performance Profiler, click on the Start action.

Then launch the TestProfiler action.

When you get control to the main browser BC page, head over the Performance Profiler page and click Stop action. And here are the results:

We can see 2 regions:

  • one that shows what application extensions were detected between Start and Stop.
  • a second one, a technical part-page, with 2 sub-lists:
    • one showing the time spent by application object
    • second showing a call tree with timing for each branch

Very cool feature which I wish I had it in all previous BC versions.

How was this designed?

The main page Performance Profiler is based on Page 24 “Performance Profiler” in the Microsoft System Application.

Lots of other objects involved with the Performance Profiler to bring this new feature:

Take a look below, to compare the Microsoft System 19.5 versus Microsoft System 20.

With BC 2022 wave 1, the system application comes with a set of new objects to support Profiling.

All in all, Performance Profiler is a great addition to the Business Central platform.

This will help consultants locate faulty/slow code and record, download and send the recording to the authors of the code that performs poorly.

Knowing your Business Central data using the Page Inspection

With the promotion of Microsoft’s ERP Dynamics NAV as the only SMB Microsoft ERP in the cloud, the community gained a significant number of new users coming from other products.

Introduction of Cloud Migration feature or Data Migration in Business Central allowed delivery teams in the Partners space to bring in BC SaaS many end-users from Dynamics GP or SL, as well as from other non-Microsoft ERPs, like QuickBooks.

Those that consider themselves new to Business Central might find difficult to navigate and find what they need in their daily work with the new ERP.

This is when knowing about the Page Inspection page comes handy.

To enable the Page Inspection pane, you can use these keys:

  • CTRL + ALT + F1
  • navigating to “?” on the top right corner -> click on Help and Support -> look for “Inspect Pages and Data” link:

Once enabled, Page Inspection appears as a vertical frame in the browser window and allows users to see the components of each page.

Click on various page components and notice how the Page Inspection updates. Look for:

  • the fields included in that component
  • the extensions that touched the current page/component
  • existing filters for that component/page


Enabling Page Inspection on the Business Manager Role Center and selecting the frame in the middle uncovers valuable information:

The frame is actually a Card Page based on a BC table “Activities Cue” and the card page itself is Page 1310 “O365 Activities”.

Some users might ask:

  1. What are the records stored in this table?

For seeing the records there are a few options:

In the page inspection pane, click on “View Table” link:

This will open up the default page for table 1313

  • Another way of displaying all records in a table is to go to Table Information page, Search for table 1313, and click on the flow field “No. Of Records”
  • Ultimately, the users can run the table from the URL. Copy the link up to and inluding the environment name (in my example below, include everything up to “Production” then add “/?table=1313”

2. What data is available for each record?

There are three Tabs at the top:

  • Table Fields
    • use the magnifying loop(under Page Filters) to filter the fields by name or value
    • not all fields available in the table are displayed on the page. You can try to Personalize your page or ask your Partner to help you. To enable a field for all users within one profile you might want to ask your admin to customize the page.
  • Extensions
    • Custom code affects tables and pages. You can easily see what extensions have touched the current table/page

Note: you can easily detect which fields belong to which extension as new extensions need to include a suffix or prefix for each field/action/page/table.

  • Page Filter: filters are often used to display the data in Business Central. This tab gives good clues into how the data has been filtered:

If you do not see the details you’re expecting to see in the Page Inspection page, you probably do not have the right permission. Talk to your admin and ask them to give you D365 Troubleshoot permission set or user group.

For some Microsoft official documentation on the Page Inspection read this.

Getting started with Snapshot Debugging in Business Central

Happy new year, readers! Time for another Feynman technique exercise.

Today I tried for the first time (Duh Silviu …. it’s been out there for at least a year!!!) to debug a snapshot.

I started my exercise on Microsoft Docs here. Also, very useful was Stefano’s blog.

But first, what is a snapshot? A snapshot is a recording of executed code in Business Central.

The idea is that when you want to investigate an error in one of your environments (I’ll be showing screenshots of SaaS), you would start the recorder (from VS Code), perform the action you want to investigate, stop the recorder. Then re-play the recording. Simple!

Well, I ran into a few issues, therefore, for my sake in the future debugging sessions and for the interested readers, I’ll recap what I did to be able to replay a snapshot for debugging.

User debugging settings

First, the user that will connect to the SaaS for debugging purposes should be part of a group called D365 SNAPSHOT DEBUG:

Point Snapshot to the right environment

Most of SaaS development environments I come across have one configuration in launch.json.

For snapshot debugging though, you need an additional configuration:

I added the first configuration for this exercise.

The key element is the sessionId.

To find the sessionId you need to go to the admin console:

a) Navigate to and

b) Click on Environments

c) Launch action Sessions

d) Refresh

e) Take note of your session

Note: This step is going to be a bit tricky. Because you might have plenty of sessions with BC and you might try to record the wrong session. I usually cancel all sessions under my name, close all BC windows, make sure there is no active session under my name, login in BC, check the session id, update configuration, and start recording.

Start Recording

In VS Code, start recording by pressing F7 or from the Pallette launch AL: Initialize Snapshot Debugging.

Play use case

In BC, play your use case.

For example, in my environment I had a Sales Order without External Document No.

Open the Sales Order and attempt to Post.

Get the error referring to the missing “External Document No.”.

Move then to VS Code to stop the recording.

Stop Recording

In VS Code, stop recording. Use ALT + F7 or from the Pallette launch AL: Finish Snapshot debugging on the server. In the Output screen of the VS Code you should see something like this:

Replay recording

In VS Code, on the left side of the toolbar, there is a small button showing all snapshots.

Click on it, and from the top choose the desired snapshot, in my case last snapshot is the one on top:

After choosing the snapshot, the system will automatically play it, stopping through each breakpoint and ending up at the line of code responsible for the error encountered in the web client.

You can see on the left side all the goodies needed for debugging: the Call Stack, the local and global variables…

And if you are interested, you can unzip the snapshot and have a look at what is in it: a set of MDC files, AL files and a version file.

Hope this helps!

Cloud migration changes with BC19 wave 2

With BC 19 wave 2 Microsoft improved the Cloud Migration functionality by re-designing the migration flow, cleaning up the errors reported by the partners over the summer, allowing migration of a larger set of GP tables with one additional extension installation, and removing the version 18 size limitations.

Improved migration flow

  • search “cloud migration management”
  • go through “Cloud Migration Setup”
    • before version 19, the replication and data upgrade were done in one step
    • now we have 2 separate steps triggered from the Migration Cloud Management page actions:
  1. “Run Migration Now” – for copy/replication; at the end the last “Cloud Migration Management” entry shows “Upgrade Pending”
  2. “Run Data Upgrade Now” – once data is copied it has to be validated/upgraded. Status will move through “Upgrade in Progress” and Completed.

The split of the unique process “Run Migration Now” into two steps was introduced to better manage cases when the data upgrade would fail.

If this is the case, then in the admin center you can restore the environment as of the time of end of the replication step and repeat the data upgrade.

Cloud migration resiliency

  • use of upgrade tags: to avoid multiple data upgrades processing use Upgrade Tags or to completely skip the upgrade. More here.
  • less locking when running cloud migration (more companies per run). There is still a limitation if schema for migrated tables is above a json object limit of 4 Mb. To avoid locking, migrate companies in smaller chunks. Especially if you get an error like this:
    • The size of the lookup activity result exceeds the limitation …
  • data repair after Cloud Migration: some data would be missing while migration would be reported as successful in the previous version

Dynamics GP migration changes

  • Enabled large GP tables (> 500 mb or > 100k records)
  • Support mapping of any custom table
  • add events for GP to BC migration:
    • new sample Microsoft extension to create BC counterpart tables for GP tables (source code here). This allows for a much larger set of tables to be migrated to SaaS.
    • mapping between GP to BC tables
    • there is also a powershell script that generates AL table based on a SQL table

Upload limits

80 GB limitation is lifted. Any size is supported.

Cleanup as much as possible before cloud migration

Some functionality may be disabled after cloud migration if tenant is too large. For ex. you might not be able to create a sandbox from Production if your capacity is 80 GB and you reached already 80 GB with your Production. Alternatively, you can upgrade capacity.

Business Central 2021 wave 2: an overview of Data Management

Reasons why data size matters

  • Data size or rather tenant capacity is reflected in the price the end user pays
  • Data size influences performance: smaller databases allow for efficient processing, faster backups, reports, sorting records, faster page display …

To see the current data size, for the entire database and per environment, in the admin center, click on the Capacity blade:

To increase the capacity of their environments or increase number of environments, customers can purchase through their partners, additional capacity Add-Ons (check this).

Under Customer subscriptions, their Partner can purchase additional capacity:

To stay within their license capacity, customers and/or their partners need to handle their data size and compression.

Handle and Compress Data

Users can start with the Table Information page and use Data Administration action or directly launching the Data Administration page from Search.

This page/view contains two lists: one displays data per table and the second list summarizes data per company:

Through Data Administration page one can note or perform the following:

  • Refresh action: updates tables’ size and can be scheduled via a job queue
  • Data Cleanup: various reports can be run to delete Change Logs, Document Archives, Invoiced Documents and so on.
  • Data Compression
    • various reports are used to delete and archive ledger entries and registers:
  • A link to Companies page, allows for deletion of companies or copy an existing company under a different name:
  • An action to Retention Policies brings up a list of records from Retention Policy Setup table.
  • And drilling deeper, we can inspect the Retention Policy Setup record via Retention Policy Setup Card:
  • If you drill down on the Table ID you can see there are a limited number of tables that have defined retention policies:
  • To add your own table in the list above please read Stefano’s blog.
  • There is also a Data Administration Guide action. This wizard takes users through retention policies list, manage companies page, data compression of different tables

A few notes about archiving

  • the archiving mechanism is not a backup restore system
  • integrated in the Data Compression processes, a report option for archiving for each data compression report
  • can be exported to excel or csv files: look in Data Archive list page
  • ability to archive any data to be deleted: start logging, delete data, stop logging.
  • Example: lets archive some test vendors:
  • A quick test of deleting a test vendor reveals the Data archive record and the excel (or csv) created:
  • archives are stored in media fields in Data Archive Table: Table Fields (json) and Table Data not counting in the database size

More to check in Microsoft’s Tomas Navarro and Bardur Knudsen’s video “Data Management in the Business Central Application”.

5 new features Business Central admins need to know

Microsoft keeps adding new features to all facets of Business Central, including the admin center.

The community has access to the BC Ideas portal to signal Microsoft its wishes and Microsoft delivers.

If you want to contribute to the future of Business Central, add your ideas to

1. How to access Admin URL

Well, this is not something new, but still admins need to know how to access the BC admin center.


2. Copy environments

A) Sandbox to Production

After testing your sandbox you can turn it into a production environment.

Click on a Sandbox environment

Click on Copy action:

A new dialog appears:

Enter the New environment name and the type of the environment, in this case Production, and then click on Copy.

B) Production to Sandbox

Navigate to environment home page and click on a Production environment.

In the next screen, pick a name for the new sandbox and click on Copy:

Confirm operation:

Copy is scheduled:

and later is Copying:

Note 1: you can also perform these 2 actions programmatically via APIs:

Note 2: Clean up or prepare data via 2 new events:

3) Restrict access to a specific environment to only certain users (part of a security group)

A) Create security group in Azure AD or Microsoft 365 admin center

  • Open Admin Center
  • Navigate to Active teams & groups
  • Click Security
  • Select Add a group action
  • add owner(s) and member(s)

B) Assign security group to BC environment

Admin users will be allowed in all these environments.

To restrict access to an environment, create a security group with 0 members. In this case only admins have access.

4) Environment operations are now caught in the Environment Operations:

Navigate to Operations:

5) Restart Environments:

  • Open the environment;
  • Click on Sessions;
  • Click on Restart Environment

6) Update apps in each environment

If you have apps installed in your environments and these apps have updates in the AppSource, starting with BC 2021 wave 2 you can manage the apps and their upgrade from admin center.

  • Click on one environment link
  • Choose Apps action

If the Available Update Action shows “Action Required” click on it and get through the upgrade flow.