Leveraging “Filter Tokens” codeunit to expand Business Central users’ filtering experience

Hello Readers!

A few weeks back, I watched Erik Hougaard‘s youtube video “Make your own Date Filters in AL and Business Central” and thought of trying it and adding my own bit to it.

First, what was the intention with custom filter tokens?

The standard application comes already with some tokens.

Think of dates, when you press “t” in a date field you get the today’s date, or when you press “q” in a date filter field you get the current quarter, and so on.

But what if we want to build our own tokens?

Custom Date Filters

For example, let’s assume that if I type “sv1” in a date filter I want the system to process my token into Jan 1st – Jan 31st. If I type “sv2” in a date filter I want the system to translate “sv2” into Feb 1st to Feb 29th or 28th depending on the current year, leap or not, and so on.

How can we do that? Extend the event OnResolveDateFilterToken from System Application codeunit “Filter Tokens” like in my sample code below:

[EventSubscriber(ObjectType::Codeunit, Codeunit::"Filter Tokens", 'OnResolveDateFilterToken', '', false, false)]
    local procedure CustomDateFilter(DateToken: Text; var FromDate: Date; var Handled: Boolean; var ToDate: Date)
    begin
        DateToken := UpperCase(DateToken);
        case DateToken of
            'SV1':
                begin
                    FromDate := GetFromDate(Today(), 1);
                    ToDate := GetToDate(Today(), 1);
                    Handled := true;
                end;
            'SV2':
                begin
                    FromDate := GetFromDate(Today(), 2);
                    ToDate := GetToDate(Today(), 2);
                    Handled := true;
                end;
            'SV3':
                begin
                    FromDate := GetFromDate(Today(), 3);
                    ToDate := GetToDate(Today(), 3);
                    Handled := true;
                end;
            'SV4':
                begin
                    FromDate := GetFromDate(Today(), 4);
                    ToDate := GetToDate(Today(), 4);
                    Handled := true;
                end;
            'SV5':
                begin
                    FromDate := GetFromDate(Today(), 5);
                    ToDate := GetToDate(Today(), 5);
                    Handled := true;
                end;
            'SV6':
                begin
                    FromDate := GetFromDate(Today(), 6);
                    ToDate := GetToDate(Today(), 6);
                    Handled := true;
                end;
            'SV7':
                begin
                    FromDate := GetFromDate(Today(), 7);
                    ToDate := GetToDate(Today(), 7);
                    Handled := true;
                end;
            'SV8':
                begin
                    FromDate := GetFromDate(Today(), 8);
                    ToDate := GetToDate(Today(), 8);
                    Handled := true;
                end;

            'SV9':
                begin
                    FromDate := GetFromDate(Today(), 9);
                    ToDate := GetToDate(Today(), 9);
                    Handled := true;
                end;
            'SV10':
                begin
                    FromDate := GetFromDate(Today(), 10);
                    ToDate := GetToDate(Today(), 10);
                    Handled := true;
                end;
            'SV11':
                begin
                    FromDate := GetFromDate(Today(), 11);
                    ToDate := GetToDate(Today(), 11);
                    Handled := true;
                end;
            'SV12':
                begin
                    FromDate := GetFromDate(Today(), 12);
                    ToDate := GetToDate(Today(), 12);
                    Handled := true;
                end;
        end;
    end;

    local procedure GetFromDate(Dt: Date; mo: integer): Date
    begin
        Exit(DMY2Date(1, mo, Date2DMY(Dt, 3)));
    end;

    local procedure GetToDate(Dt: Date; mo: integer): Date
    begin
        case mo of
            1, 3, 5, 7, 8, 10, 12:
                Exit(DMY2Date(31, mo, Date2DMY(Dt, 3)));
            4, 6, 9, 11:
                Exit(DMY2Date(30, mo, Date2DMY(Dt, 3)));
            2:
                begin
                    if Date2DMY(Dt, 3) div 4 = 0 then
                        Exit(DMY2Date(29, mo, Date2DMY(Dt, 3)))
                    else
                        Exit(DMY2Date(28, mo, Date2DMY(Dt, 3)))
                end;
        end

The code could be refactored into a function that parses a 4 characters token of form “svxy” and call once GetToDate and once GetFromDate instead of 12 calls, but that’s not the goal of this blog.

Let’s test it.

Open Chart of Accounts page and use the flow filters in the “Filter Totals By” section of the page as below:

What about text filters? Can we customize them?

Custom Text Filters

This is the use case: each user has access to his list of customers (My Customer page):

Users can edit their own list of customers, adding/removing customers.

We also want, when we are in the Customers list, to be able to quickly filter the list of customers to the list in My Customers.

We can create a custom text filter token and by subscribing to event OnResolveTextFilterToken in codeunit “Filter Tokens” we get the functionality desired, like below:

    [EventSubscriber(ObjectType::Codeunit, Codeunit::"Filter Tokens", 'OnResolveTextFilterToken', '', true, true)]
    local procedure CustomTextFilter(TextToken: Text; var TextFilter: Text; var Handled: Boolean)
    var
        _mc: Record "My Customer";
        _maxloops: integer;
    begin
        _maxloops := 10;
        TextToken := UpperCase(TextToken);
        Handled := true;
        case TextToken of
            'SV':
                begin
                    _mc.SetRange("User ID", UserId());
                    if _mc.FindSet() then begin
                        _maxloops -= 1;
                        _maxloops -= 1;
                        TextFilter := _mc."Customer No.";
                        if _mc.Next() <> 0 then
                            repeat
                                _maxloops -= 1;
                                TextFilter += '|' + _mc."Customer No.";
                            until (_mc.Next() = 0) or (_maxloops <= 0);
                    end
                end;
        end;
    end;

In the Customers List we can now use the new token:

When users filter the “No.” field to “%sv” the system finds all Customer “No.” in My Customer list and populates the filter for “No.” field.

My Customer list consists of customers 20000,30000, and 50000 and therefore when using custom text filter “sv” I get the list of my customers.

You could similarly create a custom token to filter Chart of Accounts to G/L accounts in “My Accounts”.

Things to consider

The token above “sv” would be triggered and parsed in any page.

For example, if we are in the Vendors list the same list (20000,30000 and 50000) will be the result of parsing “sv” token. And that might not be what we need.

A possible solution is to specialize the custom filters to customers or to vendors, like having 2 tokens: “csv” for customers and “vsv” for vendors.

For more considerations when using custom tokens read here.

Go on, try them!

“Field Selection” codeunit – how I select and record the ID of a field in Business Central

Hello readers!

Recently I have been working on a customization for a customer with the goal of changing the out-of-the-box Positive Pay export for a Bank Account record.

While preparing the mapping for the positive pay details, I noticed the way Microsoft wrote the picking up of a field ID. They created a new codeunit: “Field Selection”.

Let’s see how to get to that piece of code:

  • in BC, search for “Data Exchange Definitions”
  • Click on any Exchange Definition Code, then in the Line Definitions, click on Manage and “Field Mapping”
  • In the “Field Mapping”, click on “Field ID” lookup “…”
  • the list of fields in the table 330 is displayed:

This lookup page is triggered by the OnLookup trigger on page 1217 “Data Exch Field Mapping Part”:

field("Field ID"; "Field ID")
                {
                    ApplicationArea = Basic, Suite;
                    ShowMandatory = true;
                    ToolTip = 'Specifies the number of the field in the external file that is mapped to the field in the Target Table ID field, when you are using an intermediate table for data import.';

                    trigger OnLookup(var Text: Text): Boolean
                    var
                        "Field": Record "Field";
                        TableFilter: Record "Table Filter";
                        FieldSelection: Codeunit "Field Selection";
                    begin
                        Field.SetRange(TableNo, "Table ID");
                        if FieldSelection.Open(Field) then begin
                            if Field."No." = "Field ID" then
                                exit;
                            TableFilter.CheckDuplicateField(Field);
                            FillSourceRecord(Field);
                            FieldCaptionText := GetFieldCaption;
                        end;
                    end;

                    trigger OnValidate()
                    begin
                        FieldCaptionText := GetFieldCaption;
                    end;
                }

The OnLookup trigger is using codeunit 9806 “Field Selection”. This codeunit, as well as its associated page, Page 9806 “Fields Lookup” can be located in Microsoft System Application app.

You can go through the source code for this codeunit here.

As you can see below, the main function Open is fairly simple:

Let’s put these objects into a simple practice exercise.

I am going to create a new table and in it a new field in which I am planning to record the ID of the Customer.”No.” field.

Here is field definition in the table:

        field(2; "Customer No Field ID"; Integer)
        {
            DataClassification = CustomerContent;
        }

And here is the field definition on the list page:

                field(MyField2; Rec."Customer No Field ID")
                {
                    ApplicationArea = All;

                    trigger OnLookup(var Text: Text): Boolean
                    var
                        RecField: Record "Field";
                        FieldSelection: Codeunit "Field Selection";
                    begin
                        RecField.SetRange(TableNo, Database::Customer);
                        if RecField.Get(Database::Customer, Rec."Customer No Field ID") then;

                        if FieldSelection.Open(RecField) then
                            Rec.Validate("Customer No Field ID", RecField."No.");
                    end;
                }

First, we’re filtering the Field record (RecField) to the Customer table and then we execute the “Field – Selection” Open method, which in turn displays the “Fields Lookup” page. Lastly, I validate my new field “Customer No Field ID” against the result of the lookup:

In the Lookup page users can pick the field and the field ID needed:

The goal with this exercise reminding those who knew and making aware those who didn’t know about the new Microsoft System app objects and use them in daily tasks, instead of re-inventing them each time.

Moreover, these objects are part of the platform (designed and tested by Microsoft), therefore, we have every reason to use them.

Hope this helps!

Azure Storage Table and its API

One way to make available data from Business Central to different parties and users is to leverage Azure Storage.

Customers will need to have an Azure Portal subscription.

There are 4 types of storage in Azure portal: Blob Containers, File Shares, Queues, and Tables.

Today’s blog is about Azure Storage Tables and its API.

More on its API here.

In this blog I covered:

  • create a storage account table using Azure Platform;
  • insert data in the storage account table via VS Code extension “Rest Client”;
  • an extension to send Business Central vendor data to the Azure Storage table;
  • using Excel to share Azure Storage table data with 3rd party users.

Create storage account and storage account table

I created through the Azure portal one storage account and a few tables stored inside that storage account.

The process is simple and you can find details about storage accounts here and an overview on storage account tables here.

You can insert manually data in these tables using Azure Storage Explorer.

Let’s see how we can interact with them first via Rest Client (extension for VS Code), and then via Business Central extensions.

Using Rest Client to interact with Azure Storage Table

1. we can query the Azure Tables via Rest Client in VS Code:

[in the picture insert your SAS token right after “?”]

2. Query Vendor table:

3. Insert new record in Vendor Table:

Let’s now verify in Azure portal the last action:

Use Azure Storage Table API from Business Central

With the API tested, let’s now move to Business Central and AL and try to insert records in the Azure tables.

The sample code I worked on will scan all vendors and send Vendor.”No.” and Vendor.Name to the Vendor table in Azure Storage.

When creating a new table in Azure Storage Table, each table comes by default with the 2 fields:

  • PartitionKey
  • RowKey

In my example, PartitionKey will be empty, but you could populate it with the company name.

RowKey will be populated with Vendor.SystemId.

“Azure Storage Setup” it’s an extension table to keep all that great Azure stuff:

To log the work done by the Post request I created table and a page and I am inserting records in this table with each successful Post request.

Use the Azure Storage Explorer to view records in the Azure Portal:

What about the 3rd party users?

How do we give them access to the data in the storage table?

The good old Excel is here to contribute.

Use of Excel to share Azure Storage Data with 3rd party users

-Open Excel

  • go to Data menu
  • Get Data -> From Azure Storage Table
  • when prompted enter for the “Account Name or URL” enter the storage URL:
  • For access key enter the access key from storage account

And this is what we get:

Click on Load, and then double click on Vendor Connection:

PowerQuery is opening and we can enable the other fields (by clicking on Content column and select the missing columns):

You can find the AL sample code here.

Give customers their own Advanced Settings page in Business Central

In the latest versions of Business Central one could find an Advanced Settings page.

In the Navigation Bar, click on the magnifying glass icon on the address bar; in the search bar type “Advanced Settings”.

This brings a NavigatePage, page 9202 “Advanced Settings” located in the Microsoft System Application app.

To see the source code of this page browse to Micorosft Github.

The repo consists of this page and 2 codeunits involved with the internal mechanics of this page.

If customers find themselves lost through all the setup and settings pages or if the “Manual Setup” page is too large we could gather the most used pages on a custom page “ABC Advanced Settings”, like I did below:

Repository here.

To use or not to use SelectLatestVersion()

When using web services or API exposed entities you might find useful to request the application service to grab the latest version of the underlying data.

The definition at Database.SelectLatestVersion Method – Business Central | Microsoft Docs states that with using SelectLatestVersion() function you make sure that “the data displayed is the most current data in the database”.

Why would we do that? Isn’t the browser page automatically refreshed?

Not always. Not when a 3rd party up updates the records.

Let’s do some tests in a SaaS environment.

I created a custom entity (testTable), with a list page and an API page. Will start with pushing 10 records to the table via a batch using Postman:

This is the result when executing “refresh” action:

And now let’s send another batch with 4 Delete request:

Next, I’m going to send another 10 records batch to BC.

Using a new action, “refresh-SelectLatestVersion” that does not contain SelectLatestVersion() gives us the following:

It appears that SelectLatestVersion does not make any difference in SaaS and that affecting records with a BC native API does not require SelectLatestVersion().

Let’s try something similar in an On-prem installation.

When records are updated by other apps, not through Business Central means (by the way, not a great idea), the page is not notified of changes in the underlying data and therefore is in a stale state.

How can we enforce the data to update?

Using SelectLatestVersion() we’re clearing up the client cache for the underlying table, initiating a new Read transaction for the affected table, thus affecting performance.

Let’s see how much is actually taking the server to grab the latest data.

I inserted via T-SQL 1,000,000 records:

and this is what I’ve got when I refreshed the page:

The I removed all records:

As you can see above, while my CurrPage.Update is before Message, the page still shows the records. I am guessing that the Message gets displayed before the page is re-rendered.

After clicking on OK the page get rendered again and shows 0 records.

It took 69 milliseconds but the table had only 2 fields. With more fields the result might take longer.

Sometimes customers will ask for an auto-refresh page. While there are technical means to satisfy the request, we need to recognize that this comes with a price, hurting performance. And when applying an auto-refresh to multiple pages the price consequently multiplies.

Things to consider:

  1. avoid when possible use of Selectlatestversion on prem.
  2. In SaaS no need for SelectLatestVersion, refreshing the page via an action or browser F5 displays the latest data.
  3. avoid auto-refreshing. Rather go with a manual refresh(action on page to refresh and call SelectLatestVersion) than auto-refresh (a timer controladdin)
  4. To decrease the number of SelectLatestVersion() calls and CurrPage.Update, log your refresh triggers (count and refresh datetime), and compare current count against last refresh count, get the maximum System Modified At among your records and compare it against your last log datetime …

Extension code for SaaS is located here.

How to PowerApps with Business Central

PowerApps – Intro

What is PowerApps? PowerApps is a service for generating cross platform (iOS, Android, Windows Store) applications. It allows connectivity to different systems, comes up with cloud IDE and a cloud admin interface that allows users to publish apps targeting whatever platform you need. The IDE is called PowerApps Studio and can be downloaded from Windows Store locally on your machine or it can be used as a web application. I designed the app detailed below using the web application.

Most importantly, just like the other power tools, PowerBI and MS Flow, PowerApps is accessible not only to professional developers, but also business analysts, junior developers, or expert users in any company. I wrote this app without any code inside PowerApps Studio, just a few Excel functions invoked sporadically.

The quick PowerApps app I built required:

  • a PowerApps license. I got my free license here.
  • a NAV container hosted on Azure. You could build yours easily, some help here.

Application scope

The app will get from the Azure Business Central container the list of items via Item List page exposed as web service, and will present on the first screen the Item No. and Description for all items. App user can then advance into the details screen for each item. Here, if the Quantity is low the user can advance on a third screen where he can generate a purchase invoice for the desired quantity for the item and vendor selected. The result is that in Business Central the app will generate via a second web service a purchase invoice for the item, the vendor selected and the quantity entered.

Application design

There are two main parts to create your app:

  1. Create app connector
  2. Design app

1. Create app connectors

To create a Business Central connector go to the File menu in the PowerApps Studio and choose Connections:conn

 

The connector to the Azure BC Container instance looks like this:

 

new bc connector

Once the connector is set we can access all web services exposed in Business Central Azure Container.

2. Design PowerApps app

The PowerApps Studio comes with 3 main regions:

  1. Left Side is where you work with the screens. In this simple app I have 4 screens: Master, Detail, Order and Confirm.
  2. The center belongs to the canvas where you drop your controls
  3. Right side is occupied by the Data Source (if any) and the properties of the current control you selected on the canvas

 

studio screens

MasterScreen consists of a Galleria control (GalleriaItems) which contains a list of items retrieved via Items web service Data Source. You will see later that this web service is Page 31 exposed as web service in Business Central Azure Container.

OnSelect event for the Forward button has Navigate(screen,effect) function behind to advance to a certain screen in your app.

forwardbutton

The second screen, DetailScreen displays a bit more fields from Items web service.

detailscreen

If the inventory is low, the app user can decide to order more by clicking on “Order more” button:

orderscreen

Once the user enters the desired quantity to be included on a Business Central Purchase Invoice the app will create a POST request to a new ODATA web service data source (OrderItemVendorWS) and ultimately generate the purchase invoice with one purchase line.

Let’s see the app:

runappgiflast

 

And, in Business Central, the new purchase invoice:

purchaseinvoiceinbc

This is what was needed on the PowerApps side, but additionally, I needed to plug a few new things in Business Central.

Business Central Container changes

First, create a new AL project, and point Visual Studio Code to the azure container:

Launch.json:

launchjson

Web services:

  • Items service will support MasterScreen and DetailScreen
  • OrderItemVendorWS will support OrderScreen.

web servicesPage 50100 “PurchaseItemList” is a new page based on a new Table 50100 PurchaseItem:

Table 50100 PurchaseItem:

TabPurchaseItem

Page 50100 PurchaseItemList:

Pag50100PurchaseItemList

The Purchase Invoice is generated during OnInsert trigger on the new table:

InsertTrigger50100

Conclusions

Creating an app with PowerApps assumes 3 tasks:

  1. the app backend, the connectors to your apps
  2. the app design, done in PowerApps Studio
  3. Publishing and Management tasks

PowerApps comes with versioning and management capabilities of a few environments (E.g. Dev, QA, Prod). Once your app has been tested by PowerApps app users, you could export it from QA and import it in Prod and distributed it from there. Select Office and  Dynamics 365 plans will allow you to generate and manage these environments.

More specifically, if you go to web.powerapps.com and click on Solutions you will be able to follow (with the right license) Create a new environment link.

Some useful links:

 

Invoking Azure Functions to replace DOT NET calls in C/AL or AL

Recently Microsoft announced that dotnet can still be used with installations on premise of Dynamics 365 Business Central.

However, if our extension is to make it in the cloud the code leveraging dot net needs to be replaced with http api calls.

In this example I will show how a legacy C/AL code using dot net can be replaced with a call to an Azure function to achieve the original goal of validating a posting code.

Premise

  • Either Table 18 was modified and additional code was added in “Post Code” Validate trigger with Regex class entities to perform validation on post codes.
  • Or, the additional validation is executed when the Post Code Validate in standard is finished and a subscriber to Post Code Validate exists in our extension and is triggered, but still contains dot net code(RegularExpressions class entitites) as we’re only dealing with on-premise (target=internal in app.json)

Objective

I want the additional validation to be executed when the standard validation is finished and the additional validation to not contain dotnet calls.

Design

  1. In a new AL project add a new codeunit:

add_al_codeunit

2. The codeunit itself contains an event subscriber to Table18.Validate.PostCode.

(Use “teventsub” snippet to get the quick scaffolding of the event subscriber)

codeunit_content

When the subscriber is triggered we are executing an Azure Function call: azfnvalidate-us-ca-zipcode. We’re retrieving a json object whose content is : {“PCValid” : true} or {“PCValid” : false}.

3. Write the Azure Function with Visual Studio Code

Pre-requisites:

  • Azure subscription
  • install C# extension
  • Azure Function Core Tools extension
  • install .net core (use npm or chocolatey)
  • Azure CLI Tools

VSCodeExtensions

A good guide to get you started with Azure Functions is here.

Once you have the default “Hello World” Azure Function created, replace your Run function with:

azFn

Publishing the function in Azure should generate a record in your chosen storage:AzureFninPortal

Testing

  1. Once published we can quickly spin a test for the new Azure Function in a web browser window:

web_browser_test

2. Removing the “W” in the previous test, triggers the Azure Function to return above json.

web_browser_invalid_postcode

3. Let’s test now the validation in Business Central:

ezgif-3-34f9ae149c11

Therefore, to replace a set of dotnet calls we need a worker placed somewhere else other than in AL or C/AL and a caller of that worker services placed in the extension. In my example use a codeunit (caller) in the extension range with a subscriber event defined that calls an Azure function(worker).

What other methods are you employing to achieve similar results ?

If you liked this article bookmark my blog or follow me for more stuff about NAV and Business Central.

Dynamics 365 Business Central : Al Code Analyzers #TurnOnCops #Extensions #Permissions

When you turn On PerTenantExtensionCop in Visual Studio Code and you forgot to create a Permissions.xml file, you get a compile error in your extension.

In Visual Studio Code – > User Settings add an entry for al.codeAnalyzers token like below:

CodeCops

Without this flag turned on your extension compiles.

If you do set PerTenantExtensionCop and if you’re missing Permissions.xml, you’re going to get a compile error:

Compile Error

By Adding in the project root a file Permissions.xml the error is solved:

Success

More reading on code analyzers here.

Original post here.

Dynamics 365 Business Central : Extending Role Center headline with web service data, lists and dictionaries

So much to read, so little time … the speed at which Microsoft adds new Business Central and AL features is overwhelming 🙂

In this blog I’ll demonstrate how I was able to display weather temperature for 3 cities in Business Central role center headline.

ezgif-3-ef3a02f8f885

First, there are 9 headline role center pages in Business Central, with ID from 1440..1448.

headline pages

I will extend the headline role center for the page 1440 : “Headline RC Business Manager”, by adding three fields, one for each city and its temperature.

To record the three cities and their temperatures I am using here a list and a dictionary data structure.

Fields_And_DataStructures

This is followed by a querying of a weather web service and recording of the 3 cities and their temperatures in a dictionary:

Query Web Service

Commented is the response from the web service.

I need data stored in the following tokens:

  • $main.temp
  • $sys.country
  • name

For more info on how to parse web service response take a look at Mr. Kauffman blog.

json

I use a free web service for weather openweathermap. You need to create an account and you will get a free APPID when you complete the registration. You can only query the web service once every 10 minutes for the same location.

Finally, to load cities and their temperatures in your headline use the code below:

CityTemp

The complete pageextension object is included here.

That’s it … thanks for reading!

Original post here.