Leveraging “Filter Tokens” codeunit to expand Business Central users’ filtering experience

Hello Readers!

A few weeks back, I watched Erik Hougaard‘s youtube video “Make your own Date Filters in AL and Business Central” and thought of trying it and adding my own bit to it.

First, what was the intention with custom filter tokens?

The standard application comes already with some tokens.

Think of dates, when you press “t” in a date field you get the today’s date, or when you press “q” in a date filter field you get the current quarter, and so on.

But what if we want to build our own tokens?

Custom Date Filters

For example, let’s assume that if I type “sv1” in a date filter I want the system to process my token into Jan 1st – Jan 31st. If I type “sv2” in a date filter I want the system to translate “sv2” into Feb 1st to Feb 29th or 28th depending on the current year, leap or not, and so on.

How can we do that? Extend the event OnResolveDateFilterToken from System Application codeunit “Filter Tokens” like in my sample code below:

[EventSubscriber(ObjectType::Codeunit, Codeunit::"Filter Tokens", 'OnResolveDateFilterToken', '', false, false)]
    local procedure CustomDateFilter(DateToken: Text; var FromDate: Date; var Handled: Boolean; var ToDate: Date)
    begin
        DateToken := UpperCase(DateToken);
        case DateToken of
            'SV1':
                begin
                    FromDate := GetFromDate(Today(), 1);
                    ToDate := GetToDate(Today(), 1);
                    Handled := true;
                end;
            'SV2':
                begin
                    FromDate := GetFromDate(Today(), 2);
                    ToDate := GetToDate(Today(), 2);
                    Handled := true;
                end;
            'SV3':
                begin
                    FromDate := GetFromDate(Today(), 3);
                    ToDate := GetToDate(Today(), 3);
                    Handled := true;
                end;
            'SV4':
                begin
                    FromDate := GetFromDate(Today(), 4);
                    ToDate := GetToDate(Today(), 4);
                    Handled := true;
                end;
            'SV5':
                begin
                    FromDate := GetFromDate(Today(), 5);
                    ToDate := GetToDate(Today(), 5);
                    Handled := true;
                end;
            'SV6':
                begin
                    FromDate := GetFromDate(Today(), 6);
                    ToDate := GetToDate(Today(), 6);
                    Handled := true;
                end;
            'SV7':
                begin
                    FromDate := GetFromDate(Today(), 7);
                    ToDate := GetToDate(Today(), 7);
                    Handled := true;
                end;
            'SV8':
                begin
                    FromDate := GetFromDate(Today(), 8);
                    ToDate := GetToDate(Today(), 8);
                    Handled := true;
                end;

            'SV9':
                begin
                    FromDate := GetFromDate(Today(), 9);
                    ToDate := GetToDate(Today(), 9);
                    Handled := true;
                end;
            'SV10':
                begin
                    FromDate := GetFromDate(Today(), 10);
                    ToDate := GetToDate(Today(), 10);
                    Handled := true;
                end;
            'SV11':
                begin
                    FromDate := GetFromDate(Today(), 11);
                    ToDate := GetToDate(Today(), 11);
                    Handled := true;
                end;
            'SV12':
                begin
                    FromDate := GetFromDate(Today(), 12);
                    ToDate := GetToDate(Today(), 12);
                    Handled := true;
                end;
        end;
    end;

    local procedure GetFromDate(Dt: Date; mo: integer): Date
    begin
        Exit(DMY2Date(1, mo, Date2DMY(Dt, 3)));
    end;

    local procedure GetToDate(Dt: Date; mo: integer): Date
    begin
        case mo of
            1, 3, 5, 7, 8, 10, 12:
                Exit(DMY2Date(31, mo, Date2DMY(Dt, 3)));
            4, 6, 9, 11:
                Exit(DMY2Date(30, mo, Date2DMY(Dt, 3)));
            2:
                begin
                    if Date2DMY(Dt, 3) div 4 = 0 then
                        Exit(DMY2Date(29, mo, Date2DMY(Dt, 3)))
                    else
                        Exit(DMY2Date(28, mo, Date2DMY(Dt, 3)))
                end;
        end

The code could be refactored into a function that parses a 4 characters token of form “svxy” and call once GetToDate and once GetFromDate instead of 12 calls, but that’s not the goal of this blog.

Let’s test it.

Open Chart of Accounts page and use the flow filters in the “Filter Totals By” section of the page as below:

What about text filters? Can we customize them?

Custom Text Filters

This is the use case: each user has access to his list of customers (My Customer page):

Users can edit their own list of customers, adding/removing customers.

We also want, when we are in the Customers list, to be able to quickly filter the list of customers to the list in My Customers.

We can create a custom text filter token and by subscribing to event OnResolveTextFilterToken in codeunit “Filter Tokens” we get the functionality desired, like below:

    [EventSubscriber(ObjectType::Codeunit, Codeunit::"Filter Tokens", 'OnResolveTextFilterToken', '', true, true)]
    local procedure CustomTextFilter(TextToken: Text; var TextFilter: Text; var Handled: Boolean)
    var
        _mc: Record "My Customer";
        _maxloops: integer;
    begin
        _maxloops := 10;
        TextToken := UpperCase(TextToken);
        Handled := true;
        case TextToken of
            'SV':
                begin
                    _mc.SetRange("User ID", UserId());
                    if _mc.FindSet() then begin
                        _maxloops -= 1;
                        _maxloops -= 1;
                        TextFilter := _mc."Customer No.";
                        if _mc.Next() <> 0 then
                            repeat
                                _maxloops -= 1;
                                TextFilter += '|' + _mc."Customer No.";
                            until (_mc.Next() = 0) or (_maxloops <= 0);
                    end
                end;
        end;
    end;

In the Customers List we can now use the new token:

When users filter the “No.” field to “%sv” the system finds all Customer “No.” in My Customer list and populates the filter for “No.” field.

My Customer list consists of customers 20000,30000, and 50000 and therefore when using custom text filter “sv” I get the list of my customers.

You could similarly create a custom token to filter Chart of Accounts to G/L accounts in “My Accounts”.

Things to consider

The token above “sv” would be triggered and parsed in any page.

For example, if we are in the Vendors list the same list (20000,30000 and 50000) will be the result of parsing “sv” token. And that might not be what we need.

A possible solution is to specialize the custom filters to customers or to vendors, like having 2 tokens: “csv” for customers and “vsv” for vendors.

For more considerations when using custom tokens read here.

Go on, try them!

“Field Selection” codeunit – how I select and record the ID of a field in Business Central

Hello readers!

Recently I have been working on a customization for a customer with the goal of changing the out-of-the-box Positive Pay export for a Bank Account record.

While preparing the mapping for the positive pay details, I noticed the way Microsoft wrote the picking up of a field ID. They created a new codeunit: “Field Selection”.

Let’s see how to get to that piece of code:

  • in BC, search for “Data Exchange Definitions”
  • Click on any Exchange Definition Code, then in the Line Definitions, click on Manage and “Field Mapping”
  • In the “Field Mapping”, click on “Field ID” lookup “…”
  • the list of fields in the table 330 is displayed:

This lookup page is triggered by the OnLookup trigger on page 1217 “Data Exch Field Mapping Part”:

field("Field ID"; "Field ID")
                {
                    ApplicationArea = Basic, Suite;
                    ShowMandatory = true;
                    ToolTip = 'Specifies the number of the field in the external file that is mapped to the field in the Target Table ID field, when you are using an intermediate table for data import.';

                    trigger OnLookup(var Text: Text): Boolean
                    var
                        "Field": Record "Field";
                        TableFilter: Record "Table Filter";
                        FieldSelection: Codeunit "Field Selection";
                    begin
                        Field.SetRange(TableNo, "Table ID");
                        if FieldSelection.Open(Field) then begin
                            if Field."No." = "Field ID" then
                                exit;
                            TableFilter.CheckDuplicateField(Field);
                            FillSourceRecord(Field);
                            FieldCaptionText := GetFieldCaption;
                        end;
                    end;

                    trigger OnValidate()
                    begin
                        FieldCaptionText := GetFieldCaption;
                    end;
                }

The OnLookup trigger is using codeunit 9806 “Field Selection”. This codeunit, as well as its associated page, Page 9806 “Fields Lookup” can be located in Microsoft System Application app.

You can go through the source code for this codeunit here.

As you can see below, the main function Open is fairly simple:

Let’s put these objects into a simple practice exercise.

I am going to create a new table and in it a new field in which I am planning to record the ID of the Customer.”No.” field.

Here is field definition in the table:

        field(2; "Customer No Field ID"; Integer)
        {
            DataClassification = CustomerContent;
        }

And here is the field definition on the list page:

                field(MyField2; Rec."Customer No Field ID")
                {
                    ApplicationArea = All;

                    trigger OnLookup(var Text: Text): Boolean
                    var
                        RecField: Record "Field";
                        FieldSelection: Codeunit "Field Selection";
                    begin
                        RecField.SetRange(TableNo, Database::Customer);
                        if RecField.Get(Database::Customer, Rec."Customer No Field ID") then;

                        if FieldSelection.Open(RecField) then
                            Rec.Validate("Customer No Field ID", RecField."No.");
                    end;
                }

First, we’re filtering the Field record (RecField) to the Customer table and then we execute the “Field – Selection” Open method, which in turn displays the “Fields Lookup” page. Lastly, I validate my new field “Customer No Field ID” against the result of the lookup:

In the Lookup page users can pick the field and the field ID needed:

The goal with this exercise reminding those who knew and making aware those who didn’t know about the new Microsoft System app objects and use them in daily tasks, instead of re-inventing them each time.

Moreover, these objects are part of the platform (designed and tested by Microsoft), therefore, we have every reason to use them.

Hope this helps!

Handy Date-Time Dialog in Business Central

Did you ever need a DateTime field in your Business Central extension?

I recently added, in one of my customer extensions, a DateTime field, and wanted to allow users to not only record a date, but also a time.

The solution is not difficult, even if we need to write our own code.

But why not use existing code in Base Application?

Look at this sample code:

table 50110 "Sample DateTime"
{
    DataClassification = CustomerContent;

    fields
    {
        field(1; MyDateTime; DateTime)
        {
            DataClassification = CustomerContent;

            trigger OnLookup()
            var
                DateTimeDialog: Page "Date-Time Dialog";
            begin
                DateTimeDialog.SetDateTime(RoundDateTime(MyDateTime, 1000));
                if DateTimeDialog.RunModal() = Action::OK then
                    MyDateTime := DateTimeDialog.GetDateTime();
            end;
        }

When you run the page click on the “…” and see the Date-Time Dialog page:

The code above is using Page 684 “Date-Time Dialog”.

If you want to see how Microsoft designed the page, check this.

And if you want to see how Microsoft implemented it in Base Application check table 472 “Job Queue Entry”, fields “Expiration Date/Time” and “Earliest Start Date/Time”.

Look for all pickable DateTime fields in your solutions and re-factor them using “Date-Time Dialog” page.

Hope this helps!

So I chose Business Central … But am I losing my Dynamics GP data?

In the last 6 months I’ve been involved with a number of GP to BC migration projects.

A recurring question that reaches our team is how do I see GP data in BC?

One avenue to move your business to BC is to import open transactions and master data, and tested setup tables with RapidStart packages. If the underlying table of the desired GP entity does not exist in BC, then a Business Central developer would need to create the table in BC and, with Edit In Excel functionality you can get GP data in BC.

There is also the Cloud Migration Tool in BC. More about it here.

Using this tool ensures the most important entities, master data and open transactions, will make it into BC. But what if a GP end-user wants additional GP data in BC?

Microsoft recommendation is to bring as little as possible into the cloud from an on-premise database.

Moreover, as your database capacity increases, your cost can increase. See more here.

Rather than bringing GP tables one by one in BC, use the cloud migration tool to move data from GP to Azure Data Lake.

If the decision is, though, to have some GP data in Business Central, there are tools to make that possible.

We can extend the cloud migration tool so that, when the migration starts, beside the core migrated data (master data and open transactions) the process will also bring into a new space (an extension table) the data from the GP table as mapped in the “Manage Custom Tables” page.

What’s needed to achieve this:

  • Create a Business Central extension. In it, create an AL table to store your data from a GP table
  • Add the custom table in Manage Custom Tables
  • Run migration tool
  • Check custom table content after migration

Let’s try bringing table GL00100 from GP in BC.

Note: this table was chosen only for demonstration. GL00100 is brought by default by the cloud migration tool into BC table “G/L Account”.

Create extension with GP table

I created an extension that includes a table for this GP entity:

Map migration for new table in “Cloud Migration Management”

In Business Central, search for “Cloud Migration Management”.

Under actions trigger “Manage Custom Tables” action:

Under “Migration Table Mapping” page, map new table in your extension to the GP table:

On “Migration Cloud Management” trigger the “Run Migration Now” action.

You can check the results in the cue on the Migration Information area:

To check the content migrated:

  • change the company to the migrated company
  • run the new table by adding “&Table= 50340” to the Business Central URL:

We can now see the result of migrating the GP data to the custom BC table:

Conclusion

To answer the question in the title, you don’t lose GP data. There are multiple ways of accessing your GP data post go-live to BC, involving:

  • retaining the access to your old system
  • migrate your Dynamics GP installation to Azure (SQL Server and application)
  • migrating your GP data warehouse to Azure Data Lake
  • or, as shown above, with minimal coding, keeping your GP data in Business Central

Engage with your partner and decide what GP data do you really need today so that long term your cloud ERP stays performant.

Export Business Central online entities to Azure storage blob container

As most probably know, it is not possible to access the file system while in Business Central cloud environment.

For example, in Dynamics NAV, we could have a job queue entry that, when run, creates a file and copies it in a network folder. We can still do that in an On-Premise environment, but not with cloud BC.

You could create the file and use DownloadFromStream, but that would only prompt you do download it locally, but would not copy it somewhere on a local or network folder.

If you try to use File.Create() you would get the warning: “The type or method ‘Create’ cannot be used for ‘Extension’ development”.

If your customer is happy to grab the file manually every time from the downloads folder then this should suffice:

But, if we want to automatize this process and run the extract on a regular basis, we need to find a cloud solution for storing the files.

Currently, there are 4 types of storage in Azure platform:

  • Containers/Blobs
  • File Shares
  • Queues
  • Tables

In my previous blog I dived into the Azure Storage of type Tables and tackled its API.

This blog is about interacting with the Azure storage blob containers:

  • manually, via Azure Portal
  • simulation, via VS Code extension “Rest Client”
  • Business Central extension
  • view blob container with Excel
  • get Azure Blobs locally

I found on Michael Megel’s blog a nice solution for exactly what I need. Awesome job on Blob Containers API, Michael! Thank you for sharing!

What I need:

  • Azure:
    • Set up a blob container to store Business Central exported files
    • Set up Storage Access Key
  • Simulation:
    • In VS Code, write requests with “Rest Client” extension, targeting Azure blob container API
  • Business Central:
    • A setup table in Business Central for Azure access stuff
    • Wrote an export interface that would allow users to run an action(“Write File in Azure”) that will send the extract to Azure container. The same code could be executed by a job queue.

Blob Container Setup

To set up a container, following Michael’s notes on above blog was enough for me.

For blob container accessibility I went on the path of shared access signature “SAS Token”.

Once created, you can start playing with the storage account container API.

I created the storage manually:

Drilling down into the storage account, I created a new container:

Simulation:

In VS Code, using Rest Client,

  1. I sent a request to get the list of containers:

Request:

GET https://svflorida.blob.core.windows.net/?comp=list&%5Bhere you insert your SAS token key]

content-type: application/json

Response:

HTTP/1.1 201 Created

Content-Length: 0

Content-MD5: 1B2M2Y8AsgTpgAmY7PhCfg==

Last-Modified: Wed, 18 Aug 2021 19:05:13 GMT

ETag: “0x8D9627B1BD88A0F”

Server: Windows-Azure-Blob/1.0 Microsoft-HTTPAPI/2.0

x-ms-request-id: 3f97555d-801e-006d-5263-94f646000000

x-ms-version: 2020-08-04

x-ms-content-crc64: AAAAAAAAAAA=

x-ms-request-server-encrypted: true

Date: Wed, 18 Aug 2021 19:05:13 GMT

Connection: close

2. I sent a PUT request to insert an empty file:

Request:

PUT https://svflorida.blob.core.windows.net/vendorlist/vl1111?%5Byour SAS token here]

x-ms-blob-type: BlockBlob

Content-Length: 0

Response:

HTTP/1.1 201 Created

Content-Length: 0

Content-MD5: 1B2M2Y8AsgTpgAmY7PhCfg==

Last-Modified: Wed, 18 Aug 2021 19:23:46 GMT

ETag: “0x8D9627DB340E9DD”

Server: Windows-Azure-Blob/1.0 Microsoft-HTTPAPI/2.0

x-ms-request-id: b77cbfb2-b01e-003b-2566-9407a9000000

x-ms-version: 2020-08-04

x-ms-content-crc64: AAAAAAAAAAA=

x-ms-request-server-encrypted: true Date: Wed, 18 Aug 2021 19:23:46 GMT

Connection: close

And this is the file in Azure portal:

Business Central extension:

This is how the new setup table “Azure Storage Setup” looks in BC:

This is how the new BC interface “Vendors Export Log” looks like:

“Write File In Azure” action on page 50251 “Vendor Export Log” does the following:

  • exports all BC vendors to a blob
  • the blob is then written to a PUT request content
  • the PUT request is sent to Azure Blob Storage API

Consult Blobs with Excel:

BC users can click on the URL link above and download locally the file or they, and other 3rd party users, can access the files via Excel, as I explained in my previous blog.

This time though, when creating the connection choose Data – > Get Data -> From Azure -> From Azure Blob Storage.

And finally displayed in the Excel book:

Get Azure Blobs locally

To help with getting the files locally, I wrote 2 blogs:

  • one about getting the files locally using Power Automate
  • one about Azure CLI to copy the files from azure blob storage locally

For more about storage accounts in Azure check this out.

You can find sample code repository here.

Copy files from Azure Blob Storage to File System (using Power Automate)

I found an older post on community.dynamics.com in which someone was asking for ways to automatically drop data extracts originated in BC SaaS into a local folder.

First, in SaaS, we can’t generate the files automatically and store them locally.

We need to store them in the cloud.

Once in the cloud, how can we automatically download them locally on a machine or a network folder?

I bing-ed the phrase “copy files from azure blob storage to file system” and the first search result was this link to a Power Automate template flow:

There are a multitude of cloud providers, but Microsoft does continuously a great job at connecting everything between BC SaaS, Azure platform, Power Automate and Power Apps, so it’s just convenient to use its tools.

To test it, I went through the following exercise:

  • In Azure Platform I created a storage account and in it I created a Blob Container.
    • “A container organizes a set of blobs, similar to a directory in a file system. A storage account can include an unlimited number of containers, and a container can store an unlimited number of blobs.”
  • I created a local folder that will be synchronized by the new flow with the container in Azure

In Power Automate, I started with the Template provided by Microsoft and set up the flow:

The flow requires two connectors to be set up:

  • one to the azure storage container
  • one to the local or network folder

Editing Azure Blob Storage we see that we need the name of the azure storage, in my case “svflorida” and storage access key:

Storage access key is located in in azure portal under Access Keys:

Editing the File System Connector:

The most time consuming, about half an hour, was to set up and troubleshooting the gateway.

The flow cannot just drop files from Azure on your machine. It needs a gateway.

To create a new gateway, click on the drop down and choose “+ New on-premises data gateway”.

That will prompt you to download an msi to install a gateway: GatewayInstall.msi.

Once gateway installed, the only change I’ve operated was to switch from HTTPS to TCP:

In a live environment I would investigate and maybe set up an Azure Service Bus, but for the purpose of this exercise I went with TCP.

Once that is done the flow will be triggered when new files are uploaded or deleted from the Azure Container.

I noticed that with my free trial license the recurrence of the flow was set to 3 minutes.

The flow seems to pick changes as expected, just be patient and wait for the next run 🙂

In azure portal, upload a new file into your container:

The file will appear after a few minutes in your local folder:

And the flow shows a successful run:

That’s it! In the next blog I will look into how I can generate BC SaaS extracts into an Azure storage container so the flow doesn’t feel useless 🙂

I hope this helps someone. In any way, it’s late here so I call it a night!

Azure Storage Table and its API

One way to make available data from Business Central to different parties and users is to leverage Azure Storage.

Customers will need to have an Azure Portal subscription.

There are 4 types of storage in Azure portal: Blob Containers, File Shares, Queues, and Tables.

Today’s blog is about Azure Storage Tables and its API.

More on its API here.

In this blog I covered:

  • create a storage account table using Azure Platform;
  • insert data in the storage account table via VS Code extension “Rest Client”;
  • an extension to send Business Central vendor data to the Azure Storage table;
  • using Excel to share Azure Storage table data with 3rd party users.

Create storage account and storage account table

I created through the Azure portal one storage account and a few tables stored inside that storage account.

The process is simple and you can find details about storage accounts here and an overview on storage account tables here.

You can insert manually data in these tables using Azure Storage Explorer.

Let’s see how we can interact with them first via Rest Client (extension for VS Code), and then via Business Central extensions.

Using Rest Client to interact with Azure Storage Table

1. we can query the Azure Tables via Rest Client in VS Code:

[in the picture insert your SAS token right after “?”]

2. Query Vendor table:

3. Insert new record in Vendor Table:

Let’s now verify in Azure portal the last action:

Use Azure Storage Table API from Business Central

With the API tested, let’s now move to Business Central and AL and try to insert records in the Azure tables.

The sample code I worked on will scan all vendors and send Vendor.”No.” and Vendor.Name to the Vendor table in Azure Storage.

When creating a new table in Azure Storage Table, each table comes by default with the 2 fields:

  • PartitionKey
  • RowKey

In my example, PartitionKey will be empty, but you could populate it with the company name.

RowKey will be populated with Vendor.SystemId.

“Azure Storage Setup” it’s an extension table to keep all that great Azure stuff:

To log the work done by the Post request I created table and a page and I am inserting records in this table with each successful Post request.

Use the Azure Storage Explorer to view records in the Azure Portal:

What about the 3rd party users?

How do we give them access to the data in the storage table?

The good old Excel is here to contribute.

Use of Excel to share Azure Storage Data with 3rd party users

-Open Excel

  • go to Data menu
  • Get Data -> From Azure Storage Table
  • when prompted enter for the “Account Name or URL” enter the storage URL:
  • For access key enter the access key from storage account

And this is what we get:

Click on Load, and then double click on Vendor Connection:

PowerQuery is opening and we can enable the other fields (by clicking on Content column and select the missing columns):

You can find the AL sample code here.

Give customers their own Advanced Settings page in Business Central

In the latest versions of Business Central one could find an Advanced Settings page.

In the Navigation Bar, click on the magnifying glass icon on the address bar; in the search bar type “Advanced Settings”.

This brings a NavigatePage, page 9202 “Advanced Settings” located in the Microsoft System Application app.

To see the source code of this page browse to Micorosft Github.

The repo consists of this page and 2 codeunits involved with the internal mechanics of this page.

If customers find themselves lost through all the setup and settings pages or if the “Manual Setup” page is too large we could gather the most used pages on a custom page “ABC Advanced Settings”, like I did below:

Repository here.

To use or not to use SelectLatestVersion()

When using web services or API exposed entities you might find useful to request the application service to grab the latest version of the underlying data.

The definition at Database.SelectLatestVersion Method – Business Central | Microsoft Docs states that with using SelectLatestVersion() function you make sure that “the data displayed is the most current data in the database”.

Why would we do that? Isn’t the browser page automatically refreshed?

Not always. Not when a 3rd party up updates the records.

Let’s do some tests in a SaaS environment.

I created a custom entity (testTable), with a list page and an API page. Will start with pushing 10 records to the table via a batch using Postman:

This is the result when executing “refresh” action:

And now let’s send another batch with 4 Delete request:

Next, I’m going to send another 10 records batch to BC.

Using a new action, “refresh-SelectLatestVersion” that does not contain SelectLatestVersion() gives us the following:

It appears that SelectLatestVersion does not make any difference in SaaS and that affecting records with a BC native API does not require SelectLatestVersion().

Let’s try something similar in an On-prem installation.

When records are updated by other apps, not through Business Central means (by the way, not a great idea), the page is not notified of changes in the underlying data and therefore is in a stale state.

How can we enforce the data to update?

Using SelectLatestVersion() we’re clearing up the client cache for the underlying table, initiating a new Read transaction for the affected table, thus affecting performance.

Let’s see how much is actually taking the server to grab the latest data.

I inserted via T-SQL 1,000,000 records:

and this is what I’ve got when I refreshed the page:

The I removed all records:

As you can see above, while my CurrPage.Update is before Message, the page still shows the records. I am guessing that the Message gets displayed before the page is re-rendered.

After clicking on OK the page get rendered again and shows 0 records.

It took 69 milliseconds but the table had only 2 fields. With more fields the result might take longer.

Sometimes customers will ask for an auto-refresh page. While there are technical means to satisfy the request, we need to recognize that this comes with a price, hurting performance. And when applying an auto-refresh to multiple pages the price consequently multiplies.

Things to consider:

  1. avoid when possible use of Selectlatestversion on prem.
  2. In SaaS no need for SelectLatestVersion, refreshing the page via an action or browser F5 displays the latest data.
  3. avoid auto-refreshing. Rather go with a manual refresh(action on page to refresh and call SelectLatestVersion) than auto-refresh (a timer controladdin)
  4. To decrease the number of SelectLatestVersion() calls and CurrPage.Update, log your refresh triggers (count and refresh datetime), and compare current count against last refresh count, get the maximum System Modified At among your records and compare it against your last log datetime …

Extension code for SaaS is located here.