Handy Date-Time Dialog in Business Central

Did you ever need a DateTime field in your Business Central extension?

I recently added, in one of my customer extensions, a DateTime field, and wanted to allow users to not only record a date, but also a time.

The solution is not difficult, even if we need to write our own code.

But why not use existing code in Base Application?

Look at this sample code:

table 50110 "Sample DateTime"
{
    DataClassification = CustomerContent;

    fields
    {
        field(1; MyDateTime; DateTime)
        {
            DataClassification = CustomerContent;

            trigger OnLookup()
            var
                DateTimeDialog: Page "Date-Time Dialog";
            begin
                DateTimeDialog.SetDateTime(RoundDateTime(MyDateTime, 1000));
                if DateTimeDialog.RunModal() = Action::OK then
                    MyDateTime := DateTimeDialog.GetDateTime();
            end;
        }

When you run the page click on the “…” and see the Date-Time Dialog page:

The code above is using Page 684 “Date-Time Dialog”.

If you want to see how Microsoft designed the page, check this.

And if you want to see how Microsoft implemented it in Base Application check table 472 “Job Queue Entry”, fields “Expiration Date/Time” and “Earliest Start Date/Time”.

Look for all pickable DateTime fields in your solutions and re-factor them using “Date-Time Dialog” page.

Hope this helps!

So I chose Business Central … But am I losing my Dynamics GP data?

In the last 6 months I’ve been involved with a number of GP to BC migration projects.

A recurring question that reaches our team is how do I see GP data in BC?

One avenue to move your business to BC is to import open transactions and master data, and tested setup tables with RapidStart packages. If the underlying table of the desired GP entity does not exist in BC, then a Business Central developer would need to create the table in BC and, with Edit In Excel functionality you can get GP data in BC.

There is also the Cloud Migration Tool in BC. More about it here.

Using this tool ensures the most important entities, master data and open transactions, will make it into BC. But what if a GP end-user wants additional GP data in BC?

Microsoft recommendation is to bring as little as possible into the cloud from an on-premise database.

Moreover, as your database capacity increases, your cost can increase. See more here.

Rather than bringing GP tables one by one in BC, use the cloud migration tool to move data from GP to Azure Data Lake.

If the decision is, though, to have some GP data in Business Central, there are tools to make that possible.

We can extend the cloud migration tool so that, when the migration starts, beside the core migrated data (master data and open transactions) the process will also bring into a new space (an extension table) the data from the GP table as mapped in the “Manage Custom Tables” page.

What’s needed to achieve this:

  • Create a Business Central extension. In it, create an AL table to store your data from a GP table
  • Add the custom table in Manage Custom Tables
  • Run migration tool
  • Check custom table content after migration

Let’s try bringing table GL00100 from GP in BC.

Note: this table was chosen only for demonstration. GL00100 is brought by default by the cloud migration tool into BC table “G/L Account”.

Create extension with GP table

I created an extension that includes a table for this GP entity:

Map migration for new table in “Cloud Migration Management”

In Business Central, search for “Cloud Migration Management”.

Under actions trigger “Manage Custom Tables” action:

Under “Migration Table Mapping” page, map new table in your extension to the GP table:

On “Migration Cloud Management” trigger the “Run Migration Now” action.

You can check the results in the cue on the Migration Information area:

To check the content migrated:

  • change the company to the migrated company
  • run the new table by adding “&Table= 50340” to the Business Central URL:

We can now see the result of migrating the GP data to the custom BC table:

Conclusion

To answer the question in the title, you don’t lose GP data. There are multiple ways of accessing your GP data post go-live to BC, involving:

  • retaining the access to your old system
  • migrate your Dynamics GP installation to Azure (SQL Server and application)
  • migrating your GP data warehouse to Azure Data Lake
  • or, as shown above, with minimal coding, keeping your GP data in Business Central

Engage with your partner and decide what GP data do you really need today so that long term your cloud ERP stays performant.

Export Business Central online entities to Azure storage blob container

As most probably know, it is not possible to access the file system while in Business Central cloud environment.

For example, in Dynamics NAV, we could have a job queue entry that, when run, creates a file and copies it in a network folder. We can still do that in an On-Premise environment, but not with cloud BC.

You could create the file and use DownloadFromStream, but that would only prompt you do download it locally, but would not copy it somewhere on a local or network folder.

If you try to use File.Create() you would get the warning: “The type or method ‘Create’ cannot be used for ‘Extension’ development”.

If your customer is happy to grab the file manually every time from the downloads folder then this should suffice:

But, if we want to automatize this process and run the extract on a regular basis, we need to find a cloud solution for storing the files.

Currently, there are 4 types of storage in Azure platform:

  • Containers/Blobs
  • File Shares
  • Queues
  • Tables

In my previous blog I dived into the Azure Storage of type Tables and tackled its API.

This blog is about interacting with the Azure storage blob containers:

  • manually, via Azure Portal
  • simulation, via VS Code extension “Rest Client”
  • Business Central extension
  • view blob container with Excel
  • get Azure Blobs locally

I found on Michael Megel’s blog a nice solution for exactly what I need. Awesome job on Blob Containers API, Michael! Thank you for sharing!

What I need:

  • Azure:
    • Set up a blob container to store Business Central exported files
    • Set up Storage Access Key
  • Simulation:
    • In VS Code, write requests with “Rest Client” extension, targeting Azure blob container API
  • Business Central:
    • A setup table in Business Central for Azure access stuff
    • Wrote an export interface that would allow users to run an action(“Write File in Azure”) that will send the extract to Azure container. The same code could be executed by a job queue.

Blob Container Setup

To set up a container, following Michael’s notes on above blog was enough for me.

For blob container accessibility I went on the path of shared access signature “SAS Token”.

Once created, you can start playing with the storage account container API.

I created the storage manually:

Drilling down into the storage account, I created a new container:

Simulation:

In VS Code, using Rest Client,

  1. I sent a request to get the list of containers:

Request:

GET https://svflorida.blob.core.windows.net/?comp=list&%5Bhere you insert your SAS token key]

content-type: application/json

Response:

HTTP/1.1 201 Created

Content-Length: 0

Content-MD5: 1B2M2Y8AsgTpgAmY7PhCfg==

Last-Modified: Wed, 18 Aug 2021 19:05:13 GMT

ETag: “0x8D9627B1BD88A0F”

Server: Windows-Azure-Blob/1.0 Microsoft-HTTPAPI/2.0

x-ms-request-id: 3f97555d-801e-006d-5263-94f646000000

x-ms-version: 2020-08-04

x-ms-content-crc64: AAAAAAAAAAA=

x-ms-request-server-encrypted: true

Date: Wed, 18 Aug 2021 19:05:13 GMT

Connection: close

2. I sent a PUT request to insert an empty file:

Request:

PUT https://svflorida.blob.core.windows.net/vendorlist/vl1111?%5Byour SAS token here]

x-ms-blob-type: BlockBlob

Content-Length: 0

Response:

HTTP/1.1 201 Created

Content-Length: 0

Content-MD5: 1B2M2Y8AsgTpgAmY7PhCfg==

Last-Modified: Wed, 18 Aug 2021 19:23:46 GMT

ETag: “0x8D9627DB340E9DD”

Server: Windows-Azure-Blob/1.0 Microsoft-HTTPAPI/2.0

x-ms-request-id: b77cbfb2-b01e-003b-2566-9407a9000000

x-ms-version: 2020-08-04

x-ms-content-crc64: AAAAAAAAAAA=

x-ms-request-server-encrypted: true Date: Wed, 18 Aug 2021 19:23:46 GMT

Connection: close

And this is the file in Azure portal:

Business Central extension:

This is how the new setup table “Azure Storage Setup” looks in BC:

This is how the new BC interface “Vendors Export Log” looks like:

“Write File In Azure” action on page 50251 “Vendor Export Log” does the following:

  • exports all BC vendors to a blob
  • the blob is then written to a PUT request content
  • the PUT request is sent to Azure Blob Storage API

Consult Blobs with Excel:

BC users can click on the URL link above and download locally the file or they, and other 3rd party users, can access the files via Excel, as I explained in my previous blog.

This time though, when creating the connection choose Data – > Get Data -> From Azure -> From Azure Blob Storage.

And finally displayed in the Excel book:

Get Azure Blobs locally

To help with getting the files locally, I wrote 2 blogs:

  • one about getting the files locally using Power Automate
  • one about Azure CLI to copy the files from azure blob storage locally

For more about storage accounts in Azure check this out.

You can find sample code repository here.