Handy Date-Time Dialog in Business Central

Did you ever need a DateTime field in your Business Central extension?

I recently added, in one of my customer extensions, a DateTime field, and wanted to allow users to not only record a date, but also a time.

The solution is not difficult, even if we need to write our own code.

But why not use existing code in Base Application?

Look at this sample code:

table 50110 "Sample DateTime"
{
    DataClassification = CustomerContent;

    fields
    {
        field(1; MyDateTime; DateTime)
        {
            DataClassification = CustomerContent;

            trigger OnLookup()
            var
                DateTimeDialog: Page "Date-Time Dialog";
            begin
                DateTimeDialog.SetDateTime(RoundDateTime(MyDateTime, 1000));
                if DateTimeDialog.RunModal() = Action::OK then
                    MyDateTime := DateTimeDialog.GetDateTime();
            end;
        }

When you run the page click on the “…” and see the Date-Time Dialog page:

The code above is using Page 684 “Date-Time Dialog”.

If you want to see how Microsoft designed the page, check this.

And if you want to see how Microsoft implemented it in Base Application check table 472 “Job Queue Entry”, fields “Expiration Date/Time” and “Earliest Start Date/Time”.

Look for all pickable DateTime fields in your solutions and re-factor them using “Date-Time Dialog” page.

Hope this helps!

Using Postman to test OAuth 2.0 authorization to Business Central restful API

Recently I have been involved in projects involving migrating Dynamics GP customers to BC (SaaS).

All these GP customers had integrations developed over the years, integrations that now need to be re-targeted to BC.

Now, if you are new to Business Central API integration you need to know that there are some authority bloggers that have touched on OAuth 2.0 in the last 6 months with very useful how-tos. Have a look at Stefano‘s or Roberto‘s blogs. Most comprehensive blogs though, in this specific niche, I find A.J. Kauffmann writings.

A.J.’s blogs are meticulous. Moreover, it seems that they are coming almost aligned with my work requirements, so … yeah … I found them very useful.

As you probably heard or seen, Basic Authentication is no longer supported for BC online.

The only option now(at least for BC online) is using OAuth2 authorization.

How do we start with setting up OAuth2?

Well, I won’t go into that because A.J.’s blog was immaculate, I didn’t have anything to add, so I won’t add anything.

To conserve the flow of this blog, all I have to say is that you need to:

  • Register the external application in Azure Active Directory
  • Create the external application account in Business Central
  • Grant consent

Once these 3 steps are completed we can move to Postman.

Get a Token

In Postman, add the following request to generate a security token:

POST https://login.microsoftonline.com/{tenant_id}/oauth2/v2.0/token

Replace {tenant_id} with your target tenant.

If you don’t know your tenant_id, you should navigate to your partner center and launch Azure portal:

Then, in the Overview blade, look for Tenant ID:

In Postman, your POST request should look like this:

For client_id use the Application Client ID in the picture below:

For client_secret use the value in the secret_id column in the screen below:

Under the Body Tab have the following elements:

Save the request and Send.

The system will generate a new token:

Now with token let’s execute a few API requests:

Under Authorization for a new request(or better, create a folder and use “Inherit from parent” for all requests under the new folder) add the following:

In the field under Available Tokens, copy and paste the value from “access_token” element from the previous response.

Test BC APIs

  1. The request for all restful API entities could look like this:

2. The request for all companies looks like this:

3. For all customers in a specific BC company:

4. For inserting a new customer in a specific company:

5. To update a specific customer:

Note the If-match element. This should point to a most recent state of the object you are updating.

In your code, get the object, e.g. GET the customer, make note of the odata-etag value, then use the odata-etag value in an if-match header value to PATCH(update) the customer.

In the body, include the fields that you want to update:

6. You could also delete a customer:

With the APIs tested in Postman you can now translate your requests to BC to any programming platform.

Business Central : Word Layout date fields formatting

One of my co-workers asked how can we format a date field in a MS Word Layout document so that it only shows the date but not the time.

To get started with adding fields to a Word Report Layout start with this:

How to Add Fields to a Word Report Layout – Business Central | Microsoft Docs

Let’s say you added a date field: CustomDate to a BC report.

When you run the report the date field renders on the report as DateTime: 30-06-2021 12:00:00.

How do we enforce the report to only display the date: 30-06-2021

Let’s look at a simple example: you want to add to the report 206, Sales Invoice, in a custom Word Layout the DueDate_SalesInvHeader field from the dataset.

If you followed the link above you might have now a control on your document that looks like this:

If this field is coming from AL with Date and Time components, how do we format it so it only shows the Date?

In RDLC you would go to the properties of the text box that is hosting your datetime field and change the Format property to “d”.

How about Word Layouts?

It’s quite simple, but not easily available in the documentation.

You would change the date field in the layout to show like this:

You can change the formatting to match your region. In US: dd/MM/yyyy.

There is always the option of writing an extension and send to layout an already formatted date field as text.

Move your Blobs out of the database and into Media and Media Set data type fields

“Business Central customers can use up to 80 GB of database storage capacity across all of their environments (production and sandbox), meaning that the sum of database capacity usage of all of their environments must not exceed 80 GB” – Microsoft Docs

One way to keep your SaaS customers database size in check is by “Migrating BLOB data types to Media or MediaSet – Data in Media or Media set data types aren’t counted in the database limit. As an extension developer, consider migrating data from blobs to the Media or MediaSet datatypes for your own extensions” – per Microsoft Documentation.

Said and done.

Let’s create a new table that contains a MediaSet field.

In a List page for the above table, displayed the MediaSet field as a factbox with 3 actions:

  • Export
  • Import
  • Delete

The code in each action was based on the Customer Image factbox, but adapted to use Microsoft System’s app table “Tenant Media”.

Get the code from here.

Documentation:

Views and Queries in Business Central

While reviewing options to alter existing queries, I felt trap to the fact that now queries do not need a new page to display results, but can be connected to existing List pages.

Below, highlighted are all queries connected to Customer List page.

For example “Blocked Customers” query is located in _Exclude_Microsoft Dynamics 365 SmartList extension

Source code can be found in Microsoft open source github for BC:

The way we connect the query with the List page is through QueryCategory property as it’s been described in more detail here, and here.

Moving on the views now, we can create at run time a view by activating the Filter pane and saving the filtering:

But, what’s cool, you can create a view in code.

For example, if you develop a solution and you want your extension to come with already defined views, then here is what you would do:

  1. Create a new profile:

In VS Code, create a new al file and include this code:

profile CUSTOM_VIEWS_QUERIES
{
    Caption = 'CUSTOM VIEWS and QUERIES';
    RoleCenter = "Business Manager Role Center";
    Customizations = TestView;
}

2. Create a new page customization object:

pagecustomization TestView customizes "Customer List"
{

    views
    {
        addfirst
        {
            view(View1)
            {
                Caption = 'Test View 1';
                Filters = WHERE("Balance" = filter(.. 100));
            }
        }
    }
}

Compile the project containing the new AL file, and publish the extension.

In BC, switch to the new profile, CUSTOM_VIEWS_QUERIES and we can see now under Customers, 2 views:

  • Bobs – > created at run time
  • Test View 1 -> created via code

For creating views at design time, read more here.

Lastly, let’s add a custom query to the Customers List.

In your AL project, create a new AL file and create a new query.

query 50100 Test_Query
{
    QueryType = Normal;
    QueryCategory = 'Customer List';
    Caption = '__Test Query';
    TopNumberOfRows = 10;
    elements
    {
...

To make some sense being located on Customer List page, the dataitem chosen should contain primary key of the table supporting the list, meaning Customer.”No.”.

If you don’t want to go through creating a new query, just copy an existing query on Customer List, like Blocked Customers, and replace the top part with the code above.

Build and publish. Now the new query is displayed in the Customers drop down list of queries:

In the Dynamics NAV times, we would have added queries by adding an action and execute the page supporting the query, or we would have added the page supporting the query in the menusuite.

Now, with one line of code (QueryCategory property) we allow a query be executed from all the list pages defined with QueryCategory.

As we saw, views can now be coded, which makes your extension versatile and easy to use.

Awesome job, Microsoft.

New-BcContainerWizzard generates scripts to build NAV or Business Central containers

Last few months I found learning increasingly more about Docker and Microsoft Powershell libraries that get you a running instance of NAV or Business Central in a local container.

I am going to investigate 3 ways that get you a running NAV/BC container.

Using pure docker command

You can start with Docker here.

Example of using docker to create first BC container:

docker run -e accept_eula=y mcr.microsoft.com/businesscentral/sandbox

This command will set an environment variable local to the new container and will pull (if not already pulled) the image specified (mcr.microsoft.com/businesscentral/sandbox) and will run it.

You could also run the container based on a Business Central image for Windows Server 2019. This is a lighter image than the previous one which was for Windows 10 OS.

docker run -e accept_eula=y mcr.microsoft.com/businesscentral/sandbox:ltsc2019

To check the size of the images downloaded run from the command prompt:

docker image list

If you want to delete some of your images run for each the following:

docker image rm 0d -f

You can specify the whole identifier for the image or just the first 2 letters (0d).

With the container running, you can open VS Code, install AL code using the ALLanguage.vsix from the location displayed in the log:

http://e8d9bbb19805:8080/ALLanguage.vsix

If you have trouble using the dns name, something must not have been right with writing the hosts file, but you could always use the IP of the container.

Now you should be able to connect to your container and start writing extensions.

2.Using Microsoft’s module NAVContainerHelper, more specifically “New-NAVContainer” command:

New-NavContainer -accept_eula -containerName “firstcontainer” -auth Windows -imageName mcr.microsoft.com/businesscentral/sandbox:ltsc2019

While previous command could have been launched from the command prompt (with docker running), you can launch the above command from the prompt of Powershell ISE (run as admin). This will pull the latest business central image for Windows Server 2019. If you run “docker image ls” you can notice this is a lighter image.

You can connect to that instance to write extensions by running VS Code and installing vsix file that comes with the container.

3. Using Microsoft’s module BcContainerHelper.

Latest Microsoft’s module, BCContainerHelper has a command New_BCContainerWizzard. This command generates a Powershell script that, when run, creates a NAV/BC container.

To gain access to the wizzard, install first the new module BCContainerHelper. When running “Install-Module -Name BcContainerHelper” I had an error:

Then I added the “-AllowClobber” parameter and module was successfully installed.

Install-Module -Name BcContainerHelper -AllowClobber

Once BcContainerHelper installed I had access to the New-BcContainerWizzard:

Let’s launch it and install a container loaded with a NAV 2017 CU5 image:

  1. Accept Eula:

 

Choose Y.

2. Next we need to choose between a locally stored container or a container stored in Azure:

Default is Local and that’s what I chose.

3. For authentication step you have a few options: either username + password or Windows. I choose Windows:

4. Name container:

5. Version: latest BC (onprem or Sandbox) or a specific version.

We are going to install a container with an OnPrem image of NAV 2017 CU5. For a list of sandboxes and onprem images consult these links:

https://mcr.microsoft.com/v2/businesscentral/sandbox/tags/list

https://mcr.microsoft.com/v2/businesscentral/onprem/tags/list

Version

6. Country

Country

I chose NA.

7. Test toolkit ?

10. License (No, or local file or https url for downloading the license file)

11. Database: you can leave the new container loaded with cronus db in a local (to container) sqlexpress instance or you can load a database with the correct version in the SQLExpress instance or use a SQL Server other than container’s SQL Server. I chose the default but planning to connect to the SQL Server express instance and restore a backup later once container is created and runs.

12. DNS settings:

13. Memory Limit:

MemoryLimit

I left default.

14. CAL Development:

CALDev

As it is a NAV 2017 image I chose Yes.

Image

Name and save your powershell script:

script

The window will close and a Powershell ISE opens with the new script:

pswin

Now you can distribute the script with your team. If you want to generate the new container press F5.

In the script please note the new parameter artifactURL. More on this new parameter here.

After 835 seconds the container was installed. However during the last steps (exporting shortcuts to host desktop) script encountered an error:

Error

In the version of BcContainerHelper I was working with (1.0.1) , Export-BcContainerObjects was not included. In fact, the command should have been Export-NAVContainerObjects.

I created the issue in github and next day I found an email from Freddy that this was a bug and it will be shipped in BcContainerHelper 1.0.2-preview153.

As the new release was available, I re-installed the module and I was able to finish running the script.

I needed a current version of the database instead of installed Cronus database, so I connected via SSMS to the container SQL instance and restored the database. Now development can take place in the container.

More information on Freddy‘s and Steve‘s blogs.

Report design: show row only on the first page

One of the things I bumped into recently was to show a portion of an order confirmation(a portion of the header)  only on the first page.

I tried a few things, the closest being to have the controls that would appear on the first page all in a rectangle. Then, control the rectangle via Hidden property and Globals!PageNumber property.

This worked. However, the real estate taken by the rectangle was still consumed, therefore not useful.

The solution was to move all controls on the first row of the tablix located in the body.

Then, after opening the Advanced Mode of the grouping:

  • for the row that you want to display only on first page set the following properties:
    • keeptogether = false
    • keepwithgroup = none
    • repeatonnewpage = true
  • for all the other rows:
    • keeptogether=true
    • keepwithgroup=after
    • repeatonnewpage=true

Hope it helps!

How to generate Azure Containers Instances loaded with Business Central in minutes

To start writing extensions for Business Central we have a few choices: installing locally one of the release candidates that comes in the same format as any other Dynamics NAV DVD packages, creating a locally hosted docker sandbox, or in Azure as a container instance.

As the process of getting your container takes just a few minutes, I prefer to do my extensions testing and development in an Azure container.

To generate my Azure container with Business Central I started by installing Azure CLI for Windows. You can also use chocolatey to install Azure CLI on your local machine.

In Visual Studio Code click on Terminal and in a powershell session start your Azure work by logging in your Azure account with

az login

1.Azure Login

If logged in already and want to check account logged info:

az_account_show

Next, we need to create a resource group, which is a logical container in Azure, something like an organization unit in Active Directory or a folder for Windows files.

The command is “az group create” and takes two parameters: group name and location:

create group

Once the resource group is created we can create the azure container instance loaded with the latest Business Central using the following Azure command:

az container create

containerIn the image above,

  • the group in which the container will be created follows “-g” (group) option: “svrg”
  • the name of the container follows the “-n” (name) option: “d365bc-az-cont-us-cont”
  • the image loaded on this container is stored here: “Microsoft/bcsandbox:latest”
  • the OS is Windows
  • We can only enter 5 ports: 80,7046, 7048, 7049, 8080

For a complete list of parameters for “az container create”, check this.

To check the logs, find the credentials to log in recorded by Azure for the previous command run “Az container logs” like below:

logs

As you have seen above, the admin credentials are displayed and the new Azure Business Central instance appears ready for connections. Lets check by browsing on the link for the web client:

Ctrl + Click on the web client link in the picture above opens the Business Central web client:

webclient

To see the newly container page in Azure navigate to the resource group and then to your container:

az_container_page

After entering the credentials from the logs we are in:

inbc

Good! We’ve got a Business Central instance in Azure running in a container and we’re ready to code and test extensions!

To get into this container in Visual Studio Code generate with AL:Go command a new AL project and change in launch.json the value for server token to the container dns name created above:

vscode to azure

In the next blog I’ll go through the steps of deploying an Azure container loaded with a Business Central image using deployment templates with parameters.

If you liked this article bookmark my blog or follow me for more stuff about NAV and Business Central.

Dynamics 365 Business Central : Al Code Analyzers #TurnOnCops #Extensions #Permissions

When you turn On PerTenantExtensionCop in Visual Studio Code and you forgot to create a Permissions.xml file, you get a compile error in your extension.

In Visual Studio Code – > User Settings add an entry for al.codeAnalyzers token like below:

CodeCops

Without this flag turned on your extension compiles.

If you do set PerTenantExtensionCop and if you’re missing Permissions.xml, you’re going to get a compile error:

Compile Error

By Adding in the project root a file Permissions.xml the error is solved:

Success

More reading on code analyzers here.

Original post here.

Day-to-day NAV: Upgrade codeunits

Starting with NAV 2015 there is an easier way to migrate data between old and new structures of NAV tables.

From msdn excerpt, the upgrade codeunits role is to provide “the logic for migrating existing data in the business data table from the old format to the new format after schema synchronization”.

When we operate a change to the structure of a table the system will detect there are discrepancies between what we just saved through the development environment and what is current in SQL Server.

When we save the object we can switch between three options available for “Synchronize schema”. By default the value is “Now – with validation”.  If we choose “Later” the object is saved but no schema synchronization is performed. If we choose “Force”, the object is saved and the new structure is operated at the SQL Server  level with potential data loss.

If we choose “Now – with validation”, the system will attempt to synchronize schema and because there are structural changes it will check the existence of an upgrade codeunit that can perform the changes without data loss. If such an upgrade unit does not exist we get the error above.

To see the upgrade codeunit in action, I will:

  1. Create a table
  2. Populate it
  3. Change its structure:
    • change the ID# of a field
    • change the data type of a field
    • insert new field in the freed ID#
  4. Confirm schema sync error
  5. Create upgrade codeunit
  6. Attempt to save the object with Schema Synchronization “Now – with validation”
  7. Start the upgrade process
  8. Check upgrade output

Let’s create a table, populate it with some data and attempt to change its structure.

1. Create a table similar to the one below:

1-table

2. Create a codeunit to populate the newly created table:

codeunit-populate

   3. Let’s change the structure of the table as following:

  • ID 6 Status (Option) -> ID 20 Status (Option) [Update1]
  • ID 4 Field_3 (Integer) -> ID 4 Field_3(Code10) [Update2]
  • ID 6 Description (Text30) (new field) [Update3]

Check the current status of the table pre-upgrade in Management Studio

struct-pre-upgrade

4. If trying to save the object with Schema synchronization “Now – with validation” after [Update1] and [Update2] we get this error:

error1

If we try to save the object after [Update1],[Update2] and [Update3]

we get this error:

error2

Notice how the system reports only the last update for field 6, I assume because the Update1 is not committed.

5. Let’s now create the upgrade codeunit:

Start by setting up the Subtype property of the codeunit to “Upgrade”.

The codeunit consists of two functions:

  • A function that will back up the data from our table into a temporary table. [My temporary table is pretty much a copy of the original table. But if you’re not changing a lot of fields in your process, I suggest having in your temp table just the fields in the primary key and the fields that are affected.]. Set the FunctionType property of this function to TableSyncSetup.

tablesyncsetupfnA function that will populate the new structure with the data from the temporary backup table … something like a “forced” schema synchronization with data restore from a backup. Set the FunctionType property of this function to UpgradePerCompany

restoredata

6. Save the object (compile option for schema synchronization Now – with validation). No errors should be encountered at this point if the upgrade codeunit was created.

7. Start the upgrade from the development environment (Tools-Data Upgrade – Start …) or if you prefer powershell, run in Administration Shell: Start-NAVDataUpgrade [InstanceName]

8. Check the content of the table:

afterupgrade

I used upgrade codeunits in day-to-day tasks(recently changing the data type of a field from integer to code), and in upgrade projects when existent fields have been relocated to different IDs.

When the Data Upgrade processes is suspended you can identify potential errors in your upgrade codeunit by running a powershell command: Get-NAVDataUpgrade [InstanceName] -ErrorOnly

But it’s not possible to step through the codeunit with the debugger … or, to be more precise, I couldn’t find how.

Sample code here.