How to get disk usage per folder

Recently I received a question coming to the support ticketing system on how to get a listing of all folders inside c:\Users folder and their sizes.

The sys admin that asked the question had hundreds of folders in C:\Users and did not want to to do “right click” + Properties on each one.

I started digging a bit and almost wanted to write my own little dot net app, but …

Fortunately, I found this gem: sysinternals disk usage:

https://docs.microsoft.com/en-us/sysinternals/downloads/du

I download it and gave it a try:

What I needed was a break down of all folders inside C:\users folder so that the admin can see which user takes the most space on an RDS.

I used this command: “du -q -l 1 c:\Users”

Hope this helps you!

Say Hello to the new “Performance Profiler”

While going through what is new in BC 2022 wave 1, I found out a cool tool.

Here comes the new Performance Profiler. While many have implemented Telemetry and got some insight into their processes, this is a step forward from Microsoft to bring Telemetry tools on the BC user interface.

Now, the concept of a profiler is not a futuristic functionality, it has been implemented on many platforms, like Java Profiling or SQL Server Profiler or even MS Edge Profiler.

A profiler is a code monitoring tool, a tool that helps trace, recreate, and troubleshoot various issues in code.

Nevertheless, seeing a profiler in Business Central is so refreshing.

Let’s do a quick test to see how that works at high level.

I added an action on Customer List page that does nothing.

Well, it does something, it is waiting for 5 seconds before ending.

pageextension 50100 CustomerListExt extends "Customer List"
{
    actions
    {
        addlast(Reports)
        {
            action(TestProfiler)
            {
                ApplicationArea = All;

                trigger OnAction()
                var
                    _DateTime: DateTime;
                begin
                    _DateTime := CurrentDateTime();
                    _DateTime := CreateDateTime(DT2Date(_DateTime), DT2Time(_DateTime) + 5000);
                    while (CurrentDateTime() < _DateTime) do
                        Sleep(1000);
                end;
            }
        }
    }
    trigger OnOpenPage();
    begin
        Message('App published: Hello world');
    end;

}

I published it and let’s see if it is captured by Performance Profiler.

As the documentation recommended, I opened the Performance Profiler in its own page.

On the Business Central main browser window go to the Customer List and locate the new action:

On the Performance Profiler, click on the Start action.

Then launch the TestProfiler action.

When you get control to the main browser BC page, head over the Performance Profiler page and click Stop action. And here are the results:

We can see 2 regions:

  • one that shows what application extensions were detected between Start and Stop.
  • a second one, a technical part-page, with 2 sub-lists:
    • one showing the time spent by application object
    • second showing a call tree with timing for each branch

Very cool feature which I wish I had it in all previous BC versions.

How was this designed?

The main page Performance Profiler is based on Page 24 “Performance Profiler” in the Microsoft System Application.

Lots of other objects involved with the Performance Profiler to bring this new feature:

Take a look below, to compare the Microsoft System 19.5 versus Microsoft System 20.

With BC 2022 wave 1, the system application comes with a set of new objects to support Profiling.

All in all, Performance Profiler is a great addition to the Business Central platform.

This will help consultants locate faulty/slow code and record, download and send the recording to the authors of the code that performs poorly.

Knowing your Business Central data using the Page Inspection

With the promotion of Microsoft’s ERP Dynamics NAV as the only SMB Microsoft ERP in the cloud, the community gained a significant number of new users coming from other products.

Introduction of Cloud Migration feature or Data Migration in Business Central allowed delivery teams in the Partners space to bring in BC SaaS many end-users from Dynamics GP or SL, as well as from other non-Microsoft ERPs, like QuickBooks.

Those that consider themselves new to Business Central might find difficult to navigate and find what they need in their daily work with the new ERP.

This is when knowing about the Page Inspection page comes handy.

To enable the Page Inspection pane, you can use these keys:

  • CTRL + ALT + F1
  • navigating to “?” on the top right corner -> click on Help and Support -> look for “Inspect Pages and Data” link:

Once enabled, Page Inspection appears as a vertical frame in the browser window and allows users to see the components of each page.

Click on various page components and notice how the Page Inspection updates. Look for:

  • the fields included in that component
  • the extensions that touched the current page/component
  • existing filters for that component/page

Example:

Enabling Page Inspection on the Business Manager Role Center and selecting the frame in the middle uncovers valuable information:

The frame is actually a Card Page based on a BC table “Activities Cue” and the card page itself is Page 1310 “O365 Activities”.

Some users might ask:

  1. What are the records stored in this table?

For seeing the records there are a few options:

In the page inspection pane, click on “View Table” link:

This will open up the default page for table 1313

  • Another way of displaying all records in a table is to go to Table Information page, Search for table 1313, and click on the flow field “No. Of Records”
  • Ultimately, the users can run the table from the URL. Copy the link up to and inluding the environment name (in my example below, include everything up to “Production” then add “/?table=1313”

2. What data is available for each record?

There are three Tabs at the top:

  • Table Fields
    • use the magnifying loop(under Page Filters) to filter the fields by name or value
    • not all fields available in the table are displayed on the page. You can try to Personalize your page or ask your Partner to help you. To enable a field for all users within one profile you might want to ask your admin to customize the page.
  • Extensions
    • Custom code affects tables and pages. You can easily see what extensions have touched the current table/page

Note: you can easily detect which fields belong to which extension as new extensions need to include a suffix or prefix for each field/action/page/table.

  • Page Filter: filters are often used to display the data in Business Central. This tab gives good clues into how the data has been filtered:

If you do not see the details you’re expecting to see in the Page Inspection page, you probably do not have the right permission. Talk to your admin and ask them to give you D365 Troubleshoot permission set or user group.

For some Microsoft official documentation on the Page Inspection read this.

5 new features Business Central admins need to know

Microsoft keeps adding new features to all facets of Business Central, including the admin center.

The community has access to the BC Ideas portal to signal Microsoft its wishes and Microsoft delivers.

If you want to contribute to the future of Business Central, add your ideas to http://aka.ms/bcideas

1. How to access Admin URL

Well, this is not something new, but still admins need to know how to access the BC admin center.

https://businesscentral.dynamics.com/%5Btenant-guid%5D/admin

Example:

2. Copy environments

A) Sandbox to Production

After testing your sandbox you can turn it into a production environment.

Click on a Sandbox environment

Click on Copy action:

A new dialog appears:

Enter the New environment name and the type of the environment, in this case Production, and then click on Copy.

B) Production to Sandbox

Navigate to environment home page and click on a Production environment.

In the next screen, pick a name for the new sandbox and click on Copy:

Confirm operation:

Copy is scheduled:

and later is Copying:

Note 1: you can also perform these 2 actions programmatically via APIs:

Note 2: Clean up or prepare data via 2 new events:

3) Restrict access to a specific environment to only certain users (part of a security group)

A) Create security group in Azure AD or Microsoft 365 admin center

  • Open Admin Center
  • Navigate to Active teams & groups
  • Click Security
  • Select Add a group action
  • add owner(s) and member(s)

B) Assign security group to BC environment

Admin users will be allowed in all these environments.

To restrict access to an environment, create a security group with 0 members. In this case only admins have access.

4) Environment operations are now caught in the Environment Operations:

Navigate to Operations:

5) Restart Environments:

  • Open the environment;
  • Click on Sessions;
  • Click on Restart Environment

6) Update apps in each environment

If you have apps installed in your environments and these apps have updates in the AppSource, starting with BC 2021 wave 2 you can manage the apps and their upgrade from admin center.

  • Click on one environment link
  • Choose Apps action

If the Available Update Action shows “Action Required” click on it and get through the upgrade flow.

Handy Date-Time Dialog in Business Central

Did you ever need a DateTime field in your Business Central extension?

I recently added, in one of my customer extensions, a DateTime field, and wanted to allow users to not only record a date, but also a time.

The solution is not difficult, even if we need to write our own code.

But why not use existing code in Base Application?

Look at this sample code:

table 50110 "Sample DateTime"
{
    DataClassification = CustomerContent;

    fields
    {
        field(1; MyDateTime; DateTime)
        {
            DataClassification = CustomerContent;

            trigger OnLookup()
            var
                DateTimeDialog: Page "Date-Time Dialog";
            begin
                DateTimeDialog.SetDateTime(RoundDateTime(MyDateTime, 1000));
                if DateTimeDialog.RunModal() = Action::OK then
                    MyDateTime := DateTimeDialog.GetDateTime();
            end;
        }

When you run the page click on the “…” and see the Date-Time Dialog page:

The code above is using Page 684 “Date-Time Dialog”.

If you want to see how Microsoft designed the page, check this.

And if you want to see how Microsoft implemented it in Base Application check table 472 “Job Queue Entry”, fields “Expiration Date/Time” and “Earliest Start Date/Time”.

Look for all pickable DateTime fields in your solutions and re-factor them using “Date-Time Dialog” page.

Hope this helps!

Using Postman to test OAuth 2.0 authorization to Business Central restful API

Recently I have been involved in projects involving migrating Dynamics GP customers to BC (SaaS).

All these GP customers had integrations developed over the years, integrations that now need to be re-targeted to BC.

Now, if you are new to Business Central API integration you need to know that there are some authority bloggers that have touched on OAuth 2.0 in the last 6 months with very useful how-tos. Have a look at Stefano‘s or Roberto‘s blogs. Most comprehensive blogs though, in this specific niche, I find A.J. Kauffmann writings.

A.J.’s blogs are meticulous. Moreover, it seems that they are coming almost aligned with my work requirements, so … yeah … I found them very useful.

As you probably heard or seen, Basic Authentication is no longer supported for BC online.

The only option now(at least for BC online) is using OAuth2 authorization.

How do we start with setting up OAuth2?

Well, I won’t go into that because A.J.’s blog was immaculate, I didn’t have anything to add, so I won’t add anything.

To conserve the flow of this blog, all I have to say is that you need to:

  • Register the external application in Azure Active Directory
  • Create the external application account in Business Central
  • Grant consent

Once these 3 steps are completed we can move to Postman.

Get a Token

In Postman, add the following request to generate a security token:

POST https://login.microsoftonline.com/{tenant_id}/oauth2/v2.0/token

Replace {tenant_id} with your target tenant.

If you don’t know your tenant_id, you should navigate to your partner center and launch Azure portal:

Then, in the Overview blade, look for Tenant ID:

In Postman, your POST request should look like this:

For client_id use the Application Client ID in the picture below:

For client_secret use the value in the secret_id column in the screen below:

Under the Body Tab have the following elements:

Save the request and Send.

The system will generate a new token:

Now with token let’s execute a few API requests:

Under Authorization for a new request(or better, create a folder and use “Inherit from parent” for all requests under the new folder) add the following:

In the field under Available Tokens, copy and paste the value from “access_token” element from the previous response.

Test BC APIs

  1. The request for all restful API entities could look like this:

2. The request for all companies looks like this:

3. For all customers in a specific BC company:

4. For inserting a new customer in a specific company:

5. To update a specific customer:

Note the If-match element. This should point to a most recent state of the object you are updating.

In your code, get the object, e.g. GET the customer, make note of the odata-etag value, then use the odata-etag value in an if-match header value to PATCH(update) the customer.

In the body, include the fields that you want to update:

6. You could also delete a customer:

With the APIs tested in Postman you can now translate your requests to BC to any programming platform.

Business Central : Word Layout date fields formatting

One of my co-workers asked how can we format a date field in a MS Word Layout document so that it only shows the date but not the time.

To get started with adding fields to a Word Report Layout start with this:

How to Add Fields to a Word Report Layout – Business Central | Microsoft Docs

Let’s say you added a date field: CustomDate to a BC report.

When you run the report the date field renders on the report as DateTime: 30-06-2021 12:00:00.

How do we enforce the report to only display the date: 30-06-2021

Let’s look at a simple example: you want to add to the report 206, Sales Invoice, in a custom Word Layout the DueDate_SalesInvHeader field from the dataset.

If you followed the link above you might have now a control on your document that looks like this:

If this field is coming from AL with Date and Time components, how do we format it so it only shows the Date?

In RDLC you would go to the properties of the text box that is hosting your datetime field and change the Format property to “d”.

How about Word Layouts?

It’s quite simple, but not easily available in the documentation.

You would change the date field in the layout to show like this:

You can change the formatting to match your region. In US: dd/MM/yyyy.

There is always the option of writing an extension and send to layout an already formatted date field as text.

Move your Blobs out of the database and into Media and Media Set data type fields

“Business Central customers can use up to 80 GB of database storage capacity across all of their environments (production and sandbox), meaning that the sum of database capacity usage of all of their environments must not exceed 80 GB” – Microsoft Docs

One way to keep your SaaS customers database size in check is by “Migrating BLOB data types to Media or MediaSet – Data in Media or Media set data types aren’t counted in the database limit. As an extension developer, consider migrating data from blobs to the Media or MediaSet datatypes for your own extensions” – per Microsoft Documentation.

Said and done.

Let’s create a new table that contains a MediaSet field.

In a List page for the above table, displayed the MediaSet field as a factbox with 3 actions:

  • Export
  • Import
  • Delete

The code in each action was based on the Customer Image factbox, but adapted to use Microsoft System’s app table “Tenant Media”.

Get the code from here.

Documentation:

Views and Queries in Business Central

While reviewing options to alter existing queries, I felt trap to the fact that now queries do not need a new page to display results, but can be connected to existing List pages.

Below, highlighted are all queries connected to Customer List page.

For example “Blocked Customers” query is located in _Exclude_Microsoft Dynamics 365 SmartList extension

Source code can be found in Microsoft open source github for BC:

The way we connect the query with the List page is through QueryCategory property as it’s been described in more detail here, and here.

Moving on the views now, we can create at run time a view by activating the Filter pane and saving the filtering:

But, what’s cool, you can create a view in code.

For example, if you develop a solution and you want your extension to come with already defined views, then here is what you would do:

  1. Create a new profile:

In VS Code, create a new al file and include this code:

profile CUSTOM_VIEWS_QUERIES
{
    Caption = 'CUSTOM VIEWS and QUERIES';
    RoleCenter = "Business Manager Role Center";
    Customizations = TestView;
}

2. Create a new page customization object:

pagecustomization TestView customizes "Customer List"
{

    views
    {
        addfirst
        {
            view(View1)
            {
                Caption = 'Test View 1';
                Filters = WHERE("Balance" = filter(.. 100));
            }
        }
    }
}

Compile the project containing the new AL file, and publish the extension.

In BC, switch to the new profile, CUSTOM_VIEWS_QUERIES and we can see now under Customers, 2 views:

  • Bobs – > created at run time
  • Test View 1 -> created via code

For creating views at design time, read more here.

Lastly, let’s add a custom query to the Customers List.

In your AL project, create a new AL file and create a new query.

query 50100 Test_Query
{
    QueryType = Normal;
    QueryCategory = 'Customer List';
    Caption = '__Test Query';
    TopNumberOfRows = 10;
    elements
    {
...

To make some sense being located on Customer List page, the dataitem chosen should contain primary key of the table supporting the list, meaning Customer.”No.”.

If you don’t want to go through creating a new query, just copy an existing query on Customer List, like Blocked Customers, and replace the top part with the code above.

Build and publish. Now the new query is displayed in the Customers drop down list of queries:

In the Dynamics NAV times, we would have added queries by adding an action and execute the page supporting the query, or we would have added the page supporting the query in the menusuite.

Now, with one line of code (QueryCategory property) we allow a query be executed from all the list pages defined with QueryCategory.

As we saw, views can now be coded, which makes your extension versatile and easy to use.

Awesome job, Microsoft.

New-BcContainerWizzard generates scripts to build NAV or Business Central containers

Last few months I found learning increasingly more about Docker and Microsoft Powershell libraries that get you a running instance of NAV or Business Central in a local container.

I am going to investigate 3 ways that get you a running NAV/BC container.

Using pure docker command

You can start with Docker here.

Example of using docker to create first BC container:

docker run -e accept_eula=y mcr.microsoft.com/businesscentral/sandbox

This command will set an environment variable local to the new container and will pull (if not already pulled) the image specified (mcr.microsoft.com/businesscentral/sandbox) and will run it.

You could also run the container based on a Business Central image for Windows Server 2019. This is a lighter image than the previous one which was for Windows 10 OS.

docker run -e accept_eula=y mcr.microsoft.com/businesscentral/sandbox:ltsc2019

To check the size of the images downloaded run from the command prompt:

docker image list

If you want to delete some of your images run for each the following:

docker image rm 0d -f

You can specify the whole identifier for the image or just the first 2 letters (0d).

With the container running, you can open VS Code, install AL code using the ALLanguage.vsix from the location displayed in the log:

http://e8d9bbb19805:8080/ALLanguage.vsix

If you have trouble using the dns name, something must not have been right with writing the hosts file, but you could always use the IP of the container.

Now you should be able to connect to your container and start writing extensions.

2.Using Microsoft’s module NAVContainerHelper, more specifically “New-NAVContainer” command:

New-NavContainer -accept_eula -containerName “firstcontainer” -auth Windows -imageName mcr.microsoft.com/businesscentral/sandbox:ltsc2019

While previous command could have been launched from the command prompt (with docker running), you can launch the above command from the prompt of Powershell ISE (run as admin). This will pull the latest business central image for Windows Server 2019. If you run “docker image ls” you can notice this is a lighter image.

You can connect to that instance to write extensions by running VS Code and installing vsix file that comes with the container.

3. Using Microsoft’s module BcContainerHelper.

Latest Microsoft’s module, BCContainerHelper has a command New_BCContainerWizzard. This command generates a Powershell script that, when run, creates a NAV/BC container.

To gain access to the wizzard, install first the new module BCContainerHelper. When running “Install-Module -Name BcContainerHelper” I had an error:

Then I added the “-AllowClobber” parameter and module was successfully installed.

Install-Module -Name BcContainerHelper -AllowClobber

Once BcContainerHelper installed I had access to the New-BcContainerWizzard:

Let’s launch it and install a container loaded with a NAV 2017 CU5 image:

  1. Accept Eula:

 

Choose Y.

2. Next we need to choose between a locally stored container or a container stored in Azure:

Default is Local and that’s what I chose.

3. For authentication step you have a few options: either username + password or Windows. I choose Windows:

4. Name container:

5. Version: latest BC (onprem or Sandbox) or a specific version.

We are going to install a container with an OnPrem image of NAV 2017 CU5. For a list of sandboxes and onprem images consult these links:

https://mcr.microsoft.com/v2/businesscentral/sandbox/tags/list

https://mcr.microsoft.com/v2/businesscentral/onprem/tags/list

Version

6. Country

Country

I chose NA.

7. Test toolkit ?

10. License (No, or local file or https url for downloading the license file)

11. Database: you can leave the new container loaded with cronus db in a local (to container) sqlexpress instance or you can load a database with the correct version in the SQLExpress instance or use a SQL Server other than container’s SQL Server. I chose the default but planning to connect to the SQL Server express instance and restore a backup later once container is created and runs.

12. DNS settings:

13. Memory Limit:

MemoryLimit

I left default.

14. CAL Development:

CALDev

As it is a NAV 2017 image I chose Yes.

Image

Name and save your powershell script:

script

The window will close and a Powershell ISE opens with the new script:

pswin

Now you can distribute the script with your team. If you want to generate the new container press F5.

In the script please note the new parameter artifactURL. More on this new parameter here.

After 835 seconds the container was installed. However during the last steps (exporting shortcuts to host desktop) script encountered an error:

Error

In the version of BcContainerHelper I was working with (1.0.1) , Export-BcContainerObjects was not included. In fact, the command should have been Export-NAVContainerObjects.

I created the issue in github and next day I found an email from Freddy that this was a bug and it will be shipped in BcContainerHelper 1.0.2-preview153.

As the new release was available, I re-installed the module and I was able to finish running the script.

I needed a current version of the database instead of installed Cronus database, so I connected via SSMS to the container SQL instance and restored the database. Now development can take place in the container.

More information on Freddy‘s and Steve‘s blogs.