Business Central On-Premise installation: hardware and software requirements and recommendations

Standard

A list of system requirements for Business Central On-Premise is readily available from Microsoft Docs here.

The only issue is, these are bare minimum requirements.

How do we know what level of hardware requirements will be enough to not only guarantee good performance at deployment, but down the line months and years from deployment?

When the decision to go for Business Central On-Premise versus Business Central on Microsoft cloud is behind, end-users and VARs alike are facing decisions regarding the server type: should it be dedicated server in End-Users premises or virtual servers.

If going down the virtual server path, End-Users can chose between a self-managed self-hosted virtualization system (using Hyper-V or other alike solutions) or use a cloud provider(one of them being Microsoft’s Azure platform).

The minimum requirements allow for an installation of standard product. But if the company is using a different Application layer (standard + AddOns) or if the installation needs to accommodate various extensions, minimum requirements won’t be enough.

For example, in my recent experience with 2 VARs with products for Food vertical, the Microsoft Application extension has been replaced with code that includes Microsoft Application and Food application code. There are high chances that the minimum requirements won’t be satisfied by the modified base layer of the application. And with additional extensions installed on top of the base layer, our needs are getting further and further away from the bare minimum requirements.

When we need to research about Business Central requirements we need to understand the architecture of this product. Business Central functions on a three-tier architecture as seen on this Microsoft Docs page:

Each component in this model comes with their own hardware/software recommendations.

Most often than not, I have seen the Web Server and the Business Central server side by side on the same server.

Quite often, especially for installations with less than 25 users, I’ve seen installations of SQL Server, NAV Server and Web Server all on 1 machine or virtual machine.

For installation of 25 users or less the topology I’ve seen working quite well is this:

  • Business Central Server + Web Server on one machine
  • SQL Server on a different machine

Business Central Server and Web Server

SpecificationMinimumRecommended
Memory16 GB>= 32 GB
Processor1 Quad Core2 x 16 CPU cores
DisksSAS or SSD Drives, configured with RAID1

For a list of Operating Systems required by Business Central Server visit Microsoft Docs recommendations for Operating System.

SQL Server

SpecificationMinimumRecommended
Memory32 GB>= 64 GB
Processor1 Quad Core 2 x 16 CPU cores
Disks SAS or SSD Drives:
– OS: RAID1
– Data drive: RAID1
– Log drive: RAID1/RAID10
– Master/TempDB: RAID1
– Backup drive: RAID1

Most SQL Server installations use SQL Server Standard.

While I’ve seen installations of Microsoft Dynamics NAV and Business Central on a SQL Server express platform, SQL Server Express should only be deployed for non-production use such as test or development environments. It is not fit as a live environmentโ€™s production server.

For a list of SQL Server OS recommendations visit Microsoft Docs page.

The three tier architecture includes:

  • Web Clients
  • Business Central mobile app
  • MS Office Applications (needed for integration of BC with Office products)
  • AL Development workstations.

Please visit Microsoft Docs for each of these additional components’ operating system and other required software recommendation.

MB-800 Business Central Functional Consultant exam : study materials

Standard

As a Business Central developer I don’t get every day to set up Business Central standard processes, I mostly design and setup the processes in my customizations work. Standard Business Central setup is covered by Functional Consultants.

The exam MB-800 has been active for about a year (beta version started in October 2020) and given my developer experience with setting up the system I thought I’d give it a try. This blog contains materials I found, read, and tested; hopefully it will provide a good starting point for study for others.

This exam is testing your skills on setting up Business Central SaaS. It is easy to get a trial of BC SaaS which can be used for training for this exam. And if you need more time, you can extend your trial by another 30 days. Just navigate to https://businesscentral.dynamics.com/?page=1828 and extend your trial. If 60 days is not enough, you can start again a new trial.

You should start with Exam MB-800: Microsoft Dynamics 365 Business Central Functional Consultant – Learn | Microsoft Docs.

That page offers:

For example, one learning path is Set up financial management in Microsoft Dynamics 365 Business Central.

This path contains the following modules and lessons:

For all lessons targeting Functional Consultant role go Browse all – Learn | Microsoft Docs.

If you want to take Business Central Microsoft Docs with you, when you are offline, download the Microsoft Docs BC pdf from:

Welcome to Microsoft Dynamics 365 Business Central – Business Central | Microsoft Docs

I sent the pdf to my Kindle. Unfortunately, the file is too big and the experience reading from Kindle is not that great.

Would be great if Microsoft could split the big pdf in separate pdfs for each chapter.

Additional materials (with Danish roots, just like Business Central ๐Ÿ˜‰):

Go ahead and schedule the exam! Good luck!

Move your Blobs out of the database and into Media and Media Set data type fields

Standard

“Business Central customers can use up to 80 GB of database storage capacity across all of their environments (production and sandbox), meaning that the sum of database capacity usage of all of their environments must not exceed 80 GB” – Microsoft Docs

One way to keep your SaaS customers database size in check is by “Migrating BLOB data types to Media or MediaSetย – Data in Media or Media set data types aren’t counted in the database limit. As an extension developer, consider migrating data from blobs to the Media or MediaSet datatypes for your own extensions” – per Microsoft Documentation.

Said and done.

Let’s create a new table that contains a MediaSet field.

In a List page for the above table, displayed the MediaSet field as a factbox with 3 actions:

  • Export
  • Import
  • Delete

The code in each action was based on the Customer Image factbox, but adapted to use Microsoft System’s app table “Tenant Media”.

Get the code from here.

Documentation:

Views and Queries in Business Central

Standard

While reviewing options to alter existing queries, I felt trap to the fact that now queries do not need a new page to display results, but can be connected to existing List pages.

Below, highlighted are all queries connected to Customer List page.

For example “Blocked Customers” query is located in _Exclude_Microsoft Dynamics 365 SmartList extension

Source code can be found in Microsoft open source github for BC:

The way we connect the query with the List page is through QueryCategory property as it’s been described in more detail here, and here.

Moving on the views now, we can create at run time a view by activating the Filter pane and saving the filtering:

But, what’s cool, you can create a view in code.

For example, if you develop a solution and you want your extension to come with already defined views, then here is what you would do:

  1. Create a new profile:

In VS Code, create a new al file and include this code:

profile CUSTOM_VIEWS_QUERIES
{
    Caption = 'CUSTOM VIEWS and QUERIES';
    RoleCenter = "Business Manager Role Center";
    Customizations = TestView;
}

2. Create a new page customization object:

pagecustomization TestView customizes "Customer List"
{

    views
    {
        addfirst
        {
            view(View1)
            {
                Caption = 'Test View 1';
                Filters = WHERE("Balance" = filter(.. 100));
            }
        }
    }
}

Compile the project containing the new AL file, and publish the extension.

In BC, switch to the new profile, CUSTOM_VIEWS_QUERIES and we can see now under Customers, 2 views:

  • Bobs – > created at run time
  • Test View 1 -> created via code

For creating views at design time, read more here.

Lastly, let’s add a custom query to the Customers List.

In your AL project, create a new AL file and create a new query.

query 50100 Test_Query
{
    QueryType = Normal;
    QueryCategory = 'Customer List';
    Caption = '__Test Query';
    TopNumberOfRows = 10;
    elements
    {
...

To make some sense being located on Customer List page, the dataitem chosen should contain primary key of the table supporting the list, meaning Customer.”No.”.

If you don’t want to go through creating a new query, just copy an existing query on Customer List, like Blocked Customers, and replace the top part with the code above.

Build and publish. Now the new query is displayed in the Customers drop down list of queries:

In the Dynamics NAV times, we would have added queries by adding an action and execute the page supporting the query, or we would have added the page supporting the query in the menusuite.

Now, with one line of code (QueryCategory property) we allow a query be executed from all the list pages defined with QueryCategory.

As we saw, views can now be coded, which makes your extension versatile and easy to use.

Awesome job, Microsoft.

Parsing RunRequestPage output using XML Buffer

Standard

RunRequestPage allows developers to record the request page settings of a Dynamics NAV/Business Central report without actually running the report. The output of this command is an xml string.

E.q.

//XMLParameters: Text;

XmlParameters := REPORT.RUNREQUESTPAGE(50000);

What if we want to process the report in certain conditions explicitly defined by the report options? We need to be able in this case to parse the output of RunRequestPage.

Simple enough. One way is using XMLDocument LoadXml and load the string into a DotNet variable and use DotNet functions to get the value of the nodes.

If you want to avoid using DotNet you could use “XML Buffer Writer” codeunit (1235) and “XML Buffer” table (1235) in a codeunit called from an action.

XMLBuffer, XMLSpecialInterestNode : Record 1235;

XMLBufferWriter : Codeunit 1235;

First, we’re running the request page for report 50000. This will open up the request page, allowing the user to set all options/filters. Once finished click ok.

All the options/filters for the report will be recorded in the string XmlParameters.

Secondly, we load the xml string into an xml structure inside NAV, using table and codeunit 1235. This is done via function InitializeXMLBufferFromText from codeunit 1235.

We can then filter the entries and locate the option we are interested in.

In my case I had a report option “Run Later” … if this option is true I will do a different type of processing than just running the report. Think in terms of what you could do to a report beside running it: keep track of run time, email output …ย 

 

New-BcContainerWizzard generates scripts to build NAV or Business Central containers

Standard

Last few months I found learning increasingly more about Docker and Microsoft Powershell libraries that get you a running instance of NAV or Business Central in a local container.

I am going to investigate 3 ways that get you a running NAV/BC container.

Using pure docker command

You can start with Docker here.

Example of using docker to create first BC container:

docker run -e accept_eula=y mcr.microsoft.com/businesscentral/sandbox

This command will set an environment variable local to the new container and will pull (if not already pulled) the image specified (mcr.microsoft.com/businesscentral/sandbox) and will run it.

You could also run the container based on a Business Central image for Windows Server 2019. This is a lighter image than the previous one which was for Windows 10 OS.

docker run -e accept_eula=y mcr.microsoft.com/businesscentral/sandbox:ltsc2019

To check the size of the images downloaded run from the command prompt:

docker image list

If you want to delete some of your images run for each the following:

docker image rm 0d -f

You can specify the whole identifier for the image or just the first 2 letters (0d).

With the container running, you can open VS Code, install AL code using the ALLanguage.vsix from the location displayed in the log:

http://e8d9bbb19805:8080/ALLanguage.vsix

If you have trouble using the dns name, something must not have been right with writing the hosts file, but you could always use the IP of the container.

Now you should be able to connect to your container and start writing extensions.

2.Using Microsoft’s module NAVContainerHelper, more specifically “New-NAVContainer” command:

New-NavContainer -accept_eula -containerName “firstcontainer” -auth Windows -imageName mcr.microsoft.com/businesscentral/sandbox:ltsc2019

While previous command could have been launched from the command prompt (with docker running), you can launch the above command from the prompt of Powershell ISE (run as admin). This will pull the latest business central image for Windows Server 2019. If you run “docker image ls” you can notice this is a lighter image.

You can connect to that instance to write extensions by running VS Code and installing vsix file that comes with the container.

3. Using Microsoft’s module BcContainerHelper.

Latest Microsoft’s module, BCContainerHelper has a command New_BCContainerWizzard. This command generates a Powershell script that, when run, creates a NAV/BC container.

To gain access to the wizzard, install first the new module BCContainerHelper. When running “Install-Module -Name BcContainerHelper” I had an error:

Then I added the “-AllowClobber” parameter and module was successfully installed.

Install-Module -Name BcContainerHelper -AllowClobber

Once BcContainerHelper installed I had access to the New-BcContainerWizzard:

Let’s launch it and install a container loaded with a NAV 2017 CU5 image:

  1. Accept Eula:

 

Choose Y.

2. Next we need to choose between a locally stored container or a container stored in Azure:

Default is Local and that’s what I chose.

3. For authentication step you have a few options: either username + password or Windows. I choose Windows:

4. Name container:

5. Version: latest BC (onprem or Sandbox) or a specific version.

We are going to install a container with an OnPrem image of NAV 2017 CU5. For a list of sandboxes and onprem images consult these links:

https://mcr.microsoft.com/v2/businesscentral/sandbox/tags/list

https://mcr.microsoft.com/v2/businesscentral/onprem/tags/list

Version

6. Country

Country

I chose NA.

7. Test toolkit ?

10. License (No, or local file or https url for downloading the license file)

11. Database: you can leave the new container loaded with cronus db in a local (to container) sqlexpress instance or you can load a database with the correct version in the SQLExpress instance or use a SQL Server other than container’s SQL Server. I chose the default but planning to connect to the SQL Server express instance and restore a backup later once container is created and runs.

12. DNS settings:

13. Memory Limit:

MemoryLimit

I left default.

14. CAL Development:

CALDev

As it is a NAV 2017 image I chose Yes.

Image

Name and save your powershell script:

script

The window will close and a Powershell ISE opens with the new script:

pswin

Now you can distribute the script with your team. If you want to generate the new container press F5.

In the script please note the new parameter artifactURL. More on this new parameter here.

After 835 seconds the container was installed. However during the last steps (exporting shortcuts to host desktop) script encountered an error:

Error

In the version of BcContainerHelper I was working with (1.0.1) , Export-BcContainerObjects was not included. In fact, the command should have been Export-NAVContainerObjects.

I created the issue in github and next day I found an email from Freddy that this was a bug and it will be shipped in BcContainerHelper 1.0.2-preview153.

As the new release was available, I re-installed the module and I was able to finish running the script.

I needed a current version of the database instead of installed Cronus database, so I connected via SSMS to the container SQL instance and restored the database. Now development can take place in the container.

More information on Freddy‘s and Steve‘s blogs.

Report design: show row only on the first page

Standard

One of the things I bumped into recently was to show a portion of an order confirmation(a portion of the header)ย  only on the first page.

I tried a few things, the closest being to have the controls that would appear on the first page all in a rectangle. Then, control the rectangle via Hidden property and Globals!PageNumber property.

This worked. However, the real estate taken by the rectangle was still consumed, therefore not useful.

The solution was to move all controls on the first row of the tablix located in the body.

Then, after opening the Advanced Mode of the grouping:

  • for the row that you want to display only on first page set the following properties:
    • keeptogether = false
    • keepwithgroup = none
    • repeatonnewpage = true
  • for all the other rows:
    • keeptogether=true
    • keepwithgroup=after
    • repeatonnewpage=true

Hope it helps!

How to get all Windows services when you know the name of the executable

Standard

Long time no blog ๐Ÿ™‚

A few days ago I tried to stop a NAS on a NAV 2009 server.

The problem was that I did know which service is running the NAS.

If you are in a similar situation find that there is a powershell cmdlet that can be used to query windows services. The cmdlet name is Get-WmiObject.

In NAV 2009 R2 NAS executable was nassql.exe.

If I want to get all NAS services I would run in a Windows Powershell session the command:

Get-WmiObject Win32_Service | Where-object {$_.pathname -like “*nassql.exe*”}| Format-List -Property Name,Status,PathName

To list all NAV services, replace nassql with Dynamics.NAV.Server.exe.

WMI screenHave fun trouble shooting ๐Ÿ™‚

 

Fastest way to get entities in Dynamics NAV or Business Central and not only

Standard

It’s 4 pm. To my surprise a skype call from one of the customers I usually talk maybe once a month. She cut the niceties quite abruptly: “Look, I have a list of 100 customers and I need it in production asap. I have 15 fields with data for each new customer. Can you do it today before 5?”

This is the context of this blog post. How do we inject new entities in NAV, and not only, in the fastest way (under one hour) ?

A few weeks ago I engaged a few of my peer developers, not just NAV developers, on what they usually do in this type of scenario.

Some of their answers were really good and could be applied in Dynamics NAV or Business Central.

One of the answers was to ask the customer to enter it manually:)

giphy

That is indeed one way, but I’m not sure if my customer was willing to do it and – under one hour was out of the question.

Another answer was to “quickly” write an integration tool to take the data from the original system and into the target system.

crazy

Some of the answers I recall: “That’s crazy!” or “You have a list!” or “Under one hour, please!”…

Another idea was to manipulate the list, residing in an Excel file, in such a way that we generate the code required to insert the records in Excel in the language of your choice (C/AL, AL, C#) and once generated copy it from Excel worksheet straight into a Run trigger of a codeunit or any other method and execute that method. For Business Central create a new extension, extending only Customer List page with one action “Import Customers” and drop the code generated in Excel in that action OnAction trigger. Install the extension, run the action, un-install extension. I personally used this method at least a dozen times in my career in different environments including NAV. It’s fast, dirty and does the job ๐Ÿ™‚

A similar answer was to generate the “INSERT INTO” t-sql statements in Excel and copy the batch in a query window and execute it. We know this is not what we call best practices when working with Dynamics NAV, not to mention Business Central. But this might work very well for other environments, especially when you don’t have to trigger any business logic.

Another answer was to write in the language one prefers a subroutine to manipulate the Excel file programmatically. While this is a method that works most of the time when you have enough time, I don’t think is doable in under one hour unless you already have the bulk of the code and you just need to quickly transform it and polish it for the fields that the customer is including this time. I used this method a few times in Dynamics NAV when one can take advantage of the structure Microsoft put in place since NAV 2013 via table 370 Excel Buffer.

One last answer discussed between the NAV guys was to use RapidStart services. We, the NAV people, are quite lucky to have the mothership design this service for us. We both agreed that this would be one quick way to get the data in and most likely under one hour.

This is what I gathered for this type of time-sensitive requests. What would you do if you encounter this type of task?