How to get all Windows services when you know the name of the executable

Standard

Long time no blog đŸ™‚

A few days ago I tried to stop a NAS on a NAV 2009 server.

The problem was that I did know which service is running the NAS.

If you are in a similar situation find that there is a powershell cmdlet that can be used to query windows services. The cmdlet name is Get-WmiObject.

In NAV 2009 R2 NAS executable was nassql.exe.

If I want to get all NAS services I would run in a Windows Powershell session the command:

Get-WmiObject Win32_Service | Where-object {$_.pathname -like “*nassql.exe*”}| Format-List -Property Name,Status,PathName

To list all NAV services, replace nassql with Dynamics.NAV.Server.exe.

WMI screenHave fun trouble shooting đŸ™‚

 

Fastest way to get entities in Dynamics NAV or Business Central and not only

Standard

It’s 4 pm. To my surprise a skype call from one of the customers I usually talk maybe once a month. She cut the niceties quite abruptly: “Look, I have a list of 100 customers and I need it in production asap. I have 15 fields with data for each new customer. Can you do it today before 5?”

This is the context of this blog post. How do we inject new entities in NAV, and not only, in the fastest way (under one hour) ?

A few weeks ago I engaged a few of my peer developers, not just NAV developers, on what they usually do in this type of scenario.

Some of their answers were really good and could be applied in Dynamics NAV or Business Central.

One of the answers was to ask the customer to enter it manually:)

giphy

That is indeed one way, but I’m not sure if my customer was willing to do it and – under one hour was out of the question.

Another answer was to “quickly” write an integration tool to take the data from the original system and into the target system.

crazy

Some of the answers I recall: “That’s crazy!” or “You have a list!” or “Under one hour, please!”…

Another idea was to manipulate the list, residing in an Excel file, in such a way that we generate the code required to insert the records in Excel in the language of your choice (C/AL, AL, C#) and once generated copy it from Excel worksheet straight into a Run trigger of a codeunit or any other method and execute that method. For Business Central create a new extension, extending only Customer List page with one action “Import Customers” and drop the code generated in Excel in that action OnAction trigger. Install the extension, run the action, un-install extension. I personally used this method at least a dozen times in my career in different environments including NAV. It’s fast, dirty and does the job đŸ™‚

A similar answer was to generate the “INSERT INTO” t-sql statements in Excel and copy the batch in a query window and execute it. We know this is not what we call best practices when working with Dynamics NAV, not to mention Business Central. But this might work very well for other environments, especially when you don’t have to trigger any business logic.

Another answer was to write in the language one prefers a subroutine to manipulate the Excel file programmatically. While this is a method that works most of the time when you have enough time, I don’t think is doable in under one hour unless you already have the bulk of the code and you just need to quickly transform it and polish it for the fields that the customer is including this time. I used this method a few times in Dynamics NAV when one can take advantage of the structure Microsoft put in place since NAV 2013 via table 370 Excel Buffer.

One last answer discussed between the NAV guys was to use RapidStart services. We, the NAV people, are quite lucky to have the mothership design this service for us. We both agreed that this would be one quick way to get the data in and most likely under one hour.

This is what I gathered for this type of time-sensitive requests. What would you do if you encounter this type of task?

How to generate Azure Containers Instances loaded with Business Central in minutes

Standard

To start writing extensions for Business Central we have a few choices: installing locally one of the release candidates that comes in the same format as any other Dynamics NAV DVD packages, creating a locally hosted docker sandbox, or in Azure as a container instance.

As the process of getting your container takes just a few minutes, I prefer to do my extensions testing and development in an Azure container.

To generate my Azure container with Business Central I started by installing Azure CLI for Windows. You can also use chocolatey to install Azure CLI on your local machine.

In Visual Studio Code click on Terminal and in a powershell session start your Azure work by logging in your Azure account with

az login

1.Azure Login

If logged in already and want to check account logged info:

az_account_show

Next, we need to create a resource group, which is a logical container in Azure, something like an organization unit in Active Directory or a folder for Windows files.

The command is “az group create” and takes two parameters: group name and location:

create group

Once the resource group is created we can create the azure container instance loaded with the latest Business Central using the following Azure command:

az container create

containerIn the image above,

  • the group in which the container will be created follows “-g” (group) option: “svrg”
  • the name of the container follows the “-n” (name) option: “d365bc-az-cont-us-cont”
  • the image loaded on this container is stored here: “Microsoft/bcsandbox:latest”
  • the OS is Windows
  • We can only enter 5 ports: 80,7046, 7048, 7049, 8080

For a complete list of parameters for “az container create”, check this.

To check the logs, find the credentials to log in recorded by Azure for the previous command run “Az container logs” like below:

logs

As you have seen above, the admin credentials are displayed and the new Azure Business Central instance appears ready for connections. Lets check by browsing on the link for the web client:

Ctrl + Click on the web client link in the picture above opens the Business Central web client:

webclient

To see the newly container page in Azure navigate to the resource group and then to your container:

az_container_page

After entering the credentials from the logs we are in:

inbc

Good! We’ve got a Business Central instance in Azure running in a container and we’re ready to code and test extensions!

To get into this container in Visual Studio Code generate with AL:Go command a new AL project and change in launch.json the value for server token to the container dns name created above:

vscode to azure

In the next blog I’ll go through the steps of deploying an Azure container loaded with a Business Central image using deployment templates with parameters.

If you liked this article bookmark my blog or follow me for more stuff about NAV and Business Central.

Invoking Azure Functions to replace DOT NET calls in C/AL or AL

Standard

Recently Microsoft announced that dotnet can still be used with installations on premise of Dynamics 365 Business Central.

However, if our extension is to make it in the cloud the code leveraging dot net needs to be replaced with http api calls.

In this example I will show how a legacy C/AL code using dot net can be replaced with a call to an Azure function to achieve the original goal of validating a posting code.

Premise

  • Either Table 18 was modified and additional code was added in “Post Code” Validate trigger with Regex class entities to perform validation on post codes.
  • Or, the additional validation is executed when the Post Code Validate in standard is finished and a subscriber to Post Code Validate exists in our extension and is triggered, but still contains dot net code(RegularExpressions class entitites) as we’re only dealing with on-premise (target=internal in app.json)

Objective

I want the additional validation to be executed when the standard validation is finished and the additional validation to not contain dotnet calls.

Design

  1. In a new AL project add a new codeunit:

add_al_codeunit

2. The codeunit itself contains an event subscriber to Table18.Validate.PostCode.

(Use “teventsub” snippet to get the quick scaffolding of the event subscriber)

codeunit_content

When the subscriber is triggered we are executing an Azure Function call: azfnvalidate-us-ca-zipcode. We’re retrieving a json object whose content is : {“PCValid” : true} or {“PCValid” : false}.

3. Write the Azure Function with Visual Studio Code

Pre-requisites:

  • Azure subscription
  • install C# extension
  • Azure Function Core Tools extension
  • install .net core (use npm or chocolatey)
  • Azure CLI Tools

VSCodeExtensions

A good guide to get you started with Azure Functions is here.

Once you have the default “Hello World” Azure Function created, replace your Run function with:

azFn

Publishing the function in Azure should generate a record in your chosen storage:AzureFninPortal

Testing

  1. Once published we can quickly spin a test for the new Azure Function in a web browser window:

web_browser_test

2. Removing the “W” in the previous test, triggers the Azure Function to return above json.

web_browser_invalid_postcode

3. Let’s test now the validation in Business Central:

ezgif-3-34f9ae149c11

Therefore, to replace a set of dotnet calls we need a worker placed somewhere else other than in AL or C/AL and a caller of that worker services placed in the extension. In my example use a codeunit (caller) in the extension range with a subscriber event defined that calls an Azure function(worker).

What other methods are you employing to achieve similar results ?

If you liked this article bookmark my blog or follow me for more stuff about NAV and Business Central.

Microsoft Flow, Twitter and Dynamics NAV

Standard

As C/AL is my number one language to code, I wanted since last summer to give it a try to Microsoft Flow. And as Twitter is one of the top 3 applications I use on my phone I wanted to see if I can get an MS Flow bring the tweets in my favorite environment, Dynamics NAV.

After a few trials and tweaks my flow brings twits in NAV:

Tweet Table

If you want to try it out this is what you need:

  • a Dynamics NAV instance with public IP. I used an azure machine loaded with Dynamics NAV 2017
    • web services for entities you want to create or update via MS flow
  • a Microsoft or work account to connect to flow.microsoft.com
  • a Twitter account

To allow MS Flow to talk to both Twitter and MS NAV we need to set up appropriate Connections in MS Flow:

Conn

The connection to NAV looks like this:

NAVConn

For usename and password create a NAV user and set a password. On the instance you want to connect set a credential type = NAVUserPassword.

For twitter connection user your twitter user id.

To support the Flow, I needed 2 tables and 2 pages:

  • a table Tweet and a page Tweets (exposed as web service) to support the tweets
  • a table Last Tweet to record the last tweet id coming into NAV and a page to be able to update this last tweet id, so the flow does not collect again the same tweets but only new tweets published after the last tweet brought to NAV

NAV tables

And this is what the flow is doing:

  1. Start with a recurrence action:Recurrence
  2. Get the last tweet id by using the NAV connection:Search Tweets
  3. Set up the search for tweets action:Search Tweets
  4. Insert a new record in the NAV Tweet table by using a NAV connection and mapping twitter fields to NAV fields:Net NAVTweet
  5. Update “Last Tweet” table with the last action in the flow:UpdateLastTweet

And this is how the flow looks like:

WholeFlow

The C/AL code is included here.

Thanks for reading. Enjoy Flowing!

 

 

 

 

3 Easy Methods to move company data between NAV databases

Standard

If there is something that changes in NAV every few releases, is how we backup and restore data (be that full backup, application data, companies data) between NAV databases.

Recently I was asked how is this done in NAV 2016.

The answer is simple and is applicable to both NAV 2015 and NAV 2016:

  • with 2 powershell cmdlets
  • via Pages 9901 (for Export) and Page 9900 (for Import)
  • using C/AL functions EXPORTDATA/IMPORTDATA.

Let’s assume the following case. A bug was reported by a customer. You want to have access to his production environment, but many times that’s not possible. One way is to do a SQL backup of customer’s database. That assumes you will have at least backup rights on their database server. Many times that’s not possible. But you probably have access to NAV.

Let’s try to export company data with powershell cmdlets:

  • Launch Dynamics 2016 Administration Shell
  • Run the following cmdlet:NAV-ExportData
    • DatabaseName : look in the NAV Administration console for the database name associated with your instance
    • FilePath: it is expected a file with extension “navdata”
  • Copy navdata file onto your local machine and run the following cmdlet to restore the company data in your other environment (be that development, or QA …)NAV-ImportData
    • Similarly to the previous step, DatabaseName can be extracted from the NAV Administration console and FilePath is a local file with a “navdata” extension
    • The cmdlet is confirming that we want to overwrite the company “…” on the local server “.\ServerName”
    • After confirmation of the overwrite, the importing starts, but we get an error. The cmdlet reports that there is a difference in a table definition. In the example above, on the destination database, one of the tables had a few new fields, just because the destination environment was a development copy of customer’s database.
    • To get around this error you need to have similar objects on the destination and the source. So I saved the latest changes on the target database (exported to a fob) calling it LatestChanges.fob. Then I sync’d objects from customer production with target database. I re-run again Import-NAVData and this time it run without issues.
    • ImportSuccesful
    • After a succesful import of company data you can restore your code from LatestChanges.fob
  • The easiest way though, to restore a company data, is to use Page 9901 (for Export)on the Source and Page 9900 (for Import) on the Target.

To export/import data via C/AL code check EXPORTDATA and IMPORTDATA functions:

Msdn EXPORTDATA and Msdn IMPORTDATA

For a sample of how to use these 2 functions, check the code in pages 9900, 9901 in W1.

Original post here.

 

Dimensions in NAV 2016.. – design overview

Standard

As stated here, beyond the setup and usage of dimensions, developing with NAV requires a better grasp of dimension’s internal design.

I’ll be diving shortly into the dimension related concepts: dimension set entries (Table 480 Dimension Set Entry), search trees (Table 481 Dimension Set Tree Node), dimension management (Codeunit 408 DimensionManagement).

The following use case will create a sales invoice, and add dimensions. We will inspect the main tables and the code in codeunit 408 DimensionManagement to understand what happens. But first, let’s setup some dimensions and their values:

  1. Open Page 536 “Dimension” and create a new dimension (DEPARTMENT), this will create a record in Table 348 “Dimension”
  2. On the same Page, in the Navigate FastTasb, launch “Dimension Values”. This will open Page 537 “Dimension Values” where you can create a few dimension values for this new dimension: these new values will be stored in Table 349 “Dimension Value”
  3. Repeat steps 1 and 2 for two more dimensions AREA and SALESPERSON.
  4. Assign Global Dimensions in General Ledger Setup (Open General Ledger Setup Page and launch “Change Global Dimensions …” action) to Department and Area. This will fill up General Ledger Setup fields: “Global Dimension 1 Code” and “Global Dimension 2 Code”. NAV will assign “Shortcut Dimension 1 Code” and “Shortcut Dimension 2 Code” with these two global dimensions.

GLS

Note: If you’ve updated the Global Dimensions you’ll be warned that the system will run an update on all posted entries to update the dimensions. After the update restart RTC, otherwise .

At the end of the 4 steps this is how the affected tables look like:

Table 348 “Dimension”:

T348

Table 349 “Dimension Value”:

DimVal

 

Let’s create now a Sales Invoice and note how the dimensions are handled.

On the Lines Fast Tab on the Sales Invoice add the columns for the newly picked dimensions: DEPARTMENT Code and AREA Code.

Assign AREA Code field the value “EAST”:

 

SL_1

Validating the “Department Code” involves updating the “Dimension Set ID” for the sales line. The core of this operation happens in Codeunit 408 DimensionManagement:

CU480_1

Looking into table 480 and 481 this is the current situation in our exercise:

480_481_1

NAV has created an entry in the table 480 Dimension Set Entry for the Dimension/Dimension Value pair AREA=EAST and assigned it an ID = 3254.

Additionally, NAV created a search tree in which at this point we only have a node(the root) for the just started dimension combination with one dimension (AREA=EAST).

Lets assign now a new dimension, the Department code to SALES:

SalesLine_Department

Let’s check what happened in Table 480 and 481:

480_481_2

 

In Table 480 “Dimension Set Entry” NAV inserted two more lines (2nd and 3rd) and assigned to both Dimension Set ID = 3255. This is to identify the 2-tuple AREA=EAST and DEPARTMENT =SALES uniquely with ID = 3255. I do expect that, if I add a new dimension that would be assigned an ID=3256 and we will have 3 new records with that ID.

In the left side we have the trees.

The middle one (0,5039;3254) was generated in the previous step when we assigned the first dimension (AREA=EAST); As part of assigning a second dimension (DEPARTMENT=SALES) the system generated two more trees:

  • first one is a tree with a single node for DEPATMENT=SALES
  • second tree has two nodes: as root we have the node with Dimension Set ID=1 (DEPARTMENT=SALES) and as branch from this root we have a node with Dimension Value= 5839 and Dimension Set ID = 3255 which we can see in Table 480 is the node for AREA=EAST.

Tree_2_Dims

Let’s fill up a third dimension and check if our expectations were right:

SL_3

By inspecting the two tables (480 and 481) we notice:

480_481_3

  • 3 new records in 480 to identify the 3-tuple (AREA=EAST,DEPARTMENT=SALES,SALESPERSON=SAM). All three records have been assigned Dimension Set ID = 3256.
  • In table 481, NAV created a 4th search tree, beside the 3 existing ones

4th_TREE

In code, if we want to update the “Dimension Set ID” we would do something similar to the code in ValidateShortcutDimValues function in Codeunit 408 “DimensionManagement”

UpdateDimSetID

If you need to get a combined Dimension Set ID based on two existing Dimension Set IDs you can use the function GetCombinedDimensionSetID to do it in one shot:

MultipleDimSets

To understand the benefit between old way of managing dimensions (Pre-2013) and the new way, check archerpoint article.

Thanks for reading, sharing commenting … Much appreciated!

Original article here.