Invoking Azure Functions to replace DOT NET calls in C/AL or AL

Standard

Recently Microsoft announced that dotnet can still be used with installations on premise of Dynamics 365 Business Central.

However, if our extension is to make it in the cloud the code leveraging dot net needs to be replaced with http api calls.

In this example I will show how a legacy C/AL code using dot net can be replaced with a call to an Azure function to achieve the original goal of validating a posting code.

Premise

  • Either Table 18 was modified and additional code was added in “Post Code” Validate trigger with Regex class entities to perform validation on post codes.
  • Or, the additional validation is executed when the Post Code Validate in standard is finished and a subscriber to Post Code Validate exists in our extension and is triggered, but still contains dot net code(RegularExpressions class entitites) as we’re only dealing with on-premise (target=internal in app.json)

Objective

I want the additional validation to be executed when the standard validation is finished and the additional validation to not contain dotnet calls.

Design

  1. In a new AL project add a new codeunit:

add_al_codeunit

2. The codeunit itself contains an event subscriber to Table18.Validate.PostCode.

(Use “teventsub” snippet to get the quick scaffolding of the event subscriber)

codeunit_content

When the subscriber is triggered we are executing an Azure Function call: azfnvalidate-us-ca-zipcode. We’re retrieving a json object whose content is : {“PCValid” : true} or {“PCValid” : false}.

3. Write the Azure Function with Visual Studio Code

Pre-requisites:

  • Azure subscription
  • install C# extension
  • Azure Function Core Tools extension
  • install .net core (use npm or chocolatey)
  • Azure CLI Tools

VSCodeExtensions

A good guide to get you started with Azure Functions is here.

Once you have the default “Hello World” Azure Function created, replace your Run function with:

azFn

Publishing the function in Azure should generate a record in your chosen storage:AzureFninPortal

Testing

  1. Once published we can quickly spin a test for the new Azure Function in a web browser window:

web_browser_test

2. Removing the “W” in the previous test, triggers the Azure Function to return above json.

web_browser_invalid_postcode

3. Let’s test now the validation in Business Central:

ezgif-3-34f9ae149c11

Therefore, to replace a set of dotnet calls we need a worker placed somewhere else other than in AL or C/AL and a caller of that worker services placed in the extension. In my example use a codeunit (caller) in the extension range with a subscriber event defined that calls an Azure function(worker).

What other methods are you employing to achieve similar results ?

If you liked this article bookmark my blog or follow me for more stuff about NAV and Business Central.

Dynamics 365 Business Central : Al Code Analyzers #TurnOnCops #Extensions #Permissions

Standard

When you turn On PerTenantExtensionCop in Visual Studio Code and you forgot to create a Permissions.xml file, you get a compile error in your extension.

In Visual Studio Code – > User Settings add an entry for al.codeAnalyzers token like below:

CodeCops

Without this flag turned on your extension compiles.

If you do set PerTenantExtensionCop and if you’re missing Permissions.xml, you’re going to get a compile error:

Compile Error

By Adding in the project root a file Permissions.xml the error is solved:

Success

More reading on code analyzers here.

Original post here.

Dynamics 365 Business Central : Extending Role Center headline with web service data, lists and dictionaries

Standard

So much to read, so little time … the speed at which Microsoft adds new Business Central and AL features is overwhelming 🙂

In this blog I’ll demonstrate how I was able to display weather temperature for 3 cities in Business Central role center headline.

ezgif-3-ef3a02f8f885

First, there are 9 headline role center pages in Business Central, with ID from 1440..1448.

headline pages

I will extend the headline role center for the page 1440 : “Headline RC Business Manager”, by adding three fields, one for each city and its temperature.

To record the three cities and their temperatures I am using here a list and a dictionary data structure.

Fields_And_DataStructures

This is followed by a querying of a weather web service and recording of the 3 cities and their temperatures in a dictionary:

Query Web Service

Commented is the response from the web service.

I need data stored in the following tokens:

  • $main.temp
  • $sys.country
  • name

For more info on how to parse web service response take a look at Mr. Kauffman blog.

json

I use a free web service for weather openweathermap. You need to create an account and you will get a free APPID when you complete the registration. You can only query the web service once every 10 minutes for the same location.

Finally, to load cities and their temperatures in your headline use the code below:

CityTemp

The complete pageextension object is included here.

That’s it … thanks for reading!

Original post here.

Microsoft Flow, Twitter and Dynamics NAV

Standard

As C/AL is my number one language to code, I wanted since last summer to give it a try to Microsoft Flow. And as Twitter is one of the top 3 applications I use on my phone I wanted to see if I can get an MS Flow bring the tweets in my favorite environment, Dynamics NAV.

After a few trials and tweaks my flow brings twits in NAV:

Tweet Table

If you want to try it out this is what you need:

  • a Dynamics NAV instance with public IP. I used an azure machine loaded with Dynamics NAV 2017
    • web services for entities you want to create or update via MS flow
  • a Microsoft or work account to connect to flow.microsoft.com
  • a Twitter account

To allow MS Flow to talk to both Twitter and MS NAV we need to set up appropriate Connections in MS Flow:

Conn

The connection to NAV looks like this:

NAVConn

For usename and password create a NAV user and set a password. On the instance you want to connect set a credential type = NAVUserPassword.

For twitter connection user your twitter user id.

To support the Flow, I needed 2 tables and 2 pages:

  • a table Tweet and a page Tweets (exposed as web service) to support the tweets
  • a table Last Tweet to record the last tweet id coming into NAV and a page to be able to update this last tweet id, so the flow does not collect again the same tweets but only new tweets published after the last tweet brought to NAV

NAV tables

And this is what the flow is doing:

  1. Start with a recurrence action:Recurrence
  2. Get the last tweet id by using the NAV connection:Search Tweets
  3. Set up the search for tweets action:Search Tweets
  4. Insert a new record in the NAV Tweet table by using a NAV connection and mapping twitter fields to NAV fields:Net NAVTweet
  5. Update “Last Tweet” table with the last action in the flow:UpdateLastTweet

And this is how the flow looks like:

WholeFlow

The C/AL code is included here.

Thanks for reading. Enjoy Flowing!

 

 

 

 

3 Easy Methods to move company data between NAV databases

Standard

If there is something that changes in NAV every few releases, is how we backup and restore data (be that full backup, application data, companies data) between NAV databases.

Recently I was asked how is this done in NAV 2016.

The answer is simple and is applicable to both NAV 2015 and NAV 2016:

  • with 2 powershell cmdlets
  • via Pages 9901 (for Export) and Page 9900 (for Import)
  • using C/AL functions EXPORTDATA/IMPORTDATA.

Let’s assume the following case. A bug was reported by a customer. You want to have access to his production environment, but many times that’s not possible. One way is to do a SQL backup of customer’s database. That assumes you will have at least backup rights on their database server. Many times that’s not possible. But you probably have access to NAV.

Let’s try to export company data with powershell cmdlets:

  • Launch Dynamics 2016 Administration Shell
  • Run the following cmdlet:NAV-ExportData
    • DatabaseName : look in the NAV Administration console for the database name associated with your instance
    • FilePath: it is expected a file with extension “navdata”
  • Copy navdata file onto your local machine and run the following cmdlet to restore the company data in your other environment (be that development, or QA …)NAV-ImportData
    • Similarly to the previous step, DatabaseName can be extracted from the NAV Administration console and FilePath is a local file with a “navdata” extension
    • The cmdlet is confirming that we want to overwrite the company “…” on the local server “.\ServerName”
    • After confirmation of the overwrite, the importing starts, but we get an error. The cmdlet reports that there is a difference in a table definition. In the example above, on the destination database, one of the tables had a few new fields, just because the destination environment was a development copy of customer’s database.
    • To get around this error you need to have similar objects on the destination and the source. So I saved the latest changes on the target database (exported to a fob) calling it LatestChanges.fob. Then I sync’d objects from customer production with target database. I re-run again Import-NAVData and this time it run without issues.
    • ImportSuccesful
    • After a succesful import of company data you can restore your code from LatestChanges.fob
  • The easiest way though, to restore a company data, is to use Page 9901 (for Export)on the Source and Page 9900 (for Import) on the Target.

To export/import data via C/AL code check EXPORTDATA and IMPORTDATA functions:

Msdn EXPORTDATA and Msdn IMPORTDATA

For a sample of how to use these 2 functions, check the code in pages 9900, 9901 in W1.

Original post here.

 

Dimensions in NAV 2016.. – design overview

Standard

As stated here, beyond the setup and usage of dimensions, developing with NAV requires a better grasp of dimension’s internal design.

I’ll be diving shortly into the dimension related concepts: dimension set entries (Table 480 Dimension Set Entry), search trees (Table 481 Dimension Set Tree Node), dimension management (Codeunit 408 DimensionManagement).

The following use case will create a sales invoice, and add dimensions. We will inspect the main tables and the code in codeunit 408 DimensionManagement to understand what happens. But first, let’s setup some dimensions and their values:

  1. Open Page 536 “Dimension” and create a new dimension (DEPARTMENT), this will create a record in Table 348 “Dimension”
  2. On the same Page, in the Navigate FastTasb, launch “Dimension Values”. This will open Page 537 “Dimension Values” where you can create a few dimension values for this new dimension: these new values will be stored in Table 349 “Dimension Value”
  3. Repeat steps 1 and 2 for two more dimensions AREA and SALESPERSON.
  4. Assign Global Dimensions in General Ledger Setup (Open General Ledger Setup Page and launch “Change Global Dimensions …” action) to Department and Area. This will fill up General Ledger Setup fields: “Global Dimension 1 Code” and “Global Dimension 2 Code”. NAV will assign “Shortcut Dimension 1 Code” and “Shortcut Dimension 2 Code” with these two global dimensions.

GLS

Note: If you’ve updated the Global Dimensions you’ll be warned that the system will run an update on all posted entries to update the dimensions. After the update restart RTC, otherwise .

At the end of the 4 steps this is how the affected tables look like:

Table 348 “Dimension”:

T348

Table 349 “Dimension Value”:

DimVal

 

Let’s create now a Sales Invoice and note how the dimensions are handled.

On the Lines Fast Tab on the Sales Invoice add the columns for the newly picked dimensions: DEPARTMENT Code and AREA Code.

Assign AREA Code field the value “EAST”:

 

SL_1

Validating the “Department Code” involves updating the “Dimension Set ID” for the sales line. The core of this operation happens in Codeunit 408 DimensionManagement:

CU480_1

Looking into table 480 and 481 this is the current situation in our exercise:

480_481_1

NAV has created an entry in the table 480 Dimension Set Entry for the Dimension/Dimension Value pair AREA=EAST and assigned it an ID = 3254.

Additionally, NAV created a search tree in which at this point we only have a node(the root) for the just started dimension combination with one dimension (AREA=EAST).

Lets assign now a new dimension, the Department code to SALES:

SalesLine_Department

Let’s check what happened in Table 480 and 481:

480_481_2

 

In Table 480 “Dimension Set Entry” NAV inserted two more lines (2nd and 3rd) and assigned to both Dimension Set ID = 3255. This is to identify the 2-tuple AREA=EAST and DEPARTMENT =SALES uniquely with ID = 3255. I do expect that, if I add a new dimension that would be assigned an ID=3256 and we will have 3 new records with that ID.

In the left side we have the trees.

The middle one (0,5039;3254) was generated in the previous step when we assigned the first dimension (AREA=EAST); As part of assigning a second dimension (DEPARTMENT=SALES) the system generated two more trees:

  • first one is a tree with a single node for DEPATMENT=SALES
  • second tree has two nodes: as root we have the node with Dimension Set ID=1 (DEPARTMENT=SALES) and as branch from this root we have a node with Dimension Value= 5839 and Dimension Set ID = 3255 which we can see in Table 480 is the node for AREA=EAST.

Tree_2_Dims

Let’s fill up a third dimension and check if our expectations were right:

SL_3

By inspecting the two tables (480 and 481) we notice:

480_481_3

  • 3 new records in 480 to identify the 3-tuple (AREA=EAST,DEPARTMENT=SALES,SALESPERSON=SAM). All three records have been assigned Dimension Set ID = 3256.
  • In table 481, NAV created a 4th search tree, beside the 3 existing ones

4th_TREE

In code, if we want to update the “Dimension Set ID” we would do something similar to the code in ValidateShortcutDimValues function in Codeunit 408 “DimensionManagement”

UpdateDimSetID

If you need to get a combined Dimension Set ID based on two existing Dimension Set IDs you can use the function GetCombinedDimensionSetID to do it in one shot:

MultipleDimSets

To understand the benefit between old way of managing dimensions (Pre-2013) and the new way, check archerpoint article.

Thanks for reading, sharing commenting … Much appreciated!

Original article here.

Setup and How-Tos re: Dimensions in NAV2016..

Standard

Recently I’ve been asked to explain the concept of dimension in NAV and while I managed to come up quickly with an answer I wanted to spend some time researching the topic so that next time I need to offer a similar explanation my answer will sound more academic, plus I’ll have a link to refer to 🙂

According to msdn, “A dimension is data that you can add to an entry as a kind of marker so that the program can group entries with similar characteristics and easily retrieve these groups for analysis purposes.”.

A few characteristics of dimensions:

  • can be applied to documents and journals
  • dimensions have dimension values, for example we can set a dimension AREA with 3 dimension values: Europe, America, Africa.
  • used to create extracts of financial statements for statistics and analysis. In creating these statistics we can use more than one dimension, for example we can look at sales by the following dimensions AREA and DEPARTMENT.

There are two types of dimensions I want to cover here: global dimensions and shortcut dimensions. Additionally, there are budget dimensions that can be defined at the moment when you generate a new budget.

Both, global dimensions as well as shortcut dimensions can be set up for the company in the General Ledger Setup. There are maximum 2 global dimensions and maximum 8 shortcut dimensions. NAV defines the first two shortcut dimensions the same with the global dimensions.

Global dimensions are available for input across  the system in tables like 15 “G/L Account”, 18 Customer, 23 Vendor or in ledger entries tables like: 17 “G/L Entries”, 21 “Cust. Ledger Entries” and 25 “Vend. Ledger Entries” (Fields 23,24). Because Global Dimensions are being part of the core system they can be used in filtering ledger entries, filtering in the reports, account schedules or batch jobs.

Shortcut dimensions(set up initially in the General Ledger Setup) will be available at the document level, or at the journal lines level. On each line of the document(sale or purchase) or journal line add the column with a name(code) that was designated in general ledger setup as shortcut dimension and assign to it one of the available dimension values. E.g. if I have defined Global Dimension 1 as DEPARTMENT, I’m expecting to have in the General Ledger Setup the “Shortcut Dimension 1 Code” as DEPARTMENT and when I create a Purchase Invoice, on the lines, to have available a field “Department Code”.

To create dimensions, use Page Dimensions and enter or modify dimensions.

To create values for each dimensions, on the same page go to Navigate Tab, and launch action “Dimension Values”. To assign default values for dimensions, on the same Page (Dimensions) use action “Account Type Default Dim.”. If you dont’ want to specify a default dimension value code, but you want to have it on each ledger entry, set the value of “Value Posting” field to “Code Mandatory”. This way you ensure that no General Journal Line or document will be posted without a value for the specific dimension.

In a future blog I will dive into the technical design of dimensions covering notions such as dimension sets, dimension set entries, search trees and a few notes on performance improvements compared to older dimension design.

Thank you for reading, sharing, commenting … Much appreciated!

Original blog here.