Clone your leaflet here.
It’s 4 pm. To my surprise a skype call from one of the customers I usually talk maybe once a month. She cut the niceties quite abruptly: “Look, I have a list of 100 customers and I need it in production asap. I have 15 fields with data for each new customer. Can you do it today before 5?”
This is the context of this blog post. How do we inject new entities in NAV, and not only, in the fastest way (under one hour) ?
A few weeks ago I engaged a few of my peer developers, not just NAV developers, on what they usually do in this type of scenario.
Some of their answers were really good and could be applied in Dynamics NAV or Business Central.
One of the answers was to ask the customer to enter it manually:)
That is indeed one way, but I’m not sure if my customer was willing to do it and – under one hour was out of the question.
Another answer was to “quickly” write an integration tool to take the data from the original system and into the target system.
Some of the answers I recall: “That’s crazy!” or “You have a list!” or “Under one hour, please!”…
Another idea was to manipulate the list, residing in an Excel file, in such a way that we generate the code required to insert the records in Excel in the language of your choice (C/AL, AL, C#) and once generated copy it from Excel worksheet straight into a Run trigger of a codeunit or any other method and execute that method. For Business Central create a new extension, extending only Customer List page with one action “Import Customers” and drop the code generated in Excel in that action OnAction trigger. Install the extension, run the action, un-install extension. I personally used this method at least a dozen times in my career in different environments including NAV. It’s fast, dirty and does the job 🙂
A similar answer was to generate the “INSERT INTO” t-sql statements in Excel and copy the batch in a query window and execute it. We know this is not what we call best practices when working with Dynamics NAV, not to mention Business Central. But this might work very well for other environments, especially when you don’t have to trigger any business logic.
Another answer was to write in the language one prefers a subroutine to manipulate the Excel file programmatically. While this is a method that works most of the time when you have enough time, I don’t think is doable in under one hour unless you already have the bulk of the code and you just need to quickly transform it and polish it for the fields that the customer is including this time. I used this method a few times in Dynamics NAV when one can take advantage of the structure Microsoft put in place since NAV 2013 via table 370 Excel Buffer.
One last answer discussed between the NAV guys was to use RapidStart services. We, the NAV people, are quite lucky to have the mothership design this service for us. We both agreed that this would be one quick way to get the data in and most likely under one hour.
This is what I gathered for this type of time-sensitive requests. What would you do if you encounter this type of task?
What is PowerApps? PowerApps is a service for generating cross platform (iOS, Android, Windows Store) applications. It allows connectivity to different systems, comes up with cloud IDE and a cloud admin interface that allows users to publish apps targeting whatever platform you need. The IDE is called PowerApps Studio and can be downloaded from Windows Store locally on your machine or it can be used as a web application. I designed the app detailed below using the web application.
Most importantly, just like the other power tools, PowerBI and MS Flow, PowerApps is accessible not only to professional developers, but also business analysts, junior developers, or expert users in any company. I wrote this app without any code inside PowerApps Studio, just a few Excel functions invoked sporadically.
The quick PowerApps app I built required:
The app will get from the Azure Business Central container the list of items via Item List page exposed as web service, and will present on the first screen the Item No. and Description for all items. App user can then advance into the details screen for each item. Here, if the Quantity is low the user can advance on a third screen where he can generate a purchase invoice for the desired quantity for the item and vendor selected. The result is that in Business Central the app will generate via a second web service a purchase invoice for the item, the vendor selected and the quantity entered.
There are two main parts to create your app:
1. Create app connectors
To create a Business Central connector go to the File menu in the PowerApps Studio and choose Connections:
Â
The connector to the Azure BC Container instance looks like this:
Â
Once the connector is set we can access all web services exposed in Business Central Azure Container.
2. Design PowerApps app
The PowerApps Studio comes with 3 main regions:
Â
MasterScreen consists of a Galleria control (GalleriaItems) which contains a list of items retrieved via Items web service Data Source. You will see later that this web service is Page 31 exposed as web service in Business Central Azure Container.
OnSelect event for the Forward button has Navigate(screen,effect) function behind to advance to a certain screen in your app.
The second screen, DetailScreen displays a bit more fields from Items web service.
If the inventory is low, the app user can decide to order more by clicking on “Order more” button:
Once the user enters the desired quantity to be included on a Business Central Purchase Invoice the app will create a POST request to a new ODATA web service data source (OrderItemVendorWS) and ultimately generate the purchase invoice with one purchase line.
Let’s see the app:
Â
And, in Business Central, the new purchase invoice:
This is what was needed on the PowerApps side, but additionally, I needed to plug a few new things in Business Central.
First, create a new AL project, and point Visual Studio Code to the azure container:
Launch.json:
Web services:
Page 50100 “PurchaseItemList” is a new page based on a new Table 50100 PurchaseItem:
Table 50100 PurchaseItem:
Page 50100 PurchaseItemList:
The Purchase Invoice is generated during OnInsert trigger on the new table:
Creating an app with PowerApps assumes 3 tasks:
PowerApps comes with versioning and management capabilities of a few environments (E.g. Dev, QA, Prod). Once your app has been tested by PowerApps app users, you could export it from QA and import it in Prod and distributed it from there. Select Office and Dynamics 365 plans will allow you to generate and manage these environments.
More specifically, if you go to web.powerapps.com and click on Solutions you will be able to follow (with the right license) Create a new environment link.
Some useful links:
Â
A bit of context here … I was trying to create an environment in which to write an extension around workflows and notifications.
I needed 2 users, one to generate the document and the other one to receive the notification to approve it.
I decided to create a local Business Central sandbox and point my extension to this environment.
So I ran the New-NavContainer cmdlet as below
When I try to create the user as a local Windows user in the host, and in the Business Central sandbox the username gets translated into the SID for that Windows:
And after I logged as the new local user, launching BC sandbox Client it fails:
Then I’ve seen someone recommending to run New-NavContainerNewUser … so I gave it a try:
This is the issue and I have not been able to find a solution yet.
Create the sandbox with “-Auth NavUserPassword” like below:
You will be prompted for credentials of the first user (I named mine svi) when you run the script.
The cmdlet New-NavContainer will create among many other things, a sql login with sysadmin server role.
Once the script is finished, launch BC Sandbox client, use the credentials entered at installation and create a new user.
Now I have 2 users and can continue my tests on Workflows/Approvals:
If you have a solution for a local BC sandbox with Credential Type (-Auth param in New-NavContainer) set to Windows and the issue expressed in the title, please leave a comment with your solution.
Otherwise, I hope this workaround can help anyone with the same issue.
As soon as I started working with Containers, more specifically with Azure Containers, around mid-December 2018, I quickly run into a few questions: how can I automate the container creation, how can I update a container (scale up or down, override settings)? How can I scale out my configuration? For some of my questions I identified answers, for others the research is ongoing.
As we established I am not exactly an expert and if you’re still here, the process of generating your first Azure Container loaded with Business Central is a fairly easy one. Check my previous blog where I described step by step the process.
I like to mess around, and I did mess around with the tables where the extensions are managed (system tables 2000000150 NAV App*) ending up with a corrupt container, or rather with a corrupt Business Central. Because I did not have any important data I could just delete the container and run through the steps of manually creating it again. But what if I wanted to automate the process? What if I needed to build 5 distinct containers? How can I speed up the process and make it scalable?
Instead of going through last blog exercise, to delete the corrupt container and re-create it, I decided to investigate Microsoft documentation around deployment templates and deployment parameter files.
This is what I learnt:
In the portal, go to the container created in the previous blog, click on “Automated script” and download:
Download the automatic script into a new Visual Studio Code folder. I chose to save it as azuredeploy.json.
Above, is the deployment template I’m going to work with to automate the creation of new containers loaded with a Business Central image. The current image, Microsoft/bcsandbox:latest, in the template code, won’t have data. If you want sample data in your new container(s) use this image: Microsoft/bcsandbox:base. If you need more info about loading your Business Central with data, read Waldo’s and Roberto’s blogs.
Additionally, create a new file(the script) – I named it templatedeploy.ps1:
Before we run this script we have to take a closer look at the deployment template downloaded from the portal.
I replaced the highlighted section above with this one below:
I’m adding 3 new parameters, but you could parametrize almost any setting in your deployment template and create placeholders for them in the deployment template:
Moreover, I needed to create a new file in our project, parameters.json:
Before running the script “az group deployment create” looks like this:
Now I’m ready to run the powershell script:
To be able to log in Business Central we need the credentials for admin which can be obtained with the command:
az container logs -g rg-template -n d365bc-container-fromtemplate
To perform some cleanup (remove resource group and its content)Â run:
az group delete -n rg-template –yes
Let’s now scale out our deployment to 2 containers:
And after running “templatedeploy.ps1” we go to Azure Portal and we can see 2 containers under our unique deployment:
Check the logs, identify the Admin password and you’re ready to login in your container!
That’s what I learnt. What would you add?
To start writing extensions for Business Central we have a few choices: installing locally one of the release candidates that comes in the same format as any other Dynamics NAV DVD packages, creating a locally hosted docker sandbox, or in Azure as a container instance.
As the process of getting your container takes just a few minutes, I prefer to do my extensions testing and development in an Azure container.
To generate my Azure container with Business Central I started by installing Azure CLI for Windows. You can also use chocolatey to install Azure CLI on your local machine.
In Visual Studio Code click on Terminal and in a powershell session start your Azure work by logging in your Azure account with
“az login“
If logged in already and want to check account logged info:
Next, we need to create a resource group, which is a logical container in Azure, something like an organization unit in Active Directory or a folder for Windows files.
The command is “az group create” and takes two parameters: group name and location:
Once the resource group is created we can create the azure container instance loaded with the latest Business Central using the following Azure command:
“az container create”
In the image above,
For a complete list of parameters for “az container create”, check this.
To check the logs, find the credentials to log in recorded by Azure for the previous command run “Az container logs” like below:
As you have seen above, the admin credentials are displayed and the new Azure Business Central instance appears ready for connections. Lets check by browsing on the link for the web client:
Ctrl + Click on the web client link in the picture above opens the Business Central web client:
To see the newly container page in Azure navigate to the resource group and then to your container:
After entering the credentials from the logs we are in:
Good! We’ve got a Business Central instance in Azure running in a container and we’re ready to code and test extensions!
To get into this container in Visual Studio Code generate with AL:Go command a new AL project and change in launch.json the value for server token to the container dns name created above:
In the next blog I’ll go through the steps of deploying an Azure container loaded with a Business Central image using deployment templates with parameters.
If you liked this article bookmark my blog or follow me for more stuff about NAV and Business Central.