Category Archives: Dynamics NAV

Extending Role Centers

Today I wanted to extend a Role Center with a Page Extension and I noticed Microsoft has updated terminology.

HomeItems = Embedding

To add a list to the Home Items you must use Embedding.RC1

Other changes:

  • Related Information = Navigation
  • New = Creation.


The Result in the Windows Client is the normal behavior with grouping the Cue from the Activities which in this case was also created using a Table Extension and a Page Extension.


Great Job guys!


The FOB is planning its retirement

In software we’ve invented all kinds of terminology to make it sound as if removing a feature is great. Terms like sunsetting and retirement give you a great feeling. Who does not love to see the sun go down with a beer and a loved one.

In reality it means that software that once was is no longer and we’ve had plenty of that in our beloved Navision product, now referred to as Business Central.

FOB is an abbreviation of Financials Object, just like FLF is short for Financials License File.

We use FOB files to move objects from one database to another using C/Side.

In the future of Business Central there is no room for C/Side. Microsoft is currently working very hard to make the Visual Studio Code experience and the ALC.Exe compiler mature enough to replace C/Side. On top of that C/Side also manages the system tables and a lot more that “something else” has to take over.

But this post is not about C/Side. C/Side will be there in October when NAV 2019 is released just like the Windows Client and the Fob option. But that does not mean it is smart to continue using them.

NAV Architecture, Design & Code

Loyal readers of my blog will not be surprised to read that I am not a fan of how NAV is architected, designed and coded.

I’m not talking about the base design principles and patterns. They are fine and should be used, but also by Microsoft.

NAV is one monolithical application where the functional modules cannot be recognised in the code structure.

This is not a problem in C/Side because here we can easily filter on object names and object numbers and most of us somehow leverage the Version List to filter on objects that have been modified for a project or customer.

Making a Fob

When we make a Fob we apply a filter in C/Side and export the files. This export does not necessarily need to contain all changes. We can cherry pick.

I did this for one of my customers this week. We are hybrid in NAV2018 and have close to 2.000 new objects added to NAV. We wanted some of the modifications in DEV to ship to PROD, but not all of them.

This is easy these days by just marking the objects, exporting them, putting them into the Acceptance system, do one final test and go live.

What if we don’t have Fobs no more…

Let’s say that I move all of this customers modifications to an extension. I actually tried this a few times.

When NAV 2018 was just released I ran the ExportToNewSyntax and the Txt2Al and quickly gave up. I had hundreds, if not thousands of compile errors which were caused by many things starting with bugs in the export, the converter and then the fact that NAV 2018 does not support DotNET which crashes a big percentage of our code.

I did the same thing a few weeks ago and to my surprise I was down to 414 errors. Many things had been fixed and DotNET variables no longer error out although they don’t work yet.

When NAV 2019 ships in October these 414 errors are probably down to a number that is overseeable and fixable. DotNET will work and I can migrate everything to an extension.

Why is this a bad idea?

If I migrate all my changes to one super extension my life will become impossible. It will be very hard to work with Visual Studio Code because finding objects is hard. The compiler will be very slow because it has to evaluate 2.000 objects with each keystroke.

Working on the same database with more than one developer will be hard, even if we use Git and GitLens. Sure, Git will merge but it will make mistakes and I will loose time.

Moving parts of my changes which are finished and testing while leaving out ongoing mods will be extremely hard.

The Solution

I need to break down my solution into smaller projects with dependencies. If I break down my 2000 objects into groups that belong together and compile together I can ship different versions when they are tested. I can write automated tests for these components and have different developers working on different parts of the application without them running into Git Merge issues.

I can have junior developers working on simple extensions and have senior developers work on the core objects that are difficult to maintain.

How To Get There?

I don’t know. Because DotNET is not supported yet I’ve not yet had a chance to move stuff around easily.

In theory it should be as easy as moving .al files around into different projects and have the compiler test if everything still works.

You’ll probably run into the situation where you have to move code from OnValidate triggers to events because they belong in different components.

Dude, is this still simplicity?

Right. This is exactly why I am debating this for my customer. For the last 20 years or so we’ve been using NAV as a development environment. The solution has very little dependency on NAV.

With Extensions and Visual Studio Code Microsoft is moving into a more Object Oriented approach with dependencies and a high level of abstraction.

For this reason I am comparing NAV to .Net Core. This allows us to use the full C# stack, Entity Framework and any HTML front end framework like Angular, React or whatever the framework of the day will be in five years.

Extensions as Micro Services

Whatever I can do today as an Extension I do as an Extension. We have 8 extensions already and none of them have more than 20 objects.

With a little help from the ForNAV Object Explorer it is very easy to manage. Every extension gets 10 numbers assigned from our license. When we need more than 10 numbers the extension get’s too complex and has to be broken up into more components.

How Many Extensions?

We will probably end up with about 30 extensions, maybe less if we move some of our components to .Net Core.

In .Net Core we are going to take the same approach. Small projects on a shared core. We have four projects as we speak that have organically grown over time. This summer the plan is to synchronize them and generate a NUGET package that will allow communication with the Business Central API.

Who knows, this NUGET package might be available for you as well.



Move Bespoke Symbols to Production

Today I had a small talk with my brother about generating custom symbols from C/Side and how to manage that.

Here is how I did it.

At the company I work for we have a DTAP environment and we (normally) only code in Development. Fobs and Extensions are moved from Development to Test, Acceptance and Production.

The quesion is how to control your symbols and move them together with each iteration.

Surely you can generate symbols in your production database but that might not be super smart. Alternatively you can move your Application App file together with the rest.

You need two PowerShell commands.

Unpublish-NAVApp -ServerInstance NAV2018PROD -Name "Application"

Publish-NAVApp -ServerInstance NAV2018PROD -Path "\\Symbols\" -PackageType SymbolsOnly -SkipVerification

You need to make sure that the flags for Development in the Server Config are false.

Dev Endpoint

Breaking Symbols

Sometimes Symbols can break if someone in a Development database changes an object in C/Side without saving the changes.

Another developer doing an extension can be hurt by that if he decides to refresh the symbols at the same time.

The simplest and official answer is to introduce the concept of distrubuted development where each developer has their own Development system and you have a build server creating the test envinronment if the build passes and execute the automated tests.

But in our ecosystem many partners seem to prefer centralised development and then you could consider to also disable the loading of the symbols on object changes and introduce a symbol database. (SDTAP).


Another alternative might be to introduce versions of the Application symbols but this is something I haven’t tried myself.

Not sure if that would work but I thought it was worth sharing.

Question: How do you manage your custom Application Symbols?

Performance Measuring of Large Reports

In the ForNAV standard report pack we have a few reports that are traditionally slow when running. One of my design goals when developing these reports was to see if I can increase performance.

The names of the challenged reports will sound familiar to those in our channel for a longer time.

  • Aged Accounts Receivables & Payables
  • Inventory to G/L Reconcile

The latter only exists in the North American localization but whomever spends a lot of time on MiBuSo has seen the questions on performance of these guys.

Why are they slow?

Both reports have slow performance because they loop through the entry tables one-by-one which means they get slower over time. Both reports were created a long time ago. In case of the Aged Accounts Receivables & Payables report it was done before we had detailed entries.

Exactly how slow?

So, this is the question everybody asks and the only true answer is “it depends”. It depends not just on the size of your database but even more on the ratio between Master Data and Entries.

Also, you need a reasonable amount of data to test this, not just a CRONUS database with Microsoft Demo data.

Long live the upgrade business

When I started my freelance career 12 years ago I decided to step into upgrades. Not alone, but with the help of my good friend Tom Wickstrom. Tom has probably done thousands of upgrades over the last decades.

Tom picked two databases for me that I’ve used to test with. One database is about 60 GB and the other is about 50GB. This is a good representation of a professional bespoke NAV system.

The ratio’s in these databases are different, especially at an Item level.

50GB System 10 Years of Posting Data System A
No. of Customers No. of Cust Ledg. Entries Ratio No. of Detailsed Entries No. of Items No. of Item Entries Ratio No. of Value Entries Ratio
2741 71583 26,1 160287 380 1948702 5128,2 8198945 4,2
60GB System 11 Years of Posting Data System B
No. of Customers No. of Cust Ledg. Entries Ratio No. of Detailsed Entries No. of Items No. of Item Entries Ratio No. of Value Entries Ratio
9463 269694 28,5 552562 134114 1146037 8,5 2015607 1,8

On average each customer has made between 25 and 30 purchases in 10 years. The number of sales per item is the biggest difference as is the amount of value entries per item entry.

How do we Measure

The databases are installed on the same SQL Server. The servers are warmed up. We run the report once before we measure the results and then we take the average of three adjacent runs. We run using the Windows client. No Azure, No Docker, No VMWare or HyperV. Pure iron, bare metal. Each drive is an individual 500 gig ssd drive 

SQL Version                       2012
NAV Version                      2017
ForNAV Version      
Memory                              32GB
CPU                                       3.40 Ghz. Intel Core i7-4770
Disks                                     C Drive  SQL installed here. w. Database & server executables
                                              E Drive  MDF database file is here
                                            F Drive  NDF database file is here
                                            G Drive  LDF database file is here


Microsoft’s Performance

Inventory to G/L Reconciliation System A 12:20 Minutes/Sec
  System B 7:09 Minutes/Sec
Aged Accounts Receivables System A 0:17 Minutes/Sec
  System B 1:07 Minutes/Sec


ForNAV Performance

Inventory to G/L Reconciliation System A 1:25 Minutes/Sec
  System B 4:00 Minutes/Sec
Aged Accounts Receivables System A 0:04 Minutes/Sec
  System B 0:08 Minutes/Sec



The ForNAV reports are up to 8 or 9 times faster than the Microsoft RDLC reports. The difference gets smaller as the ratio between Master Data and Entries gets lower which makes perfect sense.

How did we do this?

Well, although it is not a secret, I am not going to tell you. We wrote this blog post to trigger you to look at our product.

There are a lot of goodies in our report pack if you are a modern programmer. Where feasible we use the MVC pattern, Dependency Inversion and Polymorphism. This means that the Aged Receivables and Payables report use the same code where possible which then is reused in the Statement report.


JavaScript Objects

We use JavaScript Objects to show grand totals. In ForNAV you can code in JavaScript which includes creating objects that help you have clean and fast front/end (report-side) code.

Prevent C/Side from using ID’s used by Extensions

Last week the inevitable happened. I created a page in C/Side with an ID that I had already been used by an extension.

Microsoft is aware of this issue but does not want to prevent it from happening.

The problem is that at first everything seems to work. Your new C/Side page will run just fine. I only noticed it after a restart of the Service Tier because this actually does a check but you have to dive into the Windows Event log to find it.

The Fix

Extension objects are stored in the NAV App Object Metadata table. You can write a SQL Trigger that checks if a record exists in that table with the same ID and Type. This should show a message like this.

Error Extension

The Trigger can look something like this:

USE [NAV] -- change DB Name here 

IF EXISTS (SELECT * FROM sys.triggers WHERE object_id = OBJECT_ID(N'[dbo].[CheckExtensionObject]')) 
DROP TRIGGER [dbo].[CheckExtensionObject] 
CREATE TRIGGER [CheckExtensionObject] ON [dbo].[Object] 
DECLARE @ins_count int 
SELECT @ins_count = COUNT(*) FROM inserted 
IF (@ins_count <> 0) --BEGIN

IF ((select count(*) from [inserted] inner join [dbo].[NAV App Object Metadata] obj 
 ON obj.[Object Type] = inserted.[Type] AND obj.[Object ID] = inserted.[ID]) <> 0)
 RAISERROR('Object Already Exist as an Extension Object', 18, -1, '');

With thanks to Jorg Stryk.

Translate Stuff the right (hard) way

Looking for feedback as always.

As you might know (or not) we ship a set of standard reports with ForNAV which are optimized for the product.

With these reports you’ll also get a set of tables, pages, codeunits and in the next version even a query.

Out of the box, ForNAV is translated to all the NAV 2018 languages plus Portugese.

For sure we also like to translate the standard reports and we want to do it right.

The LCS Translation Service

I’ve taken a look at the LCS Translation Service provided by Microsoft. This service runs on XLIFF files which is an optional format for Extensions.

Since our product still runs best on C/Side (you could run it as an extension if you so desire) I figured I could still use LCS and convert XLIFF back to C/Side as long as the translation is ok.

If you google on this subject you land on the blog from Gunnar and he has an example result on his GitHub.


The screenshot shows the result of the translation by LCS to Dutch and this would not be something I would enjoy shipping.

The Alternative

Now I am not sure if I will make a complete fool out of myself but let me share how I get my translations.

As you know standard NAV has a lot of standard tables, pages and codeunits that are translated by Microsoft. Microsoft inherrited most of the translations from Navision as the base application never changed much since 2002.

The Navision translations were always very consistent and could be trusted.

You can export the translations from C/Side and that is what I did. I’ve downloaded NAV 2018 in the languages I needed and exported the translations into a file.

The next thing I did was import these files into some tables I’ve created in NAV. Now I have my own database with translations.



There is one caveat. C/Side is not unicode. In order to make sure I get the correct characters in my strings I have a virtual machine that runs in Danish Codepage. My Danish improves with every report I translate.

Discipline & Common Sense

The next thing I did was to go through my objects one-by-one and searched for translations. As some of our reports are from the North American version of NAV such as the Inventory to G/L Reconciliation report some of the captions don’t exist.

Inventory Value is an example of that, and so is Received Not Invoiced.

However, Inventory Value (Calculated) exists. How hard is it to remove the one word from a string and use that translation. It feels a lot safer for good results than using the LCS (which I did not try).

Receive also exists as does Shipped Not Invoiced. Common sense helps me to create a decent translation.


Translation sucks but it is a requirement for the succes of your ISV solution. It is to be handled with care because there is only one first impression of your application and a weird translation can destroy that.

Clean code also helps. If you try to normalize reusable code into libraries you can re-use the translations once you have it.

Please share your idea’s in the comments.