Tip #62 | “Where-Used” in Visual Studio Code

The “Where-Used” in Visual Studio Code is nice, but not always productive. Today I wanted to see where a field was used and came up with an alternative I wanted to share.

ObsoleteState

If you mark a field ObsoleteState::Pending the compile will throw warnings everywhere the field is used. This allows you to quickly use the error window to jump through the code and check whatever you want to check.

When finished you set the obsolete state back.

What if…

Sometimes I can be a bit emotional when it comes to changes in the software product I work with on a daily bases. An example of that was my previous blog that I took offline in order to do some editing making it less about emotion and more about facts.

The emotion is probably justified for a few reasons of which most the fact that Navision, NAV, Business Central (I stopped caring about the name) provides a living for me and my family but it’s more than that.

Around our product there is a community that stands out from almost any other community I’ve seen. We have a large number of events that are not organised by Microsoft but by partners, customers or in one case even by one single guy. Not because they make money doing it, but because they think it’s nessesairy.

What I’ve seen is that even if the product changes and evolves in complexity the people around it don’t stop loving it. Sometimes Microsoft makes a decision around the product that could have been done differently but then we always have the community steering them back. It has happend so many times that I lost count.

When I read the comments to my blog it looks as if a majority agrees that working in C/Side is faster than working in Visual Studio Code and we lost a great deal of simplicity. I know that there are also many who disagree with this statement.

One community that I’ve experienced to be even more passionate about their product is the Great Plains community. When I statet that in my opinion Business Central is replacing the GP product I did not got love and hugs.

The fact is though that I’m a bit jealous of the Great Plains people and sometimes I wished that instead of NAV, GP was the platform of choice to move to the cloud.

Why, you might think. Wel, because Microsoft is still maintaining GP, adding new features and keeping it compatible with modern versions of Windows. But the software is not overhauled like NAV.

Imagine that for the next decade or more we could work wiith C/Side and sell new licenses. Including to be able to use the windows client. Many would love that,

The fact that Business Central is based on NAV makes it easy for me to join the new community but it has cost us a great level of productivity,

It’s going to be very interesting to see what Microsoft will do after Wave II. I cannot wait to go to Vienna and see the roadmap.

Personally I think we can all use some slowing down after all the changes in the last few years. If the Microsoft slide for Wave III said “stabilise the product” I would stand up and cheer.

From a business value perspective integration scenario’s are the most important area for the future. Where Visual Studio Code and AL get all the attention I would spend my time learning the API and if I had a vertical solution I would redevelop it on another platform than AL.

The future of Business Central is international. BC is the only flexible SMB solution with localizations and translations all over the globe.

How Do I – Prevent an epic clusterfuck…

Now that the NDA on Business Central Wave II has been lifted and the DVD preview is released partners got time to look at the code Microsoft has refactored.

The reactions vary from being marketing correct to more realistic.

I have a strong opinion about what Microsoft did, and especially how they did it.

First of all, I agree that it’s a great idea to split NAV up into modules and I also agree that the architecture has to be modernised in more than one way.

But that does not mean it has to be with breaking changes and most of all, it did not have to happen in Visual Studio Code with Extensions.

Microsoft is years, maybe a decade too late with starting this project. To write decoupled code you don’t need extensions and you don’t need a fancy code editor. You need discipline and consistency. Especially the latter seems to be where Microsoft is totally off these days moving away from patterns in a horrible way. (But that’s a different blog.)

As I suggested in many presentations Microsoft should have added Table Extensions and Page Extensions to C/Side. They should have also added a column to the Object table called “module”. The compile should have been enhanced checking if modules would compile on their own.

With these simple changes modularity would have been possible a long time ago and the ecosystem would have been used to it.

Let’s not look back, but let’s look forward, let’s see how it is posible to prevent programmers who take a dependency from being forced to refactor their code.

The problem Microsoft has now managed to place themselves into is that extensions on AppSource cannot be compatible with both Wave I and Wave II. This means tenants cannot be upgraded until partners are ready with the refactoring, which is a lot of work.

It get’s more difficult with per-tenant extensions. To upgrade the code a partner has to compile against Docker or the installed DVD, but how does the customer test against their own data?

Does the customer get to upgrade a sandbox? And if yes, how many times?

Things that are broken are primarily renamed codeunits and functions that changed signature.

A simple example is the function to read the contents of a zipfile that changed from a temptable to a list of text.

To prevent breaking this Microsoft’s AL team introduced overloading. This allows to create a new and improved version while keeping the old one and mark it to be obsolete in the future.

The same can be done with new codeunits. Just leave the old ones there. Point them to the new code if you want to.

BUT MAKE SURE EXTENSIONS CAN EASILY COMPILE AGAINST AT LEAST TO AJACENT VERSIONS!!!!!

This way of moving API releated code has been normal in all frameworks for decades. Why can a huge company like Microsoft no do this with Business Central? I just cannot get my head around it.

I know it’s cool to be an MVP. I’ve been an MVP for 11 years, traveled the world and it gave me opportunities I could have never dreamed about. That does not mean you cannot have your own opinion and it does not mean you always have to agree with what Microsoft does.

It’s going to be interesting to see what happens in the future. I am in favor of continuing to break the functional app into pieces with contracts. I will explain how I would try to do this.

My favorite example is Fixed Assets. Did you ever try and see what happens if you remove the 56xx objects from C/Side?

Large parts of the application will no longer compile. Codeunits like 12, 80, 90 and tables like 37, 39 and 81.

To prevent this you’ld have to implement event publishers and introduce enumerations. This will allow to move code that has dependencies to it’s own module.

This needs to be done without changing any of the functionality and then taken into production. Only after a succesful launch without changing the functionality one can consider changes.

But, the changes should then be done to a new app while leaving the old one intact.

This is probably not something you would want to do with Fixed Assets, but with Production, Warehouse Management or Inventory it makes more sense. Especially Warehousing is in a horrible state because it’s hard to extend. It was never designed for extensibility.

It does not have to be when the old module can be replaced with a new module.

Maybe I am just dreaming or over simplifying things but I think it’s realistic to say that with the introduction of the system app Microsoft could have been more careful, take more patience and allow a more phased approach.

After all we are talking about a business solution that is critical to the companies using it. Microsoft made a strong promise about upgradability that can and should be kept.

Partners have the responsibility to be more critical to their software vendor. Don’t just take things for granted – ask questions, be critical, be very critical, and come up with constructive examples and ideas where things need to be improved

Just my 0.02$.

Working with Azure Blob and NAV

UPDATE 2: The code is (finally) published here. https://github.com/markbrummel/azure-blob-storage-public

UPDATE!! The code will be published. Please allow me a period to do so. Making the code available will be part of the next itteration of the ForNAV report pack module.

This is something that’s long overdue, I wanted to write this before my summer vacation.

My reason for holding back is that I want to share all the code for this project and this needs cleaning up. This is still not done and if you want the code you’ll have to contact me.

Why am I still writing this? I am actually writhing this from the “International” airport of Cork Ireland where I spent the day with my friend Tim Grant.

Tim and I go way back when we both worked together on the Design Patterns project with Microsoft and the reason for my visit was to help him with his go-to cloud strategy.

Last spring we moved all EDI and E-Invoicing at my customer Vos Transport from On-Prem to Azure Blob Storage and Logic Apps.

In total we moved 4.5 million files to the cloud and migrated a few dozen EDI processes to use Azure Blob storage as queues.

The cost of storage and running this is less than 100 euro’s per month and it is insanely stable. So stable that I had to use Statical Prism today to find some of the code and explain it to Tim.

I’ll let this post sit here for a while and see what happens. if I get spammed to share the code I’ll spend the time cleaning it up. If nothing happens than no time is wasted.

Enjoy…

Tip #61 | .gitignore for AL projects

The Business Central Community loves Git, but GitHub does not seem to even know we exist. It recognises our projects as perl projects and there is no suggestion for a .gitignore file.

Why .gitignore?

It’s generally considered best practice to use Git for managing uncompiled code, but not to store the result of a project, nor it’s dependencies.

Also, settings that may vary from developer to developer are best not to be stored since doing to would continously lead to conflicts with pushing and merging.

For AL projects this means we need to exclude our .app file (the result), the alpackages (our dependencies) and the vscode settings file.

Or if you want to copy & paste

.alpackages/
.vscode/
*.app

NOTE: You should create your .gitignore before the initial commit. Removing files later is a tedious process.

Tip #60 | Suppress Warnings in Visual Studio Code

One of the most anoying things about writing AL code in Visual Studio Code is getting warnings that you cannot fix. Simply impossible.

My “favorite” warning is this one

For almost a decade it’s been possible to sort on flowfields from code and in reports and in most cases it works fine. On larger datasets it might require a covering index for performance.

This warning is a joke because it suggests to add a flowfield to the key’s. Even if it were a normal field; in extensions you cannot influence keys in Base Application anyway.

It’s recommended to fix this per project and to do this you need to add a settings.json file to your .vscode folder like this:

The content of this file should be something like this:

{
    "name": "ForNAV",
    "description": "ForNAV Rules",
    "generalAction": "Hidden",
    "rules": [
        {
            "id": "AL0432",
            "action": "Hidden",
            "justification": "Marked for removal, be careful with this rule…"
        },
        {
            "id": "AL0254",
            "action": "Hidden",
            "justification": "Not possible to solve"
        },

Tip #59 | Multiple Start Configurations in Visual Studio Code

When developing extensions for Business Central you have a wide array of publishing options to choose from.

My most used options when working on the ForNAV Customizable Report Pack are our Sandbox and Docker.

Testing is best on the Sanxbox for two reasons. First because all the Azure Active Directory stuff actually returns something which is useful for licensing scenario’s. Second because you can easily share the result with the team since everyone is on the same Sandbox.

Docker is useful when you don’t want to test on current but on an older or vNext instance.

Lastly it’s also possible to install Business Central on your own infrastructure altough this is a dying species.

In your Visual Studio Code project you can specify how you want to publish in the launch.json file but did you also know you can setup miltiple configurations and then choose one at the time of publishing.

This is how it could look:

{
     "version": "0.2.0",
     "configurations": [
         {
             "type": "al",
             "request": "launch",
             "name": "Docker",
             "authentication": "UserPassword",
             "startupObjectType": "Page",
             "breakOnError": true,
             "launchBrowser": true,
             "server": "http://bcsandbox",
             "serverInstance": "NAV",
             "enableLongRunningSqlStatements": true,
             "enableSqlInformationDebugger": true,
             "schemaUpdateMode": "Synchronize"
    },
    {
        "type": "al",
        "request": "launch",
        "name": "Microsoft cloud sandbox",
        "startupObjectId": 6188475,
        "startupObjectType": "Page",
        "breakOnError": true,
        "launchBrowser": true,
        "enableLongRunningSqlStatements": true,
        "enableSqlInformationDebugger": true,
        "schemaUpdateMode": "Synchronize"
    }
] 
}

Now if you publish your code Visual Studio Code will ask for the correct configuration.

NOTE: Your credentials cache is shared accross these configurations. You will need to clear the credentials cache if you switch.

TIP: You can also use this to create a seperate config for Syncronize and Recreate.