12 November 2019

How to change the version of the AL Language compiler

Microsoft released a new update to AL Language extension for VS Code. With this new update, the compiler does a few pre-checks before compiling the objects and packages them into a .app file. 

Most probably you will have to do minor changes to your extension codebase to work with the new compiler. Otherwise, you will end up in errors like below:


Waldo already posts a few tweets about this new update. 

If you are not ready to deal with these new warning or error messages, and simply want to compile with using the old version this blog post will help you. 

If you want to go back to a previous version of the AL Language compiler you can use the following method:

But make sure to go back to the latest version and resolve the error as soon as possible, since this is just a workaround.

Please provide your feedback with a comment. 
Thank you and Regards,
Tharanga Chandrasekara
Read more »

24 October 2019

Platform property is still required in app.json

With the Wave 2 release, Microsoft did convert all the C/AL objects into AL. Since they want to lead by example they wanted their own code to be on extensions as well. Plus they wanted to split the objects based on its functionality. After converting to AL, Microsoft put the objects into below two extensions:
  1. System Application
  2. Base Application
If you can remember in the app.json file we had to specify the "platform" and "application" versions to download symbols just like below. 

With the Wave 2 changes, our expectation was that we can simply remove the "platform" and "application" values from the app.json and set the dependency to "System Application" and to "Base Application" extensions. This was completely based on an assumption with "Microsoft completed splitting the objects". 

As soon as we removed the "platform" we get errors on a few tables such as "Date', Integer. It is clear that Microsoft didn't complete on splitting objects into main 2 extensions and there are objects still sitting on the platform. Therefore we had to put back the "platform" into app.json. 


Once you download the symbols you will see 3 AL Packages in your .alpackages folder. 

We will have to keep this till Microsoft completely move tables and functionalities into the main 2 apps.  

Please provide your feedback with a comment. 
Thank you and Regards,
Tharanga Chandrasekara
Read more »

How to handle breaking changes?

Wagner and I are on our way to Vienna to present at Directions EMEA, and now we got into our second flight after flying over 15000Km from Auckland New Zealand, to Dubai. We got another 6 hours to complete on this flight before we step our foot in Vienna. Went through the in-flight entertainment system, but almost all the movies are not my type or watched already. What better way to spend 6 more hours rather writing some blog posts on Wave 2 release. Here we go.

What are these breaking changes?
If there are any schema changes that may result in any data loss, then it can be categorized as a breaking change. Changing the primary key of the table, or changing the data type, reducing the field length can be categorized as breaking changes.



What is the big deal about it as we used to do this all (most) the time?
Exactly! If you are so much used to C/AL then must be wondering is this something even to blog about? Oh yes! Because with the Wave 2 release Microsoft officially stopped shipping C/AL code with the product DVD. Even if you want to run Microsoft Dynamics 365 Business Central on-premise version just because you want to do C/AL changes, you are no longer able to do that with the latest version. This means fob is no longer available and with that force-sync option is gone too.

That is why we need to handle the breaking changes in Microsoft's way. At the start, this whole process will feel like taking to much time, but when you think about it, it is all about giving a better (seamless) experience to the customer.

How do we handle breaking changes?
If your extension is only in sandbox then you are lucky, because you got a couple of easy options.
  1. Drop the sandbox and create a new one. 
  2. In the launch.json file changes the “synchronization” parameter to recreate or force

If the option selected was,

Force: It will force changes regardless of the data loss. This does not guarantee any data loss preventions, it simply applies the changes.

Recreate: It will definitely drop all the tables and table extensions created through the extension. This action will lead to data loss and then create the schema from scratch.

These options are acceptable since it’s the sandbox and you may be comfortable losing data. However above-mentioned options are not available in the production environment.

Then how to handle breaking changes in production?
It all depends on the change you want to make, if the change is related to a field then your option would be to obsolete the field and create a new field on the table, then write an upgrade codeunit to transfer the data from the old field to the new field during the installation.

Dimitri wrote a very nice blog on how to write an upgrade codeunit to transfer data. I might be creating a similar post on that, but I would suggest you read that and get a good understanding of how things work in SaaS.

What if the change is related to the Primary Key of the table?
Unfortunately, you will have to obsolete the entire table and create a new table. Then write an upgrade codeunit to do the data transfer. There are few things to keep in mind when obsoleting a table of a field. Make sure all the table relationships are commented out in the code.

As an example, you create a table and one of the fields has a table relationship to dimension code. Then you went ahead and obsolete your table. It all looks good and works fine if the user did not delete or rename the dimension code.

As soon as a user rename or delete the dimension code system will try to modify all the related fields, and it will throw an error to the user. This seems to be something Microsoft should fix but for time being just to make sure when you obsolete any field or table, remove the table relationship to be on the safe side.

How to make a field or table obsolete?
If you want to obsolete the entire table then you can add Obsolete Status and Reason just like below.

If you want to obsolete the field, then you need to add the status and reason as below.

Once you do that just make sure to handle the data upgrade as well. You need to write a small codeunit to upgrade the data from your old table/field to the new table/field.

What annoys me the most?
There are few things annoys me with this process,
  1. All the obsolete tables and fields still are in the schema which means it will be in the extension code.
  2. Finding a new name: As a developer what I always struggle is to come up with a shorter but good name to a field or to a table. Then have to drop that field or table? That is just sad. 
Is there any solution for this?
As per this moment we don’t, we will have to come up with new names and to maintain all the obsolete fields and tables on our codebase, but the good news is Microsoft knows this and they are working on the solution. They will release function to clean up the schema. 😊 

As per now, just be careful and think twice before releasing any schema changes to the production environment.

Please provide your feedback with a comment.
Thank you and Regards,
Tharanga Chandrasekara
Read more »

09 October 2019

Directions EMEA 2019 : We safely landed in Vienna

We safely landed in Vienna after flying over more than 18000+ Km. This is just normal to most of us who live on the edge of the world. As our prime minister used says even the shortest flight to our neighbor is more than 4 hours away. So flying for more than 24 hours is quite common for most of us.  
Sometimes it feels New Zealand is far away from the rest of the world, but that is the same feeling that holds us tight to the beautiful island nation. This is my first time in Vienna and also to Directions EMEA. Wagner and I presented in the last two Directions ASIA conferences but I’m sure the vibe is totally different in Directions EMEA. 

We will be conducting 3 sessions and all our sessions are focused on integrating Microsoft Dynamics 365 Business Central with the outer world with the help of Azure integration Services. We are planning to take the audience through some of the very interesting Azure technologies that are built to make the integration life easy. 

If you are planning or already working with integration scenarios you might want to book your slot with us. 

Sessions :
“Logic Apps x Microsoft Flow, which one should I choose?" | Date: Wednesday, October 9, 2019 | From: 15:30:00 to 16:15:00 | Room: HALLG1

Unlocking new integration potential for Dynamics 365 BC with Azure Event Grid and Azure Integration | Date: Thursday, October 10, 2019 | From: 11:00:00 to 11:45:00 | Room: HALLG2

Exploring Azure Integration Services - Extending D365 BC with the power of integration | Date: Thursday, October 10, 2019 | From: 16:00:00 to 16:45:00 | Room: HALLG1

Theta: 
This year we got 4 people attending Directions EMEA representing THETA (NZ). Joerg Rau (Head of Theta ERP) is a frequent attendee to Directions EMEA and together with him Carl Head, Wagner Silveira and myself are planning to attend almost all the possible sessions we can, to grasp the latest technologies related to Microsoft Dynamics 365 Business Central. 

The knowledge we gather during the Directions EMEA helps us to add more value and flavor to the solutions we deliver to our customers back home. Even though we are far away from the rest of the world, this keeps us toe to toe with all the other partners in Europe and the US. 

If you see us during the conference(I’m sure you will) don’t be a stranger and just say hi. I would love to get to know you all and who knows the possibilities that might open with just a small “hi”. 

See you tomorrow!
Read more »

14 July 2019

Too many requests reached

What is the maximum number of API requests Microsoft Dynamics 365 Business Central can handle within a one minute? 

Ran into this question a couple of months ago, specifically soon after the April release. Most of our Azure Logic Apps integrations to Microsoft Dynamics 365 Business Central started to fail due to API endpoint changes

Logic Apps were kept on failing every time it tries to push the data into the MSDYN365BC. Initially, the error was related to URI.


1
2
3
4
5
6
{
    "error": {
        "code": "BadRequest_NotFound",
        "message": "The request URI is not valid. Since the segment 'customers' refers to a collection, this must be the last segment in the request URI or it must be followed by an function or action that can be bound to it otherwise all intermediate segments must refer to a single resource."
    }
}

After a while error message changed into "Too many requests reached".

1
2
3
4
5
6
7
{
  "error": {
    "code": "Application_TooManyRequests",
    "message": "Too many requests reached.  Actual (101).  Maximum (100)."

  }
}

So there is a limit! What is the limit and why there is a limit? To get an answer to all these questions we request support from Microsoft. While we wait for an answer from Microsoft we found the answer to our questions from Microsoft Docks: Business Central (Preview)

The document is related to the Business Central Connector and it specifically mentioned that Microsoft Dynamics 365 Business Central can ONLY handle 100 API calls per minute




The limit is enforced mainly to avoid DoS attacks (denial-of-service attack) and to make sure Microsoft Business Central runs smoothly. Too many API requests can bring the product to its knees causing a probable outage.

I suggest you read the "API Limits" article by Microsoft If you are working on the D365 integrations. Even though the article does not directly talk about BC, all the rules discuss it does apply to the BC as well.

Please provide your feedback with a comment. 
Thank you and Regards,
Tharanga Chandrasekara
Read more »

PowerShell: Run Script Error: Exception setting "CursorPosition"

Last week I was working on a PS script to upload files to an FTP and this script supposes to runs through a windows service. It runs perfectly well when it runs manually with the PowerShell ISE.

However, when I scheduled it to run it through a service, service logs below error in the error log. 
Run Script Error: Exception setting "CursorPosition": "A command that prompts the user failed because the host program or the command type does not support user interaction. Try a host program that supports user interaction, such as the Windows PowerShell Console or Windows PowerShell ISE, and remove prompt-related commands from command types that do not support user interaction, such as Windows PowerShell workflows."
While creating the script, I did add CLS to the script to clear the screen, so I can see the execution steps in the output window very clearly. This simple command was causing the error when it runs through the service. Only thing I had to do was simply remove the CLS and restart the service. 

Please provide your feedback with a comment. 
Thank you and Regards,
Tharanga Chandrasekara
Read more »