Monday, March 25, 2019

Repost: Pointing Build Definitions to Specific VMs (agents)

Since the AXDEVALM blog has been removed from MSDN, I will repost the agent computer name post here AS-IS, until we can get better official documentation.
Original post: October 20, 2017

----

We've recently collaborated with some customers who are upgrading from previous releases of Dynamics 365 to the recent July 2017 application. These customers typically have to support their existing live environment on the older application, but also produce builds on the newer application (with newer platform).

Currently the build agent is not aware of the application version available on the VM. As a result, Visual Studio Team Services (VSTS) will seemingly randomly pick one or the other VM (agent) to run the build on. Obviously this presents a challenge if VSTS compiles your code on the wrong VM - so the wrong version of application and platform. We are reviewing what would be the best way to support version selection, but in the mean time there is an easy way to tie a build definition to a specific VM.

First, in LCS go to your build environment and on the environment details page, find the VM Name of the build machine. In this particular example below, the VM Name is "DevStuffBld-1".


Next, go to VSTS and find the build definition you wish to change. Note that if you have more than one version you're building for, you will want more than one build definition - and point each to its respective VM. To make sure a build definition points to a specific VM, edit the build definition and find the Options tab. Under Options you will find a section of parameters called Demands. The demands are effectively either specific values setup on the agent setup in VSTS (you can do this in the Agent Queue settings), and the agent also picks up all environment variables on the VM it runs on. You will notice that all build definitions already check for a variable called DynamicsSDK to be present to ensure the build definition runs only on agents where we have set this "flag" if you will. Since each VM already has an environment variable called COMPUTERNAME, we can add a demand for computername to equal the name of our build VM. So for the example of the build VM from above, we can edit our build definition to add the following demand by clicking +Add:


Save your build definition and from now on your build will always run on the right VM/agent.

Tuesday, February 19, 2019

Repost: Enabling X++ Code Coverage in Visual Studio and Automated Build

Since the AXDEVALM blog has been removed from MSDN, I will repost the code coverage blog post here AS-IS (other than wrong capitalization in the XML code), until we can get better official documentation. Note that after this was published, I received a mixed response from developers. For many it worked, for others this did not work at all no matter what they tried... I have not been able to spend more time on investigating why for some people this doesn't work.
Original post: March 28, 2018

----

To enable code coverage for X++ code in your test automation, a few things have to be setup. Typically, more tweaking is needed since you will likely be using some platform/foundation/appsuite objects and code, and don't want code coverage to show up for those. Additionally, the X++ compiler generates some extra IL to support certain features, which can be ignored. Unfortunately there is one feature that may throw off your results, we'll talk about this further down.

One important note: Code Coverage is a feature of Visual Studio Enterprise and is not available in lower SKUs. See this comparison chart under Testing Tools | Code Coverage.

To get started, you can download the sample RunSettings file here: CodeCoverage You will need to update this file to include your own packages (="modules" in IL terminology). At the top of the file, you will find the following XML:

<ModulePaths>
    <Include>
        <ModulePath>.*MyPackageName.*</ModulePath>
    </Include>
    <Exclude>
        <ModulePath>.*MyPackageNameTest*.*</ModulePath>
    </Exclude>
</ModulePaths>

You will need to replace the "MyPackageName" with the name of your package. You can add multiple lines here and use wildcards, of course. You could add Dynamics.AX.* but that would then include any and all packages under test (including Application Suite, for example). This example also shows how to exclude a package explicitly, for example in this case the test package itself. If you have multiple packages to exclude and include, you would enter it this way:

<ModulePaths>
    <Include>
        <ModulePath>.*MyPackage1.*</ModulePath>
        <ModulePath>.*MyPackage2.*</ModulePath>
    </Include>
    <Exclude>
        <ModulePath>.*MyPackageName1Test*.*</ModulePath>
        <ModulePath>.*MyPackageName2Test*.*</ModulePath>
    </Exclude>
</ModulePaths>

To enable code coverage in Visual Studio, open the Test menu, select Test Settings and Select Test Settings File. Select your settings file. You can then run code coverage from menu Test > Analyze Code Coverage and then selecting All Tests or Selected Tests (this is your selection in the Test Explorer window). You can open the code coverage results and double click any of the lines - which will open the code and highlight the coverage.

To enable code coverage in the automated build, edit your build definition. Click on the Execute Tests task, and find the Run Settings File parameter. If you have a generic run settings file, you can place it in the C:\DynamicsSDK folder on the build VM, and point to it here (full path). Optionally, if you have a settings file specific for certain packages or build definitions, you can be more flexible here. For example, if the run settings file is in source control in the Metadata folder, you can point this argument to "$(Build.SourcesDirectory)\Metadata\MySettings.runsettings".

The biggest issue with this is the extra IL code that our compiler generates, namely the pre- and post-handler code that is generated. This is placed inside any method, and is thus evaluated by code coverage even though your X++ source doesn't contain this code. As such most methods will never get 100% coverage. If a method has the [Hookable(false)] attribute (which makes the X++ compiler not add the extra IL code), or if the method actually has pre/post handlers, the coverage will be fine. Note that Chain-of-Command logic that the compiler generates is nicely filtered out.

Friday, January 18, 2019

Azure DevOps Release Pipeline

Welcome to 2019, the year of the X++ developer!

Today marks a great day with a release of the first Azure DevOps task for D365 FinOps users. Since documentation is still underway, I wanted to append the official blog post with some additional info to help guide you through the setup.
The extension can be installed from here: https://marketplace.visualstudio.com/items?itemName=Dyn365FinOps.dynamics365-finops-tools

The LCS Connection
- if your LCS project is hosted in the EU, you will need to change the "Lifecycle Services API Endpoint". By default it points to https://lcsapi.lcs.dynamics.com but if you log into LCS and your URL for your project shows "https://eu.lcs.dynamics.com" you will need to change this api URL to also include EU, like so: https://lcsapi.eu.lcs.dynamics.com
- App registration: I encourage to use the preview setup experience ("App registrations (Preview)"). Add a "new registration" for a native application, I selected "accounts in this organizational directory only (MYAAD)". In the redirect URI you can put anything for a native application, typically http://localhost and in the preview experience use "Public client (mobile & desktop)" to indicate this is a native application.

Thanks to Marco Scotoni for pointing out that finding the API to give permissions to, just go to the "APIs my organization uses" tab.

The Task
- Create the new connection using the app registration as described above
- LCS Project Id is the "number" of your project. You can see this in the URL when you go to your project on the LCS website, for example https://lcs.dynamics.com/V2/ProjectDashboard/1234567. I'm hoping this can eventually be made into a dropdown selection.
- File to upload... The build currently produces a ZIP file with a name that contains the actual build number, and that is not configurable there (you'd have to edit powershell for that). So until that is changed, there's actually an easy way to fix that. Since your release pipeline has the build pipeline's output as an artifact, you can actually grab the build's build number. So, use the BROWSE button to select the build drop artifact, but then replace the build number with the $(Build.BuildNumber) variable. For example, on my test project this resulted in the following file path: $(System.DefaultWorkingDirectory)/BuildDev/Packages/AXDeployableRuntime_7.0.4641.16233_$(Build.BuildNumber).zip
If your AX build is not your primary artifact, you can use the artifact alias, like $(Build.MyAlias.BuildNumber). You can find this into in the release pipeline variables documentation.
- LCS Asset Name and Description are optional, but I would recommend setting at least the name. For example, I set the following:
LCS Asset Name: $(Release.ReleaseName)
LCS Asset Description: Uploaded from Azure DevOps from build $(Build.BuildNumber)
- If using a hosted agent, make sure to use the latest host ("Host VS2017").

Happy uploading!!