Dynamics 365 Commerce: Azure Data Explorer dashboard to investigate Commerce Scale Unit performance

Overview

Operations Insights for the Commerce is now generally available and provides a wealth of telemetric information. After enabling and configuring it, every RetailServer API call and SQL query is being logged to the customer’s Azure Application Insights, no matter whether it is coming from out-of-box or extension code. The extend of the information is comparable with what is available to Microsoft’s product and support teams, including the execution time of these operations in milliseconds. This proves to be a treasure trove of information for customers and implementation partners to understand better what is happening inside the CSU. Azure provides different tools to “look” into this data which are based on the Kusto query language ((TODO: add link)). These are well documented.

While querying the data yourself proves useful for many cases, it is lacking a high-level view. While working on a Commerce implementation project recently, I have build a Azure’s data explorer dashboard to provide this high-level view. After receiving great feedback from my team and since it took a few iterations to get the Kusto queries right, I have been able to “re-create” the dashboard for another Commerce implementation in under 15 minutes. I am providing a step-by-step manual on how do do that so other’s can use it in their implementations as well.

See a screen shot of the dashboard below:
(TODO: add link)

If you have any feedback, corrections or additional implementation ideas please let me know.

Note: Only CSU telemetry is currently supported, but POS, HardwareStation and even the Dynamics 365 Finance and SCM traces are planned for the future.

Creation

First, enable Operations Insights in F&SCM and verify that the telemetry data is flowing (ToDO add link).

Add the cluster

Go to https://dataexplorer.azure.com/clusters/ and select Add Cluster. Then configure the dialog as below:

The connection URI should have the following format:

https://ade.loganalyticshttps://ade.applicationinsights.io/subscriptions/<subscription-id>/resourcegroups/<resource-group-name>/providers/microsoft.insights/components/<application-insights-name>

Note that the above URI needs to be edited for correct subscription Id, resource group name and application insights names.

Add the dashboard and data source

  • Create a new Azure Data Explorer dashboard at (todo add link……….. https://dataexplorer.azure.com/dashboards)
  • Add a new dashboard at https://dataexplorer.azure.com/dashboards
  • On the new dashboard, click the top right down-caret, pick Data Sources and click Create new data source
  • Enter the same URI that you used for the cluster above, and give it a prescriptive name. Note: Make sure the URI is proper, copying and pasting in this dialog is a little difficult
  • Click Connect and select the data source. If it did not find the data source, the URI is incorrect.
  • Save the dashboard before continuing.

Create the [@] Parameters

  • Click the [@] Parameters link
  • A “Time range” parameter will already be pre-configured, and we will keep it as is
  • Create a new parameter “ScaleUnit”. See details below
  • Create a new parameter “HttpStatus”. See details below
  • Create a new parameter “RS Api” as below
  • Create a new “Sql” parameter as below

At this point, we are done with the parameters. The dashboard should show these new query parameters:

Create the first tile

Use Azure Data Explorer to query LCS Raw information logs

Note: This is now also part of the official documentation on docs.microsoft.com: Use Azure Data Explorer to query raw information logs – Finance & Operations | Dynamics 365 | Microsoft Docs. I will continue to maintain the samples at the end of this blog.

There are occasions when a customer, partner, consultant, or support engineer needs to look at the low-level Dynamics 365 Finance & Operations telemetry data. These use cases include troubleshooting of errors, performance-related investigations or just to gain some additional understanding of how the platform work. Telemetry data can be accessed by authorized users via the Environment monitoring part of the LCS portal, can be filtered in a few different ways and displayed inside the LCS portal’s raw logs section. A data grid can be used to inspect the log entries. LCS does not allow for more sophisticated pivoting and users can use Excel for that purpose. For that purpose, the telemetry data can also be downloaded formatted in CSV format.

However, Excel is not the optimal tool for advanced querying of this data. The perfect tool, designed for this purpose is the Azure Data Explorer. It provides an innovative query language Kusto that is optimized for high-performance data analytics. Answering questions like “how often has a certain process taken place, how long has it taken in 90% of the times, how often per hour has a certain action taken place over the course of a day” becomes a lot easier and can be backed up with powerful graphics as well.

Here are examples how the graphics could look like: 

A less known feature of the Azure Data Explorer is that it supports ingestion of CSV files. We can use it to get our CSV data files uploaded and staged so it can be queried with the Kusto language. If you have not setup Azure Data Explorer Cluster, follow these steps.

Steps to upload to Azure Data Explorer

  • Run your query on LCS raw logs page
  • Important: adjust the time interval or – filter to get to the right data (row limit is 5000 for export in next step)
  • export the grid to Excel
  • Open the file in Excel and save it without making any changes (this seems to fix a formatting issue)
  • In your Azure Data Explorer, right click on the cluster in the tree view and select “ingest new data” and then on the next page “ingest data from a local file”
  • Pick your cluster, name a new table for the data to be imported into, select up to 10 CSV files to import, select CSV format. Hit next a few times till your data is getting imported.
  • Use the Query tile to write a Kusto query against your data.

To learn more about the Kusto query language, go here.

Sample queries

Modern POS

all errors

Sometimes its nice to get an idea of what all the errors are. The use of the text fields to describe errors is not consistently used, so its better to use the EventIds and map them to the correct errors. They can be looked up in these two Commerce SDK files (but using the code snippet below may give you all of them already):

  • Pos.RetailLogger.js
  • Microsoft.Dynamics.Retail.Diagnostics.Sinks.man
// Note: Filter EventSeverity == Error before exporting/ingesting 
MPosErrors24h
| extend EventName = case(
EventId == 53231, "runtimeInterceptorFailed",
EventId == 48434, "posPaymentTerminalAuthorizeRefundActivityFailed",
EventId == 48347, "posAuthorizeOrRefundPaymentFailed",
EventId == 48104, "posTenderPaymentOperationFailed",
EventId == 46802, "extensibilityFrameworkExecuteRuntimeRequestFailed",
EventId == 44322, "viewModelCartProcessTextFailed",
EventId == 42112, "viewModelCartProcessTextFailed",
EventId == 42106, "RetailProxyExtensionAdapterManagerMethodNotFound",
EventId == 40450, "posInitiatedHardwareStationRequestFailed",
EventId == 40409, "peripheralsBarcodeScannerEnableFailed",
EventId == 40325, "operationSetQuantityOptionsValidationFailed",
EventId == 40255, "RetailOperationFailed",
EventId == 40209, "coreRetailOperationOnExecutingFailed",
EventId == 40191, "modelManagersChainedRequestExecutionFailed",
EventId == 40170, "modelManagersErrorParserHardwareStationError",
EventId == 40150, "modelManagersChainedRequestFactoryExecuteOfflineLogonRequestUnAvailable",
EventId == 40104, "ModelManagersCommerceRuntimeRequestError",
EventId == 40101, "ModelManagersRetailServerRequestError",
EventId == 7505 , "PaymentConnectorLogErrors",
EventId == 7503 , "PaymentConnectorLogException",
EventId == 7014 , "HardwareStationActionFailure",
EventId == 1001 , "CrtExecuteRequestErrorFailure",
strcat("*************** Unknown EventId: ", EventId))
| where EventSeverity == "Error"
| summarize count() by EventId, EventName
| order by count_ desc

Now, its easier to decide where to look next, i.e. what more detailed queries to issue for specific errors…

Errors returned by RetailServer

// Note: Filter EventSeverity == Error before exporting/ingesting 
MPosErrors24h
| extend EventName = case(
EventId == 53231, "runtimeInterceptorFailed",
EventId == 48434, "posPaymentTerminalAuthorizeRefundActivityFailed",
EventId == 48347, "posAuthorizeOrRefundPaymentFailed",
EventId == 48104, "posTenderPaymentOperationFailed",
EventId == 46802, "extensibilityFrameworkExecuteRuntimeRequestFailed",
EventId == 44322, "viewModelCartProcessTextFailed",
EventId == 42112, "viewModelCartProcessTextFailed",
EventId == 42106, "RetailProxyExtensionAdapterManagerMethodNotFound",
EventId == 40450, "posInitiatedHardwareStationRequestFailed",
EventId == 40409, "peripheralsBarcodeScannerEnableFailed",
EventId == 40325, "operationSetQuantityOptionsValidationFailed",
EventId == 40255, "RetailOperationFailed",
EventId == 40209, "coreRetailOperationOnExecutingFailed",
EventId == 40191, "modelManagersChainedRequestExecutionFailed",
EventId == 40170, "modelManagersErrorParserHardwareStationError",
EventId == 40150, "modelManagersChainedRequestFactoryExecuteOfflineLogonRequestUnAvailable",
EventId == 40104, "ModelManagersCommerceRuntimeRequestError",
EventId == 40101, "ModelManagersRetailServerRequestError",
EventId == 7505 , "PaymentConnectorLogErrors",
EventId == 7503 , "PaymentConnectorLogException",
EventId == 7014 , "HardwareStationActionFailure",
EventId == 1001 , "CrtExecuteRequestErrorFailure",
strcat("*************** Unknown EventId: ", EventId))
| where EventSeverity == "Error"
| where EventId == 40101
| summarize count() by requestAction, error
| order by count_ desc

CSU

All errors

Similarly to the MPOS errors, we can get them from the CommerceSDK (Microsoft.Dynamics.Retail.Diagnostics.Sinks.man)

// Note: Filter EventSeverity == Error before exporting/ingesting 
CSUErrors24h
| extend EventName = case(
EventId == 1017, "CrtSlowHandlerExecution",
EventId == 1005, "CrtExecuteRequestWarningFailure",
EventId == 5055, "RetailServerRequestWarningFailure",
EventId == 60208, "SyncLibraryMergeDataIntoTableWarning",
EventId == 60125, "ProcessDeleteRequestStart",
EventId == 2617, "CrtWorkflowUserAuthenticationRequestHandlerFailure",
EventId == 2486, "CrtServicesEmployeePasswordDoesNotMatch",
EventId == 5106, "RetailServerSecretRetrievalWarning",
EventId == 2503, "CurrencyServiceGetSupportedChannelCurrenciesNotFound",
EventId == 6943, "RtsClientLibraryApiExecutionWarning",
EventId == 3007, "CrtTransactionServiceClientRtsExecutionWarning",
EventId == 2413, "CrtServicesSalesOrderTransactionServiceMarkReturnedItemsFailure",
EventId == 2483, "CrtServicesLocalLogonFailedDueToIncorrectStaffId",
strcat("*************** Unknown EventId: ", EventId))
| summarize count() by EventId, EventName
| order by count_ desc

A next step would be to go through each of these error types, and look at them closer. A few of those could indicate quality issues that could effect the user experience.

In many cases, these errors can be fixed by cleaning up the extended code, adding proper SQL indexes or investigate better problem approaches.

In some cases, these errors could indicate problems with out-of-box experience (OOBE) or deployment. Please open a support request to get these fixed by Microsoft.

Slow CRT handlers by requesttype

// Note: Filter EventSeverity == Error before exporting/ingesting 
CSUErrors24h
| extend EventName = case(
EventId == 1017, "CrtSlowHandlerExecution",
EventId == 1005, "CrtExecuteRequestWarningFailure",
EventId == 5055, "RetailServerRequestWarningFailure",
EventId == 60208, "SyncLibraryMergeDataIntoTableWarning",
EventId == 60125, "ProcessDeleteRequestStart",
EventId == 2617, "CrtWorkflowUserAuthenticationRequestHandlerFailure",
EventId == 2486, "CrtServicesEmployeePasswordDoesNotMatch",
EventId == 5106, "RetailServerSecretRetrievalWarning",
EventId == 2503, "CurrencyServiceGetSupportedChannelCurrenciesNotFound",
EventId == 6943, "RtsClientLibraryApiExecutionWarning",
EventId == 3007, "CrtTransactionServiceClientRtsExecutionWarning",
EventId == 2413, "CrtServicesSalesOrderTransactionServiceMarkReturnedItemsFailure",
EventId == 2483, "CrtServicesLocalLogonFailedDueToIncorrectStaffId",
strcat("*************** Unknown EventId: ", EventId))
| where EventId == 1017
| summarize count(), sum(inclusiveExecutionTime), percentiles(inclusiveExecutionTime, 75, 90) by handlerAssemblyName, requestType
| order by sum_inclusiveExecutionTime desc

F&O

All errors

// Note: use selection "All error events" to only get the errors
FNOErrors24h
| summarize count() by formName, targetName, errorLabel
| order by count_ desc

FormName, targetName, errLabel are all good candidates to look up in Azure DevOps to find out the code location. That gives more insight in what areas these error occur.

Example: The @SYS18885 is thrown from ReqPlanData class. Now I can focus on figuring out why that is the case (is Master planning not configured right?)

Slow SELECT queries

// use selection "slow queries" in LCS diagnostics
SlowQueries
| extend MainTableName = extract("FROM\\s([A-Z0-9]+)\\s", 1, statement)
| where statement contains "SELECT"

slow inserts

// use selection "slow queries" in LCS diagnostics
SlowQueries
| extend MainTableName = extract("INSERT INTO ([A-Z0-9]+)\\s", 1, statement)
| where statement contains "INSERT"

Slow deletes

// use selection "slow queries" in LCS diagnostics
SlowQueries
| extend MainTableName = extract("DELETE FROM ([A-Z0-9]+)\\s", 1, statement)
| where statement contains "DELETE FROM"

Batch jobs performance

// use selection "All logs" and add query "where TaskName Equals BatchFinishedOneTask" in LCS diagnostics
BatchFinishedOneTask
| summarize count(), sum(durationInMilliSeconds), percentiles(durationInMilliSeconds, 75, 90) by className
| order by sum_durationInMilliSeconds desc

Failing Dynamics 365 F&O deployments? Last resort workarounds

This post originally appeared on https://dynamicsnotes.com/failing-dynamics-365-fo-deployments-last-resort-workarounds/.

There are times when a deployment fails and even a re-try does not help.  In these cases, a service ticket should be opened to the Microsoft engineers.

However, there are cases when this is not feasible or helpful.  For example:

    • its a tier 1 development environment and you caused the issue, or
    • you cannot wait and need to get it done very fast, or
    • you moved the database but did not run the Retail Re-provisioning tool and the Retail deployment fails now (and you do not care because its not a Retail project).

In these and other cases, it may be OK to just step over the failing step and let the deployment finish (in non-production environments).

The following steps can be used to workaround. Again, this is almost “hack” territory, but sometimes is needed…

  1. Find the step number that failed. LCS should tell you. Say for a moment, it’s step 43.
  2. Wait till the deployment is in “Failed” state.
  3. Login to the VM where this error occurred. This can also be found on LCS portal.
  4. Find the current runbook.xml. Its under C:\RunbookOutput, and is going to be the latest changed file.
  5. Open it in your favorite XML editor (i.e. Notepad++) and find the step with the number (search for “>43<“)
  6. Mark the step “Completed” (from “Failed”).
  7. Save the file and resume the deployment from LCS.

Note, sometimes I have found that marking this step is not enough. If that is the case, you can also edit the PowerShell file that the step calls into and essentially comment out all the code. The PowerShell file will be in the service directory under AOSService\DeployablePackages.

Again, this is a hack, but sometimes desperate times call for desperate measures.

Implementing Retail Store Credit Card discounts


This post originally appeared on Implementing Retail Store Credit Card discounts.

An often-requested Dynamics Retail channel feature is the support of store credit cards. The retailer wants to encourage the use of a store credit card; so if the customer uses it, he should get a discount percentage.

Ideally, we should be able to configure the discount percentage, the products that should apply or not, and certain other ordering rules for the discount (apply on-top of other discounts or replace other discounts if better, etc.). The retailer should be able to configure these in Headquarters (AX) without changes in the channel.

This discount applies only during or right before payment, it cannot be applied during simple add-to-cart operations, like other discounts.

Modelling the discount as an affiliation

The pricing or discount engine is a complex piece of code. Even though it supports extensibility features, I would rather not go there. Is there an easier way to do that? Yes, there is. Read on…

We do not want to apply the associated discounts automatically but rather only when a special store credit card is used. We can use affiliation and affiliation price groups to achieve that. The only thing we need to “code” is the application of the affiliation to the cart at the right moment (right when the customer checks out with a store credit card).

A nice benefit of not needing pricing extensions is that we can use the Price simulator in HQ to verify that our discount rules are correct.

In order to try this out, we need to create a new discount for the products we want, with the right percentage, associate with a price group and add it to an affiliation.

As soon as we save, we can try it out in the Price Simulator. Below, you can see that the store credit card discount is applied to all projects, but other discounts are honored (because the StoreCreditCardDiscountAffiliation was set). As soon as the affiliation is removed from the General section, the discounts would disappear.

Applying the store credit card discount in POS

As mentioned above, we only need to apply correct affiliation to the cart before we checkout, and we are done. That functionality already exists in POS today. So, to simply try this out, I did not need to make any code changes (POS transaction screen -> Discounts -> Add affiliation):

Ideally, this affiliation should only be added right before the store credit card is being used. Therefore I opted for adding a new button next to the “Pay Card” button that will do all the work and the cashier would not need to do this via the “Add Affiliation” operation. See the steps below for details.

It would even be better to “detect” that the credit card is a store credit card and only then apply the affiliation. However, it turned out that this may not be easily doable today, as the payment view in POS does not support this. Hence, I opted for the slightly more manual option where the cashier has choose the button.

Here are the steps to add the button to POS:

    • In HQ, add a new operation (i.e. StoreCreditCardDiscountAffiliation, 5000)
    • In the layout designer, edit the button grid and add the button and associate it with the new operations

    • Run these jobs: 1050 for Affiliation price groups, 1020 for discount, 1090 for the button grid changes
    • In POS.Extensions project, create a new project StoreCardDiscountExtension, with a new operation mapped to the same Id created earlier, and in there just add the affiliation to the cart.

public executeAsync(request: StoreCardDiscountOperationRequest): Promise<ClientEntities.ICancelableDataResult> {

    let addAffiliationRequest: CartOperations.AddAffiliationOperationRequest =
        new CartOperations.AddAffiliationOperationRequest(["StoreCreditCardDiscountAffiliation",], [], request.correlationId);

    return this.context.runtime.executeAsync(addAffiliationRequest)
        .then((
            result: ClientEntities.ICancelableDataResult
        ): ClientEntities.ICancelableDataResult => {

            return <ClientEntities.ICancelableDataResult>{
                canceled: result.canceled,
                data: new StoreCardDiscountOperationResponse()
            };
        });
}

The POS extension code is attached below in zip format.
StoreCardDiscountPosExtensions

Applying the store credit card discount in online channel

The same discount logic as above is executed automatically, as we use built-in concepts around discounts (as long as RetailServer is being called). However, just as in POS, we need to add the affiliation to the cart at the right moment. In eCommerce, we can accomplish this by updating the “AffiliationLines” property of the Cart object.

Since in eCommerce the payment process is a wizard with multiple steps, we do not need to add a new UI action to apply the discount affiliation. We can handle this better by detecting the use of a store credit card by number, then apply the affiliation automatically (form the UI or 3rd party ecommerce wrapper). The customer should see the updated cart, while moving to the next payment step.

Retail Channel performance investigations

This post originally appeared on https://dynamicsnotes.com/retail-channel-performance-investigations/.

There may be a time during a Retail project when you likely will hear someone saying: “Doing X in POS is slow. What’s going on?”. If this is the case, my hope is that this during the implementation phase and not on a live production environment.

What do we do to find out what is wrong? Is it a customization to the CRT business logic, latency to an external call, generic latency to the server from POS, or a Microsoft bug?

The following steps will hopefully help to give an answer to these questions.

Latency between POS and RetailServer

RetailServer API timings as seen from POS

Simulating network “issues”

“Looking inside” RetailServer/CRT/SQL with Commerce Runtime Analyzer

Profiling Channel database SQL queries

Some things to remember:

RetailServer could be either Microsoft hosted or RSSU-hosted. It is agnostic to this investigation, but you need to make sure you are testing against the right RetailServer (especially if you use both Microsoft-hosted and RSSU-hosted at the same time).

Microsoft-hosted RetailServer in production cannot be accessed, so some of the steps below cannot be carried out on a production environment. It is advised to carry these out on a lower tier environment.

RSSU-hosted RetaiSserver, even in production, is under the management of the customer, so the steps below can be carried out, but with care.

Sometimes the slowness could occur only when there are multiple POS used at the same time. If that is the case, you can still use the steps below, but would have to reproduce the issue by running multiple POS.

Simulating network “issues”

This post originally appeared on https://dynamicsnotes.com/simulating-network-issues/.

In some cases it is very useful to see how a web site would perform under certain network conditions that are worse than what you have currently. You can use this tool to simulate it: https://jagt.github.io/clumsy/. As an example, if you want to simulate what a network latency of 200ms looks like while using POS, run it on the same computer as POS and configure it with 200ms outbound.

Even if you are on a good network, with this tool you can evaluate what it would look like for someone with higher latency (or other network conditions).

RetailServer API timings as seen from POS

This post originally appeared on https://dynamicsnotes.com/retailserver-api-timings-as-seen-from-pos/.

There may be some APIs that “appear” slower than others. To find out, just hit F12 if you are running in the browser or ModernPOS to launch the tool (ModernPOS may require to enable developer mode in Windows).

Running the CloudPOS logon sequence on the same computer that I measured the latency above from, I get these timings:

As you can see, the quickest API calls are all around 60 ms, which makes sense as our measurements above showed an always occurring latency of 41ms.

Depending which scenario is slow, focus on these and see what exact RetailServer call(s) are the culprits. Then move on to the next step to find out why.

By the way, I am not too worried about the time a login takes, as it is usually not a very frequent operation. This may be different for some customers that configure a logoff after each transaction or after a certain time. In general, I would focus on the top 5 customer scenarios.

Latency between POS and RetailServer

This post originally appeared on https://dynamicsnotes.com/latency-between-pos-and-retailserver/.

There is always going to be a physical separation between the server (RetailServer) and client (ModernPOS or CloudPOS browser instance).  If this physical distance is large, every RetailServer call will incur a larger additional latency. There is no code change that can fix this, a redeployment of cloud or client components will be needed to lower the value.

Here is how to measure the latency.

  1. Find out what Azure data center the RetailServer is hosted. Use LCS environment page for the specific environment or ask your IT administrator. For example, my RetailServer is hosted on the Azure primary region “West Central US”.
  2. On the same computer as your store terminal open an internet browser window and browse to http://www.azurespeed.com/. Check at correct Azure region or a few on the same area. The web page will tell you the latency from your computer to the data center the RetailServer is hosted. It is very important to use the same computer as the store terminal, as sometimes customers use a dedicated network for the store terminals and a computer in the back office may be on a different network.

In my example, you can see I get a latency of about 41 ms (for every RetailServer call).  That is a very good value (for this particular single store, it may have been better to use “West US 2” to get an even better response). Off course the choice should be balanced between all different stores that use this RetailServer instance.

If you want to further investigate where a latency issue occurs, you may use other networking utilities. For example tracert (http://www.mediacollege.com/internet/troubleshooter/traceroute.html). It will list all the routers it passes through until it reaches its destination, or fails to and is discarded. In addition to this, it will tell you how long each ‘hop’ from router to router takes.

Note that Azure does not seem to allow ICMP packets through, but tracert and ping can still be used to get a sense of what the issues are outside of the data center.

Beefing up ModernPOS security with a real app package signing certificate and AAD authentication

Overview

A real certificate should be used to code sign the ModernPOS packages. That will allow for more security and simpler deployments (as no self-signed certs will need to be installed before installing MPOS).

Since the real certificate is usually not accessible by all developers in the team (and should not be), non-official builds will keep using a simple self-signed cert, i.e. the Contoso certificate shipped with the Retail Sdk.

Additionally, recent releases of MPOS, RetailServer, and Dynamics AX use AAD app ids for proper authorization. A onetime setup in AAD and AX is needed. With the proper setup, we configure AX to only allow access from ModernPOS clients that claim the correct app id combination.

Note, that there is a direct relation between the signing certificate, the reply address of the final built ModernPOS and the AAD app id.

With these recent changes, non-production environments can be configured to allow both the real and self-signed ModernPOS clients to access. However, the actual production environment should only allow access by MPOS clients with real certs and self-signed certs should not be allowed.

An automatic benefit of this is that it cannot happen, that a developer connects by accident to production during development phase.

Note, that all the steps below have to be carried out once per tenant, and not per environment.

Detailed steps

Ideally, the following steps should be carried out after a dev environment is fully working with a self-signed cert. That ensures that AX is correctly setup (workers, stores, registers, etc.).

1. Build server setup

The first step is to install a real certificate on the build server so that the Retail Sdk build can use it.

Note, that the steps below require administrative access to the build server. Therefore a cloud-hosted build server is required (as opposed to a Microsoft-hosted build server without administrative privileges is not possible to be used).  This is unfortunate, and Microsoft is working on a solution.

  • 2. Initial Retail Sdk setup

    We want to use the certificate in Release builds (official) but keep using the Contoso signing cert for dev purposes in Debug builds. Other strategies can be used, but using the build flavor has worked well for some. The RetailSdk allows us to provide either a path to a certificate or a certificate thumbprint. We will use the first approach for the self-signed test certificate and the 2nd approach for the real certificate. We can use simple MSbuild logic to switch between the two. Below is an example how to do that. This change would have to be made to the RetailSdk’s customization.settings or better, to a global.props file sitting next to it with all customer values. Use the thumbprint from the step above.

    <!– Release builds pick up the secure cert from the build server with named thumbprint, Debug builds use the test cert part of the Retail Sdk –>
    <ModernPOSPackageCertificateKeyFile Condition=”‘$(ModernPOSPackageCertificateKeyFile)’ == ” and ‘$(Configuration)’ != ‘Release'”>$(MSBuildThisFileDirectory)\ModernPOSAppxSigningCert-Contoso.pfx</ModernPOSPackageCertificateKeyFile>
    <ModernPOSPackageCertificateThumbprint Condition=”‘$(ModernPOSPackageCertificateThumbprint)’ == ” and ‘$(Configuration)’ == ‘Release'”>22aa3bdca99b70d4ca44d0c51d23a07e06fcfc61</ModernPOSPackageCertificateThumbprint>

  • 3. AAD setup

    Carry out an official build in order to get a properly signed ModernPOS appx/installer
    Install the fully signed ModernPOS on a client machine and point to a RetailServer/AX environment of your choice (dev environment, for example)
    follow the steps in this blog to create 2 app ids: https://community.dynamics.com/ax/b/axforretail/archive/2017/11/05/mpos-aad-changes-in-monthly-update-4
    Activate ModernPOS

    4. Final Retail Sdk setup

    The final changes to the dllhost.exe.config you had to make in step 3 should be added to the RetailSdk\Assets\dllhost.exe.config file. That way, any new and officially build ModernPOS will automatically have the right app id information configured.
    Note: For development purposes, the older app id information must be used. This can be easily done by leaving the original settings in the config file, but leave them commented out. Switching between the production and dev ModernPOS is that as easy as un-commenting the developer values before activating MPOS.

Inject business or validation logic during AddToCart (AddCartLines)





Note: Implemented with Dynamics 365 version 7.2.11792.62628, or App Update 4 (should work with many other versions). Sample code can be downloaded at the end of this blog.

Imagine there is some additional business logic that should be executed during “AddToCart” in order to cancel the operation and show a dialog in POS (both ModernPOS or CloudPOS).  It could be that an external system has some additional information about item availability, or it could be a credit check that the customer on the transaction failed. Whatever the actual business logic may be, our code extension shall meet these goals:

  • If a certain condition is true, do not persist the new cart item,
  • show a well-formed, localized error message in POS about the issue, and
  • keep the existing POS view open, with unchanged data, after the dialog is closed.

This can be accomplished with a very small extension in CRT and without any changes in POS. The CRT request in charge of saving the cart data is “SaveCartVersionedDataRequest”. All we need to do is to augment the CRT request with a pre-trigger that will give us the opportunity to “cancel” before saving the cart. Steps:

    1. Create a simple new CRT trigger and implement the interface methods
    class MyAddToCartLineValidationTrigger : IRequestTrigger
    {
        public IEnumerable SupportedRequestTypes
        {
            get
            {
                return new[] {
                    typeof(SaveCartVersionedDataRequest),
                };
            }
        }

        public void OnExecuted(Request request, Response response)
        {
        }

        public void OnExecuting(Request request)
    1. add business logic to validate any data on the cart or transaction
    2. if you need to invalidate/cancel the “AddToCart” because the business rules call for it, throw a CommerceException with localized message
                if (invalidCart)
                {
                    throw new CommerceException("Microsoft_Dynamics_Commerce_30104", "Custom error")
                    {
                        LocalizedMessage = string.Format("The item with Id {0} is not allowed to be added at this time.", disallowedItemId),
                        LocalizedMessageParameters = new object[] { disallowedItemId }
                    };
                }
  • Add your changes to the Retail Sdk, edit the commerceruntime.exe.config, global.prop or Customization.settings files

Extensions.MyAddToCartLineValidationTriggerForBlog

Overriding CommerceRuntime tax calculation with a 3rd party implementation





Note: Implemented with Dynamics 365 version 7.2.* (but likely working fine with lower or higher versions). Sample code can be downloaded at the end of this blog.

This information explains how to override tax calculation in the CRT in Dynamics 365 for Operations (and similar products). This is a simple example how integrate the calls to external tax providers and how to embed the returned values back into the CRT. The code is of non-production quality; its main purpose is to show the concepts only.

High-level overview

In this example, I demonstrate how to short-circuit any CRT tax calculations (by implementing my own IRequestHandler) and forward the individual line items on the cart in one large payload to a call to an external system. The external system will calculate a 20% tax rate on each item (a simplification). Our new implementation of the tax service will then take the return values and save the information on the transaction as it is expected by the CRT and CDX frameworks.

The identifier of the product in the external system is modelled to be different from ItemId or ProductId in Dynamics. Here, I showcase the use of the product attribute framework to add an attribute called “ExternalTaxId”. As part of the newly implemented code, we must fetch the value of this attribute and forward it to the external provider. As an optimization, the fetching of this attribute is cached. This should help with limiting the number of calls to a minimum (per IIS process). There are other approaches that could be taken, but this seemed to be the simplest.

Finally, imagine that we have “additional” data coming from the external tax provider to be saved per sales tax line and to be brought back to Dynamics HQ. This shows the use of a table extension in X++, an extension table in the channel database and the inclusion of this data in the transaction upload job (P-job).
When all work is finished, we will see the 20% tax rate in the POS clients:

After running the P-0001 job, we can also see the data in AX:

Details

There are 3 tax-related CRT request handlers that can be overridden for tax-related purposes.

  • CalculateTaxServiceRequest: Re-calculates taxes for individual items and whole transaction
  • AssignTaxCodesServiceRequest: Fetches and assigns sales tax codes. Since we do not calculate the taxes locally, this is not really needed. However, there is no negative impact to just leave that code untouched.
  • PopulateTaxRatesRequest: Since we will override CalculateTaxServiceRequest, there is no need for this handler. We will just provide with our own empty implementation.

In the overridden CalculateTaxServiceRequest, we carry out these steps:

  • Use the passed in SalesTransaction, loop through all active sales line items and get the external product id (from the product attributes). For the steps how to setup product attributes see this blog post
  • cache the external product id for later calls (see blog post for more detailed discussion)
  • pass the sales line information and external product id to the external tax service
  • dummy tax service returns a single tax code per sales line item with 20% tax and some other string value that must be stored on the tax line
  • the CRT tax service builds the right data structure with tax lines and calculates final total taxes
  • CRT tax service adds additional string value as Extension property on each tax line
  • CRT tax service returns the SalesTransaction

In addition, we need to hook into the transaction save process. Therefore, we implement the SaveSalesTransactionDataRequest post trigger. More details in this blog post.

The full source code can be found Extensions.MyTaxServiceBlog

Extending a transactional Retail Channel table with additional data (HQ X++ table extension, CDX, extension table, CRT data service)





Note: Implemented with Dynamics 365 version 7.2.11792.62628, or App Update 4 (may not work with lower versions). Sample code can be downloaded at the end of this blog.

A frequent customization need is to store more transactional data in the channel database in order to synchronize it back to HQ. There is a list of tables that holds these data, i.e. RetailTransactionTable, RetailTransactionSalesTrans, RetailTransactionTaxTrans tables, and a few others.

In this example, I am going to show how to extend the RetailTransactionTaxTrans table, how to write to it from a CRT data service, how to create a view to sync it back to AX and how to configure CDX. The single string that the tax line is extended with, is being passed in by means of an CRT Extension property. This sample is part of a larger sample to override the CRT’s tax calculation.

Lets work on the pieces from bottom up to the top.

HQ table extension (X++)

Build an X++ table extension for the table RetailTransactionTaxTrans by use of the AOT and Visual Studio. Before you do that, create a new Dynamics 365 package and model. In the screenshot below you can see I created a package called “Extensibility” and a model called “Extensibility” and added a new string column called ADDLSTRING1.

Make sure to build and db sync, and when all looks good to add the new package to VSTS.

Channel database extension (SQL)

Since we want to get the data from the channel to the HQ, we must store it initially in the channel (or even in the ModernPOSOffline db). In order to do that, we must develop some SQL for the channel database.

The new way for doing this is to create any new SQL on the db schema “[ext]” and NOT in any preexisting schemas, i.e. [ax], [dbo], [crt]. At this point you can reference Microsoft’s schemas, but we careful to not take too many dependencies. SQL objects cannot be considered a real interface, so the more assumptions your SQL makes on Microsoft’s SQL, the higher the probability you may need to fix something someday when taking an update. In the SQL I show here, you will see that all the new SQL objects live in the [ext] schema.

New table

We need to store the new column in a new table. Since, in this sample we want to store one more string value, we must at least have this new column in the table, plus all the other columns that allow us to reference the corresponding record in the original table (base table). The correct approach is to add all PK columns from the original table to the new table as well. In addition, it is a good practice (but not required) to add columns for creation and modification time.
Also, make sure that the PK on the new table is the same as on the original one. This is needed so CDX can find the data correctly.
Lastly, we need to provide CRUD privileges on the table to the DataSyncUsersRole (CDX) and insert privileges to the UsersRole (CRT process).

    CREATE TABLE [ext].[RETAILTRANSACTIONTAXTRANS](
		[CHANNEL] [bigint] NOT NULL,
		[SALELINENUM] [numeric](32, 16) NOT NULL,
		[STOREID] [nvarchar](10) NOT NULL,
		[TAXCODE] [nvarchar](10) NOT NULL,
		[TERMINALID] [nvarchar](10) NOT NULL,
		[TRANSACTIONID] [nvarchar](44) NOT NULL,
		[DATAAREAID] [nvarchar](4) NOT NULL,
		[CREATEDDATETIME] [datetime] NOT NULL,
		[MODIFIEDDATETIME] [datetime] NOT NULL,
		[ROWVERSION] [timestamp] NOT NULL,
		[ADDLSTRING1] [nvarchar](200) NOT NULL DEFAULT (''),
		 CONSTRAINT [I_EXT_RETAILTRANSACTIONTAXTRANS_PK] PRIMARY KEY NONCLUSTERED 
		(
			[CHANNEL] ASC,
			[TERMINALID] ASC,
			[STOREID] ASC,
			[TRANSACTIONID] ASC,
			[SALELINENUM] ASC,
			[TAXCODE] ASC,
			[DATAAREAID] ASC
		)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
		) ON [PRIMARY]

END
GO

GRANT SELECT,INSERT,UPDATE,DELETE ON [ext].[RETAILTRANSACTIONTAXTRANS] TO [DataSyncUsersRole]
GRANT INSERT ON [ext].[RETAILTRANSACTIONTAXTRANS] TO [UsersRole]
GO

New view

Consider that the view should represent the exact same columns as the original table before, plus the new column. As explained already above, we have the same PK on both tables, so we will use these to LEFT JOIN the two.
Grant the priviledge to SELECT by the DataSyncUsersRole (CDX).

CREATE VIEW [ext].[RETAILTRANSACTIONTAXTRANSEXTVIEW] AS
(
	SELECT
	tt.[AMOUNT],
	tt.[CHANNEL],
	tt.[ISINCLUDEDINPRICE],
	tt.[REPLICATIONCOUNTERFROMORIGIN],
	tt.[SALELINENUM],
	tt.[STOREID],
	tt.[TAXCODE],
	tt.[TERMINALID],
	tt.[TRANSACTIONID],
	tt.[DATAAREAID],
	tt.[ROWVERSION],
	exttt.[ADDLSTRING1],
	exttt.MODIFIEDDATETIME,
	exttt.CREATEDDATETIME
	 from [ax].RetailTransactionTaxTrans tt
	LEFT JOIN [ext].[RETAILTRANSACTIONTAXTRANS] exttt
	ON 
		tt.[CHANNEL] = exttt.[CHANNEL] AND
		tt.[TERMINALID] = exttt.[TERMINALID] AND
		tt.[STOREID] = exttt.[STOREID] AND
		tt.[TRANSACTIONID] = exttt.[TRANSACTIONID] AND
		tt.[SALELINENUM] = exttt.[SALELINENUM] AND
		tt.[TAXCODE] = exttt.[TAXCODE] AND
		tt.[DATAAREAID] = exttt.[DATAAREAID]
)
GO
GRANT SELECT ON [ext].[RETAILTRANSACTIONTAXTRANSEXTVIEW] TO [DataSyncUsersRole]
GO

New table type (TVP)

A table value type is a convenient way to “pass around a data table”. Eventually, we want to call a stored procedure and pass in a whole data table in order to be more efficient. An alternative would be to just pass in the individual parameters for each column and call the stored procedure multiple times. In this sample, since we want to create tax lines, we will create usually a few rows in one call (5 sales items with 2 different tax codes each would produce 10 rows). I created the TVP by copying the original one, and expanded it with the new column.

CREATE TYPE [crt].[EXTRETAILTRANSACTIONTAXTRANSTABLETYPE] AS TABLE(
	[DATAAREAID] [nvarchar](4) NULL,
	[SALELINENUM] [numeric](32, 16) NULL,
	[STOREID] [nvarchar](10) NULL,
	[TAXCODE] [nvarchar](10) NULL,
	[TERMINALID] [nvarchar](10) NULL,
	[TRANSACTIONID] [nvarchar](44) NULL,
	[ADDLSTRING1] [nvarchar](200) NOT NULL)
GO
GRANT EXECUTE ON TYPE::[crt].[EXTRETAILTRANSACTIONTAXTRANSTABLETYPE] TO [UsersRole];
GO

New stored procedure (sproc)

The sproc will get supplied by the TVP and the channel Id. This sproc will be called by our CRT data service below.

CREATE PROCEDURE [ext].[INSERTRETAILTRANSACTIONTAXTRANS]
			   (@bi_ChannelId		bigint,
			   @tvp_ExtRetailTransactionTaxTrans		[crt].[EXTRETAILTRANSACTIONTAXTRANSTABLETYPE] READONLY)
AS
BEGIN
...

CRT data service to write to the new table

Each CRT service (for data or other purposes) requires a Request class. In this example, we want to store additional tax line data from the transaction. So we can pass the SalesTransaction instance to our service via the Request.

public sealed class InsertTaxTransExtensionDataRequest : DataRequest
{
    public InsertTaxTransExtensionDataRequest(SalesTransaction transaction)
    {
        this.Transaction = transaction;
    }

    [DataMember]
    [Required]
    public SalesTransaction Transaction { get; private set; }
}

The data service must find the data we want to store and build a DataTable with multiple DataRows that corresponds to the TVP in SQL. Make sure the columns are in the same order. Then we use the DatabaseContext class to make the call to the sproc, by passing in the expected parameters.

using (DataTable taxTable = new DataTable("EXTRETAILTRANSACTIONTAXTRANSTABLETYPE"))
{
    taxTable.Columns.Add(DataAreaIdColumn, typeof(string)).DefaultValue = string.Empty;
    taxTable.Columns.Add(SaleLineNumColumn, typeof(decimal)).DefaultValue = 0m;
    taxTable.Columns.Add(StoreIdColumn, typeof(string)).DefaultValue = string.Empty;
    taxTable.Columns.Add(TaxCodeColumn, typeof(string)).DefaultValue = string.Empty;
    taxTable.Columns.Add(TerminalIdColumn, typeof(string)).DefaultValue = string.Empty;
    taxTable.Columns.Add(TransactionIdColumn, typeof(string)).DefaultValue = string.Empty;
    taxTable.Columns.Add(ADDLSTRING1Column, typeof(string)).DefaultValue = string.Empty;

    foreach (var line in tx.ActiveSalesLines)
    {
        foreach (var taxItem in line.TaxLines)
        {
            DataRow row = taxTable.NewRow();
            row[DataAreaIdColumn] = request.RequestContext.GetChannelConfiguration().InventLocationDataAreaId;
            row[SaleLineNumColumn] = line.LineNumber;
            row[StoreIdColumn] = tx.StoreId;
            row[TaxCodeColumn] = taxItem.TaxCode;
            row[TerminalIdColumn] = tx.TerminalId;
            row[TransactionIdColumn] = tx.Id;
            object oAddlString = taxItem.GetProperty("AdditionalString1");
            string addlString = string.Empty;
            if (oAddlString != null)
            {
                addlString = (string)oAddlString;
            }

            row[ADDLSTRING1Column] = addlString;
            taxTable.Rows.Add(row);
        }
    }

    ParameterSet parameters = new ParameterSet();
    parameters[ChannelIdVariableName] = request.RequestContext.GetPrincipal().ChannelId;
    parameters[TVPVariableName] = taxTable;

    int errorCode;
    using (var databaseContext = new SqlServerDatabaseContext(request))
    {
        errorCode = databaseContext.ExecuteStoredProcedureNonQuery("[ext].INSERTRETAILTRANSACTIONTAXTRANS", parameters);
    }

Finally, register this CRT data service with the CRT, in the commerceruntime.ext.config (RS) and the commerceruntime.ext.offline.config (MPOS offline).

Implement trigger in the appropriate place to call CRT data service

For our use case, it is perfect to piggy-bag onto the SaveSalesTransactionDataRequest CRT request and provided an implementation for a post-trigger to call our new data service. In that trigger, we initialize the request to our new data service and let the CRT take care of routing the call.

class SaveSalesTransactionDataRequestTrigger : IRequestTrigger
{
    public IEnumerable SupportedRequestTypes
    {
        get
        {
            return new[] { typeof(SaveSalesTransactionDataRequest) };
        }
    }

    public void OnExecuted(Request request, Response response)
    {
        ThrowIf.Null(request, "request");
        ThrowIf.Null(response, "response");

        var newRequest = new InsertTaxTransExtensionDataRequest(((SaveSalesTransactionDataRequest)request).SalesTransaction);
        request.RequestContext.Execute(newRequest);
    }

    public void OnExecuting(Request request)
    {
    }
}

Do not forget to register the trigger in the commerceruntime.ext.config (RS) and the commerceruntime.ext.offline.config (MPOS offline), similarly as shown above.

Configure CDX

New AX7 schema table

Create a new AX7 schema table called “ext.RetailTransactionTaxTransEXTView” in the “Retail Channel schema” page for AX7 schema. Make sure the same columns are used as in the original table called “ax.RETAILTRANSACTIONTAXTRANS”, plus the new ADDLSTRING1 column.

Switch scheduler subjob to use view instead of original table

In “Scheduler subjobs” find the job subjob called “RetailTransactionTaxTrans” and change it table to the newly created one and add the one new column.

Finally, use the “Generate queries” in the “Retail channel schema” to initialize CDX.

Resource file to automatically configure CDX after deployment

Brand new feature, to be continued at a future time…

Testing

Run through your scenario in your POS client to save a transaction. Then look in the database before and after to see that the extension table gets populated and synced with the P-0001 job.


Extensions.MyTaxTransExtensionDataServiceBlog

Product attribute usage in the channel




Note: Implemented with Dynamics 365 version 7.2.* (but likely working fine with lower or higher versions). Sample code can be downloaded at the end of this blog.

This information explains how to setup product attributes in Dynamics 365 for Operations (and similar products) and use it in code in the channel (in a CRT service).

Before we can see any new values in POS or write code against, there are some setup steps needed.

Setup

Adding a new product attribute

Steps for adding a new product attribute (e.g.: ExternalTaxId) to a category of products (e.g: Gloves and Scarves in fashion hierarchy) to the Houston channel:

  1. Create a new type under “Attribute type” (e.g.: ExternalTaxId)
  2. Create a new attribute (e.g: ExternalTaxId) under “Attributes” with the attribute type created in #1. In this same form, make sure to open the Filter settings and save it once, even if you are not making any changes
  3. Add the created attribute to a new attribute group under “Attribute groups” (e.g.: ExternalTaxId)
  4. Go to “Channel navigation categories” and select the navigation hierarchy (fashion navigation hierarchy) for the specific category (Gloves and Scarves). Under “product attribute groups” tab, add the new attribute GROUP.
  5. Go to “Channel categories and product attributes”. Set attribute metadata for channel = Houston. Find the attribute (season) and set Include attribute to yes and save.
  6. Publish channel updates on same form.

Saving values for product attributes

Browse to a product that is in the same category for which we added the new attributes (Products by category) and use “Setup/Product attributes” to save values.

Update the channel with changes

Go to distribution schedule and run the distribution jobs 1040, 1070,1150

Usage

You could now use MPOS for the respective channel to view the new value:

If you wanted to use CRT code, you can simply query for the attribute:

var channelId = context.Runtime.CurrentPrincipal.ChannelId;
int catalogId = 0;
var settings = QueryResultSettings.AllRecords;
var request = new GetProductAttributeValuesServiceRequest(channelId, catalogId, productId, settings);
var response = context.Runtime.Execute(request, context);
var externalProductTaxIdAttributeValue = response.AttributeValues.FirstOrDefault(av => av.Name == "ExternalTaxId");

if (externalProductTaxIdAttributeValue != null)
{
    externalTaxId = externalProductTaxIdAttributeValue.TextValue;
}

Inquiring version information of a Dynamics 365 for Finance and Operations deployment

There are two ways to find out the version information. Either use LCS or look at individual files on the box (in case the VM is not hosted on LCS).

In the LCS case, browse to the environment in question and follow the “View detailed version information” link. The following information will be available:

    • Platform update version (blue)
    • Binary hotfixes (red)
    • Application/X++ hotfixes (black)

    • Microsoft modules versions (incl. Retail)

    • custom modules versions

Dynamics 365 for Finance and Operations hotfix and deployment cheat sheet (including Retail)


Overview

There are a few wikis at https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/index?toc=dynamics365/unified-operations/fin-and-ops/toc.json which help with specific dev ALM and deployment topics. I found myself and from questions of others that it is difficult to pull all this information together into a single process. This write-up hopefully helps somebody that wants to update to follow the process without errors. It will certainly help me next time I need to take a hotfix since I can just follow a simple cheat sheet. So lets get started.

There are 3 different update types. 1) A platform update is fully backwards compatible with the application, its of binary nature, and it can simply be deployed. 2) A X++ application hotfix ships fixes in X++ source code that can be integrated into ones own code (code merged), sort of like a customization. And finally, 3) a binary hotfix is an application update for other tools, binaries, and Retail source code. The latter is cumulative, so you will always get the latest.  For that reason, use a good naming convention for the packages you download (and upload to the Asset library) as this helps later when you need to bring multiple environments to the same hotfix level.

Recommended (best) practices:

    • Download to and upload from the cloud development box. That way these files get transferred much faster and your intranet is not used.
    • If you have multiple development boxes, the steps below should be taken by one. Once all looks good, the other development boxes should get the changes via syncing VSTS (not via deployable package).
    • Some binary hotfixes depend on X++ hotfixes. Deploy these first (either by deployable package or VSTS), then tackle the binary update.
    • The LCS Asset library has a feature that allows for package merging. If you use it, you can decrease the overall deployment time. Unfortunately, merging does not work if the
    • Retail combined package is involved. Merging X++ deployable package, binary hotfix works.
    • Installed platform version, X++ hotfix (KB number) and binary hotfix (build version) can be inquired on the LCS detailed version info. See here for details.
    • There is no easy way to infer the KB number of a binary hotfix from the build version. One thing to deal with this is to include build version and download date into the package name. Then you can tell with high likelihood that a KB that is older then the binary package date is included. A KB with a newer date is likely not included in the package.

     

    See the picture below to see an overall flow of the process. Refer to it while reviewing the following sections.

    Note, that there are 4 total different deployment packages that could be deployed (purple boxes). See the details below how to get them.

    Note: For the following steps, the integration of the Retail Sdk’s build with the Dynamics 365 for Operations build is assumed. If you do not have this setup, follow Retail Sdk and Dynamics 365 build definition.

    Platform updates

    Platform updates are the easiest. There is no code merge required, not VSTS needed, you just move it to the Asset library and deploy it. See more information at FAQ monthly platform updates.

    X++ application hotfixes

    X++ hotfixes are not cumulative, so there may be some code merging needed, if you pick multiple. Visual Studio and the Dynamics Operations VS addin will help with that.

    Download the hotfix

    Ideally, you pick the hotfix you need and be done with it. However, I have found it to be better in the long run to take all hotfixes. It incurs slightly more testing up-front, but less testing when updating to the next major version. Additionally, other fixed issues you may just not have not encountered will get fixed before you see them.

      • Logon to LCS on the development box, chose the right environment and hit the X++ hotfix tile

    • Select All and hit Add button
    • Download package
    • Select all and download
    • Name it with meaningful data so you can identify the package later. I use environment name, date of download, the fact that this is not a binary update but an X++ update etc. (i.e. NewX++HotfixesForSpringEnv170903)
    • Unblock and unzip the package

    Backup PackageLocalDirectory

    I got burned by a failed hotfix application after my dev box became corrupted. I could not tell for sure which files had been touched… Since then, I do a simple robocopy-backup to be able to rollback if something happens.

    • Open a cmd.exe window with elevated privileges (runas admin)
    • Change directory into the parent of your PackagesLocalDirectory folder (here K:\AosService, might be in J: also)
    • robocopy PackagesLocalDirectory PLD_BeforeHFWork_170803 /E /NFL /NDL

    Prepare VSTS for the X++ hotfixes

    I have had issues with the VS addin in the past, so I always use the command line version. Additionally, it is important to understand the –prepare statement. Use it! Otherwise, code merges you may need to do later will be hard.

    Update:
    Below is a batch script that I add to each Metadata folder (VSTS too). Just update the environment variables, and remove the commented lines one by one, first -prepare, then -install. See below for details. here is the contents of the script:

    setlocal
    
    set HotfixPackageBundlePath=C:\Temp\Downloads\AllX++HotfixesTill06192018\HotfixPackageBundle.axscdppkg
    set PLD=k:\AosService\PackagesLocalDirectory
    set TFSUri=https://xxxx.visualstudio.com/defaultcollection
    
    rem bin\SCDPBundleInstall.exe -prepare -packagepath=%HotfixPackageBundlePath% -metadatastorepath=%PLD% -tfsworkspacepath=%PLD% -tfsprojecturi=%TFSUri%
    rem bin\SCDPBundleInstall.exe -install -packagepath=%HotfixPackageBundlePath% -metadatastorepath=%PLD% -tfsworkspacepath=%PLD% -tfsprojecturi=%TFSUri%
    
    endlocal

    Save it with a name like UpdateAppHotfixes.cmd. Run it from an elevated cmd console, while the local directory is the PackagesLocalDirectory.

        • Open Visual Studio and make sure you are logged in with the same account that is going to be used to access VSTS. If you are not sure, logout and log back in. All we want to get is a new valid authentication token so the steps below will succeed.
        • Close all VS instances
        • Open a cmd.exe window with elevated privileges (runas admin)
        • Change directory into PackagesLocalDirectory\bin folder (here K:\AosService\PackagesLocalDirectory\Bin, might be in J: also)
        • SCDPBundleInstall.exe -prepare -packagepath=C:\Temp\Downloads\NewX++HotfixesForSpringEnv170903\HotfixPackageBundle.axscdppkg -metadatastorepath=k:\AosService\PackagesLocalDirectory -tfsworkspacepath=k:\AosService\PackagesLocalDirectory -tfsprojecturi=https://XXXXX.visualstudio.com/defaultcollection
        • Once the command finished, open Visual Studio and submit the newly added files with a meaningful changelist name

      Apply the hotfixes

      This step will apply the actual changes to the files that were prepared in the previous step.

        • close all VS instances and keep using same cmd.exe instance from above
        • SCDPBundleInstall.exe -install -packagepath=C:\Temp\Downloads\NewX++HotfixesForSpringEnv170903\HotfixPackageBundle.axscdppkg -metadatastorepath=k:\AosService\PackagesLocalDirectory -tfsworkspacepath=k:\AosService\PackagesLocalDirectory -tfsprojecturi=https://XXXXX.visualstudio.com/defaultcollection
        • Once the command finished, check for conflicts: open Visual Studio, Select Dynamics 365/Addings/Create project from conflicts
          If there are conflicts, you need to resolve them
        • Do a full build: Dynamics 365/Build models/Packages-select all/Options-use default plus select sync database and then hit the Build button
        • When the build succeeded without errors, submit the changed files with a meaningful changelist name

      Binary hotfixes

      Binary hotfixes are cumulative. You need to pick one of them, and you will get the latest. If Retail channel components are not customized, then there is no code merge needed.

      Download the binary hotfix

        • Logon to LCS on the development box, chose the right environment

      • Click the download binaries button
      • Name it with meaningful data so you can identify the package later. I date of download (i.e. AllBinary72UpdatesLatestPlatform170903)
      • Unblock the zip file and then unzip it
      • Upload the zipped package to LCS’s Asset library

      Apply the binary hotfix

      Use the LCS environment’s Maintain menu to deploy this package.

      Only in case of Retail channel customizations: Update the Retail Sdk mirror branch

      In order to effectively do code merges, it is suggested to use 2 branches. For more details, check Retail Sdk Overview (at the end of the wiki page).

      Ideally, the Retail Sdk branch would be hosted in the same VSTS project, in parallel to the Trunk folder.

      In order to update it:

        • Make sure the mirror branch/folder is fully synced to latest version.
        • Close all but one Visual Studio instances
        • In a first Windows Explorer window, find the new Retail Sdk which we will use to update the mirror. On a brand new environment, find it in the service drive (K:\ or J:\) under “Retail Sdk”. If this is a binary hotfix, unzip the hotfix package as you downloaded it, and find the SDK in the RetailSDK\Code folder.

      • In a second Windows Explorer window open the location of the outdated mirror Retail Sdk branch/folder (where it is mapped from VSTS to local folder)
      • Delete all files in the outdated mirror Retail Sdk branch/folder (open in the second Windows Explorer Window)
      • Copy and paste all files from the new Retail Sdk into the folder you just cleaned (copy from second to first Windows Explorer window)
      • (Optional) If you have any doubt whether the shipped Retail Sdk has a build error, carry out these steps to verify:
      • (Optional) Make a temporary copy of the new Retail Sdk (from the hotfix) to any other place of your choice
      • (Optional) Open a Visual Studio 2015 msbuild command prompt and change directory to the temporary location
      • (Optional) Type “msbuild” and hit Enter (if this shows any build errors, please open a support request or bug as the shipped Retail Sdk should build without errors)
      • Delete all files in the mirror branch in Windows explorer, and add the new Retail Sdk back. This will ensure that removed files are properly being removed from the source control.
      • In “Source Control Explorer”, right click the mirror branch, “Add items to Folder…”, Add all folders from the same source location back. Make sure there are no “excluded items”, and hit Finish.
      • Make sure there are no files from the mirror branch listed under “Team Explorer”, “Pending Changes”, “Excluded Changes” and “Detected”. If there are, promote them to the “Included Changes”
      • Check In the changes.

      Only in case of Retail channel customizations: Code merge the Retail Sdk customization branch

      • Make sure you do not have any changed files in the customization branch before you start. If this is difficult to accomplish create a new client mapping, get the customization branch into a different folder or machine and do the merge there. Do not start merging if you have opened files.
      • In Source Control Explorer, right click the mirror branch and select “Branching and Merge…”, Merge
      • Make sure that the source is the mirror branch and destination is your customization branch
      • Hit Next and Finish
      • Resolve any possible merge conflicts
      • Watch closely that all “Included files” are the correct files. These should only be the merged files, or updated files in the mirror
        Watch closely that all “Excluded files” only include generated files. Do not promote them

      Only in case of Retail channel customizations: Test local Retail Sdk customization build and submit to VSTS

      Before checking in these changes, lets make sure that all builds fine. Open a Visual Studio MSBuild developer command window, and type “msbuild” at the root of the Retail Sdk customization branch. Once all builds fine, submit the changes with a meaningful changeset name.

    Run build on build machine

    Inspect the submitted changes in the VSTS code branch. In the example below, I see 2 checkins for the X++ hotfixes, one other code change, one to update the Retail Sdk mirror branch and one to code merge the Retail Sdk customization branch.

    Upload the AX and Retail deployable packages to LCS

    Find the packages in VSTS and upload them to LCS.

    Deploy AX and Retail deployable packages

    Deploy the packages from the LCS asset library (in the image below the 3rd and 4th).  The RetailDeployablePackage is only needed in case of Retail channel customizations.

    Once the deployment succeeded, you should see the tile count go down.

    Retail only: Update channel components

    Follow the wiki about how to deploy the store components (Modern POS, Modern POS Offline, Hardware station, Retail Store Scale Unit)

CDX extensibility – Synchronizing data from channel to HQ via P-job



By default, there are very few tables that get synchronized from the store to the HQ. These are mostly tables related to the transactions being generated. However, there are scenarios where customers may want to create other data in their own tables in the channel and bring it back to HQ. In many cases, that may not need to happen in real-time and this task can be accomplished by using the pull job architecture. Below are the steps how to set this up.

Dynamics 365 for Operations table creation

Since the records will trickle in from multiple channels, stores, terminals, the table’s index must include the data that differentiate the records. A good example is the RetailTransactionTable to look at how it’s done.
Create the table with the AOT and follow these guidelines:

  • A column called “REPLICATIONCOUNTERFROMORIGIN” of type RetailReplicationCounter needs to be added.
  • Add the other data columns that you would like to synchronize (in this example, FieldStr1 and FieldInt1)
  • Set “Created Date Time” and “Modfied Date Time” to Yes.
  • Create an index that includes the fields that are needed to differentiate the records well. For retail stores, that could be Channel, Store, Staff, Terminal and the id of the record as called ChannelOriginatedDataId in this example.


Note: How to put this file into VSTS so the change is reproducible and deployable into other environments is omitted here.

Create the table schema for the Channel database

Create a table with the same name but schema [ax] in SSMS. Here are some specific requirements to follow:

  • Include columns called [REPLICATIONCOUNTERFROMORIGIN], [ROWVERSION], [CREATEDDATETIME] and [MODIFIEDDATETIME] (details below)
  • If the table in HQ is per company (SaveDataPerCompany = yes), include a column [DataAreaId]
  • Create a clustered unique index for column [REPLICATIONCOUNTERFROMORIGIN]
  • Create a nonclustered primary key index for columns that uniquely differenciate the records, including [DATAAREAID] if data is saved per company
IF (SELECT OBJECT_ID('ax.CHANNELORIGINATEDDATATABLE')) IS NULL 
BEGIN
	CREATE TABLE [ax].[CHANNELORIGINATEDDATATABLE](
		[CHANNELORIGINATEDDATAID] [nvarchar](44) NOT NULL,
		[FIELDSTR1] [nvarchar](200) NOT NULL DEFAULT (''),
		[FIELDINT1] [int] NOT NULL DEFAULT (0),
		[CHANNEL] [bigint] NOT NULL,
		[STORE] [nvarchar](10) NOT NULL,
		[TERMINAL] [nvarchar](10) NOT NULL,
		[STAFF] [nvarchar](25) NOT NULL,
		[CREATEDDATETIME] [datetime] NOT NULL,
		[MODIFIEDDATETIME] [datetime] NOT NULL,
		[REPLICATIONCOUNTERFROMORIGIN] [int] IDENTITY(1,1) NOT NULL,
		[DATAAREAID] [nvarchar](4) NOT NULL,
		[ROWVERSION] [timestamp] NOT NULL,
	 CONSTRAINT [I_AX_CHANNELORIGINATEDDATATABLE_PK] PRIMARY KEY NONCLUSTERED 
	(
		[CHANNEL] ASC,
		[STORE] ASC,
		[TERMINAL] ASC,
		[CHANNELORIGINATEDDATAID] ASC,
		[DATAAREAID] ASC
	)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY],
	 CONSTRAINT [I_AX_CHANNELORIGINATEDDATATABLE_REPLICATIONCOUNTERFROMORIGIN] UNIQUE CLUSTERED 
	(
		[REPLICATIONCOUNTERFROMORIGIN] ASC
	)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
	) ON [PRIMARY]
END
GO

GRANT SELECT,INSERT,UPDATE,DELETE ON [ax].CHANNELORIGINATEDDATATABLE TO [DataSyncUsersRole]
GRANT INSERT ON [ax].[CHANNELORIGINATEDDATATABLE] TO [UsersRole]

Note: How to put this file into VSTS and registered in the Retail Sdk’s Customization.settings file so the change is reproducible and deployable into other environments is omitted here.

Add the new table to the list of Channel tables

CDX needs to be aware of the schema of the channel table in order to map and sync it with the HQ table. In Retail and commerce > Headquarters setup > Retail Scheduler > Retail channel schema, click Channel tables and add a new table with the new fields:

Add a new scheduler subjob

In Retail and commerce > Headquarters setup > Retail Scheduler > Scheduler subjobs create a new subjob that maps both HQ and channel table. Make sure that:

  • the special column [REPLICATIONCOUNTERFROMORIGIN] is both mapped in the field mapping and used in the replication counter field, and
  • the slider for “Pull Data” is set to Yes.

Add the subjob to the job

In Retail and commerce > Headquarters setup > Retail Scheduler > Scheduler jobs add the subjob to a job. You could reuse the existing P-0001 job as in this example:

Finally, in Retail and commerce > Headquarters setup > Retail Scheduler you must regenerate the sync helper classes with “Generate queries”.
Note: The previous steps in the Dynamics 365 for Operations user interface to configure CDX are shown here for better illustration. Ideally, these should be coded in X++ to get reliable and automatic deployments into other environments.

Verification

Add some data into the table in the channel side ([ax] schema) and run the pull job. Then inspect the data in the HQ side:

Data access in the CommerceRuntime



Below are some code samples for reading and writing data in Channel database. This is just for Channel database. This does not necessary mean that the data needs to go to D365 HQ. It may, either via CDX pull job or via Real-time service call. This information is only for simple reads and writes in Channel db.

Prerequisites:
Dynamics 365 for Operations (1611)
KB3214687
KB3214679

First, create the SQL table in ax schema and a view in crt schema:

    CREATE TABLE [ax].[MY_EXTENSION](
		[DATAAREAID] [nvarchar](4) NOT NULL,
		[RECID] [bigint] NOT NULL,
		[COL1] [int] NOT NULL,
		[COL2] [nvarchar](100) NOT NULL,
		[COL3] [bit] NOT NULL,
		[ACCOUNTNUM] [nvarchar](20) NOT NULL)
    . . .
    END
    GO

    CREATE VIEW [crt].[MY_EXTENSIONVIEW] AS
    (
        SELECT ACCOUNTNUM, DATAAREAID, COL1, COL2, COL3 FROM [ax].[MY_EXTENSION]
    )
    GO

    GRANT SELECT ON [crt].[MY_EXTENSIONVIEW] TO [UsersRole];
    GO

The shown sample adds 3 columns to be stored from 3 extension properties.

Grant the right permissions to table and view. This is the list of supported SQL roles:

DataSyncUsersRole Used by CDX process account
PublishersRole Used by publishing process account (eCommerce)
ReportUsersRole Used by reporting user account
UsersRole Used by runtime user (RetailServer)

Next, create a sproc for updating:

CREATE PROCEDURE [crt].[MY_UPDATEEXTENSIONPROPERTIES]
    @TVP_EXTENSIONPROPERTIESTABLETYPE         [crt].EXTENSIONPROPERTIESTABLETYPE READONLY
AS
BEGIN
	DECLARE @nvc_DataAreaId NVARCHAR(4);
	DECLARE @recId bigint;
	DECLARE @accountNum [nvarchar](20);

	DECLARE @Col1Value int = 0;
	DECLARE @Col2Value nvarchar(100) = '';
	DECLARE @Col3Value bit = 0;
	
       SELECT DISTINCT TOP 1 @recId = tp.PARENTRECID, @nvc_DataAreaId = ct.DATAAREAID, @accountNum = ct.ACCOUNTNUM
       FROM @TVP_EXTENSIONPROPERTIESTABLETYPE tp
       JOIN [ax].CUSTTABLE ct on ct.RECID = tp.PARENTRECID
       WHERE tp.PARENTRECID <> 0
			
	SELECT @Col1Value = COALESCE(tp.PROPERTYVALUE, 0)
	FROM @TVP_EXTENSIONPROPERTIESTABLETYPE tp 
	where tp.PARENTRECID <> 0 and tp.PROPERTYNAME = 'COL1'
	
	SELECT @Col2Value = COALESCE(tp.PROPERTYVALUE, '')
	FROM @TVP_EXTENSIONPROPERTIESTABLETYPE tp 
	where tp.PARENTRECID <> 0 and tp.PROPERTYNAME = 'COL2'
	
	SELECT @Col3Value = CAST(CASE WHEN tp.PROPERTYVALUE = 'True' THEN 1 ELSE 0 END AS BIT)
	FROM @TVP_EXTENSIONPROPERTIESTABLETYPE tp 
	where tp.PARENTRECID <> 0 and tp.PROPERTYNAME = 'COL3'

	MERGE INTO [ax].[MY_CUSTOMEREXTENSION] dest
	USING (SELECT @accountNum as ACCOUNTNUM) as source on  dest.ACCOUNTNUM = source.ACCOUNTNUM
	WHEN matched then
		update set dest.COL1 = @Col1Value, dest.COL2 = @Col2Value, dest.COL3 = @Col3Value
	when not matched then
		INSERT (RECID, ACCOUNTNUM, DATAAREAID, COL1, COL2, COL3)
		VALUES (@recId, @accountNum, @nvc_DataAreaId, @Col1Value, @Col2Value, @Col3Value);
END
GO

GRANT EXECUTE ON [crt].[MY_UPDATECUSTOMEREXTENSIONPROPERTIES] TO [UsersRole];
GO

Note the MERGE INTO statement. This allows to either create or update a record.

For reading in the CRT, here especially for extension properties, use SQLPagedQuery, SQLServerDatabaseContext and ExtensionsEntity:

var query = new SqlPagedQuery(QueryResultSettings.SingleRecord)
{
    Select = new ColumnSet(new string[] { "COL1", "COL2", "COL3" }),
    From = "MY_EXTENSIONVIEW",
    Where = "ACCOUNTNUM = @accountNum AND DATAAREAID = @dataAreaId",
};

query.Parameters["@accountNum"] = customer.AccountNumber;
query.Parameters["@dataAreaId"] = request.RequestContext.GetChannelConfiguration().InventLocationDataAreaId;
using (var databaseContext = new SqlServerDatabaseContext(request))
{
    ExtensionsEntity extensions = databaseContext.ReadEntity(query).FirstOrDefault();
    if (extensions != null)
    {
        var col1 = extensions.GetProperty("COL1");
        if (col1 != null)
        {
            customer.SetProperty("COL1", col1);
        }
        else
        {
            customer.SetProperty("COL1", 0);
        }
 	 
        . . .
    }
    else
    {
        customer.SetProperty("COL1", 0);
    }
}

In order to write the data to the db, use this code to call the sproc shown above:

using (var databaseContext = new SqlServerDatabaseContext(r))
using (var transactionScope = new TransactionScope())
{
    if (!r.Customer.ExtensionProperties.IsNullOrEmpty())
    {
        ParameterSet parameters = new ParameterSet();
        parameters["@TVP_EXTENSIONPROPERTIESTABLETYPE"] = new ExtensionPropertiesTableType(r.Customer.RecordId, r.Customer.ExtensionProperties).DataTable;
        databaseContext.ExecuteStoredProcedureNonQuery("MY_UPDATEEXTENSIONPROPERTIES", parameters);
    }

    transactionScope.Complete();
}

In the case of you want to write other data to the database (no extension properties to entities), build the SqlParameters one by one and match in your sproc. For reading, it may be easier to just treat it as extension properties, then convert to whatever you want. Or you could create your own entity and query for that.

Hosting a downloaded dev VHD of Dynamics 365 for Operations

In order to save cost or time, it may be practical for partners/customers to host their own version of the VHD that is based on the official downloadable VHD by Microsoft. One scenario could be that the Contoso demo data is good enough for development, but some additional data setup, hotfix application or code customizations may be needed. These steps could be carried out once by one person, and then that VHD could be re-used. Some partners may be on a monthly cadence to “rev” their dev environments. There are two options to do this:

  1. Reuse the VHD and host it locally in HyperV or similar virtualization technologies
  2. Reuse the VHD and host it in Azure

The option 1) is straightforward and many will opt for this. There are some cases where it is useful to host in Azure though, mostly for simpler sharing of a VM or because an appropriate host for the VM is not available (i.e. laptop not powerful enough). Here is a step-by-step guide that worked for me to bring up the VHD to Azure so I can simply spin up a new instance in relatively short time:

  1. Download the VHD from https://connect.microsoft.com/ and unpack it
  2. (optional) prepare the VHD with data, binary fixes or customizations
  3. Upload the VHD to your Azure subscription. If you have not done already, you need to install the Windows Azure Sdk. If you have not done already, you need to create a management certificate for Azure on the local machine and upload it to Azure (basically this grants access to the Azure subscription). Then follow this: https://docs.microsoft.com/en-us/azure/virtual-machines/virtual-machines-windows-classic-createupload-vhd. I did not have to run sysprep on the downloaded VHD from Microsoft. I think this step is needed if you carry out step 2).
    When the steps are done, you should see the new VHD ready to be used as a template for creating new VMs.
  4. Create a new VM based on that VHD: https://docs.microsoft.com/en-us/azure/virtual-machines/virtual-machines-windows-classic-createportal?toc=%2fazure%2fvirtual-machines%2fwindows%2fclassic%2ftoc.json(optional but suggested) Update the administrator password to something else, so nobody else can hijack your VM (if they somehow had the RDP connection)

Update 3/30/2017:
I have added a PowerShell script that is ready do be executed after changing some of the variables:
UploadD365VHDUpdate5.ps1

Move POS screen layouts from one environment to another

Using the POS screen layout designer is a tedious work and very likely you would not want to do this in every environment over and over again (production, sandbox, dev, etc.). The solution to this (and similar tasks) is to use the DIXF framework to import and export entities. For this particular task of moving POS screen layouts, here are the steps to follow:

  1. Configure the data source
    1. Enter the “Data management” workspace
    2. Click on “Configure data source”
    3. Select “CSV-Unicode” and edit
    4. Set the “text qualifier” to ~ (tilde)
    5. Save the data source
  2. Export
    1. configure the data source as above
    2. Start a new export project with target data source set to “CSV-Unicode”
    3. Add the following entities:
      1. RetailImagesEntity (POS layout images)
      2. RetailButtonGridEntity (POS button grid)
      3. RetailButtonGridButtonsEntity (POS button grid buttons)
        1. Click “View map” and “Mapping details”
        2. Enable text qualifier for DisplayText and Save
      4. RetailTilllayoutEntity (POS screen layouts)
        1. follow same steps to edit “Mapping details” but enable text qualifier for all fields
      5. RetailTillLayoutZoneEntity (POS screen layout zones)
      6. RetailTillLayoutButtonGridZoneEntity (POS screen layout button grid zones)
      7. RetailTillLayoutImageZoneEntity (POS screen layout images zones)
      8. RetailTillLayoutReportZoneEntity (POS screen layout report zones)
    4. Download the package
  3. Import
    1. configure the data source as above
    2. Start a new import project and pick source data format “Package”
    3. Upload the package and import it

Looking for AX/Retail errors in the EventLog but do not know where to look for?

One solution is to look at all EventLog entries for anything “Dynamics”. Here is how to do it:

Open the EventViewer.
Custom Views/Create new custom view.
Select Event levels you want to see.
Select the Event logs you want to see.  Here, make sure you select Applications and Services Logs/Microsoft/Dynamics
Hit OK and call it “Dynamics”.

dynamicseventlogfilter