Use Azure Data Explorer to query LCS Raw information logs

Note: This is now also part of the official documentation on docs.microsoft.com: Use Azure Data Explorer to query raw information logs – Finance & Operations | Dynamics 365 | Microsoft Docs. I will continue to maintain the samples at the end of this blog.

There are occasions when a customer, partner, consultant, or support engineer needs to look at the low-level Dynamics 365 Finance & Operations telemetry data. These use cases include troubleshooting of errors, performance-related investigations or just to gain some additional understanding of how the platform work. Telemetry data can be accessed by authorized users via the Environment monitoring part of the LCS portal, can be filtered in a few different ways and displayed inside the LCS portal’s raw logs section. A data grid can be used to inspect the log entries. LCS does not allow for more sophisticated pivoting and users can use Excel for that purpose. For that purpose, the telemetry data can also be downloaded formatted in CSV format.

However, Excel is not the optimal tool for advanced querying of this data. The perfect tool, designed for this purpose is the Azure Data Explorer. It provides an innovative query language Kusto that is optimized for high-performance data analytics. Answering questions like “how often has a certain process taken place, how long has it taken in 90% of the times, how often per hour has a certain action taken place over the course of a day” becomes a lot easier and can be backed up with powerful graphics as well.

Here are examples how the graphics could look like: 

A less known feature of the Azure Data Explorer is that it supports ingestion of CSV files. We can use it to get our CSV data files uploaded and staged so it can be queried with the Kusto language. If you have not setup Azure Data Explorer Cluster, follow these steps.

Steps to upload to Azure Data Explorer

  • Run your query on LCS raw logs page
  • Important: adjust the time interval or – filter to get to the right data (row limit is 5000 for export in next step)
  • export the grid to Excel
  • Open the file in Excel and save it without making any changes (this seems to fix a formatting issue)
  • In your Azure Data Explorer, right click on the cluster in the tree view and select “ingest new data” and then on the next page “ingest data from a local file”
  • Pick your cluster, name a new table for the data to be imported into, select up to 10 CSV files to import, select CSV format. Hit next a few times till your data is getting imported.
  • Use the Query tile to write a Kusto query against your data.

To learn more about the Kusto query language, go here.

Sample queries

Modern POS

all errors

Sometimes its nice to get an idea of what all the errors are. The use of the text fields to describe errors is not consistently used, so its better to use the EventIds and map them to the correct errors. They can be looked up in these two Commerce SDK files (but using the code snippet below may give you all of them already):

  • Pos.RetailLogger.js
  • Microsoft.Dynamics.Retail.Diagnostics.Sinks.man
// Note: Filter EventSeverity == Error before exporting/ingesting 
MPosErrors24h
| extend EventName = case(
EventId == 53231, "runtimeInterceptorFailed",
EventId == 48434, "posPaymentTerminalAuthorizeRefundActivityFailed",
EventId == 48347, "posAuthorizeOrRefundPaymentFailed",
EventId == 48104, "posTenderPaymentOperationFailed",
EventId == 46802, "extensibilityFrameworkExecuteRuntimeRequestFailed",
EventId == 44322, "viewModelCartProcessTextFailed",
EventId == 42112, "viewModelCartProcessTextFailed",
EventId == 42106, "RetailProxyExtensionAdapterManagerMethodNotFound",
EventId == 40450, "posInitiatedHardwareStationRequestFailed",
EventId == 40409, "peripheralsBarcodeScannerEnableFailed",
EventId == 40325, "operationSetQuantityOptionsValidationFailed",
EventId == 40255, "RetailOperationFailed",
EventId == 40209, "coreRetailOperationOnExecutingFailed",
EventId == 40191, "modelManagersChainedRequestExecutionFailed",
EventId == 40170, "modelManagersErrorParserHardwareStationError",
EventId == 40150, "modelManagersChainedRequestFactoryExecuteOfflineLogonRequestUnAvailable",
EventId == 40104, "ModelManagersCommerceRuntimeRequestError",
EventId == 40101, "ModelManagersRetailServerRequestError",
EventId == 7505 , "PaymentConnectorLogErrors",
EventId == 7503 , "PaymentConnectorLogException",
EventId == 7014 , "HardwareStationActionFailure",
EventId == 1001 , "CrtExecuteRequestErrorFailure",
strcat("*************** Unknown EventId: ", EventId))
| where EventSeverity == "Error"
| summarize count() by EventId, EventName
| order by count_ desc

Now, its easier to decide where to look next, i.e. what more detailed queries to issue for specific errors…

Errors returned by RetailServer

// Note: Filter EventSeverity == Error before exporting/ingesting 
MPosErrors24h
| extend EventName = case(
EventId == 53231, "runtimeInterceptorFailed",
EventId == 48434, "posPaymentTerminalAuthorizeRefundActivityFailed",
EventId == 48347, "posAuthorizeOrRefundPaymentFailed",
EventId == 48104, "posTenderPaymentOperationFailed",
EventId == 46802, "extensibilityFrameworkExecuteRuntimeRequestFailed",
EventId == 44322, "viewModelCartProcessTextFailed",
EventId == 42112, "viewModelCartProcessTextFailed",
EventId == 42106, "RetailProxyExtensionAdapterManagerMethodNotFound",
EventId == 40450, "posInitiatedHardwareStationRequestFailed",
EventId == 40409, "peripheralsBarcodeScannerEnableFailed",
EventId == 40325, "operationSetQuantityOptionsValidationFailed",
EventId == 40255, "RetailOperationFailed",
EventId == 40209, "coreRetailOperationOnExecutingFailed",
EventId == 40191, "modelManagersChainedRequestExecutionFailed",
EventId == 40170, "modelManagersErrorParserHardwareStationError",
EventId == 40150, "modelManagersChainedRequestFactoryExecuteOfflineLogonRequestUnAvailable",
EventId == 40104, "ModelManagersCommerceRuntimeRequestError",
EventId == 40101, "ModelManagersRetailServerRequestError",
EventId == 7505 , "PaymentConnectorLogErrors",
EventId == 7503 , "PaymentConnectorLogException",
EventId == 7014 , "HardwareStationActionFailure",
EventId == 1001 , "CrtExecuteRequestErrorFailure",
strcat("*************** Unknown EventId: ", EventId))
| where EventSeverity == "Error"
| where EventId == 40101
| summarize count() by requestAction, error
| order by count_ desc

CSU

All errors

Similarly to the MPOS errors, we can get them from the CommerceSDK (Microsoft.Dynamics.Retail.Diagnostics.Sinks.man)

// Note: Filter EventSeverity == Error before exporting/ingesting 
CSUErrors24h
| extend EventName = case(
EventId == 1017, "CrtSlowHandlerExecution",
EventId == 1005, "CrtExecuteRequestWarningFailure",
EventId == 5055, "RetailServerRequestWarningFailure",
EventId == 60208, "SyncLibraryMergeDataIntoTableWarning",
EventId == 60125, "ProcessDeleteRequestStart",
EventId == 2617, "CrtWorkflowUserAuthenticationRequestHandlerFailure",
EventId == 2486, "CrtServicesEmployeePasswordDoesNotMatch",
EventId == 5106, "RetailServerSecretRetrievalWarning",
EventId == 2503, "CurrencyServiceGetSupportedChannelCurrenciesNotFound",
EventId == 6943, "RtsClientLibraryApiExecutionWarning",
EventId == 3007, "CrtTransactionServiceClientRtsExecutionWarning",
EventId == 2413, "CrtServicesSalesOrderTransactionServiceMarkReturnedItemsFailure",
EventId == 2483, "CrtServicesLocalLogonFailedDueToIncorrectStaffId",
strcat("*************** Unknown EventId: ", EventId))
| summarize count() by EventId, EventName
| order by count_ desc

A next step would be to go through each of these error types, and look at them closer. A few of those could indicate quality issues that could effect the user experience.

In many cases, these errors can be fixed by cleaning up the extended code, adding proper SQL indexes or investigate better problem approaches.

In some cases, these errors could indicate problems with out-of-box experience (OOBE) or deployment. Please open a support request to get these fixed by Microsoft.

Slow CRT handlers by requesttype

// Note: Filter EventSeverity == Error before exporting/ingesting 
CSUErrors24h
| extend EventName = case(
EventId == 1017, "CrtSlowHandlerExecution",
EventId == 1005, "CrtExecuteRequestWarningFailure",
EventId == 5055, "RetailServerRequestWarningFailure",
EventId == 60208, "SyncLibraryMergeDataIntoTableWarning",
EventId == 60125, "ProcessDeleteRequestStart",
EventId == 2617, "CrtWorkflowUserAuthenticationRequestHandlerFailure",
EventId == 2486, "CrtServicesEmployeePasswordDoesNotMatch",
EventId == 5106, "RetailServerSecretRetrievalWarning",
EventId == 2503, "CurrencyServiceGetSupportedChannelCurrenciesNotFound",
EventId == 6943, "RtsClientLibraryApiExecutionWarning",
EventId == 3007, "CrtTransactionServiceClientRtsExecutionWarning",
EventId == 2413, "CrtServicesSalesOrderTransactionServiceMarkReturnedItemsFailure",
EventId == 2483, "CrtServicesLocalLogonFailedDueToIncorrectStaffId",
strcat("*************** Unknown EventId: ", EventId))
| where EventId == 1017
| summarize count(), sum(inclusiveExecutionTime), percentiles(inclusiveExecutionTime, 75, 90) by handlerAssemblyName, requestType
| order by sum_inclusiveExecutionTime desc

F&O

All errors

// Note: use selection "All error events" to only get the errors
FNOErrors24h
| summarize count() by formName, targetName, errorLabel
| order by count_ desc

FormName, targetName, errLabel are all good candidates to look up in Azure DevOps to find out the code location. That gives more insight in what areas these error occur.

Example: The @SYS18885 is thrown from ReqPlanData class. Now I can focus on figuring out why that is the case (is Master planning not configured right?)

Slow SELECT queries

// use selection "slow queries" in LCS diagnostics
SlowQueries
| extend MainTableName = extract("FROM\\s([A-Z0-9]+)\\s", 1, statement)
| where statement contains "SELECT"

slow inserts

// use selection "slow queries" in LCS diagnostics
SlowQueries
| extend MainTableName = extract("INSERT INTO ([A-Z0-9]+)\\s", 1, statement)
| where statement contains "INSERT"

Slow deletes

// use selection "slow queries" in LCS diagnostics
SlowQueries
| extend MainTableName = extract("DELETE FROM ([A-Z0-9]+)\\s", 1, statement)
| where statement contains "DELETE FROM"

Batch jobs performance

// use selection "All logs" and add query "where TaskName Equals BatchFinishedOneTask" in LCS diagnostics
BatchFinishedOneTask
| summarize count(), sum(durationInMilliSeconds), percentiles(durationInMilliSeconds, 75, 90) by className
| order by sum_durationInMilliSeconds desc

Failing Dynamics 365 F&O deployments? Last resort workarounds

This post originally appeared on https://dynamicsnotes.com/failing-dynamics-365-fo-deployments-last-resort-workarounds/.

There are times when a deployment fails and even a re-try does not help.  In these cases, a service ticket should be opened to the Microsoft engineers.

However, there are cases when this is not feasible or helpful.  For example:

    • its a tier 1 development environment and you caused the issue, or
    • you cannot wait and need to get it done very fast, or
    • you moved the database but did not run the Retail Re-provisioning tool and the Retail deployment fails now (and you do not care because its not a Retail project).

In these and other cases, it may be OK to just step over the failing step and let the deployment finish (in non-production environments).

The following steps can be used to workaround. Again, this is almost “hack” territory, but sometimes is needed…

  1. Find the step number that failed. LCS should tell you. Say for a moment, it’s step 43.
  2. Wait till the deployment is in “Failed” state.
  3. Login to the VM where this error occurred. This can also be found on LCS portal.
  4. Find the current runbook.xml. Its under C:\RunbookOutput, and is going to be the latest changed file.
  5. Open it in your favorite XML editor (i.e. Notepad++) and find the step with the number (search for “>43<“)
  6. Mark the step “Completed” (from “Failed”).
  7. Save the file and resume the deployment from LCS.

Note, sometimes I have found that marking this step is not enough. If that is the case, you can also edit the PowerShell file that the step calls into and essentially comment out all the code. The PowerShell file will be in the service directory under AOSService\DeployablePackages.

Again, this is a hack, but sometimes desperate times call for desperate measures.

Simulating network “issues”

This post originally appeared on https://dynamicsnotes.com/simulating-network-issues/.

In some cases it is very useful to see how a web site would perform under certain network conditions that are worse than what you have currently. You can use this tool to simulate it: https://jagt.github.io/clumsy/. As an example, if you want to simulate what a network latency of 200ms looks like while using POS, run it on the same computer as POS and configure it with 200ms outbound.

Even if you are on a good network, with this tool you can evaluate what it would look like for someone with higher latency (or other network conditions).

Overriding CommerceRuntime tax calculation with a 3rd party implementation





Note: Implemented with Dynamics 365 version 7.2.* (but likely working fine with lower or higher versions). Sample code can be downloaded at the end of this blog.

This information explains how to override tax calculation in the CRT in Dynamics 365 for Operations (and similar products). This is a simple example how integrate the calls to external tax providers and how to embed the returned values back into the CRT. The code is of non-production quality; its main purpose is to show the concepts only.

High-level overview

In this example, I demonstrate how to short-circuit any CRT tax calculations (by implementing my own IRequestHandler) and forward the individual line items on the cart in one large payload to a call to an external system. The external system will calculate a 20% tax rate on each item (a simplification). Our new implementation of the tax service will then take the return values and save the information on the transaction as it is expected by the CRT and CDX frameworks.

The identifier of the product in the external system is modelled to be different from ItemId or ProductId in Dynamics. Here, I showcase the use of the product attribute framework to add an attribute called “ExternalTaxId”. As part of the newly implemented code, we must fetch the value of this attribute and forward it to the external provider. As an optimization, the fetching of this attribute is cached. This should help with limiting the number of calls to a minimum (per IIS process). There are other approaches that could be taken, but this seemed to be the simplest.

Finally, imagine that we have “additional” data coming from the external tax provider to be saved per sales tax line and to be brought back to Dynamics HQ. This shows the use of a table extension in X++, an extension table in the channel database and the inclusion of this data in the transaction upload job (P-job).
When all work is finished, we will see the 20% tax rate in the POS clients:

After running the P-0001 job, we can also see the data in AX:

Details

There are 3 tax-related CRT request handlers that can be overridden for tax-related purposes.

  • CalculateTaxServiceRequest: Re-calculates taxes for individual items and whole transaction
  • AssignTaxCodesServiceRequest: Fetches and assigns sales tax codes. Since we do not calculate the taxes locally, this is not really needed. However, there is no negative impact to just leave that code untouched.
  • PopulateTaxRatesRequest: Since we will override CalculateTaxServiceRequest, there is no need for this handler. We will just provide with our own empty implementation.

In the overridden CalculateTaxServiceRequest, we carry out these steps:

  • Use the passed in SalesTransaction, loop through all active sales line items and get the external product id (from the product attributes). For the steps how to setup product attributes see this blog post
  • cache the external product id for later calls (see blog post for more detailed discussion)
  • pass the sales line information and external product id to the external tax service
  • dummy tax service returns a single tax code per sales line item with 20% tax and some other string value that must be stored on the tax line
  • the CRT tax service builds the right data structure with tax lines and calculates final total taxes
  • CRT tax service adds additional string value as Extension property on each tax line
  • CRT tax service returns the SalesTransaction

In addition, we need to hook into the transaction save process. Therefore, we implement the SaveSalesTransactionDataRequest post trigger. More details in this blog post.

The full source code can be found Extensions.MyTaxServiceBlog

Extending a transactional Retail Channel table with additional data (HQ X++ table extension, CDX, extension table, CRT data service)





Note: Implemented with Dynamics 365 version 7.2.11792.62628, or App Update 4 (may not work with lower versions). Sample code can be downloaded at the end of this blog.

A frequent customization need is to store more transactional data in the channel database in order to synchronize it back to HQ. There is a list of tables that holds these data, i.e. RetailTransactionTable, RetailTransactionSalesTrans, RetailTransactionTaxTrans tables, and a few others.

In this example, I am going to show how to extend the RetailTransactionTaxTrans table, how to write to it from a CRT data service, how to create a view to sync it back to AX and how to configure CDX. The single string that the tax line is extended with, is being passed in by means of an CRT Extension property. This sample is part of a larger sample to override the CRT’s tax calculation.

Lets work on the pieces from bottom up to the top.

HQ table extension (X++)

Build an X++ table extension for the table RetailTransactionTaxTrans by use of the AOT and Visual Studio. Before you do that, create a new Dynamics 365 package and model. In the screenshot below you can see I created a package called “Extensibility” and a model called “Extensibility” and added a new string column called ADDLSTRING1.

Make sure to build and db sync, and when all looks good to add the new package to VSTS.

Channel database extension (SQL)

Since we want to get the data from the channel to the HQ, we must store it initially in the channel (or even in the ModernPOSOffline db). In order to do that, we must develop some SQL for the channel database.

The new way for doing this is to create any new SQL on the db schema “[ext]” and NOT in any preexisting schemas, i.e. [ax], [dbo], [crt]. At this point you can reference Microsoft’s schemas, but we careful to not take too many dependencies. SQL objects cannot be considered a real interface, so the more assumptions your SQL makes on Microsoft’s SQL, the higher the probability you may need to fix something someday when taking an update. In the SQL I show here, you will see that all the new SQL objects live in the [ext] schema.

New table

We need to store the new column in a new table. Since, in this sample we want to store one more string value, we must at least have this new column in the table, plus all the other columns that allow us to reference the corresponding record in the original table (base table). The correct approach is to add all PK columns from the original table to the new table as well. In addition, it is a good practice (but not required) to add columns for creation and modification time.
Also, make sure that the PK on the new table is the same as on the original one. This is needed so CDX can find the data correctly.
Lastly, we need to provide CRUD privileges on the table to the DataSyncUsersRole (CDX) and insert privileges to the UsersRole (CRT process).

    CREATE TABLE [ext].[RETAILTRANSACTIONTAXTRANS](
		[CHANNEL] [bigint] NOT NULL,
		[SALELINENUM] [numeric](32, 16) NOT NULL,
		[STOREID] [nvarchar](10) NOT NULL,
		[TAXCODE] [nvarchar](10) NOT NULL,
		[TERMINALID] [nvarchar](10) NOT NULL,
		[TRANSACTIONID] [nvarchar](44) NOT NULL,
		[DATAAREAID] [nvarchar](4) NOT NULL,
		[CREATEDDATETIME] [datetime] NOT NULL,
		[MODIFIEDDATETIME] [datetime] NOT NULL,
		[ROWVERSION] [timestamp] NOT NULL,
		[ADDLSTRING1] [nvarchar](200) NOT NULL DEFAULT (''),
		 CONSTRAINT [I_EXT_RETAILTRANSACTIONTAXTRANS_PK] PRIMARY KEY NONCLUSTERED 
		(
			[CHANNEL] ASC,
			[TERMINALID] ASC,
			[STOREID] ASC,
			[TRANSACTIONID] ASC,
			[SALELINENUM] ASC,
			[TAXCODE] ASC,
			[DATAAREAID] ASC
		)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
		) ON [PRIMARY]

END
GO

GRANT SELECT,INSERT,UPDATE,DELETE ON [ext].[RETAILTRANSACTIONTAXTRANS] TO [DataSyncUsersRole]
GRANT INSERT ON [ext].[RETAILTRANSACTIONTAXTRANS] TO [UsersRole]
GO

New view

Consider that the view should represent the exact same columns as the original table before, plus the new column. As explained already above, we have the same PK on both tables, so we will use these to LEFT JOIN the two.
Grant the priviledge to SELECT by the DataSyncUsersRole (CDX).

CREATE VIEW [ext].[RETAILTRANSACTIONTAXTRANSEXTVIEW] AS
(
	SELECT
	tt.[AMOUNT],
	tt.[CHANNEL],
	tt.[ISINCLUDEDINPRICE],
	tt.[REPLICATIONCOUNTERFROMORIGIN],
	tt.[SALELINENUM],
	tt.[STOREID],
	tt.[TAXCODE],
	tt.[TERMINALID],
	tt.[TRANSACTIONID],
	tt.[DATAAREAID],
	tt.[ROWVERSION],
	exttt.[ADDLSTRING1],
	exttt.MODIFIEDDATETIME,
	exttt.CREATEDDATETIME
	 from [ax].RetailTransactionTaxTrans tt
	LEFT JOIN [ext].[RETAILTRANSACTIONTAXTRANS] exttt
	ON 
		tt.[CHANNEL] = exttt.[CHANNEL] AND
		tt.[TERMINALID] = exttt.[TERMINALID] AND
		tt.[STOREID] = exttt.[STOREID] AND
		tt.[TRANSACTIONID] = exttt.[TRANSACTIONID] AND
		tt.[SALELINENUM] = exttt.[SALELINENUM] AND
		tt.[TAXCODE] = exttt.[TAXCODE] AND
		tt.[DATAAREAID] = exttt.[DATAAREAID]
)
GO
GRANT SELECT ON [ext].[RETAILTRANSACTIONTAXTRANSEXTVIEW] TO [DataSyncUsersRole]
GO

New table type (TVP)

A table value type is a convenient way to “pass around a data table”. Eventually, we want to call a stored procedure and pass in a whole data table in order to be more efficient. An alternative would be to just pass in the individual parameters for each column and call the stored procedure multiple times. In this sample, since we want to create tax lines, we will create usually a few rows in one call (5 sales items with 2 different tax codes each would produce 10 rows). I created the TVP by copying the original one, and expanded it with the new column.

CREATE TYPE [crt].[EXTRETAILTRANSACTIONTAXTRANSTABLETYPE] AS TABLE(
	[DATAAREAID] [nvarchar](4) NULL,
	[SALELINENUM] [numeric](32, 16) NULL,
	[STOREID] [nvarchar](10) NULL,
	[TAXCODE] [nvarchar](10) NULL,
	[TERMINALID] [nvarchar](10) NULL,
	[TRANSACTIONID] [nvarchar](44) NULL,
	[ADDLSTRING1] [nvarchar](200) NOT NULL)
GO
GRANT EXECUTE ON TYPE::[crt].[EXTRETAILTRANSACTIONTAXTRANSTABLETYPE] TO [UsersRole];
GO

New stored procedure (sproc)

The sproc will get supplied by the TVP and the channel Id. This sproc will be called by our CRT data service below.

CREATE PROCEDURE [ext].[INSERTRETAILTRANSACTIONTAXTRANS]
			   (@bi_ChannelId		bigint,
			   @tvp_ExtRetailTransactionTaxTrans		[crt].[EXTRETAILTRANSACTIONTAXTRANSTABLETYPE] READONLY)
AS
BEGIN
...

CRT data service to write to the new table

Each CRT service (for data or other purposes) requires a Request class. In this example, we want to store additional tax line data from the transaction. So we can pass the SalesTransaction instance to our service via the Request.

public sealed class InsertTaxTransExtensionDataRequest : DataRequest
{
    public InsertTaxTransExtensionDataRequest(SalesTransaction transaction)
    {
        this.Transaction = transaction;
    }

    [DataMember]
    [Required]
    public SalesTransaction Transaction { get; private set; }
}

The data service must find the data we want to store and build a DataTable with multiple DataRows that corresponds to the TVP in SQL. Make sure the columns are in the same order. Then we use the DatabaseContext class to make the call to the sproc, by passing in the expected parameters.

using (DataTable taxTable = new DataTable("EXTRETAILTRANSACTIONTAXTRANSTABLETYPE"))
{
    taxTable.Columns.Add(DataAreaIdColumn, typeof(string)).DefaultValue = string.Empty;
    taxTable.Columns.Add(SaleLineNumColumn, typeof(decimal)).DefaultValue = 0m;
    taxTable.Columns.Add(StoreIdColumn, typeof(string)).DefaultValue = string.Empty;
    taxTable.Columns.Add(TaxCodeColumn, typeof(string)).DefaultValue = string.Empty;
    taxTable.Columns.Add(TerminalIdColumn, typeof(string)).DefaultValue = string.Empty;
    taxTable.Columns.Add(TransactionIdColumn, typeof(string)).DefaultValue = string.Empty;
    taxTable.Columns.Add(ADDLSTRING1Column, typeof(string)).DefaultValue = string.Empty;

    foreach (var line in tx.ActiveSalesLines)
    {
        foreach (var taxItem in line.TaxLines)
        {
            DataRow row = taxTable.NewRow();
            row[DataAreaIdColumn] = request.RequestContext.GetChannelConfiguration().InventLocationDataAreaId;
            row[SaleLineNumColumn] = line.LineNumber;
            row[StoreIdColumn] = tx.StoreId;
            row[TaxCodeColumn] = taxItem.TaxCode;
            row[TerminalIdColumn] = tx.TerminalId;
            row[TransactionIdColumn] = tx.Id;
            object oAddlString = taxItem.GetProperty("AdditionalString1");
            string addlString = string.Empty;
            if (oAddlString != null)
            {
                addlString = (string)oAddlString;
            }

            row[ADDLSTRING1Column] = addlString;
            taxTable.Rows.Add(row);
        }
    }

    ParameterSet parameters = new ParameterSet();
    parameters[ChannelIdVariableName] = request.RequestContext.GetPrincipal().ChannelId;
    parameters[TVPVariableName] = taxTable;

    int errorCode;
    using (var databaseContext = new SqlServerDatabaseContext(request))
    {
        errorCode = databaseContext.ExecuteStoredProcedureNonQuery("[ext].INSERTRETAILTRANSACTIONTAXTRANS", parameters);
    }

Finally, register this CRT data service with the CRT, in the commerceruntime.ext.config (RS) and the commerceruntime.ext.offline.config (MPOS offline).

Implement trigger in the appropriate place to call CRT data service

For our use case, it is perfect to piggy-bag onto the SaveSalesTransactionDataRequest CRT request and provided an implementation for a post-trigger to call our new data service. In that trigger, we initialize the request to our new data service and let the CRT take care of routing the call.

class SaveSalesTransactionDataRequestTrigger : IRequestTrigger
{
    public IEnumerable SupportedRequestTypes
    {
        get
        {
            return new[] { typeof(SaveSalesTransactionDataRequest) };
        }
    }

    public void OnExecuted(Request request, Response response)
    {
        ThrowIf.Null(request, "request");
        ThrowIf.Null(response, "response");

        var newRequest = new InsertTaxTransExtensionDataRequest(((SaveSalesTransactionDataRequest)request).SalesTransaction);
        request.RequestContext.Execute(newRequest);
    }

    public void OnExecuting(Request request)
    {
    }
}

Do not forget to register the trigger in the commerceruntime.ext.config (RS) and the commerceruntime.ext.offline.config (MPOS offline), similarly as shown above.

Configure CDX

New AX7 schema table

Create a new AX7 schema table called “ext.RetailTransactionTaxTransEXTView” in the “Retail Channel schema” page for AX7 schema. Make sure the same columns are used as in the original table called “ax.RETAILTRANSACTIONTAXTRANS”, plus the new ADDLSTRING1 column.

Switch scheduler subjob to use view instead of original table

In “Scheduler subjobs” find the job subjob called “RetailTransactionTaxTrans” and change it table to the newly created one and add the one new column.

Finally, use the “Generate queries” in the “Retail channel schema” to initialize CDX.

Resource file to automatically configure CDX after deployment

Brand new feature, to be continued at a future time…

Testing

Run through your scenario in your POS client to save a transaction. Then look in the database before and after to see that the extension table gets populated and synced with the P-0001 job.


Extensions.MyTaxTransExtensionDataServiceBlog

CDX extensibility – Synchronizing data from channel to HQ via P-job



By default, there are very few tables that get synchronized from the store to the HQ. These are mostly tables related to the transactions being generated. However, there are scenarios where customers may want to create other data in their own tables in the channel and bring it back to HQ. In many cases, that may not need to happen in real-time and this task can be accomplished by using the pull job architecture. Below are the steps how to set this up.

Dynamics 365 for Operations table creation

Since the records will trickle in from multiple channels, stores, terminals, the table’s index must include the data that differentiate the records. A good example is the RetailTransactionTable to look at how it’s done.
Create the table with the AOT and follow these guidelines:

  • A column called “REPLICATIONCOUNTERFROMORIGIN” of type RetailReplicationCounter needs to be added.
  • Add the other data columns that you would like to synchronize (in this example, FieldStr1 and FieldInt1)
  • Set “Created Date Time” and “Modfied Date Time” to Yes.
  • Create an index that includes the fields that are needed to differentiate the records well. For retail stores, that could be Channel, Store, Staff, Terminal and the id of the record as called ChannelOriginatedDataId in this example.


Note: How to put this file into VSTS so the change is reproducible and deployable into other environments is omitted here.

Create the table schema for the Channel database

Create a table with the same name but schema [ax] in SSMS. Here are some specific requirements to follow:

  • Include columns called [REPLICATIONCOUNTERFROMORIGIN], [ROWVERSION], [CREATEDDATETIME] and [MODIFIEDDATETIME] (details below)
  • If the table in HQ is per company (SaveDataPerCompany = yes), include a column [DataAreaId]
  • Create a clustered unique index for column [REPLICATIONCOUNTERFROMORIGIN]
  • Create a nonclustered primary key index for columns that uniquely differenciate the records, including [DATAAREAID] if data is saved per company
IF (SELECT OBJECT_ID('ax.CHANNELORIGINATEDDATATABLE')) IS NULL 
BEGIN
	CREATE TABLE [ax].[CHANNELORIGINATEDDATATABLE](
		[CHANNELORIGINATEDDATAID] [nvarchar](44) NOT NULL,
		[FIELDSTR1] [nvarchar](200) NOT NULL DEFAULT (''),
		[FIELDINT1] [int] NOT NULL DEFAULT (0),
		[CHANNEL] [bigint] NOT NULL,
		[STORE] [nvarchar](10) NOT NULL,
		[TERMINAL] [nvarchar](10) NOT NULL,
		[STAFF] [nvarchar](25) NOT NULL,
		[CREATEDDATETIME] [datetime] NOT NULL,
		[MODIFIEDDATETIME] [datetime] NOT NULL,
		[REPLICATIONCOUNTERFROMORIGIN] [int] IDENTITY(1,1) NOT NULL,
		[DATAAREAID] [nvarchar](4) NOT NULL,
		[ROWVERSION] [timestamp] NOT NULL,
	 CONSTRAINT [I_AX_CHANNELORIGINATEDDATATABLE_PK] PRIMARY KEY NONCLUSTERED 
	(
		[CHANNEL] ASC,
		[STORE] ASC,
		[TERMINAL] ASC,
		[CHANNELORIGINATEDDATAID] ASC,
		[DATAAREAID] ASC
	)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY],
	 CONSTRAINT [I_AX_CHANNELORIGINATEDDATATABLE_REPLICATIONCOUNTERFROMORIGIN] UNIQUE CLUSTERED 
	(
		[REPLICATIONCOUNTERFROMORIGIN] ASC
	)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
	) ON [PRIMARY]
END
GO

GRANT SELECT,INSERT,UPDATE,DELETE ON [ax].CHANNELORIGINATEDDATATABLE TO [DataSyncUsersRole]
GRANT INSERT ON [ax].[CHANNELORIGINATEDDATATABLE] TO [UsersRole]

Note: How to put this file into VSTS and registered in the Retail Sdk’s Customization.settings file so the change is reproducible and deployable into other environments is omitted here.

Add the new table to the list of Channel tables

CDX needs to be aware of the schema of the channel table in order to map and sync it with the HQ table. In Retail and commerce > Headquarters setup > Retail Scheduler > Retail channel schema, click Channel tables and add a new table with the new fields:

Add a new scheduler subjob

In Retail and commerce > Headquarters setup > Retail Scheduler > Scheduler subjobs create a new subjob that maps both HQ and channel table. Make sure that:

  • the special column [REPLICATIONCOUNTERFROMORIGIN] is both mapped in the field mapping and used in the replication counter field, and
  • the slider for “Pull Data” is set to Yes.

Add the subjob to the job

In Retail and commerce > Headquarters setup > Retail Scheduler > Scheduler jobs add the subjob to a job. You could reuse the existing P-0001 job as in this example:

Finally, in Retail and commerce > Headquarters setup > Retail Scheduler you must regenerate the sync helper classes with “Generate queries”.
Note: The previous steps in the Dynamics 365 for Operations user interface to configure CDX are shown here for better illustration. Ideally, these should be coded in X++ to get reliable and automatic deployments into other environments.

Verification

Add some data into the table in the channel side ([ax] schema) and run the pull job. Then inspect the data in the HQ side: