Data access in the CommerceRuntime



Below are some code samples for reading and writing data in Channel database. This is just for Channel database. This does not necessary mean that the data needs to go to D365 HQ. It may, either via CDX pull job or via Real-time service call. This information is only for simple reads and writes in Channel db.

Prerequisites:
Dynamics 365 for Operations (1611)
KB3214687
KB3214679

First, create the SQL table in ax schema and a view in crt schema:

    CREATE TABLE [ax].[MY_EXTENSION](
		[DATAAREAID] [nvarchar](4) NOT NULL,
		[RECID] [bigint] NOT NULL,
		[COL1] [int] NOT NULL,
		[COL2] [nvarchar](100) NOT NULL,
		[COL3] [bit] NOT NULL,
		[ACCOUNTNUM] [nvarchar](20) NOT NULL)
    . . .
    END
    GO

    CREATE VIEW [crt].[MY_EXTENSIONVIEW] AS
    (
        SELECT ACCOUNTNUM, DATAAREAID, COL1, COL2, COL3 FROM [ax].[MY_EXTENSION]
    )
    GO

    GRANT SELECT ON [crt].[MY_EXTENSIONVIEW] TO [UsersRole];
    GO

The shown sample adds 3 columns to be stored from 3 extension properties.

Grant the right permissions to table and view. This is the list of supported SQL roles:

DataSyncUsersRole Used by CDX process account
PublishersRole Used by publishing process account (eCommerce)
ReportUsersRole Used by reporting user account
UsersRole Used by runtime user (RetailServer)

Next, create a sproc for updating:

CREATE PROCEDURE [crt].[MY_UPDATEEXTENSIONPROPERTIES]
    @TVP_EXTENSIONPROPERTIESTABLETYPE         [crt].EXTENSIONPROPERTIESTABLETYPE READONLY
AS
BEGIN
	DECLARE @nvc_DataAreaId NVARCHAR(4);
	DECLARE @recId bigint;
	DECLARE @accountNum [nvarchar](20);

	DECLARE @Col1Value int = 0;
	DECLARE @Col2Value nvarchar(100) = '';
	DECLARE @Col3Value bit = 0;
	
       SELECT DISTINCT TOP 1 @recId = tp.PARENTRECID, @nvc_DataAreaId = ct.DATAAREAID, @accountNum = ct.ACCOUNTNUM
       FROM @TVP_EXTENSIONPROPERTIESTABLETYPE tp
       JOIN [ax].CUSTTABLE ct on ct.RECID = tp.PARENTRECID
       WHERE tp.PARENTRECID <> 0
			
	SELECT @Col1Value = COALESCE(tp.PROPERTYVALUE, 0)
	FROM @TVP_EXTENSIONPROPERTIESTABLETYPE tp 
	where tp.PARENTRECID <> 0 and tp.PROPERTYNAME = 'COL1'
	
	SELECT @Col2Value = COALESCE(tp.PROPERTYVALUE, '')
	FROM @TVP_EXTENSIONPROPERTIESTABLETYPE tp 
	where tp.PARENTRECID <> 0 and tp.PROPERTYNAME = 'COL2'
	
	SELECT @Col3Value = CAST(CASE WHEN tp.PROPERTYVALUE = 'True' THEN 1 ELSE 0 END AS BIT)
	FROM @TVP_EXTENSIONPROPERTIESTABLETYPE tp 
	where tp.PARENTRECID <> 0 and tp.PROPERTYNAME = 'COL3'

	MERGE INTO [ax].[MY_CUSTOMEREXTENSION] dest
	USING (SELECT @accountNum as ACCOUNTNUM) as source on  dest.ACCOUNTNUM = source.ACCOUNTNUM
	WHEN matched then
		update set dest.COL1 = @Col1Value, dest.COL2 = @Col2Value, dest.COL3 = @Col3Value
	when not matched then
		INSERT (RECID, ACCOUNTNUM, DATAAREAID, COL1, COL2, COL3)
		VALUES (@recId, @accountNum, @nvc_DataAreaId, @Col1Value, @Col2Value, @Col3Value);
END
GO

GRANT EXECUTE ON [crt].[MY_UPDATECUSTOMEREXTENSIONPROPERTIES] TO [UsersRole];
GO

Note the MERGE INTO statement. This allows to either create or update a record.

For reading in the CRT, here especially for extension properties, use SQLPagedQuery, SQLServerDatabaseContext and ExtensionsEntity:

var query = new SqlPagedQuery(QueryResultSettings.SingleRecord)
{
    Select = new ColumnSet(new string[] { "COL1", "COL2", "COL3" }),
    From = "MY_EXTENSIONVIEW",
    Where = "ACCOUNTNUM = @accountNum AND DATAAREAID = @dataAreaId",
};

query.Parameters["@accountNum"] = customer.AccountNumber;
query.Parameters["@dataAreaId"] = request.RequestContext.GetChannelConfiguration().InventLocationDataAreaId;
using (var databaseContext = new SqlServerDatabaseContext(request))
{
    ExtensionsEntity extensions = databaseContext.ReadEntity(query).FirstOrDefault();
    if (extensions != null)
    {
        var col1 = extensions.GetProperty("COL1");
        if (col1 != null)
        {
            customer.SetProperty("COL1", col1);
        }
        else
        {
            customer.SetProperty("COL1", 0);
        }
 	 
        . . .
    }
    else
    {
        customer.SetProperty("COL1", 0);
    }
}

In order to write the data to the db, use this code to call the sproc shown above:

using (var databaseContext = new SqlServerDatabaseContext(r))
using (var transactionScope = new TransactionScope())
{
    if (!r.Customer.ExtensionProperties.IsNullOrEmpty())
    {
        ParameterSet parameters = new ParameterSet();
        parameters["@TVP_EXTENSIONPROPERTIESTABLETYPE"] = new ExtensionPropertiesTableType(r.Customer.RecordId, r.Customer.ExtensionProperties).DataTable;
        databaseContext.ExecuteStoredProcedureNonQuery("MY_UPDATEEXTENSIONPROPERTIES", parameters);
    }

    transactionScope.Complete();
}

In the case of you want to write other data to the database (no extension properties to entities), build the SqlParameters one by one and match in your sproc. For reading, it may be easier to just treat it as extension properties, then convert to whatever you want. Or you could create your own entity and query for that.

Hosting a downloaded dev VHD of Dynamics 365 for Operations

In order to save cost or time, it may be practical for partners/customers to host their own version of the VHD that is based on the official downloadable VHD by Microsoft. One scenario could be that the Contoso demo data is good enough for development, but some additional data setup, hotfix application or code customizations may be needed. These steps could be carried out once by one person, and then that VHD could be re-used. Some partners may be on a monthly cadence to “rev” their dev environments. There are two options to do this:

  1. Reuse the VHD and host it locally in HyperV or similar virtualization technologies
  2. Reuse the VHD and host it in Azure

The option 1) is straightforward and many will opt for this. There are some cases where it is useful to host in Azure though, mostly for simpler sharing of a VM or because an appropriate host for the VM is not available (i.e. laptop not powerful enough). Here is a step-by-step guide that worked for me to bring up the VHD to Azure so I can simply spin up a new instance in relatively short time:

  1. Download the VHD from https://connect.microsoft.com/ and unpack it
  2. (optional) prepare the VHD with data, binary fixes or customizations
  3. Upload the VHD to your Azure subscription. If you have not done already, you need to install the Windows Azure Sdk. If you have not done already, you need to create a management certificate for Azure on the local machine and upload it to Azure (basically this grants access to the Azure subscription). Then follow this: https://docs.microsoft.com/en-us/azure/virtual-machines/virtual-machines-windows-classic-createupload-vhd. I did not have to run sysprep on the downloaded VHD from Microsoft. I think this step is needed if you carry out step 2).
    When the steps are done, you should see the new VHD ready to be used as a template for creating new VMs.
  4. Create a new VM based on that VHD: https://docs.microsoft.com/en-us/azure/virtual-machines/virtual-machines-windows-classic-createportal?toc=%2fazure%2fvirtual-machines%2fwindows%2fclassic%2ftoc.json(optional but suggested) Update the administrator password to something else, so nobody else can hijack your VM (if they somehow had the RDP connection)

Update 3/30/2017:
I have added a PowerShell script that is ready do be executed after changing some of the variables:
UploadD365VHDUpdate5.ps1

Move POS screen layouts from one environment to another

Using the POS screen layout designer is a tedious work and very likely you would not want to do this in every environment over and over again (production, sandbox, dev, etc.). The solution to this (and similar tasks) is to use the DIXF framework to import and export entities. For this particular task of moving POS screen layouts, here are the steps to follow:

  1. Configure the data source
    1. Enter the “Data management” workspace
    2. Click on “Configure data source”
    3. Select “CSV-Unicode” and edit
    4. Set the “text qualifier” to ~ (tilde)
    5. Save the data source
  2. Export
    1. configure the data source as above
    2. Start a new export project with target data source set to “CSV-Unicode”
    3. Add the following entities:
      1. RetailImagesEntity (POS layout images)
      2. RetailButtonGridEntity (POS button grid)
      3. RetailButtonGridButtonsEntity (POS button grid buttons)
        1. Click “View map” and “Mapping details”
        2. Enable text qualifier for DisplayText and Save
      4. RetailTilllayoutEntity (POS screen layouts)
        1. follow same steps to edit “Mapping details” but enable text qualifier for all fields
      5. RetailTillLayoutZoneEntity (POS screen layout zones)
      6. RetailTillLayoutButtonGridZoneEntity (POS screen layout button grid zones)
      7. RetailTillLayoutImageZoneEntity (POS screen layout images zones)
      8. RetailTillLayoutReportZoneEntity (POS screen layout report zones)
    4. Download the package
  3. Import
    1. configure the data source as above
    2. Start a new import project and pick source data format “Package”
    3. Upload the package and import it

Auto-Installing most needed dev tools in 5 mins with Chocolatey

I deploy new development boxes on a fairly regular basis. And every time I do this, the same procedure follows. I install the same software so I can be more productive.

I guess this can be automated, and below is a Chocolatey script (package manager for Windows) that can be used to install most tools I need. Adjust it for your needs.  Currently, it installs these (that I need for AX7, Retail dev and troubleshooting work):

  • Notepad++
  • Adobe Reader
  • Firefox
  • Fiddler
  • WindirStat
  • ProcessExplorer
  • ProcessMonitor
  • Putty
  • WinMerge
  • change windows prompt to include date and time

Download, rename to *.cmd, open a console as administrator and run it. That’s it!
WindowsChocoInstaller.cmd

RetailTransactionService (Real-time service) customization sample – Secure app settings

This video demonstrates how to store application settings securely and manage them in AX. These settings are needed for AX business logic and CommerceRuntime business logic. The data is fetched via a RetailTransactionServiceEx call. The CommerceRuntime service takes care of calling the RTS and caching for a configurable period.  Video also shows how to test this by exposing it via RetailServer and using the RetailServer TestClient.

Retail Sdk customization branch update

This video shows the steps involved to code merge a new Retail Sdk into your current customization branch. In this case, I am updating my AX 7 RTW Retail Sdk (with customizations) to the AX 7 Update 1 Sdk. Same steps apply if you snap to a hotfix or to another update. It is a good practice to use a mirror branch as it makes code merges much easier. The update of the mirror branch is a prerequisite for this step, and was shown in a separate video.

Retail Sdk mirror branch update

This video shows the steps involved to update your VSTS-based Retail Sdk mirror branch with a newer build. In this case, I am updating my AX 7 RTW Retail Sdk to the AX 7 Update 1 Sdk. Same steps apply if you snap to a hotfix or to another update. It is a good practice to use a mirror branch as it makes code merges much easier. The code merge (as the next step) will be shown in a separate video.

Final note: If you take the Sdk from a new development VM, it may be installed at the C:\ or J:\ drive as shown in the video. If you however take a hotfix, the Retail Sdk can be found in the ‘Code’ subfolder:

RetailSdk location in hotfix package

DIXF recurring file imports fail for data packages. What now?



Applicable to all AX7 releases.

If you are following https://ax.help.dynamics.com/en/wiki/recurring-integrations/ to setup your recurring imports, and if you are trying to use a data package with multiple entities, you may notice that the server shows a “preprocessing failure”. This is a bug. Until this is fixed, a workaround is to over-layer the ApplicationFoundation:

Create a new class:

class DMFSchedulerStatusUpdate
{
    using Microsoft.Dynamics.AX.Framework.Tools.DataManagement.Serialization;
    [DataEventHandler(tableStr(Batch), DataEventType::Updated)]
    public static void Batch_onUpdated(Common sender, DataEventArgs e)
    {
        Batch task = sender;
        if(task.Status == BatchStatus::Finished || task.Status == BatchStatus::Error && task.ClassNumber == className2Id(identifierStr(DMFImportTaskScheduler)))
        {
            container params = task.Parameters;
            DMFExecutionId executionId = conPeek(params,3);
            DMFIntegrationBridge::updateStatus_New(executionId);
        }
    }
}

Customize DMFIntegrationBridge:

public static void updateStatus_New(dmfExecutionId executionId)
{
    IntegrationActivityMessageTable messageStatusTable;
    IntegrationActivityRuntimeExecutionTable runtimeExecutionTable;
    select firstonly messageStatusTable
    join 
    runtimeExecutionTable where runtimeExecutionTable.ExecutionId == executionId && 
    runtimeExecutionTable.MessageIdentifier == messageStatusTable.MessageId;

    DMFIntegrationBridge bridge = new DMFIntegrationBridge();
    ActivityMessageContext ActivityMessageContext = new ActivityMessageContext();
    ActivityMessageContext.MessageId = guid2Str(messageStatusTable.MessageId);
    ActivityMessageContext.ExecutionCorrelationId = str2Guid(messageStatusTable.ExternalCorrelationId);
    // Get status for bridge result
    IntegrationBridgeResults bridgeResult = DMFIntegrationBridgeHelper::GetStateOfExecution(executionId, false);
    // Set post processing state
    bridge.postProcessMessage(ActivityMessageContext, executionId, bridgeResult);
}

 

DIXF recurring file downloads must be forced to be HTTPS



Applicable to all AX7 releases.

If you are following https://ax.help.dynamics.com/en/wiki/recurring-integrations/ to setup your recurring exports, you may notice that the download URL that the dequeue call returns is wrong. This may be caused by the load balancer in a production environment. In any case, the client code that fetches the data can use simple code to solve the issue.  In C#, it would look like this:

var newDownloadLocation = new UriBuilder(dataMessage.DownloadLocation)
{
    Scheme = Uri.UriSchemeHttps,
    Port = -1,
};

Once the download URL is “fixed up”, download with a normal GET request.

Update your Retail Sdk to add payment packaging and deployment support (AX 7 RTW and Update 1)

Update 8/8/2016: Hotfix to Update 1 is now available: “KB 3183058: Retail Sdk update for packaging and deployment of payment files.”. If you can update, that is recommended, if you cannot, this article still applies.

The steps below allow you to update some files in your Retail Sdk so that payment packages can be built as well.  The same code changes as in this zip package will be released as part of Update 2 and possibly even as a hotfix to Update 1.

Benefits

The update allows you to generate all packages with all payment files (payment device assemblies, payment connector assemblies, and payment web files) embedded. All you would need to do is to get the payment files from the 3rd party, place them into the 3 locations dedicated for them and build. Whether built manually on command line or via VSTS automated build, the packages would include the files correctly.
Furthermore, the deployment of these packages will place the file in the correct locations. The packages that include these files are RetailServer, ModermPOS, ModernPOSOffline, CloudPOS, HardwareStation and AOS. The developer is not needed to know where the files need to be deployed too.

One-time setup steps

  1. Make sure you do not have any open files in your current VSTS client. If you do, look at RetailSdk-PaymentIncremental.zip and make sure at least these files are not in edit mode.
  2. Take the RetailSdk-PaymentIncremental.zip (link is below) and unpack it into the Retail Sdk customization branch (Note: At a future date, these changes will come through an update, at that point a simple code merge will have to be done). Make sure that the top folders of the zip file are being copied into the root of the Retail Sdk ($RetailSdk).
    SdkPaymentPackagingZipContents
  3. In Visual Studio, look at the changes and make sure no customizations (to the same files) were overwritten. If they were, merge the files rather than just copy them in. However, the chances are high none of these files were customized.
  4. Add the payment files from the 3rd party vendor to the Sdk, under the PaymentExternals folder. You should have gotten 3 different folders, and these should be matching the new Sdk folders.
  5. Do a local verification build. Open a Visual Studio 2015 Developer command window and issue a “msbuild /v:Rebuild” from the top of the Retail Sdk.
  6. If it built successfully, this part is done.

Verification

  1. After doing the build, verify that $RetailSdk\Packages\AosPaymentsPackage\content.folder\AOSService\Code\WebArtifacts has the files from the $RetailSdk\PaymentExternals\PaymentWebFiles folder.
  2. Verify, that $RetailSdk\Packages\AosPaymentsPackage\content.folder\AOSService\Code\Assemblies has the files from $RetailSdk\PaymentExternals\IPaymentProcessorAssemblies
  3. Verify that $RetailSdk\Packages\ModernPOS\bin\Debug\content.folder\CustomizedFiles\ClientBroker has the files from $RetailSdk\PaymentExternals\IPaymentDeviceAssemblies.
  4. Deploy the packages via runbook and verify functionality (on development or sandbox environment).
  5. If there are any issues with the payment assemblies themselves (not with packaging or deployment), contact the party that released the files
  6. Once everything worked (one-time steps and verification with payment files and deployment), submit the changes from the zip file plus the payment files to your VSTS.

Deployment

There is no difference for deployment of the existing packages. One thing to note is that the new package AOSPaymentPackage must be installed by itself and last.  The deployment for production environments is exactly the same (via DSE if Microsoft-hosted). The AosPaymentPackage must be installed last.

 

RetailSdk-PaymentIncremental-RTWandUpdate1