Electronic Reporting (ER) Cookbook 4: References in a model

Electronic Reporting (ER) Cookbook 4: References in a model

The “Item reference” in Dynamics 365 Electronic Reporting models has been an elusive concept for me for quite a long time. A reference creates a kind of a cross-link between the ER model parts, but what for? Now I think I have figured out some important use cases:

  • Re-use structures and record list definitions within the model, especially for UNIONs
  • Map the same model in different ways and export similarly structured files from different sources

References to facilitate LISTJOINs

The LISTJOIN ER function is an analogue to the SQL UNION. It combines heterogenous record lists into one list of a common type. For example, a LISTJOIN(model.PurchaseLines, model.SalesLines) creates a typed single record list with their common fields: Product no, Quantity, Delivery date, Warehouse etc. However, for the LISTJOIN to recognize the common fields, the fields must be aligned, i.e. follow in exactly the same order, bear exactly the same names.

Here is where the references come into play. One can organize a “Dictionary” root entry in the model, define the most elementary composite entries there, then click the Switch item reference button elsewhere and include this small “brick” into a larger record. From the small bricks one may build larger bricks and so on. Such cross references are indicated by the asterisk (*):

In the above example, an Order record includes the CategoryComposite, Coverage order, Item order nested 1:1 records, and uses fields of the Category type and Line status enumerations. The resulting common Order record type is further assigned to the Sales and Purchase record lists, and this makes their structure in the model identical. The LISTJOIN() then produces a list of records with the following fields:

@.Amount
@.’Qty avail’
@.Category.Group
@.Category.’Category ID’

and so on.
By the way, the root node Item dictionary here is a Container. A Container is used simply to put together diverse artefacts, a way to make the model definition more readable.

References to apply different mappings to the same model

Imagine you need to export sales order lines and purchase order lines into 2 different files, similar in structure. You need a model to abstractly describe a generic Order line list, then a Sales line format export definition, a Purch line format definition both preferrably derived from a common Order line format, and 2 mappings: Sales line to model and Purch line to model to populate the Order line list model from 2 different data sources in Dynamics 365.

Try to run the Sales line format file. It is going to start complaining about 2 models present: “More than one model mapping exists. Set one of the configurations as default.” You set the Sales line to model as Default for model mapping = Yes and the export starts working.
No try to run the Purch line format. It is going to export sales order lines in the purchase order line format, because a link between the format and the outbound mapping is through the model node they use:

Format → Model [node] → [default] Model mapping

To fix this, the 2 formats should be mapped to different nodes in the model!

I’ve got a similar issue when I was trying to use the same model both for the export and for the import of data. There was a Counting journal model and 2 formats: Export to Kardex and Import from Kardex. There was a Counting model mapping for the export and a Counting model to destination mapping for the import.

Tried to run the export Export to Kardex and it asked to “Set one of the configurations as default“, which I did.
Then I tried to run the Counting model to destination mapping but it was not prompting any files for the upload, because a link between the inbound mapping to the import format was established through the model and it was pointing to the export format:

Model mapping to destination → Model [node] → Format

The trick is to use different nodes! The import format should write the data from the inbound file into one Record list in the model, while the export format should read data from another Record list in the model. The mappings then bind the same table to the one node and another, respectively. To reuse the data types and the structure, one node references the other (Switch item reference).

Electronic reporting for data migration

Electronic reporting for data migration

Intro

There are still dozens of important tables in Dynamics 365 for Finance and SCM not exposed by any entity. Sometimes simple solutions like my Copy-paste with a keyboard script from Excel don’t work due to the volume or complexity of the data, and sometimes there is no UI at all.

One of the notorious examples is the table EcoResCatalogControl which is associated with product attributes bound to procurement categories, see the Searchable flag in my blog Searchable product attributes. An import of these “Product category attributes” through the entity EcoResProductCategoryAttributeEntity leaves the attributes dysfunctional: the internal table EcoResCategoryAttributeLookup is left out of sync and the new attributes remain invisible in the product master.
Procurement category attributes
The only form where the EcoResCatalogControl is editable and visible is the Procurement category form, but an attempt to update the flag fails with a “Missing reference” error: the system expects a EcoResCatalogControl record to exist already. Dead end.

Electronic reporting: the last resort

Since the table browser is not an option anymore in a sandbox or production environment, the number of tools available to the consultant reduces to just one: Electronic reporting. Despite the name, the Electronic reporting module is able not only to read, but also freely insert, update or delete voluntary tables in D365FO.
In essence, the idea is to make a CSV text file and import it into the EcoResCatalogControl with the help of the Electronic reporting (ER).

This requires a so-called Mapping to destination where the “destination” is one or many tables or entities in D365FO. The concept is nicely outlined in the blog of Mr. Ties Philippi. In total, 4 “components” are required in at least 2 ER configurations (the Mapping to destination may in theory be detached into a separate – third – configuration).

  1. Model
  2. Model mapping “To destination”
  3. [CSV file] Format
  4. Model mapping [from format] ”To model”

ER configuration and component diagram
The execution flows in the order (3) -> (4) -> (1) -> (2).
The 2 configurations I used can be downloaded here.

First, create a model, for the simplification its structure may follow exactly the table.

Next, create a mapping with the Direction = To destination. Add the target table to the right with the button Destination, then bind the model from the left to the table at the right side of the Model mapping designer. Change the status of the configuration to Complete.

Next, create a format configuration based on the model. It should describe the CSV file structure, namely a set of records (Lines) separated by CRLF, with 2 fields in every line, separated by the semicolon “;” (in a German version of the Excel; in an English version the comma “,” is more appropriate). The encoding of the CSV format should better be UTF-8 (Excel is able to produce UTF-8 encoded files).

Do not bind the format to the model, but use the button Map format to model to create a mapping of the same name. Bind the format to the model as shown on the screenshot below. The button Run can be used to test the mapping ad hoc: it takes a CSV file, interprets it according to the format definition, translates to the model internal container and exports this container to an XML file.
ER components and mappings

Once tested, update the format status to Completed. Make note that the import is started from the artefact (2) Model mapping to destination (with the button Run). The system looks for a format based on the model the mapping belongs to, and inside of that format it looks for a “To model” mapping.
Run Mapping to destination
Having found an appropriate format, the system asks for a CSV file and unleashes the import, there is no way back.

Update route

Added on 31.01.2020
Electronic reporting destination mappings can also be used to launch X++ code. If you have ever worked with production routes, you may have noted the Update route button. The class behind is called RouteUpdate and it is executed by the system automatically from the UI on any change in the order of operations. On plain production routes, it updates the Next operation numbers, and multiplies the variable scrap percentages to cache and store accumulated scrap factors. The problem is, this class cannot be run over all routes in the batch mode; in the Route headers entity they forgot to use it. Opening every single route out of thousands imported is obviously not an option.

I took the same CSV file and created a new destination mapping which updates the RouteTable. One column in the CSV file (Attribute) contains the key RouteId, the other (Category) bears the route name. I am updating the name by the key, but in the middle I am injecting the following code
RouteSearch.newRouteId(@.Attribute).routeId
like this
LEFT(RouteSearch.newRouteId(@.Attribute).routeId,0)&@.Category
It instantiates and executes the RouteSearch class which as a by-product calls RouteUpdate in its constructor method, then gives a routeId() value back which I neglect. Thousands of routes are updated blazing fast, and the Next operation and the Accumulated scrap is set everywhere:

The ER configuration can be downloaded here: RouteUpdateMapping.zip.

Open project milestones

Added on 16.04.2020
Finally, I have designed a data model and mappings for the fixed fee project On-account transactions aka Billing milestones. The configuration may take a project TransId from the CSV file, or it may create its own from the standard number sequence in the Project module parameters. Both the related tables ProjOnAccTrans + ProjOnAccTransSale are generated.
Project on-account transactions
The ER configuration can be downloaded here: ProjOnAccTransModel.zip
Fun fact: a DMF entity for this existed in AX2012, but not D365.

Epilogue

The above crude trick works, but it is dangerous. Fixing a broken import might be a challenge, because to delete the data and start all over you need to change the destination Record action to Delete and then hope to find the right record by the primary key from the CSV file.

I would love to use standard entities, and make Microsoft add them to the application. Please vote for my 2 “favourites”:

Electronic Reporting (ER) Cookbook 3: Working with dates

Electronic Reporting (ER) Cookbook 3: Working with dates

Following the parts 1 and 2 of my hints and tips collection for Electronic reporting, let me shed some light on working with dates and date formatting.

Cut-off date

If a classic Excel transaction report must be implemented in the Electronic Reporting module, then a cut-off date is a common requirement: show all transactions up to and including a certain date, or show all transactions if the user hasn’t selected any date. Classically, a parameter in the dialog is presented to the user, this requires a User parameter of the DATE type in the format or – better – in the date model mapping (see also Electronic reporting in depth).

This parameter must be used in a Calculated field of the Record List type. Below is a snippet from one of my very first reports dated back to 2016; it extracts ledger transactions from the current company for the German GDPdU dump file:
WHERE(GLaccountEntry,
AND(
NOT(GLaccountEntry.AccountingCurrencyAmount=0),
GLaccountEntry.'>Relations'.GeneralJournalEntry.Ledger = Ledger.'current()',
GLaccountEntry.'>Relations'.GeneralJournalEntry.'>Relations'.FiscalCalendarPeriod.Type = FiscalPeriodType.Operating,
GLaccountEntry.'>Relations'.GeneralJournalEntry.AccountingDate >= FromDate,
OR(GLaccountEntry.'>Relations'.GeneralJournalEntry.AccountingDate <= ToDate, ToDate=NULLDATE())))

In hindsight, this was a terrible implementation. The WHERE operator works with in-memory lists. In essence, it loads all transactions from all companies and all periods and years into an internal XML container, and THEN starts filtering it by company and date. Not surprisingly, the performance degrades rapidly and the report execution time grows exponentially.

The FILTER is a better function as it compiles to a direct SQL statement and returns a reduced dataset; however, it has limited capabilities and does not support cross joins as above.

A mature solution would be to perform the filtering in 2 steps: one Calculated list fetches a maximally pre-filtered list from the SQL database, and subsequent Calculated field variables apply additional dynamic filters to this subset.
The below expression returns a list of all cost project transactions up to a certain date:
FILTER(ProjTransPosting,
AND(ProjTransPosting.PostingType=Enums.LedgerPostingType.ProjCost,
OR(ProjTransPosting.LedgerTransDate <= $ToDate, $ToDate=NULLDATE())))

where $ToDate is the user parameter.

To simplify and reuse the classic pattern WHERE LedgerTransDate <= $ToDate OR $ToDate=NULLDATE(), an interim variable (Calculated field) $ToDateOrInfinity may be declared (I could not find any other way to pass a date literal into an ER expression, that’s why the crude DATEVALUE(“31-12-2154″,”dd-MM-yyyy”)):
IF('$ToDate'>NULLDATE(), '$ToDate', DATEVALUE("31-12-2154","dd-MM-yyyy"))
and the above query reduces to just
FILTER(ProjTransPosting, AND(ProjTransPosting.PostingType=Enums.LedgerPostingType.ProjCost,
ProjTransPosting.LedgerTransDate <= $ToDateOrInfinity))

ToDate user parameterThis led to an issue on the UI, however. User parameters from the [default] Mapping designer propagate to the format, are merged with the user parameters defined in the Format designer and displayed together in the common dialog window. According to Maxim B. from the Microsoft R&D Center in Moscow, a user parameter is shown at the run time if 2 conditions have been met:

  • the visibility is not turned off;
  • in the mapping, the user parameter is bound to the model either directly or indirectly.

Seemingly, the above indirect usage obscures the parameter from the ER format and it disappears from the dialog. The solution was to bind $ToDate directly to an unused field in the model.

Date formatting: DATETIME to DATE

I spent quite some time trying to find an embedded function to convert a DATETIME type and bind to a DATE cell in Excel. Obviously, you can declare the format node a STRING and apply the DATETIMEFORMAT() function, but this circumvents the locale settings.

The solution was dumb, but it worked:
DATEVALUE(DATETIMEFORMAT(@.'Created date', "dd-MM-yyyy"), "dd-MM-yyyy")

First, Last date of the month

The below formulas work in model mappings. Assume the parmPeriodDate a user parameter. The first formula is easy and trivial:
DATEVALUE("01-"&DATEFORMAT(parmPeriodDate, "MM-yyyy"), "dd-MM-yyyy")

The second formula is crazy. It needs a declaration of the system class DateTimeUtil nearby in the model mapping data source section.
DATEVALUE(DATETIMEFORMAT(ADDDAYS(DATETODATETIME(DATEVALUE("01-"&
DATETIMEFORMAT(DateTimeUtil.addMonths(DATETODATETIME(parmPeriodDate),1),"MM-yyyy"), "dd-MM-yyyy")),-1),"dd-MM-yyyy"),"dd-MM-yyyy")

It takes the parameter (e.g. 15.02.2023), adds a month (15.03.2023), makes the 1st of that month (01.03.2023), and subtracts 1 day (28.02.2023).

Null date transformation

Date formatting: Show an empty date in Excel as blank

An empty date in the format data source is printed in Excel as 02.01.1900. This is not good and very distracting. Again, an obvious solution is DATEFORMAT() in a string type cell, but it just doesn’t feel right.
An elegant approach is a generic Transformation in the Format designer: Formula transformation NoNullDate =
IF(parameter<=NULLDATE()+1, "", DATEFORMAT(parameter, "dd-MM-yyyy"))
This can be re-used and quickly applied to every node: