Electronic reporting for data migration

Electronic reporting for data migration

Intro

There are still dozens of important tables in Dynamics 365 for Finance and SCM not exposed by any entity. Sometimes simple solutions like my Copy-paste with a keyboard script from Excel don’t work due to the volume or complexity of the data, and sometimes there is no UI at all.

One of the notorious examples is the table EcoResCatalogControl which is associated with product attributes bound to procurement categories, see the Searchable flag in my blog Searchable product attributes. An import of these “Product category attributes” through the entity EcoResProductCategoryAttributeEntity leaves the attributes dysfunctional: the internal table EcoResCategoryAttributeLookup is left out of sync and the new attributes remain invisible in the product master.
Procurement category attributes
The only form where the EcoResCatalogControl is editable and visible is the Procurement category form, but an attempt to update the flag fails with a “Missing reference” error: the system expects a EcoResCatalogControl record to exist already. Dead end.

Electronic reporting: the last resort

Since the table browser is not an option anymore in a sandbox or production environment, the number of tools available to the consultant reduces to just one: Electronic reporting. Despite the name, the Electronic reporting module is able not only to read, but also freely insert, update or delete voluntary tables in D365FO.
In essence, the idea is to make a CSV text file and import it into the EcoResCatalogControl with the help of the Electronic reporting (ER).

This requires a so-called Mapping to destination where the “destination” is one or many tables or entities in D365FO. The concept is nicely outlined in the blog of Mr. Ties Philippi. In total, 4 “components” are required in at least 2 ER configurations (the Mapping to destination may in theory be detached into a separate – third – configuration).

  1. Model
  2. Model mapping “To destination”
  3. [CSV file] Format
  4. Model mapping [from format] ”To model”

ER configuration and component diagram
The execution flows in the order (3) -> (4) -> (1) -> (2).
The 2 configurations I used can be downloaded here.

First, create a model, for the simplification its structure may follow exactly the table.

Next, create a mapping with the Direction = To destination. Add the target table to the right with the button Destination, then bind the model from the left to the table at the right side of the Model mapping designer. Change the status of the configuration to Complete.

Next, create a format configuration based on the model. It should describe the CSV file structure, namely a set of records (Lines) separated by CRLF, with 2 fields in every line, separated by the semicolon “;” (in a German version of the Excel; in an English version the comma “,” is more appropriate). The encoding of the CSV format should better be UTF-8 (Excel is able to produce UTF-8 encoded files).

Do not bind the format to the model, but use the button Map format to model to create a mapping of the same name. Bind the format to the model as shown on the screenshot below. The button Run can be used to test the mapping ad hoc: it takes a CSV file, interprets it according to the format definition, translates to the model internal container and exports this container to an XML file.
ER components and mappings

Once tested, update the format status to Completed. Make note that the import is started from the artefact (2) Model mapping to destination (with the button Run). The system looks for a format based on the model the mapping belongs to, and inside of that format it looks for a “To model” mapping.
Run Mapping to destination
Having found an appropriate format, the system asks for a CSV file and unleashes the import, there is no way back.

Update route

Added on 31.01.2020
Electronic reporting destination mappings can also be used to launch X++ code. If you have ever worked with production routes, you may have noted the Update route button. The class behind is called RouteUpdate and it is executed by the system automatically from the UI on any change in the order of operations. On plain production routes, it updates the Next operation numbers, and multiplies the variable scrap percentages to cache and store accumulated scrap factors. The problem is, this class cannot be run over all routes in the batch mode; in the Route headers entity they forgot to use it. Opening every single route out of thousands imported is obviously not an option.

I took the same CSV file and created a new destination mapping which updates the RouteTable. One column in the CSV file (Attribute) contains the key RouteId, the other (Category) bears the route name. I am updating the name by the key, but in the middle I am injecting the following code
RouteSearch.newRouteId(@.Attribute).routeId
like this
LEFT(RouteSearch.newRouteId(@.Attribute).routeId,0)&@.Category
It instantiates and executes the RouteSearch class which as a by-product calls RouteUpdate in its constructor method, then gives a routeId() value back which I neglect. Thousands of routes are updated blazing fast, and the Next operation and the Accumulated scrap is set everywhere:

The ER configuration can be downloaded here: RouteUpdateMapping.zip.

Open project milestones

Added on 16.04.2020
Finally, I have designed a data model and mappings for the fixed fee project On-account transactions aka Billing milestones. The configuration may take a project TransId from the CSV file, or it may create its own from the standard number sequence in the Project module parameters. Both the related tables ProjOnAccTrans + ProjOnAccTransSale are generated.
Project on-account transactions
The ER configuration can be downloaded here: ProjOnAccTransModel.zip
Fun fact: a DMF entity for this existed in AX2012, but not D365.

Epilogue

The above crude trick works, but it is dangerous. Fixing a broken import might be a challenge, because to delete the data and start all over you need to change the destination Record action to Delete and then hope to find the right record by the primary key from the CSV file.

I would love to use standard entities, and make Microsoft add them to the application. Please vote for my 2 “favourites”:

Electronic Reporting (ER) Cookbook 3: Working with dates

Electronic Reporting (ER) Cookbook 3: Working with dates

Following the parts 1 and 2 of my hints and tips collection for Electronic reporting, let me shed some light on working with dates and date formatting.

Cut-off date

If a classic Excel transaction report must be implemented in the Electronic Reporting module, then a cut-off date is a common requirement: show all transactions up to and including a certain date, or show all transactions if the user hasn’t selected any date. Classically, a parameter in the dialog is presented to the user, this requires a User parameter of the DATE type in the format or – better – in the date model mapping (see also Electronic reporting in depth).

This parameter must be used in a Calculated field of the Record List type. Below is a snippet from one of my very first reports dated back to 2016; it extracts ledger transactions from the current company for the German GDPdU dump file:
WHERE(GLaccountEntry,
AND(
NOT(GLaccountEntry.AccountingCurrencyAmount=0),
GLaccountEntry.'>Relations'.GeneralJournalEntry.Ledger = Ledger.'current()',
GLaccountEntry.'>Relations'.GeneralJournalEntry.'>Relations'.FiscalCalendarPeriod.Type = FiscalPeriodType.Operating,
GLaccountEntry.'>Relations'.GeneralJournalEntry.AccountingDate >= FromDate,
OR(GLaccountEntry.'>Relations'.GeneralJournalEntry.AccountingDate <= ToDate, ToDate=NULLDATE())))

In hindsight, this was a terrible implementation. The WHERE operator works with in-memory lists. In essence, it loads all transactions from all companies and all periods and years into an internal XML container, and THEN starts filtering it by company and date. Not surprisingly, the performance degrades rapidly and the report execution time grows exponentially.

The FILTER is a better function as it compiles to a direct SQL statement and returns a reduced dataset; however, it has limited capabilities and does not support cross joins as above.

A mature solution would be to perform the filtering in 2 steps: one Calculated list fetches a maximally pre-filtered list from the SQL database, and subsequent Calculated field variables apply additional dynamic filters to this subset.
The below expression returns a list of all cost project transactions up to a certain date:
FILTER(ProjTransPosting,
AND(ProjTransPosting.PostingType=Enums.LedgerPostingType.ProjCost,
OR(ProjTransPosting.LedgerTransDate <= $ToDate, $ToDate=NULLDATE())))

where $ToDate is the user parameter.

To simplify and reuse the classic pattern WHERE LedgerTransDate <= $ToDate OR $ToDate=NULLDATE(), an interim variable (Calculated field) $ToDateOrInfinity may be declared (I could not find any other way to pass a date literal into an ER expression, that’s why the crude DATEVALUE(“31-12-2154″,”dd-MM-yyyy”)):
IF('$ToDate'>NULLDATE(), '$ToDate', DATEVALUE("31-12-2154","dd-MM-yyyy"))
and the above query reduces to just
FILTER(ProjTransPosting, AND(ProjTransPosting.PostingType=Enums.LedgerPostingType.ProjCost,
ProjTransPosting.LedgerTransDate <= $ToDateOrInfinity))

ToDate user parameterThis led to an issue on the UI, however. User parameters from the [default] Mapping designer propagate to the format, are merged with the user parameters defined in the Format designer and displayed together in the common dialog window. According to Maxim B. from the Microsoft R&D Center in Moscow, a user parameter is shown at the run time if 2 conditions have been met:

  • the visibility is not turned off;
  • in the mapping, the user parameter is bound to the model either directly or indirectly.

Seemingly, the above indirect usage obscures the parameter from the ER format and it disappears from the dialog. The solution was to bind $ToDate directly to an unused field in the model.

Date formatting: DATETIME to DATE

I spent quite some time trying to find an embedded function to convert a DATETIME type and bind to a DATE cell in Excel. Obviously, you can declare the format node a STRING and apply the DATETIMEFORMAT() function, but this circumvents the locale settings.

The solution was dumb, but it worked:
DATEVALUE(DATETIMEFORMAT(@.'Created date', "dd-MM-yyyy"), "dd-MM-yyyy")

First, Last date of the month

The below formulas work in model mappings. Assume the parmPeriodDate a user parameter. The first formula is easy and trivial:
DATEVALUE("01-"&DATEFORMAT(parmPeriodDate, "MM-yyyy"), "dd-MM-yyyy")

The second formula is crazy. It needs a declaration of the system class DateTimeUtil nearby in the model mapping data source section.
DATEVALUE(DATETIMEFORMAT(ADDDAYS(DATETODATETIME(DATEVALUE("01-"&
DATETIMEFORMAT(DateTimeUtil.addMonths(DATETODATETIME(parmPeriodDate),1),"MM-yyyy"), "dd-MM-yyyy")),-1),"dd-MM-yyyy"),"dd-MM-yyyy")

It takes the parameter (e.g. 15.02.2023), adds a month (15.03.2023), makes the 1st of that month (01.03.2023), and subtracts 1 day (28.02.2023).

Null date transformation

Date formatting: Show an empty date in Excel as blank

An empty date in the format data source is printed in Excel as 02.01.1900. This is not good and very distracting. Again, an obvious solution is DATEFORMAT() in a string type cell, but it just doesn’t feel right.
An elegant approach is a generic Transformation in the Format designer: Formula transformation NoNullDate =
IF(parameter<=NULLDATE()+1, "", DATEFORMAT(parameter, "dd-MM-yyyy"))
This can be re-used and quickly applied to every node:

Italian eInvoicing FatturaPA: Part 2

Italian eInvoicing FatturaPA: Part 2

You thought this was over? Not at all, the implementation of the FatturaPA format in D365FO harbors many pearls : )

DatiBollo, ImportoBollo

The “Bollo” or the Import duty seems to be a self-assessed Italian charge on certain deliveries. The user should be able to apply stamp duties to a subset of outbound invoices at will. The total stamp duty load is deducted by the Italian authorities periodically by harvesting the electronic invoices. In case the stamp duty is applicable, the tag should say “si” and should give the [positive] amount: 2.00.

Sales charge setupThe standard D365FO electronic format assumes the stamp duty is represented by a miscellaneous charge to the sales order. These charges trigger the XML element; otherwise the miscellaneous charge is exported as a regular invoice line.
The respective charge code can be configured with a transit debit and a transit credit account and added to any sales line manually or automatically as a auto-charge:

Electronic parametersThe system relies on the Spanish parameter Stamp duty to identify the dedicated miscellaneous charges code representing this duty. Unfortunately, this Spanish parameter is unavailable in Italy by definition, check the button Electronic document properties in the legal entities form:

Moreover, in the mapping between the database and the format the charges code was missing, and the duty only took sales order header level charges into accounts. A custom mapping needs to be derived from the Customer invoice model – Model mapping Customer invoice to fix this. This mapping must be declared Default for model mapping = Yes.
In the format the parametrized query $StampDutyCharges can be extended by a hardcoded check if the charge code begins with “Bollo”, then both the header-level and line-level charges can be merged into a single list:
IF(ISEMPTY('Stamp duty'),
LISTJOIN(WHERE(@.InvoiceBase.MarkupTransaction,
LEFT(@.InvoiceBase.MarkupTransaction.Code,5)="Bollo"),
WHERE(@.LineItem.MarkupTransaction,
LEFT(@.LineItem.MarkupTransaction.Code,5)="Bollo")),

WHERE(@.InvoiceBase.MarkupTransaction,
@.InvoiceBase.MarkupTransaction.Code='Stamp duty'.Value))

RiferimentoNormativo

The RiferimentoNormativo delivers the legal justification for tax exempt invoices and refers to the Italian tax codex, e.g. <RiferimentoNormativo. This tag was completely missing in the standard format. The tax code description is a good place to store the justification, the Customer invoice model has a node to store this data, but in the standard mapping the tax code description was not bound to the model. The derived mapping needs to be adjusted at the node CustInvoiceJour/<Relations/TaxTrans :
@.'>Relations'.TaxTable.TaxName

A new conditional XML element shall be added to the format configuration:
XMLHeader/p:FatturaElettronica/FatturaElettronicaBody/DatiBeniServizi/DatiRiepilogo/RiferimentoNormativo:
@.lines.TaxDescription

The Enabled-condition for the tag shall trigger the XML element only if the delivery is tax exempt, and this correlates with the “Natura” tag:
@.lines.'$IsVisibleNatura'

Credit notes in FatturaPA and ESTEROMETRO

The worst comes at the end. It turned out that credit notes ( = TD04) must have the positive sign throughout the XML file. This requires massive changes in the DettaglioLinee section
ABS(@.LineBase.Quantity)
ABS(IF(@.LineBase.Quantity<0, -(1)*@.LineBase.'$FinalUnitPrice', @.LineBase.'$FinalUnitPrice'))
DatiRiepilogo
ABS(@.aggregated.Amount_Sum)
and DatiPagamento sections:
ABS(@.DueAmount)
Similar amendments but just at one place should be made in the ESTEROMETRO (which is a kind of an EU Sales list report for Italy) electronic formats, such as the “Report customer invoices IT” and “Report vendor invoices IT”.