Activities

Activity Area Description

Add data set or data entity to data model

Data modeling

In the Data modeling studio, a data model defines the data to be exported. For each data model, you define the applicable:

  • Data sets: You can add the data sets as defined in the Data modeling studio. An added data set gets the mode 'Unmanaged'.
  • Data entities: You can add standard or custom data entities as available for D365 FO. An added data entity gets the mode 'Managed'.

Add fields to data set

Data modeling

In the Data modeling studio, a data set defines the data to be exported. For each data set, define the relevant fields.

Analyze data exchange log

Data modeling

You can analyze a data export action in the data exchange log. On analyze:

  1. Validation is done: The number of records in the source database (D365 FO/AxDB) is compared to the number of records in the target database (BYOD). If the number of records is equal, no further analysis is done.
  2. If the number of records differs, the records in both databases are compared.
  3. The result of the comparison is stored and can be viewed on the Analysis details page.
  4. A message is shown to indicate the difference.

The result of the comparison, as shown on the Analysis details page, can be:

  • Identical: The record exists in both the source database and the target database.
  • Missing: The record exists in the source database, but not in the target database.
  • Extra: The record exists in the target database but not in the source database.

Note:  You can use the Analysis details data for troubleshooting purposes. This data is not used for synchronization.

Apply cross filtering

Data modeling

To each data set or data entity, as added to a data model, you can apply filters to limit the data that is exported.

You can apply these filters:

  • Legal entity: By default, data is exported from all legal entities in the current environment. If you want to only export data for one or several legal entities, you can filter on these legal entities.
  • Created date/time: By default, on data export, all records are considered for export, regardless of the created date/time of these records. You can apply created date/time filters to only export records that are created:
    • After a specific date/time: Only define the Start date/time.
    • Before a specific date/time: Only define the End date/time.
    • In a specific period: Define both the Start date/time and the End date/time.

Apply data set template

Data modeling

You can use a data set template to create data sets based on the template or to add fields to existing data sets.

If a data set in the template:

  • Does not exist, the data set is created with the settings and fields as defined in the template.
  • Already exists and the template has (several) other fields than the data set, these fields are added to the existing data set. The existing data set fields that are not in the template are not removed from the data set. So, you can only use a template to add fields to a data set and not to remove fields from a data set.

Apply template to data model

Data modeling

You can use a template to easily add data sets and data entities to a data model.

You can use templates that are created from the:

  • Data models page. Before you apply a template that is created from the Data models page, make sure that the data sets and data entities, as saved in the template, already exist in your D365 FO environment.
  • Data sets page. If a data set in the template:
    • Does not exist, the data set is created with the settings and fields as defined in the template. Also, the data set is added to the data model.
    • Already exists and the template has (several) other fields than the data set, these fields are added to the existing data set. The existing data set fields that are not in the template are not removed from the data set. So, you can only use a template to add fields to a data set and not to remove fields from a data set.

Change data set or data entity status for data model

Data modeling

On export, only the enabled data sets and data entities are exported. So, to (temporarily):

  • Exclude a data set or data entity from export, disable it.
  • Include a data set or data entity in export, enable it.

Create data model

Data modeling

In the Data modeling studio, a data model defines the data to be exported, how to export the data, where to export the data to, and when to export the data.

For each data model, define these settings:

  • Identification: The identification of the data model.
  • Model: How the data is exported and how the target database is configured.
  • Destination: The schema that is used to create the table in the target database on deploy.
  • Metadata: Which metadata is included added to the target database on deploy.
  • Scheduling: How the data export is scheduled for the data model.
  • Update information: The contact info of the data model owner.
  • Connection: The connection to the target database.

Create data set

Data modeling

In Data modeling studio, a data set defines the data to be exported. Use a data set to define the data to be exported by a data model.

On creation of a data set, at least, define:

  • The relevant table.
  • The data selection type.
  • If system fields or date/time fields are added automatically to the data set.

Create export groups for data model

Data modeling

You can enable a data model export to be scheduled by export group. To export by export group, for each data set or data entity, as added to the data model, define the applicable export group.
On export, the data sets and data entities with the same export group defined, are combined in one batch job.
If you schedule export by export group, for one data model, you can schedule export for several data subsets in a different recurrence.

Export groups are defined by data model. To be able to use export group, create the export groups for the data model.

Define financial dimensions

Data modeling

For a data model, you can define which financial dimension combinations must be exported to the target database. You can use these financial dimension combinations for reporting purposes.
On setup of financial dimensions for a data model, a batch job is run to search for and add existing financial dimension combinations to the CDPDIMENSIONS table in D365 FO. The search for financial dimension combinations that fit the data model setup is done in all applicable financial dimension tables in D365 FO. All existing applicable financial dimension combinations are transformed and added to the CDPDIMENSIONS table in D365 FO. When, in D365 FO, new financial dimension combinations are created that fit the data model setup, these combinations are automatically transformed and added to the CDPDIMENSIONS table in D365 FO.
If financial dimensions are set up for the data model, on:

  • Deploy of the data model, the financial dimensions table is created in the target database. The name of this table is: [schema name].CDPDIMENSIONSSTAGING; where [schema name] is the schema as defined for the data model.
    All financial dimension combinations that exist in the CDPDIMENSIONS table in D365 FO are exported to the [schema name].CDPDIMENSIONSSTAGING table in the target database.
  • Export of the data model, the applicable financial dimension combinations are exported from the CDPDIMENSIONS table in D365 FO to the [schema name].CDPDIMENSIONSSTAGING table in the target database.

Define primary index of data set

Data modeling

The standard table index for D365 FO tables, as used by Data modeling studio, consists of these fields: RecId and RecVersion. You can select additional primary indexes, if available for the data set table. These additional primary indexes are then used as well by Data modeling studio.

Note: To use additional indexes, the table system fields must be added as well. If you select an additional index and the 'Add system fields' field was set to 'No', this field is automatically set to 'Yes'.

Deploy data model

Data modeling

If a data model is set up, and applicable transformations are defined, deploy the data model. You deploy a data model to prepare the target database for data exports.

Deploy data set

Data modeling

In the Data modeling studio, a data set defines the data to be exported. You can deploy a data set. As a result, in the target database, the related table is deleted and recreated based on the deployed data set.

Note:

  • You can only deploy a data set if it is linked to a data model.
  • When you deploy a data set, the related table in the target database is deleted. Also the existing data is removed. The next time you export the related data model, the data is recreated in the target database.

Design data set based on form

Data modeling

In Data modeling studio, a data set defines the data to be exported. Use a data set to define the data to be exported by a data model.

You can create data sets based on a form. A form can have fields from several tables. You can select any form field. If the selected form fields are related to different tables, for each of these tables, a separate data set is created.

If a data set already exists for a table, no new data set is created. If a selected field:

  • Already exists in the existing data set, nothing is done.
  • Does not exist in the existing data set, the field is added to the existing data set.

Download data templates and transformations

Data modeling

To exchange a data template or transformation with another D365 FO environment, download the template or transformation.

You can download several templates and transformations at once. The downloaded templates and transformations are bundled in a compressed (zipped) folder. This folder is saved to your downloads location.

In the compressed folder, for each category type, a separate folder is created with the templates and transformations of that category.

The type defines file format of the template or transformation file. For example, XML or SQL.

Edit data set

Data modeling

If you have created a data set based on a form, check the data set header settings and field settings, and edit these as desired.

 

Export data

Data modeling

If a data model is set up and deployed, you can export the data as defined in the data model.

Refresh data exchange history

Data modeling

The deploy and export history is logged. Over time, the number of logged records grows. To limit the number of logged records in the database, you can refresh the history. If you refresh the history, the retention settings of these Data modeling studio parameters are applied:

  • Log record retention (days): On refresh, only the log records are kept that are within the number of defined retention days.
  • Processing record retention (days): On refresh, only the processing records are kept that are within the number of defined retention days.

Example: Number of retention days is 7. Today you Refresh. As a result, all logged records are deleted that are older than a week.

Save data model data sets and data entities as template

Data modeling

On a data model, you can create a template based on one or several of the added data sets and data entities. You can use this template to easily add these data sets and data entities to another data model.

If you, on a data model, save one or several data sets and data entities as a template, the template is created with the category 'Data' and the type 'XML'. In the template, references to the selected data sets and data entities are stored in plain XML. The XML also contains the value of the 'Enabled' field.

On saving data set and data entity configurations to a template, you can:

  • Create a new template: Enter the desired template name in the Template field.
  • Overwrite an existing template: Set the Overwrite template field to 'Yes', and, in the list, select the desired template.

Save data set as template

Data modeling

You can save one or several data set configurations as a template. You can use this template to exchange the data set configuration with another D365 FO environment or to easily add it to a data model.

If you save data set configurations as template, a template is created with the category 'Data' and the type 'XML'. In the template, the data set configuration is stored in plain XML. The XML contains both the data set general settings and the data set fields.

On saving data set configurations to a template, you can:

  • Create a new template: Enter the desired template name in the Template field.
  • Overwrite an existing template: Set the Overwrite template field to 'Yes', and, in the list, select the desired template.

Select data set fields to be exported for data model

Data modeling

If you add a data set to a data model, by default all data set fields are included in the data export. For a data model, you can select a subset of the data set fields to be included in the data export.

Note: If a data set is used in a data model, you can still add fields to the data set. These newly added data set fields are not added automatically to the selected fields for the data model. So, if fields are added to a data set, and these must be included in the data export, manually select these fields for the data model.

Select deploy transformations

Data modeling

For each data model you can define transformations to be executed automatically after the data model is deployed.

You can define transformations of these categories:

  • Modeling: You can use modeling transformations to create views in the target database. These views are created based on the business entities that are created in the target database on deploy.
  • Instrumentation: You can use instrumentation transformations to create objects, stored procedures, and schemas in the target database. You can also use these transformations to do calculations on the metadata that is exported on deploy.

A transformation contains one or several SQL statements which define the transformation actions to be done.

Select pre- and post-export transformations

Data modeling

For each data model you can define transformations to be executed automatically before and after the data model is exported.

On data export for a data model, the processing transformations are used to do calculations in the target database. A processing transformation contains one or several SQL statements which define the transformation actions to be done.

You can have processing transformations executed before or after exporting data:

  • Pre-export transformations: Used to do calculations on the metadata in the target database. For example, on enumerations or labels.
  • Post-export transformations: Used to do calculations based on the data that is just exported to the target database.

Set up data export parameters

Data modeling

Set up the parameters that are applied:

  • To scheduling an export batch job.
  • During the export of data, like timeout settings.
  • To creation of data models: Schema.

Set up entity creation parameters

Data modeling

Set up the parameters that are applied:

  • To the creation of Data sets in D365 FO.
  • On deploy, to the creation of entities in the target database.

Set up instrumentation transformation

Data modeling

On deploy of a data model, you can use instrumentation transformations to create objects, stored procedures, and schemas in the target database. You can also use these transformations to do calculations on the metadata that is exported on deploy. An instrumentation transformation contains one or several SQL statements which define the transformation actions to be done.

Usually, a transformation is created in SQL and then the file is uploaded to the Data templates and transformations. On upload, a new transformation is created or an existing transformation is overwritten. You can also manually create a transformation in DMS and create or copy the SQL statements to the Definition field.

Set up maintenance parameters

Data modeling

Several upgrade scripts are available to be used on upgrading your Data modeling studio installation to a newer version. Probably, most of these upgrade scripts aren't applicable anymore to must of the Data modeling studio installations.

The only one that can possibly be applicable is the Upgrade post processing script (updateStagingEntitySuffixScript). You can use this upgrade script if you upgrade your Data modeling studio installation from a version that's older than 10.0.25.... This script solves possible issues with the entity suffix.

Run the script after you have upgraded your Data modeling studio installation to the newest version.

You can also define some diagnostics-related settings.

Set up metadata synchronization parameters

Data modeling

Set up the metadata-related parameters.

Set up modeling transformation

Data modeling

On deploy of a data model, you can use modeling transformations to create views in the target database. These views are created based on the business entities that are created in the target database on deploy. A modeling transformation contains one or several SQL statements which define the transformation actions to be done.

Usually, a transformation is created in SQL and then the file is uploaded to the Data templates and transformations. On upload, a new transformation is created or an existing transformation is overwritten. You can also manually create a transformation in DMS and create or copy the SQL statements to the Definition field.

Set up processing transformation

Data modeling

On data export for a data model, you can use processing transformations to do calculations in the target database. A processing transformation contains one or several SQL statements which define the transformation actions to be done.

You can have processing transformations executed before or after exporting data:

  • Pre-export transformations: Used to do calculations on the metadata in the target database. For example, on enumerations or labels.
  • Post-export transformations: Used to do calculations based on the data that has just been exported to the target database.

Usually, a transformation is created in SQL and then the file is uploaded to the Data templates and transformations. On upload, a new transformation is created or an existing transformation is overwritten. You can also manually create a transformation on the Data templates and transformations page, and create or copy the SQL statements to the Definition field.

Synchronize data exchange log

Data modeling

You can analyze a data export action in the data exchange log. On analyze:

  1. Validation is done: The number of records in the source database (D365 FO/AxDB) is compared to the number of records in the target database (BYOD). If the number of records is equal, no further analysis and synchronization is done.
  2. Analysis is done: If the number of records differs, the records in both databases are compared.
  3. The differences are synchronized.
  4. A message is shown to indicate the synchronization result.

On synchronize, in the target database, records that:

  • Exist in the source database, but do not exist in the target database, are inserted.
  • Do not exist in the source database, but exist in the target database, are deleted.

Update metadata

Data modeling

In Data modeling studio, several metadata tables exist, which are used:

  • On deploy.
  • On deploy or export.
  • In data modeling.

For the Data modeling studio metadata tables, the data is collected in D365 FO and stored in these metadata tables in D365 FO.
On deploy or export, for some of these metadata tables, the data is taken from these metadata tables in D365 FO. For other metadata tables, the data is used in data modeling.

It is important to keep the data in the metadata tables up-to-date. Make sure to update the metadata tables after each creation or change of a data model.

Upload data templates and transformations

Data modeling

To use data templates and transformations from another D365 FO environment, upload the data templates or transformations.

You can upload a:

  • Compressed (zipped) folder that is created by a download from the Data templates and transformations page. The files in the compressed folder are extracted automatically and listed on the Upload templates dialog.
  • Single file that is extracted from a compressed (zipped) folder that is downloaded from the Data templates and transformations page.

 

Validate and fix data set

Data modeling

To maintain a data set, you can validate and fix a data set. Validation checks if the data set fields exist as fields in the applicable D365 FO table. Validation also checks the field settings.

Besides the validation, the data set fields are fixed. The data set fields and field settings are made in line with the corresponding D365 FO table fields and field settings. For example, if a data set field does no longer exist in the D365 FO table, it is removed from the data set.

Validate data exchange log

Data modeling

You can validate a data export action in the data exchange log for troubleshooting purposes. On validate, the number of records in the D365 FO database table is compared to to the number of records in the target database table.

As a result a message is shown for one of these possible scenarios:

  • The number of records is equal.
  • The source database (D365 FO/AxDB) has less records than the target database (BYOD).
  • The source database (D365 FO/AxDB) has more records than the target database (BYOD).

Note: The validation results are not stored.

Validate data model

Data modeling

If you have deployed a data model, you can run the validation.

The validation is done on the target database. Validation checks if, in the target database, the:

  • Schema exists that is defined for the data model.
  • Target database exists.
  • Change tracking is enabled.

Validate data set

Data modeling

To maintain a data set, you can validate a data set. Validation checks if the data set fields exist as fields in the applicable D365 FO table. Validation also checks the field settings.

View data exchange history

Data modeling

In Data modeling studio, for monitoring purposes, executed actions are logged.

The main activities for which logging is done are:

  • Deploy: All actions that are executed when you deploy a data model are logged. Each time you deploy a data model, an execution ID is created to which all executed deploy actions are related
  • Data export: All actions that are executed when you export data are logged.
  • Metadata updates: Metadata updates are only logged if on the Data modeling studio parameters, Log metadata updates is 'Yes'.

Note: On deploy or data export, executed SQL statements are only logged if on the Data modeling studio parameters Log executed SQL statements is 'Yes'.

Logging is done on three levels:

  • Summary: Each deploy or data export results in a record in the summary. Each record is identified by the context (data model) and the execution ID.
  • Log: The actions that are executed by the deploy, data export, or metadata update.
  • Processing: The export statistics for each exported business entity (data set or data entity) as defined for the deployed or exported data model.

Provide feedback