In the Data modeling studio, a data model defines the data to be exported, how to export the data, where to export the data to, and when to export the data.

For each data model, define these settings:

  • Identification: The identification of the data model.
  • Model: How the data is exported and how the target database is configured.
  • Destination: The schema that is used to create the table in the target database on deploy.
  • Metadata: Which metadata is included added to the target database on deploy.
  • Scheduling: How the data export is scheduled for the data model.
  • Update information: The contact info of the data model owner.
  • Connection: The connection to the target database.


Standard procedure

1. Go to Data modeling studio > Data modeling studio > Data models.
2. Click New.
3. Sub-task: Define identification settings.
  3.1 In the Name field, type a value.
  3.2 In the Description field, type a value.
4. Sub-task: Define model settings.
  4.1 To export data with the Data modeling studio, always use the 'Unmanaged' mode. This is the lowest level of compliance. It provides the least restrictions and the maximal level of flexibility. To export data from D365 FO, usually, data entities are required. With the 'unmanaged' mode, you can export data based on configuration only, instead of using data entities. You configure the data to be exported in data sets.
  In the Mode field, select 'Unmanaged'.
  4.2 Define the model type that is used for the data model. The model type indicates how the data model is used and how the target database is configured. The way in which data is consumed from the target database, defines the applicable model type.
Select one of these model types:
- Staging: Configures the target database and related data export to support a downstream ETL (extract, transform, load) process. To do so, for each table in the target database, change tracking and a unique index is enabled. Usually, the 'Staging' model type is used.
- Self-service: Configures the target database and related data export to support end-user consumption by queries from the target database. To do so, the target database is configured to be used as a data mart with, for example, user-specified indexes.
  In the Model type field, select an option.
5. Sub-task: Define connection string.
  5.1 Enter the connection to the target database.
The recommended structure for the connection string is: Data Source=...;Initial Catalog=...;Integrated Security=...;User ID=...;Password=...
  In the Connection string field, type a value.
 

Note:
The account, as defined in the connection string, must have owner privileges in the target database. For example, to be able to:
- Export data to the target database.
- Create tables in the target database.
- Create objects in the target database using automation scripts (transformations).

  5.2 By default, the connection string is hidden. You can show the connection string for reviewing purposes.
  Click Show.
  5.3 When you have entered the connection string, make sure it is connecting successfully to the target database.
  Click Test.
6. Sub-task: Define destination settings.
  6.1 Enter or select the schema that is applied to create the tables in the target database.
To select a schema, the connection to the target database must be defined and tested successfully. You can select a schema as defined for the target database.
You can also manually enter a schema. If the schema does not exist in the target database, it is created automatically when the data model is deployed.
  In the Schema field, enter or select a value.
 

Note:
- The default schema is defined in the Data modeling studio parameters, on the Data export tab, in the Schema field.
- In an ETL (Extract, Transform, Load) process, the best practice schema for unprocessed data is 'RAW'.

7. Sub-task: Define metadata settings.
  7.1 You can add application metadata tables to the target database.
If you select 'Yes', on deploy, these tables are created and filled in the target database:
- CDPLabelStaging: This table contains the labels from D365 FO. So, in the target database, the UI labels are shown instead of the technical names.
- CDPEnumerationsStaging: This table contains the enum value names from D365 FO. So, in the target database, the enum value names are shown instead of the enum value numbers.
- CDPDictionaryStaging: This table contains all D365 FO tables.
  Select the desired option in the Application metadata field.
  7.2 You can add processing metadata tables to the target database.
If you select 'Yes', on deploy, these tables are created and filled in the target database:
- CDPLogStaging: This table is used to log the deploy and export history. It shows the export history using messages.
- CDPProcessingStaging: This table is used to log the deploy and export processing history. It shows the technical details of the export history.
On data export, the data in these tables is updated.
  Select the desired option in the Processing metadata field.
 

Note: You can view the log and processing history on the Data exchange monitoring page.

  7.3 You can add data model metadata tables to the target database.
If you select 'Yes', on deploy, these tables are created and filled in the target database:
- CDPFieldsStaging: This table contains the fields of the data sets and data entities tables that are added to the data model.
- CDPInputsStaging: This table contains the tables of the data sets and data entities that are added to the data model.
  Select the desired option in the Data model metadata field.
8. Sub-task: Define schedule settings.
  8.1 You can use export groups to schedule data export for several data subsets in a different recurrence. These export groups are defined per data model. To export data by export group, also define export groups for the desired data sets and data entities of the data model.
Usually, you use an export group batch job to do incremental exports, for example, export of transactions.
Each export group, as defined for the data sets and data entities, is shown on the Export dialog. For each export group, you can create a separate batch job to export the data of the data sets and data entities with that export group defined.
  Select the desired option in the Schedule by export group field.
  8.2 Usually, you use the 'Default' batch job or an export group batch job to do incremental exports. However, it can be desired to regularly run the 'Full load' export batch job, for example, each week. For example, if the data selection is set to 'Timestamp', only the new or updated records are exported with the 'Default' export or an export group export. However, in D365 FO, records can also be deleted. To delete these records as well from the target database, you can schedule a weekly 'Full load' export.
The 'Full load' export group is shown on the Export dialog. You can use this export group to create one batch job to export all data for all enabled data sets and data entities of the data model.
  Select the desired option in the Schedule full load field.
9. Sub-task: Define update information settings.
  9.1 You can define the email address of the owner of the data model. So, for example, if export is failing for the data model, you know who you can contact.
  In the Contact field, type a value.
Related to Notes

Set up data models

 

See also

Provide feedback