Activity | Area | Description | ||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
(Re)build 'Connectivity' business events |
Design | You can rebuild the business events of category 'Connectivity'. You can do so, for example, if:
As a result:
Note:
|
||||||||||||||||||||||||||||||||||
Activate business event |
Design | For several business events, you activate the business event:
For more information, refer to:
|
||||||||||||||||||||||||||||||||||
Add an item cross reference |
EDI | You can get validation errors or warnings on the received item number, name, or bar code in the EDI sales order. This can be another number, name, or bar code than you use for the related item. For example, a customer sends an external bar code to order the item. In this case, you can add an item cross reference. An item cross reference indicates which item to deliver for an external item number, name, or bar code. For example, the customer has ordered paracetamol with a bar code of a brand that you don't have. But you have the same paracetamol in another brand. With the item cross reference between the bar code of the ordered brand and your item, the sales line can be approved on validation. The search sequence for the applicable item cross reference is from specific to general:
|
||||||||||||||||||||||||||||||||||
Add batch class to task |
Design | For each task, you can define several (custom) batch classes. If you run a task, all defined batch classes are run. |
||||||||||||||||||||||||||||||||||
Add child record to outbound queue |
Operation | When the data synchronization log is processed, the Redirect event setting of the message or web service action is considered. Each table event that occurs to a record is logged in the data synchronization log for that record and related table. On processing the data synchronization log, by default, log entries for a child record of the source document are added to the outbound queue for the root record of the source document. However, if Redirect event is set to 'No', only the logged event on the child record is added to the outbound queue. When the outbound queue is processed, the child record is exported. Based on the applicable document setup, the applicable parent records and the root record are also exported. Example: The source document has a root record 'Sales order' and a child record 'Sales line'. An event is logged for a sales line. If Redirect event is set to 'No', the sales line record is added to the outbound queue. When the outbound queue is processed, the sales line and the sales order are exported. Note: Before a record is added to the outbound queue, a check is done if already a record exists for the unique combination of: - Record - Message or web service action - Event type - State If, in the outbound queue, already a record exists for the unique combination, the record is not added to the outbound queue |
||||||||||||||||||||||||||||||||||
Add document record field - JSON |
Design | To each document record, add the data fields which values must be exchanged. For external file-based documents, set up the fields in line with naming in the file. For JSON documents, fields are always of type string, to enable type conversions. This topic explains how to add record fields to a JSON document. If fields are already initialized for the document, selected for the record, or copied to the record, you can review and complete the setup for these fields. To do so, skip step 6. |
||||||||||||||||||||||||||||||||||
Add document record fields - EDI |
Design | To each document record, add the data fields which values must be exchanged. For external file-based documents, set up the fields in line with naming in the file. For EDI documents, fields are always of type string, to enable type conversions. This topic explains how to add record fields to an EDI document. If fields are already initialized for, selected for, or copied to the record, you can review and complete the setup for these fields. To do so, skip step 6. |
||||||||||||||||||||||||||||||||||
Add document record fields - Internal documents |
Design | To each document record, add the data fields which values must be exchanged. For internal documents, set up the fields in line with naming in D365 FO. For internal documents, make sure the fields have the same type as in D365 FO. This topic explains how to add records to documents of these types: D365 FO, Journal, or Staging. If fields are already selected for or copied to the record, you can review and complete the setup for these fields. To do so, skip step 6. |
||||||||||||||||||||||||||||||||||
Add document record fields - Microsoft Excel |
Design | To each document record, add the data fields which values must be exchanged. For external file-based documents, set up the fields in line with naming in the file. For Microsoft Excel documents, fields can be of any type. This topic explains how to add record fields to a Microsoft Excel document. If fields are already initialized for, selected for, or copied to the record, you can review and complete the setup for these fields. To do so, skip step 6. |
||||||||||||||||||||||||||||||||||
Add document record fields - Microsoft Word |
Design | To each document record, add the data fields which values must be exchanged. For external file-based documents, set up the fields in line with naming in the file. For Microsoft Word documents, fields are always of type string, to enable type conversions. This topic explains how to add record fields to a Microsoft Word document. If fields are already selected for or copied to the record, you can review and complete the setup for these fields. To do so, skip step 6. |
||||||||||||||||||||||||||||||||||
Add document record fields - Text |
Design | To each document record, add the data fields which values must be exchanged. For external file-based documents, set up the fields in line with naming in the file. For Text documents, fields are always of type string, to enable type conversions. This topic explains how to add record fields to a Text document. If fields are already initialized for, selected for, or copied to the record, you can review and complete the setup for these fields. To do so, skip step 6. |
||||||||||||||||||||||||||||||||||
Add document record fields - XML |
Design | To each document record, add the data fields which values must be exchanged. For external file-based documents, set up the fields in line with naming in the file. For XML documents, fields are always of type string, to enable type conversions. This topic explains how to add record fields to an XML document. If fields are already initialized for the document, selected for the record, or copied to the record, you can review and complete the setup for these fields. To do so, skip step 6. |
||||||||||||||||||||||||||||||||||
Add document records - EDI |
Design | To each document, add the data records to be exchanged. For external file-based documents, set up the records in line with how the data is structured and named in the file. This topic explains how to add records to an EDI document. |
||||||||||||||||||||||||||||||||||
Add document records - Fixed text |
Design | To each document, add the data records to be exchanged. For external file-based documents, set up the records in line with how the data is structured and named in the file. This topic explains how to add records to a Fixed text document. |
||||||||||||||||||||||||||||||||||
Add document records - Internal document |
Design | To each document, add the data records to be exchanged. For internal documents, set up the records in line with how the data is structured and named in D365 FO. This topic explains how to add records to documents of these types: D365 FO, Journal, or Staging. |
||||||||||||||||||||||||||||||||||
Add document records - JSON |
Design | To each document, add the data records to be exchanged. For external file-based documents, set up the records in line with how the data is structured and named in the file. This topic explains how to add records to a JSON document. If records are already initialized for the document, you can review and complete the setup for these records. To do so, instead of adding a record, select the desired record. |
||||||||||||||||||||||||||||||||||
Add document records - Microsoft Excel |
Design | To each document, add the data records to be exchanged. For external file-based documents, set up the records in line with how the data is structured and named in the file. This topic explains how to add records to a Microsoft Excel document. |
||||||||||||||||||||||||||||||||||
Add document records - Microsoft Word |
Design | To each document, add the data records to be exchanged. For external file-based documents, set up the records in line with how the data is structured and named in the file. This topic explains how to add records to a Microsoft Word document. |
||||||||||||||||||||||||||||||||||
Add document records - ODBC |
Design | To each document, add the data records to be exchanged. For ODBC documents, set up the records in line with how the data is structured and named in the external database. This topic explains how to add records to an ODBC document. |
||||||||||||||||||||||||||||||||||
Add document records - Text |
Design | To each document, add the data records to be exchanged. For external file-based documents, set up the records in line with how the data is structured and named in the file. This topic explains how to add records to a Text document. |
||||||||||||||||||||||||||||||||||
Add document records - XML |
Design | To each document, add the data records to be exchanged. For external file-based documents, set up the records in line with how the data is structured and named in the file. This topic explains how to add records to an XML document. If records are already initialized for the document, you can review and complete the setup for these records. To do so, instead of adding a record, select the desired record. |
||||||||||||||||||||||||||||||||||
Add document records fields - Fixed text |
Design | To each document record, add the data fields which values must be exchanged. For external file-based documents, set up the fields in line with naming in the file. For Fixed text documents, fields are always of type string, to enable type conversions. This topic explains how to add record fields to a Fixed text document. If fields are already selected for or copied to the record, you can review and complete the setup for these fields. To do so, skip step 6. |
||||||||||||||||||||||||||||||||||
Add document records fields - ODBC |
Design | To each document record, add the data fields which values must be exchanged. For ODBC documents, set up the fields in line with naming in the external database. For ODBC documents, make sure the fields have the same type as in the external database. This topic explains how to add record fields to an ODBC document. If fields are already initialized for, selected for, or copied to the record, you can review and complete the setup for these fields. To do so, skip step 6. |
||||||||||||||||||||||||||||||||||
Add file to Working folder |
Design | If you want to test an import message with an external file-based source document, you need a file with test data to be imported. This file must be available in the Working folder that is defined for the source connector. You can add the test data file to the Work folder in these ways:
|
||||||||||||||||||||||||||||||||||
Add mapping fields |
Design | Set up the field mapping for each record mapping of the message. On the field mapping, you define which target document record fields are mapped to which source document record fields. The resulting field mapping is used to get the right data from the source and get it in the right format to the right place in the target. To set up the field mapping, you can:
You can only use the fields, as defined for the related record in the target document. |
||||||||||||||||||||||||||||||||||
Add master data entity to task |
Design | For each task, you can define several master data entities. If you run a task, all defined master data entities are run. For more information on master data entities, refer to Master data management studio. |
||||||||||||||||||||||||||||||||||
Add message to task |
Design | For each task, you can define several messages. If you run a task, all defined messages are run. |
||||||||||||||||||||||||||||||||||
Add outbound web service action to task |
Design | For each task, you can define several outbound web service actions. If you run a task, all defined outbound web service actions are run. |
||||||||||||||||||||||||||||||||||
Add project to version management |
Analysis | If version management is activated in the Connectivity studio parameters, you can add projects to version management. So, version management is only applicable to the projects that you added to version management. If you add a project to version management, it is automatically checked out to make changes. To add the current project setup to version management as the first project version, check in the project. As a result, a version file is created and added to the folder as defined in the Business integration parameters.
|
||||||||||||||||||||||||||||||||||
Add root record to outbound queue |
Operation | If an event is logged for the root record of the applicable source document, the Redirect event setting is not applicable. On processing the data synchronization log, the logged event on the root record of the source document is added to the outbound queue. When the outbound queue is processed, the root record is exported. If, in the applicable document, child records are defined for the root record, these child records are exported as well. Note: Before a record is added to the outbound queue, a check is done if already a record exists for the unique combination of: - Record - Message or web service action - Event type - Status If, in the outbound queue, already a record exists for the unique combination, the record is not added to the outbound queue. |
||||||||||||||||||||||||||||||||||
Add root record to outbound queue (Redirect) |
Operation | When the data synchronization log is processed, the Redirect event setting of the message or web service action is considered. Each applicable table event, that occurs to a record, is logged in the data synchronization log for that record and related table. On processing the data synchronization log, by default, logged events for a child record of the source document are added to the outbound queue for the root record of the source document. When the outbound queue is processed, the root record and its child records are exported. Example: The source document has a root record 'Sales order' and a child record 'Sales line'. An event is logged for a sales line. If Redirect event is set to 'Yes', not the logged event on the sales line is added to the outbound queue. Instead, the event is added to the outbound queue as logged on the sales order. When the outbound queue is processed, the sales order and all its sales lines are exported by the message. Note: Before a record is added to the outbound queue, a check is done if already a record exists for the unique combination of: - Record - Message or web service action - Event type - State If, in the outbound queue, already a record exists for the unique combination, the record is not added to the outbound queue. |
||||||||||||||||||||||||||||||||||
Analyze the comparison results |
Design | As a result of a content packages comparison, all differences between the selected content packages are shown on the Compare page. Compare resultsOn the Compare page, the Compare results tree shows where differences between the compared content packages exist. The Compare results tree has these types of entries:
Possible differencesFor the selected entry in the Compare results tree, the related differences are shown in the Results pane. In the Results pane, for each difference, one of these messages is shown to describe the difference:
Note: "Field does not exist in source" and "Field does not exist in target" messages do not imply that the original data tables do not have the specified field. It implies that the applicable message mapping removed this field from the resulting XML file. AnalysisUse the Source value and Target value fields to analyze the difference between the field values. You can choose one of these options to indicate how to solve a difference:
You can use the View source record and View target record buttons to view the full selected record details in XML format. The shown data is retrieved from either the source content package or the target content package. |
||||||||||||||||||||||||||||||||||
Analyze tracer |
Design | When you run a message for testing purposes, you can use a tracer to register what the message does when it is run. When the message has run using the tracer, you can review the registered actions in the tracer. Usually, you use this to find issues in the message run. | ||||||||||||||||||||||||||||||||||
Apply custom expression (Field option) |
Design | You can use an expression to modify the value. In the expression, use source fields, target fields, or mapping variables as variables.
Examples: You can use an expression to:
Note:
For more information on expressions, refer to: Expression.
|
||||||||||||||||||||||||||||||||||
Apply custom handler (Field option) |
Design | You can use a handler class to set the target field value. Several handler classes are available. For example, to get the current date, to add the time to a date, or to get the current company. |
||||||||||||||||||||||||||||||||||
Apply dimension set (Field option) |
Design | On D365 FO tables, a financial dimension is expressed in a RecId to a financial dimensions table. So, it does not reflect the financial dimension name and value. You can use the Dimension set option to get or set the financial dimension value based on the RecId and the dimension name (or number) as defined for the field option. If the message is used to:
Note: Only use the Dimension set field option if one of these fields is part of the field mapping:
|
||||||||||||||||||||||||||||||||||
Apply display method (Field option) |
Design | You can use a display method to get the applicable value. For example, you can use a display method to calculate a value. |
||||||||||||||||||||||||||||||||||
Apply edit method (Field option) |
Design | You can use an edit method to set the target field value. On import, the edit method changes the value in the target field of the D365 FO table. |
||||||||||||||||||||||||||||||||||
Apply external code |
Design | In a field mapping, you can apply an external code as defined for the related entity. As a result, on:
You can only use the external code setup in messages that are run in EDI studio. To use the external code functionality, in EDI studio, additional external code setup is required for the EDI parties or EDI groups. For each EDI party or EDI group, you can define an external code definition as set up for a:
|
||||||||||||||||||||||||||||||||||
Apply external reference (Field option) |
Design | You can link an external ID to a record ID in D365 FO. Together with the external ID, you can also link an external revision number to a record ID.
Note:
|
||||||||||||||||||||||||||||||||||
Apply inventory dimension (Field option) |
Design | On several D365 FO tables, inventory dimensions exist. Each inventory dimension is expressed in a RecId that refers to the Inventory dimensions (InventDim) table. So, it does not reflect the inventory dimension name and value. You can use the Inventory dimension option to get or set the inventory dimension value based on the RecId and the dimension name as defined for the field option.
Note:
|
||||||||||||||||||||||||||||||||||
Apply ledger (Field option) |
Design | To exchange journal data, you can get or set the account number for ledger transactions as main account (Default) or using another field (Ledger) to indicate if the value is, for example, a project, customer, or main account. Note: If the account type is Main account, only use the Ledger field option if one of these fields is part of the field mapping: LedgerDimension or OffsetLedgerDimension. |
||||||||||||||||||||||||||||||||||
Apply lookup (Field option) |
Design | You can use a lookup to get a value from another table and use it as the output value of this option. For example, you can get a value from a table that is not in the source document. The other table must have the source field as the single key field. The current value of the 'Modify a value' process is the input of the lookup. As a result, the value that is returned by the lookup is the value of the Return value field. Before determining the output value of this option, you can apply a type conversion or a transformation. |
||||||||||||||||||||||||||||||||||
Apply mapping variable |
Design | During a message run, you can use mapping variables to temporary store values. You can write (calculated) values to a mapping variable, and later during the message run, read the value from the mapping variable. You can use mapping variables across records. |
||||||||||||||||||||||||||||||||||
Apply number sequence (Field option) |
Design | You can use a number sequence to get the field value. So, instead of the source field value, the next available number in the number sequence is the applicable value. |
||||||||||||||||||||||||||||||||||
Approve EDI Delfor journal |
EDI | If all errors and warnings of an EDI Delfor journal are solved, accepted, or canceled, you must approve the EDI Delfor journal. The EDI Delfor journal is again validated according to the applicable journal validation setup. If the applicable validation rules are met, the EDI Delfor journal journal status is set to Approved. Approved EDI Delfor journals are processed by the 'Sales (Delfor) - EDI Delfor journal to Order' message batch. |
||||||||||||||||||||||||||||||||||
Approve EDI inventory order |
EDI | If all errors and warnings of an EDI inventory order are solved, accepted, or canceled, you must approve the EDI inventory order. The EDI inventory order is again validated according to the applicable journal validation setup. If the applicable validation rules are met, the EDI inventory order journal status is set to Approved. Approved EDI inventory orders are processed by the applicable custom message: 'EDI inventory order to picking list registration' or 'EDI inventory order to product receipt'. |
||||||||||||||||||||||||||||||||||
Approve EDI purchase order confirmations |
EDI | If all errors and warnings of an EDI purchase order confirmation are solved, accepted, or canceled, you must approve the EDI purchase order confirmation. The EDI purchase order confirmation is again validated according to the applicable journal validation setup. If the applicable validation rules are met, the EDI purchase order confirmation journal status is set to Approved. Approved EDI purchase order confirmations are processed by the 'Purchase - EDI confirmation to Order' message batch. |
||||||||||||||||||||||||||||||||||
Approve EDI sales order |
EDI | If all errors and warnings of an EDI sales order are solved, accepted, or canceled, you must approve the EDI sales order. The EDI sales order is again validated according to the applicable journal validation setup. If the applicable validation rules are met, the EDI sales order journal status is set to Approved. Approved EDI sales orders are processed by the 'Sales - EDI order to Order' message batch. |
||||||||||||||||||||||||||||||||||
Approve staging journal |
Design | If all errors and warnings of a staging journal are solved, accepted, or canceled, approve the staging journal. The staging journal is again validated according to the applicable journal validation setup. If the applicable validation rules are met, the staging journal status is set to Approved. Approved staging journal can be further processed by the message that imports the staging journal records into D365 FO. |
||||||||||||||||||||||||||||||||||
Auto fix errors |
Design | If errors are found by the automated error check, you can first try to have these errors fixed automatically. For example, an error that can be fixed automatically: The field length of an internal document record field does not match the field length in the related table. In this case, the document record field length is changed to the table field length. You can auto-fix errors for:
In this activity, as an example, the steps explain how to auto-fix a document. Where applicable, notes are added to explain how to auto-fix the other types. |
||||||||||||||||||||||||||||||||||
Change project or related components |
Analysis | If a project is checked out, you can make the required changes to the project or to the related components. | ||||||||||||||||||||||||||||||||||
Check if event complies with source document setup |
Operation | For each applicable message, a check is done if the record, for which the event is logged, complies with the source document of the message. Checks are done if the record complies with, for example, the applicable document record ranges and join modes. |
||||||||||||||||||||||||||||||||||
Check in a project |
Analysis | If version management is active and a project is checked out to make changes, check in the project to make the changes generally available for other environments. As a result, the changes to the project details and related components are stored as a new project version. The project version is stored as a file in the file storage folder as defined in the Connectivity studio parameters.
|
||||||||||||||||||||||||||||||||||
Check out a project |
Analysis | If version management is active, you must check out a project to make changes to the project or its components. You check out a project in an environment. Each user in this environment can make changes to the checked-out project. If a project is checked out, it cannot be checked out in another environment. You can only check out the latest project version. If you check out a project and the active version is not the latest version, you can choose to first get the latest version. If you do not get the latest version, the project is not checked out. |
||||||||||||||||||||||||||||||||||
Check relations |
Design | For internal documents and ODBC documents, the relation between the tables of a parent record and a child record are important. The relation is required to be able to query from parent to child records in the record tree. Check or define the relation between the table for the current record and the table of the parent record. Note: If you define the parent record for an internal document and a table relation exists with the current records table, it automatically adds the first-found relation to the current record. If the relation fields are not yet added to the parent record or the current record, these are automatically added. For an ODBC document, you must set the relations manually. |
||||||||||||||||||||||||||||||||||
Check setup |
Design | If you have finished the setup, you can run a test to check for errors in the setup. You can do so for:
If an error is found, in the message bar, a message is shown indicating the error.
If for an entity, an error exists or the setup is incomplete, an error icon is shown. You can click the icon to show the related error in the message bar.
In this activity, as an example, the steps explain how to check a document. Where applicable, notes are added to explain how to check the other types.
|
||||||||||||||||||||||||||||||||||
Clean up data changes log |
Design | On a message header, in the Log changes field, you can indicate if data changed are logged. If set to 'Yes', on import, the D365 FO data that is changed during import, is logged. Changes are logged by field. Only the latest change is stored for each field. The data changes are logged in the History change log (BisConHistoryChangeLog) table. To view the logged data changes for a message, on the Message page, on the Action Pane, on the Operations tab, click Show changes. You can clean up the logged data changes manually or in recurring mode. |
||||||||||||||||||||||||||||||||||
Clean up generic staging journal table |
Design | In Connectivity studio, you can use the staging concept to validate data in an intermediate area before it is further processed. One predefined generic staging journal table is available for Connectivity studio: 'BisStagingBufferOrderJournal'. This topic explains how to clean up the 'BisStagingBufferOrderJournal' table. You can clean up the staging journals manually or in recurring mode. For example, you want to keep staging journals for six months. Each week, you can do a cleanup, deleting staging journals older than six months. |
||||||||||||||||||||||||||||||||||
Clean up history tables |
Deployment | Each integration run results in history records. History records can be logged for:
For example, you want to keep history records for six months. Each week, you can do a cleanup, deleting history records older than six months.
These tables are cleaned up:
|
||||||||||||||||||||||||||||||||||
Clean up outbound queue |
Deployment | On processing the outbound queue, for each record, the related message or web service action is run to export the applicable data. A successfully processed outbound queue record gets the status Processed. You can clean up the outbound queue by deleting the records with the status Processed. |
||||||||||||||||||||||||||||||||||
Clean up unused fields |
Design | If the document is linked to a message, you can clean up the fields. All fields that are not used in the message mapping are shown on a separate page. You can decide which of the unused fields you want to delete from the document. All messages to which the document is linked are checked. Only the fields are shown as unused that are not used in any of the field mappings of the related messages. You can use this, for example, if you have initialized the record or fields and all found fields are added to the document records. |
||||||||||||||||||||||||||||||||||
Combine logged events of same record and type |
Operation | In the data synchronization log, several events can be logged for a unique record and event type combination. To prevent processing duplicate events, all logged events for a unique record and event type combination are further processed as one event. Finally, this results in only unique records in the outbound queue. Example: A sales order is changed several times. So, several events of type Update are logged for the sales order. When the data synchronization log is processed, these events are first grouped on one page. When the page is processed, these events are combined into one event for the sales order for further processing. Note: If you split logged events over pages, combining events for a unique record and event type combination is done separately for each page. |
||||||||||||||||||||||||||||||||||
Compare projects |
Analysis | You can compare a project for reviewing purposes.
You can compare a:
On the Compare projects page, the:
A record that is shown in the comparison can have one of these statuses:
|
||||||||||||||||||||||||||||||||||
Configure business events endpoint |
Design | For several endpoint types, you configure the business event endpoints:
For more information, refer to Manage business event endpoints. The tutorials, as referred to in this topic, give more detailed information. |
||||||||||||||||||||||||||||||||||
Configure the App Service web app |
Design | Before you can use the web site, as installed on the App Service web app, to connect to D365 FO, configure the web app. | ||||||||||||||||||||||||||||||||||
Configure the IIS application |
Design | Before you can use the web site to connect to D365 FO, configure the IIS application. | ||||||||||||||||||||||||||||||||||
Connect environment to Azure file share |
Design | To exchange files with the environment to which you connect, you can give it access to the Azure file share. To be able to access the Azure file share from another environment, you can mount the Azure file share with the other environment.
For more information, refer to Use an Azure files share with Windows. |
||||||||||||||||||||||||||||||||||
Copy fields |
Design | You can copy fields from a record of another document. You can use this, for example, to save setup time if you use a specific record in several documents. You can only copy fields:
|
||||||||||||||||||||||||||||||||||
Copy message |
Design | You can create a copy of a message. For example, to troubleshoot a failing message that is already in operation, you can copy the message. So, you do not disrupt the operational data. |
||||||||||||||||||||||||||||||||||
Create an Azure Logic App as inbound web service |
Design | Create an Azure Logic App that serves as inbound web service for D365 FO. For more information, refer to Azure Logic Apps documentation. You can, for example, create a logic app that picks up files when created in OneDrive and trigger the applicable web service action in D365 FO. |
||||||||||||||||||||||||||||||||||
Create an Azure Logic App as outbound web service |
Design | Create an Azure Logic App that serves as outbound web service for D365 FO. For more information, refer to Azure Logic Apps documentation. You can, for example, create a logic app that posts files to OneDrive. |
||||||||||||||||||||||||||||||||||
Create Azure Service Bus namespace |
Design | Create an Azure Service Bus namespace. For more information, refer to Create a Service Bus namespace using the Azure portal. Copy the Primary connection string and the Primary key of the namespace to a temporary location for later use. For more information, refer to Get the connection string. |
||||||||||||||||||||||||||||||||||
Create Azure SQL database |
Design | You can use a connector of type Database to connect to an Azure SQL database. For more information on Azure SQL databases, refer to Azure SQL Database. |
||||||||||||||||||||||||||||||||||
Create Azure Storage Account |
Design | You can use an Azure Storage Account to:
You can:
For more information, refer to Create an Azure Storage Account.
|
||||||||||||||||||||||||||||||||||
Create Blob container |
Design | For the Azure Storage Account, create a Blob container where data files are uploaded to or downloaded from. In the Blob container, create the folders that are required to exchange files. You can create folders that relate to the paths in the Properties section, the Read section, and the Write section of a connector of type 'Azure Blob storage':
For more information, refer to Azure Blob storage. |
||||||||||||||||||||||||||||||||||
Create content package |
Design | In the Environment comparison studio, you can compare data using content packages. To start a data comparison, you need two content packages. Generate each of these content packages based on the XML files in the applicable Azure Storage account folder from the desired company/environment combination. The generation process:
You can compare the generated content package with another content package.
|
||||||||||||||||||||||||||||||||||
Create file share for Azure file storage connector |
Design | For the Azure Storage Account, create a file share where data files are uploaded to or downloaded from. For the file share, you can create the required folders. You can create folders that relate to the paths in the Properties section, the Read section, and the File actions section of the connector:
For more information, refer to Create file share. |
||||||||||||||||||||||||||||||||||
Create file share for version management and history reporting |
General | For the Azure Storage Account, create a file share for the version management files and history reporting files. For the file share, you can create the required folders.
Create folders that relate to the paths in these sections of the Connectivity studio parameters:
For more information, refer to Create file share.
|
||||||||||||||||||||||||||||||||||
Create mapping variables |
Design | During a message run, you can use mapping variables to store values. You can write (calculated) values to a mapping variable, and later during the message run, read the value from the mapping variable. You can use mapping variables across records. The mapping variable values are only stored during the message run. When the message run is finished, the mapping variable values are deleted. During a message run, to:
|
||||||||||||||||||||||||||||||||||
Create message business event |
Design | If you want to create a business event to be triggered by a message, first create and set up a message business event. A message business event is a business event definition that is used to create a business event in D365 FO. For each message, you can only create one message business event. The business event is triggered when the message run is done, taking into account the 'Send business event' setting. |
||||||||||||||||||||||||||||||||||
Create Microsoft Word document from project |
Analysis | To analyze a project, you can create a Microsoft Word document based on the project. A summary of the project setup and related components setup is added to the document. You can use the document to review the setup. When created, the Microsoft Word document is downloaded to your local downloads folder. To create the document, the Microsoft Word template is used that is defined in the Connectivity studio parameters. |
||||||||||||||||||||||||||||||||||
Create on-premises Data Source Name (DSN) |
Design | Create a Data Source name (DSN) on the external on-premises server where you installed the BIS Azure Service Bus Windows service. Note: If you connect from D365 FO on-premises to an external on-premises server, no Azure Service Bus is required. In this case, the Data Source Name (DSN) must be created on the D365 FO server. |
||||||||||||||||||||||||||||||||||
Create project |
Analysis | Use a project as the basis for each integration. For each project, you can also define the subprojects. You can use the subproject functionality to run multiple projects at once. The subprojects are run after you run the project. You can, for example, use subprojects for data migration. In this case, you set up a separate project for each applicable D365 FO module, for example, Sales and marketing, Procurement and sourcing, and Production control. Also, set up an 'umbrella' project and link the module-specific projects to it as subprojects. To run the data migration, you run the 'umbrella' project and all subprojects are automatically run as well. |
||||||||||||||||||||||||||||||||||
Create related record |
Design | You can add a record based on an existing table relation in D365 FO. You can only create a related record for:
Example: The record CustTable, has a field CustGroup. This CustGroup field is part of the table relation between the CustTable and CustGroup table. Create a related record for the CustGroup field of the CustTable record. As a result, the CustGroup record is created and added as a child record to the CustTable record. To the CustGroup record, the mandatory fields of the CustGroup table are added. |
||||||||||||||||||||||||||||||||||
Create secret reference |
Design | You can create secret references to store secrets at a central place in Connectivity studio. Wherever you need a secret in Connectivity studio, you can use a secret reference. Benefits of using secret references are:
|
||||||||||||||||||||||||||||||||||
Create task |
Design | You can use tasks to set up the execution of:
|
||||||||||||||||||||||||||||||||||
Create test case for export message |
Design | You can use test cases to automate the testing of an integration or data migration setup. For each message, you can create the desired test cases. This topic explains how to create a test case for an export message; the message source document is an internal document. |
||||||||||||||||||||||||||||||||||
Create test case for import message |
Design | You can use test cases to automate the testing of an integration or data migration setup. For each message, you can create the desired test cases. This topic explains how to create a test case for an import message; the message target document is an internal document. |
||||||||||||||||||||||||||||||||||
Create test project |
Design | If you want to use automated testing, the best practice is to use separate projects for testing and for the actual integration or data migration. Create a test project in the same way as an integration or data migration project. For the test project, only set up test tasks. |
||||||||||||||||||||||||||||||||||
Data is written to target |
Operation | Based on the target document, the mapped data is written to the target of the message. | ||||||||||||||||||||||||||||||||||
Data mapping is done |
Operation | Based on the record mapping and field mapping setup of the message the source data is mapped to the target records and fields. | ||||||||||||||||||||||||||||||||||
Data validations are done in the staging journal |
Operation | In Connectivity studio, you can use the staging concept to validate data in an intermediate area before it is further processed. This is usually used to import data into D365 FO from another system. In this way, you can validate the data before it is written into the D365 FO database. Usually, the data validations are done automatically. |
||||||||||||||||||||||||||||||||||
Define data migration setup |
General | When you have selected the AX2012 tables which data you want to migrate to D365 FO, the related data migration setup records are created. Complete the created data migration records. |
||||||||||||||||||||||||||||||||||
Define field mapping sequence |
Design | On import, the business logic in D365 FO calls the ModifiedField method. This method can set or change other values. If the field mapping sequence is not right, it can reset values which are just imported.
Tip: Set field mappings in the sequence in which you fill in the fields in the related form. |
||||||||||||||||||||||||||||||||||
Define field sequence |
Design | The sequence, in which you set up the fields for the record, defines the sequence in which the related data is exchanged. If required, you can change the sequence of the fields. | ||||||||||||||||||||||||||||||||||
Define project applications |
Analysis | On the project, define the applications that are involved in the project. If you define an application for a connector, document, or type conversion, only the applications of the applicable project are available. The project applications are also used during export and import of the project. During project:
|
||||||||||||||||||||||||||||||||||
Define range |
Design | For each record, you can define the range of data that is queried for export or import. For example, you only want to export sales orders for a specific customer group. To do so, on the Range tab, add a record for the CustGroup field. For more information on how to define ranges in the Range field, refer to Advanced filtering and query syntax. |
||||||||||||||||||||||||||||||||||
Define record mapping query settings |
Design | When you run a message, a query is applied. For this query, you can define several specific settings:
|
||||||||||||||||||||||||||||||||||
Define record mapping sequence |
Design | You can change the sequence of the record mappings to make sure the related data is exchanged in the right order. | ||||||||||||||||||||||||||||||||||
Define record sequence and structure |
Design | You can organize the business document records in these ways:
|
||||||||||||||||||||||||||||||||||
Define sorting |
Design | For internal documents and ODBC documents, you can define the order in which the data in the record is queried and processed during export or import. For example, to export sales orders, you want the sales order to be queried by customer and for each customer by delivery date. To do so, on the Sorting tab, add a record for both the CustAccount field and the DeliveryDate field, in this sequence. |
||||||||||||||||||||||||||||||||||
Define task dependencies |
Design | If a task depends on one or more other tasks, define the dependencies. So, the task is not done before the other tasks are done. The dependencies are only taken into account if you run a project. You can use task dependencies to schedule data import or export in batch. The main reasons to use task dependencies are:
For each data level, you can set up one or more tasks. To define the level of a task, add a dependency to a task of the applicable previous level. As a result, the task level is automatically assigned. For example, if you add a dependency to a level '2' task, automatically, level '3' is assigned to the current task. To each task, assign the messages that process the data for the task. You can group messages in tasks as desired. The next picture gives an example of a data migration project. The project is run using the defined task dependencies. As a result, the data is migrated in the required sequence and with a better performance. Example: A sales order can only be imported if the related customer and item are already available. So, the customer and item must be imported first. In the previous picture, the customer is imported in level 2 and the item in level 3. This is done before the sales order header is imported in level 4 and the sales order line in level 5. |
||||||||||||||||||||||||||||||||||
Define transformation for mapping field |
Design | You can use a transformation to change a source value into another value. If a transformation is required for a field mapping, add the relevant transformation to the field mapping. |
||||||||||||||||||||||||||||||||||
Define type conversion for field mapping |
Design | You can use a type conversion to convert the data to match the format as required in the target. With a type conversion, you can convert values from any type to string or from string to any type. Usually, the string value is the external value. If a type conversion is required for a field mapping, add the relevant type conversion to the field mapping. |
||||||||||||||||||||||||||||||||||
Delete Connectivity studio parameters |
General | If you have copied a database to the current environment and you want to manually reset the Connectivity studio parameters, first delete these parameters. As a result, all parameter settings are deleted, and a new environment ID is generated.
If the imported parameters are fine, you can also decide to only reset the Environment ID. As a result, only these fields are reset:
|
||||||||||||||||||||||||||||||||||
Delete document |
Deployment | On the Document page, you cannot delete a document if it is used on a message or if it has records and fields. However, if such a document is no longer desired, to clean up your environment, you can still delete it. On delete, only the document is deleted. Usually, you do not clean up your documents frequently. Beware: Cleaning up your documents results in a hard delete of the applicable documents. Note that the messages in which the document is used cannot run until another document is linked to the message. So, be very careful when you use the clean-up functionality. |
||||||||||||||||||||||||||||||||||
Delete message |
Deployment | On the Message page, you cannot delete a message if it is used or if it has field mappings. A message can be used on, for example, a task, a web service action, or an EDI document flow. However, if such a message is no longer desired, to clean up your environment, you can still delete it. On delete, only the message is deleted. Usually, you do not clean up your messages frequently. Beware: Cleaning up your messages results in a hard delete of the applicable messages. Note that deleting a message has a consequence for the entities where it was used. For example, a web service action cannot run properly until another message is linked. So, be very careful when you use the clean-up functionality. |
||||||||||||||||||||||||||||||||||
Delete project |
Deployment | On the Project page, you cannot delete a project if it is used as a sub project or if it has children, like tasks. However, if such a project is no longer desired, to clean up your environment, you can still delete it. On delete, everything that belongs to the project is deleted. Usually, you do not clean up your projects frequently. Beware: Cleaning up your projects results in a hard delete of the applicable projects. So, be very careful when you use the clean-up functionality. |
||||||||||||||||||||||||||||||||||
Delete web service action |
Deployment | On the Web service action page, you cannot delete a web service action if it is used or if it has attributes or arguments. A web service action can be used on, for example, a task or an EDI document flow. However, if such a web service action is no longer desired, to clean up your environment, you can still delete it. On delete, only the web service action is deleted. Usually, you do not clean up your web service actions frequently. Beware: Cleaning up your web service actions results in a hard delete of the applicable web service actions. Note that deleting a web service action has a consequence for the entities where it was used. For example, an EDI document flow cannot run properly until another web service action is linked. So, be very careful when you use the clean-up functionality. |
||||||||||||||||||||||||||||||||||
Develop external code to run a custom service |
Design | You can develop custom code outside D365 FO to directly run a custom service that is provided for Connectivity studio web services. To use the external custom code to run a:
|
||||||||||||||||||||||||||||||||||
Download BIS Documentation Template and move it to Shared path |
General | To analyze a project, you can create a Microsoft Word document based on the project. A summary of the project setup and related components setup is added to the document. You can use the document to review the setup. To create the document, you must define the applicable Microsoft Word template in the Connectivity studio parameters. To do so, first download the BIS Documentation Template.dotx template. Move the downloaded template to the folder as defined in the Connectivity studio parameters, in the Version management section, in the Shared path field. |
||||||||||||||||||||||||||||||||||
Download transactions |
Deployment | When a project is exported, the transaction and created XML file are stored in the Download transactions. Download the XML file to your local download folder. |
||||||||||||||||||||||||||||||||||
Events are logged in the Data synchronization log |
Operation | For messages and web service actions, you can use table events to track data changes. You can define, for each table, which table events are logged. The table events are logged in the Data synchronization log. |
||||||||||||||||||||||||||||||||||
Export Connectivity studio parameters |
General | If you want to copy a database to the current environment and you want to keep the Connectivity studio parameters, first export these parameters. As a result, an XML file with the parameter settings is created and downloaded to your local downloads folder.
|
||||||||||||||||||||||||||||||||||
Export files for comparison |
Design | To compare data, first export your data from the applicable D365 FO environment to XML files. You can use XML files to generate a content package. To generate an XML file for a content package, run the applicable message. As a result, an XML file with the content, as defined by the message, is generated and placed in a specified folder on your Azure Storage account. You can run several messages to export different sets of data to XML files. You can generate a content package based on several XML files. In this case, make sure all desired XML files for the content package are placed in the same Azure Storage account folder.
|
||||||||||||||||||||||||||||||||||
Export project to file |
Deployment | To exchange a configuration, you can export a project. All elements that are related to the project are exported. Transactions and history are not exported. For example, the related messages and web service actions are exported, but the message history and web service history are not exported. Which settings are included in the export is defined by the Export options parameter in the Connectivity studio parameters. If you export a project, and the export option is:
As a result of the export, an XML file is created with the project configuration. This topic explains how to export a project configuration to a file. |
||||||||||||||||||||||||||||||||||
Export released project as data entity with Data management framework |
Analysis | If you have released a Connectivity studio project a project release record is created in the BisReleaseTable table that contains an XML file with the data of the project and all related components. You can export this project release as a data entity with the Data management framework. |
||||||||||||||||||||||||||||||||||
Export secret references |
Design | You can export secret references from a D365 FO environment to be imported in another D365 FO environment. If you export a project, the secret references are not exported. If an export of the secret references is required, you can export the secret references separately. Usually, you only export and import secret references to a D365 FO environment of the same type. For example, you only export secret references from a Development environment to import these in another Development environment. Reason: You don't want to mix up data. For example, you don't want to mix up Test data with Production data. |
||||||||||||||||||||||||||||||||||
Find request message of applicable web service action |
Operation | If a web service action is 'subscribed' to a processed event, find the request message that is defined for the web service action. This request message is used to check if the event complies with the source document of the message. | ||||||||||||||||||||||||||||||||||
Generate data migration message |
General | You can generate messages based on the data migration setup records. If you generate a message, this is generated:
Be aware that a generated message often needs some fine-tuning due to differences between the AX2012 table and D365FO table.
|
||||||||||||||||||||||||||||||||||
Generate tasks for data migration project |
General | When you have reviewed and completed the data migration setup records with related message and documents, you can generate tasks based on the records.
The tasks are created based on the areas and sublevels as assigned to the data migration setup records in this way:
To each created task, the message is added as defined for the related data migration setup record. Note: If records exist with the same area and area sublevel, only one task is created based on these records. And for all these records, the messages are added to this one task.
On task generation, when a message is added to a task, and the related data migration setup record is:
ExampleFor a data migration project, these data migration setup records are used:
Note the use of the status, areas, area sublevels, and record activation.
The generated task structure, as shown on the Project page, is:
The generated tasks with added messages, and the set message actions are:
|
||||||||||||||||||||||||||||||||||
Get latest project version |
Analysis | If in an environment, the current project version is not the latest project version, you can get the latest project version. As a result, the current project version is replaced with the latest project version. If you have checked out a project and made changes in the current environment, you can undo these changes by doing Get latest. In this case, also the check out of the project is undone. |
||||||||||||||||||||||||||||||||||
Import Connectivity studio parameters |
General | If you have copied a database to the current environment and you first exported the Connectivity studio parameters, you can import these parameters. As a result, the Connectivity studio parameters are automatically reset with the original environment parameters.
|
||||||||||||||||||||||||||||||||||
Import ECS result into D365 F&O |
Design | When you process comparison results, for each table, a separate XML file is created. By default, the generated XML files are stored in the Import folder. You can import the files into D365 FO to create the desired environment setup. The import message, with the ECS Azure file storage connector as source connector, gets the XML files from the Import folder. You can:
|
||||||||||||||||||||||||||||||||||
Import Logic App tutorial |
Design | If you want to run an inbound web service or outbound web service in the cloud using Azure logic Apps, first download and extract the Project Tutorial - Logic App.zip file. This file contains a project with an inbound web service action and an outbound web service action. | ||||||||||||||||||||||||||||||||||
Import project data entity with Data management framework |
Analysis | If you have exported a project release as a data entity, you can import it in another environment with the Data management framework. As a result, a project release record is created in the BisReleaseTable table that contains an XML file with the data of the imported project and all related components. To view the created project release, go to Connectivity studio > Periodic > Version management > Release. |
||||||||||||||||||||||||||||||||||
Import project from file |
Deployment | To exchange a configuration, you can export a project. All elements that are related to the project are exported. Transactions and history are not exported. For example, the related messages and web service actions are exported, but the message history and web service history are not exported. As a result of the export, an XML file is created with the project configuration. You can use this XML file to import the configuration, for example, in another environment. This topic explains how to import a project configuration from a file. |
||||||||||||||||||||||||||||||||||
Import project from resource |
Deployment | Several project configurations are included as resources in the Connectivity studio release. If the Connectivity studio is deployed, you can import these project configurations from the resources. |
||||||||||||||||||||||||||||||||||
Import received Service Bus data |
Operation | You can import data from a Service Bus queue or topic subscription. First, you must receive the data from the Service Bus queue or topic subscription. On receiving data from the Service Bus, based on the Service Bus search definitions and settings on the received data, import messages are automatically assigned to the received data records.
When the data is received, you can import the received data from the 'Received data from queue' table into D365 FO. To import the received data, run the messages as assigned to the received data records. You can run the messages in several ways, as desired. You can run a message by running a:
If an import message run finishes:
Note: The assigned import messages must have the 'Service Bus queue' connector as source connector.
|
||||||||||||||||||||||||||||||||||
Import secret references |
Design | You can import secret references that are exported from another D365 FO environment. If you import a project, the secret references are not included. If the related secret references are required, you can import the secret references separately. Usually, you only export and import secret references to a D365 FO environment of the same type. For example, you only export secret references from a Development environment to import these in another Development environment. Reason: You don't want to mix up data. For example, you don't want to mix up Test data with Production data. |
||||||||||||||||||||||||||||||||||
Inbound web service action is triggered |
Operation | The inbound web service application triggers the applicable inbound web service action.
Based on these parameters in the HTTP request, the inbound web service calls the executeWebserviceOperation method that determines which web service action is triggered:
|
||||||||||||||||||||||||||||||||||
Inbound web service staging table records are processed in batch |
Operation | |||||||||||||||||||||||||||||||||||
Initialize document record fields |
Design | For documents of type Text or Microsoft Excel, you can initialize the fields for a record. To initialize record fields for:
When the initialization is finished, review and complete the properties of the initialized fields. Usually, during review, you do not add fields. However, you can remove the not needed fields. |
||||||||||||||||||||||||||||||||||
Initialize document record fields - ODBC |
Design | For documents of type ODBC, you can initialize the fields for a record. To initialize record fields for an ODBC document, use a connector of type Database to connect to the applicable external database. Make sure, the name in the Record table field is exactly the same name as the relevant table in the external database.
The fields are added to the record based on the fields of the external table.
When the initialization is finished, review and complete the properties of the initialized fields. Usually, during review, you do not add fields. However, you can remove the not needed fields. |
||||||||||||||||||||||||||||||||||
Initialize document records and fields |
Design | For documents of type XML or JSON, you can initialize the records based on an input file. To initialize records and fields for:
On initializing records, also the fields that are defined in the input file, are initialized. If you initialize based on:
When the initialization is finished, review and complete the properties of the initialized records and fields. Usually, during review, you do not add records or fields. However, you can remove the not needed record and fields. |
||||||||||||||||||||||||||||||||||
Initialize form mapping from message mapping |
Design | You can initialize a form mapping from an existing message. As a result, the record mapping and field mapping of the message are translated to a form mapping. You can do this, for example, to bridge the gap between functional users and technical users. |
||||||||||||||||||||||||||||||||||
Initialize message mapping from form mapping |
Design | If you have finished the form mapping recording and reviewed it, you can use this to initialize the related message mapping. As a result of the initialization, based on the form mapping:
After initialization, you can edit and fine-tune the internal document and the message. |
||||||||||||||||||||||||||||||||||
Initialize project in environment |
Analysis | The main purpose of 'Get latest' on the Project version management page is to initialize a version-controlled project in the current environment. So, if a project is not available in the current environment, you can select it from the list and make it available. Besides this, you can also use 'Get latest' in these cases:
|
||||||||||||||||||||||||||||||||||
Install Azure Storage Explorer and connect to an Azure Storage Account |
Design | You can use the Azure Storage Explorer to connect to and manage your Azure Storage Accounts.
First download and install the Azure Storage Explorer and then connect to a storage account.
To connect, use a storage account name and key. You can find the name and key on the Azure Portal > Storage accounts > Access keys.
When connected, and you use the Azure Storage Explorer for:
For more information, refer to Get started with Storage Explorer.
|
||||||||||||||||||||||||||||||||||
Install deployment tool |
Design | Run the IIS application deployment tool to install the web site. | ||||||||||||||||||||||||||||||||||
Install on-premises Windows service - BIS Azure Service Bus |
Design | Install the BIS Azure Service Bus as Windows service on the external on-premises server. To install the BIS Azure Service Bus, first download the BIS Azure Service Bus.zip file. |
||||||||||||||||||||||||||||||||||
Install XPO on AX2012 |
Design | In D365 FO, enums can be extendable. For extendable enums, the enum value numbering is not fixed. Therefore, an enum value number in AX2012 can be different from the number of the related enum value in D365 FO. To prevent a mismatch of enum values when migrating data, migrate enums based on the enum value name instead of the enum value number. To do so, on the AX2012 environment:
As a result, you can migrate the enum values based on the names instead of the numbers. |
||||||||||||||||||||||||||||||||||
Manually fix errors |
Design | If you have re-run the automated error check, and still errors exist in the setup that are not fixed automatically, manually fix these errors. | ||||||||||||||||||||||||||||||||||
Mark message run as Solved |
Operation | If the errors of a message run are solved, and no re-run in History management is required, the message run status is not automatically set to Finished. Therefore, you can mark the message run as Solved. As a result, the message run status is set to Finished. | ||||||||||||||||||||||||||||||||||
Merge message |
Design | You can troubleshooted a failing message using a copy of that message. When you have made changes to the message copy to solve the issue, the same changes must be applied to the original message. To do so, you can merge the message copy with the original message. When you merge a message (source) with another message (target), the:
Note: Use this merge option carefully. You cannot undo the overwrite action. |
||||||||||||||||||||||||||||||||||
Monitor data received from Service Bus dead letter queue |
Operation | You can monitor the data that you receive from an Azure Service Bus dead letter queue. The possible statuses are:
|
||||||||||||||||||||||||||||||||||
Monitor data received from Service Bus queue or topic subscription |
Operation | You can monitor the data that you receive from an Azure Service Bus queue or topic subscription.
Use the Received status to decide if any troubleshooting action is required.
The possible statuses are:
|
||||||||||||||||||||||||||||||||||
Monitor data sent to Service Bus |
Operation | On export of data to an Azure Service Bus queue or topic a message is added to the Service Bus queue. Each message that is added to the Service Bus queue is logged in Connectivity studio. You can monitor this data, for example, for troubleshooting purposes.
Note:
By default, only records with errors are shown. If on receipt of the Service Bus message an error occurs, the Service Bus moves the message to the related dead letter queue. If you get the dead letter queue data from the Service Bus, and the message exists in the 'Data sent to queue' table, its received status is set to 'Returned with error'.
|
||||||||||||||||||||||||||||||||||
Monitor data synchronization log |
Operation | For messages and web service actions, you can use table events to track data changes. You can define, for each table, which table events are logged. The table events are logged in the Data synchronization log. If table events are not logged in the Data synchronization log as expected, you can check the data synchronization setup for your message or outbound web service action.
|
||||||||||||||||||||||||||||||||||
Monitor inbound web service staging table |
Design | Depending on the asynchronous execution mode of the inbound web service action, the inbound web service process runs directly or asynchronously. You can monitor the inbound web service staging table. |
||||||||||||||||||||||||||||||||||
Monitor project history |
Operation | The project history shows, the project runs of the selected project. And for each project run, for example, its status is shown and if the project tasks have run with errors. |
||||||||||||||||||||||||||||||||||
Monitor project versions |
Analysis | To monitor project versions, you can open a Project version management page that shows:
|
||||||||||||||||||||||||||||||||||
Monitor task history |
Operation | Usually, you run a task by running the related project. However, you can also run a single task, for example, for testing purposes. |
||||||||||||||||||||||||||||||||||
Monitor web service action history |
Operation | When a web service action is run, you can view the web service history. The web service history shows, for example, the:
Possible issues can occur, for example, in the:
|
||||||||||||||||||||||||||||||||||
Open related purchase order for EDI purchase order confirmation |
EDI | For each EDI purchase order confirmation with message status Processed, you can open the related purchase order from the EDI purchase order confirmation journal. |
||||||||||||||||||||||||||||||||||
Open related sales order for EDI Delfor journal |
EDI | For each EDI Delfor journal with message status Processed, you can open the related sales order from the EDI Delfor journal. |
||||||||||||||||||||||||||||||||||
Open related sales order for EDI sales order |
EDI | For each EDI sales order with message status Processed, you can open the related sales order from the EDI sales order journal. | ||||||||||||||||||||||||||||||||||
Outbound queue is processed |
Operation | For messages and web service actions, you can use table events to track data changes. You can define, for each table, which table events are logged. The table events are logged in the data synchronization log. On processing the data synchronization log, based on the logged events, records are added to the outbound queue. On processing the outbound queue, for each record, the related message or web service action is run to export the applicable data. |
||||||||||||||||||||||||||||||||||
Process data synchronization log |
Operation | For messages and web service actions, you can use table events to track data changes. You can define, for each table, which table events are logged. The table events are logged in the Data synchronization log.To fully process the logged events, process the:
This topic explains how to process the data synchronization log.
|
Number of pages | Page size | Limit records | Number of records to be processed | Result |
---|---|---|---|---|
5 | 10,000 | No | 40,000 | 4 pages of 10.000 records are processed |
5 | 10,000 | Yes | 40,000 | 4 pages of 10.000 records are processed |
5 | 10,000 | No | 80,000 | 5 pages of 16.000 records are processed |
5 | 10,000 | Yes | 80,000 | 5 pages of 10.000 records are processed |
Process outbound queue
This topic explains how to process the outbound queue.
In the outbound queue, several entries can exist for a unique message or webservice action. How these entries are processed, depends on whether bundling is active. You can activate bundling on the data synchronization setup of an outbound:
If the bundling field is set to:
Process received dead letter data
If a receiver reads a message from a Service Bus queue or topic subscription, an error can occur. If so, the Service Bus moves the message to the applicable dead letter queue.
You can receive the Service Bus dead letter queue data. When received, you can further process this data into D365 FO. For example, you can set a Production order 'On hold' when a dead letter is received on the export of a production order.
To further process the received dead letter data, run the messages as assigned to the received dead letter data records. You can run the messages in several ways, as desired. You can run a message by running a:
Process using source
When you analyze data differences, you can edit a record value manually. To process your analysis results, you can choose to use the source value. As a result, in the created XML file, for the fields to be edited manually, the source value is used as field value. You can manually edit the value in the resulting XML file.
Process using target
When you analyze data differences, you can edit a record value manually. To process your analysis results, you can choose to use the target value. As a result, in the created XML file, for the fields to be edited manually, the target value is used as field value. You can manually edit the value in the resulting XML file.
Publish imported project release
Reassign message to data received from Service Bus dead letter queue
Reassign message to data received from Service Bus queue or topic subscription
Receive data from Service Bus dead letter queue
If a receiver reads a message from a Service Bus queue or topic subscription, an error can occur. If so, the Service Bus moves the message to the applicable dead letter queue.
You can receive the messages from the Service Bus dead letter queue.
On receiving data from the Service Bus dead letter queue, based on the Service Bus search definitions and settings on the received data, messages are automatically assigned to the received dead letter data records. The assigned messages are used to further process the dead letter data into D365 FO. For example, you can set a Production order 'On hold' when a dead letter is received on the export of a production order.
Receive data from Service Bus queue or topic subscription
Record form mapping
Record is created in the Inbound web service staging table
Register app in Microsoft Entra ID to connect to Azure Storage
If you want to connect to the Azure Blob storage with an authentication of type 'Password' or 'Client credentials', register an app in Microsoft Entra ID and configure the Azure Storage permissions.
Register application with Microsoft Entra ID
Register a native web application in Microsoft Entra ID to access D365 FO. For more information, refer to Register an application with the Microsoft identity platform.
Register Azure Logic App in Microsoft Entra ID
To be able to fill in the Client ID and the Secret in the HTTP action settings of the Azure Logic App, register the Azure Logic App in Microsoft Entra ID. For more information, refer to: Register an app.
Register Microsoft Entra ID application in D365 FO
To connect to D365 FO, you must register the Microsoft Entra ID application in D365 FO.
Release project
Repair header info in project version files
In unusual cases, the header information can get disturbed in the files on the file storage. You can repair the header information in the files on the file storage, based on the project version management table.
Repair project version check-in status on file storage
Re-run file actions
On a message, you can use a connector of type Azure file storage. For a Azure file storage connector, you can set up several file actions. These file actions are run in the defined sequence. If one of the file actions fails, the next file actions are not run.
Re-run message run
If you have solved the errors in a message run, you can run it again. The message is re-run in the company in which it was originally run. As a result:
Reset outbound queue record status to New
Reset status of data received from Service Bus queue or topic subscription
Reset status of date received from Service Bus dead letter queue
Restore project version
In an environment, you can restore another project version. As a result, the current project version is replaced with the selected project version.
You can restore a project version for these reasons:
Review and complete data migration setup
Review form mapping
Run inbound web service staging records
Run message
Run message for testing purposes
You can run a message for testing and troubleshooting purposes. You can find and analyze the results in the:
Run message from action menu item
Run message or web service action from dynamic button
You can add a dynamic button to a form to run a message or an outbound web service action from the form.
To add a dynamic button, no coding is required.
The dynamic button is added to the Business integration tab of the ActionPane of the form.
Run message test cases
To check if all test cases of a message work properly, you can run the message test cases. So, no full message run is done. Only the test cases, as defined for the message, are run.
Run project
Run task
Run test case
Run web service action
If you want to run an outbound web service action immediately, you can run it from the Web service action page.
Run web service action from action menu item
Search for applicable messages and web service actions
Select fields
You can add a selection of table fields to a record. You can select fields from the D365 FO table that is defined in the Record table field.
This is mainly applicable to internal documents. However, you can also use this to quickly set up fields for external file-based documents.
When the field selection is added to the record, review and complete the properties of the added fields.
Select fields - ODBC
You can add a selection of table fields to a record.
You can select fields from an external table via ODBC. To connect to the external environment, the default connector of type Database. You can set up this default connector for the applicable project.
To find the external table name, the name that is defined in the Record table field is used.
When the field selection is added to the record, review and complete the properties of the added fields.
Select tables to be mapped
For a data migration from AX2021 to D365 FO, you can generate messages based on data migration setup records. Before you can do so, select the AX2012 tables which data you want to migrate to D365 FO. As a result, the related data migration setup records are created with the selected AX2012 tables as source table. If in D365 FO a table exists with the same name, this is automatically set as target table.
Select the content packages for comparison
To compare data, select two content packages. If a desired content package does not exist, import the content package. As a result, a comparison is created.
Set Connectivity studio parameters
Before you start using Connectivity studio, set up the Connectivity studio parameters.
Define parameters for the:
Set Data quality studio integration options
You can apply data quality policy rules on import of data into D365 FO with a Connectivity studio message. On the message header, define which types of rules are applied on data import with the message.
For more information, refer to Apply data quality rules on data import with Connectivity studio.
Set up applications
Define the applications that are involved in integration or data migration projects. You can link an application to several projects.
Set up Azure file storage connector
Set up a connector of type Azure file storage. Use this type to exchange data files between your D365 FO environment (on-cloud or on-premises) and another environment, for example an on-premises environment.
With the Azure file storage type connector, you can exchange these external file-based documents: EDI, Fixed text, Microsoft Word, Microsoft Excel, Text, XML, JSON.
You can exchange data files using one of these file systems:
File system |
Description |
Azure File Storage |
You can use an Azure Storage Account to exchange data files between your D365 FO environment (on-cloud or on-premises) and another environment, for example an on-premises environment. |
Local folders |
If you use Connectivity studio on a D365 FO (on-premises) environment, you can choose to use local Windows folders to exchange data files. |
Set up Azure Service Bus namespace - Queue
You can use a connector of type 'Service Bus queue' to exchange information via an Azure Service bus queue. A Service Bus queue provides First In, First Out (FIFO) message delivery to one or more competing consumers. That is, receivers typically receive and process messages in the order in which they were added to the queue. And only one message consumer receives and processes each message.
Set up Azure Service Bus namespace - Topic and subscriptions
Set up Azure Web App
Set up the Azure App Service Web App that is used to manage the web service. For more information, refer to App Service documentation .
Set up Blob storage connector
Set up a connector of type Blob storage to exchange data files between your D365 FO environment and another environment, using Azure Blob storage. Azure Blob storage is Microsoft's object storage solution for the cloud. Blob storage is optimized for storing massive amounts of unstructured data.
With the Blob storage type connector, you can exchange these external file-based documents: EDI, Fixed text, Microsoft Word, Microsoft Excel, Text, XML, JSON. You can only use this connector in combination with a document for which the version 3 (V3) handler class is selected.
You can use one of these authentication methods to access the Blob storage:
Set up company-specific field mapping
Set up D365 FO connector
Set up a connector of type D365 FO. Use this type to directly connect to a D365 FO database.
Set up data migration areas
Set up data migration statuses
Set up data synchronization - Date range - Message
Set up data synchronization - Date range - Web service
For outbound web service actions, you can use the data synchronization setup to define which records are processed.
Web service data synchronization only applies to outbound web service actions for which data synchronization is set up. If no data synchronization is set up, the web service action must be run in another way.
This topic explains how to use a date range to export only the records that are changed or added since the latest web service action run.
Make sure, the root record of the source document of the request message has a date/time field that indicates when the record is last changed, for example a 'modifiedDateTime' field. If you run the outbound web service action, all records that are found, based on the source document setup, are considered. For each found root record, the date/time is compared with the Latest run date/time of the outbound web service action. Only the records are exported with a date/time that is later than the latest run date/time of the outbound web service action.
Set up data synchronization - Table events - Message
Set up data synchronization - Table events - Outbound web service action
Set up Database connector
Set up a connector of type Database. Use this type to directly connect to an external database. This external database can be an on-premises database or a cloud database.
You can connect to an external database with an:
Set up default connector for project
Set up default connectors for data migration project
Set up default response text
{"response": "We have received the request."}
Set up document - D365 FO
Set up document - EDI
Set up document - Fixed text
Set up document - Inventory journal
Set up document - JSON
Set up document - Ledger journal
Set up document - Microsoft Excel
Use a Microsoft Excel document to read data from or write data to a Microsoft Excel file (XLSX).
Set up document - Microsoft Word
In Connectivity studio, use a Microsoft Word document to write data to a Microsoft Word document (DOCX) using a Microsoft Word template (DOTX).
With a Microsoft Word document, you can, for example, add data to text, include contract text, support multi-language output, or include product attributes or specifications. In this way, you can, for example, generate invoices or contracts with the style texts as defined in the template.
To add data to the Microsoft Word document, you can add markers to the Microsoft Word template. You can add markers for document records and for document record fields.
Note: You can use a Microsoft Word document only to write. So, no read options need to be set.
Set up document - ODBC
Use an ODBC document to directly read data from or write data to an external database. You can exchange data with an external database via ODBC or with an external Azure SQL database.
Set up document - Other journals
Set up document - Staging
Set up document - Text
Set up document - Trade agreement journal
Set up document - XML
Set up EDI type
Set up endpoint
Set up Environment Comparison Studio connector
Set up a connector of type ECS Azure file storage. Use this type to connect to your Azure file storage location. The connector creates the Environment comparison studio folders in the share that is defined in the Environment comparison studio parameters.
On the Connector page, in the ECS connector details section, you can view the generated folder paths.
The ECS Azure file storage connector creates:
The environment-specific parent folder. This folder is created in the Share that is defined in the Environment comparison studio parameters.
A folder for each legal entity in the environment. These folders are created in the environment-specific parent folder.
A Work (ECSWorkFolder) folder for each legal entity. This folder is created in the legal entity folder.
Archive, Error, Export, and Import folders. These folders are created in the Work (ECSWorkFolder) folder.
Use the ECS Azure file storage connector to manage the paths and to place Environment comparison studio files in the applicable folders.
Folder | Description |
Archive |
The Archive folder of the connector. The folder is used to store used and imported XML files.
|
Error |
The Error folder of the connector. Files are moved to this folder if, during import, errors occurred. |
Export | Folder where export messages, with the ECS Azure file storage connector as target connector, store the generated XML files. These files are used to create ECS content packages. |
Import |
Folder where the import messages, with the ECS Azure file storage connector as source connector, get XML files to import into D365 FO. The import message, with the ECS Azure file storage connector as source connector, gets the XML files from the Import folder. |
Set up Environment Comparison Studio message
Use messages as the carriers that transport data from a source to a target, based on the mapping as defined on the message.
In Environment comparison studio, for a message to:
Export data from D365 FO, the:
Source document type is D365 FO.
Source connector type is D365 FO.
Target document type is XML.
Target connector type is ECS Azure file storage.
Import comparison result data into D365 FO, the:
Source document type is XML.
Source connector type is ECS Azure file storage.
Target document type is D365 FO.
Target connector type is D365 FO.
Make sure the Key field is added to the source XML document. The key is a constant value that is used to compare records. The Key field is a manual addition and is not part of the table that is used in the message.
Set up Environment Comparison Studio parameters
Set the general Environment comparison studio parameters that define the storage location for the ECS Azure file storage connector generated folders.
Set up field mapping condition
Set up field mapping condition using an expression
Set up file action - Azure Blob storage
Set up file action - Copy
Set up file action - Delete
Set up file action - Email - D365FO email
Set up file action - Email - Exchange server
Set up file action - Email - IMAP
Set up file action - Email - SMTP server
Set up file action - FTP
Set up file action - Move
Set up file action - SFTP
Set up file action - Zip
Set up inbound web service website
Set up journal validations
Set up local Windows folders for Azure file storage connector
Set up local Windows folders for general files
If you use Connectivity studio on a D365 FO (on-premises) environment, you can choose to use local Windows folders to exchange data files.
Set up mapping condition using an expression
Set up mapping conditions
Set up message
Set up Microsoft Graph API
To connect to SharePoint, the Microsoft Graph API is used. Register an app in Microsoft Entra ID and configure the Microsoft Graph permissions.
Set up qualifiers
Set up record mapping
Set up Service Bus queue connector
The related document defines which data is sent to or received from a queue or topic and in which format and structure. So, the document does not result in a file.
With the 'Service Bus queue' connector, you can use these external file-based documents: EDI, Fixed text, Text, XML, JSON. You can only use this connector in combination with a document for which the version 3 (V3) handler class is selected.
Set up Service Bus search definition
Set up SharePoint
Set up SharePoint connector
Set up a connector of type SharePoint to exchange data files between your D365 FO environment and another environment, using SharePoint. SharePoint is a solution to share and manage content, knowledge, and applications to empower teamwork, quickly find information, and seamlessly collaborate across the organization.
With the SharePoint type connector, you can exchange these external file-based documents: EDI, Fixed text, Microsoft Word, Microsoft Excel, Text, XML, JSON. You can only use this connector in combination with a document for which the version 3 (V3) handler class is selected.
Set up staging display options
Set up Staging journal connector
Set up a connector of type Staging journal. Use this type to validate and approve data before it is imported into D365 FO.
A staging journal scenario consists of:
Set up staging validations
For each document record, you can define data validations to be done when a data record is inserted in the staging journal.
Validation class | Description |
---|---|
BisValidationFieldisMantadory | This class checks if the field is filled. The arguments are: FieldValue and Type. The validation is met if the field has a valid value. For example, for date fields, the value 0 is not valid. |
BisValidationReferenceExists | This class checks if a record exists in the defined table. The validation is met if at least one record exists in the table. The arguments are KeyFieldName or KeyFieldValue (only define one of these arguments) and TableName. For example, you can check if a customer exists in the CustTable. If the customer does not exist, a validation error is reported. |
BisValidationMdmDifference |
This class if differences exist between the data that is send from the source to the MDM staging journal and the current data in the target. This validation defines what happens with MDM staging journal lines with data differences. You can only use this class if you use:
For more information, refer to Monitor MDM staging journal. |
Set up template company in Connectivity studio parameters
You can use a template company as source company. If you run a message, the data is exported from the template company and imported in the target company.
In a template company, you can, for example, define generally applicable master data. You can use this to set up the master data for a new company.
To be able to use a template company, in the Connectivity studio parameters, define the template company for each target company. You can define several template companies. For example, with country-specific or industry-specific data.
Set up test tasks for test project
Set up transformation
Set up type conversion
If you need a type conversion and it does not already exist, you must set it up.
You can use a type conversion to convert the data to match the format as required in the target. With a type conversion, you can convert values from any type to string or from string to any type. Usually, the string value is the external value. Note: Type conversions from any type to any type are not supported. For example, a conversion of type integer to type date is not possible.
You can use these conversion types:
Conversion type |
Description |
Text |
Define the format in which a text is imported or exported. You can, for example, replace or remove characters, or use one element of the text. |
Amount |
Define the format in which an amount is imported or exported. You can, for example, define separators and unit conversion. |
Date |
Define the format in which a date is imported or exported. You can, for example, define the sequence and separator. |
Enum |
Define the format in which an enum value is imported or exported. You can, for example, define that the enum value is imported or exported as text. |
Time |
Define the format in which times are imported or exported. You can define the format and the separators to be used. |
UtcDateTime |
Define in which format a date and time field value is imported or exported. This type combines the Date and Time types. |
Date/time format |
Define in a flexible way the format in which a date and time field value is imported or exported. You can also include a time zone. Note: The format is case sensitive. For example, the lowercase 'm' is the identifier for minute, and the uppercase 'M' is the identifier for month. Example: dd-MM-yyyy. For more information on how to set up the date/time format, refer to Custom date and time format settings. |
Set up Upload and download connector
Set up a connector of type Upload and download. Use this type to:
With the Upload and download type connector, you can upload or download these external file-based documents: EDI, Fixed text, Microsoft Word, Microsoft Excel, Text, XML, JSON. You can only use this connector in combination with a document for which the version 3 (V3) handler class is selected.
Set up validation rules
Set up web service action - Inbound
Set up web service action - Outbound
You can use an outbound web service action to request data from an external application and to process the response in D365 FO, via an external web service.
Message | Description |
---|---|
Request message | The request message provides the web service with data from D365 FO. |
Response message | The response message processes the response from the web service in D365 FO. |
Error message | The error message processes the error from the web service in D365 FO. |
Attribute type | Description |
---|---|
Value | The attribute is a fixed value. Enter the fixed value in the Value or Custom field. |
Document field | The attribute value is derived from a field of the source document of the request message. Fill in the Document field field. |
Record field |
The attribute value is derived from a field of a selected record. Usually, this type is used to get records. Only use this type if you start the web service with a menu item from a specific page. Fill in these fields: Record table and Record field. Example: You start the web service from the Sales orders page. You can use the attribute to get all sales orders for the customer of the selected sales order. In this case, you fill in the CustTable and ID. |
Custom | You can enter a static method that defines the range. The static method is applied to the source document of the request message. |
Secret | You can enter a secret reference to be used as attribute. The secret reference refers to a centrally stored secret which makes updating secrets easier. So, the secret value is not visible on the Web service action page. You only see the secret reference. Fill in the Secret reference field. |
The attribute styles define how the attribute is applied to the request. You can use these attribute styles:
Attribute style | Description |
---|---|
Header | Sends a custom header with an HTTP request. The attribute is added to the header of the HTTP request. |
Query |
Most common attribute type. It applies to the whole request. It is added to the URL after the question mark (?) after the resource name. Example: https://myserver.com/resource?attr1=Your Value&attr2=Your Value |
Template |
Parameterizes the resource path, adding a placeholder for a variable value. Example: https://myserver.com/resource/{attr3} |
Matrix |
Applies to a specific resource path element. The attribute is added to the URL, between the resource or the template attribute and the QUERY attributes. The attribute is separated from the resource or the template attribute with a semicolon (;). Example: https://myserver.com/resource/{attr3};;attr4=Your Value?attr1=Your Value |
Plain | Excludes the attribute from the HTTP request. For example, for testing purposes. |
Body key pair | Usually, for an outbound web service, the body contains the content. In some cases, for example for Dataverse, the body contains more data than only the content. The data is split in a list of, so called, key pairs. In this case, the content is stored in a key pair, instead of in the body. For each of the key pairs to be added to the request body, add an attribute to the outbound web service action. To use key pairs in your body, use these settings:
|
Set up web service application for project
Set up Web service connector
Set up a connector of type Web service. Use this type to exchange data via a web service using a stream.
If you:
The related document defines which data is added to or taken from a stream and in which format and structure. So, the document does not result in a file.
With the Web service type connector, you can use these external file-based documents: EDI, Fixed text, Text, XML, JSON. You can only use this connector in combination with a document for which the version 3 (V3) handler class is selected.
Set up web service user
Show Infolog for message run
Show where a secret reference is used
You can show the records where a secret reference is used. This can be helpful, for example, if you want to change a secret and you want to see which records are involved.
Solve connection issue for Azure file storage connector
For an Azure file storage connector, you can connect to an Azure file share. To access the Azure file share, you can mount the Azure file share.
Solve EDI Delfor journal processing errors
When an EDI Delfor journal is processed with the 'Sales (Delfor) - EDI Delfor journal to Order' message, errors can occur. If an error occurs:
You can solve processing errors and have the EDI Delfor journal processed again.
Solve EDI Delfor journal validation errors
The delivery forecast sales orders that you receive with the 'Sales (Delfor) - XML to EDI Delfor journal (830)' message are stored in the EDI Delfor journal.
These EDI Delfor journals are validated according to the applicable journal validation setup. If the applicable validation rules are not met, an error or warning is given. Before the EDI Delfor journals can be processed further, review the errors and warnings, and take appropriate actions.
If journal validation errors are given, you have these options:
You cannot accept headers, lines, or addresses with errors. If you do so, and approve the EDI Delfor journal, the header, line, or address with errors is again set to Rejected.
If the errors are solved or lines or addresses with errors are canceled, approve the EDI Delfor journal.
Solve EDI Delfor journal validation warnings
The delivery forecast sales orders that you receive with the 'Sales (Delfor) - XML to EDI Delfor journal (830)' message are stored in the EDI Delfor journal.
These EDI Delfor journals are validated according to the applicable journal validation setup. If the applicable validation rules are not met, an error or warning is given. Before the EDI Delfor journals can be processed further, review the errors and warnings, and take appropriate actions.
If journal validation warnings are given, you have several options:
If all warnings are solved, accepted, or canceled, approve the EDI Delfor journal.
Solve EDI inventory order processing errors
Solve EDI inventory order validation errors
If you use EDI inventory order staging in your EDI process, the received information is imported into the EDI inventory order journal by the relevant custom messages. You can, for example, use staging in your EDI process for picking list registrations or product receipts.
The EDI inventory orders are validated according to the applicable journal validation setup. If the applicable validation rules are not met, an error or warning is given. Before the EDI inventory order can be processed further, review the errors and warnings, and take appropriate actions.
If journal validation errors are given, you have these options:
You cannot accept headers, lines, or addresses with errors. If you do so, and approve the EDI inventory order, the header, line, or address with errors is again set to Rejected.
Solve EDI inventory order validation warnings
If you use EDI inventory order staging in your EDI process, the received information is imported into the EDI inventory order journal by the relevant custom messages. You can, for example, use staging in your EDI process for picking list registrations or product receipts.
The EDI inventory orders are validated according to the applicable journal validation setup. If the applicable validation rules are not met, an error or warning is given. Before the EDI inventory order can be processed further, review the errors and warnings, and take appropriate actions.
Solve EDI purchase order confirmation processing errors
Solve EDI purchase order confirmation validation errors
The purchase order confirmations that you receive with the 'Purchase - XML to EDI confirmation' message are stored in the EDI purchase order confirmations journal.
These EDI purchase order confirmations are validated according to the applicable journal validation setup. If the applicable validation rules are not met, an error or warning is given. Before an EDI purchase order confirmation can be processed further, review the errors and warnings, and take appropriate actions.
If journal validation errors are given, you have these options:
You cannot accept headers, lines, or addresses with errors. If you do so, and approve the EDI purchase order confirmation, the header, line, or address with errors is again set to Rejected.
Solve EDI purchase order confirmation validation warnings
The purchase order confirmations that you receive with the 'Purchase - XML to EDI confirmation' message are stored in the EDI purchase order confirmations journal.
Solve EDI sales order processing errors
Solve EDI sales order validation errors
Solve EDI sales order validation warnings
Solve errors - File actions
Solve the file action errors that occurred during the message run.
Solve errors - Message run
Solve the errors that have occurred in the message run.
Solve validation errors and warnings
Source connection is made
Source data is retrieved
Split EDI inventory order journal
Split logged events over pages
Take over project checkout
Target connection is made
Test data synchronization log processing
For messages and web service actions, you can use table events to track data changes. You can define, for each table, which table events are logged. The table events are logged in the Data synchronization log.
When the logged events are processed, for each applicable message or web service action, a record is added to the Outbound queue.
If logged events in the data synchronization log are not processed to the outbound queue, you can:
When tested, a logged event is processed to the outbound queue. If processing a logged event goes:
When the test is finished, you can view the processing steps and result in the message details. For each message or web service action that is subscribed to events of the same table, you can view the processing steps. In case of errors, verify the processing steps to see where the issue occurred.
This picture is an example of message details, with an explanation of the different parts:
Explanation:
Test document
If you run into an issue with a message, you can separately test the source and target documents.
Test inbound Azure Logic App
Test inbound web service action
You can test an inbound web service action without receiving an HTTP request from the external application. So, you only test the inbound web service action setup and not the full process with connection to the external application.
When testing, the inbound web service action does run the request message and response message. So, data can be impacted. Therefore, you are advised to only test an inbound web service action in a Development or Test environment.
For the result of an inbound web service action test, view the Result section on the test page. You can also view the message history of the related request message and response message.
Test outbound Azure Logic App
Test outbound web service action
You can test an outbound web service action without sending a request to the external web service. So, you only test the outbound web service action setup and not the full process with connection to the external web service.
When testing, the outbound web service action does run the request message and response message. So, data can be impacted. Therefore, you are advised to only test an outbound web service action in a Development or Test environment.
For the result of an outbound web service action test, view the message history of the related request message and response message.
Troubleshoot data synchronization log issues
You can use table events to log data changes. You can define, for each record, which table events are logged. The table events are logged in the Data synchronization log.
Troubleshoot staging issues
In Connectivity studio, you can use the staging concept to validate data in an intermediate area before it is further processed. Usually, the issues as shown in the staging journal are data related.
If it appears that issues are not data related, extend your issue investigation to the messages that import data to or export data from the the staging journal.
Troubleshoot trigger issues
An integration can be triggered in several ways. If a trigger fails, no errors are shown in the Connectivity studio Integration operations history.
Update secret reference name
You can change a secret reference name. this automatically updates the secret reference in all places where it is used.
You can update a secret reference name, for example, after you upgraded from local secret storage to central secret storage. In this case, you can change the automatically created secret reference names.
Upgrade secrets to the secret reference tables
For each project, you can migrate from 'locally' stored secrets to centrally stored secrets. To do so, you can automatically collect the locally stored secrets and store these in the centrally stored secret references.
During upgrade:
Note: Usually, you only use this upgrade function once per project during migration from locally to centrally stored secrets.
Use file explorer
You can use the File explorer to quickly access and view the share as defined for the applicable Azure file storage connector. The File explorer offers limited functionality. You can only view, copy, move, or delete files.
Verify test results
View all EDI inventory lines
View EDI Delfor journal
Use the EDI Delfor journal to monitor the delivery forecast sales orders.
The delivery forecast sales orders that you receive with the 'Sales (Delfor) - XML to EDI Delfor journal (830)' message are stored in the EDI Delfor journal.
If you monitor EDI Delfor journals, the statuses are important. For more information on these statuses, refer to 'EDI staging journal statuses'.
View EDI Delfor sales information
For each customer, item, and customer order combination, for which you receive delivery forecasts, you can provide an overview of the related EDI Delfor sales information. You can use this data to apply several validation rules to EDI Delfor journals.
The EDI Delfor sales information is:
View EDI inventory orders
Use the EDI inventory order staging journal to monitor the EDI inventory orders. If you use EDI inventory order staging, custom messages are required.
You can, for example, use staging in your EDI process for picking list registrations or product receipts.View EDI purchase order confirmations
View EDI sales orders
View error report
You can view a report in Microsoft Excel format that contains the errors that occurred during a message run. You can, for example, use this to inform the sender of data on the errors.
View file history
View message run history
You can review and analyze the history of message runs.
View message run record history
View messages on Service Bus queue or topic subscription
You can view the messages on a Service Bus queue or topic subscription for a specific connector.
View original message run history
View outbound queue
View project version log, status, or files
View staging journal
Use the staging journal to monitor the staged records.
If you use staging in your inbound process, you receive data with message that stores the data staging journal.View table relations
View where-used
Web service error message is run - Inbound
When in an inbound web service process, the request message did not run successfully or an error occurred, the inbound web service action runs the error message, if defined. This is managed by the handler class as defined for the web service action.
The error message sends the error information from D365 FO to the inbound web service. For example, you can use an error message to change the status of a record in the external application.
Web service error message is run - Outbound
Web service request message is run - Inbound
When triggered, the inbound web service action first runs the request message, if defined. This is managed by the handler class as defined for the web service action.
The goal of the request message depends on the HTTP action of the web service action. In general, the request message provides D365 FO with data from the external application.
Web service request message is run - Outbound
When triggered, the outbound web service action first runs the request message, if defined. This is managed by the handler class as defined for the web service action.
Web service response message is run - Inbound
When in an inbound web service process, the requested data is received from D365 FO, the inbound web service action runs the response message, if defined. This is managed by the handler class as defined for the web service action.
The goal of the response message depends on the HTTP action of the web service action. In general, the response message sends the response from D365 FO to the inbound web service.
Web service response message is run - Outbound
When in an outbound web service process, the requested data is received from the web service, the outbound web service action runs the response message, if defined. This is managed by the handler class as defined for the web service action.
The goal of the response message depends on the HTTP action of the web service action. In general, the response message processes the response from the web service in D365 FO.