You can use test cases to automate the testing of an integration or data migration setup. For example, if D365 FO is updated to a newer version, you can use test cases to easily check if the integration or data migration setup still works properly.


Application Consultant Application Consultant Start Start For which message type  do you want to create  a test case? For which message type  do you want to create  a test case? Create test case for export message Create test case for export message You can use test cases to automate the testing of an integration or data migration setup. For each message, you can create the desired test cases.This topic explains how to create a test case for an export message; the message source document is an internal document. Procedure 1. Click Connectivity studio Integration Design. 2. On the Message tab, in the list, find and select the desired export message. 3. Click Edit. 4. On the Action Pane, click Design. 5. Click Test case. 6. Click New. 7. In the Name field, type a value. 8. Select Yes in the Skip file action field. 9. In the Note field, type a value. 10. Sub-task: Define test data range. 11. Expand the Range section. 12. Click New. 13. In the Document field, enter or select a value. 14. In the Record field field, enter or select a value. 15. Define the range type that applies to the Range field. You can apply ranges of these types: - Value: Enter a fixed value or a range of values using the advanced query syntax. - Custom: Enter a static method that defines the range. For example, you can use the SysQueryRangeUtil class to apply advanced date queries. Other examples are: 'curExt()' (gets the current company), 'strFmt("%1..%2", prevMth(today()), today())' (gets the orders from the last month), and 'DateTimeUtil::utcNow()' (filters on the current date/time). In the Range type field, select an option. Note: The Parameter type is not applicable here. 16. In the Range field, enter or select a value. 17. Sub-task: Set up test steps with the expected results. 18. Expand the Test steps section. 19. Click New. Note: You can also use the Create expected result function. The test steps are automatically added based on the message field mapping. The expected results are filled in, based on the defined data range. For each record in the range, a separate set of test steps is added with the related expected results. These sets of test steps are distinguished by the target record number; each set gets its own number. If added, you can manually change the test steps as desired. 20. In the Test name field, type a value. 21. In the Message mapping field, enter or select a value. 22. In the Target field field, enter or select a value. 23. In the Target record number field, enter a number. 24. Define the range type that applies to the Expected result field. You can apply ranges of these types: - Value: Enter a fixed value or a range of values using the advanced query syntax. - Custom: Enter a static method that defines the range. For example, you can use the SysQueryRangeUtil class to apply advanced date queries. Other examples are: 'curExt()' (gets the current company), 'strFmt("%1..%2", prevMth(today()), today())' (gets the orders from the last month), and 'DateTimeUtil::utcNow()' (filters on the current date/time). In the Expected result type field, select an option. Note: The Parameter type is not applicable here. 25. In the Expected result field, type a value. 26. In the Execution field, select an option. 27. In the Expected status field, select an option. 28. Close the page. 29. Close the page. Create test case for import message Create test case for import message You can use test cases to automate the testing of an integration or data migration setup. For each message, you can create the desired test cases.This topic explains how to create a test case for an import message; the message target document is an internal document. Procedure 1. Click Connectivity studio Integration Design. 2. On the Message tab, in the list, find and select the desired import message. 3. Click Edit. 4. On the Action Pane, click Design. 5. Click Test case. 6. Click New. 7. In the Name field, type a value. 8. For an import message, you can run a test case without actually processing the data. Select Yes in the Commit data field. Note: For more complex messages, to prevent errors, you are advised to commit the data. 9. Select Yes in the Skip file action field. 10. In the Note field, type a value. 11. Sub-task: Define the test data source. 12. Define the data source that is used for the test case. Select either a message history record or a file. Expand the Source section. Note: If the source document of the message is an: - External filed-based document, you can use a history record or a file as data source for the test case. - External ODBC document, you can only use a history record as data source for the test case. 13. In the History field, enter or select a value. Note: If you select a history record, the File name field is cleared. 14. In the File name field, enter or select a value. 15. Sub-task: Set up test step with the expected results. 16. Click New. Note: You can also use the Create expected result function. The test steps are automatically added based on the message field mapping. The expected results are filled in, based on the defined data source. For each record in the data source, a separate set of test steps is added with the related expected results. These sets of test steps are distinguished by the target record number; each set gets its own number. If added, you can manually change the test steps as desired. 17. In the Test name field, type a value. 18. In the Message mapping field, enter or select a value. 19. In the Target field field, enter or select a value. 20. In the Target record number field, enter a number. 21. Define the range type that applies to the Expected result field. You can apply ranges of these types: - Value: Enter a fixed value or a range of values using the advanced query syntax. - Custom: Enter a static method that defines the range. For example, you can use the SysQueryRangeUtil class to apply advanced date queries. Other examples are: 'curExt()' (gets the current company), 'strFmt("%1..%2", prevMth(today()), today())' (gets the orders from the last month), and 'DateTimeUtil::utcNow()' (filters on the current date/time). In the Expected result type field, select an option. Note: The Parameter type is not applicable here. 22. In the Expected result field, type a value. 23. In the Execution field, select an option. 24. In the Expected status field, select an option. 25. Close the page. 26. Close the page. How to run a test? How to run a test? Run test case Run test case To check if a test case works properly, you can run the test case. Procedure 1. Click Connectivity studio Integration Design. 2. On the Message tab, in the list, find and select the desired message. 3. Click Edit. 4. Click Test case. 5. In the list, find and select the desired test case. 6. Click Run test. Note: When run, a message is shown in the message bar with the test result or any error. 7. Close the page. 8. Close the page. Run message test cases Run message test cases To check if all test cases of a message work properly, you can run the message test cases. So, no full message run is done. Only the test cases, as defined for the message, are run. Procedure 1. Click Connectivity studio Integration Design. 2. On the Message tab, in the list, find and select the desired message. 3. Click Run. 4. Select Yes in the Run test case field. 5. Click OK. Create test project Create test project If you want to use automated testing, the best practice is to use separate projects for testing and for the actual integration or data migration.Create a test project in the same way as an integration or data migration project. For the test project, only set up test tasks. Set up test tasks for test project Set up test tasks for test project If you have created a test project, set up the tasks for this project. Make sure, you only set up tasks that are used for testing (Run test case is Yes).To each test task, you can add the applicable messages from the integration or data migration project. Make sure, you only add messages with test cases. So, you use the actual integration or data migration messages for testing! Run test project Run test project You can use a project to exchange data. To be able to run a project, tasks must be set up for the project. If you run the project, all related tasks are run.Depending on the tasks that are set up for the project, you can use a project to run:An integration or data migration.Outbound web services.Batch classes.Master data management.Test cases.Usually, you run a project in batch. Based on the defined sub-projects and tasks dependencies, tasks are run in parallel. If you do not run the project in batch, all sub-projects and tasks are run sequentially. Procedure 1. Click Connectivity studio Integration Design. 2. Click Run project. 3. In the Project field, enter or select a value. Note: By default, the currently active project, as shown on the Connectivity studio Integration design workspace, is selected. 4. In the Company group field, enter or select a value. 5. Sometimes, you want to import only the delta, which are the changes since the last import. Example: During a migration project, often, some time elapses between the migration of the data and the go-live moment. Therefore, just before the go-live moment, you want to import the latest open transactions. However, you do not want to run the full migration again (see picture). In Connectivity studio, you can do a delta run. To do a delta run: - For the project tasks, add the applicable messages with the Run for delta field set to Run. - Set the current Delta run field to Yes. As a result, if the project is run, for each project task, only the messages are run that have the Run for delta field set to Run. When a message is run in a delta run, only the record inserts are done. So, no update or delete of records is done. Select Yes in the Delta run field. Note: For outbound web service actions, you can have set up data synchronization of type 'Date range'. To apply this data synchronization setup when you run web service actions from a project, select Yes in the Delta run field. 6. Select Yes in the Process outbound queue field. 7. Sub-task: Set up batch processing. 8. Expand the Run in the background section. 9. In the Batch processing field, select an option. If Yes, also fill in the other fields. 10. Click Recurrence and fill in the fields as desired. 11. Click OK. 12. Click OK. Notes You can also run a project from the project form. Verify test results Verify test results When a test case is run, the results are stored as test case run. For each test case run, you can review the test results, and if applicable do the manual tests.The status is shown in these ways:Test step status in the Result section:The test step status can be automatically set to:Passed: If the test step result matches with the expected status, the test step Status is set to Passed.Failed: If the test step result does not match with the expected status, the test step Status is set to Failed.New: The mapping cannot be done because there is no value to be set.Test case run status in the Test case run section:The test case run status field can automatically be set to:Passed: If for all test steps, for all tested records, the status is Passed, the test case run status is set to Passed.Failed: If for at least one of the steps, for at least one of the tested records, the status is Failed, the test case run status is set to Failed.No data: If the test case cannot be done because no data was available to be processed, the test case run status is set to No data.A test case can fail, for example, caused by errors or by a test step that must be tested manually. You can review a failed test case and, if applicable, manually test the manual test steps.After reviewing and manual testing, you can manually change the test case run status in line with your findings. Procedure 1. Go to Connectivity studio > Inquiries > Test case run. 2. In the list, find and select the desired test case run. 3. Click Edit. 4. In the Test status field, select an option. 5. Close the page. End End Export Import Test case Message Project

Activities

Name Responsible Description

Create test case for export message

Application Consultant

You can use test cases to automate the testing of an integration or data migration setup. For each message, you can create the desired test cases.
This topic explains how to create a test case for an export message; the message source document is an internal document.

Create test case for import message

Application Consultant

You can use test cases to automate the testing of an integration or data migration setup. For each message, you can create the desired test cases.
This topic explains how to create a test case for an import message; the message target document is an internal document.

Run test case

Application Consultant

To check if a test case works properly, you can run the test case.

Run message test cases

Application Consultant

To check if all test cases of a message work properly, you can run the message test cases. So, no full message run is done. Only the test cases, as defined for the message, are run.

Create test project

Application Consultant

If you want to use automated testing, the best practice is to use separate projects for testing and for the actual integration or data migration.
Create a test project in the same way as an integration or data migration project. For the test project, only set up test tasks.

Set up test tasks for test project

Application Consultant

If you have created a test project, set up the tasks for this project. Make sure, you only set up tasks that are used for testing (Run test case is Yes).
To each test task, you can add the applicable messages from the integration or data migration project. Make sure, you only add messages with test cases. So, you use the actual integration or data migration messages for testing!

Run test project

Application Consultant

You can use a project to exchange data. To be able to run a project, tasks must be set up for the project. If you run the project, all related tasks are run.
Depending on the tasks that are set up for the project, you can use a project to run:
  • An integration or data migration.
  • Outbound web services.
  • Batch classes.
  • Master data management.
  • Test cases.
Usually, you run a project in batch. Based on the defined sub-projects and tasks dependencies, tasks are run in parallel. If you do not run the project in batch, all sub-projects and tasks are run sequentially.

Verify test results

Application Consultant

When a test case is run, the results are stored as test case run. For each test case run, you can review the test results, and if applicable do the manual tests.
The status is shown in these ways:
  • Test step status in the Result section:
    The test step status can be automatically set to:
    • Passed: If the test step result matches with the expected status, the test step Status is set to Passed.
    • Failed: If the test step result does not match with the expected status, the test step Status is set to Failed.
    • New: The mapping cannot be done because there is no value to be set.
  • Test case run status in the Test case run section:
    The test case run status field can automatically be set to:
    • Passed: If for all test steps, for all tested records, the status is Passed, the test case run status is set to Passed.
    • Failed: If for at least one of the steps, for at least one of the tested records, the status is Failed, the test case run status is set to Failed.
    • No data: If the test case cannot be done because no data was available to be processed, the test case run status is set to No data.
A test case can fail, for example, caused by errors or by a test step that must be tested manually. You can review a failed test case and, if applicable, manually test the manual test steps.
After reviewing and manual testing, you can manually change the test case run status in line with your findings.

Provide feedback