...
When the key/data part creation (images) are completed for the Before and After image, the corresponding rows are searched using the Key part.
There are cases when there is no Key part in the query output. In this situation; Validate uses the rows position to compare the appropriate rows between both images. When multiple rows have the same Key part, Validate picks the corresponding row from another image. If the corresponding row does not belong to the group of rows with the matching Key part, then the output of the row is highlighted with yellow color to indicate the need for a manual check. To prevent any SAP rounding differences, in the settings for Validate you can specify the number of decimal places used for the Query results comparison. Please refer to Settings chapter to see how this setting can be changed. If a row from either image does not have a matching pair, the alternate image row is colorized with a red color; the data fields are colorized along with the row number. Missing rows can occur in both the Before and After image.
Missing row in other image
All corresponding rows are compared even on the data part. If any differences are found in the data part, the appropriate data cells are colorized with a red color to mark the difference. Only cells in the After image are colorized as the Before Image values are taken as correct ones.
Incorrect data in After image
...
To prevent the execution of a Test Run by mistake and overwriting current results, you can use lock option. When a Test Run is locked, it is not possible to change task state, execute or reset image creation or comparison, change variants or reporting status until it is unlocked again. Mass Execution and scheduling will not run on locked test runs. Deletion in Test Plan stops at first locked Test Run. You are informed about locked Test Run by the lock icon in the Test Plan tab of Test Management and by a message with a description and a user name.
You can change lock state of a Test Run by selecting from the context menu of a Test Case name in Test Plan (or select it from the context menu of the root task in the Test Case) and selecting the 'Lock Run' (or 'Unlock Run') from context menu.
Clicking on 'Lock Run' option, opens a popup window where you can set the description of the reason to lock Test Run. Clicking on 'Unlock Run' unlocks the Test Run directly.
...
Overview of Tasks for Test Run ID
6. In the next step, we will add new query variants for our run.
Double click on the first task in the Select Query Variants and a new screen appears with several options for adding a new variant. Here you can do the following:
...
If you choose this option, you can choose from the existing query variants that were created previously.
...
- Create Based o Web Template
Create a query variant for queries of selected web template.
...
Create query variant for queries of web template bookmark.
In our example, we will choose to Create new Query Variant.
Creating a new Query Variant
7. When you click on Create new Query Variant, a new window should appear; here . Here you need to add the technical name of the query you want to use and a description. Other fields are optional, please refer to 'Create new Query Variant' chapter for more details.
Query variables
If the selected query requires input variables, you can then set this them up by clicking on 'Set query variables' button.
Set query variables
8. After you save the query variant, you can view this in the list of all query variants.
Set query variables
9. Once all your query variants are added, save the selection by pressing the 'Save' button (Ctrl + S), then . Then you can return to the Test Run ID tasks. In the next step you need to Execute (F8 or double click) Generate tasks for before image, once . Once generated, the first two status icons should be green.
Generate Tasks for before image
...
Note |
---|
Important information: To reset the task state of every executed task by highlighting the task and clicking on on the Reset task state button (Shift + F8). |
Generate Tasks for before image
10. Once the tasks are generated, you can then execute Create before image. A new pop up appears where you can specify the number of background jobs and the name of Application server to be used.
If you want the number of background jobs to stay the same even when one or more jobs fail, you can check the 'Keep alive' option.
Create before Image
You can press F5 or F9 to refresh the task monitor during the time it takes for the task to finish. A truck icon in the Status column means that the task is running. After a successful run, the status icon should turn green/yellow/red.
It is possible to execute specific sub-tasks instead of executing all tasks at once. To display and execute these sub-tasks, click on the blue icon in the Sub-task column.
Display Sub-Tasks
From here, you can execute the sub-tasks. In our example, we have few sub-tasks and to execute one, double click on the chosen sub-task or press F8. We can observe the changing status and refresh the task monitor until the sub-task finishes and the status icon turns green.
Execute Sub-Task
Sub-Task Completed
...
12. Afterwards, the Next step is to execute Generate tasks for after image.
Generate tasks for after image
13. Creating the after image task is similar to creating the before image. You execute the Create after image task and choose the number of background jobs for the task to run.
Create after Image
14. After both images (before and after) are ready for comparison, you should execute the task Generate tasks for comparison, followed by the task Compare before and after image. Your Test Run ID should now look similar to this one:
Generate task and Compare before and after image
15. In the Display results screen, the section on the left displays a list of all the test variants, by . By selecting one of the test variants you can compare the before and after image outputs in the right-hand side of the screen. The runtimes units are displayed in seconds.
Double click on your test case to compare the results of your before and after image.
Comparing Before and After results
16. As mentioned previously in the documentation, you can go back to any step in your Test Run by selecting Reset task state (Shift + F8).
Resetting steps in the Test Run ID