Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...


When the key/data part creation (images) are completed for the Before and After image, the corresponding rows are searched using the Key part. 
There are cases when there is no Key part in the query output. In this situation; Validate uses the rows position to compare the appropriate rows between both images. When multiple rows have the same Key part, Validate picks the corresponding row from another image. If the corresponding row does not belong to the group of rows with the matching Key part, then the output of the row is highlighted with yellow color to indicate the need for a manual check. To prevent any SAP rounding differences, in the settings for Validate you can specify the number of decimal places used for the Query results comparison. Please refer to Settings chapter to see how this setting can be changed. If a row from either image does not have a matching pair, the alternate image row is colorized with a red color; the data fields are colorized along with the row number. Missing rows can occur in both the Before and After image. 


Image RemovedImage Added
Missing row in other image


All corresponding rows are compared even on the data part. If any differences are found in the data part, the appropriate data cells are colorized with a red color to mark the difference. Only cells in the After image are colorized as the Before Image values are taken as correct ones.


Image RemovedImage Added

Incorrect data in After image

...

  • Automated Root Cause Analysis cannot be used for table comparison.
  • Key figure columns cannot be ignored in table comparison.

Comparison of DTP load results

Transformation testing often requires checking huge volumes of lines to see if the transformation logic was changed during the test scenario. When the Before and After image is compared, the all rows to be mutually checked are selected by their row position. To speed up the comparison, only hashed values of these lines are compared. The hashes of lines are calculated by using all of fields of in the line. Lines are colorized with red colour if there is any difference in any number of cells of compared lines.

...

If the lookup data testing is included in the comparison, then the following logic is applied during the data comparison. Direct Lookup data includes the lookup source data, which is coming into Lookup and the data results that are returned from Lookup. 
The data provided from the DVD Lookup Translator comes in packages because the packages are processed in the transformation. Validate executes the comparison only in cases where the same number of packages is provided by DVD Lookup Translator as for the before and after image. 
Data from each package is then compared independently with the data from other packages. For each package, the lookup source data is compared first. If the source data for the before/after package is not the same, then the results data is not compared as you can only test the lookup correctness based on the same input. If the source data is the same as for the before/after image of package results, data is then compared. 
Matching the image is done by the row number of the before/after resource data, when saving the lookup source and the result data, Validate sorts this by all fields to prevent sorting problems.

...

When displaying the results of Query, ListCube, SLO ListCube, Drill Down or Transformation testing in the 'Display results' step, for these Test Scenarios there is a list of all of the run Test Variants on the left side of screen. With basic information about each Test Variant (this depends on Test Variant type i.e. Query/ListCube/DTP/Drill Down) it is displayed together with following information:

  • Before Runtime [s] (optional) – runtime of Query/Table/ListCube/SLO ListCube/DTP load/ Drilldown execution in Before image.
  • After Runtime [s] (optional)  runtime of Query/Table/ListCube/SLO ListCube/DTP load/Drill Down execution in After image.
  • Before Runtime [hh:mm:ss] (optional) – runtime of Query/Table/ListCube/SLO ListCube/DTP load execution in a Before image in time format.
  • After Runtime [hh:mm:ss] (optional)  runtime of Query/Table/ListCube/SLO ListCube/DTP load execution in a After image in time format.
  • InfoProvider A Runtime [hh:mm:ss] (optional)  runtime of a Drill Down execution in an InfoProvider A in time format.
  • InfoProvider B Runtime [hh:mm:ss] (optional)  runtime of a Drill Down execution in an InfoProvider B in time format.
  • InfoProvider A Runtime [s] (optional)  runtime of a Drill Down execution in an InfoProvider A in time format.
  • InfoProvider B Runtime [s] (optional)  runtime of a Drill Down execution in an InfoProvider B in time format.
  • Difference [s] (optional) – difference in seconds between After image runtime and Before image runtime.
  • Difference [hh:mm:ss] (optional) – difference in seconds between After image runtime and Before image runtime in time format.
  • Result – result of data comparison.
  • Reporting Result – reporting status set be Validate/user.
  • Before Image Creation Time (optional) – time of before image creation.
  • Before Image Creation Data (optional) – date of before image creation.
  • After Image Creation Time (optional) – time of after image creation.
  • After Image Creation Data (optional) – date of after image creation.
  • Before Image Rows (optional) – number of rows of before image.
  • After Image Rows (optional) – number of rows of after image.
  • Reporting Status Text (optional) – text supplied when Validate/ user set reporting status.
  • Variant ID (optional) – Validate technical ID of variant.
  • Before Image Status (optional) – Task task execution status for the before image
  • After Image Status (optional) – Task task execution status for the after image
  • Before Image Overflow Handled (optional) – notification of overflow occurrence
  • After Image Overflow Handled (optional) – notification of overflow occurrence
  • Conversion Runtime [s] (optional)  conversion runtime of a Before image in the SLO ListCube/ListCube.
  • Conversion Runtime [hh:mm:ss] (optional)  conversion runtime of a Before image in the SLO ListCube/ListCube in time format.

All optional columns can be added to the results overview table by clicking on  'Change Layout …' button. ALV column structure for the user and scenario type is always saved on exit and reapplied when user enters this step again.


Image RemovedImage Added
Different Results of Before/After image comparison

...

  • Green semaphore - if data returned by Before and After image is the same,.
  • Yellow semaphore - if none of the images returned data.
  • Red semaphore - if inconsistency/differences were found between the data returned in the Before and After Image.
  • None - if the comparison of images failed with an error (e.g. outputs with different keys structure were supplied for comparison) or if the comparison was not yet done.

Sometimes the 'After Runtime [s]' cells along with the 'Difference' cells are colorized with red colour, this . This can happen when a there is a difference between the Before image runtime and/or the After image runtime image reaches a threshold value defined in the Validate settings. You can specify these threshold values by clicking on the  button in the toolbar.
These settings can have an influence on the comparisons decimal precisions and are also applied in the output.
You can display each variant in detail, by right clicking on selecting from the context menu of the appropriate variant row and selecting the 'Display Variant' from the context menu. 


Image RemovedImage Added
Display Variant Details


The Variant details screen differs, based on the type of variant that was clicked on (e.g. Query Variant, Bookmark Variant, ListCube Variant, Drill Down Variant).


Image RemovedImage Added
Query Variant Details Screen


You can use  button to filter out all correct records. 
In the list of all Test Variants, only the variants that finished with erroneous comparisons are displayed (i.e. Test Variants that didn't finish with an error are filtered out from list).


Image RemovedImage Added
Query Testing scenario erroneous variants


For the actual data screens if the 'Only Errors' mode is active, only the rows of the output that have with at least one cell colorized with red colour are displayed. In the Before Image output, the correct rows are also displayed that correspond to After Image rows with wrong data.


Image RemovedImage Added
ListCube erroneous rows and appropriate before image rows


For the Transformation testing in the before image results, only the corresponding rows in terms of row number are displayed with the erroneous rows of the after image, when the 'Only Errors' mode is active.


Image RemovedImage Added
Erroneous display in Transformations testing scenario

...

  • No union view is available
  • There are three different formats you can review the reports outputs: ALV table, Text Screen and HTML.

Image RemovedImage Added
Three types of report output display


We recommended the display type HTML that uses a monospaced font so the results are easily readable. ALV and HTML display types can colorize errors found in the reports unlike the simple text display.


Image RemovedImage Added
Example of report HTML report output view

PDF Export

Image RemovedImage Added

You can choose to export ERP report result into PDF file. First you have to select variants which you want to generate PDF from. User You can select multiple variants as it will create one PDF for each variant. After choosing this option user you will be prompted to select a folder. PDF files will be saved in this folder following namespace: RUN ID_TESTCASE ID_DATE OF COMPARE.  In each PDF there will be before image and after image for one variant. Differences are highlighted by in red color in after image.


Navigation in outputs

For performance reasons 5000 lines (can be changed via settings) is displayed at once in the 'Display results' for each variant output. When any of the variant images have more than 5000 lines in the output, the navigational buttons become active and you can page through the results. In the Before and After image outputs you can navigate through these independently by using navigational buttons. 


Image RemovedImage Added
Output buttons and position

...

Sometimes it is required to display Before/After image outputs in full screen. You can click on the Image RemovedImage Added 'Full Screen' button in the application toolbar. To activate a full screen mode, the results must be already displayed in the standard view. It is possible to navigate through the output result pages (if any) directly in the full screen mode. 


Image RemovedImage Added
Full screen mode of variant output

...

In the Output for ListCube, Transformation and Drill Down test scenarios these can be displayed together in union mode. This is accessible after clicking on  'Union Screen Results Display' button in the toolbar and lets you display two different outputs on one single screen. 


Image RemovedImage Added
Union screen mode for results display


The first column contains the information about the source of data for a specified row. For Transformation and ListCube scenarios it either contains the value 'A' (After image) or 'B' (Before image) and specifies which image the record belongs to. For the Drill Down scenario this column contains a concatenation of InfoProvider technical name and its RFC Destination system ID (if any). 

For all scenarios the rows are paired in a way they can be compared together. This can differ based on the scenario i.e. for Transformation testing scenario row numbers are used while for ListCube and Drill Down scenarios the appropriate row keys are matched. 
*Important Note: In the current version of Validate, the Query testing scenario does not support Union display mode.

Check of keyfigure key figure sums (Listcube and Table Test Scenario)

In the output of InfoProviders or database tables, sometimes, when there are differences in results of Before and After Images, it is needed to compare also overal overall sums of Before/After Image keyfigureskey figures. Instead of manual check of each column, you can click on button 'Check Sums' and get comparison of summed keyfigure key figure values. Fields with different values are highlighted with red, overflown sums are set to 0 and also higlighted highlighted with different color.  

Image RemovedImage Added

Compared Table keyfugure sums with highlighted differences and overflown sum

...

For the Drill Down and ListCube Test Scenario a column with the name 'Missing' is added to the output of InfoProviders. If a line is not correct in one of the Before/After images because there was no record with the same key found in the After image, an icon  is added to the column. This helps you to differentiate between the erroneous and the missing records, you . You can also use this column for the filtering of results. This column is visible in all three types of results display.
*Important Note: Using standard ALV filtering functionality on the output tables only influences the actually displayed page records and does not influence records of other pages. 


Image RemovedImage Added
Missing Column

...

It is possible to display the reporting status column in the table with a list of the tested variants for all of the Backend scenarios by changing the layout of the result table. By default this status is always set to the same value as that of the comparison results in Validate, . The Reporting status is used to define the statuses of individual Test Cases for reporting. You can set the reporting status and override the compare status set by Validate by clicking on the appropriate Variant result row and then selecting the 'Set Reporting Status' option. 


Image RemovedImage Added
Set Reporting Status Option


When changing the reporting status, you can choose the reporting status you want to set for variant and add a description. 


Image RemovedImage Added
Set Reporting Status Dialog


It is 's possible to set the same reporting status atonce at once for multiple variants by selecting more rows and choosing 'Set Reporting Status' option.

 

Image RemovedImage Added
Setting reporting status for multiple variants


You may want to unify reporting statuses used by test users to reflect acceptable errors (e.g. 'Out Of Scope') it is possible to specify cross Validate reporting statuses. You can define these reporting statuses in Validate settings under 'Backend' settings tab. All customized reporting statuses can be then be selected in Set Reporting Status dialog using F4 help. 


Image RemovedImage Added
Report Statuses Customizing

...

During the Transformation testing scenario, when Validate is provided with the data from the DVD Lookup Translator , and is using the 'Display Lookup Results', this contains the data and the comparison results. The Structure of this screen is very similar to the standard 'Display Results' screen, however there are some differences:
In 'Display Lookup Results' screen there are two comparison statuses for each line of the DTP variant tested, in . In some cases there can be multiple lines in a run and for each variant. The number of lines depends on number of data packages that were processed during the load for each variant. The First comparison of the results defines the statuses of the lookup source data for the comparison, while the second one defines the statuses of the comparison results from the returned data in the lookup. 


Image RemovedImage Added
Lookup package comparison results


As is it's displayed on Figure 237, you can see that when the source data comparison fails, no comparison is performed on the lookup returned data.
The display on the right side of the screen is the before/after image data as it normally would have been in the standard 'Display Results' screen. When you double click on the appropriate variant lookup package, the data is displayed. By default, when you display the actual data this way, the returned lookup data is also displayed. To switch between the display of the source data and the result data, click on the Image RemovedImage Added 'Source Data' button (Shift + F7) or Image RemovedImage Added 'Result Data' button (Shift + F8).

Test Run ID Locking

To prevent the execution of a Test Run by mistake and overwriting current results, you can use lock option. When a Test Run is locked, it is not possible to change task state, execute or reset image creation or comparison, change variants or reporting status until it is unlocked again. Mass Execution and scheduling will not run on locked test runs. Deletion in Test Plan stops at first locked Test Run. You are informed about locked Test Run by the lock icon in the Test Plan tab of Test Management and by a message with a description and a user name.
You can change lock state of a Test Run by right clicking on selecting from the context menu of a Test Case name in Test Plan (or right click on or select it from the context menu of the root task in the Test Case) and selecting the 'Lock Run' (or 'Unlock Run') from context menu. 


Image Added

Image RemovedImage RemovedImage Added


Click Clicking on 'Lock Run' option, opens a popup window where you can set the description of the reason to lock Test Run. Click Clicking on 'Unlock Run' unlocks the Test Run directly.


Image RemovedImage Added


Lock functionality is available for these Test Case Types:

...

This example shows you a step-by-step guide as to how to create, execute and display results in Backend Testing.

...

2. In Validate Dashboard screen, choose Backend Testing (last icon on the function panel).

Image RemovedImage Added
Backend Testing

3. In the Validate Backend Testing screen, choose Create new Test Run ID (F5).


Image RemovedImage Added
Create New Run Test ID

4. A new window appears, in the first pop up , where you can add the type of a Test Scenario (you can press F4 for the list of all possible entries). Currently, there are 4 Test Scenarios to choose from. These scenarios are described in the chapter Backend Testing.

Image RemovedImage Added
Adding a Test Scenario


For this example, we will choose the test scenario for Query testing – EQS.
After choosing the Test Scenario, you can enter the name for Test Run ID and the description. 


Image RemovedImage Added
Completing the creation of a Test Run ID

...

5. After creating a new Test Run ID, you should see an overview of all the tasks.

Image RemovedImage Added
Overview of Tasks for Test Run ID 

...

If you choose this option, you can choose from the existing query variants that were created previously.

...

  • Create Based o Web Template

Create a query variant for queries of selected web template.

...

Create query variant for queries of web template bookmark.
In our example, we will choose to Create new Query Variant.


Image RemovedImage Added
Creating a new Query Variant

7. When you click on Create new Query Variant, a new window should appear; here . Here you need to add the technical name of the query you want to use and a description. Other fields are optional, please refer to 'Create new Query Variant' chapter for more details.

Image RemovedImage Added
Query variables 


If the selected query requires input variables, you can then set this them up by clicking on 'Set query variables' button. 


Image RemovedImage Added

Set query variables 

8. After you save the query variant, you can view this in the list of all query variants.


Image RemovedImage Added
Set query variables

9. Once all your query variants are added, save the selection by pressing the 'Save' button (Ctrl + S), then . Then you can return to the Test Run ID tasks. In the next step you need to Execute (F8 or double click) Generate tasks for before image, once . Once generated, the first two status icons should be green.

Image RemovedImage Added
Generate Tasks for before image 

...

Note

Important information: To reset the task state of every executed task by highlighting the task and clicking on on the Reset task state button (Shift + F8). 


Image RemovedImage Added
Generate Tasks for before image 

10. Once the tasks are generated, you can then execute Create before image. A new pop up appears where you can specify the number of background jobs and the name of Application server to be used.

If you want the number of background jobs to stay the same even when one or more jobs fail, you can check the 'Keep alive' option. 


Image RemovedImage Added

Create before Image 


You can press F5 or F9 to refresh the task monitor during the time it takes for the task to finish. A truck icon in the Status column means that the task is running. After a successful run, the status icon should turn green/yellow/red.
It is possible to execute specific sub-tasks instead of executing all tasks at once. To display and execute these sub-tasks, click on the blue icon in the Sub-task column.


Image RemovedImage Added
Display Sub-Tasks 


From here, you can execute the sub-tasks. In our example, we have few sub-tasks and to execute one, double click on the chosen sub-task or press F8. We can observe the changing status and refresh the task monitor until the sub-task finishes and the status icon turns green.


Image RemovedImage Added
Execute Sub-Task 


Image RemovedImage Added
Sub-Task Completed

...

12. Afterwards, the Next step is to execute Generate tasks for after image.

Image RemovedImage Added
Generate tasks for after image 

13. Creating the after image task is similar to creating the before image. You execute the Create after image task and choose the number of background jobs for the task to run.

Image RemovedImage Added
Create after Image 

14. After both images (before and after) are ready for comparison, you should execute the task Generate tasks for comparison, followed by the task Compare before and after image. Your Test Run ID should now look similar to this one:

Image RemovedImage Added
Generate task and Compare before and after image 

15. In the Display results screen, the section on the left displays a list of all the test variants, by . By selecting one of the test variants you can compare the before and after image outputs in the right-hand side of the screen. The runtimes units are displayed in seconds.

Double click on your test case to compare the results of your before and after image. 


Image RemovedImage Added
Comparing Before and After results 

16. As mentioned previously in the documentation, you can go back to any step in your Test Run by selecting Reset task state (Shift + F8).

Image RemovedImage Added
Resetting steps in the Test Run ID