(DV-2202) Drill Down Execution & Backend Testing
DRILL DOWN TEST CASE EXECUTION
When a Drill Down Test Case is added to the Test Plan a new backend Test Run is automatically created and a Run ID is generated. The description holds the technical name of the Test Plan and the Test Case. All Drill Down Variants selected during the creation of Drill Down Test Case are automatically added into Drill Down Variant selection for the generated Test Run.
To execute a Drill Down Test Case from Test Management click on 'Execute'. Afterward, the standard Drill Down Testing scenario is displayed. Please refer to Test Scenario EQS_DRILL (Drill Down testing) section below for a more detailed description of each scenario step.
Drill Down testing
The SNP Validate Drill Down Test Scenario uses the ListCube functionality in a similar way as in the ListCube test scenario and uses the RSA1 'Display Data' functionality to get the data from the InfoProviders to be tested.
Recommendations:
- For one DrillDown testing run, a maximum of 1000 Drill Down variants should be used.
- It is recommended to use the SNP Validate setting 'DrillDown maximum records' to under the value of 100 in order to reduce the amount of data to process and reduce the chance of errors.
Test Scenario EQS_DRILL (Drill Down testing)
Drill Down Testing Scenario
Drill Down testing contains the following steps:
- Selecting Drill Down Variants
- Generation of execution tasks
- Execution of Drill Down Tasks
- Display of results
Select Drill Down Variants
You can select which of the Drill Down Variants are to be used or create new Drill Down Variants to be added to the Test. When the Test Run is generated through the SNP Validate Test management the variants are already preselected for the Drill Down Test Case.
Once the variants are selected for the Test Run, you can save the selection by clicking on the 'Save' button (Ctrl + S).
In the following ways it is possible to add Drill Down Variants to the list:
Create New Drill Down Variant - (SHIFT + F1). For detailed information about the creation of the Drill Down variant please refer to Create new Drill Down Variant section,
Add Existing Drill Down Variant - (SHIFT + F4) displays all of the existing Drill Down Variants in the system, and you can select one or more variants to be added into the current Test Run.
Add Drill Down Variants of Run - (SHIFT + F5) you can select the Variants used in another (distinct from current) run and add them into current the run as well.
Generate Drill Down Variants - (SHIFT + F6) please refer to Generate Drill Down Variants section for more information.
Copy Variants - Please refer to the Copy DrillDown Variants section for function details.
Generate Tasks for Drill Down
The tasks are generated for the following step 'Execute Drill Down Tasks'. You can specify the key figures go be ignored in the InfoObjects during the comparison of the Drill Down Variants specified for this run. These ignored key figure columns are not visible in the 'Display Results'. The Generation of the tasks is executed by clicking on the 'Create Tasks' (F8) button.
You can enable Automated Root Cause Analysis for comparison tasks. Please refer to the Automated Root Cause Analysis section for details.
Selection of ignored key figure columns
Execute Drill Down Tasks
You double click to execute, first you can define the number of jobs/ background jobs to be used for task execution each task executes a Drill Down test for one Drill Down Variant. Drill Down scenario testing is explained below.
The Drill Down test variant compares the data from two different InfoProviders, in most cases, it's the same InfoProvider but on different systems. Based on selected Drill Down characteristics in the variant definition the execution starts by selecting the first specified Drill Down characteristic and adds it to a ListCube, which is then read on both InfoProviders as a characteristic. The ListCube reads the outputs and these are immediately compared. For two out of three scenarios the Drill Down test execution ends here.
- One InfoProvider did not return data while the other one did. In such case, the first InfoProvider has no data, we advise not to continue as further comparisons will fail.
- The data returned by both InfoProviders are the same. If everything is correct the Drill Down task stops the execution to free up resources for other tasks.
In the third scenario, some or all data returned by InfoProviders are not the same. In this case, a new test cycle begins, and the erroneous data is checked. Using the SNP Validate setting 'DrillDown maximum records', for the first added characteristic (the one added at the start of the first cycle) the X number of distinct values that belong to erroneous records are selected. These values act as a filter value for this characteristic in the next test cycle (drill down into erroneous records).
The next characteristic is the order of specified Drill Down characteristics (if no more are available then the execution stops here) is added to ListCube for reading and then another characteristic is selected for the output. The ListCube reads are repeated for both InfoProviders with the new settings and the data is then compared again. Afterward either a new test cycle begins using the same logic just described, or execution ends here, depending on the result of the comparison.
If SNP Validate setting 'DrillDown additional aggr.' is set to 'X' in each cycle when InfoProviders are read the returned data is aggregated again. This functionality can be used for cases when ListCube returns multiple rows with the same key because of unsupported functions.
Display results
By double-clicking on this step, you can view the DrillDown Variants executions outputs and their comparison results are displayed. Please refer to the Results overview chapter for more details.