(DV-2311) MS Document Template

The new concept of MS Document Templates allows you to define your own document layout. The template is a simple Microsoft Word .docx file that contains some placeholders. Placeholders are replaced with real values in the process of generation of Test Plan Documentation. All allowed placeholders are reflecting the real data structure provided by Validate Test Plan.

Manage Test Plan default templates

Templates are stored in SNP Validate Document Repository accessible in Test Management settings

Default templates delivered in SNP Validate transport are:

  1. Test Plan Default Template [ID: DVD_VAL_DOCX_TP_DEFAULT_TEMPLATE]: Contains all Test Plan-specific information like in the old template (with the fixed layout).

  2. Test Plan w/o details Template [ID: DVD_VAL_DOCX_TP_WO_DETAILS_TEMPLATE]: Contains all Test Plan-specific information without variant details.

The default template can be downloaded to the local PC, modified, and uploaded back to the SNP Validate Document Repository under a new ID with the prefix ZVAL_DOCX_. It is not allowed to overwrite default templates!

Create own template

To create or change the template you have to toggle the developer toolbar in Microsoft Word (File > Options > Customize ribbon)

Go to the Developer tab and turn on Design Mode.

Create your own text layout of Test Plan documentation with the desired formatting. Write placeholder names (use only allowed placeholder names) to the places which will be later replaced with real values by the generator. The formatting used on placeholder names will be used on real values.

 

 

Save the template as a Microsoft Document .dox file and upload it to SNP Validate Template Repository.

Create simple placeholder

Convert written placeholder names to a real placeholder that can be recognized by the Test Plan documentation generator in the following way:

  1. Select the placeholder name

2. In the developer toolbar click on the Rich Text Content Control icon and then click on Properties.

3. Enter the tag name. The tag name started with f_* / d_* refers to a real field in the data structure of the test plan.

4. Repeat it for all placeholders.

Create placeholders for repeated content

To create repeated rows or text fragment(s) use the following procedure:

  1. Select row or text fragment(s), and click on the Repeating Section Content Control icon in the developer toolbar.

2. Place the cursor at the beginning or end of the control and click on properties.

3. Set the tag name of the repeated part starting with t_* that refers to the real table name in a data structure of the test plan.

4. Tag all placeholders in the document.

 

In the same way, you can repeat any content of any text or even pages. Just select the content, click on the Repeating Section Content Control icon, and set the proper tag.

The repeated part with placeholder names (tags especially) has to reflect the real hierarchy of fields and tables in the data structure of the Test Plan.

Create placeholders for sub-document

With the placeholder is possible to insert also the Test Case / Test Plan status document or the main document of the Test Case (a.g. document for manual testing). The principle for the placeholder is the same as for a simple placeholder. The placeholder name which refers to the document is starting with the prefix d_*.

The sub-document is inserted into the main document in a process of generation and kees all formatting of the sub-document. The sub-document has to be stored in the new Microsoft Document .docx format otherwise a message will appear that the sub-document will not be inserted in the process of generation. The sub-document placeholder name has to refer to the real data structure of the Test Plan.

Placeholder list

The data structure with placeholder (tag) names of each Test plan is following:

<TEST_PLAN_DATA>

f_gen_system: Name of the SAP system from which the document is generated

f_gen_date: Date and time of generation in a local date-time format

f_gen_user: User name who generated the document

f_tech_name: Technical name of the test plan

f_description: Description of test plan

f_untested: Number of untested test cases

f_inprocess: Number of test cases in process

f_successful: Number of test cases executed successfully

f_warning: Number of test cases finished with warning

f_failed: Number of test cases failed

f_finished: Number of test cases where execution has finished

d_status_document: Content of the status document of the test plan (the status document has to be stored in new Microsoft Document .docx format)

t_history: List of historical changes in the test plan status

f_status: Test plan status (Active|Closed)

f_description: Description/note of the status

f_status_date: Date and time of the status grant in local date-time format

f_set_by: User name of the person who set the status

t_test_case: List of the test cases in the test plan

f_tech_name: Technical name of the test case

f_description: Description of the test case

f_test_type: Test type of the test case

f_status: Last status of the test case

f_status_date: Date and time of the test case status grant in local date-time format

f_set_by: User name of the person who set the test case status

f_untested: Number of untested variants

f_inprocess: Number of variants in process

f_successful: Number of variants executed successfully

f_warning: Number of variants finished with warning

f_failed: Number of variants failed

f_finished: Number of variants where execution has finished

d_document: Test case document (e.g. document of testing step list for the manual test)

d_status_document: Content of the status document of the test case (the status document has to be stored in the new Microsoft Document .docx format)

t_history: List of historical changes in the test case status

f_status: Test case status (Untested|In Process|Failed|Successful|Re-test required|Depreciated|Ignored|…)

f_description: Description/note of the status

f_status_date: Date and time of the status grant in local date-time format

f_set_by: User name of the person who set the status

t_variant: List of variants in the test case

f_variant_id: Unique ID of the variant

f_result: Reporting status result of the variant

f_rfc_destination: RFC destination used in the variant definition

f_description: Description of the variant

f_after_image_variant_id: After image variant ID (if exists)

f_before_image_status: Status of before image creation

f_after_image_status: Status of after image creation (if exists)

f_compare_task_status: Status of comparison task execution

f_compare_result_status: Status from comparison

t_detail: List of another variant’s definition values (test type specific)

f_name: Name of the specific variant value

f_value: Corresponding variant value

t_directory: List of directories in the test plan

f_description: Directory description

f_name: Directory technical name

f_untested: Number of untested test cases in the directory

f_in_process: Number of test cases in the directory in the processing state

f_failed: Number of failed test cases in the directory

f_successful: Number of successfully tested test cases in the directory