CN117520137A - Tracking method of test progress, electronic device and computer readable medium - Google Patents

Tracking method of test progress, electronic device and computer readable medium Download PDF

Info

Publication number
CN117520137A
CN117520137A CN202210900624.3A CN202210900624A CN117520137A CN 117520137 A CN117520137 A CN 117520137A CN 202210900624 A CN202210900624 A CN 202210900624A CN 117520137 A CN117520137 A CN 117520137A
Authority
CN
China
Prior art keywords
test
plan
environment
items
item
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210900624.3A
Other languages
Chinese (zh)
Inventor
郭向兵
韦征
张卫龙
权娇
徐化东
李伟山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ZTE Corp
Original Assignee
ZTE Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ZTE Corp filed Critical ZTE Corp
Priority to CN202210900624.3A priority Critical patent/CN117520137A/en
Priority to PCT/CN2023/098537 priority patent/WO2024021877A1/en
Publication of CN117520137A publication Critical patent/CN117520137A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The present disclosure provides a method for tracking a test progress, including: establishing a test item pool, wherein the test item pool comprises a plurality of test items; acquiring a set of items to be tested from the test item pool according to a test scene in a test environment, and setting a test plan; scheduling test items in the test plan on the test environment, and updating test progress data; pushing a tracking result of the test progress according to the test progress data. The present disclosure also provides an electronic device, a computer-readable medium.

Description

Tracking method of test progress, electronic device and computer readable medium
Technical Field
The disclosure relates to the field of testing technology, and in particular, to a method for tracking testing progress, an electronic device, and a computer readable medium.
Background
The test field mainly tracks the test progress based on the test time or the test log, and usually needs to track the test progress of the manual test and the automatic test respectively, but the simultaneous tracking of the test progress of the manual test and the automatic test is less involved. In addition, the current test plan is created by relying on manual creation of test experts, a great deal of labor is required, the information obtained by tracking the test progress is not abundant, and the user experience is poor.
Disclosure of Invention
The embodiment of the disclosure provides a test progress tracking method, electronic equipment and a computer readable medium.
In a first aspect, an embodiment of the present disclosure provides a method for tracking a test progress, including:
establishing a test item pool, wherein the test item pool comprises a plurality of test items;
acquiring a set of items to be tested from the test item pool according to a test scene in a test environment, and setting a test plan;
scheduling test items in the test plan on the test environment, and updating test progress data;
pushing a tracking result of the test progress according to the test progress data.
In some embodiments, scheduling test entries in the test plan on the test environment, updating test progress data, includes:
scheduling test items in an automated test plan on the test environment, and updating test progress data of the automated test plan;
scheduling test items in a manual test plan on the test environment, and updating test progress data of the manual test plan;
wherein automated test items in the set of items to be tested are categorized as the automated test plan, and manual test items in the set of items to be tested are categorized as the manual test plan.
In some embodiments, scheduling test entries in an automated test plan on the test environment, updating test progress data of the automated test plan, includes:
scheduling test items in the automatic test plan on the test environment according to a scheduling configuration table, wherein an execution strategy and test environment information are configured in the scheduling configuration table;
according to the automatic test link field of the test item in the automatic test plan stored in the test plan sub-database, retrieving and executing the automatic test script of the test item;
and collecting an environment label, a version label, a use case number, a use case title, a test result, execution time, CPU resource information, memory resource information and an execution report of the test environment, and updating the environment label, the version label, the use case number, the use case title, the test result, the execution time, the CPU resource information, the memory resource information and the execution report to a use case analysis sub-database.
In some embodiments, scheduling test entries in an automated test plan on the test environment, updating test progress data of the automated test plan, further comprises:
when the test result of the test item in any one of the automatic test plans is failure and the failure cause is non-failure, updating the final result field of the test item in the use case analysis sub-database to be needed to be rescheduled;
According to the environment label and the version label, periodically searching the final result field in the use case analysis sub-database as a test item needing rescheduling;
the timing rescheduling final result field is the test entry that needs to be rescheduled.
In some embodiments, scheduling test entries in a manual test plan on the test environment, updating test progress data of the manual test plan, includes:
acquiring test items in a manual test plan according to an environment label and a version label, so that a tester can perform manual test on the test environment according to the test steps of the test items;
and updating the test result, the execution time, the failure reason, the CPU resource information and the memory resource information into the use case analysis sub-database according to the manual test result.
In some embodiments, building a pool of test entries includes:
adding test items into the test item pool according to a predefined test item template;
acquiring an automatic test script of the test item;
updating the automation identification of the test item, and adding the path of the automation test script to the test item.
In some embodiments, the test entries in the test entry pool are stored in a test entry sub-database, the test entries stored in the test entry sub-database comprising: case number field, case header field, test step field, expected result field, test scenario field, automation identification field, automation test link field.
In some embodiments, according to a test scenario in a test environment, acquiring a set of items to be tested from the test item pool, and setting a test plan includes:
obtaining test items corresponding to the test scene from the test item pool to form the item set to be tested;
classifying the automatic test items in the item set to be tested as an automatic test plan, and classifying the manual test items in the item set to be tested as a manual test plan;
environmental and version tags are added to test items in the automated test plan and the manual test plan.
In some embodiments, the automated test plan and the manual test plan are stored in a test plan sub-database, the test plan sub-database storing test entries comprising: the environment label field, the version label field, the case number field, the case title field, the test step field, the test result field, the automation identification field, the automation test link field, the latest result field, the final result field, the failure reason field, the Central Processing Unit (CPU) resource field and the memory resource field.
In some embodiments, pushing the tracking result of the test progress according to the test progress data includes:
summarizing test progress data of each test environment and each test version of the test plan from a use case analysis sub-database according to the environment label and the version label;
pushing tracking results of the test progress of the test plans of each test environment and each test version according to the summarized results.
In some embodiments, pushing the tracking result of the test progress of each test environment, each test version of the test plan according to the summarized result includes:
and carrying out early warning on real-time test progress of the test plans of each test environment and each test version according to the summarized results.
In some embodiments, pushing the tracking result of the test progress of each test environment, each test version of the test plan according to the summarized result includes:
comparing failure reasons of failed test items in the same test environment, different versions, and/or resource information of successful test items;
pushing the comparison result.
In a second aspect, embodiments of the present disclosure provide an electronic device, including:
one or more processors;
and a memory having one or more programs stored thereon, which when executed by the one or more processors, cause the one or more processors to implement a method of tracking test progress according to the first aspect of the embodiments of the present disclosure.
In a third aspect, embodiments of the present disclosure provide a computer readable medium having stored thereon a computer program which, when executed by a processor, implements the method of tracking test progress according to the first aspect of embodiments of the present disclosure.
According to the test progress tracking method, the test item pool is established, the test item setting test plan can be obtained from the test item pool according to the test scene in the test environment, then in the test item scheduling process in the test plan, the test progress data are updated, the tracking result is pushed, automatic test and manual test can be tracked on the same system, accurate perception of the test progress by a user and real-time control risk are facilitated, labor investment is reduced, and user experience is improved.
Drawings
FIG. 1 is a flow chart of a method of tracking test progress in an embodiment of the present disclosure;
FIG. 2 is a flow chart of some steps in another method of tracking test progress in an embodiment of the present disclosure;
FIG. 3 is a block diagram of one component of an electronic device in an embodiment of the present disclosure;
FIG. 4 is a block diagram of one component of a computer-readable medium in an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a test progress tracking system in an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of another test progress tracking system in an embodiment of the present disclosure.
Detailed Description
In order to better understand the technical solutions of the present disclosure, the following describes in detail a tracking method, an electronic device, and a computer readable medium for a test progress provided by the present disclosure with reference to the accompanying drawings.
Example embodiments will be described more fully hereinafter with reference to the accompanying drawings, but may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Embodiments of the disclosure and features of embodiments may be combined with each other without conflict.
As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and the present disclosure, and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
In a first aspect, referring to fig. 1, an embodiment of the present disclosure provides a method for tracking a test progress, including:
s1, establishing a test item pool, wherein the test item pool comprises a plurality of test items;
s2, acquiring a set of items to be tested from the test item pool according to a test scene in a test environment, and setting a test plan;
s3, dispatching test items in the test plan on the test environment, and updating test progress data;
and S4, pushing a tracking result of the test progress according to the test progress data.
In the disclosed embodiments, the test items in the test item pool may include automated test items, as well as manual test items. The embodiments of the present disclosure are not particularly limited thereto. In embodiments of the present disclosure, test items in a test item pool may correspond to multiple test scenarios in multiple test environments. The embodiments of the present disclosure are not particularly limited thereto.
In the embodiment of the disclosure, acquiring a set of items to be tested from a test item pool according to a test scene in a test environment refers to extracting test items corresponding to the test scene from the test item pool to form the set of items to be tested; according to different test scenes in different test environments, a set of items to be tested corresponding to different test scenes in different test environments can be obtained.
In some embodiments, the test plan is set up as a subset of test items in the collection under test. The embodiments of the present disclosure are not particularly limited thereto.
In embodiments of the present disclosure, scheduling test items in a test plan on a test environment may include scheduling automatic test items on the test environment, or may include scheduling manual test items on the test environment. The embodiments of the present disclosure are not particularly limited thereto.
In some embodiments, the test environment is a test environment of a telecommunications network gateway product, and the test scenario is a test scenario of a telecommunications network management product. The embodiments of the present disclosure are not particularly limited thereto.
According to the test progress tracking method, the test item pool is established, the test item setting test plan can be obtained from the test item pool according to the test scene in the test environment, then in the test item scheduling process in the test plan, the test progress data are updated, the tracking result is pushed, automatic test and manual test can be tracked on the same system, accurate perception of the test progress by a user and real-time control risk are facilitated, labor investment is reduced, and user experience is improved.
The disclosed embodiments are not particularly limited as to how to schedule test entries in a test plan and update test progress data on a test environment.
In some embodiments, the set of test items to be tested is divided into an automatic test item subset and a manual test item subset according to whether the test items are automatic test items, and the automatic test item subset and the manual test item subset are respectively classified into an automatic test plan and a manual test plan, and the automatic test plan and the manual test plan are independently scheduled when the test items in the test plan are scheduled.
Accordingly, in some embodiments, referring to fig. 2, scheduling test entries in the test plan on the test environment, updating test progress data, includes:
s31, scheduling test items in an automatic test plan on the test environment, and updating test progress data of the automatic test plan;
s32, dispatching test items in a manual test plan on the test environment, and updating test progress data of the manual test plan;
wherein automated test items in the set of items to be tested are categorized as the automated test plan, and manual test items in the set of items to be tested are categorized as the manual test plan.
It should be noted that, the automated test entry refers to a test entry that can be automated and has completed development of an automated test script; manual test items refer to test items that cannot be automated or for which automated test script development is not complete.
In the embodiment of the disclosure, the item set to be tested is divided into the automatic test plan and the manual test plan which are independent of each other, and the automatic test plan and the manual test plan are respectively scheduled, so that the independence of test services of the automatic test and the manual test on the same test environment can be realized, the mutual interference of the automatic test and the manual test on the same test environment can be avoided, and the simultaneous tracking of the automatic test and the manual test is facilitated.
The disclosed embodiments do not make particular restrictions on how to schedule test items in an automated test plan on a test environment.
In some embodiments, scheduling test entries in an automated test plan on the test environment, updating test progress data of the automated test plan, includes: scheduling test items in the automatic test plan on the test environment according to a scheduling configuration table, wherein an execution strategy and test environment information are configured in the scheduling configuration table; according to the automatic test link field of the test item in the automatic test plan stored in the test plan sub-database, retrieving and executing the automatic test script of the test item; and collecting an environment label, a version label, a use case number, a use case title, a test result, execution time, central processing unit (CPU, central Processing Unit) resource information, memory resource information and an execution report of the test environment, and updating the information into a use case analysis sub-database.
The embodiment of the present disclosure does not particularly limit the execution policy in the schedule configuration table. For example, the execution policy includes any of manual triggers, implementation triggers, and timing triggers.
The embodiment of the present disclosure also does not particularly limit the test environment information in the schedule configuration table. For example, the test context information includes at least one of a context access internet protocol (IP, internet Protocol), a context label, a test scenario.
It should be noted that, in the embodiment of the present disclosure, for a test item capable of being automated, an automated test script is developed according to a test step of the test item, and the automated test script is stored in a test script sub-database; test entries for the automated test plan and the manual test plan are stored in a test plan sub-database. In the test plan sub-database, the automated test link field of the test item of the automated test plan is used to save the path of the automated test script of the test item, so that the automated test script of the test item can be retrieved and executed according to the automated test link field of the test item.
In some embodiments, scheduling can be scheduled multiple times for automated test entries in an automated test plan. For example, when the test result of an automated test entry fails, the automated test entry can be rescheduled.
Accordingly, in some embodiments, scheduling test items in an automated test plan on the test environment, updating test progress data of the automated test plan, further comprises: when the test result of the test item in any one of the automatic test plans is failure and the failure cause is non-failure, updating the final result field of the test item in the use case analysis sub-database to be needed to be rescheduled; according to the environment label and the version label, periodically searching the final result field in the use case analysis sub-database as a test item needing rescheduling; the timing rescheduling final result field is the test entry that needs to be rescheduled.
The cause of the test entry failure is not particularly limited by the embodiments of the present disclosure. For example, failure causes include problems with the test environment, test entry failures, use case design problems, business interface changes, and the like. It should be noted that, in the embodiment of the present disclosure, the failure of the test item is not a failure, which means that the test item is not scheduled due to a factor other than the test item itself. For example, a problem with the test environment results in a scheduling failure for the test item.
The disclosed embodiments are not particularly limited as to how test items in a manual test plan are scheduled on a test environment.
In some embodiments, scheduling test entries in a manual test plan on the test environment, updating test progress data of the manual test plan, includes: acquiring test items in a manual test plan according to an environment label and a version label, so that a tester can perform manual test on the test environment according to the test steps of the test items; and updating the test result, the execution time, the failure reason, the CPU resource information and the memory resource information into the use case analysis sub-database according to the manual test result.
It should be noted that, in the embodiment of the present disclosure, for a manual test plan, a tester filters out corresponding test items through an environment tag and a version tag, and performs a manual test in a test environment according to test steps of the test items.
The embodiments of the present disclosure are not particularly limited as to how the test entry pool is established.
In some embodiments, referring to FIG. 2, building a test entry pool includes:
s11, adding test items into the test item pool according to a predefined test item template;
S12, acquiring an automatic test script of the test item;
s13, updating the automatic identification of the test item, and adding the path of the automatic test script to the test item.
In the embodiment of the present disclosure, for the test item capable of automation, steps S12 to S13 are performed. Steps S11 to S13 are performed iteratively, each iteration adding a test entry to the test entry pool. After multiple iterations, a test entry pool is built.
In the disclosed embodiment, the test entry templates are predefined. In some embodiments, the user adds a test item to the test item pool according to a test item template. In some embodiments, a user analyzes whether a test item in a test item pool is capable of automation, and for the test item capable of automation, develops an automation test script based on the test steps of the test item. In some embodiments, the automated test script can automatically collect, in addition to executing the test steps, an environmental label, a version label, a use case number, a use case title, a test result, execution time, CPU resource information, memory resource information, and an execution report of the test environment after completing the test service, and update the test result to the use case analysis sub-database.
In the disclosed embodiment, the test entry templates are stored in a test entry template sub-database and the test entries added to the test entry pool are stored in a test entry sub-database. Since the test items are set according to the test item templates, the fields constituting the test items are identical to the fields constituting the test item templates.
The fields constituting the test item template and the fields constituting the test item are not particularly limited in the embodiments of the present disclosure.
In some embodiments, the test entries in the test entry pool are stored in a test entry sub-database, the test entries stored in the test entry sub-database comprising: case number field, case header field, test step field, expected result field, test scenario field, automation identification field, automation test link field.
The embodiments of the present disclosure are not particularly limited as to how the test plan is set.
In some embodiments, the set of items to be tested is divided into a subset of automatic test items and a subset of manual test items according to whether the test items are automatic test items, and the subset of automatic test items and the subset of manual test items are respectively classified into an automatic test plan and a manual test plan, so that the automatic test plan and the manual test plan can be independently scheduled when the test items in the test plan are scheduled.
Accordingly, in some embodiments, referring to fig. 2, according to a test scenario in a test environment, a set of items to be tested is obtained from the test item pool, and a test plan is set, including:
s21, acquiring test items corresponding to the test scene from the test item pool to form the item set to be tested;
s22, classifying the automatic test items in the item set to be tested as an automatic test plan, and classifying the manual test items in the item set to be tested as a manual test plan;
s23, adding environment labels and version labels for test items in the automatic test plan and the manual test plan.
In the disclosed embodiments, test entries for the automated test plan and the manual test plan are set to be stored in a test plan sub-database.
Accordingly, in some embodiments, the automated test plan and the manual test plan are stored in a test plan sub-database, the test plan sub-database storing test entries comprising: the environment label field, the version label field, the case number field, the case header field, the test step field, the test result field, the automation identification field, the automation test link field, the latest result field, the final result field, the failure reason field, the CPU resource field and the memory resource field.
It should be noted that the automation identification field is used to indicate whether the test item is already automated, i.e. to indicate whether the test item can be automated and, if so, whether the automation script development has been completed. For example, if a test item cannot be automated or if automated script development is not complete if automated is enabled, the test item is not automated; if the test item is capable of automation and the automated script development is complete, the test item is automated.
The embodiment of the disclosure does not particularly limit how to push the tracking result of the test progress according to the test progress data.
In some embodiments, referring to fig. 2, pushing the tracking result of the test progress according to the test progress data includes:
s41, summarizing test progress data of each test environment and each test version of a test plan from a use case analysis sub-database according to the environment label and the version label;
s42, pushing tracking results of the test environments and the test progress of the test plans of the test versions according to the summarized results.
In the embodiment of the disclosure, the test progress tracking results of each test version in each test environment are automatically generated based on the environment label and the version label, so that the accuracy of the tracking results can be ensured, users can accurately perceive the test progress and the version quality, and the risk is controlled in real time.
The tracking result is not particularly limited by the embodiments of the present disclosure.
In some embodiments, the tracking result is used for displaying the implementation progress of the test service of each test environment and each test version, so that the early warning pushing of the test progress can be performed.
Accordingly, in some embodiments, pushing the tracking result of the test progress of the test plan of each test environment and each test version according to the summarized result includes: and carrying out early warning on real-time test progress of the test plans of each test environment and each test version according to the summarized results.
In some embodiments, the failure reasons of the failed use cases and/or the resource information of the successful use cases in different test versions in the same test environment can be compared, so that users can perceive the quality of different versions.
Accordingly, in some embodiments, pushing the tracking result of the test progress of the test plan of each test environment and each test version according to the summarized result includes: comparing failure reasons of failed test items in the same test environment, different versions, and/or resource information of successful test items; pushing the comparison result.
The resource information includes CPU resource information, memory resource information, and the like.
The embodiments of the present disclosure are not particularly limited as to how the tracking results are pushed.
In some embodiments, the tracking results are presented to the user browser in real-time through an external web service.
In some embodiments, the tracking results are pushed to the user mailbox periodically by an out-of-mail service.
In a second aspect, referring to fig. 3, an embodiment of the present disclosure provides an electronic device, including:
one or more processors 101;
a memory 102 having one or more programs stored thereon, which when executed by one or more processors, cause the one or more processors to implement a method of tracking test progress as described in the first aspect of the embodiments of the present disclosure;
one or more I/O interfaces 103, coupled between the processor and the memory, are configured to enable information interaction of the processor with the memory.
Wherein the processor 101 is a device having data processing capabilities, including but not limited to a Central Processing Unit (CPU) or the like; memory 102 is a device with data storage capability including, but not limited to, random access memory (RAM, more specifically SDRAM, DDR, etc.), read-only memory (ROM), electrically charged erasable programmable read-only memory (EEPROM), FLASH memory (FLASH); an I/O interface (read/write interface) 103 is connected between the processor 101 and the memory 102 to enable information interaction between the processor 101 and the memory 102, including but not limited to a data Bus (Bus) or the like.
In some embodiments, processor 101, memory 102, and I/O interface 103 are connected to each other via bus 104, and thus to other components of the computing device.
In a third aspect, referring to fig. 4, an embodiment of the present disclosure provides a computer readable medium having a computer program stored thereon, which when executed by a processor implements the method for tracking test progress according to the first aspect of the embodiment of the present disclosure.
In order to enable those skilled in the art to more clearly understand the technical solutions provided by the embodiments of the present disclosure, the following details of the technical solutions provided by the embodiments of the present disclosure are described by specific embodiments:
example 1
The embodiment is mainly aimed at simultaneously tracking and real-time perspective of two test progress of automatic test and manual test of multiple test scenes and multiple project versions of a telecom network management product.
The present embodiment defines a test entry pool. Each test item in the pool has fields such as a case number, a case name, a test step, a test scenario, whether it is automated, an automated test link, etc., and the user adds the test item to the test item pool according to a defined test item template.
In this embodiment, automated test plans and manual test plans are adaptively created and adjusted. The two test plans adaptively extract an automatic test item subset (namely an automatic test plan) and a manual test item subset (namely a manual test plan) with environment labels and version labels from a test item pool according to the two dimensions of test scenes, version information and whether the test items are automatic or not of a test environment, all the test items under the two test plans are periodically and automatically synchronized into a database, wherein the latest result and the final result are uniformly initialized to be untested.
The automated test entry in this embodiment may be scheduled multiple times and may be mixed with the manual test plan on the same test environment. For an automatic test plan, a dispatching module dispatches test items under the test plan on a tested environment according to an execution strategy and a test environment field in a dispatching configuration table, each test item retrieves a corresponding automatic test script according to an automatic test link field of the test item, each automatic test script automatically acquires related information and updates the related information into a database after completing service test of the test item, and dispatching management periodically retrieves the automatic test items needing to be rescheduled from the database for timing rescheduling; for the manual test plan, a tester filters out corresponding test items in the system through the environment tag and the version tag, performs manual test in the test environment according to the displayed test steps, and updates relevant information into the database.
The embodiment provides early warning of test progress. The system automatically and quickly gathers test progress data of each test environment in each version based on the environment label and the version label to early warn pushing (such as large screen real-time display and mail timing pushing) as required, and simultaneously, the comparison of failure reasons of failed test items and resource information of successful test items in different versions in the same environment is perspective, and a user accurately perceives the current test progress and version quality and controls risks in real time.
Example two
The embodiment is applied to the situations that the manual test and the automatic test in the iteration period both need to follow continuously, and the tested object has various different test scenes and test version scenes. As shown in fig. 5, the test progress tracking system in this embodiment can be in network intercommunication with all test environments of the tested object, the test results are stored in the database of the system in real time, the system counts the test progress data in the summary database, then displays the summary result of the test progress to the user browser through the external web service, and pushes the summary result of the test progress to the user mailbox periodically through the external mail service.
The architecture of the test progress tracking system in this embodiment is shown in fig. 6, and includes:
test entry pool management: the user adds test items into a test item pool according to a defined test item template, wherein the test item template comprises fields of a use case number, a use case title, a test step, an expected result, a test scene, whether automation, an automation test link and the like, and all test items and templates in the test item pool are stored in a database system.
Test script library: the automatic test script developer develops the automatic test script according to the test steps of a certain test item in the test item pool, and the developed test item updates the two field values of whether the test item is automatic or not and the automatic test link.
Test plan management: the system acquires a test scene from a test environment through a remote service adapter, extracts a corresponding item set to be tested from a test item pool according to the test scene, tracks whether the test items are automated, classifies the item set to be tested into two subsets of an automated test plan and a manual test plan, marks the two test plan subsets with environment labels and version labels according to version fields and environment fields of the test environment, and finally synchronizes the two test plans to a database system periodically, wherein each test item of the database comprises fields of environment labels, version labels, use case numbers, use case titles, test steps, test results, whether the test items are automated, automatic test links and the like, and the latest results and the final results of each test item in the database are initialized to be untested.
Scheduling management: the scheduling management schedules test items under the test plan on the tested environment according to the execution strategy (such as manual triggering, real-time triggering and timing triggering) and the test environment information (such as environment access IP, environment label, version label and test scene) fields in the scheduling configuration table, each test item retrieves a corresponding automatic test script according to the automatic test link field of the test item, and each automatic test script automatically acquires the environment label, version label, case number, case title, test result, execution time of the test environment and the like of the test execution report and updates the fields to a database after completing the service test of the test item. The scheduling management also periodically retrieves automated test items from the database for rescheduling.
Use case analysis management: for the manual test plan, a tester filters out corresponding manual test items on the test progress tracking system through the environment label and the version label, then performs manual test on the test environment according to the displayed test steps, and finally updates the final result to the database system. For automated test items, if the latest result is failure, a test analyst analyzes the failure reason by combining the execution report of the corresponding time point, updates the failure reason (such as faults, environmental problems, use case design problems, service interface changes and the like) to a database, and if the analyzed failure reason is non-service faults (such as environmental self problems-broken links), the use case analyst corrects the final result to be rescheduling, and scheduling management can be regularly rescheduled.
Summarizing push management: the system automatically gathers test progress data of each test environment in each version based on the environment label and the version label to perform early warning pushing (such as large-screen real-time display and mail timing pushing) according to the need, and simultaneously, the comparison of failure reasons of failure test items in different versions in the same environment and resource information of the success test items is seen through. The user can accurately sense the quality of the product version in the current test progress and iterative evolution, and can control risks in real time, so that the labor investment in tedious data arrangement, summarization and reporting is greatly reduced.
Example III
The tracking of the test progress in this embodiment includes:
and (3) establishing a test item pool. The first stage: the user adds test items to the test item pool according to the defined test item templates. The test entry templates contain fields for use case number, use case title, test steps, desired results, test scenario, whether automated, automated test links, etc. And a second stage: the user analyzes whether a certain test item in the test item pool can be automated, if yes, the development of an automatic test script is carried out according to the test steps of the test item, and the collection of environment information, version information and resource information (such as a CPU (Central processing unit), a memory and the like) is added to each automatic test script except the completion of the test steps. And a third stage: test items for which automated test script development has been completed are updated by the user if the automated field of the test item in the test item pool is yes, and the path of the automated test script is added to the automated test link field. The three stages iterate a plurality of times, and a test item pool is gradually built. The test entry pool is formalized into two sub-test entry pools from whether this dimension has been automated: an automated test entry pool, a manual test entry pool; from the test scenario, the test scenario is divided into a plurality of sub-test item pools: test scenario 1 test entry pool, test scenario 2 test entry pool, … …, test scenario N test entry pool. All test items and test item templates of the whole test item pool are stored in a test item sub-database and a test item template sub-database in the database, and all automation test scripts are stored in a test script sub-database in the database
Setting a test plan. According to the test scene of the test environment, the corresponding item set to be tested is adaptively extracted from the test item pool, whether the test items are automated or not is tracked, the item set to be tested is classified into two subsets of an automated test plan and a manual test plan, then the two test plan subsets are marked with environment labels and version labels according to the version fields and environment fields of the test environment, finally the two test plans are periodically synchronized to a test plan sub-database in the database, each test item in the sub-database comprises the fields of the environment labels, the version labels, the case numbers, the case titles, the test steps, the test results, whether the test results are automated or not, the automated test links, the latest results, the final results, the failure reasons, CPU resources, memory resources and the like, and the latest results and the final results of each test item in the sub-database are initialized to be untested.
Updating progress data. Automated test entry may be scheduled multiple times and may be performed in a mix with manual test plans on the same test environment. For an automatic test plan, a scheduling module schedules test items under the test plan on a tested environment according to execution strategies (such as manual triggering, real-time triggering and timing triggering) and test environment information (such as environment access IP, environment labels and test scenes) in a scheduling configuration table, each test item searches a corresponding automatic test script according to an automatic test link field of the test item, each automatic test script automatically acquires information such as environment labels, version labels, case numbers, case titles, test results, execution time, CPU resources, memory resources and execution reports of the test environment after completing service test of the test item and updates the information to a case analysis sub-database in a database, wherein if the test result of the first scheduling is failure, the test result is updated to a latest result field in the case analysis sub-database, a test analyzer analyzes failure reasons in combination with the execution report of a corresponding time point, confirms failure reasons (such as failure, environmental problem, case design problem, service interface change and the like), and if the failure is non-failure (such as environment self problem), the case analysis person updates and finally reschedules. The scheduling management periodically retrieves all test items with the final result of rescheduling under the environment label and the version label in the use case analysis sub-database to perform timing rescheduling. For the manual test plan, a tester filters out corresponding test items in the system through the environment tag and the version tag, performs manual test in the test environment according to the displayed test steps, and updates the test result, the execution time, the failure reason, the CPU resource and the memory resource information into the use case analysis sub-database.
And (5) pushing the test progress perspective. The system automatically gathers test progress data of each test environment in each version from the use case analysis sub-database in real time based on the environment label and the version label to perform early warning pushing (such as large-screen real-time display and mail timing pushing) according to the needs, and meanwhile, the comparison of failure reasons of failure test items in different versions in the same environment and resource information of the success test items can be seen through. The user can accurately sense the quality of the product version in the current test progress and iterative evolution, and can control risks in real time, so that the labor investment in tedious data arrangement, summarization and reporting is greatly reduced.
Those of ordinary skill in the art will appreciate that all or some of the steps, systems, functional modules/units in the apparatus, and methods disclosed above may be implemented as software, firmware, hardware, and suitable combinations thereof. In a hardware implementation, the division between the functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components; for example, one physical component may have multiple functions, or one function or step may be performed cooperatively by several physical components. Some or all of the physical components may be implemented as software executed by a processor, such as a central processing unit, digital signal processor, or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on computer readable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media). The term computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data, as known to those skilled in the art. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computer. Furthermore, as is well known to those of ordinary skill in the art, communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
Example embodiments have been disclosed herein, and although specific terms are employed, they are used and should be interpreted in a generic and descriptive sense only and not for purpose of limitation. In some instances, it will be apparent to one skilled in the art that features, characteristics, and/or elements described in connection with a particular embodiment may be used alone or in combination with other embodiments unless explicitly stated otherwise. It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the disclosure as set forth in the appended claims.

Claims (14)

1. A method of tracking test progress, comprising:
establishing a test item pool, wherein the test item pool comprises a plurality of test items;
acquiring a set of items to be tested from the test item pool according to a test scene in a test environment, and setting a test plan;
scheduling test items in the test plan on the test environment, and updating test progress data;
pushing a tracking result of the test progress according to the test progress data.
2. The method of claim 1, wherein scheduling test entries in the test plan on the test environment, updating test progress data, comprises:
Scheduling test items in an automated test plan on the test environment, and updating test progress data of the automated test plan;
scheduling test items in a manual test plan on the test environment, and updating test progress data of the manual test plan;
wherein automated test items in the set of items to be tested are categorized as the automated test plan, and manual test items in the set of items to be tested are categorized as the manual test plan.
3. The method of claim 2, wherein scheduling test entries in an automated test plan on the test environment, updating test progress data of the automated test plan, comprises:
scheduling test items in the automatic test plan on the test environment according to a scheduling configuration table, wherein an execution strategy and test environment information are configured in the scheduling configuration table;
according to the automatic test link field of the test item in the automatic test plan stored in the test plan sub-database, retrieving and executing the automatic test script of the test item;
and collecting an environment label, a version label, a use case number, a use case title, a test result, execution time, CPU resource information, memory resource information and an execution report of the test environment, and updating the environment label, the version label, the use case number, the use case title, the test result, the execution time, the CPU resource information, the memory resource information and the execution report to a use case analysis sub-database.
4. The method of claim 3, wherein scheduling test entries in an automated test plan on the test environment, updating test progress data of the automated test plan, further comprises:
when the test result of the test item in any one of the automatic test plans is failure and the failure cause is non-failure, updating the final result field of the test item in the use case analysis sub-database to be needed to be rescheduled;
according to the environment label and the version label, periodically searching the final result field in the use case analysis sub-database as a test item needing rescheduling;
the timing rescheduling final result field is the test entry that needs to be rescheduled.
5. The method of claim 2, wherein scheduling test entries in a manual test plan on the test environment, updating test progress data of the manual test plan, comprises:
acquiring test items in a manual test plan according to an environment label and a version label, so that a tester can perform manual test on the test environment according to the test steps of the test items;
and updating the test result, the execution time, the failure reason, the CPU resource information and the memory resource information into the use case analysis sub-database according to the manual test result.
6. The method of any of claims 1 to 5, wherein creating a pool of test entries comprises:
adding test items into the test item pool according to a predefined test item template;
acquiring an automatic test script of the test item;
updating the automation identification of the test item, and adding the path of the automation test script to the test item.
7. The method of claim 6, wherein the test entries in the pool of test entries are stored in a test entry sub-database, the test entries stored in the test entry sub-database comprising: case number field, case header field, test step field, expected result field, test scenario field, automation identification field, automation test link field.
8. The method according to any one of claims 1 to 5, wherein obtaining a set of items to be tested from the test item pool according to a test scenario in a test environment, setting a test plan, comprises:
obtaining test items corresponding to the test scene from the test item pool to form the item set to be tested;
classifying the automatic test items in the item set to be tested as an automatic test plan, and classifying the manual test items in the item set to be tested as a manual test plan;
Environmental and version tags are added to test items in the automated test plan and the manual test plan.
9. The method of claim 8, wherein the automated test plan and the manual test plan are stored in a test plan sub-database, the test plan sub-database storing test entries comprising: the environment label field, the version label field, the case number field, the case title field, the test step field, the test result field, the automation identification field, the automation test link field, the latest result field, the final result field, the failure reason field, the Central Processing Unit (CPU) resource field and the memory resource field.
10. The method of any one of claims 1 to 5, wherein pushing the tracking result of the test progress according to the test progress data comprises:
summarizing test progress data of each test environment and each test version of the test plan from a use case analysis sub-database according to the environment label and the version label;
pushing tracking results of the test progress of the test plans of each test environment and each test version according to the summarized results.
11. The method of claim 10, wherein pushing the trace results of the test progress of the test plans of the respective test environments, the respective test versions, according to the aggregated results comprises:
and carrying out early warning on real-time test progress of the test plans of each test environment and each test version according to the summarized results.
12. The method of claim 10, wherein pushing the trace results of the test progress of the test plans of the respective test environments, the respective test versions, according to the aggregated results comprises:
comparing failure reasons of failed test items in the same test environment, different versions, and/or resource information of successful test items;
pushing the comparison result.
13. An electronic device, comprising:
one or more processors;
a memory having one or more programs stored thereon, which when executed by the one or more processors, cause the one or more processors to implement the method of tracking test progress according to any of claims 1 to 12.
14. A computer readable medium having stored thereon a computer program which when executed by a processor implements a method of tracking test progress according to any of claims 1 to 12.
CN202210900624.3A 2022-07-28 2022-07-28 Tracking method of test progress, electronic device and computer readable medium Pending CN117520137A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210900624.3A CN117520137A (en) 2022-07-28 2022-07-28 Tracking method of test progress, electronic device and computer readable medium
PCT/CN2023/098537 WO2024021877A1 (en) 2022-07-28 2023-06-06 Testing progress tracking method, electronic device, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210900624.3A CN117520137A (en) 2022-07-28 2022-07-28 Tracking method of test progress, electronic device and computer readable medium

Publications (1)

Publication Number Publication Date
CN117520137A true CN117520137A (en) 2024-02-06

Family

ID=89705318

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210900624.3A Pending CN117520137A (en) 2022-07-28 2022-07-28 Tracking method of test progress, electronic device and computer readable medium

Country Status (2)

Country Link
CN (1) CN117520137A (en)
WO (1) WO2024021877A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103077109B (en) * 2009-12-28 2016-05-25 中兴通讯股份有限公司 A kind of test plan dispatching method and system
CN107273286B (en) * 2017-06-02 2020-10-27 携程计算机技术(上海)有限公司 Scene automatic test platform and method for task application
CN108984418B (en) * 2018-08-22 2023-04-11 中国平安人寿保险股份有限公司 Software test management method and device, electronic equipment and storage medium
CN114328226A (en) * 2021-12-28 2022-04-12 苏州浪潮智能科技有限公司 Test plan data generation method and related device

Also Published As

Publication number Publication date
WO2024021877A1 (en) 2024-02-01

Similar Documents

Publication Publication Date Title
CN106844217B (en) Method and device for embedding point of applied control and readable storage medium
CN107908541B (en) Interface testing method and device, computer equipment and storage medium
US8151248B1 (en) Method and system for software defect management
EP2778929B1 (en) Test script generation system
US8930772B2 (en) Method and system for implementing a test automation results importer
US8095554B1 (en) Global inventory warehouse
CN108108297A (en) The method and apparatus of automatic test
US20080010535A1 (en) Automated and configurable system for tests to be picked up and executed
US8549483B1 (en) Engine for scalable software testing
CN112506807B (en) Automatic test system for interface serving multiple systems
CN102571403A (en) Realization method and device for general data quality control adapter
US10922216B1 (en) Intelligent automation test workflow
CN112650688A (en) Automated regression testing method, associated device and computer program product
CN111966580A (en) Automatic testing method, device, equipment and storage medium based on artificial intelligence
CN110147312A (en) Software development test method, device, computer installation and storage medium
CN113127280A (en) API interface automatic input method and system
CN110209578B (en) Information online test platform
CN111897737A (en) Omission detection method and device for program test of micro-service system
CN114546814A (en) Recording playback method, recording playback device and storage medium
CN117520137A (en) Tracking method of test progress, electronic device and computer readable medium
CN108073511A (en) test code generating method and device
CN111190817A (en) Method and device for processing software defects
CN113220592B (en) Processing method and device for automatic test resources, server and storage medium
Foganholi et al. Supporting technical debt cataloging with TD-Tracker tool
CN110928784A (en) Software testing environment monitoring method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication