CN117608903A - Method, device, equipment and storage medium for automatically generating test report - Google Patents

Method, device, equipment and storage medium for automatically generating test report Download PDF

Info

Publication number
CN117608903A
CN117608903A CN202311649514.5A CN202311649514A CN117608903A CN 117608903 A CN117608903 A CN 117608903A CN 202311649514 A CN202311649514 A CN 202311649514A CN 117608903 A CN117608903 A CN 117608903A
Authority
CN
China
Prior art keywords
test
task
bug
detail
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311649514.5A
Other languages
Chinese (zh)
Inventor
王剑敏
王潇娴
张岩
吕承业
曹安奇
何轶群
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China United Network Communications Group Co Ltd
Original Assignee
China United Network Communications Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China United Network Communications Group Co Ltd filed Critical China United Network Communications Group Co Ltd
Priority to CN202311649514.5A priority Critical patent/CN117608903A/en
Publication of CN117608903A publication Critical patent/CN117608903A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/07Responding to the occurrence of a fault, e.g. fault tolerance
    • G06F11/0703Error or fault processing not based on redundancy, i.e. by taking additional measures to deal with the error or fault not making use of redundancy in operation, in hardware, or in data representation
    • G06F11/0766Error or fault reporting or storing
    • G06F11/0775Content or structure details of the error report, e.g. specific table structure, specific error fields
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis

Abstract

The application provides a method, a device, equipment and a storage medium for automatically generating a test report. The method is applied to the technical field of Internet, test task basic data and BUG detail basic data are obtained from a research and development platform according to a preset period, screening and integrating processing are carried out on the test task basic data and the BUG detail basic data, analysis processing is carried out according to multiple analysis dimensions, and a test report corresponding to each analysis dimension is generated, wherein the test report comprises the following components: test task detail sheet, BUG detail sheet, personal task statistics sheet, test progress report sheet. And the test report is automatically generated, so that the test report manufacturing efficiency is improved.

Description

Method, device, equipment and storage medium for automatically generating test report
Technical Field
The application belongs to the technical field of internet, and particularly relates to a method, a device, equipment and a storage medium for automatically generating a test report.
Background
The test report refers to the progress situation of test work in the software development process, and covers the progress situations of activities such as the establishment of a test plan, the design and execution of test cases, the tracking and repair of defects and the like. The test report is particularly important in the technical field of the Internet, the test is one of important means for ensuring the quality of software, potential defects and problems can be timely found and repaired as soon as possible through monitoring and tracking of the test progress, so that the quality of the software is improved, the monitoring of the test progress is helpful for grasping the progress of test work, the test activity is ensured to be carried out according to the plan, the situation of delay of connotation of a test reporter or bottleneck of a test task is timely developed, and corresponding measures are taken to adjust resources and optimize the progress.
The test report needs to quickly collect and sort the information of the test tasks, test cases, BUG details and the like of each tester in a plurality of projects, discover potential defects in time and master the test progress. The existing test report generation method is to manually count the information such as test tasks, test cases, BUG details and the like of different testers, and then make corresponding test reports according to the data of different testers.
However, the period of the existing test report version is fixed, and each tester participates in a plurality of projects, so that if the information of project tasks, test cases, details of test BUGs and the like which are responsible for the testers is to be checked, the corresponding projects need to be switched to be checked, so that the working conditions of the testers and the production of the test report can be counted manually, and therefore, the method for counting the test information of the testers in the prior art has the defect of low counting efficiency.
Disclosure of Invention
The application provides a method, a device, equipment and a storage medium for automatically generating a test report, which are used for improving the efficiency and accuracy of generating the test report and making up the defect of generating the test report in the prior art.
In a first aspect, the present application provides a method of automatically generating a test report, the method comprising:
Acquiring test task basic data and BUG detail basic data from a research and development platform according to a preset period, wherein the test task basic data and the BUG detail basic data are data obtained when a plurality of testers execute test tasks;
screening and integrating the basic data of the test tasks to obtain test task detail sheet pages, wherein the test task detail sheet pages are used for indicating task IDs, task names, task states, testers, iteration names, project names, work attributions, work groups, test ranges and task descriptions of each test task;
analyzing and processing the test task detail sheet page and BUG detail basic data according to various analysis dimensions to obtain an analysis result corresponding to each analysis dimension;
and generating a test report corresponding to each analysis dimension according to the analysis results corresponding to the multiple analysis dimensions.
Optionally, the analyzing the test task detail sheet page according to multiple analysis dimensions to obtain an analysis result corresponding to each analysis dimension includes:
analyzing and processing the test task detail sheet page according to the dimension of the testers to obtain test task information corresponding to each tester, wherein the test task information comprises test task data of a plurality of test tasks corresponding to the testers;
Generating a personal task statistics sheet page according to the test task information corresponding to the plurality of testers, wherein the personal task statistics sheet page is used for indicating the statistics data of the test tasks corresponding to each tester;
analyzing and processing the test task detail sheet page according to the task state dimension to obtain test task data corresponding to each task state;
and generating a test progress report sheet page according to the test task data corresponding to the task states, wherein the test progress report sheet page is used for indicating the statistical data of each task state.
Optionally, the task state includes: the method includes the steps of generating a test progress report sheet page according to test task data corresponding to a plurality of task states, wherein the test progress report sheet page comprises the following components:
determining a first number of test tasks in a completed state, a second number of test tasks in a test state and a third number of test tasks in an unmeasured state according to the test task data corresponding to each task state;
determining test task progress information according to the first quantity, the second quantity and the third quantity, wherein the test task progress information is used for indicating the duty ratio condition of a test task in a finished state, the duty ratio condition of the test task in a test state and the duty ratio condition of the test task in an unmeasured state;
Determining task state details corresponding to each task state according to test task data corresponding to a plurality of task states;
and generating the test progress report sheet page according to the task state details corresponding to each task state and the test task progress information.
Optionally, the BUG detail basic data includes: the method comprises the steps of analyzing and processing BUG detail basic data according to various analysis dimensions to obtain an analysis result corresponding to each analysis dimension, wherein the analysis result comprises the following steps of:
determining BUG repair time of each BUG according to regression testing time and defect creation time corresponding to the BUGs;
determining regression testing duration of each BUG according to the defect completion time and the defect creation time corresponding to the BUGs;
and generating a BUG detail sheet page according to the defect circulation state, the BUG repair time length and the regression test time length corresponding to each BUG, wherein the BUG detail sheet page is used for indicating the statistical data of each BUG.
Optionally, after generating the test report corresponding to each analysis dimension according to the analysis results corresponding to the multiple analysis dimensions, the method further includes:
Setting a timing sending task, wherein the timing sending task comprises the following steps: sending the moment, the identification information corresponding to the test report and the application program identification;
and when the timing sending task is triggered, determining a target application program according to the application program identification, and sending the test report to the target application program.
Optionally, the obtaining test task basic data and BUG detail basic data from the research and development platform according to a preset period includes:
invoking a test task interface according to a preset period, and acquiring the test task basic data from the research and development platform;
and calling a BUG interface according to a preset period, and acquiring the BUG detail basic data from the research and development platform.
In a second aspect, the present application provides an apparatus for automatically generating a test report, the apparatus comprising:
the acquisition module is used for acquiring basic data of a test task and basic data of BUG detail from the research and development platform according to a preset period, wherein the basic data of the test task and the basic data of the BUG detail are data obtained when a plurality of testers execute the test task;
the processing module is used for screening and integrating the basic data of the test tasks to obtain test task detail sheet pages, wherein the test task detail sheet pages are used for indicating task IDs, task names, task states, testers, iteration names, project names, work attributions, work groups, test ranges and task descriptions of each test task;
The analysis module is used for analyzing and processing the test task detail sheet page and the BUG detail basic data according to various analysis dimensions to obtain an analysis result corresponding to each analysis dimension;
and the generating module is used for generating a test report corresponding to each analysis dimension according to the analysis results corresponding to the multiple analysis dimensions.
Optionally, the analysis module is configured to analyze the test task detail sheet page according to a dimension of a tester to obtain test task information corresponding to each tester, where the test task information includes test task data corresponding to a plurality of test tasks of the tester;
the generation module is used for generating a personal task statistics sheet page according to the test task information corresponding to the plurality of testers, wherein the personal task statistics sheet page is used for indicating the statistical data of the test tasks corresponding to each tester;
the analysis module is further used for analyzing and processing the test task detail sheet page according to task state dimensions to obtain test task data corresponding to each task state;
the generating module is further configured to generate a test progress report sheet page according to test task data corresponding to the plurality of task states, where the test progress report sheet page is used to indicate statistical data of each task state.
Optionally, the apparatus further comprises a determining module;
the determining module is used for determining a first number of test tasks in a completed state, a second number of test tasks in a test state and a third number of test tasks in an unmeasured state according to the test task data corresponding to each task state;
the determining module is further configured to determine, according to the first number, the second number, and the third number, test task progress information, where the test task progress information is used to indicate a duty ratio condition of a test task in a completed state, a duty ratio condition of a test task in a test state, and a duty ratio condition of a test task in an unmeasured state;
the determining module is further used for determining task state details corresponding to each task state according to the test task data corresponding to the task states;
the generation module is used for generating the test progress report sheet page according to the task state details corresponding to each task state and the test task progress information.
Optionally, the determining module is configured to determine a BUG repair duration of each BUG according to regression testing moments and BUG creation moments corresponding to the multiple BUGs;
The determining module is further configured to determine a regression testing duration of each of the BUGs according to the defect completion time and the defect creation time corresponding to the multiple BUGs;
the generating module is configured to generate a BUG detail sheet page according to a defect circulation state, the BUG repair duration and the regression testing duration corresponding to each BUG, where the BUG detail sheet page is used to indicate statistical data of each BUG.
Optionally, the device further comprises a setting module and a sending module;
the setting module is configured to set a timing sending task, where the timing sending task includes: sending the moment, the identification information corresponding to the test report and the application program identification;
and the sending module is used for determining a target application program according to the application program identifier when the timed sending task is triggered and sending the test report to the target application program.
Optionally, the acquiring module is configured to invoke a test task interface according to a preset period, and acquire the test task basic data from the development platform;
the acquisition module is also used for calling a BUG interface according to a preset period and acquiring the BUG detail basic data from the research and development platform.
In a third aspect, the present application provides an apparatus for automatically generating a test report, comprising:
a memory;
a processor;
wherein the memory stores computer-executable instructions;
the processor executes the computer-executable instructions stored in the memory to implement the method of automatically generating test reports as described in the first aspect and the various possible implementations of the first aspect.
In a fourth aspect, the present application provides a computer storage medium having stored thereon a computer program for execution by a processor to implement a method of automatically generating test reports as described in the first aspect and the various possible implementations of the first aspect.
The application provides a method, a device, equipment and a storage medium for automatically generating a test report. According to the method, test task basic data and BUG detail basic data are obtained from a research and development platform according to a preset period, screening and integrating processing are carried out on the test task basic data and the BUG detail basic data, analysis processing is carried out according to multiple analysis dimensions, and a test report corresponding to each analysis dimension is generated, wherein the test report comprises the following components: test task detail sheet, BUG detail sheet, personal task statistics sheet, test progress report sheet. And the test report is automatically generated, so that the test report manufacturing efficiency is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
FIG. 1 is a flow chart of a method for automatically generating test reports provided in the present application;
FIG. 2 is a second flow chart of the method for automatically generating test reports provided in the present application;
FIG. 3 is a schematic structural diagram of an apparatus for automatically generating test reports provided herein;
fig. 4 is a schematic structural diagram of an apparatus for automatically generating a test report provided in the present application.
Specific embodiments thereof have been shown by way of example in the drawings and will herein be described in more detail. These drawings and the written description are not intended to limit the scope of the inventive concepts in any way, but to illustrate the concepts of the present application to those skilled in the art by reference to specific embodiments.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the present application more apparent, the technical solutions in the present application will be clearly and completely described below with reference to the drawings in the present application, and it is apparent that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims and in the above drawings, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein.
In the embodiments of the present application, words such as "exemplary" or "such as" are used to mean examples, illustrations, or descriptions. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
First, terms related to the present application will be explained:
BUG: is a term commonly used in software development and refers to a defect, error, or problem in a program or system. BUG refers to errors, flaws or faults that exist in software that cause a program to fail to function in an intended manner or produce incorrect results.
The test report refers to the progress situation of test work in the software development process, and covers the progress situations of activities such as the establishment of a test plan, the design and execution of test cases, the tracking and repair of defects and the like. The test report is particularly important in the technical field of the Internet, the test is one of important means for ensuring the quality of software, potential defects and problems can be timely found and repaired as soon as possible through monitoring and tracking of the test progress, so that the quality of the software is improved, the monitoring of the test progress is helpful for grasping the progress of test work, the test activity is ensured to be carried out according to the plan, the situation of delay of connotation of a test reporter or bottleneck of a test task is timely developed, and corresponding measures are taken to adjust resources and optimize the progress.
In the prior art, in order to generate a test report, a common method is to manually switch to each project to check the conditions of a test task, a test case, a test BUG and the like which are responsible for each tester, and manually count task data of the tester, manually calculate the task number of the tester, manually calculate the data of the BUG and the like which need to be subjected to regression test after checking is completed, and manually summarize the data to generate the test report.
However, the existing method for generating test reports has the following drawbacks:
1) The efficiency is low, test data of a tester need to be checked manually, the test data is calculated manually, and a test report is produced manually, so that the efficiency of generating the test report is low.
2) The expansibility is poor, and automatic synchronization to other application programs is not supported, so that each tester cannot timely learn the working progress of other testers during collaborative operation.
Aiming at the problems, the application provides a method for automatically generating a test report, which automatically screens, integrates and analyzes the test task basic data and the BUG detail basic data by automatically acquiring the test task basic data and the BUG detail basic data to obtain a plurality of analysis dimension test reports, improves the manufacturing efficiency of the test report, and avoids the defect of long period of manually counting and manufacturing the test report.
The following describes the technical solutions of the present application and how the technical solutions of the present application solve the above technical problems in detail with specific embodiments. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 1 is a flowchart of a method for automatically generating a test report according to an embodiment of the application. As shown in fig. 1, the method for automatically generating a test report provided in this embodiment includes:
s101, acquiring basic data of a test task and basic data of BUG detail from a research and development platform according to a preset period.
The preset period refers to a preset period length set according to actual requirements at a certain time interval, which may be hourly, daily, weekly, etc.
Acquiring test task basic data and BUG detail basic data from a research and development platform according to a preset period refers to calling a test task interface according to the preset period, and acquiring the test task basic data from the research and development platform; and calling a BUG interface according to a preset period, and acquiring the BUG detail basic data from the research and development platform.
Namely: invoking a test task interface according to a preset period, and acquiring test task basic data such as task ID, task name, task state, testers, iteration name, project name, work attribution, work team, test scope, task description and the like from the research and development platform; and calling a BUG interface according to a preset period, and acquiring BUG detail basic data such as a defect ID, a defect name, a defect description, a defect circulation state, a defect creation time, a defect completion time, a regression testing time grade, a tester, an iteration, a project name and the like from the research and development platform.
By way of example, one possible implementation is presented herein. Calling a test task API interface at nine points in the morning every day, and acquiring the test task basic data from the research and development platform; and calling an API interface of the BUG according to a preset period, and acquiring the BUG detail basic data from the research and development platform.
And S102, screening and integrating the basic data of the test tasks to obtain a test task detail sheet page.
The test task basic data comprises: task ID, task name, task status, tester, iteration name, project name, job attribution, job team, test scope, task description, etc.
Screening the test task basic data according to the need, brushing out the required test task basic data, integrating the screened test task basic data into a table, and processing the integrated data, namely: and arranging according to a certain format and rule to obtain a clearer and easily-analyzed test task detail sheet page.
By way of example, one possible implementation is presented herein. Creating a point sub-table, importing test task basic data acquired from a research and development platform, screening the test task basic data, and deleting unnecessary columns, for example: deleting data before the creation date, closed tasks and the like, and integrating the screened test task basic data, for example: and integrating the data of the task ID, the task name, the task state, the testers and the like, and arranging the integrated data according to a certain format and rule to obtain the test task detail sheet page.
And S103, analyzing and processing the test task detail sheet page and BUG detail basic data according to various analysis dimensions to obtain an analysis result corresponding to each analysis dimension.
And determining the dimension to be analyzed according to the analysis requirement. For example: and analyzing and processing according to dimensions such as testers, task types, task states, task grades and the like.
Relevant data of required dimensions are extracted from test task detail sheet pages and BUG detail basic data, the extracted data are analyzed and processed, statistics and calculation are carried out according to different dimensions, for example: and counting the number and progress of tasks of each tester, counting the number and completion rate of different types of tasks, analyzing the number and repair time of BUGs of different grades, and the like, so as to obtain an analysis result corresponding to each analysis dimension.
S104, generating a test report corresponding to each analysis dimension according to analysis results corresponding to the multiple analysis dimensions.
The analysis results of each analysis dimension are displayed in a corresponding table, a test report corresponding to each analysis dimension is generated, the conditions of test tasks and BUG are effectively presented, and a basis is provided for subsequent decisions.
The embodiment provides a method for automatically generating a test report, which comprises the steps of obtaining test task basic data and BUG detail basic data from a research and development platform according to a preset period, screening, integrating and analyzing the test task basic data and BUG detail basic data to obtain analysis results corresponding to different dimensions, and generating the test report corresponding to each analysis dimension according to the analysis results corresponding to the different dimensions. The method realizes automatic acquisition and analysis of the basic data, reduces the degree of manual participation and improves the efficiency of making the test report.
Fig. 2 is a second flowchart of a method for automatically generating a test report according to an embodiment of the present application. This embodiment is a detailed description of a method of automatically generating a test report based on the embodiment of fig. 1. As shown in fig. 2, the method for automatically generating a test report provided in this embodiment includes:
s201, acquiring test task basic data and BUG detail basic data from a research and development platform according to a preset period, wherein the BUG detail basic data comprises: defect circulation state, defect creation time, defect completion time, and regression test time.
Step S201 is the same as step S101 described above, and will not be described again here.
And S202, screening and integrating the basic data of the test tasks to obtain a test task detail sheet page.
Step S202 is the same as step S102 described above, and will not be described again here.
And S203, analyzing and processing the test task detail sheet page according to the dimension of the testers to obtain the test task information corresponding to each tester.
S204, generating a personal task statistics sheet page according to the test task information corresponding to the plurality of testers.
In the test task detail sheet page, data related to the testers are screened out through a screening function or a formula, and the test task information of each tester is sorted according to the screened data. The test task information may include: task ID, task description, testers, task priority, task status, and the like. For each tester, counting the number of the test tasks responsible for the tester, and for the test tasks responsible for each tester, analyzing the state distribution condition of each task, for example: and analyzing the number and the duty ratio of each different task state, displaying the number and the duty ratio by using a graph, and evaluating the working progress of each tester according to the task number and the task state distribution of each tester. And obtaining the test task information corresponding to each tester according to the analysis.
Table 1 is a schematic representation of the test task detail sheet page provided in this embodiment.
TABLE 1
Task ID Task description Test staff Task priority Task status
001 First test task A High height Not start
002 Second test task B High height In the process of
003 Third test task A Low and low In the process of
004 Fourth test task C In (a) Completion of processing
005 Fifth test task A In (a) In the process of
As shown in table 1, the task ID includes: 001. 002, 003, 004, 005; the specific conditions of the test task corresponding to each task ID are as follows:
the test task with the task ID of 001 corresponds to a first test task, the tester is A, the task priority is high, and the current task state is not started;
the test task with the task ID 002 corresponds to a second test task, the tester is B, the task priority is high, and the current task state is in process;
the test task with the task ID 003 corresponds to a third test task, the tester is A, the task priority is high, and the current task state is in process;
the test task with the task ID 004 corresponds to a fourth test task, the tester is C, the task priority is the current task state is the processing completion;
The test task with the task ID of 005 corresponds to a fifth test task, the tester is A, the task priority is in, and the current task state is in process.
Table 2 is a schematic illustration of a personal task statistics sheet page according to the tester dimension analysis provided in this embodiment.
TABLE 2
As shown in table 2, tester a corresponds to three test tasks, 001 not started, 003 and 005 in process, respectively; the tester B corresponds to a test task 002 in one process; tester C corresponds to a completed test task 004.
S205, analyzing and processing the test task detail sheet page according to the task state dimension to obtain test task data corresponding to each task state.
In the test task detail sheet page, a screening function is used to screen task states to be analyzed, for example: not started, in process, completed, etc., and according to the filtered data, sorting test task information corresponding to each task state may include: and (3) counting test task data of each task state aiming at the fields such as task numbers, task descriptions, task priorities, testers and the like to obtain the test task data corresponding to each task state.
Continuing to refer to the content of the test task detail sheet shown in the above table 1, after analyzing and processing the test task detail sheet, the obtaining test task data corresponding to the task state being the un-started state includes: testing data corresponding to task 001; the test task data corresponding to the task state being the completed state comprises: testing data corresponding to task 004; the task state is test task data corresponding to the in-process state, and comprises the following steps: testing tasks 002, 003 and 005.
S206, determining a first number of test tasks in a completed state, a second number of test tasks in a test state and a third number of test tasks in an unmeasured state according to the test task data corresponding to each task state.
According to the test task data corresponding to each task state, screening and determining which test tasks are completed and which test tasks are processed, and which test tasks are not tested, and respectively counting the number of the test tasks corresponding to the three completed, processed and not-started states after screening, namely: counting a first number of test tasks in a completed state; counting a second number of test tasks in a test state; and counting a third number of test tasks in an unmeasured state.
And determining a first number of test tasks in a completed state, a second number of test tasks in a test state and a third number of test tasks in an unmeasured state according to the test task data corresponding to each task state.
Continuing with the test task detail sheet page shown in Table 1 above, the first number is 1, the second number is 3, and the third number is 1.
S207, determining test task progress information according to the first quantity, the second quantity and the third quantity.
The test task progress information is used for indicating the duty ratio condition of the test task in the completed state, the duty ratio condition of the test task in the test state and the duty ratio condition of the test task in the unmeasured state. And determining overall test task progress information according to the first number of test tasks in the completed state, the second number of test tasks in the test state and the third number of test tasks in the untested state.
In combination with the contents of the test task detail sheet shown in table 1, it is understood that the ratio of the test tasks in the completed state is 20%, the ratio of the test tasks in the test state is 60%, and the ratio of the test tasks in the unmeasured state is 20%.
S208, determining task state details corresponding to each task state according to test task data corresponding to the task states.
S209, generating the test progress report sheet page according to task state details and test task progress information corresponding to each task state.
And determining task state details corresponding to each task state according to the test task data corresponding to the plurality of task states so as to take corresponding measures for each task state.
For each task state, the corresponding task state details are arranged. The task state details may include, for example: and determining task state details corresponding to each task state according to detailed information such as task names, task descriptions, testers, task quantity, duty ratio conditions and the like.
Table 3 is a schematic table of a test progress report sheet page obtained by task state dimension analysis provided in this embodiment.
TABLE 3 Table 3
S210, determining BUG repair time length of each BUG according to regression testing time and defect creation time corresponding to the BUGs.
It can be understood that the BUG detail basic data in the step S210 is obtained based on the step S201, and the specific use process thereof has no clear association relationship with the steps S202-S209. Therefore, there is no clear timing relationship between step S210 and step S202, step S203 may be performed first, and then step S210 may be performed. Step S210 may be performed first, and then step S203 may be performed; step S210 and step S203 may also be performed simultaneously, which is not limited in this application.
And for each BUG, subtracting the defect creation time from the BUG regression test time to determine the repair time of each BUG, and recording the repair time into a corresponding BUG detail data BUG repair time long column so as to analyze the repair time data later and know the overall condition of the repair time and the operation condition of each tester.
Table 4 shows the BUG detail basic data provided in this example.
TABLE 4 Table 4
As shown in table 4, the BUG ID includes: 001. 002, 003; the specific case of the BUG corresponding to each BUG ID is as follows:
the corresponding defect circulation state of the BUG with the BUG ID of 001 is that the regression testing time is U, the defect finishing time is L, the defect creating time is X, and the tester is A;
the corresponding defect circulation state of the BUG with the BUG ID of 002 is NO, the regression testing time is V, the defect finishing time is M, the defect creating time is Y, and the tester is A;
the corresponding defect circulation state of the BUG with the BUG ID 003 is no, the regression testing time is W, the defect finishing time is N, the defect creating time is Z, and the tester is B.
The defect circulation state is used for indicating whether the BUG is opened for the second time, and when the BUG is in the defect circulation state, the BUG is opened for the second time, namely: once repaired, but later reappearance, the relevant tester needs to reprocess the BUG; and the failure in the defect circulation state indicates that the BUG is not opened for the second time, i.e. the BUG is not appeared after repair.
As can be seen from the contents of the summary of the BUG detail basic data shown in the above table 4, the repair time period of BUG001 is: U-X; the repair duration of the BUG002 is as follows: V-Y; the repair duration of the BUG003 is as follows: W-Z.
S211, determining regression testing duration of each BUG according to the defect completion time and the defect creation time corresponding to the BUGs
And for each BUG, subtracting the defect creation time from the defect completion time, determining the regression test time of each BUG, and recording the regression test time into a corresponding BUG detail data BUG regression test time long column so as to analyze the regression test time data later and know the overall condition of the regression test time and the operation condition of each tester.
The regression test duration of BUG001 was as follows, as is evident from the contents of the schematic representation of the BUG detail basis data shown in Table 4 above: L-X; the regression testing duration of BUG002 is: M-Y; the regression-testing duration of BUG003 was: N-Z.
S212, generating BUG detail sheet pages according to the defect circulation state, the BUG repair time length and the regression test time length corresponding to each BUG.
Table 5 is a generated BUG detail sheet page provided in this embodiment according to the defect circulation state, the BUG repair duration, and the regression test duration corresponding to each BUG.
TABLE 5
As shown in table 5, the defect circulation state of the BUG001 is opened for the second time, the regression testing time is U, the defect completion time is L, the defect creation time is X, the tester is a, the BUG repair time is U-X, and the regression testing time is L-X; the defect circulation state of BUG002 is not opened for the second time, the regression testing time is V, the defect finishing time is M, the defect creating time is Y, the tester is A, the BUG repairing time is V-Y, and the regression testing time is M-Y; the defect circulation state of BUG003 is not opened for the second time, the regression testing time is W, the defect finishing time is N, the defect creating time is Z, the tester is B, the BUG repairing time is W-Z, and the regression testing time is N-Z.
S213, setting a timing sending task, wherein the timing sending task comprises the following steps: and sending the moment, the identification information corresponding to the test report and the application program identification.
The setting of the timing sending task refers to a mode of automatically sending the test report to related personnel in a preset time, and the task can be automatically triggered after the test task is executed, or can be automatically triggered at a preset time point, which is not limited in the application.
Each test report has an identification information for distinguishing what dimension the test report is, and when the timing transmission task is set, the corresponding test report identification needs to be specified so that the corresponding test report can be correctly transmitted.
When sending the test report, the test report needs to be sent to the target application program, and when setting the timing task, the corresponding application program identifier needs to be specified so that the test report can be correctly sent to the target application program.
And S214, when the timing sending task is triggered, determining a target application program according to the application program identification, and sending the test report to the target application program.
When the timing sending task is triggered, determining a target application program according to the application program identification, acquiring a latest test report corresponding to the application program after determining the target application program, and sending the latest test report to the target application program after acquiring the latest test report.
The embodiment provides a method for automatically generating a test report, which comprises the steps of acquiring test task basic data and BUG detail basic data from a research and development platform according to a preset period, screening and integrating the test task basic data to generate a test task detail sheet page; analyzing and processing the test task detail sheet page according to the dimension of the tester to generate a personal task statistics sheet page; analyzing and processing the test task detail sheet page according to task state dimensions to generate a test progress report sheet page; and analyzing and processing the BUG detail basic data to generate a BUG detail sheet page. And setting a timing sending task and sending the test report to the target application program. The method realizes automatic generation of the multi-dimensional test report, automatically sends the multi-dimensional test report to the target application program, improves the efficiency of making the test report, and provides convenience for collaborative operation.
Fig. 3 is a schematic structural diagram of an apparatus for automatically generating a test report provided in the present application, and as shown in fig. 3, an apparatus 300 for automatically generating a test report provided in the present embodiment includes:
the acquiring module 301 is configured to acquire, according to a preset period, test task basic data and BUG detail basic data from the development platform, where the test task basic data and BUG detail basic data are data obtained when a plurality of testers execute a test task;
the processing module 302 is configured to perform screening integration processing on the test task basic data to obtain a test task detail sheet, where the test task detail sheet is used to indicate a task ID, a task name, a task state, a tester, an iteration name, a project name, a job attribution, a job team, a test scope, and a job description of each test task;
the analysis module 303 is configured to analyze the test task detail sheet page and the BUG detail basic data according to multiple analysis dimensions, so as to obtain an analysis result corresponding to each analysis dimension;
the generating module 304 is configured to generate a test report corresponding to each analysis dimension according to analysis results corresponding to the multiple analysis dimensions.
Optionally, the analysis module 303 is configured to analyze the test task detail sheet page according to a dimension of a tester to obtain test task information corresponding to each tester, where the test task information includes test task data corresponding to a plurality of test tasks of the tester;
the generating module 304 is configured to generate a personal task statistics sheet according to test task information corresponding to a plurality of testers, where the personal task statistics sheet is used to indicate statistical data of a test task corresponding to each tester;
the analysis module 303 is further configured to analyze the test task detail sheet page according to task state dimensions, to obtain test task data corresponding to each task state;
the generating module 304 is further configured to generate a test progress report sheet according to the test task data corresponding to the plurality of task states, where the test progress report sheet is used to indicate statistical data of each task state.
Optionally, the apparatus further comprises a determining module 305;
the determining module 305 is configured to determine, according to the test task data corresponding to each task state, a first number of test tasks in a completed state, a second number of test tasks in a test state, and a third number of test tasks in an untested state;
The determining module 305 is further configured to determine, according to the first number, the second number, and the third number, test task progress information, where the test task progress information is used to indicate a duty ratio of a test task in a completed state, a duty ratio of a test task in a test state, and a duty ratio of a test task in an unmeasured state;
the determining module 305 is further configured to determine a task state detail corresponding to each task state according to test task data corresponding to the plurality of task states;
the generating module 304 is configured to generate the test progress report sheet page according to the task state details corresponding to each task state and the test task progress information.
Optionally, the determining module 305 is configured to determine a BUG repair duration of each BUG according to regression testing moments and BUG creation moments corresponding to the multiple BUGs;
the determining module 305 is further configured to determine a regression testing duration of each of the BUGs according to the defect completion time and the defect creation time corresponding to the multiple BUGs;
the generating module 304 is configured to generate a BUG detail sheet page according to the defect circulation state, the BUG repair duration, and the regression testing duration, where the BUG detail sheet page is used to indicate statistical data of each BUG.
Optionally, the device further comprises a setting module 306 and a sending module 307;
the setting module 306 is configured to set a timing sending task, where the timing sending task includes: sending the moment, the identification information corresponding to the test report and the application program identification;
the sending module 307 is configured to determine a target application according to the application identifier when the timed sending task is triggered, and send the test report to the target application.
Optionally, the acquiring module 301 is configured to invoke a test task interface according to a preset period, and acquire the test task basic data from the development platform;
the obtaining module 301 is further configured to invoke a BUG interface according to a preset period, and obtain the BUG detail basic data from the development platform.
Fig. 4 is a schematic structural diagram of an apparatus for automatically generating a test report provided in the present application. As shown in fig. 4, the present application provides an apparatus for automatically generating a test report, the apparatus 400 for automatically generating a test report including: a receiver 401, a transmitter 402, a processor 403 and a memory 404.
A receiver 401 for receiving instructions and data;
a transmitter 402 for transmitting instructions and data;
Memory 404 for storing computer-executable instructions;
processor 403 is configured to execute computer-executable instructions stored in memory 404 to perform the steps performed by the method for automatically generating test reports in the above-described embodiments. Reference may be made in particular to the description of the embodiments of the method for automatically generating test reports described above.
Alternatively, the memory 404 may be separate or integrated with the processor 403.
When the memory 404 is provided separately, the electronic device further comprises a bus for connecting the memory 404 and the processor 403.
The application also provides a computer storage medium in which computer-executable instructions are stored, which when executed by a processor, implement a method of automatically generating a test report as performed by the above-described apparatus for automatically generating a test report.
Those of ordinary skill in the art will appreciate that all or some of the steps, systems, functional modules/units in the apparatus, and methods disclosed above may be implemented as software, firmware, hardware, and suitable combinations thereof. In a hardware implementation, the division between the functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components; for example, one physical component may have multiple functions, or one function or step may be performed cooperatively by several physical components. Some or all of the physical components may be implemented as software executed by a processor, such as a central processing unit, digital signal processor, or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on computer readable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media). The term computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data, as known to those skilled in the art. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computer. Furthermore, as is well known to those of ordinary skill in the art, communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.

Claims (10)

1. A method of automatically generating a test report, the method comprising:
acquiring test task basic data and BUG detail basic data from a research and development platform according to a preset period, wherein the test task basic data and the BUG detail basic data are data obtained when a plurality of testers execute test tasks;
screening and integrating the basic data of the test tasks to obtain test task detail sheet pages, wherein the test task detail sheet pages are used for indicating task IDs, task names, task states, testers and task descriptions of each test task;
analyzing and processing the test task detail sheet page and BUG detail basic data according to various analysis dimensions to obtain an analysis result corresponding to each analysis dimension;
And generating a test report corresponding to each analysis dimension according to the analysis results corresponding to the multiple analysis dimensions.
2. The method of claim 1, wherein the analyzing the test task detail sheet according to a plurality of analysis dimensions to obtain an analysis result corresponding to each analysis dimension comprises:
analyzing and processing the test task detail sheet page according to the dimension of the testers to obtain test task information corresponding to each tester, wherein the test task information comprises test task data of a plurality of test tasks corresponding to the testers;
generating a personal task statistics sheet page according to the test task information corresponding to the plurality of testers, wherein the personal task statistics sheet page is used for indicating the statistics data of the test tasks corresponding to each tester;
analyzing and processing the test task detail sheet page according to the task state dimension to obtain test task data corresponding to each task state;
and generating a test progress report sheet page according to the test task data corresponding to the task states, wherein the test progress report sheet page is used for indicating the statistical data of each task state.
3. The method of claim 2, wherein the task state comprises: the method includes the steps of generating a test progress report sheet page according to test task data corresponding to a plurality of task states, wherein the test progress report sheet page comprises the following components:
Determining a first number of test tasks in a completed state, a second number of test tasks in a test state and a third number of test tasks in an unmeasured state according to the test task data corresponding to each task state;
determining test task progress information according to the first quantity, the second quantity and the third quantity, wherein the test task progress information is used for indicating the duty ratio condition of a test task in a finished state, the duty ratio condition of the test task in a test state and the duty ratio condition of the test task in an unmeasured state;
determining task state details corresponding to each task state according to test task data corresponding to a plurality of task states;
and generating the test progress report sheet page according to the task state details corresponding to each task state and the test task progress information.
4. The method according to claim 1, wherein the BUG detail basis data comprises: the method comprises the steps of analyzing and processing BUG detail basic data according to various analysis dimensions to obtain an analysis result corresponding to each analysis dimension, wherein the analysis result comprises the following steps of:
Determining BUG repair time of each BUG according to regression testing time and defect creation time corresponding to the BUGs;
determining regression testing duration of each BUG according to the defect completion time and the defect creation time corresponding to the BUGs;
and generating a BUG detail sheet page according to the defect circulation state, the BUG repair time length and the regression test time length corresponding to each BUG, wherein the BUG detail sheet page is used for indicating the statistical data of each BUG.
5. The method of claim 1, wherein after generating the test report corresponding to each of the plurality of analysis dimensions based on the analysis results corresponding to the plurality of analysis dimensions, the method further comprises:
setting a timing sending task, wherein the timing sending task comprises the following steps: sending the moment, the identification information corresponding to the test report and the application program identification;
and when the timing sending task is triggered, determining a target application program according to the application program identification, and sending the test report to the target application program.
6. The method according to claim 1, wherein the obtaining test task basic data and BUG detail basic data from the development platform according to the preset period includes:
Invoking a test task interface according to a preset period, and acquiring the test task basic data from the research and development platform;
and calling a BUG interface according to a preset period, and acquiring the BUG detail basic data from the research and development platform.
7. An apparatus for automatically generating a test report, the apparatus comprising:
the acquisition module is used for acquiring basic data of a test task and basic data of BUG detail from the research and development platform according to a preset period, wherein the basic data of the test task and the basic data of the BUG detail are data obtained when a plurality of testers execute the test task;
the processing module is used for screening and integrating the basic data of the test tasks to obtain test task detail sheet pages, wherein the test task detail sheet pages are used for indicating task IDs, task names, task states, testers, iteration names, project names, work attributions, work groups, test ranges and task descriptions of each test task;
the analysis module is used for analyzing and processing the test task detail sheet page and the BUG detail basic data according to various analysis dimensions to obtain an analysis result corresponding to each analysis dimension;
and the generating module is used for generating a test report corresponding to each analysis dimension according to the analysis results corresponding to the multiple analysis dimensions.
8. The apparatus of claim 7, wherein the apparatus further comprises:
the analysis module is used for analyzing and processing the test task detail sheet page according to the dimension of the testers to obtain test task information corresponding to each tester, wherein the test task information comprises test task data of a plurality of test tasks corresponding to the testers;
the generation module is used for generating a personal task statistics sheet page according to the test task information corresponding to the plurality of testers, wherein the personal task statistics sheet page is used for indicating the statistical data of the test tasks corresponding to each tester;
the analysis module is further used for analyzing and processing the test task detail sheet page according to task state dimensions to obtain test task data corresponding to each task state;
the generating module is further configured to generate a test progress report sheet page according to test task data corresponding to the plurality of task states, where the test progress report sheet page is used to indicate statistical data of each task state.
9. An apparatus for automatically generating a test report, comprising:
a memory;
a processor;
wherein the memory stores computer-executable instructions;
The processor executes the computer-executable instructions stored in the memory to implement the method of automatically generating test reports as claimed in any one of claims 1-6.
10. A computer storage medium having stored therein computer executable instructions which when executed by a processor are adapted to carry out the method of automatically generating test reports according to any of claims 1-6.
CN202311649514.5A 2023-12-04 2023-12-04 Method, device, equipment and storage medium for automatically generating test report Pending CN117608903A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311649514.5A CN117608903A (en) 2023-12-04 2023-12-04 Method, device, equipment and storage medium for automatically generating test report

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311649514.5A CN117608903A (en) 2023-12-04 2023-12-04 Method, device, equipment and storage medium for automatically generating test report

Publications (1)

Publication Number Publication Date
CN117608903A true CN117608903A (en) 2024-02-27

Family

ID=89957755

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311649514.5A Pending CN117608903A (en) 2023-12-04 2023-12-04 Method, device, equipment and storage medium for automatically generating test report

Country Status (1)

Country Link
CN (1) CN117608903A (en)

Similar Documents

Publication Publication Date Title
CN110275878B (en) Service data detection method and device, computer equipment and storage medium
CN107193730A (en) A kind of interface test method of automation
CN109189407A (en) Statistical method, system, device and the storage medium of a kind of pair of multi-chip burning
CN115422065A (en) Fault positioning method and device based on code coverage rate
CN116954624B (en) Compiling method based on software development kit, software development system and server
CN113535538B (en) Method, device, electronic equipment and storage medium for automatically testing application full link
CN106294109B (en) Method and device for acquiring defect code
CN111930611B (en) Statistical method and device for test data
CN111752833B (en) Software quality system approval method, device, server and storage medium
CN110377471B (en) Interface verification data generation method and device, storage medium and electronic equipment
CN117608903A (en) Method, device, equipment and storage medium for automatically generating test report
CN111078526A (en) Test case generation method and device and storage medium
CN113849404A (en) Management method, device and storage medium for interface test related information
CN102521134B (en) Test information detecting method and test information detecting device based on mainframe
CN107102938B (en) Test script updating method and device
CN113485906B (en) Method for testing statistical data in financial cloud platform
CN111813662A (en) User behavior driven sustainable integration test method, device and equipment
CN110990247A (en) Test system for validity of unattended issuing system
CN111240963B (en) Information display method and device for software defects, electronic equipment and storage medium
CN111258894B (en) Method and device for evaluating software risk, storage medium and electronic equipment
CN112486823B (en) Error code verification method and device, electronic equipment and readable storage medium
CN111143221B (en) Test method and device
CN105975924A (en) Regression testing method for precisely recognizing ad content based on video frame statistics
CN114281705A (en) Software defect positioning method and device, electronic equipment and medium
CN112148624A (en) Method and system for processing automatic test report of digital television radio frequency performance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination