CN112749093A - Test case management method, device, equipment and storage medium - Google Patents
Test case management method, device, equipment and storage medium Download PDFInfo
- Publication number
- CN112749093A CN112749093A CN202110045307.3A CN202110045307A CN112749093A CN 112749093 A CN112749093 A CN 112749093A CN 202110045307 A CN202110045307 A CN 202110045307A CN 112749093 A CN112749093 A CN 112749093A
- Authority
- CN
- China
- Prior art keywords
- bug
- determining
- test case
- time difference
- test
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3684—Test management for test design, e.g. generating new test cases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Stored Programmes (AREA)
Abstract
The application discloses a test case management method, a test case management device, test case management equipment and a storage medium. A test case management method comprises the following steps: receiving relevant information of the input BUG; determining the priority of the BUG according to the name and the type of the BUG and a preset BUG priority list; and sending the relevant information of the BUG to the client of the corresponding maintainer according to the priority of the BUG so as to enable the corresponding maintainer to repair the BUG. According to the technical scheme, due to the fact that the priority of the BUG is set, the BUG with the higher priority ratio can be solved preferentially by maintenance personnel, and the overall efficiency of the maintenance personnel is improved.
Description
Technical Field
The application relates to the technical field of software testing, in particular to a method, a device, equipment and a storage medium for managing test cases.
Background
And (4) the tester performs software test on the case, reports the BUG BUG to a maintenance worker to repair the BUG after finding the BUG BUG, and performs test again by the tester after completing repair. Since a plurality of BUGs may be generated during the test. For example, some BUGs are systems that cannot log in normally; some BUGs are display picture deformation or wrongly written characters; if they are reported together to the maintenance personnel. Due to the fact that work tasks are heavy, a large number of BUGs need to be maintained, and under the condition that time is short, the BUGs with low grades can be maintained, and processing of the BUGs with high grades is missed.
Disclosure of Invention
The application mainly aims to provide a test case management method, a test case management device, test case management equipment and a storage medium, so as to solve the problem that the overall efficiency is low due to unscientific labor division of BUG in the prior art.
In order to achieve the above object, according to an aspect of the present application, there is provided a test case management method, including:
receiving relevant information of the input BUG;
the related information includes: name, type and collateral information; the auxiliary information comprises one or more of the following: video, picture, text description;
determining the priority of the BUG according to the name of the BUG and a preset BUG priority list;
determining corresponding maintenance personnel according to the name and the type of the BUG;
and sending the relevant information of the BUG to the client of the corresponding maintainer according to the priority of the BUG so as to enable the corresponding maintainer to repair the BUG.
In one embodiment, after the BUG is repaired by a maintenance person, the method further comprises:
retesting the BUG by adopting a test case, and if the test passes, determining that the BUG is successfully repaired; and if the test is not passed, determining that the BUG is not repaired successfully.
In one embodiment, determining the corresponding maintenance person according to the name and the type of the BUG includes:
determining a database of corresponding maintenance personnel according to the name and the type of the BUG;
determining the maintainer with the highest priority from the database of the maintainers; and/or determining that there is time for maintenance personnel to be available.
In one embodiment, after the retest is finished, recording a time point T2 of the BUG repair completion sent by the client side receiving the maintenance personnel;
calculating a time difference T between the T2 and a time T1 of the BUG generation;
and if the time difference T is larger than a preset time difference threshold value, determining a deduction value of the maintenance personnel according to the time difference T.
In one embodiment, the method further comprises:
recording the time point T3 when the retest is finished;
calculating a time difference T between the T3 and the T2;
if the time difference T is greater than a predetermined second time difference threshold;
and determining a deduction value for the tester according to the time difference T.
In one embodiment, after the test is finished, the method further comprises:
determining the number M of BUGs generated in the BUG test process of a product;
counting the total number N of the test cases;
judging whether the BUGs in the number of kXN are distributed in the test case of zeta XM;
wherein k is a predetermined proportional coefficient of the BUG; zeta is the predetermined test case proportionality coefficient;
if so, determining that the test case distribution of the whole product is normal.
In order to achieve the above object, according to a second aspect of the present application, there is provided a test case management apparatus; the device includes:
the receiving module is used for receiving relevant information of the input BUG;
the related information includes: name, type and collateral information; the auxiliary information comprises one or more of the following: video, picture, text description;
the determining module is used for determining the priority of the BUG according to the name of the BUG and a preset BUG priority list;
determining corresponding maintenance personnel according to the name and the type of the BUG;
and the BUG sending module is used for sending the relevant information of the BUG to the client of the corresponding maintainer according to the priority of the BUG so as to enable the corresponding maintainer to repair the BUG.
In an embodiment, the apparatus further includes a test module, configured to perform retesting on the BUG by using a test case after a maintenance worker repairs the BUG, and if the test passes, determine that the BUG is successfully repaired; and if the test is not passed, determining that the BUG is not repaired successfully.
In an embodiment, the determining module is further configured to determine a database of a corresponding maintenance person according to the name and the type of the BUG;
determining the maintainer with the highest priority from the database of the maintainers; and/or determining that there is time for maintenance personnel to be available.
In an embodiment, the system further comprises a calculating module, configured to record a time point T2, sent by a client of a maintenance worker, when the BUG is repaired, after the retest is finished;
calculating a time difference T between the T2 and a time T1 of the BUG generation;
and if the time difference T is larger than a preset time difference threshold value, determining a deduction value of the maintenance personnel according to the time difference T.
In one embodiment, the system further comprises a calculating module, configured to record a time point T3 when the retest is ended;
calculating a time difference T between the T3 and the T2;
if the time difference T is greater than a predetermined second time difference threshold;
and determining a deduction value for the tester according to the time difference T.
In an embodiment, the system further comprises a calculation module, configured to determine, after the test is finished, the number M of the BUGs generated by the product in the BUG test process;
counting the total number N of the test cases;
judging whether the BUGs in the number of kXN are distributed in the test case of zeta XM;
wherein k is a predetermined proportional coefficient of the BUG; zeta is the predetermined test case proportionality coefficient;
if so, determining that the test case distribution of the whole product is normal.
In order to achieve the above object, according to a third aspect of the present application, there is provided a test case management apparatus; comprising at least one processor and at least one memory; the memory is to store one or more program instructions; the processor, configured to execute one or more program instructions, is configured to perform the following steps:
receiving relevant information of the input BUG;
the related information includes: name, type and collateral information; the auxiliary information comprises one or more of the following: video, picture, text description;
determining the priority of the BUG according to the name of the BUG and a preset BUG priority list;
determining corresponding maintenance personnel according to the name and the type of the BUG;
and sending the relevant information of the BUG to the client of the corresponding maintainer according to the priority of the BUG so as to enable the corresponding maintainer to repair the BUG.
In an embodiment, the processor is further configured to, after the BUG is repaired by a maintenance person, perform retest on the BUG by using a test case, and if the test passes, determine that the BUG is successfully repaired; and if the test is not passed, determining that the BUG is not repaired successfully.
In one embodiment, the processor is further configured to determine a database of corresponding maintenance personnel according to the name and the type of the BUG;
determining the maintainer with the highest priority from the database of the maintainers; and/or determining that there is time for maintenance personnel to be available.
In an embodiment, the processor is further configured to record a time point T2, sent by the client of the maintenance staff, at which the BUG is repaired after the retest is finished;
calculating a time difference T between the T2 and a time T1 of the BUG generation;
and if the time difference T is larger than a preset time difference threshold value, determining a deduction value of the maintenance personnel according to the time difference T.
In one embodiment, the processor is further configured to record a point in time T3 when the retest ends;
calculating a time difference T between the T3 and the T2;
if the time difference T is greater than a predetermined second time difference threshold;
and determining a deduction value for the tester according to the time difference T.
In one embodiment, the processor is further configured to, after the testing is finished, determine the number M of the BUGs generated by the product in the BUG testing process;
counting the total number N of the test cases;
judging whether the BUGs in the number of kXN are distributed in the test case of zeta XM;
wherein k is a predetermined proportional coefficient of the BUG; zeta is the predetermined test case proportionality coefficient;
if so, determining that the test case distribution of the whole product is normal.
To achieve the above object, according to a fourth aspect of the present application, there is provided a computer-readable storage medium having one or more program instructions embodied therein, the one or more program instructions being for performing the steps of:
receiving relevant information of the input BUG;
the related information includes: name, type and collateral information; the auxiliary information comprises one or more of the following: video, picture, text description;
determining the priority of the BUG according to the name of the BUG and a preset BUG priority list;
determining corresponding maintenance personnel according to the name and the type of the BUG;
and sending the relevant information of the BUG to the client of the corresponding maintainer according to the priority of the BUG so as to enable the corresponding maintainer to repair the BUG.
In one embodiment, after a maintainer repairs the BUG, retesting the BUG by using a test case, and if the test passes, determining that the BUG is repaired successfully; and if the test is not passed, determining that the BUG is not repaired successfully.
In one embodiment, determining the corresponding maintenance person according to the name and the type of the BUG includes:
determining a database of corresponding maintenance personnel according to the name and the type of the BUG;
determining the maintainer with the highest priority from the database of the maintainers; and/or determining that there is time for maintenance personnel to be available.
In one embodiment, after the retest is finished, recording a time point T2 of the BUG repair completion sent by the client side receiving the maintenance personnel;
calculating a time difference T between the T2 and a time T1 of the BUG generation;
and if the time difference T is larger than a preset time difference threshold value, determining a deduction value of the maintenance personnel according to the time difference T.
In one embodiment, the method further comprises:
recording the time point T3 when the retest is finished;
calculating a time difference T between the T3 and the T2;
if the time difference T is greater than a predetermined second time difference threshold;
and determining a deduction value for the tester according to the time difference T.
In one embodiment, after the test is finished, the method further comprises:
determining the number M of BUGs generated in the BUG test process of a product;
counting the total number N of the test cases;
judging whether the BUGs in the number of kXN are distributed in the test case of zeta XM;
wherein k is a predetermined proportional coefficient of the BUG; zeta is the predetermined test case proportionality coefficient;
if so, determining that the test case distribution of the whole product is normal.
According to the technical scheme, the relevant information of the BUG is sent to the client side of the corresponding maintainer according to the priority of the BUG, so that the corresponding maintainer can repair the BUG. Due to the fact that the BUG priority is set, maintenance personnel can preferentially solve the BUG with the higher priority, and overall efficiency of the maintenance personnel is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, serve to provide a further understanding of the application and to enable other features, objects, and advantages of the application to be more apparent. The drawings and their description illustrate the embodiments of the invention and do not limit it. In the drawings:
FIG. 1 is a flow chart of a test case management method according to an embodiment of the present application;
FIG. 2 is a flow diagram of another test case management method according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a test case management apparatus according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a test case management device according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be used. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
First, technical terms of the present application will be introduced
Use Case is a very important concept in UML, and during the whole software development process using UML, Use Case is in a central position. A use case is an abstract description of a set of action sequences that the system executes to produce a corresponding result.
The application provides a test case management method, which refers to a flow chart of the test case management method shown in the attached figure 1; the method comprises the following steps:
step S102, receiving relevant information of the input BUG;
the related information includes: name, type and collateral information; the auxiliary information comprises one or more of the following: video, picture, text description.
The video is a video or a picture shot by a user when a BUG is generated in the case testing process; thereby facilitating maintenance personnel to locate the BUG more quickly. The textual description is a summary description of the BUG.
Specifically, a tester executes a test case, generates a corresponding BUG according to the test case, and after adding the name, type and attached information of the BUG to a case page, a user selects a repair state on the case page and submits the repair state to a maintainer for maintenance.
Illustratively, the name of the BUG is generally associated with the project; for example, for a logged-in item, the generated BUG is generally that the logged-in user name and password are incorrect but can still log in normally; such a BUG is known by the name of a login BUG. Specifically, the BUG whose user name and password do not match can be used. The number of letters of the user name does not reach a predetermined number; passwords do not implement numbers, combinations of uppercase and lowercase letters and characters, and the like.
Also BUG is display BUG; for example, the displayed page has wrongly written characters, the displayed picture is deformed, or the picture cannot be displayed normally.
Types of BUG include: testing the BUG, on-line BUG, smoking BUG, pre-distributing BUG.
The testing BUG is generated when a use case is tested in a simulation environment before a product is on line; the BUG generated at the time of general testing is comparatively large, and the BUG that is comparatively serious generally appears at this stage.
The on-line BUG is the BUG generated after the product is on-line; the product is tested off-line and then is put on-line, so that the general on-line BUG is slight and has no great problem.
The smoking BUG is the BUG generated when the smoking case is executed by the product;
the pre-release BUG is the BUG generated in the pre-release process of the product.
Step S104, determining the priority of the BUG according to the name of the BUG and a preset BUG priority list;
specifically, the priority list may be a system class, a login class, a content class, and a display class;
step S106, determining corresponding maintenance personnel according to the name and the type of the BUG;
different maintenance personnel are arranged to maintain the product at different stages; the correspondence between the work number of the person and the corresponding BUG may be preset in consideration of dynamic changes of the person.
For example, see the correspondence table of the BUG type, the name and the work number of the maintenance person shown in table 1.
TABLE 1
As shown in table 1 above, taking the test BUG as an example only, this type of BUG includes the following BUG names:
the system crashes, and the work number of the corresponding maintenance personnel is 001; the system crash is a serious BUG, and a user cannot normally log in the system, so the priority of the employee with the work number 001 and the technical in the team of maintenance employees is higher, namely, the employee with the higher priority is arranged to process the BUG with the higher priority.
The interface displays wrongly written characters, and the work number of the corresponding maintenance staff is 002.
The image of the interface is displayed and deformed, the BUG is slight, and the work number of a corresponding maintainer is 003; the maintainer of this job number is responsible for the work of picture processing specially, and software such as PS is mastered to can improve the efficiency of solving BUG problem processing, and then improve the speed that whole product was online.
Products can be purchased without payment, and the work number of a corresponding maintainer is 004; because the BUG of the payment problem also belongs to a serious problem, the maintenance personnel of the work number 004 are specially responsible for the work in the aspect of the payment process, so that the problem processing efficiency can be improved, and the grade of the staff of the work number in the staff assembly team should be higher.
It is worth emphasizing that the maintenance team may comprise a plurality of different groups, each group comprising a plurality of employees with different levels of division; some are responsible for the picture class and some are responsible for the system class. See table 2 for different subgroups for the BUG types.
TABLE 2
And S108, sending the relevant information of the BUG to the client of the corresponding maintainer according to the priority of the BUG so as to enable the corresponding maintainer to repair the BUG.
Illustratively, in a BUG test stage, a BUG is a system crash, the priority is highest, and relevant information of the BUG is sent to a client of a corresponding employee number 001; therefore, the worker 001 can process the BUG in time, the efficiency of processing the BUG of the maintainer is improved, and the maintainer can process the BUG in a labor-sharing mode. The efficiency of software testing of the whole product is improved.
According to the technical scheme, after the BUG is found, the related information of the BUG is input on the page of the use case; after the click submission, the use case can send the relevant information of the BUG to the client of the maintenance personnel.
Specifically, the relevant information of the BUG is sent to the client of the corresponding maintainer according to the priority of the BUG, so that the corresponding maintainer can repair the BUG; the problem of low overall efficiency caused by the fact that work division is not detailed when maintenance personnel process a plurality of different BUGs is avoided.
In one embodiment, after a maintainer repairs the BUG, retesting the BUG by using a test case, and if the test passes, determining that the BUG is repaired successfully; and if the test is not passed, determining that the BUG is not repaired successfully.
In one embodiment, when determining corresponding maintenance personnel according to the name and the type of the BUG, determining a set of corresponding maintenance personnel according to the name and the type of the BUG; determining a maintainer with the highest priority from the set of maintainers; and/or determining that there is time for maintenance personnel to be available.
For example, see another correspondence table between the BUG and the maintenance person shown in table 3.
TABLE 3
As shown in table 3, at a certain time, the maintainer 001 has a high priority and is in an idle state;
the maintainer 002, low priority, busy status;
the maintainer 003, priority is low, status is busy;
the maintainer 004, with medium priority, is idle.
When the BUG of the system crash is determined, judging whether the state of the maintainer 001 is idle; and if the information is idle, sending the relevant information of the BUG to the client of the maintenance personnel 001 so that the maintenance personnel can perform maintenance. If the state of the maintainer 001 is busy, the state is sent to the maintainer 004 whose priority is middle and the maintainer is idle.
When the picture of the interface is determined to display the deformed BUG, judging whether the state of the maintainer 003 is idle; if the client is idle, the relevant information of the BUG is sent to the client of the maintenance personnel 003, so that the maintenance personnel can perform maintenance.
When the BUG with the wrongly written characters on the interface is determined, judging whether the 002 state of the maintenance personnel is idle; if the client is idle, the information related to the BUG is sent to the client of the maintenance staff 002, so that the maintenance staff performs maintenance.
It is to be emphasized that the status of the priority is not fixed, but is adjusted periodically. In one embodiment, the method further comprises:
counting, for any one of the maintenance personnel, the number of BUGs to be resolved within a predetermined period; and time to resolve each BUG;
wherein the predetermined period may be one day. Or other arbitrary time can be set flexibly.
According to the number of the solved BUGs; and/or, the time to resolve each BUG determines the priority.
See the correspondence table of the priorities and the BUGs shown in Table 4
TABLE 4
Preferably, the judgment of the priority also takes the feedback of the tester into account, and if the BUG is repaired, other related BUGs are generated; alternatively, the tester finds that the BUG is still present after the repair. The repair quality of the BUG is not high. The rating of the priority is also affected.
Illustratively, the score is 20 × the number of repaired BUGs in a day — the number of negative feedbacks × 10;
if the number of the BUGs repaired by the maintainer with the number 001 in one day is 5, the negative feedback frequency is 1; then the score is 90 points; the 90 points fall within the range of 80-100, and the priority is determined to be high.
Wherein negative or positive feedback is selected by the tester.
In order to promote the maintainers to actively solve the problems, improve the efficiency and realize the scoring of the maintainers, in one embodiment, after the retest is finished, the time point T2 of the BUG repair completion sent by the client of the maintainer is recorded;
calculating a time difference T between the T2 and a time T1 of the BUG generation;
and if the time difference T is larger than a preset time difference threshold value, determining a deduction value of the maintenance personnel according to the time difference T.
Illustratively, if the time for which the maintenance personnel 001 maintain the BUG exceeds a predetermined threshold, such as a threshold of 24 hours; the specific threshold value can be flexibly set. It may be determined to deduct points for maintenance personnel. The deduction value can be flexibly set and can be set to be proportional to the consumed time.
For example, if after more than 24 hours, 5 minutes into mouth; if the time exceeds 12 hours, the deduction is added for 2.5 minutes. This can increase the enthusiasm of maintenance personnel.
To achieve scoring for the test person, in one embodiment, the method further comprises the steps of;
recording the time point T3 when the retest is finished;
calculating a time difference T between the T3 and the T2;
if the time difference T is greater than a predetermined second time difference threshold;
and determining a deduction value for the tester according to the time difference T.
For example, the second time difference may be 2 hours, and may be flexibly set. If the tester does not finish the test within more than 2 hours, the efficiency of the tester is not high, and if the score can be deducted, the score can be flexibly set, such as 5 scores. The score of the deduction may be set in proportion to the length of time that the second time difference threshold is exceeded; for example, if the testing time reaches 3 hours, the button is added for 2.5 minutes.
A Test Case (Test Case) refers to a description of the task of testing a particular software product. In the process of software testing, in order to ensure the coverage rate of a test case to the requirement, namely the coverage of a system from the whole function to a single function is as high as possible. Therefore, case design plays a crucial role in software testing. According to the twenty-eight law of management, twenty percent of cases in a good and healthy product test case system can bear eighty percent of bugs; therefore, in order to count the distribution of the test cases and the BUGs, in an embodiment, after the test is finished, the method further includes:
determining the number M of BUGs generated in the BUG test process of a product;
counting the total number N of the test cases;
judging whether the BUGs in the number of kXN are distributed in the test case of zeta XM;
wherein k is a predetermined proportional coefficient of the BUG; zeta is the predetermined test case proportionality coefficient;
if so, determining that the test case distribution of the whole product is normal.
Wherein k is 0.8; ζ was 0.2.
It is emphasized that the above-mentioned scaling factors can also be adjusted. k should not be less than 0.5; ζ should not be greater than 0.5. The general principle is that a small number of use cases bear most of the BUG.
The present application also proposes another testing method, see the flow chart of another test case management method shown in fig. 2; the method comprises the following steps:
step S200, logging in a management system;
specifically, after the user inputs a user name and a password, the user logs in the management system.
The user includes the following roles:
the project manager is used for adding work tasks; the work tasks are test tasks distributed to various testers;
the tester is used for carrying out software testing according to the received work task;
and the maintainer, namely a software developer copies the BUG to repair the BUG, and solves each BUG in the test process.
The test case management system adds the test cases in the work tasks and manages the test cases in a unified way. After the personnel of each role log in the system, the BUG submitted by the testing personnel can be seen; the BUG is more convenient and quicker to solve, and the working efficiency is limited to be improved.
Step S201, entering a current project;
step S202, creating a new import case;
specifically, after the project manager adds a work task, the tester enters the current work task and enters a specific project, adds a test case, and can also download a template and import the test case.
Step S203, case review is carried out to perfect the test cases;
specifically, after the case addition is completed, the project members organize the case review according to the corresponding test cases, and after the review is completed, the related cases are added or deleted.
Wherein, each test case can be numbered; the number includes the version, and if modified, the version is upgraded.
Step S204, executing and marking a use case;
the tester executes the test case, generates a corresponding bug according to the test case, and can insert videos or pictures, so that the developer can conveniently position the bug.
Step S205, judging whether the test is passed; if yes, step S208 is performed, and if no, step S206 is performed.
And step S206, adding a verification result and submitting the BUG.
And step S207, providing the BUG to a developer for repairing.
And step S208, turning off the BUG.
Step S209, counting the BUG.
Specifically, counting the number of BUGs; and the distribution of the BUG.
For example, determine whether eighty percent of the BUGs are distributed among twenty percent use cases; if so, the case writing is qualified, and the distribution is reasonable.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowcharts, in some cases, the steps illustrated or described may be performed in an order different than presented herein.
According to an embodiment of the present invention, a test case management device is further provided, referring to a schematic structural diagram of the test case management device shown in fig. 3; the device includes:
a receiving module 31, configured to receive relevant information of the input BUG;
the related information includes: name, type and collateral information; the auxiliary information comprises one or more of the following: video, picture, text description;
a determining module 32, configured to determine a priority of the BUG according to the name of the BUG and a predetermined BUG priority list;
determining corresponding maintenance personnel according to the name and the type of the BUG;
and the BUG sending module 33 is used for sending the relevant information of the BUG to the client of the corresponding maintainer according to the priority of the BUG so as to enable the corresponding maintainer to repair the BUG.
In an embodiment, the apparatus further includes a test module, configured to perform retesting on the BUG by using a test case after a maintenance worker repairs the BUG, and if the test passes, determine that the BUG is successfully repaired; and if the test is not passed, determining that the BUG is not repaired successfully.
In an embodiment, the determining module is further configured to determine a database of a corresponding maintenance person according to the name and the type of the BUG;
determining the maintainer with the highest priority from the database of the maintainers; and/or determining that there is time for maintenance personnel to be available.
In an embodiment, the system further comprises a calculating module, configured to record a time point T2, sent by a client of a maintenance worker, when the BUG is repaired, after the retest is finished;
calculating a time difference T between the T2 and a time T1 of the BUG generation;
and if the time difference T is larger than a preset time difference threshold value, determining a deduction value of the maintenance personnel according to the time difference T.
In one embodiment, the system further comprises a calculating module, configured to record a time point T3 when the retest is ended;
calculating a time difference T between the T3 and the T2;
if the time difference T is greater than a predetermined second time difference threshold;
and determining a deduction value for the tester according to the time difference T.
In an embodiment, the system further comprises a calculation module, configured to determine, after the test is finished, the number M of the BUGs generated by the product in the BUG test process;
counting the total number N of the test cases;
judging whether the BUGs in the number of kXN are distributed in the test case of zeta XM;
wherein k is a predetermined proportional coefficient of the BUG; zeta is the predetermined test case proportionality coefficient;
if so, determining that the test case distribution of the whole product is normal.
According to a third aspect of the present application, there is provided a vehicle motion control apparatus; referring to fig. 4, including at least one processor 41 and at least one memory 42; the memory 42 is for storing one or more program instructions; the processor 41 is configured to execute one or more program instructions to perform the following steps:
receiving relevant information of the input BUG;
the related information includes: name, type and collateral information; the auxiliary information comprises one or more of the following: video, picture, text description;
determining the priority of the BUG according to the name of the BUG and a preset BUG priority list;
determining corresponding maintenance personnel according to the name and the type of the BUG;
and sending the relevant information of the BUG to the client of the corresponding maintainer according to the priority of the BUG so as to enable the corresponding maintainer to repair the BUG.
In an embodiment, the processor 41 is further configured to, after the BUG is repaired by a maintenance worker, perform retest on the BUG by using a test case, and if the test passes, determine that the BUG is successfully repaired; and if the test is not passed, determining that the BUG is not repaired successfully.
In an embodiment, the processor 41 is further configured to determine a database of corresponding maintenance personnel according to the name and the type of the BUG;
determining the maintainer with the highest priority from the database of the maintainers; and/or determining that there is time for maintenance personnel to be available.
In an embodiment, the processor 41 is further configured to record, after the retest is finished, a time point T2 when the BUG is repaired, which is sent by the client of the maintenance staff, is received;
calculating a time difference T between the T2 and a time T1 of the BUG generation;
and if the time difference T is larger than a preset time difference threshold value, determining a deduction value of the maintenance personnel according to the time difference T.
In one embodiment, the processor 41 is further configured to record a time point T3 when the retest ends;
calculating a time difference T between the T3 and the T2;
if the time difference T is greater than a predetermined second time difference threshold;
and determining a deduction value for the tester according to the time difference T.
In one embodiment, the processor 41 is further configured to, after the testing is finished, determine the number M of the BUGs generated by the product in the BUG testing process;
counting the total number N of the test cases;
judging whether the BUGs in the number of kXN are distributed in the test case of zeta XM;
wherein k is a predetermined proportional coefficient of the BUG; zeta is the predetermined test case proportionality coefficient;
if so, determining that the test case distribution of the whole product is normal.
According to a fourth aspect of the present application, there is provided a computer readable storage medium having one or more program instructions embodied therein, the one or more program instructions for performing the steps of:
receiving relevant information of the input BUG;
the related information includes: name, type and collateral information; the auxiliary information comprises one or more of the following: video, picture, text description;
determining the priority of the BUG according to the name of the BUG and a preset BUG priority list;
determining corresponding maintenance personnel according to the name and the type of the BUG;
and sending the relevant information of the BUG to the client of the corresponding maintainer according to the priority of the BUG so as to enable the corresponding maintainer to repair the BUG.
In one embodiment, after the BUG is repaired by a maintenance person, the method further comprises:
retesting the BUG by adopting a test case, and if the test passes, determining that the BUG is successfully repaired; and if the test is not passed, determining that the BUG is not repaired successfully.
In one embodiment, determining the corresponding maintenance person according to the name and the type of the BUG includes:
determining a database of corresponding maintenance personnel according to the name and the type of the BUG;
determining the maintainer with the highest priority from the database of the maintainers; and/or determining that there is time for maintenance personnel to be available.
In one embodiment, after the retest is finished, recording a time point T2 of the BUG repair completion sent by the client side receiving the maintenance personnel;
calculating a time difference T between the T2 and a time T1 of the BUG generation;
and if the time difference T is larger than a preset time difference threshold value, determining a deduction value of the maintenance personnel according to the time difference T.
In one embodiment, the method further comprises: recording the time point T3 when the retest is finished;
calculating a time difference T between the T3 and the T2;
if the time difference T is greater than a predetermined second time difference threshold;
and determining a deduction value for the tester according to the time difference T.
In one embodiment, after the test is finished, the method further comprises:
determining the number M of BUGs generated in the BUG test process of a product;
counting the total number N of the test cases;
judging whether the BUGs in the number of kXN are distributed in the test case of zeta XM;
wherein k is a predetermined proportional coefficient of the BUG; zeta is the predetermined test case proportionality coefficient;
if so, determining that the test case distribution of the whole product is normal.
In an embodiment of the invention, the processor may be an integrated circuit chip having signal processing capability. The Processor may be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete Gate or transistor logic device, discrete hardware component.
The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The processor reads the information in the storage medium and completes the steps of the method in combination with the hardware.
The storage medium may be a memory, for example, which may be volatile memory or nonvolatile memory, or which may include both volatile and nonvolatile memory.
The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory.
The volatile Memory may be a Random Access Memory (RAM) which serves as an external cache. By way of example and not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), SLDRAM (SLDRAM), and Direct Rambus RAM (DRRAM).
The storage media described in connection with the embodiments of the invention are intended to comprise, without being limited to, these and any other suitable types of memory.
Those skilled in the art will appreciate that the functionality described in the present invention may be implemented in a combination of hardware and software in one or more of the examples described above. When software is applied, the corresponding functionality may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.
Claims (9)
1. A test case management method is characterized by comprising the following steps:
receiving relevant information of the input BUG;
the related information includes: name, type and collateral information; the auxiliary information comprises one or more of the following: video, picture, text description;
determining the priority of the BUG according to the name of the BUG and a preset BUG priority list;
determining corresponding maintenance personnel according to the name and the type of the BUG;
and sending the relevant information of the BUG to the client of the corresponding maintainer according to the priority of the BUG so as to enable the corresponding maintainer to repair the BUG.
2. The test case management method of claim 1, wherein after a maintenance person repairs the BUG, the method further comprises:
retesting the BUG by adopting a test case, and if the test passes, determining that the BUG is successfully repaired; and if the test is not passed, determining that the BUG is not repaired successfully.
3. The test case management method according to claim 1, wherein determining the corresponding maintenance person according to the name and the type of the BUG comprises:
determining a database of corresponding maintenance personnel according to the name and the type of the BUG;
determining the maintainer with the highest priority from the database of the maintainers; and/or determining that there is time for maintenance personnel to be available.
4. The test case management method according to claim 2, wherein after the retest, a time point T2 when the BUG repair is completed, which is sent by the client side of the maintenance staff, is recorded;
calculating a time difference T between the T2 and a time T1 of the BUG generation;
and if the time difference T is larger than a preset time difference threshold value, determining a deduction value of the maintenance personnel according to the time difference T.
5. The test case management method of claim 4, further comprising:
recording the time point T3 when the retest is finished;
calculating a time difference T between the T3 and the T2;
if the time difference T is greater than a predetermined second time difference threshold;
and determining a deduction value for the tester according to the time difference T.
6. The test case management method according to claim 1, wherein after the test is completed, the method further comprises:
determining the number M of BUGs generated in the BUG test process of a product;
counting the total number N of the test cases;
judging whether the BUGs in the number of kXN are distributed in the test case of zeta XM;
wherein k is a predetermined proportional coefficient of the BUG; zeta is the predetermined test case proportionality coefficient;
if so, determining that the test case distribution of the whole product is normal.
7. A test case management apparatus, comprising:
the receiving module is used for receiving relevant information of the input BUG;
the related information includes: name, type and collateral information; the auxiliary information comprises one or more of the following: video, picture, text description;
the determining module is used for determining the priority of the BUG according to the name of the BUG and a preset BUG priority list;
determining corresponding maintenance personnel according to the name and the type of the BUG;
and the BUG sending module is used for sending the relevant information of the BUG to the client of the corresponding maintainer according to the priority of the BUG so as to enable the corresponding maintainer to repair the BUG.
8. A test case management apparatus, comprising: at least one processor and at least one memory; the memory is to store one or more program instructions; the processor, configured to execute one or more program instructions to perform the method of any of claims 1-7.
9. A computer-readable storage medium having one or more program instructions embodied therein for performing the method of any one of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110045307.3A CN112749093A (en) | 2021-01-13 | 2021-01-13 | Test case management method, device, equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110045307.3A CN112749093A (en) | 2021-01-13 | 2021-01-13 | Test case management method, device, equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112749093A true CN112749093A (en) | 2021-05-04 |
Family
ID=75651807
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110045307.3A Pending CN112749093A (en) | 2021-01-13 | 2021-01-13 | Test case management method, device, equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112749093A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114138660A (en) * | 2021-12-07 | 2022-03-04 | 中国建设银行股份有限公司 | Defect processing method, device, equipment, computer readable storage medium and product |
-
2021
- 2021-01-13 CN CN202110045307.3A patent/CN112749093A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114138660A (en) * | 2021-12-07 | 2022-03-04 | 中国建设银行股份有限公司 | Defect processing method, device, equipment, computer readable storage medium and product |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107885656B (en) | Automatic product algorithm testing method and application server | |
Dyckman et al. | Accounting research: Past, present, and future | |
Bath et al. | The software test engineer's handbook: a study guide for the ISTQB test analyst and technical test analyst advanced level certificates 2012 | |
US11461671B2 (en) | Data quality tool | |
Chopade et al. | Agile software development: Positive and negative user stories | |
CN109284974A (en) | A kind of checking method based on block chain, device, audit equipment and storage medium | |
CN111367982B (en) | Method, device, computer equipment and storage medium for importing TRRIGA basic data | |
CN112749093A (en) | Test case management method, device, equipment and storage medium | |
CN107153694B (en) | Method, device, equipment and storage medium for automatically modifying question errors | |
CN115358897A (en) | Student management method, system, terminal and storage medium based on electronic student identity card | |
CN103440460A (en) | Application system change validation method and system | |
US20180350252A1 (en) | Learning Program Provision System and Learning Program Provision Method | |
CN113822527A (en) | Compliance monitoring method, equipment, medium and product based on flexible employment platform | |
KR20190101555A (en) | System for automatically inspecting document | |
JP4572126B2 (en) | Audit processing program, apparatus and method | |
CN111639478B (en) | Automatic data auditing method and system based on EXCEL document | |
CN113902457A (en) | Method and device for evaluating reliability of house source information, electronic equipment and storage medium | |
CN107122272A (en) | A kind of automatic Verification method and device of CPU register informations | |
CN107885839B (en) | Method and device for reading information in Word file | |
US20140122419A1 (en) | Method and system to promote database cost savings | |
CN101996133A (en) | Interaction interface test system of Web application software | |
US20130091384A1 (en) | System and method for measuring the effect of interruptions on software application usability | |
CN113435696B (en) | Method and system for evaluating emergency disposal capability of rail transit vehicle dispatcher | |
CN112597246B (en) | Method, device and system for maintaining data consistency | |
CN115757355A (en) | Data collection processing method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |