CN117609058A - Test management method, device, equipment and storage medium - Google Patents

Test management method, device, equipment and storage medium Download PDF

Info

Publication number
CN117609058A
CN117609058A CN202311631240.7A CN202311631240A CN117609058A CN 117609058 A CN117609058 A CN 117609058A CN 202311631240 A CN202311631240 A CN 202311631240A CN 117609058 A CN117609058 A CN 117609058A
Authority
CN
China
Prior art keywords
test
sub
tested
item
items
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311631240.7A
Other languages
Chinese (zh)
Inventor
张东宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Agricultural Bank of China
Original Assignee
Agricultural Bank of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Agricultural Bank of China filed Critical Agricultural Bank of China
Priority to CN202311631240.7A priority Critical patent/CN117609058A/en
Publication of CN117609058A publication Critical patent/CN117609058A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06311Scheduling, planning or task assignment for a person or group

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Economics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Strategic Management (AREA)
  • Development Economics (AREA)
  • Educational Administration (AREA)
  • Game Theory and Decision Science (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The application discloses a test management method, a device, equipment and a storage medium, which relate to the technical field of computers and can improve the overall test efficiency of projects. The method is applied to the server and comprises the following steps: receiving a test request for a target item initiated by a client; the test request carries item attribute parameters and test configuration parameters, wherein the item attribute parameters comprise each sub-item to be tested in the target item, test difficulty levels corresponding to the sub-items to be tested respectively and test sequences aiming at the sub-items to be tested; determining a test period corresponding to each sub-project to be tested respectively based on project attribute parameters and test configuration parameters, and determining a target test team corresponding to each sub-project to be tested respectively from each candidate test team; according to the test time periods respectively corresponding to the sub-items to be tested and the target test team respectively corresponding to the sub-items to be tested, test tasks aiming at the sub-items to be tested are created, and the test tasks are returned to the client.

Description

Test management method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a test management method, apparatus, device, and storage medium.
Background
Currently, in the process of performing project testing on a software development project, the following situations may occur: when multiple test teams simultaneously test different business function parts of the same project in the same test environment, the test operation of one test team (taking an online transaction system in which the project to be tested is a bank as an example, the test operation may be operations such as account moving and day cutting) may affect the test results of the business function parts tested by other test teams. Based on this, the test management manner of the prior art for the project is generally: when a certain test team needs to test a certain project, a test application can be carried out to the background; if no other test team is testing the project at this time, the background can apply for the test, and the test team can start to test the project; if other test teams are testing the project at this time, the background can reject the test application, and the test team needs to wait for the test of the other test teams to complete and then test.
However, in the above-mentioned conventional test management method, the test time of each test team has uncertainty, and there may be a case that the test team needs to wait for a long time to start the test, resulting in low overall test efficiency of the project.
Disclosure of Invention
The application provides a test management method, a device, equipment and a storage medium, which can reduce the situation that a test team needs to wait for a long time to start a test, thereby improving the overall test efficiency of a project.
In order to achieve the above purpose, the present application adopts the following technical scheme:
in a first aspect, the present application provides a test management method, where the method may be applied to a server, including: receiving a test request for a target item initiated by a client; the test request carries project attribute parameters and test configuration parameters, wherein the project attribute parameters comprise each sub-project to be tested in a target project, test difficulty levels corresponding to the sub-projects to be tested respectively, and test sequences aiming at the sub-projects to be tested, and the test configuration parameters comprise total test duration and associated candidate test teams; determining a test period corresponding to each sub-project to be tested respectively based on project attribute parameters and test configuration parameters, and determining a target test team corresponding to each sub-project to be tested respectively from each candidate test team; according to the test time periods respectively corresponding to the sub-items to be tested and the target test team respectively corresponding to the sub-items to be tested, test tasks aiming at the sub-items to be tested are created, and the test tasks are returned to the client.
In the technical scheme provided by the application, when the project test is required to be performed on the target project, project management personnel can input project information (such as predicted total test duration and the like) at the client, and the client can respond to the input operation of the project management personnel to determine the project attribute parameters and the test configuration parameters of the target project and initiate a test request carrying the project attribute parameters and the test configuration parameters to the server. The project attribute parameters comprise each sub-project to be tested in the target project, the test difficulty level corresponding to each sub-project to be tested respectively, and the test sequence aiming at each sub-project to be tested, and the test configuration parameters comprise total test duration and associated candidate test teams. Because the test difficulty level corresponding to each sub-item to be tested can represent the test duration required by each sub-item to be tested (the higher the test difficulty is, the more the required test duration is). Therefore, after receiving the test request, the server can reasonably allocate test time periods for the sub-items to be tested according to the test difficulty level and the total test time length respectively corresponding to the sub-items to be tested, uniformly allocate the sub-items to be tested to candidate test teams, and then establish test tasks for the sub-items to be tested according to the determined information. The server may then return the test task to the client. Therefore, each candidate test team can reasonably plan and distribute test time by checking task information of the test task (including test time periods corresponding to each sub-project to be tested and target test teams corresponding to each sub-project to be tested) in the client, so that the situation that the test teams need to wait for a long time to start testing can be reduced.
It can be seen that, in the technical scheme provided by the application, the server side can automatically generate a test task for the target item according to the item attribute parameters and the test configuration parameters of the target item, namely, automatically allocate a test period and a test team for each sub-item to be tested of the target item. Therefore, the test time of each test team is determined, and each test team can reasonably plan and distribute the test time according to the task information of the test task, so that the situation that the test team needs to wait for a long time to start testing can be reduced, and the overall test efficiency of the project is improved.
Optionally, determining a test period corresponding to each sub-item to be tested based on the item attribute parameter and the test configuration parameter, and determining a target test team corresponding to each sub-item to be tested from each candidate test team, including:
determining the sub-test duration corresponding to each sub-item to be tested based on the total test duration and the test difficulty level corresponding to each sub-item to be tested; determining the test time periods respectively corresponding to the sub-items to be tested according to the test sequence and the sub-test time periods respectively corresponding to the sub-items to be tested; and determining target test teams corresponding to the sub-projects to be tested respectively from the candidate test teams based on the test sequence.
Optionally, determining, based on the test sequence, a target test team corresponding to each sub-item to be tested from each candidate test team, including:
determining a first sub-project to be tested from all sub-projects to be tested based on a test sequence, and determining a target test team corresponding to the first sub-project to be tested from all candidate test teams based on a preset matching rule; determining the next sub-project to be tested from all the sub-projects to be tested based on the test sequence, and determining a target test team corresponding to the next sub-project to be tested from all the candidate test teams based on a preset matching rule until the next sub-project to be tested is the last sub-project to be tested in all the sub-projects to be tested.
Optionally, after returning the test task to the client, the test management method provided by the application further includes:
receiving a task execution request for a test task initiated by a client; determining test deadlines corresponding to the sub-items to be tested respectively based on the test time periods corresponding to the sub-items to be tested respectively and the initial execution time carried in the task execution request; and returning the test deadlines respectively corresponding to the sub-items to be tested to the client so that the client presents the test deadlines respectively corresponding to the sub-items to be tested.
Optionally, after returning the test deadlines respectively corresponding to the sub-items to be tested to the client so that the client presents the test deadlines respectively corresponding to the sub-items to be tested, the test management method provided by the application further includes:
starting a timing monitoring task; the task content of the timing monitoring task is as follows: and for each sub-item to be tested, monitoring the time interval between the test deadline corresponding to the sub-item to be tested and the current moment, and returning prompt information to the client under the condition that the time interval meets the preset condition.
Optionally, after returning the test deadlines respectively corresponding to the sub-items to be tested to the client so that the client presents the test deadlines respectively corresponding to the sub-items to be tested, the test management method provided by the application further includes:
and stopping executing the timing monitoring task under the condition that a pause execution request for the test task initiated by the client is received.
Optionally, after determining the test deadlines corresponding to the sub-items to be tested, the test management method provided by the application further includes:
under the condition that a test completion instruction aiming at a target sub-item to be tested is received, determining each remaining sub-item to be tested in each sub-item to be tested; updating the test deadlines respectively corresponding to the remaining sub-items to be tested based on the current time and the test time periods respectively corresponding to the remaining sub-items to be tested; and returning the test deadlines respectively corresponding to the remaining sub-items to be tested to the client so that the client presents the test deadlines respectively corresponding to the remaining sub-items to be tested.
In a second aspect, the present application provides a test management apparatus, where the test management apparatus may be configured at a server, and includes: the device comprises a receiving module, a determining module and a creating module;
the receiving module is used for receiving a test request for a target item initiated by the client; the test request carries project attribute parameters and test configuration parameters, wherein the project attribute parameters comprise each sub-project to be tested in a target project, test difficulty levels corresponding to the sub-projects to be tested respectively, and test sequences aiming at the sub-projects to be tested, and the test configuration parameters comprise total test duration and associated candidate test teams; the determining module is used for determining the test time periods corresponding to the sub-projects to be tested respectively based on the project attribute parameters and the test configuration parameters, and determining target test teams corresponding to the sub-projects to be tested respectively from the candidate test teams; the creation module is used for creating test tasks aiming at all the sub-projects to be tested according to the test time periods corresponding to all the sub-projects to be tested respectively and the target test teams corresponding to all the sub-projects to be tested respectively, and returning the test tasks to the client.
Optionally, the determining module is specifically configured to:
Determining the sub-test duration corresponding to each sub-item to be tested based on the total test duration and the test difficulty level corresponding to each sub-item to be tested; determining the test time periods respectively corresponding to the sub-items to be tested according to the test sequence and the sub-test time periods respectively corresponding to the sub-items to be tested; and determining target test teams corresponding to the sub-projects to be tested respectively from the candidate test teams based on the test sequence.
Optionally, the determining module is specifically further configured to:
determining a first sub-project to be tested from all sub-projects to be tested based on a test sequence, and determining a target test team corresponding to the first sub-project to be tested from all candidate test teams based on a preset matching rule; determining the next sub-project to be tested from all the sub-projects to be tested based on the test sequence, and determining a target test team corresponding to the next sub-project to be tested from all the candidate test teams based on a preset matching rule until the next sub-project to be tested is the last sub-project to be tested in all the sub-projects to be tested.
Optionally, the test management apparatus provided in the present application may further include a sending module;
the receiving module is also used for receiving a task execution request for the test task initiated by the client after the test task is returned to the client; the determining module is further used for determining the test deadline corresponding to each sub-item to be tested based on the test time period corresponding to each sub-item to be tested and the initial execution time carried in the task execution request; and the sending module is used for returning the test deadlines respectively corresponding to the sub-items to be tested to the client so that the client can display the test deadlines respectively corresponding to the sub-items to be tested.
Optionally, the test management apparatus provided in the present application may further include a timing module;
the timing module is used for starting a timing monitoring task after the sending module returns the test deadlines respectively corresponding to the sub-items to be tested to the client so that the client presents the test deadlines respectively corresponding to the sub-items to be tested; the task content of the timing monitoring task is as follows: and for each sub-item to be tested, monitoring the time interval between the test deadline corresponding to the sub-item to be tested and the current moment, and returning prompt information to the client under the condition that the time interval meets the preset condition.
Optionally, the timing module is further configured to stop executing the timing monitoring task when receiving a request for suspending execution of the test task initiated by the client after the sending module returns the test deadlines respectively corresponding to the sub-items to be tested to the client so that the client presents the test deadlines respectively corresponding to the sub-items to be tested.
Optionally, the test management device provided by the application further includes an update module and a sending module;
the determining module is further used for determining each remaining sub-item to be tested in each sub-item to be tested under the condition that a test completion instruction aiming at the target sub-item to be tested is received after determining the test deadline corresponding to each sub-item to be tested respectively; the updating module is used for updating the test deadlines respectively corresponding to the remaining sub-items to be tested based on the current moment and the test time periods respectively corresponding to the remaining sub-items to be tested; and the sending module is used for returning the test deadlines respectively corresponding to the remaining sub-items to be tested to the client so that the client can display the test deadlines respectively corresponding to the remaining sub-items to be tested.
In a third aspect, the present application provides a test management apparatus comprising a memory, a processor, a bus, and a communication interface; the memory is used for storing computer execution instructions, and the processor is connected with the memory through a bus; when the test management apparatus is running, the processor executes computer-executable instructions stored in the memory to cause the test management apparatus to perform the test management method as provided in the first aspect described above.
In a fourth aspect, the present application provides a computer-readable storage medium having instructions stored therein that, when executed by a computer, cause the computer to perform the test management method as provided in the first aspect.
In a fifth aspect, the present application provides a computer program product comprising computer instructions which, when run on a computer, cause the computer to perform the test management method as provided in the first aspect.
It should be noted that the above-mentioned computer instructions may be stored in whole or in part on a computer-readable storage medium. The computer readable storage medium may be packaged together with the processor of the test management device, or may be packaged separately from the processor of the test management device, which is not limited in this application.
The description of the second, third, fourth and fifth aspects of the present application may refer to the detailed description of the first aspect; further, the advantageous effects described in the second aspect, the third aspect, the fourth aspect, and the fifth aspect may refer to the advantageous effect analysis of the first aspect, and are not described herein.
In the present application, the names of the above-mentioned devices or functional modules are not limited, and in actual implementation, these devices or functional modules may appear under other names. Insofar as the function of each device or function module is similar to the present application, it is within the scope of the present application and the equivalents thereof.
These and other aspects of the present application will be more readily apparent from the following description.
Drawings
Fig. 1 is a schematic flow chart of a test management method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a target test team for matching sub-projects to be tested according to an embodiment of the present application
FIG. 3 is a schematic diagram of another exemplary matching target test team for each sub-project under test according to an embodiment of the present application;
FIG. 4 is a flowchart illustrating another test management method according to an embodiment of the present disclosure;
Fig. 5 is a schematic structural diagram of a test management device according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a test management device according to an embodiment of the present application.
Detailed Description
The test management method, device, equipment and storage medium provided by the embodiment of the application are described in detail below with reference to the accompanying drawings.
The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone.
The terms "first" and "second" and the like in the description and in the drawings are used for distinguishing between different objects or for distinguishing between different processes of the same object and not for describing a particular sequential order of objects.
Furthermore, references to the terms "comprising" and "having" and any variations thereof in the description of the present application are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed but may optionally include other steps or elements not listed or inherent to such process, method, article, or apparatus.
It should be noted that, in the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the description of the present application, unless otherwise indicated, the meaning of "a plurality" means two or more.
In addition, the technical scheme of the application is used for acquiring, storing, using, processing and the like data, and the data are in accordance with relevant regulations of national laws and regulations.
In the existing test management mode, the test time of each test team has uncertainty, and the situation that the test team needs to wait for a long time to start the test may occur, so that the overall test efficiency of the project is low.
Aiming at the problems in the prior art, the embodiment of the application provides a test management method, in which each test team can reasonably plan and distribute test time according to task information of a test task, so that the situation that the test team needs to wait for a long time to start testing can be reduced, and overall test efficiency of a project can be improved.
The test management method provided by the embodiment of the application can be applied to a test management system, wherein the test management system can comprise a client side and a server side, and the test management method can be applied to the server side. In addition, the test management method provided by the embodiment of the application may be executed by the test management apparatus provided by the embodiment of the application, and the test management apparatus may be implemented in a software and/or hardware manner and integrated in a test management device executing the method. The test management device may be a server or a server cluster corresponding to the server.
The test management method provided in the embodiment of the present application is described below with reference to the accompanying drawings.
Referring to fig. 1, the test management method provided in the embodiment of the present application includes S101-S103:
s101, receiving a test request for a target item initiated by a client.
The test request carries project attribute parameters and test configuration parameters, wherein the project attribute parameters comprise all sub-projects to be tested in a target project, test difficulty levels corresponding to all the sub-projects to be tested respectively, and test sequences aiming at all the sub-projects to be tested, and the test configuration parameters comprise total test duration and associated candidate test teams.
Each sub-item to be tested can be a plurality of sub-items split from the target item, and one sub-item to be tested can correspond to one business function part. For example, taking an online transaction system in which the target item is a bank as an example, each sub-item to be tested may include sub-items respectively corresponding to service functions such as credit card service, deposit service, payment service, loan service, and the like.
In one possible implementation, when the target item needs to be tested, the project manager may input the project information in a plurality of input boxes provided by an application interface of the client, and the client may determine the project attribute parameter and the test configuration parameter according to the project information in response to an input operation of the project manager. The client may then initiate a test request for the target item to the server. The client can directly determine a part of item attribute parameters and test configuration parameters (for example, each sub-item to be tested in the target item and the test difficulty level corresponding to each sub-item to be tested) according to the item information, and the other part of item attribute parameters and test configuration parameters need to be further processed to obtain the item information. For example, the project information input by the project manager may include a relationship graph of each sub-project to be tested, and the client may process the relationship graph and determine a test sequence for each sub-project to be tested according to the mutual influence relationship between the sub-projects to be tested. For example, when the service corresponding to the sub-item a to be tested needs to be invoked in the execution process, the test sequence of the sub-item B to be tested is before the sub-item a to be tested. In another example, the project information input by the project manager may include attribute parameters (such as a size of a to-be-tested file) of each to-be-tested sub-project, and the client may determine service complexity of the to-be-tested sub-project according to the attribute parameters of the to-be-tested file, and then determine a test difficulty level of the to-be-tested sub-project according to a predetermined mapping relationship table (which is used for representing a corresponding relationship between the service complexity and the test difficulty level).
S102, determining test time periods corresponding to the sub-projects to be tested respectively based on project attribute parameters and test configuration parameters, and determining target test teams corresponding to the sub-projects to be tested respectively from candidate test teams.
Optionally, determining a test period corresponding to each sub-item to be tested based on the item attribute parameter and the test configuration parameter, and determining a target test team corresponding to each sub-item to be tested from each candidate test team, including: determining the sub-test duration corresponding to each sub-item to be tested based on the total test duration and the test difficulty level corresponding to each sub-item to be tested; determining the test time periods respectively corresponding to the sub-items to be tested according to the test sequence and the sub-test time periods respectively corresponding to the sub-items to be tested; and determining target test teams corresponding to the sub-projects to be tested respectively from the candidate test teams based on the test sequence.
In one possible implementation manner, corresponding grade weights can be determined in advance for different testing difficulty grades, and the server can allocate testing time length to each sub-item to be tested according to the grade weights of the testing difficulty grades respectively corresponding to the sub-items to be tested and combined with total testing time length. For example, if the total test duration is 10 days, the target item includes a sub-item to be tested a, a sub-item to be tested B, and a sub-item to be tested C, and the level weights of the test difficulty levels corresponding to the three sub-items to be tested are 3, 1, and 1, respectively, then the sub-test durations corresponding to the three sub-items to be tested are 6 days, 2 days, and 2 days, respectively. If the test sequence of the three sub-items to be tested is the sub-item to be tested B, the sub-item to be tested C and the sub-item to be tested A in sequence, the test period corresponding to the sub-item to be tested B is the first day to the sixth day, the test period corresponding to the sub-item to be tested C is the seventh day to the eighth day, and the test period corresponding to the sub-item to be tested A is the ninth day to the tenth day.
In one possible implementation, the sub-items to be tested may be uniformly distributed to the candidate test teams according to the number of sub-items to be tested.
Referring to fig. 2, a schematic diagram of a target test team matching each sub-item to be tested is provided in an embodiment of the present application. As shown in fig. 2, two sub-items A, B to be tested, which are in the test order at the front, may be assigned to the test team a, two sub-items C, D to be tested, which are in the test order at the middle, may be assigned to the test team B, and two sub-items E, F to be tested, which are in the test order at the rear, may be assigned to the test team C.
In another example, if each sub-project under test includes a total of 7 sub-projects under test A, B, C, D, E, F, G, each candidate test team includes a total of 3 test teams A, B, C. In one possible implementation, the sub-project under test A, B, C, D, E, F may be distributed evenly to three test teams A, B, C, and then the sub-project under test G may be distributed to test team a. Alternatively, the sub-items under test A, B, C may be assigned to test team a first, and then the sub-items under test D, E, F, G may be evenly assigned to test team B, C.
Optionally, determining, based on the test sequence, a target test team corresponding to each sub-item to be tested from each candidate test team, including: determining a first sub-project to be tested from all sub-projects to be tested based on a test sequence, and determining a target test team corresponding to the first sub-project to be tested from all candidate test teams based on a preset matching rule; determining the next sub-project to be tested from all the sub-projects to be tested based on the test sequence, and determining a target test team corresponding to the next sub-project to be tested from all the candidate test teams based on a preset matching rule until the next sub-project to be tested is the last sub-project to be tested in all the sub-projects to be tested.
The preset matching rule may be a predetermined matching rule. For example, the preset matching rule may be to cycle through each candidate test team, and determine the target test team from each candidate test team in turn.
Referring to fig. 3, another schematic diagram of a target test team matching each sub-item to be tested according to an embodiment of the present application is provided. As shown in fig. 3, the sub-project a to be tested, the sub-project B to be tested, and the sub-project C to be tested may be sequentially allocated to the test team a, the test team B, and the test team C, and then the sub-project D to be tested, the sub-project E to be tested, and the sub-project F to be tested may be sequentially allocated to the test team a, the test team B, and the test team C. As can be seen from comparing fig. 2 and 3, in the allocation scheme provided in fig. 2, the maximum time that the test team needs to wait is a period of time t0 to t2, and in the allocation scheme provided in fig. 3, the maximum time that the test team needs to wait is a period of time t0 to t 1. It is apparent that the longest waiting time of the allocation provided in fig. 3 is shorter than that provided in fig. 2. Therefore, the embodiment of the application circularly matches each candidate test team to each sub-project to be tested based on the test sequence, so that the longest waiting time of the test team can be shortened.
S103, creating test tasks aiming at all the sub-items to be tested according to the test time periods corresponding to all the sub-items to be tested respectively and the target test team corresponding to all the sub-items to be tested respectively, and returning the test tasks to the client.
The server may return task information of the test task (including the test period corresponding to each sub-item to be tested and the target test team corresponding to each sub-item to be tested) to the client, and the client may present the received task information. Each candidate test team can reasonably plan and allocate test time by looking up the task information of the test task in the client.
Optionally, after returning the test task to the client, the test management method provided in the embodiment of the present application further includes: receiving a task execution request for a test task initiated by a client; determining test deadlines corresponding to the sub-items to be tested respectively based on the test time periods corresponding to the sub-items to be tested respectively and the initial execution time carried in the task execution request; and returning the test deadlines respectively corresponding to the sub-items to be tested to the client so that the client presents the test deadlines respectively corresponding to the sub-items to be tested.
In one possible implementation manner, when determining to start testing the target item, the item manager may trigger the client to initiate a task execution request to the server through selecting a start control provided in an application interface of the client. After receiving the task execution request, the server may calculate the test deadlines corresponding to the sub-items to be tested respectively by taking the receiving time of the task execution request as the initial execution time. Alternatively, in another possible implementation, the start execution time may be a time customized by the project manager. For example, if each sub-item to be tested includes a sub-item to be tested A, B, C, the test periods corresponding to the three sub-items to be tested are [ t0, t0+6 days ], [ t0+6 days, t0+8 days ], [ t0+8 days, t0+10 days ], and the initial execution time is 11, 10, 13 points in XX years, 11, 10, and 13 points in XX years, 11, 18, and 20, respectively.
Optionally, in order to further improve overall test efficiency of the project, the server side returns test deadlines corresponding to each sub-project to be tested to the client side, so that after the client side presents the test deadlines corresponding to each sub-project to be tested, the test management method provided by the embodiment of the application further includes: and starting a timing monitoring task.
The task content of the timing monitoring task is as follows: and for each sub-item to be tested, monitoring the time interval between the test deadline corresponding to the sub-item to be tested and the current moment, and returning prompt information to the client under the condition that the time interval meets the preset condition.
The preset condition may be a predetermined condition, and the preset condition may be that the time interval is less than a predetermined preset time period. The prompt information may be information for prompting the test team that a time interval between the current time and the test deadline is less than a preset duration. For example, the prompt information may be: "1 day from test cutoff time".
Optionally, after returning the test deadlines respectively corresponding to the sub-items to be tested to the client so that the client presents the test deadlines respectively corresponding to the sub-items to be tested, the test management method provided by the embodiment of the application further includes: and stopping executing the timing monitoring task under the condition that a pause execution request for the test task initiated by the client is received.
In practical application, the project test may be suspended due to various factors, so in order to meet the project management requirement of the user, so as to calculate the test deadline more accurately, in the embodiment of the application, the user can be provided with the function of self-defining the switch timing monitoring task. It should be noted that, each time the timing monitor task is restarted, the server needs to recalculate the test deadline.
Optionally, since in practical application, the test process may end in advance, in order to calculate the test deadline more accurately, so as to further improve the test efficiency. The test management method provided by the embodiment of the application further comprises the following steps: after determining the test deadlines corresponding to the sub-items to be tested respectively, determining each remaining sub-item to be tested in the sub-items to be tested under the condition that a test completion instruction aiming at the target sub-item to be tested is received; updating the test deadlines respectively corresponding to the remaining sub-items to be tested based on the current time and the test time periods respectively corresponding to the remaining sub-items to be tested; and returning the test deadlines respectively corresponding to the remaining sub-items to be tested to the client so that the client presents the test deadlines respectively corresponding to the remaining sub-items to be tested.
In view of the above, in the test management method provided in the embodiment of the present application, when a target item needs to be tested, a project manager may input item information (for example, a predicted total test duration, etc.) at a client, and then the client may determine, in response to an input operation of the project manager, a project attribute parameter and a test configuration parameter of the target item, and initiate a test request carrying the project attribute parameter and the test configuration parameter to a server. The project attribute parameters comprise each sub-project to be tested in the target project, the test difficulty level corresponding to each sub-project to be tested respectively, and the test sequence aiming at each sub-project to be tested, and the test configuration parameters comprise total test duration and associated candidate test teams. Because the test difficulty level corresponding to each sub-item to be tested can represent the test duration required by each sub-item to be tested (the higher the test difficulty is, the more the required test duration is). Therefore, after receiving the test request, the server can reasonably allocate test time periods for the sub-items to be tested according to the test difficulty level and the total test time length respectively corresponding to the sub-items to be tested, uniformly allocate the sub-items to be tested to candidate test teams, and then establish test tasks for the sub-items to be tested according to the determined information. The server may then return the test task to the client. Therefore, each candidate test team can reasonably plan and distribute test time by checking task information of the test task (including test time periods corresponding to each sub-project to be tested and target test teams corresponding to each sub-project to be tested) in the client, so that the situation that the test teams need to wait for a long time to start testing can be reduced. It can be seen that in the embodiment of the present application, the server may automatically generate a test task for the target item according to the item attribute parameter and the test configuration parameter of the target item, that is, automatically allocate a test period and a test team for each sub-item to be tested of the target item. Therefore, the test time of each test team is determined, and each test team can reasonably plan and distribute the test time according to the task information of the test task, so that the situation that the test team needs to wait for a long time to start testing can be reduced, and the overall test efficiency of the project is improved.
Optionally, as shown in fig. 4, the embodiment of the present application further provides a test management method, which may include S401-S408:
s401, the client responds to input operation of project management personnel, determines project attribute parameters and test configuration parameters, and initiates a test request carrying the project attribute parameters and the test configuration parameters to the server.
S402, the server determines the sub-test duration corresponding to each sub-item to be tested based on the total test duration and the test difficulty level corresponding to each sub-item to be tested; then determining the test time periods respectively corresponding to the sub-items to be tested according to the test sequence and the sub-test time periods respectively corresponding to the sub-items to be tested; and then determining target test teams corresponding to the sub-projects to be tested respectively from the candidate test teams based on the test sequence.
S403, the server creates a test task for each sub-item to be tested according to the test period corresponding to each sub-item to be tested and the target test team corresponding to each sub-item to be tested, and returns the test task to the client.
S404, the client presents task information of the test task.
S405, the client initiates a task execution request for the test task to the server.
S406, the server determines the test deadline corresponding to each sub-item to be tested based on the test time period corresponding to each sub-item to be tested and the initial execution time carried in the task execution request.
S407, the server returns the test deadlines respectively corresponding to the sub-items to be tested to the client so that the client presents the test deadlines respectively corresponding to the sub-items to be tested.
S408, the server starts a timing monitoring task.
As shown in fig. 5, the embodiment of the present application further provides a test management apparatus, where the test management apparatus may be configured at a server, and includes: a receiving module 11, a determining module 21 and a creating module 31;
wherein the receiving module 11 executes S101 in the above-described method embodiment, the determining module 21 executes S102 in the above-described method embodiment, and the creating module 31 executes S103 in the above-described method embodiment.
Specifically, the receiving module 11 is configured to receive a test request for a target item initiated by a client; the test request carries project attribute parameters and test configuration parameters, wherein the project attribute parameters comprise each sub-project to be tested in a target project, test difficulty levels corresponding to the sub-projects to be tested respectively, and test sequences aiming at the sub-projects to be tested, and the test configuration parameters comprise total test duration and associated candidate test teams; the determining module 21 is configured to determine a test period corresponding to each sub-item to be tested based on the item attribute parameter and the test configuration parameter, and determine a target test team corresponding to each sub-item to be tested from each candidate test team; the creating module 31 is configured to create a test task for each sub-item to be tested according to the test period corresponding to each sub-item to be tested and the target test team corresponding to each sub-item to be tested, and return the test task to the client.
Optionally, the determining module 21 is specifically configured to:
determining the sub-test duration corresponding to each sub-item to be tested based on the total test duration and the test difficulty level corresponding to each sub-item to be tested; determining the test time periods respectively corresponding to the sub-items to be tested according to the test sequence and the sub-test time periods respectively corresponding to the sub-items to be tested; and determining target test teams corresponding to the sub-projects to be tested respectively from the candidate test teams based on the test sequence.
Optionally, the determining module 21 is specifically further configured to:
determining a first sub-project to be tested from all sub-projects to be tested based on a test sequence, and determining a target test team corresponding to the first sub-project to be tested from all candidate test teams based on a preset matching rule; determining the next sub-project to be tested from all the sub-projects to be tested based on the test sequence, and determining a target test team corresponding to the next sub-project to be tested from all the candidate test teams based on a preset matching rule until the next sub-project to be tested is the last sub-project to be tested in all the sub-projects to be tested.
Optionally, the test management apparatus provided in the present application may further include a sending module;
the receiving module 11 is further configured to receive a task execution request for the test task initiated by the client after returning the test task to the client; the determining module 21 is further configured to determine a test deadline corresponding to each sub-item to be tested based on a test period corresponding to each sub-item to be tested and an initial execution time carried in the task execution request; and the sending module is used for returning the test deadlines respectively corresponding to the sub-items to be tested to the client so that the client can display the test deadlines respectively corresponding to the sub-items to be tested.
Optionally, the test management apparatus provided in the present application may further include a timing module;
the timing module is used for starting a timing monitoring task after the sending module returns the test deadlines respectively corresponding to the sub-items to be tested to the client so that the client presents the test deadlines respectively corresponding to the sub-items to be tested; the task content of the timing monitoring task is as follows: and for each sub-item to be tested, monitoring the time interval between the test deadline corresponding to the sub-item to be tested and the current moment, and returning prompt information to the client under the condition that the time interval meets the preset condition.
Optionally, the timing module is further configured to stop executing the timing monitoring task when receiving a request for suspending execution of the test task initiated by the client after the sending module returns the test deadlines respectively corresponding to the sub-items to be tested to the client so that the client presents the test deadlines respectively corresponding to the sub-items to be tested.
Optionally, the test management device provided by the application further includes an update module and a sending module;
the determining module 21 is further configured to determine each remaining sub-item to be tested in each sub-item to be tested when receiving a test completion instruction for the target sub-item to be tested after determining the test deadline corresponding to each sub-item to be tested; the updating module is used for updating the test deadlines respectively corresponding to the remaining sub-items to be tested based on the current moment and the test time periods respectively corresponding to the remaining sub-items to be tested; and the sending module is used for returning the test deadlines respectively corresponding to the remaining sub-items to be tested to the client so that the client can display the test deadlines respectively corresponding to the remaining sub-items to be tested.
Optionally, the test management apparatus may further include a storage module for storing program codes of the test management apparatus and the like.
As shown in fig. 6, the embodiment of the present application further provides a test management device including a memory 41, a processor (such as 42-1 and 42-2 in fig. 6), a bus 43, and a communication interface 44; the memory 41 is used for storing computer-executed instructions, and the processor is connected with the memory 41 through the bus 43; when the test management apparatus is running, the processor executes computer-executable instructions stored in the memory 41 to cause the test management apparatus to execute the test management method as provided in the above-described embodiment.
In a particular implementation, the processor may include, as one embodiment, one or more central processing units (central processing unit, CPU), such as CPU0 and CPU1 shown in fig. 6. And as one example the test management device may include multiple processors, such as processor 42-1 and processor 42-2 shown in fig. 6. Each of these processors may be a single-Core Processor (CPU) or a multi-core processor (multi-CPU). A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
The memory 41 may be, but is not limited to, a read-only memory 41 (ROM) or other type of static storage device that can store static information and instructions, a random access memory (random access memory, RAM) or other type of dynamic storage device that can store information and instructions, or an electrically erasable programmable read-only memory (electrically erasable programmable read-only memory, EEPROM), a compact disc read-only memory (compact disc read-only memory) or other optical disc storage, optical disc storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory 41 may be stand alone and be connected to the processor via a bus 43. The memory 41 may also be integrated with the processor.
In a specific implementation, the memory 41 is used for storing data in the application and computer-executable instructions corresponding to executing a software program of the application. The processor may test various functions of the management device by running or executing a software program stored in the memory 41 and invoking data stored in the memory 41.
Communication interface 44, using any transceiver-like device, is used to communicate with other devices or communication networks, such as a control system, a radio access network (radio access network, RAN), a wireless local area network (wireless local area networks, WLAN), etc. The communication interface 44 may include a receiving unit to implement a receiving function and a transmitting unit to implement a transmitting function.
Bus 43 may be an industry standard architecture (industry standard architecture, ISA) bus, an external device interconnect (peripheral component interconnect, PCI) bus, or an extended industry standard architecture (extended industry standard architecture, EISA) bus, among others. The bus 43 may be classified into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in fig. 6, but not only one bus or one type of bus.
As an example, in connection with fig. 5, the function implemented by the receiving module in the test management apparatus is the same as the function implemented by the receiving unit in fig. 6, and the function implemented by the creating module in the test management apparatus is the same as the function implemented by the processor in fig. 6. When the test management apparatus includes a memory module, the memory module performs the same function as the memory implementation in fig. 6.
The explanation of the related content in this embodiment may refer to the above method embodiment, and will not be repeated here.
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of functional modules is illustrated, and in practical application, the above-described functional allocation may be implemented by different functional modules, that is, the internal structure of the device is divided into different functional modules, so as to implement all or part of the functions described above. The specific working processes of the above-described systems, devices and units may refer to the corresponding processes in the foregoing method embodiments, which are not described herein again.
The embodiment of the application also provides a computer readable storage medium, in which instructions are stored, which when executed by a computer, cause the computer to execute the test management method provided in the above embodiment.
The computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a RAM, a ROM, an erasable programmable read-only memory (erasable programmable read only memory, EPROM), a register, a hard disk, an optical fiber, a CD-ROM, an optical storage device, a magnetic storage device, or any suitable combination of the foregoing, or any other form of computer readable storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an application specific integrated circuit (application specific integrated circuit, ASIC). In the context of the present application, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The foregoing is merely a specific embodiment of the present application, but the protection scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered in the protection scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. The test management method is characterized by being applied to a server and comprising the following steps:
receiving a test request for a target item initiated by a client; the test request carries project attribute parameters and test configuration parameters, wherein the project attribute parameters comprise all sub-projects to be tested in the target project, test difficulty levels corresponding to all the sub-projects to be tested respectively and test sequences aiming at all the sub-projects to be tested, and the test configuration parameters comprise total test duration and associated candidate test teams;
determining test time periods corresponding to all the sub-projects to be tested respectively based on the project attribute parameters and the test configuration parameters, and determining target test teams corresponding to all the sub-projects to be tested respectively from all the candidate test teams;
according to the test time periods respectively corresponding to the sub-projects to be tested and the target test team respectively corresponding to the sub-projects to be tested, a test task aiming at the sub-projects to be tested is created, and the test task is returned to the client.
2. The test management method according to claim 1, wherein the determining, based on the project attribute parameter and the test configuration parameter, a test period corresponding to each of the sub-projects to be tested, and determining, from the candidate test teams, a target test team corresponding to each of the sub-projects to be tested, respectively, includes:
determining the sub-test duration corresponding to each sub-item to be tested based on the total test duration and the test difficulty level corresponding to each sub-item to be tested;
determining a test period corresponding to each sub-item to be tested according to the test sequence and the sub-test duration corresponding to each sub-item to be tested;
and determining target test teams corresponding to the sub-projects to be tested respectively from the candidate test teams based on the test sequence.
3. The test management method according to claim 2, wherein the determining, based on the test sequence, the target test team corresponding to each sub-item to be tested from the candidate test teams includes:
determining a first sub-project to be tested from the sub-projects to be tested based on the test sequence, and determining a target test team corresponding to the first sub-project to be tested from the candidate test teams based on a preset matching rule;
Determining the next sub-item to be tested from all the sub-items to be tested based on the test sequence, and determining a target test team corresponding to the next sub-item to be tested from all the candidate test teams based on the preset matching rule until the next sub-item to be tested is the last sub-item to be tested in all the sub-items to be tested.
4. The method of claim 1, wherein after the returning the test task to the client, the method further comprises:
receiving a task execution request for the test task initiated by the client;
determining test deadlines corresponding to the sub-items to be tested respectively based on the test time periods corresponding to the sub-items to be tested respectively and the initial execution time carried in the task execution request;
and returning the test deadlines respectively corresponding to the sub-items to be tested to the client so that the client presents the test deadlines respectively corresponding to the sub-items to be tested.
5. The method of claim 4, wherein after returning the test deadlines respectively corresponding to the sub-items to be tested to the client so that the client presents the test deadlines respectively corresponding to the sub-items to be tested, the method further comprises:
Starting a timing monitoring task; the task content of the timing monitoring task is as follows: and monitoring the time interval between the test cut-off time corresponding to the current sub-item to be tested and the current moment for each sub-item to be tested, and returning prompt information to the client under the condition that the time interval meets the preset condition.
6. The method of claim 5, wherein after returning the test deadlines respectively corresponding to the sub-items to be tested to the client so that the client presents the test deadlines respectively corresponding to the sub-items to be tested, the method further comprises:
and stopping executing the timing monitoring task under the condition that a pause execution request for the test task initiated by the client is received.
7. The method for testing management according to claim 4, wherein after determining the test deadlines respectively corresponding to the sub-items to be tested, the method further comprises:
under the condition that a test completion instruction aiming at a target sub-item to be tested is received, determining each remaining sub-item to be tested in each sub-item to be tested;
Updating the test deadlines respectively corresponding to the remaining sub-items to be tested based on the current time and the test time periods respectively corresponding to the remaining sub-items to be tested;
and returning the test deadlines respectively corresponding to the remaining sub-items to be tested to the client so that the client presents the test deadlines respectively corresponding to the remaining sub-items to be tested.
8. A test management apparatus, comprising:
the receiving module is used for receiving a test request for a target item initiated by the client; the test request carries project attribute parameters and test configuration parameters, wherein the project attribute parameters comprise all sub-projects to be tested in the target project, test difficulty levels corresponding to all the sub-projects to be tested respectively and test sequences aiming at all the sub-projects to be tested, and the test configuration parameters comprise total test duration and associated candidate test teams;
the determining module is used for determining the test time periods respectively corresponding to the sub-projects to be tested based on the project attribute parameters and the test configuration parameters, and determining target test teams respectively corresponding to the sub-projects to be tested from the candidate test teams;
The creation module is used for creating a test task aiming at each sub-item to be tested according to the test time period corresponding to each sub-item to be tested and the target test team corresponding to each sub-item to be tested, and returning the test task to the client.
9. A test management device comprising a memory, a processor, a bus, and a communication interface; the memory is used for storing computer execution instructions, and the processor is connected with the memory through the bus;
when the test management apparatus is running, a processor executes the computer-executable instructions stored in the memory to cause the test management apparatus to perform the test management method of any one of claims 1-7.
10. A computer readable storage medium having instructions stored therein which, when executed by a computer, cause the computer to perform the test management method of any of claims 1-7.
CN202311631240.7A 2023-11-30 2023-11-30 Test management method, device, equipment and storage medium Pending CN117609058A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311631240.7A CN117609058A (en) 2023-11-30 2023-11-30 Test management method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311631240.7A CN117609058A (en) 2023-11-30 2023-11-30 Test management method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117609058A true CN117609058A (en) 2024-02-27

Family

ID=89959449

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311631240.7A Pending CN117609058A (en) 2023-11-30 2023-11-30 Test management method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117609058A (en)

Similar Documents

Publication Publication Date Title
CN108492005B (en) Project data processing method and device, computer equipment and storage medium
CN113641457A (en) Container creation method, device, apparatus, medium, and program product
CN110633977A (en) Payment exception processing method and device and terminal equipment
CN111784318B (en) Data processing method, device, electronic equipment and storage medium
CN115242731A (en) Message processing method, device, equipment and storage medium
CN113220573B (en) Test method and device for micro-service architecture and electronic equipment
CN107193749B (en) Test method, device and equipment
CN111488373B (en) Method and system for processing request
CN109254791A (en) Develop management method, computer readable storage medium and the terminal device of data
CN113191114A (en) Method and apparatus for authenticating a system
CN110290172B (en) Container application cloning method and device, computer equipment and storage medium
CN113641628B (en) Data quality detection method, device, equipment and storage medium
CN116955148A (en) Service system testing method, device, equipment, storage medium and product
CN117609058A (en) Test management method, device, equipment and storage medium
CN114968286A (en) Micro-service issuing method, device, storage medium and electronic equipment
CN115617800A (en) Data reading method and device, electronic equipment and storage medium
CN115269431A (en) Interface testing method and device, electronic equipment and storage medium
CN115203158A (en) Data comparison method, device, equipment and storage medium
CN115168509A (en) Processing method and device of wind control data, storage medium and computer equipment
CN113722141A (en) Method and device for determining delay reason of data task, electronic equipment and medium
CN112131257A (en) Data query method and device
CN114584605B (en) Service distribution method and device, electronic equipment and storage medium
CN117349009A (en) Cluster node determining method and device
CN116301992A (en) Upgrading method, device, equipment and storage medium
CN113626830A (en) Risk testing method and device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination