CN115827469A - Project test management method and system - Google Patents

Project test management method and system Download PDF

Info

Publication number
CN115827469A
CN115827469A CN202211604710.6A CN202211604710A CN115827469A CN 115827469 A CN115827469 A CN 115827469A CN 202211604710 A CN202211604710 A CN 202211604710A CN 115827469 A CN115827469 A CN 115827469A
Authority
CN
China
Prior art keywords
test
task
node
project
review
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211604710.6A
Other languages
Chinese (zh)
Inventor
王丽
徐丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Inspur Intelligent Technology Co Ltd
Original Assignee
Suzhou Inspur Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Inspur Intelligent Technology Co Ltd filed Critical Suzhou Inspur Intelligent Technology Co Ltd
Priority to CN202211604710.6A priority Critical patent/CN115827469A/en
Publication of CN115827469A publication Critical patent/CN115827469A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application provides a project test management method and a project test management system, which comprise the steps of obtaining to-be-handled review tasks and construction period information of a project; determining a node handler according to a current evaluation node of a to-be-processed evaluation task and a preset rule; reminding a node processor of the current evaluation node of completing the task of the current node; predicting the latest completion time of the to-be-processed evaluation task according to the prediction model; and judging whether generating the out-of-date early warning or not according to the construction period information and the latest completion time of the to-be-processed evaluation task. The completion time of the evaluation task is estimated to realize the overdue early warning of the completion time of the evaluation task and follow-up urging of the evaluation task, and the optimal processor is determined based on historical data to improve the completion efficiency of the evaluation node task and save manpower. Furthermore, the method and the device provide a basis for integrating test task management and review task management, manage data uniformly and provide a queryable interface so as to guarantee the consistency of review task and test progress maintenance.

Description

Project test management method and system
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a project test management method and system, and an electronic device.
Background
As computer systems grow faster and the applications to servers become more widespread, so too does the market demand for servers. Each type of server configuration verification review requirement needs to pass through different development stages such as market front-end requirements, research and development, testing, factory production, after-sales and the like, and related office software is even more than ten, wherein progress information such as progress, state and the like of the testing stage for the testing project is opaque to other development stages. Because the progress, the state and the like of the testing stage can not be intuitively known, developers can not timely follow and urge tasks, the progress can be checked only after a plurality of systems are backfilled after the testing of testers of a project is finished, a lot of repeated labor is generated, excessive manpower is input, and the working efficiency is relatively low.
And under the condition that the requirement of the front-end configuration verification evaluation in the market is urgent and the evaluation task is more, the existing life cycle management mode for the test generates more repeated labor, more time and labor are consumed, the data is managed independently, the problem of inconsistent data exists, the timeliness of information transmission is poor, and the possibility of inaccurate result caused by uploading error exists.
Therefore, a project testing management method capable of improving the work efficiency, ensuring the normal completion of the evaluation task and providing conditions for the realization and promotion is needed.
Disclosure of Invention
In order to solve the deficiencies of the prior art, a primary objective of the present invention is to provide a project test management method and system to solve the above technical problems of the prior art.
In order to achieve the above object, the present invention provides a project test management method in a first aspect, including:
acquiring to-be-handled review tasks and construction period information of the project;
determining a node handler according to the current evaluation node of the to-be-processed evaluation task and a preset rule;
reminding a node processor of the current evaluation node to complete the task of the current node;
predicting the latest completion time of the to-be-processed evaluation task according to a prediction model;
and judging whether generating out-of-date early warning or not according to the construction period information and the latest completion time of the to-be-processed review task.
In some embodiments, the project test management method further comprises managing test tasks of the project:
inquiring corresponding item information according to item identification of the item, wherein the item information comprises test configuration and a test scheme;
establishing a test task of each test stage of the project according to the test configuration and the test scheme;
and executing the test task to obtain a test result and a test log.
In some embodiments, the determining a node handler according to the current review node of the pending review task and a preset rule includes:
acquiring historical processing data of historical review tasks of the same type as the to-be-processed review task, wherein the historical processing data comprises historical processors and historical processing time of each historical review node of the historical review task;
and screening the historical processing person with the shortest historical processing time of the historical review node corresponding to the current review node of the to-be-processed review task as the node handler of the current review node according to the historical processing data.
In some embodiments, the predicting a latest completion time of the pending review task according to a prediction model comprises:
training the prediction model according to the historical review nodes and the historical processing time of the historical review tasks with the same type as the to-be-processed review task;
verifying whether the prediction model meets a preset condition or not based on a preset test data set;
generating a trained predictive model when the predictive model meets a preset training condition;
and predicting the longest processing time of each evaluation node in the to-be-processed evaluation task by using the trained prediction model so as to predict the latest completion time of the to-be-processed evaluation task.
In some embodiments, before the obtaining of the pending review task and the schedule information of the project, the method further comprises:
verifying the evaluation requirement;
and after the verification of the evaluation requirement is passed, prompting a research and development manager to judge whether to issue evaluation tasks and construction period information corresponding to the project.
In some embodiments, said executing said test task to obtain test results and a test log comprises:
determining a current test stage of the project according to the project information and the current time node, and determining a current test task of the project based on the current test stage;
determining a test execution scheme according to the attribute of the test case in the current test task;
and acquiring a test result and a test log of the current test stage of the project based on the test scheme.
In some embodiments, the test execution scheme comprises an automatic test scheme and a manual test scheme, and the obtaining the test result and the test log of the item based on the test scheme comprises:
if the test scheme is a manual test scheme, outputting the current test task and generating a first prompt to prompt a tester to perform manual test and input a test result and a test log;
if the test scheme is an automatic test scheme, selecting a machine to be tested and deploying the test environment of the machine to be tested according to the test case;
and calling a test script corresponding to the test case in the to-be-tested machine after the test environment is deployed so as to automatically generate a test result and a test log.
In some embodiments, the method further comprises:
displaying each test stage of the project, a corresponding test result and a corresponding test log through a visual interface;
and displaying the evaluation nodes of the to-be-handled evaluation tasks corresponding to the project, the completion conditions of the corresponding node tasks and the overdue early warning through a visual interface.
In a second aspect, the present application provides a project test management system, comprising:
the data acquisition module is used for acquiring the to-be-handled review task and the construction period information of the project;
the data analysis module is used for determining a node handler according to the current evaluation node of the to-be-processed evaluation task and a preset rule;
the data analysis module is also used for reminding a node processor of the current evaluation node of completing the task of the current node;
the data prediction module is also used for predicting the latest completion time of the to-be-processed evaluation task according to the prediction model;
and the data early warning module is used for judging whether generating the overtime early warning according to the construction period information and the latest completion time of the to-be-processed evaluation task.
The system further comprises:
the data maintenance module is used for inquiring corresponding item information according to the item identification of the item, and the item information comprises test configuration and a test scheme;
the data analysis module is also used for establishing a test task of each test stage of the project according to the test configuration and the test scheme;
and the data processing module is used for executing the test task to obtain a test result and a test log.
In a third aspect, the present application provides an electronic device, comprising:
one or more processors;
and memory associated with the one or more processors for storing program instructions that, when read and executed by the one or more processors, perform operations comprising:
acquiring to-be-handled review tasks and construction period information of the project;
determining a node handler according to the current evaluation node of the to-be-processed evaluation task and a preset rule;
reminding a node processor of the current evaluation node of completing the task of the current node;
predicting the latest completion time of the to-be-processed evaluation task according to a prediction model;
and judging whether generating out-of-date early warning or not according to the construction period information and the latest completion time of the to-be-processed review task.
In some embodiments, the program instructions, when read and executed by the one or more processors, further perform the following:
inquiring corresponding item information according to item identification of the item, wherein the item information comprises test configuration and a test scheme;
establishing a test task of each test stage of the project according to the test configuration and the test scheme;
and executing the test task to obtain a test result and a test log.
The beneficial effect that this application realized does:
the application provides a project test management method which comprises the steps of obtaining to-be-handled review tasks and construction period information of a project; determining a node handler according to the current evaluation node of the to-be-processed evaluation task and a preset rule; reminding a node processor of the current evaluation node to complete the task of the current node; predicting the latest completion time of the to-be-processed evaluation task according to a prediction model; and judging whether generating an out-of-date early warning or not according to the construction period information and the latest completion time of the to-be-processed evaluation task. The method and the system can realize the same management on the test tasks, can estimate the completion time of the review tasks so as to realize the overdue early warning on the completion time of the review tasks, display the progress of the test tasks and the complete situation of the review nodes through the visual interface so as to realize the follow-up and urging on the projects, determine the best processor based on historical data so as to improve the completion efficiency of the review node tasks, and save manpower. Furthermore, the method and the device provide a basis for integrating the test task management and the review task management, manage data uniformly and provide a queryable interface so as to guarantee the consistency of the review task and the test progress maintenance.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained by those skilled in the art without inventive efforts, wherein:
FIG. 1 is a first schematic diagram of a project test management method according to an embodiment of the present disclosure;
FIG. 2 is a flowchart of a test task execution provided by an embodiment of the present application;
FIG. 3 is a schematic diagram of a review node provided by an embodiment of the present application;
FIG. 4 is a second schematic diagram of a project test management method according to an embodiment of the present application;
FIG. 5 is a diagram illustrating a project test management system according to an embodiment of the present application;
fig. 6 is a block diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
In order to make the purpose, technical solutions and advantages of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
It should be understood that throughout the description and claims of this application, unless the context clearly requires otherwise, the words "comprise", "comprising", and the like, are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is, what is meant is "including, but not limited to".
It will be further understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. In addition, in the description of the present application, the meaning of "a plurality" is two or more unless otherwise specified.
It should be noted that the terms "S1", "S2", etc. are used for descriptive purposes only, are not intended to refer specifically to an order or sequential meaning, nor are they intended to limit the present application, but are merely used for convenience in describing the method of the present application and are not to be construed as indicating the order of the steps. In addition, technical solutions between the embodiments may be combined with each other, but must be based on the realization of the technical solutions by a person skilled in the art, and when the technical solutions are contradictory to each other or cannot be realized, such a combination should not be considered to exist, and is not within the protection scope claimed in the present application.
Example one
In order to realize the project test management method disclosed by the application, the embodiment of the application provides a project development management system based on a micro-service architecture, which comprises a test project management module, a test task management module, a review task management module and an analysis early warning module. Specifically, as shown in fig. 1, the process of managing the project by applying the lifecycle management system disclosed in this embodiment includes:
s1, managing a project under development.
Specifically, the test item management module obtains item information uniformly managed in the item management system through a query interface according to a uniformly compiled item identifier (such as a financial code), wherein the item information at least comprises item basic information, a test stage and a plan processing term, test configuration, a test scheme and the like. The test project management module is also convenient for a test manager to check the progress of the test stage of the project and maintain the test configuration and the test scheme of the project through a visual interface.
And S2, managing the test tasks of the project.
Specifically, as shown in the flowchart of fig. 2, the test task management module reads the project information of the project through the unified test task management entry, and establishes the test task of each test stage of the project according to the test configuration and the test scheme maintained in the project information. When a test task at a current time point of a project needs to be executed, a test stage (i.e., a current test stage) at the current time point of the project can be determined according to the current time point and a calculation processing deadline corresponding to the implementation of a specified test stage, and then a corresponding current test task is determined based on the determined current test stage, wherein the current test task includes one or more test tasks.
And the test task management module determines a test execution scheme according to the attributes of the test cases in the current test task, wherein the test execution scheme specifically comprises an automatic test scheme and a manual test scheme. If the test case is set as the automation attribute, the test task management module can execute the corresponding automation test task through the automation test scheme; specifically, a tester selects a machine to be tested, the test task management module automatically detects the server environment of the machine to be tested before the automatic test task is executed, and if the server environment required in the test case is installed, the corresponding test script is directly called to execute the automatic test task; if the corresponding server environment is not installed, test environment deployment is automatically carried out on the machine to be tested according to the requirements of the test case, and after the test environment deployment is finished, the corresponding test script is called to execute an automatic test task, wherein the test script can be a test script written by programming languages such as python or Java and the like. The test task management module supports concurrent execution of automated test tasks. If the test case is set to be of a non-automatic attribute, the test task management module executes a corresponding manual test task through a manual test scheme, records the starting time, the pause time, the ending time and the like of the execution of the test task to calculate the test duration, and needs a tester to input a test result, a test log, test problems and the like. Whether the test scheme is a manual test scheme or an automatic test scheme, the test task management module stores the generated test result, test logs and other data into the data resource pool so as to realize data sharing with other systems.
And S3, managing the evaluation task of the project.
Specifically, the project development Management system provided by the application integrates a plurality of units such as a Customer Relationship Management (CRM) unit, a Product Lifecycle Management (PLM) unit, a test Management unit, and a material Management unit in a review task Management module, and each integrated unit works in an organic coordination manner, so as to achieve the purpose of overall optimization.
And the review task management module submits configuration verification review requirements of the project on the CRM by the testers, automatically synchronizes the requirements to the PLM unit, prompts a development manager to judge whether review tasks and construction period information need to be issued according to the uploaded review requirements in the PLM unit, and receives the review tasks and the construction period information issued by the development manager through the test management unit. It is worth noting that the review task management module supports batch sending of the review tasks and also provides a function of sending a mail notification to notify the test management unit to receive the sent review tasks. Typically, there will be multiple review tasks for a project.
If the project has pending review tasks (namely incomplete review tasks), the review task management module determines a node handler according to the current review node of the pending review tasks and a preset rule; specifically, historical processing data of historical review tasks of the same type as the to-be-handled review task are obtained, the historical processing data comprise historical processors and historical processing time of all historical review nodes of the historical review tasks, and according to the historical processing data, the historical processor with the shortest historical processing time of the historical review node corresponding to the current review node of the to-be-handled review task is selected as the node processor of the current review node. And then, the recommended node processor is sent to the current operator through a visual interface, and the operator can adjust the node according to actual requirements, for example, when the node processor processes other review tasks, the length of the historical processing time of the rest idle processing profiles is sorted, and the node processor of the node of the in-order processor behind the recommended node processor is selected.
Automatically reminding a node processor of the node tasks required to be completed by the current evaluation node according to the evaluation node where the to-be-processed evaluation task is currently located; the test phase of the project at least comprises the following nodes: the system comprises a test manager audit node, a test field TL (team leader), a test node, a borrowing node, a test execution node and a test report audit node. If the to-be-processed review task is currently in the review node of the test manager, the review task management module reminds the node processor of establishing a test stage for the project and maintaining a test configuration and a test scheme (wherein the node task can be realized through the project management module); if the to-be-processed evaluation task is currently in the testing field TL and the testing node is issued, reminding a node processor to create or distribute the testing task for the project on a testing task management module; if the to-be-processed review task is currently located at the object borrowing node, reminding the node processor to check whether the materials are in place, and directly jumping the integrated material management unit by one key to initiate the object borrowing if the materials are needed to be borrowed; if the to-be-processed evaluation task is currently located at the test execution node, reminding the node processor of the to-be-processed test task and uploading a test report, wherein the test report comprises a test result, a test log, a test problem and the like; and if the to-be-processed review task is currently in the test report review node, reminding the node processor of the test report to be reviewed. It should be noted that, as shown in fig. 3, each review node has a chronological relationship, and each review node is transferred to the next review node after the corresponding node is completed, and if the task corresponding to the review node is not completed, the node is rejected to the original node, and then the node handler is prompted to perform processing, or the node can be rejected to other nodes at the higher level.
The evaluation task management module is also used for predicting the latest completion time of the to-be-handled evaluation task according to the prediction model, and specifically, training the prediction model according to the historical evaluation nodes and the historical processing time of the historical evaluation tasks of the same type as the to-be-handled evaluation task; verifying whether the prediction model meets a preset condition or not based on a preset test data set, wherein the preset test data set is a part of the acquired historical evaluation nodes of the historical evaluation task and the historical processing time; generating a trained prediction model when the prediction model meets a preset training condition, for example, the prediction accuracy reaches 95%; predicting the longest processing time of each evaluation node in the to-be-processed evaluation task by using the trained prediction model; and calculating the latest completion time of the to-be-processed evaluation task according to the longest processing time of each evaluation node. The predictive model may be any machine learning model, among others.
And S4, analyzing and early warning the progress of the evaluation task.
Specifically, the analysis early warning module determines whether the completion time of the to-be-handled review task is over due to the predicted latest completion time of the to-be-handled review task and the issued construction period information, and early warns in advance. The analysis early warning module can also compare the processing time set in each evaluation node in the construction period information according to the latest processing time of the to-be-processed evaluation task in each evaluation node, judge whether the completion time of the evaluation node exceeds the period, and timely generate an overtime early warning if the evaluation node exceeds the period so as to guarantee the time effectiveness of the subsequent evaluation nodes. The analysis early warning module can store the generated overdue early warning into a data resource pool, and displays the overdue early warning information to testers, development managers and related personnel through modes of an inquiry interface, a visual interface or file downloading and the like so as to timely follow, urge and monitor.
The project development management system provided by the embodiment of the application is realized by using programming languages such as Java, javaScript and python, and a micro-service architecture is built by using a Mysql database and frames such as SpringBoot, so that low coupling and high cohesion are realized, each module can be continuously integrated, the overall benefit is maximized after integration, and universal and efficient test tracking is realized.
Example two
Corresponding to the first embodiment, an embodiment of the present application further provides a project test management method, as shown in fig. 4, specifically including:
4100. acquiring to-be-handled review tasks and construction period information of the project;
preferably, before the to-be-handled review task and the construction period information of the project are acquired, the method further includes:
4110. verifying the evaluation requirement;
4120. and after the verification of the evaluation requirement is passed, prompting a research and development manager to judge whether to issue evaluation tasks and construction period information corresponding to the project.
4200. Determining a node handler according to the current evaluation node of the to-be-processed evaluation task and a preset rule;
preferably, the determining a node handler according to the current review node of the to-be-handled review task and a preset rule includes:
4210. acquiring historical processing data of historical review tasks of the same type as the to-be-processed review task, wherein the historical processing data comprises historical processors and historical processing time of each historical review node of the historical review task;
4220. and screening the historical processing person with the shortest historical processing time of the historical review node corresponding to the current review node of the to-be-processed review task as the node handler of the current review node according to the historical processing data.
4300. Reminding a node processor of the current evaluation node of completing the task of the current node;
4400. predicting the latest completion time of the to-be-processed evaluation task according to a prediction model;
preferably, the predicting the latest completion time of the to-be-processed review task according to the prediction model includes:
4610. training the prediction model according to the historical review nodes and the historical processing time of the historical review tasks with the same type as the to-be-processed review task;
4620. verifying whether the prediction model meets a preset condition or not based on a preset test data set;
4430 generating a trained predictive model when the predictive model satisfies a preset training condition;
4440. and predicting the longest processing time of each evaluation node in the to-be-processed evaluation task by using the trained prediction model so as to predict the latest completion time of the to-be-processed evaluation task.
4500. And judging whether generating an out-of-date early warning or not according to the construction period information and the latest completion time of the to-be-processed evaluation task.
Preferably, the project test management method further includes managing a test task of the project:
4600. inquiring corresponding project information according to the project identification of the project, wherein the project information comprises a test configuration and a test scheme;
4700. establishing a test task of each test stage of the project according to the test configuration and the test scheme;
4800. and executing the test task to obtain a test result and a test log.
Preferably, the executing the test task to obtain the test result and the test log includes:
4810. determining a current test stage of the project according to the project information and the current time node, and determining a current test task of the project based on the current test stage;
4820. determining a test execution scheme according to the attribute of the test case in the current test task;
4830. and acquiring a test result and a test log of the current test stage of the project based on the test scheme.
Preferably, the test execution scheme includes an automatic test scheme and a manual test scheme, and the obtaining the test result and the test log of the project based on the test scheme includes:
4831. if the test scheme is a manual test scheme, outputting the current test task and generating a first prompt to prompt a tester to perform manual test and input a test result and a test log;
4832. if the test scheme is an automatic test scheme, selecting a machine to be tested and deploying the test environment of the machine to be tested according to the test case;
4833. and calling a test script corresponding to the test case in the to-be-tested machine after the test environment is deployed so as to automatically generate a test result and a test log.
Preferably, the method further comprises:
4840. displaying each test stage of the project, a corresponding test result and a corresponding test log through a visual interface;
4850. and displaying the evaluation nodes of the to-be-handled evaluation tasks corresponding to the project, the completion conditions of the corresponding node tasks and the overdue early warning through a visual interface.
EXAMPLE III
As shown in fig. 5, corresponding to the first embodiment and the second embodiment, an embodiment of the present application provides a project test management system, including:
the data acquisition module 510 is configured to acquire to-be-handled review tasks and construction period information of a project;
the data analysis module 520 is used for determining a node handler according to the current review node of the to-be-processed review task and a preset rule;
the data analysis module 520 is further configured to remind a node handler of the current review node to complete a task of the current node;
the data prediction module 530 is further used for predicting the latest completion time of the to-be-processed review task according to the prediction model;
and the data early warning module 540 is configured to determine whether to generate an out-of-date early warning according to the construction period information and the latest completion time of the to-be-handled review task.
In some embodiments, the system further comprises:
the data maintenance module 550 is configured to query corresponding item information according to an item identifier of an item, where the item information includes a test configuration and a test scheme;
the data analysis module 520 is further configured to establish a test task of each test stage of the project according to the test configuration and the test scheme;
and the data processing module 560 is configured to execute the test task to obtain a test result and a test log.
In some embodiments, the data analysis module 520 is further configured to obtain historical processing data of historical review tasks of the same type as the pending review task, where the historical processing data includes historical handlers and historical processing times of the historical review nodes of the historical review tasks; the data analysis module 520 is further configured to screen, according to the historical processing data, a historical handler with the shortest historical processing time of the historical review node corresponding to the current review node of the pending review task as the node handler of the current review node.
In some embodiments, the data prediction module 530 is further configured to train the prediction model according to historical review nodes and historical processing time of historical review tasks of the same type as the pending review task; verifying whether the prediction model meets a preset condition or not based on a preset test data set; generating a trained predictive model when the predictive model meets a preset training condition; the data prediction module 530 is further configured to predict a maximum processing time of each review node within the pending review task using the trained prediction model to predict a latest completion time of the pending review task.
In some embodiments, the data acquisition module 510 is further configured to verify review requirements; the data obtaining module 510 is further configured to prompt the research and development manager to determine whether to issue the review task and the construction period information corresponding to the project after the review requirement verification is passed.
In some embodiments, the data processing module 560 is further configured to determine a current testing phase of the project according to the project information and a current time node, and determine a current testing task of the project based on the current testing phase; the data processing module 560 is further configured to determine a test execution scheme according to the attribute of the test case in the current test task; the data processing module 560 is further configured to obtain a test result and a test log of the current test stage of the project based on the test solution.
In some embodiments, if the test scenario is a manual test scenario, the data processing module 560 is further configured to output the current test task and generate a first prompt to prompt a tester to perform a manual test and enter a test result and a test log; if the test scheme is an automatic test scheme, the data processing module 560 is further configured to select a dut and deploy a test environment of the dut according to the test case; the data processing module 560 is further configured to invoke a test script corresponding to the test case in the dut after the test environment is deployed, so as to automatically generate a test result and a test log.
In some embodiments, the data processing module 560 is further configured to display each testing phase of the project and corresponding testing results and testing logs through a visual interface; the data processing module 560 is further configured to display, through a visual interface, the review nodes of the to-be-handled review task corresponding to the project, the completion conditions of the corresponding node tasks, and the overdue early warning.
Example four
Corresponding to all the above embodiments, an embodiment of the present application provides an electronic device, including:
one or more processors; and memory associated with the one or more processors for storing program instructions that, when read and executed by the one or more processors, perform the steps of:
acquiring to-be-handled review tasks and construction period information of the project;
determining a node handler according to the current evaluation node of the to-be-processed evaluation task and a preset rule;
reminding a node processor of the current evaluation node to complete the task of the current node;
predicting the latest completion time of the to-be-processed evaluation task according to a prediction model;
and judging whether generating an out-of-date early warning or not according to the construction period information and the latest completion time of the to-be-processed evaluation task.
Fig. 6 illustrates an architecture of an electronic device, which may specifically include a processor 610, a video display adapter 611, a disk drive 612, an input/output interface 613, a network interface 614, and a memory 620. The processor 610, the video display adapter 611, the disk drive 612, the input/output interface 613, the network interface 614, and the memory 620 may be communicatively connected by a bus 630.
The processor 610 may be implemented by a general-purpose CPU (Central Processing Unit), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits, and is configured to execute related programs to implement the technical solution provided by the present Application.
The Memory 620 may be implemented in the form of a ROM (Read Only Memory), a RAM (Random Access Memory), a static terminal device, a dynamic terminal device, or the like. The memory 620 may store an operating system 621 for controlling execution of the electronic device 600, a Basic Input Output System (BIOS) 622 for controlling low-level operation of the electronic device 600. In addition, a web browser 623, a data storage management system 624, an icon font processing system 624, and the like may also be stored. The icon font processing system 624 may be an application program that implements the operations of the foregoing steps in this embodiment of the application. In summary, when the technical solution provided in the present application is implemented by software or firmware, the relevant program codes are stored in the memory 620 and called for execution by the processor 610.
The input/output interface 613 is used for connecting an input/output module to realize information input and output. The i/o module may be configured as a component within the device (not shown) or may be external to the device to provide corresponding functionality. The input devices may include a keyboard, a mouse, a touch screen, a microphone, various sensors, etc., and the output devices may include a display, a speaker, a vibrator, an indicator light, etc.
The network interface 614 is used to connect a communication module (not shown in the figure) to implement communication interaction between the present device and other devices. The communication module can realize communication in a wired mode (such as USB, network cable and the like) and also can realize communication in a wireless mode (such as mobile network, WIFI, bluetooth and the like).
Bus 630 includes a path that transfers information between the various components of the device, such as processor 610, video display adapter 611, disk drive 612, input/output interface 613, network interface 614, and memory 620.
In addition, the electronic device 600 may also obtain information of specific pickup conditions from the virtual resource object pickup condition information database for performing condition judgment, and the like.
It should be noted that although the above devices only show the processor 610, the video display adapter 611, the disk drive 612, the input/output interface 613, the network interface 614, the memory 620, the bus 630, etc., in a specific implementation, the device may also include other components necessary for normal execution. Furthermore, it will be understood by those skilled in the art that the apparatus described above may also include only the components necessary to implement the solution of the present application, and not necessarily all of the components shown in the figures.
From the above description of the embodiments, it is clear to those skilled in the art that the present application can be implemented by software plus a necessary general hardware platform. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, or the like, and includes several instructions for enabling a computer device (which may be a personal computer, a cloud server, or a network device) to execute the method according to the embodiments or some parts of the embodiments of the present application.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, the system or system embodiments are substantially similar to the method embodiments and therefore are described in a relatively simple manner, and reference may be made to some of the descriptions of the method embodiments for related points. The above-described system and system embodiments are only illustrative, wherein the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. A project test management method, the method comprising:
acquiring to-be-handled review tasks and construction period information of the project;
determining a node handler according to the current evaluation node of the to-be-processed evaluation task and a preset rule;
reminding a node processor of the current evaluation node of completing the task of the current node;
predicting the latest completion time of the to-be-processed evaluation task according to a prediction model;
and judging whether generating an out-of-date early warning or not according to the construction period information and the latest completion time of the to-be-processed evaluation task.
2. The method of claim 1, wherein the project test management method further comprises managing test tasks for a project:
inquiring corresponding project information according to the project identification of the project, wherein the project information comprises a test configuration and a test scheme;
establishing a test task of each test stage of the project according to the test configuration and the test scheme;
and executing the test task to obtain a test result and a test log.
3. The method of claim 1, wherein determining a node handler based on the current review node for the pending review task and a predetermined rule comprises:
acquiring historical processing data of historical review tasks of the same type as the to-be-processed review task, wherein the historical processing data comprises historical processors and historical processing time of each historical review node of the historical review task;
and screening the historical processing person with the shortest historical processing time of the historical review node corresponding to the current review node of the to-be-processed review task as the node handler of the current review node according to the historical processing data.
4. The method of any of claims 1-3, wherein predicting the latest completion time for the pending review task based on the predictive model comprises:
training the prediction model according to the historical review nodes and the historical processing time of the historical review tasks with the same type as the to-be-processed review task;
verifying whether the prediction model meets a preset condition or not based on a preset test data set;
generating a trained predictive model when the predictive model meets a preset training condition;
and predicting the longest processing time of each evaluation node in the to-be-processed evaluation task by using the trained prediction model so as to predict the latest completion time of the to-be-processed evaluation task.
5. The method of any of claims 1-3, wherein prior to obtaining pending review task and schedule information for the project, the method further comprises:
verifying the evaluation requirements;
and after the verification of the evaluation requirement is passed, prompting a research and development manager to judge whether to issue evaluation tasks and construction period information corresponding to the project.
6. The method of claim 2, wherein the executing the test task to obtain test results and a test log comprises:
determining a current test stage of the project according to the project information and the current time node, and determining a current test task of the project based on the current test stage;
determining a test execution scheme according to the attribute of the test case in the current test task;
and acquiring a test result and a test log of the current test stage of the project based on the test scheme.
7. The method of claim 6, wherein the test execution scenario includes an automatic test scenario and a manual test scenario, and the obtaining test results and test logs for the item based on the test scenario comprises:
if the test scheme is a manual test scheme, outputting the current test task and generating a first prompt to prompt a tester to perform manual test and input a test result and a test log;
if the test scheme is an automatic test scheme, selecting a machine to be tested and deploying the test environment of the machine to be tested according to the test case;
and calling a test script corresponding to the test case in the to-be-tested machine after the test environment is deployed so as to automatically generate a test result and a test log.
8. The method of claim 7, further comprising:
displaying each test stage of the project, a corresponding test result and a corresponding test log through a visual interface;
and displaying the evaluation nodes of the to-be-handled evaluation tasks corresponding to the project, the completion conditions of the corresponding node tasks and the overdue early warning through a visual interface.
9. A project test management system, the system comprising:
the data acquisition module is used for acquiring the to-be-handled review task and the construction period information of the project;
the data analysis module is used for determining a node handler according to the current evaluation node of the to-be-processed evaluation task and a preset rule;
the data analysis module is also used for reminding a node processor of the current evaluation node of completing the task of the current node;
the data prediction module is also used for predicting the latest completion time of the to-be-processed evaluation task according to the prediction model;
and the data early warning module is used for judging whether generating the overtime early warning according to the construction period information and the latest completion time of the to-be-processed evaluation task.
10. The system of claim 9, further comprising:
the data maintenance module is used for inquiring corresponding item information according to the item identification of the item, and the item information comprises test configuration and a test scheme;
the data analysis module is also used for establishing a test task of each test stage of the project according to the test configuration and the test scheme;
and the data processing module is used for executing the test task to obtain a test result and a test log.
CN202211604710.6A 2022-12-13 2022-12-13 Project test management method and system Pending CN115827469A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211604710.6A CN115827469A (en) 2022-12-13 2022-12-13 Project test management method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211604710.6A CN115827469A (en) 2022-12-13 2022-12-13 Project test management method and system

Publications (1)

Publication Number Publication Date
CN115827469A true CN115827469A (en) 2023-03-21

Family

ID=85547230

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211604710.6A Pending CN115827469A (en) 2022-12-13 2022-12-13 Project test management method and system

Country Status (1)

Country Link
CN (1) CN115827469A (en)

Similar Documents

Publication Publication Date Title
US10310968B2 (en) Developing software project plans based on developer sensitivity ratings detected from monitoring developer error patterns
CA2707916C (en) Intelligent timesheet assistance
CN103150249B (en) A kind of method and system of automatic test
US20140123110A1 (en) Monitoring and improving software development quality
US20190171550A1 (en) System and method for continuous testing and delivery of software
CN109633351B (en) Intelligent IT operation and maintenance fault positioning method, device, equipment and readable storage medium
US20170097812A1 (en) Automated and heuristically managed solution to quantify cpu and path length cost of instructions added, changed or removed by a service team
CN113946499A (en) Micro-service link tracking and performance analysis method, system, equipment and application
CN114168471A (en) Test method, test device, electronic equipment and storage medium
CN115964272A (en) Transaction data automatic testing method, device, equipment and readable storage medium
JP2017016507A (en) Test management system and program
CN115827469A (en) Project test management method and system
CN116228132A (en) Data management method and device of RM system, electronic equipment and medium
CN113094095B (en) Agile development progress determining method and device
KR101403685B1 (en) System and method for relating between failed component and performance criteria of manintenance rule by using component database of functional importance determination of nuclear power plant
CN113127362A (en) Object testing method, object testing device, electronic device, and readable storage medium
CN115994070B (en) System availability detection method and device, electronic equipment and readable storage medium
CN116452068B (en) Alarm operation and maintenance efficiency detection method and device, electronic equipment and readable storage medium
CN114003248B (en) Model management method and device, electronic equipment and storage medium
CN116300825A (en) Automatic test method, device, equipment and storage medium
Dayanti et al. Design and Implementation of Automated Regression Testing Using Karate Framework: A Case Study of PT Fliptech Lentera Inspirasi Pertiwi
Karlsen et al. Visualizing smart charging of electric vehicles for support personnel
CN116909925A (en) Test flow management method and management system based on agile iteration
CN115878484A (en) Application interface automatic monitoring management method, device, system and storage medium
CN116467139A (en) System alarm repetition rate detection method, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination