CN111611172A - Project test defect analysis method, device, equipment and storage medium - Google Patents

Project test defect analysis method, device, equipment and storage medium Download PDF

Info

Publication number
CN111611172A
CN111611172A CN202010455310.8A CN202010455310A CN111611172A CN 111611172 A CN111611172 A CN 111611172A CN 202010455310 A CN202010455310 A CN 202010455310A CN 111611172 A CN111611172 A CN 111611172A
Authority
CN
China
Prior art keywords
defect
defect analysis
project
analysis result
test
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010455310.8A
Other languages
Chinese (zh)
Inventor
范永骏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
OneConnect Smart Technology Co Ltd
OneConnect Financial Technology Co Ltd Shanghai
Original Assignee
OneConnect Financial Technology Co Ltd Shanghai
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by OneConnect Financial Technology Co Ltd Shanghai filed Critical OneConnect Financial Technology Co Ltd Shanghai
Priority to CN202010455310.8A priority Critical patent/CN111611172A/en
Publication of CN111611172A publication Critical patent/CN111611172A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Stored Programmes (AREA)

Abstract

The invention discloses a project test defect analysis method, a project test defect analysis device, project test equipment and a storage medium. The method comprises the following steps: receiving a project test request sent by a client, and analyzing a to-be-tested project and an interface calling parameter contained in the project test request; calling a preset interface configured in a preset defect tracking tool according to the interface calling parameter, and acquiring all defect data of the item to be tested in the testing process through the preset interface; inputting all defect data into a preset defect analysis model, and receiving a defect analysis result output by the defect analysis model; and storing the items to be tested and the defect analysis result in a database in a correlation manner, and simultaneously sending the defect analysis result to the client. The invention saves labor cost, has finer defect data granularity and is beneficial to improving the data processing efficiency and the reliability of the defect analysis result.

Description

Project test defect analysis method, device, equipment and storage medium
Technical Field
The invention relates to the technical field of testing, in particular to a project testing defect analysis method, a device, equipment and a storage medium.
Background
With the rapid development of computer science and technology, the application field of software is gradually popularized, the software scale and cost are gradually increased, the complexity of software design is continuously improved, the chances of errors or defects in software development are more and more, and meanwhile, the understanding of the market on the importance of software quality is gradually enhanced. Software process management is the key for improving software quality and software development efficiency, and software defect management is an important aspect of software process management, and is increasingly important in the implementation process of software projects as part of the software lifecycle, and is an important means for ensuring software quality.
At present, the key indexes for measuring the software development quality often need to be observed manually, and the defect data are obtained by manual calculation, so that errors of the key indexes are easily caused, and manpower is consumed.
Disclosure of Invention
The embodiment of the invention provides a project test defect analysis method, a project test defect analysis device, project test equipment and a storage medium, and aims to solve the problems that in the prior art, manual calculation analysis is performed on defect data, so that labor consumption is caused, and an analysis result is inaccurate.
A project test defect analysis method, comprising:
receiving a project test request sent by a client, and analyzing a to-be-tested project and an interface calling parameter contained in the project test request;
calling a preset interface configured in a preset defect tracking tool according to the interface calling parameter, and acquiring all defect data of the item to be tested in the testing process through the preset interface;
inputting all the defect data into a preset defect analysis model, and receiving a defect analysis result output by the defect analysis model;
and storing the item to be tested and the defect analysis result in a database in a correlation manner, and simultaneously sending the defect analysis result to the client.
A project test defect analysis apparatus, comprising:
the receiving module is used for receiving a project test request sent by a client and analyzing a to-be-tested project and an interface calling parameter contained in the project test request;
the project testing module is used for calling a preset interface configured in a preset defect tracking tool according to the interface calling parameter and acquiring all defect data of the project to be tested in the testing process through the preset interface;
the data analysis module is used for inputting all the defect data into a preset defect analysis model and receiving a defect analysis result output by the defect analysis model;
and the sending module is used for storing the item to be tested and the defect analysis result in a database in a correlation manner and sending the defect analysis result to the client.
A computer apparatus comprising a memory, a processor and computer readable instructions stored in the memory and executable on the processor, the processor implementing the project test defect analysis method when executing the computer readable instructions.
A computer readable storage medium having computer readable instructions stored thereon which, when executed by a processor, implement the project test defect analysis method described above.
According to the item test defect analysis method, device, equipment and storage medium, after an item test request sent by a client is received, firstly, the interface call parameter analyzed from the item test request is utilized to obtain the preset interface configured in the defect tracking tool, then, the defect data during testing of the item to be tested is obtained through the preset interface, statistical analysis processing is carried out on the basis of the defect data, and the defect analysis result containing a plurality of defect indexes is obtained; in addition, the defect analysis result and the item to be tested are stored in an associated mode, and the defect analysis result is sent to the client, so that a user can conveniently inquire and manage the defect analysis result, the user can conveniently and accurately judge the quality of the item to be tested according to the defect analysis result, and the item to be tested is optimized.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained based on these drawings without inventive labor.
FIG. 1 is a schematic diagram of an application environment of a project test defect analysis method according to an embodiment of the invention;
FIG. 2 is a flow chart of a project test defect analysis method according to an embodiment of the present invention;
FIG. 3 is a flowchart of step S30 of the project test defect analysis method according to an embodiment of the present invention;
FIG. 4 is a flow chart of a project test defect analysis method according to another embodiment of the present invention;
FIG. 5 is a schematic block diagram of an apparatus for analyzing project test defects according to an embodiment of the present invention;
FIG. 6 is a schematic block diagram of a data analysis module of the project test defect analysis apparatus according to an embodiment of the present invention;
FIG. 7 is a schematic block diagram of an apparatus for analyzing defect in a project test according to another embodiment of the present invention;
FIG. 8 is a schematic diagram of a computer device in an embodiment of the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The project test defect analysis method provided by the invention can be applied to the application environment shown in figure 1, wherein a client communicates with a server through a network. The client includes, but is not limited to, various personal computers, notebook computers, smart phones, tablet computers, cameras, and portable wearable devices. The server may be implemented as a stand-alone server or as a server cluster consisting of a plurality of servers.
In an embodiment, as shown in fig. 2, a project test defect analysis method is provided, which is described by taking the server in fig. 1 as an example, and includes the following steps:
and S10, receiving the project test request sent by the client, and analyzing the project to be tested and the interface call parameters contained in the project test request.
The item test request is sent after a user uploads an item to be tested to a server through a client and triggers a test button preset in the server.
The item to be tested is software which needs to be subjected to defect analysis, and the item to be tested can contain item information and version information, and the version information can be used for marking items released at different times; the interface call parameters include, but are not limited to, an interface call address and an interface call type.
Specifically, whether a test button associated with a project test request is triggered is detected, when the test button is triggered, the project test request is received, and a to-be-tested project and an interface calling parameter contained in the project test request are analyzed; and when the test button is not triggered, whether the to-be-tested items uploaded by the user through the client are stored in the server or not can be detected, the to-be-tested items uploaded by the user through the client are stored in the server, test prompt information is sent to the client, the test prompt information can include the file valid time of the to-be-tested items, and at the moment, if the to-be-tested items exceed the corresponding file valid time, the to-be-tested items stored in the server are cleared.
Before receiving a project test request, if a file to be tested uploaded by a user through a client is detected, whether the project to be tested is a program file is detected, and uploading success information is sent to the client when the project to be tested is the program file; and when the item to be tested is a program file, uploading failure information is sent to the client so as to prompt the user to re-upload the correct item to be tested.
And S20, calling a preset interface configured in a preset defect tracking tool according to the interface calling parameter, and acquiring all defect data of the item to be tested in the testing process through the preset interface.
Specifically, after receiving the project test request in step S10, a preset interface is called from a preset defect tracking tool according to interface call parameters (including an interface call address and an interface call type), a test script is generated based on a project test flow preset by the preset defect tracking tool and the project to be tested, the test script may include project information, version information, a defect data type, version information of a version to be tested, a maximum number of defect data (i.e., the number of defect data that can be obtained by calling the preset interface once), a field of the defect data, an identifier of the defect data, a start sequence of the defect data, and the like of the project to be tested, and the defect data is obtained by operating the test script.
In the present embodiment, the defect tracking tool may be a JIRA (a defect tracking management system) developed by atlas. The preset interface may be an API interface externally provided by the JIRA, and the API interface may obtain various defect data in a process of testing the item to be tested by using the JIRA.
And S30, inputting all the defect data into a preset defect analysis model, and receiving a defect analysis result output by the defect analysis model.
The defect analysis model is used for receiving the defect data, performing statistical calculation on the defect data to obtain a plurality of defect indexes capable of reflecting the development quality of the project to be tested, and outputting the plurality of defect indexes as defect analysis results.
In this embodiment, the defect index includes a repair duration, a verification duration, a defect step length, a defect reopening number, and the like of the item defect.
The defect step length is an index for evaluating the difficulty of finding the defects of the project, the length of the step needing to be operated after the project to be tested enters the core tested page is taken as a reference, the shorter the defect step length is, the easier the problem is found, and conversely, the deeper the defect step length is, the deeper the problem is hidden. For example, as the project is deeper, the proportion of defects with short defect steps in the newly-found defects tends to decrease and the proportion of defects with long defect steps in the newly-found defects tends to increase, the performance indicating that the project is progressing normally. If the proportion of the defects with the short step length in the newly found defects is increased at the later stage of the project, the possibility that the early-stage test work is not completed is shown, and many problems are missed, and a user (such as a project developer) can be prompted to adjust the project development strategy in time in the subsequent steps so as to ensure the progress of the project.
The reopening times can reflect defect repair reworking times, the repair duration can reflect defect repair efficiency, and root cause analysis can be performed in subsequent steps based on the reopening times and the repair duration so as to further improve the development quality and repair efficiency of the project.
The verification duration refers to the duration of verifying the defects and is used for evaluating the stability of the test environment. If the verification time is short, the testing environment for verifying the defect is good, and basically no blocking test is performed. If the verification duration is too long, the test environment may block the test due to some reasons, so that the test progress is affected, and a user (such as a project developer) can be prompted to solve the test environment in time in subsequent steps so as to avoid affecting the test progress.
Specifically, after the defect data of the item to be tested is obtained through the API interface, statistical calculation is performed on the defect data by using a preset defect analysis model, and then a data analysis result including a plurality of defect indexes is obtained. It can be understood that, since the items to be tested in this embodiment relate to hundreds of subsystems, when testing one function of an item to be tested, dozens of subsystems are often called, if only defect indicators with a wider granularity are obtained, for example, the priority of the defect data, the proportion weight of the priority, and the module distribution of the defect data, and the defect indicators with the wider granularity are analyzed, the defect analysis requirements of the item to be tested in this embodiment cannot be satisfied, so that defect indicators with smaller granularity, such as the repair duration, the verification duration, the defect step length, and the defect reopening times of the defect data, also need to be counted and analyzed in this embodiment.
And S40, storing the item to be tested and the defect analysis result in a database in a correlation mode, and sending the defect analysis result to the client.
That is, after the defect analysis result including the plurality of defect indicators is obtained in step S30, the project information, version information, and defect indicators of the project to be tested are packaged and stored together, so as to facilitate data tracing, and the defect analysis result is sent to the client.
To sum up, after receiving a project test request sent by a client, the embodiment first obtains a preset interface configured in a defect tracking tool by using an interface call parameter analyzed from the project test request, then obtains defect data when testing a project to be tested through the preset interface, and performs statistical analysis processing based on the defect data to obtain a defect analysis result including a plurality of defect indexes, wherein the obtained defect data has a finer granularity, and can intelligently perform statistical calculation on the defect indexes, thereby saving labor cost and improving data processing efficiency and reliability of the defect indexes; in addition, the embodiment stores the defect analysis result in association with the item to be tested, and sends the defect analysis result to the client, so that the user can conveniently inquire and manage the defect analysis result, and further, the user can conveniently and accurately judge the quality of the item to be tested according to the defect analysis result, and optimize the item to be tested.
In one embodiment, as shown in fig. 3, the step S30 includes the following steps:
s301, after all the defect data are classified, at least one set identifier and a data set corresponding to the set identifier are obtained.
Specifically, after all the defect data are obtained in step S20, first, a type of defect data (the same type of defect data is used to calculate a defect index) is matched according to a preset keyword, and the matched type of defect data is placed in a corresponding data set, that is, only the type of defect data matched with the keyword can be stored in the data set.
For example, the defect deployment completion time and the defect creation completion time may be respectively set in the first data set, the defect acceptance time and the defect deployment completion time may be set in the second data set, the recording information whose defect status is restart may be set in the third data set, and the information (e.g., the reproduction step) related to reproduction of the defect description field and the defect description field of the defect data may be set in the fourth data set.
S302, acquiring an index model related to the data set from the defect analysis model.
That is, the defect analysis model includes a plurality of index models, and each index model is associated with a data set.
And S303, receiving the defect indexes output by the index models, and outputting the defect analysis results generated according to all the defect indexes.
For example, the defect data included in the first data set may be calculated by using a first index model associated with the first data set, so as to obtain a defect repairing duration; calculating the defect data contained in the second data set by using a second index model associated with the second data to obtain defect verification duration; counting the defect data contained in the third set by using a third index model associated with the third data set to obtain the reopening times of the defects; and obtaining the defect step length for the fourth set by using a fourth index model associated with the fourth set.
In another embodiment, the defect index further includes a priority of the defect data (predetermined according to the trend and severity of the defect data), a proportional weight of the priority, a module distribution, a defect remaining, and the like, which can be obtained by conventional means and will not be described herein too much.
In one embodiment, the step S40 includes the following steps:
firstly, the items to be tested and the defect analysis results are stored in a database in a correlation mode, and whether each defect index contained in the defect analysis results meets the corresponding preset alarm condition or not is detected.
And then, generating corresponding prompt information when each defect index contained in the defect analysis result meets the corresponding preset alarm condition. Preferably, when the defect index includes a repair duration, a verification duration, a defect step length and a defect reopening frequency of the project defect, and the preset alarm condition includes a repair duration threshold, a verification duration threshold, a defect step length threshold and a defect repair threshold, the method specifically includes the following steps: when the repair duration exceeds a repair duration threshold and/or the defect reopening frequency exceeds a defect repair threshold, generating abnormal repair prompt information to prompt a user to analyze the reason of the abnormal repair; when the verification duration exceeds a verification duration threshold, generating prompt information of abnormal test environment to prompt a user to repair the test environment of project development; and when the defect step length exceeds a defect step length threshold value, generating prompt information of abnormal test progress to prompt a user to adjust a project development strategy.
And finally, sending the defect analysis result and/or each prompt message to the client.
It can be understood that the embodiment facilitates the subsequent user to optimize the defect analysis of the project according to the defect analysis result and/or the prompt message.
In one embodiment, as shown in fig. 4, the step S30 is followed by the following steps:
and S50, acquiring the identification information of the defect index contained in the defect analysis result, wherein the index information comprises one or more of the number, the theme and the state of the defect index.
S60, obtaining creating information, wherein the creating information comprises one or more of creating time and creating person.
S70, generating a test file containing a unique file identifier according to the creating information, the item to be tested, the defect analysis result and the identification information of the defect index contained in the defect analysis result, and storing the test file in a database.
And S80, when receiving the query instruction containing the unique file identifier, acquiring the test file corresponding to the unique file identifier from the database, and sending the test file to the client.
Understandably, the embodiment packs and stores the creation information, the project information, the version information, the defect index and the identification information of the defect index of the project to be tested together, and sets the unique file identification according to the requirement, so that a user can call the file according to the unique file identification conveniently, and the data management is facilitated.
In one embodiment, as shown in fig. 5, a project test defect analysis apparatus is provided, which corresponds to the project test defect analysis method in the above embodiments one to one. The project test defect analysis device comprises the following modules, wherein each functional module is described in detail as follows:
the receiving module 110 is configured to receive a project test request sent by a client, and analyze a to-be-tested project and an interface call parameter included in the project test request.
The project testing module 120 is configured to call a preset interface configured in a preset defect tracking tool according to the interface call parameter, and acquire all defect data of the project to be tested in the testing process through the preset interface.
The data analysis module 130 is configured to input all the defect data into a preset defect analysis model, and receive a defect analysis result output by the defect analysis model.
The sending module 140 is configured to store the item to be tested and the defect analysis result in a database in an associated manner, and send the defect analysis result to the client.
In an embodiment, as shown in fig. 6, the data analysis module 130 includes the following sub-modules, and each of the sub-modules is described in detail as follows:
the data processing sub-module 131 is configured to obtain at least one set identifier and a data set corresponding to the set identifier after performing classification processing on all the defect data.
And the model matching sub-module 132 is configured to obtain an index model associated with the data set from the defect analysis model.
The output submodule 133 is configured to receive the defect indicators output by the indicator models, and output the defect analysis result generated according to all the defect indicators.
In an embodiment, as shown in fig. 7, the project test defect analysis apparatus further includes the following modules, and each functional module is described in detail as follows:
the first obtaining module 150 is configured to obtain identification information of a defect indicator included in the defect analysis result, where the indicator information includes one or more of a number, a theme, and a status of the defect indicator.
A second obtaining module 160, configured to obtain creation information, where the creation information includes one or more of creation time and creator.
The generating module 170 is configured to generate a test file including a unique file identifier according to the creation information, the item to be tested, the defect analysis result, and the identification information of the defect index included in the defect analysis result, and store the test file in a database.
And the query module 180 is configured to, when receiving a query instruction including the unique file identifier, obtain the test file corresponding to the unique file identifier from the database, and send the test file to the client.
In one embodiment, the sending module 140 includes the following sub-modules, and each of the functional sub-modules is described in detail as follows:
and the storage submodule is used for storing the items to be tested and the defect analysis result into a database in a correlation manner, and simultaneously detecting whether each defect index contained in the defect analysis result meets the corresponding preset alarm condition.
And the detection submodule is used for generating corresponding prompt information when each defect index contained in the defect analysis result meets the corresponding preset alarm condition.
And the sending submodule is used for sending the defect analysis result and/or each prompt message to the client.
In an embodiment, when the defect indicator includes a repair duration, a verification duration, a defect step length, and a defect reopening frequency of the item defect, and the preset alarm condition includes a repair duration threshold, a verification duration threshold, a defect step length threshold, and a defect repair threshold, the detection sub-module includes the following units, and each functional unit is described in detail as follows:
and the first detection unit is used for generating abnormal repair prompt information when the repair time length exceeds a repair time length threshold value and/or the defect reopening frequency exceeds a defect repair threshold value so as to prompt a user to analyze the reason of the abnormal repair.
And the second detection unit is used for generating prompt information of abnormal test environment when the verification duration exceeds the verification duration threshold so as to prompt a user to repair the test environment for project development.
And the third detection unit is used for generating prompt information of abnormal test progress when the defect step length exceeds a defect step length threshold value so as to prompt a user to adjust the project development strategy.
For the specific definition of the project test defect analysis apparatus, reference may be made to the above definition of the project test defect analysis method, which is not described herein again. The modules in the project test defect analysis device can be wholly or partially realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a server, and its internal structure diagram may be as shown in fig. 8. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer readable instructions, and a database. The internal memory provides an environment for the operating system and execution of computer-readable instructions in the non-volatile storage medium. The computer readable instructions, when executed by a processor, implement a project test defect analysis method.
In one embodiment, a computer device is provided, comprising a memory, a processor, and computer readable instructions stored on the memory and executable on the processor, the processor when executing the computer readable instructions implementing the steps of:
receiving a project test request sent by a client, and analyzing a to-be-tested project and an interface calling parameter contained in the project test request;
calling a preset interface configured in a preset defect tracking tool according to the interface calling parameter, and acquiring all defect data of the item to be tested in the testing process through the preset interface;
inputting all the defect data into a preset defect analysis model, and receiving a defect analysis result output by the defect analysis model;
and storing the item to be tested and the defect analysis result in a database in a correlation manner, and simultaneously sending the defect analysis result to the client.
In one embodiment, a computer readable storage medium is provided having computer readable instructions stored thereon which, when executed by a processor, perform the steps of:
receiving a project test request sent by a client, and analyzing a to-be-tested project and an interface calling parameter contained in the project test request;
calling a preset interface configured in a preset defect tracking tool according to the interface calling parameter, and acquiring all defect data of the item to be tested in the testing process through the preset interface;
inputting all the defect data into a preset defect analysis model, and receiving a defect analysis result output by the defect analysis model;
and storing the item to be tested and the defect analysis result in a database in a correlation manner, and simultaneously sending the defect analysis result to the client.
It will be understood by those of ordinary skill in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware associated with computer readable instructions, which can be stored in a non-volatile computer readable storage medium, and when executed, can include processes of the embodiments of the methods described above. Any reference to memory, storage, databases, or other media used in embodiments provided herein may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), Direct Rambus Dynamic RAM (DRDRAM), and Rambus Dynamic RAM (RDRAM).
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of each functional unit or module is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units or modules according to requirements, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (10)

1. A project test defect analysis method is characterized by comprising the following steps:
receiving a project test request sent by a client, and analyzing a to-be-tested project and an interface calling parameter contained in the project test request;
calling a preset interface configured in a preset defect tracking tool according to the interface calling parameter, and acquiring all defect data of the item to be tested in the testing process through the preset interface;
inputting all the defect data into a preset defect analysis model, and receiving a defect analysis result output by the defect analysis model;
and storing the item to be tested and the defect analysis result in a database in a correlation manner, and simultaneously sending the defect analysis result to the client.
2. The project test defect analysis method of claim 1, wherein the inputting all the defect data into a preset defect analysis model and receiving the defect analysis result output by the defect analysis model comprises:
after all the defect data are classified, at least one set identifier and a data set corresponding to the set identifier are obtained;
obtaining an index model associated with the data set from the defect analysis model;
and receiving the defect indexes output by the index models, and outputting the defect analysis results generated according to all the defect indexes.
3. The project test defect analysis method of claim 1, wherein after inputting all the defect data into a preset defect analysis model and receiving a defect analysis result output by the defect analysis model, the method comprises:
acquiring identification information of a defect index contained in the defect analysis result, wherein the index information comprises one or more of the serial number, the theme and the state of the defect index;
acquiring creating information, wherein the creating information comprises one or more of creating time and creator;
generating a test file containing a unique file identifier according to the creation information, the to-be-tested item, the defect analysis result and the identification information of the defect index contained in the defect analysis result, and storing the test file into a database;
and when receiving a query instruction containing the unique file identifier, acquiring the test file corresponding to the unique file identifier from the database, and sending the test file to the client.
4. The project testing defect analysis method of claim 1, wherein the associating the project to be tested with the defect analysis result and storing the defect analysis result in a database, and sending the defect analysis result to the client comprises:
storing the items to be tested and the defect analysis result in a database in a correlation manner, and simultaneously detecting whether each defect index contained in the defect analysis result meets a corresponding preset alarm condition;
generating corresponding prompt information when each defect index contained in the defect analysis result meets a corresponding preset alarm condition;
and sending the defect analysis result and/or each prompt message to the client.
5. The project testing defect analysis method of claim 4, wherein the defect indicators include repair duration, verification duration, defect step size, and defect reopening times of project defects; the preset alarm conditions comprise a repair duration threshold, a verification duration threshold, a defect step length threshold and a defect repair threshold; when each defect index contained in the defect analysis result meets the corresponding preset alarm condition, generating corresponding prompt information, wherein the prompt information comprises the following steps:
when the repair duration exceeds a repair duration threshold and/or the defect reopening frequency exceeds a defect repair threshold, generating abnormal repair prompt information to prompt a user to analyze the reason of the abnormal repair;
when the verification duration exceeds a verification duration threshold, generating prompt information of abnormal test environment to prompt a user to repair the test environment of project development;
and when the defect step length exceeds a defect step length threshold value, generating prompt information of abnormal test progress to prompt a user to adjust a project development strategy.
6. An apparatus for project test defect analysis, comprising:
the receiving module is used for receiving a project test request sent by a client and analyzing a to-be-tested project and an interface calling parameter contained in the project test request;
the project testing module is used for calling a preset interface configured in a preset defect tracking tool according to the interface calling parameter and acquiring all defect data of the project to be tested in the testing process through the preset interface;
the data analysis module is used for inputting all the defect data into a preset defect analysis model and receiving a defect analysis result output by the defect analysis model;
and the sending module is used for storing the item to be tested and the defect analysis result in a database in a correlation manner and sending the defect analysis result to the client.
7. The project test defect analysis apparatus of claim 6, wherein the data analysis module comprises:
the data processing submodule is used for obtaining at least one set identifier and a data set corresponding to the set identifier after classifying all the defect data;
the model matching submodule is used for acquiring an index model related to the data set from the defect analysis model;
and the output submodule is used for receiving the defect indexes output by the index models and outputting the defect analysis results generated according to all the defect indexes.
8. The project test defect analysis apparatus of claim 6, further comprising:
the first acquisition module is used for acquiring the identification information of the defect index contained in the defect analysis result, wherein the index information comprises one or more of the serial number, the theme and the state of the defect index;
the second acquisition module is used for acquiring creation information, and the creation information comprises one or more of creation time and creator;
the generating module is used for generating a test file containing a unique file identifier according to the creating information, the to-be-tested item, the defect analysis result and the identification information of the defect index contained in the defect analysis result, and storing the test file into a database;
and the query module is used for acquiring the test file corresponding to the unique file identifier from the database and sending the test file to the client when receiving a query instruction containing the unique file identifier.
9. A computer device comprising a memory, a processor and computer readable instructions stored in the memory and executable on the processor, wherein the processor when executing the computer readable instructions implements the method of test defect analysis of any of claims 1 to 5.
10. A computer readable storage medium storing computer readable instructions, wherein the computer readable instructions, when executed by a processor, implement the project test defect analysis method of any one of claims 1 to 5.
CN202010455310.8A 2020-05-26 2020-05-26 Project test defect analysis method, device, equipment and storage medium Pending CN111611172A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010455310.8A CN111611172A (en) 2020-05-26 2020-05-26 Project test defect analysis method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010455310.8A CN111611172A (en) 2020-05-26 2020-05-26 Project test defect analysis method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111611172A true CN111611172A (en) 2020-09-01

Family

ID=72204343

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010455310.8A Pending CN111611172A (en) 2020-05-26 2020-05-26 Project test defect analysis method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111611172A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112069069A (en) * 2020-09-03 2020-12-11 平安信托有限责任公司 Defect automatic positioning analysis method, device and readable storage medium
CN112084110A (en) * 2020-09-08 2020-12-15 北京百度网讯科技有限公司 Data processing method, device, equipment and storage medium
CN112399466A (en) * 2020-11-12 2021-02-23 国网江苏省电力有限公司信息通信分公司 Method for analyzing communication rule defects based on domain rule base
CN112486820A (en) * 2020-11-27 2021-03-12 北京百度网讯科技有限公司 Method, apparatus, device and storage medium for testing code
CN112882934A (en) * 2021-02-24 2021-06-01 中国工商银行股份有限公司 Test analysis method and system based on defect growth
CN113011795A (en) * 2021-05-06 2021-06-22 北京京东振世信息技术有限公司 Quality analysis method and system for application program development project
CN113051180A (en) * 2021-04-30 2021-06-29 中国平安人寿保险股份有限公司 Test task monitoring method, device, equipment and storage medium
CN114578210A (en) * 2022-02-25 2022-06-03 苏州浪潮智能科技有限公司 Mainboard test method, device, equipment and storage medium

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112069069A (en) * 2020-09-03 2020-12-11 平安信托有限责任公司 Defect automatic positioning analysis method, device and readable storage medium
CN112084110A (en) * 2020-09-08 2020-12-15 北京百度网讯科技有限公司 Data processing method, device, equipment and storage medium
CN112084110B (en) * 2020-09-08 2024-05-28 北京百度网讯科技有限公司 Data processing method, device, equipment and storage medium
CN112399466A (en) * 2020-11-12 2021-02-23 国网江苏省电力有限公司信息通信分公司 Method for analyzing communication rule defects based on domain rule base
CN112399466B (en) * 2020-11-12 2024-02-09 国网江苏省电力有限公司信息通信分公司 Communication rule defect analysis method based on domain rule base
JP7289334B2 (en) 2020-11-27 2023-06-09 ベイジン バイドゥ ネットコム サイエンス テクノロジー カンパニー リミテッド Methods and apparatus, electronic devices, storage media and computer programs for testing code
CN112486820A (en) * 2020-11-27 2021-03-12 北京百度网讯科技有限公司 Method, apparatus, device and storage medium for testing code
JP2022048088A (en) * 2020-11-27 2022-03-25 ベイジン バイドゥ ネットコム サイエンス テクノロジー カンパニー リミテッド Method and apparatus for testing code, electronic device, storage medium and computer program
CN112486820B (en) * 2020-11-27 2022-04-01 北京百度网讯科技有限公司 Method, apparatus, device and storage medium for testing code
CN112882934A (en) * 2021-02-24 2021-06-01 中国工商银行股份有限公司 Test analysis method and system based on defect growth
CN112882934B (en) * 2021-02-24 2024-02-13 中国工商银行股份有限公司 Test analysis method and system based on defect growth
CN113051180B (en) * 2021-04-30 2023-09-29 中国平安人寿保险股份有限公司 Method, device, equipment and storage medium for monitoring test task
CN113051180A (en) * 2021-04-30 2021-06-29 中国平安人寿保险股份有限公司 Test task monitoring method, device, equipment and storage medium
CN113011795A (en) * 2021-05-06 2021-06-22 北京京东振世信息技术有限公司 Quality analysis method and system for application program development project
CN114578210A (en) * 2022-02-25 2022-06-03 苏州浪潮智能科技有限公司 Mainboard test method, device, equipment and storage medium
CN114578210B (en) * 2022-02-25 2024-02-02 苏州浪潮智能科技有限公司 Mainboard testing method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN111611172A (en) Project test defect analysis method, device, equipment and storage medium
CN109062809B (en) Online test case generation method and device and electronic equipment
US9824002B2 (en) Tracking of code base and defect diagnostic coupling with automated triage
CN109542761B (en) Software quality evaluation method, device and storage medium
CN111522728A (en) Method for generating automatic test case, electronic device and readable storage medium
CN113127347B (en) Interface testing method, device, equipment and readable storage medium
CN109473093A (en) Audio recognition method, device, computer equipment and storage medium
CN113282513B (en) Interface test case generation method and device, computer equipment and storage medium
CN115048257A (en) System service function verification method and device, computer equipment and storage medium
CN114020432A (en) Task exception handling method and device and task exception handling system
CN112379913B (en) Software optimization method, device, equipment and storage medium based on risk identification
CN112153375A (en) Front-end performance testing method, device, equipment and medium based on video information
CN113282476B (en) Interface performance testing method and device and electronic equipment
CN113590498B (en) Method and system for testing application starting time of desktop operating system
CN111198798B (en) Service stability measuring method and device
CN115203025A (en) Test defect analysis method and device
CN111141981B (en) Line loss point inspection method and device, computer equipment and storage medium
CN109815083B (en) Application crash monitoring method and device, electronic equipment and medium
CN113282504A (en) Incremental code coverage rate detection method and service development method and device
CN114448848B (en) Switch testing method and device, electronic equipment and storage medium
CN114860549B (en) Buried data verification method, buried data verification device, buried data verification equipment and storage medium
CN114490361A (en) Test script quality obtaining method and device, computer equipment and storage medium
CN112612882B (en) Review report generation method, device, equipment and storage medium
CN114154838B (en) Product compatibility decision platform, method and storage medium
TWI655535B (en) Program changes management system and program changes management method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination