CN115952081A - Software testing method, device, storage medium and equipment - Google Patents

Software testing method, device, storage medium and equipment Download PDF

Info

Publication number
CN115952081A
CN115952081A CN202211550885.3A CN202211550885A CN115952081A CN 115952081 A CN115952081 A CN 115952081A CN 202211550885 A CN202211550885 A CN 202211550885A CN 115952081 A CN115952081 A CN 115952081A
Authority
CN
China
Prior art keywords
defect
interface
machine learning
learning model
target interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211550885.3A
Other languages
Chinese (zh)
Inventor
王闪闪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Bank Co Ltd
Original Assignee
Ping An Bank Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Bank Co Ltd filed Critical Ping An Bank Co Ltd
Priority to CN202211550885.3A priority Critical patent/CN115952081A/en
Publication of CN115952081A publication Critical patent/CN115952081A/en
Pending legal-status Critical Current

Links

Images

Abstract

The embodiment of the application provides a software testing method, a device, a storage medium and equipment, in the method, an execution result of an automatic test case executed by an automatic test platform for a target interface is obtained, the execution result is analyzed, an execution process log and a user behavior path are combined to judge whether a defect exists, and then when the defect exists, interface information of the target interface is processed by using a trained machine learning model to obtain a defect type corresponding to the defect existing in the target interface. Therefore, automatic identification of software defects and automatic analysis of defect types in the interface test are realized, and labor cost and time cost are effectively saved.

Description

Software testing method, device, storage medium and equipment
Technical Field
The application relates to the technical field of financial science and technology and software testing, in particular to a software testing method, a software testing device, a software testing storage medium and software testing equipment.
Background
Software testing is a process used to facilitate the identification of the correctness, integrity, security, and quality of software. The work of software testing is spread around finding bugs (software bugs). Currently, during the testing process, bugs are generally manually extracted and defect types are analyzed, however, the process needs to consume a large amount of labor cost and time cost.
Disclosure of Invention
An embodiment of the application aims to provide a software testing method, a software testing device, a storage medium and software testing equipment, and aims to solve the problem that a software testing scheme in the related art needs to consume a large amount of labor cost and time cost because defects are searched and analyzed manually.
In a first aspect, a software testing method provided in an embodiment of the present application includes:
acquiring an execution result of an automated test case executed by an automated test platform, wherein the automated test case is used for testing a target interface;
analyzing the execution result, and judging whether defects exist or not based on the analyzed execution result, the execution process log and the user behavior path;
when the judgment result is yes, inputting the interface information of the target interface into a trained machine learning model, and determining the defect type corresponding to the defect of the target interface; the machine learning model is obtained by training based on historical defect data corresponding to multiple defect types as training samples.
In the implementation process, the execution result of the automated test case executed by the automated test platform for the target interface is obtained, the execution result is analyzed, the execution process log and the user behavior path are combined to judge whether the defect exists, and then when the defect exists, the training is utilized to train
And processing the interface information of the target interface by the good machine learning model to obtain a defect type corresponding to the defect existing in the target interface 5. Therefore, the automatic identification of software defects in the interface test is realized
And automatic analysis of defect types effectively saves labor cost and time cost.
Further, in some embodiments, the interface information includes: interface URI, assertions, and interface entries.
In the implementation process, the interface URI, the assertion and the interface parameter are used as interface information, and the interface information is input into the machine learning model, so that the machine learning model can identify the defect type more accurately.
Further, in some embodiments, the historical defect data is obtained from a defect pool; the method further comprises the following steps:
and if the result output by the machine learning model indicates that the defect exists, generating new defect data based on the interface information of the target interface and the corresponding defect type, and storing the new defect data into the defect pool.
In the implementation process, when the result output by the machine learning model aiming at the interface information of the target interface indicates that a defect exists, the interface information and the defect type output by the machine learning model are combined to form new defect data and gather the new defect data into a defect pool, so that closed-loop processing is realized, and data support is provided for the optimization of the subsequent machine learning model.
Further, in some embodiments, the method further comprises:
and if the result output by the machine learning model indicates that no defect exists, outputting information indicating that the judgment result fails.
In the implementation process, the result output by the machine learning model is not equal to the original judgment result
When the test result is consistent, the information indicating the failure of the judgment result is output to inform relevant testers, and the accuracy of the next test analysis is improved.
Further, in some embodiments, the method further comprises:
if the result output by the machine learning model indicates that a defect exists, judging whether the defect is a historical survival defect through a defect tracking tool, and if so, modifying the task state of the historical survival defect corresponding to the defect; and if not, creating a task aiming at the new defect.
In the implementation process, the defects are tracked, and the defects are reasonably managed.
Further, in some embodiments, the modifying the task state of the historically persisted defect corresponding to the defect comprises:
if the task state of the history reserved defect is closed, resetting the state to be reopened;
if the task state of the historical persistence defect is open, keeping the task state unchanged, and adding comments and error information screenshots to the historical persistence defect.
In the implementation process, a specific mode for modifying the task state of the historical persistent defect is provided, so that relevant personnel can quickly and effectively repair the defect after checking the task state.
In a second aspect, an embodiment of the present application provides a software testing apparatus, including:
the system comprises an acquisition module, a test module and a test module, wherein the acquisition module is used for acquiring an execution result of an automatic test case executed by an automatic test platform, and the automatic test case is used for testing a target interface;
the judging module is used for analyzing the execution result and judging whether defects exist or not based on the analyzed execution result, the execution process log and the user behavior path;
the determining module is used for determining the defect type corresponding to the defect of the target interface by inputting the interface information of the target interface into the trained machine learning model when the judging result is yes; the machine learning model is obtained by training based on historical defect data corresponding to multiple defect types as training samples.
In a third aspect, an electronic device provided in an embodiment of the present application includes: memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method according to any of the first aspect when executing the computer program.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium having instructions stored thereon, which, when executed on a computer, cause the computer to perform the method according to any one of the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product, which when run on a computer causes the computer to perform the method according to any one of the first aspect.
Additional features and advantages of the disclosure will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the above-described techniques.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a flowchart of a software testing method according to an embodiment of the present application;
fig. 2 is a schematic diagram of a workflow of an automated test system based on machine learning according to an embodiment of the present application;
FIG. 3 is a block diagram of a software testing apparatus according to an embodiment of the present application;
fig. 4 is a block diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
As described in the background art, the software testing scheme in the related art has a problem that it depends on manual work to find the defect and analyze the defect type, which requires a lot of labor cost and time cost. Based on this, the embodiments of the present application provide a new software testing solution to solve the above problems.
Next, embodiments of the present application will be described:
as shown in fig. 1, fig. 1 is a flowchart of a software testing method provided in an embodiment of the present application, where the method may be applied to a terminal or a server, and the terminal may be various electronic devices, including but not limited to a smart phone, a tablet computer, a laptop portable computer, a desktop computer, and the like; the server may be a single server or a distributed server cluster consisting of a plurality of servers. It should be noted that the terminal/server may also be implemented as a plurality of software or software modules, or may also be implemented as a single software or software module, which is not limited in this application.
The method comprises the following steps:
in step 101, obtaining an execution result of an automated test case executed by an automated test platform, where the automated test case is used for testing a target interface;
the test case is a description of a test task performed on a specific software product, and embodies a test scheme, a method, a technology and a strategy. The test cases can be divided into two types, namely manual test cases and automatic test cases, wherein the manual test cases are manually executed and can judge and verify whether the function of the current step is correctly realized through artificial logic; the execution object of the automatic test case is a script, and the purpose of the script is to release a tester from a complicated and repeated test process.
In this embodiment, the automation test case is used to test a target interface, where the target interface may be a program interface, that is, an interface inside a program, or a protocol interface, that is, an interface outside a system. Therefore, the automatic test case mentioned in this step is a test case designed for interface test, and the interface test is to determine whether the interface meets or satisfies the corresponding requirements of functionality and safety by testing the input parameter and the output parameter corresponding to the input parameter under different conditions. In practical application, most middle-stage projects are interface tests, and accordingly, the quality of the projects can be guaranteed by analyzing the results of the interface tests.
The automated test case is executed on an automated test platform. The automated test platform may be a platform for managing automated tests, and has two main functions, one is to execute an automated test case, and the other is to output an execution result in the form of a test report. In some embodiments, the automated testing platform may be an AutoMan platform. The AutoMan platform is a complete page automation platform, so that test cases, test scripts and test reports can be maintained and checked on a unified online platform, testers can upload the written automation test cases to the AutoMan platform, and then the automation test cases are scheduled and executed on the platform to obtain the test reports, and therefore working time is saved. Of course, in other embodiments, the automated testing platform may also be other testing platforms, and the present application is not limited thereto.
In step 102, analyzing the execution result, and judging whether a defect exists or not based on the analyzed execution result, the execution process log and the user behavior path;
the method comprises the following steps: and analyzing an automatic execution result, and positioning a fault by combining an execution process log and a user behavior path. Analyzing the execution result may refer to analyzing information such as a return value, response time, throughput and the like of the target interface from the execution result through a regular expression; the execution process log records test information in the execution process of the automatic test case, wherein the test information comprises key information such as a specific execution interface, a request mode, a request parameter, a return value, a verification interface, request time, time consumption and the like; in the interface test, the test case can simulate the operation steps of the user calling the interface, and tests the function, performance and the like of the interface, so that the user behavior path mentioned in the step actually refers to various parameters and logics when the automatic test case simulates the user operating the target interface. By analyzing the information, it is possible to determine whether or not a defect exists. The step can be realized as an analyzer, and the defects can be detected quickly and automatically, so that the time of a tester is effectively saved.
Step 103, when the judgment result is yes, inputting the interface information of the target interface into the trained machine learning model, and determining the defect type corresponding to the defect of the target interface; the machine learning model is obtained by training based on historical defect data corresponding to multiple defect types as training samples.
The method comprises the following steps: when the analyzer judges that the defect exists, the result of the analyzer is further confirmed by means of a machine learning algorithm, and the defect type is determined. Specifically, the machine learning model is obtained by training based on historical defect data corresponding to a plurality of defect types as training samples, so that the machine learning model can learn relevant features of the extraction interface in the training process, and further classify and identify the defect types of the machine learning model. In some embodiments, the machine learning model may be a GRNN (General Regression Neural Network) model. GRNN is a radial basis function network based on mathematical statistics, and the theoretical basis is nonlinear regression analysis. Generally, the GRNN model is structurally composed of four layers, namely an input layer, a mode layer, a summation layer and an output layer, the number of neurons in the input layer is equal to the dimension of an input vector in a learning sample, and each neuron is a simple distribution unit and directly transmits an input variable to the mode layer; the number of neurons in the mode layer is equal to the number of learning samples, and each neuron corresponds to different samples; two types of neurons are used in the summation layer for summation; the number of neurons in the output layer is equal to the dimension of the output vector in the learning sample, and each neuron divides the output of the summation layer. The GRNN model has the characteristics of simple structure, concise training, high learning convergence speed and the like, so that the GRNN model is adopted as a machine learning model, the implementation is easy, and a better classification effect can be obtained. It should be noted that, in other embodiments, the machine learning model may also be a model based on other machine learning algorithms, which is not limited in this application; in addition, for a specific training process, reference may be made to the introduction of a training mode of a corresponding machine learning algorithm in the related art, which is not described in detail herein.
And when the analyzer judges that the defect exists, processing the interface information of the target interface by using the trained machine learning model. In some embodiments, the interface information may include interface URIs, assertions, and interface entries. An interface URI (Uniform Resource Identifier) is one of the components of the interface and is the expression of the interface characteristics; the assertion is a first-order logic in a program, and aims to represent and verify the result expected by a software developer, in the interface test, the assertion can comprise a state code assertion, a response content assertion, a response head assertion and the like, and the actual return result and the expected return result of the interface can be compared through the assertion; the interface entry parameter is a variable parameter required by the interface request, wherein the variable parameter comprises a bound filling parameter and a non-bound filling parameter, and if the interface entry parameter is wrong, the defect located by the analyzer has a maximum probability of belonging to the front-end error. The interface URI, the assertion and the interface parameter are used as interface information, and the interface information is input into the machine learning model, so that the machine learning model can identify the defect type more accurately.
The aforementioned historical defect data may be obtained from a defect pool, and in some embodiments, if the result output by the machine learning model indicates that a defect exists, new defect data may be generated based on the interface information of the target interface and the corresponding defect type, and the new defect data may be stored in the defect pool. The defect pool may be a container of defect data, which is used for recording problems found in the test process, and in this embodiment, the defect pool provides a learning sample for the machine learning model, and when a result output by the machine learning model for interface information of a target interface indicates that a defect exists, the interface information and a defect type output by the machine learning model may be combined to form new defect data and aggregate the new defect data into the defect pool, so that a closed-loop process is implemented, and data support is provided for optimization of a subsequent machine learning model. It should be noted that, when the machine learning model is trained based on the training samples provided by the defect pool, the noise reduction processing may be performed on the training samples to improve the accuracy of model inference.
In some other embodiments, if the result output by the machine learning model indicates that no defect exists, information indicating that the determination result fails is output. That is, when the result output by the machine learning model is inconsistent with the judgment result of the analyzer, information indicating that the judgment result fails may be output, for example, the information is notified to relevant testers in a form of a short message, a mail, or the like, so that the testers can make processing decisions.
Further, in some embodiments, the method may further include: if the result output by the machine learning model indicates that the defect exists, judging whether the defect is a history reserved defect or not by a defect tracking tool, and if so, modifying the task state of the history reserved defect corresponding to the defect; and if not, creating a task aiming at the new defect. That is, when the machine learning model determines that a defect exists, the defect may be tracked, specifically, it is determined by the defect tracking tool whether the defect is a new defect or a history-retained defect, if the defect is determined to be a new defect, a task for the new defect is created, if the defect is determined to be a history-retained defect, a task state of the corresponding history-retained defect is modified, optionally, if the task state of the history-retained defect is off, the task state is re-on, and if the task state of the history-retained defect is on, the task state is kept unchanged, and a comment and an error information screenshot are added to the history-retained defect, so that a relevant person may quickly and efficiently repair the defect after viewing the task state, thereby implementing reasonable management of the defect. Wherein the defect tracking tool may be a JIRA. JIRA is a project and transaction tracking tool that may help software teams find, record, and track defects in software. Of course, in other embodiments, other types of defect tracking tools may be selected according to the requirements of a particular scenario.
According to the method and the device, the execution result of the automatic test case executed by the automatic test platform for the target interface is obtained, the execution result is analyzed, whether the defect exists is judged by combining the execution process log and the user behavior path, and then when the defect exists, the interface information of the target interface is processed by using the trained machine learning model so as to obtain the defect type corresponding to the defect existing in the target interface. Therefore, automatic identification of software defects and automatic analysis of defect types in the interface test are realized, and labor cost and time cost are effectively saved.
To illustrate the solution of the present application in more detail, a specific embodiment is described below:
the embodiment relates to a software testing scene, in the daily testing process, the finding of the Bug is the key point of the testing work, in the related technology, the finding of the Bug and the classification of the Bug depend on manual processing, and more cost needs to be consumed. Based on this, the present embodiment provides an automated testing system based on machine learning to solve this problem.
The work flow of the system of the present embodiment is shown in fig. 2, and includes:
s201, an automatic test case is executed by combining an existing automatic test platform, wherein the platform is mainly used for automatic interface test;
s202, obtaining an execution result of the automatic test platform;
s203, analyzing the execution result through an analyzer, and positioning the defect through the analysis of the execution process log and the user behavior path by combining the information such as the execution report, the assertion, the log and the like;
s204, whether the analyzer locates the defect or not is judged, if yes, S206 is executed, and if not, S205 is executed;
s205, determining that no defect exists, and ending the process;
s206, inputting interface information including the interface URI, the assertion and the interface input parameters into a machine learning engine so as to judge the defect type through the machine learning engine; the machine learning engine comprises a model based on a machine learning algorithm, historical defect data are obtained from a defect pool and serve as training samples when the model is trained, unsupervised learning is carried out after noise reduction, and the algorithm is more accurate through adjustment and evaluation;
s207, judging whether defects exist by the machine learning engine, if so, executing S208, and if not, executing S209;
s208, tracking the defects through a defect tracking tool, and gathering the interface information and the defect types output by the machine learning engine into a defect pool after forming defect data; the adopted defect tracking tool is JIRA, whether the defect is a new defect or a history storage defect is judged through the JIRA, if the defect is judged to be the new defect, the new defect is created in the JIRA, if the defect is judged to be the history storage defect, the state of the history Bug is modified, if the state is closed, the state is reset to be opened again, if the state is opened, the state is kept unchanged, an annotation is added to the Bug, and an error information screenshot is added;
s209, outputting the information of the failure judgment of the analyzer.
In addition, the system of the embodiment can provide a notification mechanism for the Bug decision, can also provide the function of an instrument panel, and can clearly and intuitively check the defect change.
The system of the embodiment provides automatic test result self-help learning and self-help decision processing for defects, has a built-in function of providing notification for all stakeholders, is highly extensible, and does not need to be changed for any test automation framework or test types due to easy integration. Since all processes are automated, algorithms and programs control quality, many defects can be easily handled, and thus time of related personnel can be saved.
Corresponding to the embodiments of the foregoing method, the present application also provides embodiments of a software testing device and a terminal applied thereto:
as shown in fig. 3, fig. 3 is a block diagram of a software testing apparatus provided in an embodiment of the present application, where the apparatus includes:
the obtaining module 31 is configured to obtain an execution result of an automated test case executed by an automated test platform, where the automated test case is used to test a target interface;
a judging module 32, configured to analyze the execution result, and judge whether a defect exists based on the analyzed execution result, the execution process log, and the user behavior path;
the determining module 33 is configured to, when the determination result is yes, determine a defect type corresponding to a defect existing in the target interface by inputting the interface information of the target interface into the trained machine learning model; the machine learning model is obtained by training based on historical defect data corresponding to multiple defect types as training samples.
In some embodiments, the interface information includes: interface URI, assertion, and interface entry.
In some embodiments, the historical defect data is obtained from a defect pool; the device further comprises: and the storage module is used for generating new defect data based on the interface information of the target interface and the corresponding defect type and storing the new defect data into the defect pool if the result output by the machine learning model indicates that the defect exists.
In some embodiments, the above apparatus further comprises: and the output module is used for outputting information indicating that the judgment result fails if the result output by the machine learning model indicates that no defect exists.
In some embodiments, the above apparatus further comprises: the tracking module is used for judging whether the defect is a historical survival defect or not through a defect tracking tool if the result output by the machine learning model indicates that the defect exists, and modifying the task state of the historical survival defect corresponding to the defect if the defect exists; and if not, creating a task aiming at the new defect.
In some embodiments, the tracking module is specifically configured to: if the task state of the history persistence defect is closed, resetting the state to be reopened; if the task state of the historical persistence defect is open, keeping the task state unchanged, and adding comments and error information screenshots to the historical persistence defect.
The implementation process of the functions and actions of each module in the above device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
Fig. 4 shows a block diagram of an electronic device according to an embodiment of the present disclosure, where fig. 4 is a block diagram of the electronic device. The electronic device may include a processor 410, a communication interface 420, a memory 430, and at least one communication bus 440. Wherein the communication bus 440 is used to enable direct connection communication of these components. In this embodiment, the communication interface 420 of the electronic device is used for performing signaling or data communication with other node devices. The processor 410 may be an integrated circuit chip having signal processing capabilities.
The Processor 410 may be a general-purpose Processor including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor 410 may be any conventional processor or the like.
The Memory 430 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Read Only Memory (EPROM), an electrically Erasable Read Only Memory (EEPROM), and the like. The memory 430 stores computer readable instructions that, when executed by the processor 410, enable the electronic device to perform the various steps involved in the method embodiment of fig. 1 described above.
Optionally, the electronic device may further include a memory controller, an input output unit.
The memory 430, the memory controller, the processor 410, the peripheral interface, and the input/output unit are electrically connected to each other directly or indirectly, so as to implement data transmission or interaction. For example, these components may be electrically connected to each other via one or more communication buses 440. The processor 410 is used to execute executable modules stored in the memory 430, such as software functional modules or computer programs included in the electronic device.
The input and output unit is used for providing a task for a user and starting an optional time interval or preset execution time for the task creation so as to realize the interaction between the user and the server. The input/output unit may be, but is not limited to, a mouse, a keyboard, and the like.
It will be appreciated that the configuration shown in fig. 4 is merely illustrative and that the electronic device may include more or fewer components than shown in fig. 4 or may have a different configuration than shown in fig. 4. The components shown in fig. 4 may be implemented in hardware, software, or a combination thereof.
The embodiment of the present application further provides a storage medium, where the storage medium stores instructions, and when the instructions are run on a computer, when the computer program is executed by a processor, the method in the method embodiment is implemented, and in order to avoid repetition, details are not repeated here.
The present application also provides a computer program product which, when run on a computer, causes the computer to perform the method of the method embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus embodiments described above are merely illustrative and, for example, the flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made to the present application by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a," "8230," "8230," or "comprising" does not exclude the presence of additional like elements in a process, method, article, or apparatus that comprises the element.

Claims (10)

1. A software testing method, comprising:
acquiring an execution result of an automated test case executed by an automated test platform, wherein the automated test case is used for testing a target interface;
analyzing the execution result, and judging whether defects exist or not based on the analyzed execution result, the execution process log and the user behavior path;
when the judgment result is yes, inputting the interface information of the target interface into a trained machine learning model, and determining the defect type corresponding to the defect of the target interface; the machine learning model is obtained by training based on historical defect data corresponding to multiple defect types as training samples.
2. The method of claim 1, wherein the interface information comprises: interface URI, assertions, and interface entries.
3. The method of claim 1, wherein the historical defect data is obtained from a defect pool; the method further comprises the following steps:
and if the result output by the machine learning model indicates that defects exist, generating new defect data based on the interface information of the target interface and the corresponding defect type, and storing the new defect data into the defect pool.
4. The method of claim 3, further comprising:
and if the result output by the machine learning model indicates that no defect exists, outputting information indicating that the judgment result fails.
5. The method of claim 1, further comprising:
if the result output by the machine learning model indicates that a defect exists, judging whether the defect is a historical survival defect through a defect tracking tool, and if so, modifying the task state of the historical survival defect corresponding to the defect; if not, a task for the new defect is created.
6. The method of claim 5, wherein modifying the task state of the historically persisted defects for which the defects correspond comprises:
if the task state of the history persistence defect is closed, resetting the state to be reopened;
if the task state of the historical persistence defect is open, keeping the task state unchanged, and adding comments and error information screenshots to the historical persistence defect.
7. A software testing apparatus, comprising:
the system comprises an acquisition module, a test module and a test module, wherein the acquisition module is used for acquiring an execution result of an automatic test case executed by an automatic test platform, and the automatic test case is used for testing a target interface;
the judging module is used for analyzing the execution result and judging whether defects exist or not based on the analyzed execution result, the execution process log and the user behavior path;
the determining module is used for determining the defect type corresponding to the defect of the target interface by inputting the interface information of the target interface into the trained machine learning model when the judging result is yes; the machine learning model is obtained by training based on historical defect data corresponding to multiple defect types as training samples.
8. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 6.
9. An electronic device comprising a processor, a memory, and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any of claims 1 to 6 when executing the computer program.
10. A computer program product, characterized in that it, when run on a computer, causes the computer to perform the method according to any one of claims 1 to 6.
CN202211550885.3A 2022-12-05 2022-12-05 Software testing method, device, storage medium and equipment Pending CN115952081A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211550885.3A CN115952081A (en) 2022-12-05 2022-12-05 Software testing method, device, storage medium and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211550885.3A CN115952081A (en) 2022-12-05 2022-12-05 Software testing method, device, storage medium and equipment

Publications (1)

Publication Number Publication Date
CN115952081A true CN115952081A (en) 2023-04-11

Family

ID=87289847

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211550885.3A Pending CN115952081A (en) 2022-12-05 2022-12-05 Software testing method, device, storage medium and equipment

Country Status (1)

Country Link
CN (1) CN115952081A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116932413A (en) * 2023-09-14 2023-10-24 深圳市智慧城市科技发展集团有限公司 Defect processing method, defect processing device and storage medium for test task
CN117149664A (en) * 2023-10-31 2023-12-01 深圳大数信科技术有限公司 Automatic test method based on BPMN and system platform thereof

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116932413A (en) * 2023-09-14 2023-10-24 深圳市智慧城市科技发展集团有限公司 Defect processing method, defect processing device and storage medium for test task
CN116932413B (en) * 2023-09-14 2023-12-19 深圳市智慧城市科技发展集团有限公司 Defect processing method, defect processing device and storage medium for test task
CN117149664A (en) * 2023-10-31 2023-12-01 深圳大数信科技术有限公司 Automatic test method based on BPMN and system platform thereof
CN117149664B (en) * 2023-10-31 2024-03-15 深圳大数信科技术有限公司 Automatic test method based on BPMN and system platform thereof

Similar Documents

Publication Publication Date Title
CN107291911B (en) Anomaly detection method and device
US9396094B2 (en) Software test automation systems and methods
CN115952081A (en) Software testing method, device, storage medium and equipment
CN111177714B (en) Abnormal behavior detection method and device, computer equipment and storage medium
CN113010389B (en) Training method, fault prediction method, related device and equipment
CN110088744B (en) Database maintenance method and system
Yang et al. Vuldigger: A just-in-time and cost-aware tool for digging vulnerability-contributing changes
CN114116496A (en) Automatic testing method, device, equipment and medium
JP2020102209A (en) Identification method of defect location on software program
Dhanalaxmi et al. A review on software fault detection and prevention mechanism in software development activities
CN105653455B (en) A kind of detection method and detection system of program bug
CN110765007A (en) Crash information online analysis method for android application
CN111752833B (en) Software quality system approval method, device, server and storage medium
CN113051180A (en) Test task monitoring method, device, equipment and storage medium
US11790249B1 (en) Automatically evaluating application architecture through architecture-as-code
CN110865939B (en) Application program quality monitoring method, device, computer equipment and storage medium
CN111444093B (en) Method and device for determining quality of project development process and computer equipment
CN114138537A (en) Crash information online analysis method for android application
CN113791980A (en) Test case conversion analysis method, device, equipment and storage medium
CN116991149B (en) Method and device for checking fee-controlled product, electronic equipment and storage medium
Lal et al. Intelligent Testing in Software Industry
CN116991746B (en) Method and device for evaluating general quality characteristics of software
CN114880637B (en) Account risk verification method and device, computer equipment and storage medium
JP2019194818A (en) Software trouble prediction device
TWI778634B (en) Method for classifying faults, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination