CN118295930A - Automatic test processing method, device, equipment, storage medium and product - Google Patents

Automatic test processing method, device, equipment, storage medium and product Download PDF

Info

Publication number
CN118295930A
CN118295930A CN202410565267.9A CN202410565267A CN118295930A CN 118295930 A CN118295930 A CN 118295930A CN 202410565267 A CN202410565267 A CN 202410565267A CN 118295930 A CN118295930 A CN 118295930A
Authority
CN
China
Prior art keywords
test
target
test case
information
case
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410565267.9A
Other languages
Chinese (zh)
Inventor
李研
何非
李凯
闵爱佳
谢仁艿
廖芳芳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China United Network Communications Group Co Ltd
Unicom Digital Technology Co Ltd
China Unicom Internet of Things Corp Ltd
Original Assignee
China United Network Communications Group Co Ltd
Unicom Digital Technology Co Ltd
China Unicom Internet of Things Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China United Network Communications Group Co Ltd, Unicom Digital Technology Co Ltd, China Unicom Internet of Things Corp Ltd filed Critical China United Network Communications Group Co Ltd
Priority to CN202410565267.9A priority Critical patent/CN118295930A/en
Publication of CN118295930A publication Critical patent/CN118295930A/en
Pending legal-status Critical Current

Links

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application provides an automatic test processing method, an automatic test processing device, automatic test processing equipment, a storage medium and an automatic test processing product. The method comprises the following steps: obtaining target product information of a test case to be authenticated; determining a target test case from a preset test case database according to the target product information; determining a target test tool and a target access point from a test tool database according to tool information of the target test case; according to the target test tool and the target access point, testing and processing are carried out on the target test case, and a test case result is obtained; and generating and displaying an authentication conclusion of the test case to be authenticated according to the test case result. The method of the application realizes automatic configuration, automatic test and automatic generation of test authentication results.

Description

Automatic test processing method, device, equipment, storage medium and product
Technical Field
The present application relates to the field of computer technologies, and in particular, to an automated test processing method, apparatus, device, storage medium, and product.
Background
The 5G module and the terminal product thereof transmit data through a 5G network, and are widely applied to various intelligent fields, and correspondingly, the 5G module and the terminal product thereof are essentially tested.
At present, test scripts aiming at a 5G module and a terminal product thereof exist, and corresponding scripts are executed aiming at different test contents, so that different test results can be obtained.
However, the existing testing process of the 5G module and the terminal generally relies on tedious manual configuration and operation of manually performing test certification.
Disclosure of Invention
The application provides an automatic test processing method, an automatic test processing device, automatic test processing equipment, a storage medium and a product, which are used for achieving the effects of realizing automatic configuration, automatic test and automatic generation of test authentication results.
In a first aspect, an embodiment of the present application provides an automated test processing method, including:
Obtaining target product information of a test case to be authenticated, wherein the target product information at least comprises one of target product name information, target product type information and target product module information;
Determining a target test case from a preset test case database according to the target product information, wherein the test case database comprises the product information and the test cases with corresponding relations with the product information;
determining a target test tool and a target access point from a test tool database according to tool information of the target test case;
according to the target test tool and the target access point, testing and processing are carried out on the target test case, and a test case result is obtained;
And generating and displaying an authentication conclusion of the test case to be authenticated according to the test case result.
In one possible implementation manner, after testing the target test case according to the target test tool and the target access point and before obtaining the test case result, determining the target test case from a preset test case database according to the target product information, the method further includes:
Determining importance degree information of the target test case, wherein the importance degree information of the target test case is determined according to priority information and necessity information of the target test case;
and determining the test sequence of the target test case according to the importance degree information of the target test case.
In one possible implementation manner, determining the test sequence of the target test case according to the importance degree information of the target test case includes:
Determining global priority division information of a global target test case;
obtaining priority scores according to the global priority dividing information, the priority information and the priority weights of the priority information;
Determining a necessity score according to the necessity information and the necessity weight of the necessity information;
and obtaining the test sequence of the target test case according to the priority score and the necessity score.
In one possible implementation manner, determining the test sequence of the target test case according to the importance degree information of the target test case includes:
acquiring the predicted execution time length and the predicted shortest completion time length of the target test case;
Obtaining a duration evaluation score according to the estimated execution duration and the estimated shortest completion duration of the target test case and the timely duration evaluation weight;
and determining the test sequence of the target test case according to the time length evaluation score and the importance degree information of the target test case.
In one possible implementation manner, determining the test sequence of the target test case according to the duration evaluation score and the importance degree information of the target test case includes:
obtaining priority scores and necessity scores according to importance degree information of the target test cases;
Summing the time length evaluation score, the priority score and the necessity score to obtain a test case execution score, wherein the test case execution score meets the following conditions:
score=scoreA+scoreB+scoreC,
scoreA=Pweight*(Pmax*(i/n)),
scoreB=Tweight*(Toptimal-Min(etime,600)*(Toptimal/600)),
scoreC=Nweight*Nscore
Wherein score is test case execution score, scoreA is priority score, scoreB is duration evaluation score, scoreC is necessity score, P weight is priority weight, P max is priority maximum score, i is priority rank value, N is priority rank number, T weight is duration evaluation weight, T optimal is predicted shortest duration, e time is predicted execution duration, N weight is necessity weight, N score is necessity rank score;
and determining the test sequence of the target test case according to the test case execution score.
In one possible implementation manner, generating and displaying the authentication conclusion of the test case to be authenticated according to the test case result includes:
Determining a test-on-demand target test case in the target test cases and test-on-demand case results of the test-on-demand target test cases according to preset target test case requirements;
if the test-necessary case results of the test-necessary case represent that the test passes, obtaining a passing authentication conclusion of the test case to be authenticated;
And displaying the passing authentication conclusion of the test case to be authenticated.
In one possible implementation manner, after determining the test-necessary target test case in the target test cases and the test-necessary case result of the test-necessary target test case according to the preset target test case requirement, the method further includes:
If the test-by-test case result of the test-by-test target case represents that the test is not passed, obtaining a non-passed authentication conclusion of the test case to be authenticated;
and displaying the failed authentication conclusion of the test case to be authenticated.
In one possible implementation manner, after the failed test is characterized by the test case result of the test case of interest, and the failed authentication conclusion of the test case of interest is obtained, the method further includes:
Determining the times of failed authentication conclusion obtained after test processing of the test instance to be authenticated;
If the number of times of failing to pass the authentication conclusion is smaller than a preset number of times threshold, re-executing the test processing of the target test case according to the target test tool and the target access point to obtain a test case result;
If the number of times of failed authentication conclusion is greater than a preset number of times threshold, ending the test processing of the test case to be authenticated, and taking the failed authentication conclusion as a global authentication result of the test case to be authenticated.
In a second aspect, an embodiment of the present application provides an automated test processing apparatus, including:
The first acquisition module is used for acquiring target product information of the test case to be authenticated, wherein the target product information at least comprises one of target product name information, target product type information and target product module information;
The first determining module is used for determining a target test case from a preset test case database according to the target product information, wherein the test case database comprises the product information and the test cases with corresponding relations with the product information;
the second determining module is used for determining a target test tool and a target access point from the test tool database according to the tool information of the target test case;
The test processing module is used for carrying out test processing on the target test case according to the target test tool and the target access point to obtain a test case result;
and the authentication conclusion module is used for generating and displaying an authentication conclusion of the test case to be authenticated according to the test case result.
In one possible implementation manner, before performing test processing on the target test case according to the target test tool and the target access point to obtain a test case result, determining the target test case from a preset test case database according to the target product information, and then further includes:
Determining importance degree information of the target test case, wherein the importance degree information of the target test case is determined according to priority information and necessity information of the target test case;
and determining the test sequence of the target test case according to the importance degree information of the target test case.
In one possible implementation manner, determining the test sequence of the target test case according to the importance degree information of the target test case includes:
Determining global priority division information of a global target test case;
obtaining priority scores according to the global priority dividing information, the priority information and the priority weights of the priority information;
Determining a necessity score according to the necessity information and the necessity weight of the necessity information;
and obtaining the test sequence of the target test case according to the priority score and the necessity score.
In one possible implementation manner, determining the test sequence of the target test case according to the importance degree information of the target test case includes:
acquiring the predicted execution time length and the predicted shortest completion time length of the target test case;
Obtaining a duration evaluation score according to the estimated execution duration and the estimated shortest completion duration of the target test case and the timely duration evaluation weight;
and determining the test sequence of the target test case according to the time length evaluation score and the importance degree information of the target test case.
In one possible implementation manner, determining the test sequence of the target test case according to the duration evaluation score and the importance degree information of the target test case includes:
obtaining priority scores and necessity scores according to importance degree information of the target test cases;
Summing the time length evaluation score, the priority score and the necessity score to obtain a test case execution score, wherein the test case execution score meets the following conditions:
score=scoreA+scoreB+scoreC,
scoreA=Pweight*(Pmax*(i/n)),
scoreB=Tweight*(Toptimal-Min(etime,600)*(Toptimal/600)),
scoreC=Nweight*Nscore
Wherein score is test case execution score, scoreA is priority score, scoreB is duration evaluation score, scoreC is necessity score, P weight is priority weight, P max is priority maximum score, i is priority rank value, N is priority rank number, T weight is duration evaluation weight, T optimal is predicted shortest duration, e time is predicted execution duration, N weight is necessity weight, N score is necessity rank score;
and determining the test sequence of the target test case according to the test case execution score.
In one possible implementation manner, generating and displaying the authentication conclusion of the test case to be authenticated according to the test case result includes:
Determining a test-on-demand target test case in the target test cases and test-on-demand case results of the test-on-demand target test cases according to preset target test case requirements;
if the test-necessary case results of the test-necessary case represent that the test passes, obtaining a passing authentication conclusion of the test case to be authenticated;
And displaying the passing authentication conclusion of the test case to be authenticated.
In one possible implementation manner, after determining the test-necessary target test case in the target test cases and the test-necessary case result of the test-necessary target test case according to the preset target test case requirement, the method further includes:
If the test-by-test case result of the test-by-test target case represents that the test is not passed, obtaining a non-passed authentication conclusion of the test case to be authenticated;
and displaying the failed authentication conclusion of the test case to be authenticated.
In one possible implementation manner, after the failed test is characterized by the test case result of the test case of the test target, the method further includes:
Determining the times of failed authentication conclusion obtained after test processing of the test instance to be authenticated;
If the number of times of failing to pass the authentication conclusion is smaller than a preset number of times threshold, re-executing the test processing of the target test case according to the target test tool and the target access point to obtain a test case result;
If the number of times of failed authentication conclusion is greater than a preset number of times threshold, ending the test processing of the test case to be authenticated, and taking the failed authentication conclusion as a global authentication result of the test case to be authenticated.
In a third aspect, an embodiment of the present application provides an automated test handling apparatus, including: a memory, a processor;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored in the memory such that the processor performs the various possible implementations of the first aspect and/or the first aspect as described above.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium having stored therein computer-executable instructions for implementing the various possible implementations of the above first aspect and/or the first aspect when executed by a processor.
In a fifth aspect, embodiments of the present application provide a computer program product comprising a computer program which, when executed by a processor, implements the various possible implementations of the above first aspect and/or the first aspect.
According to the automatic test processing method, device, equipment, storage medium and product, the target test case which has the corresponding relation with the target product name information, the target product type information or the target product module information can be selected from the preset test case database according to the target product name information, the target product type information and the target product module information in the target product information by acquiring the target product information of the test case to be authenticated, and after the target test case is determined, the target test tool and the target access point which correspond to the tool information are determined from the test tool database according to the tool information in the target test case, so that the target test tool can be automatically used for carrying out test processing on the target test case after the target test tool and the target access point are determined, test case results are obtained, and the means of authentication conclusion of the test case to be authenticated do not need to be manually participated, and the effects of realizing automatic configuration, automatic test and automatic generation of test authentication results are achieved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
FIG. 1 is a schematic flow chart of an automated test processing method provided by the application;
FIG. 2 is a flow chart of another automated test processing method provided by the present application;
FIG. 3 is a schematic diagram of an automated test handling apparatus according to the present application;
Fig. 4 is a schematic structural diagram of an automated test handling apparatus according to the present application.
Specific embodiments of the present application have been shown by way of the above drawings and will be described in more detail below. The drawings and the written description are not intended to limit the scope of the inventive concepts in any way, but rather to illustrate the inventive concepts to those skilled in the art by reference to the specific embodiments.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the application. Rather, they are merely examples of apparatus and methods consistent with aspects of the application as detailed in the accompanying claims.
It should be noted that, the user information (including but not limited to user equipment information, user personal information, etc.) and the data (including but not limited to data for analysis, stored data, presented data, etc.) related to the present application are information and data authorized by the user or fully authorized by each party, and the collection, use and processing of the related data need to comply with related laws and regulations and standards, and provide corresponding operation entries for the user to select authorization or rejection.
In the prior art, the testing process of the 5G module and the terminal generally comprises the testing contents including transmission rate, signal quality and network delay, the environment required by the testing contents is manually configured and the operation of testing authentication is manually performed, and the problems of complex testing work and labor cost consumption exist.
The automatic test processing method provided by the application has the advantages that the test cases and the corresponding relations between the test cases and the product information are stored in the test case database in advance, the test tools of the test cases and the corresponding relations between the test tools and the tool information of the test cases are stored in the test tool database in advance, so that the automatic test processing system can automatically acquire the target test tools and the target access points from the test case database according to the corresponding relations between the product information and the test cases after receiving the target product information of the test cases to be authenticated, the automatic configuration of the test cases to be authenticated before the detection is realized, the test processing of the target test cases is automatically completed according to the target test tools and the target access points, and the process of generating and displaying the authentication conclusion of the test cases to be authenticated according to the test case results. Based on the above, the automatic test processing method provided by the embodiment of the application solves the problems of complex test content and labor cost consumption caused by manual participation in the test, and achieves the effects of realizing automatic configuration, automatic test and automatic generation of test authentication results.
The following describes the technical scheme of the present application and how the technical scheme of the present application solves the above technical problems in detail with specific embodiments. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
The execution subject of the automated test processing method provided by the embodiment of the application can be a server. The server can be a mobile phone, a tablet, a computer and other devices. The implementation manner of the execution body is not particularly limited in this embodiment, as long as the execution body can obtain the target product information of the test case to be authenticated, where the target product information includes at least one of the target product name information, the target product type information, and the target product module information; determining a target test case from a preset test case database according to the target product information, wherein the test case database comprises the product information and the test cases with corresponding relations with the product information; determining a target test tool and a target access point from a test tool database according to tool information of the target test case; according to the target test tool and the target access point, testing and processing are carried out on the target test case, and a test case result is obtained; and generating and displaying an authentication conclusion of the test case to be authenticated according to the test case result.
First, the terms involved in the present application will be explained:
Automated testing refers to the use of software tools or scripts to execute test cases and verify the behavior of a software system, instead of a manual testing process. Through automatic test, the test efficiency, accuracy and coverage area can be improved, and human errors and test cost are reduced.
In the field of computer programming, script (Script) generally refers to a series of instructions or sequences of commands for automating the execution of a particular task. Scripts are a lightweight programming language that is commonly used to simplify and automate the execution of repetitive tasks. Scripts are typically stored in the form of text files and executed by an interpreter or an interpreted programming language.
Fig. 1 is a schematic flow chart of an automated test processing method provided in the present application, where an execution subject of the method may be an automated test processing system, as shown in fig. 1, and the method includes:
S101, obtaining target product information of a test case to be authenticated, wherein the target product information at least comprises one of target product name information, target product type information and target product module information.
The test case can refer to a case for testing the target product, generally comprises information of a test case number, test contents and test results, the test case to be authenticated can refer to a test case corresponding to the information of the target product, and whether the test case can be authenticated is judged according to a test conclusion of the test case.
The product information can indicate information for identifying the product and parameter information of the product, and in the application, the product information can comprise a product name, a product type, a product CPU model and a product module model, and the target product information can be the product information of the product to be tested.
Ext> theext> productext> nameext> informationext> mayext> referext> toext> aext> productext> nameext>,ext> theext> targetext> productext> nameext> informationext> mayext> referext> toext> aext> productext> nameext> ofext> aext> productext> toext> beext> testedext>,ext> theext> productext> typeext> informationext> mayext> referext> toext> aext> typeext> ofext> theext> productext>,ext> aext> workerext> mayext> divideext> theext> productext> typeext> inext> advanceext> accordingext> toext> aext> useext> sceneext> ofext> theext> productext>,ext> aext> technologyext> usedext> byext> theext> productext>,ext> andext> aext> locationext> ofext> theext> productext> inext> aext> marketext>,ext> theext> targetext> productext> typeext> informationext> mayext> referext> toext> productext> typeext> informationext> ofext> theext> productext> toext> beext> testedext>,ext> forext> exampleext>,ext> theext> targetext> productext> typeext> informationext> mayext> referext> toext> aext> 4ext> Gext> moduleext> productext> andext> aext> 5ext> Gext> moduleext> productext>,ext> theext> productext> moduleext> informationext> mayext> referext> toext> aext> moduleext> typeext> ofext> theext> productext>,ext> forext> exampleext>,ext> aext> moduleext> typeext> ofext> theext> 5ext> Gext> moduleext> mayext> referext> toext> aext> 5ext> Gext> -ext> aext> -ext> bext> -ext> cext> moduleext> typeext>,ext> theext> workerext> mayext> setext> aext> productext> moduleext> typeext> inext> advanceext> accordingext> toext> aext> planningext> dateext> ofext> theext> moduleext> andext> aext> technologyext> usedext> byext> theext> moduleext>,ext> andext> theext> targetext> productext> moduleext> informationext> mayext> referext> toext> productext> moduleext> informationext> ofext> theext> productext> toext> beext> testedext>.ext>
In the embodiment of the application, the automatic test processing system can display an input interface to a user, and the user can obtain the test application information of the test instance to be authenticated by the automatic test processing system in an input mode or a transmission mode. And then the automatic test processing system can obtain target product information from the target position in the test application information in a character recognition or label recognition mode and the like.
S102, determining a target test case from a preset test case database according to the target product information, wherein the test case database comprises the product information and the test cases with corresponding relations with the product information.
The test case database can refer to a database comprising test case contents, and in the application, the test case database can be a test case table, wherein the test case table is a test case, and a worker presets the test case according to an actual test scene and test contents.
The test cases can refer to a guide text for executing the test, one test case can correspond to a plurality of test cases, the test cases can comprise test case numbers, product types, product CPU (central processing unit) types, product module types and test case states, the target test cases can refer to test cases of products to be tested, and the test cases consistent with target product information are screened in a test case database to be the target test cases.
For example, the relationship between the product information and the test case may be a relationship between the product identifier and the case identifier, and is recorded in a test case table in the test case database. After the target product information is obtained, the product identification code corresponding to the target product name information, the target product type information and the target product module information in the target product information can be determined, then the test case table in the test case database is searched according to the product identification code, so that the corresponding case identification code is determined, and finally, the corresponding consistent target test case is determined from the test case database according to the case identification code.
S103, determining a target test tool and a target access point from a test tool database according to tool information of the target test case.
The test tool database may include a database of test tool information, in the present application, the test tool database may be a test tool table, in the test tool table, the test tool table may include a test case number and tool information, the tool information may refer to related information of a tool for executing a test case, in the present application, the tool information may include a tool number, a tool name, a tool version, a tool path, a tool IP, and a tool port, and the target test tool may refer to tool information corresponding to the test case number of the target test case.
An access point may refer to an interface for connecting an external component or implementing a specific function, and in the present application, an access point may refer to an access point where a target test tool port can be successfully connected to a network to perform a test, or may refer to an access point where a test case needs to be successfully connected to the network to perform a test.
For example, the relationship between the test case number information and the tool information may be a relationship between the test case identification code and the tool identification code, and recorded in a test tool table in the test tool library. After the test case number information of the target test case is obtained, a test case identification code corresponding to the test case number information can be determined, then a test tool table in a test tool library is searched according to the test case identification code, so that a corresponding tool identification code is determined, and finally, a target test tool and a target access point are determined from the test tool database according to the tool identification code.
In the application, after testing the target test case according to the target test tool and the target access point and obtaining the test case result, determining the target test case from a preset test case database according to the target product information, the method further comprises:
Determining importance degree information of the target test case, wherein the importance degree information of the target test case is determined according to priority information and necessity information of the target test case;
and determining the test sequence of the target test case according to the importance degree information of the target test case.
The importance information may refer to priority information and necessity information, among others.
In the present application, the priority information includes a priority level for evaluating global test cases
In the present application, the priority information may refer to an execution priority of each test case in the test cases to be authenticated, and the execution priority information may characterize an execution order of each test case, for example, if the test case a is to be executed, the test case B needs to be executed first, and then the execution priority of the test case B is higher than the execution priority of the test case B. In the embodiment of the application, the execution priority of the test cases can be determined according to the execution sequence among the test cases, for example, 10 test cases in the test cases to be authenticated are included. 7 test cases with the target test execution precedence relationship can be sequentially determined to have the priority of 7-1 according to the execution sequence, and the other 3 test cases can be determined to have the priority of 7 according to the target test execution precedence relationship if the test execution precedence relationship does not exist.
The necessity information may refer to the execution necessity of each test case in the test case to be authenticated under the target dimension, for example, when the target dimension is the connection frequency band of the test case, if the test content of the test case C is whether the rate of frequency band 8 communication in the test communication process reaches the expected threshold, the test content of the test case D is whether the rate of frequency band 5 communication in the test communication process reaches the expected threshold, and frequency band 8 is the most current frequency band, and frequency band 5 is the least covered frequency band, the necessity of setting the test case C is greater than the necessity of the test case D, the score of the test case C may be set to 2, and the score of the test case D may be set to 1. Therefore, when the frequency bands are multiple, the execution necessity of the test cases can be determined according to the main stream degrees of different frequency bands, and the necessity scores can be sequentially determined according to the ordering of the execution necessity, wherein the main stream degrees of different frequency bands can be set according to the requirements of users.
In some embodiments, when there are multiple target dimensions, the weight of each target dimension and the necessity score of each target dimension may be determined according to the setting of the user, then the product is obtained according to the weight and the necessity score of the target dimension, and then the product results are summed, so as to determine the comprehensive score of the test case, and finally the necessity information of each test case is determined according to the comprehensive score of the test case.
The test sequence may refer to a sequence of executing test cases in the test process, and in the present application, the test sequence may sort the test cases in descending order according to importance information of the test cases, where the score is obtained according to the importance information of the test cases.
In the application, according to the importance degree information of the target test case, the test sequence of the target test case is determined, which comprises the following steps:
Determining global priority division information of a global target test case;
obtaining priority scores according to the global priority dividing information, the priority information and the priority weights of the priority information;
Determining a necessity score according to the necessity information and the necessity weight of the necessity information;
and obtaining the test sequence of the target test case according to the priority score and the necessity score.
The global priority classification information may refer to information for performing overall priority classification on the test case, and in the present application, the global priority classification information may refer to a priority classification range level, and a worker may classify the priority of the test case into 1 level to 10 levels when designing the test case.
The priority score may refer to a score calculated from a priority weight, priority level, global priority scoping level, where weight generally refers to a numerical value used in a certain system or model to represent the relative importance or influence of different factors or variables. The priority weight may refer to a duty cycle weight of the priority score in the test case execution score.
The necessity score may refer to a score calculated according to a necessity weight and a necessity level, wherein the necessity weight may refer to a duty weight of the necessity score in the test case execution score.
In the application, a worker can customize priority weights and necessity weights according to actual conditions, and after considering the ratio of influence factors of the priority and the necessity on the test cases and the fact that the test cases with high necessity levels obtain test results which do not pass, the product performance corresponding to the test cases needs to be improved in time more seriously, the necessity weights can be larger than the priority weights, and when the execution sequence of the test cases only considers the necessity and the priority, the ratio of the necessity weights to the priority weights can be 5:2.
For example, given that the priority weight is 0.2, the necessity weight is 0.5, the maximum priority score is 10, the priority class value is 2, the number of priority classes is 5, the necessity class score is 8, the score calculated by the test case execution score is 0.2×10 (2/5) +0.5×8=4.8, the test case execution score of the test case is 4.8, and then the value is compared with the test case execution scores of other test cases in the target test case, the greater the value, the earlier the ranking, the more the test case is executed.
In the application, according to the importance degree information of the target test case, the test sequence of the target test case is determined, which comprises the following steps:
acquiring the predicted execution time length and the predicted shortest completion time length of the target test case;
Obtaining a duration evaluation score according to the estimated execution duration and the estimated shortest completion duration of the target test case and the timely duration evaluation weight;
and determining the test sequence of the target test case according to the time length evaluation score and the importance degree information of the target test case.
In the application, a worker can preset the expected execution time of the test case in designing the test case, the expected execution time can be in seconds, and when the time spent for configuring the test tool is not considered, the worker can preset the expected shortest completion time, the expected shortest completion time can be in seconds, and the expected shortest completion time is less than or equal to the expected execution time.
The duration evaluation score may refer to a score calculated according to a predicted execution duration and a predicted shortest completion duration, wherein the duration evaluation weight may refer to a duty ratio weight of the duration evaluation score in the execution score of the test case, in the present application, a worker may customize the duration evaluation weight according to an actual situation, and considering that in test execution, the time of execution of a previous test case may affect the time of start of execution of a next test case, the longer the execution time of the test case, the longer the waiting time of the test case to be tested, the slower the progress of obtaining the test result, and when the execution sequence of the test case considers the predicted execution duration, necessity and priority, the ratio of the duration evaluation weight, the necessity weight and the priority weight may be 3:5:2.
In the application, according to the time length evaluation score and the importance degree information of the target test case, the test sequence of the target test case is determined, which comprises the following steps:
obtaining priority scores and necessity scores according to importance degree information of the target test cases;
Summing the time length evaluation score, the priority score and the necessity score to obtain a test case execution score, wherein the test case execution score meets the following conditions:
score=scoreA+scoreB+scoreC,
scoreA=Pweight*(Pmax*(i/n)),
scoreB=Tweight*(Toptimal-Min(etime,600)*(Toptimal/600)),
scoreC=Nweight*Nscore
Wherein score is a test case execution score, scoreA is a priority score, scoreB is a duration evaluation score, scoreC is a necessity score, P weight is a priority weight, P max is a priority maximum score, a score preset for a worker, for example, 10 scores, i is a priority level value, N is a priority level number, T weight is a duration evaluation weight, T optimal is a predicted completion shortest duration, e time is a predicted execution duration, N weight is a necessity weight, and N score is a necessity level score;
and determining the test sequence of the target test case according to the test case execution score.
The test case execution score may refer to a score obtained by adding a duration evaluation score, a priority score and a necessity score. For example, given a priority weight of 0.2, a duration evaluation weight of 0.3, a necessity weight of 0.5, a maximum priority score of 10, a priority rank value of 2, a priority rank number of 5, a shortest completion time of 10 seconds, a predicted execution time of 60 seconds, and a necessity rank score of 8, the test case execution score is calculated as a score of 6.
S104, testing the target test case according to the target testing tool and the target access point to obtain a test case result.
The test processing may refer to the test content of executing the test cases by using the test tool, and in the present application, after each test case is executed, a test case result may be obtained, where the test case result may include pass or fail.
The test tool may be a script file, and may be compiled in Java language or Python language to implement file operation, data processing and network communication, and has a log output function for recording information in the execution process and obtaining an execution result.
S105, generating and displaying an authentication conclusion of the test case to be authenticated according to the test case result.
The authentication conclusion may refer to an authentication conclusion for judging whether the product passes the test, and in the application, the authentication conclusion may be a pass or fail of the test, and the authentication conclusion of the test case to be authenticated may refer to an authentication conclusion of whether the product corresponding to the test case to be authenticated passes the test.
According to the test case result, the authentication conclusion of the test case to be authenticated is generated and displayed, and the authentication conclusion comprises the following steps:
Determining a test-on-demand target test case in the target test cases and test-on-demand case results of the test-on-demand target test cases according to preset target test case requirements;
if the test-necessary case results of the test-necessary case represent that the test passes, obtaining a passing authentication conclusion of the test case to be authenticated;
And displaying the passing authentication conclusion of the test case to be authenticated.
The test case requirements can refer to test case requirements preset by staff, the test case requirements comprise necessary test information, the necessary test information comprises necessary test labels, the staff designs the test case according to the influence degree of test contents on products, the test case requirements are preset, the larger the influence of the test contents on the products is, the more likely the test case is to be provided with the necessary test labels, and the target test case requirements can refer to the test case requirements corresponding to the target test case.
The test-on-demand test case may refer to a test case with a test-on-demand label in the test case, and the test-on-demand test case result of the test-on-demand test case may refer to a test case result corresponding to the test-on-demand test case.
In the application, after determining the test-necessary test case in the target test cases and the test-necessary case result of the test-necessary case according to the preset target test case requirement, the method further comprises the following steps:
If the test-by-test case result of the test-by-test target case represents that the test is not passed, obtaining a non-passed authentication conclusion of the test case to be authenticated;
and displaying the failed authentication conclusion of the test case to be authenticated.
In the application, after the test case result of the test case to be tested characterizes the non-passing test, and the non-passing authentication conclusion of the test case to be authenticated is obtained, the method further comprises:
Determining the times of failed authentication conclusion obtained after test processing of the test instance to be authenticated;
If the number of times of failing to pass the authentication conclusion is smaller than a preset number of times threshold, re-executing the test processing of the target test case according to the target test tool and the target access point to obtain a test case result;
If the number of times of failed authentication conclusion is greater than a preset number of times threshold, ending the test processing of the test case to be authenticated, and taking the failed authentication conclusion as a global authentication result of the test case to be authenticated.
The number of times of failing to pass the authentication conclusion may refer to the number of times of failing to pass the authentication conclusion obtained after the test cases in the test cases are all tested.
The preset frequency threshold may refer to the number of times of failing to authenticate the conclusion preset by the staff, for example, if the preset frequency threshold is 2, the step of performing test processing on the target test case according to the target test tool and the target access point for the first time to obtain a test case result, and obtaining the failing to authenticate the conclusion, the number of times of failing to authenticate the conclusion is 1, and if the number of times of failing to authenticate the conclusion is less than the preset frequency threshold, the step of performing test processing on the target test case according to the target test tool and the target access point is re-performed, and obtaining the test case result, and after the test case result is obtained, continuing to obtain the failing to authenticate the conclusion, and the number of times of failing to authenticate the conclusion is 2, and ending the test processing of the test case to be authenticated at this time; if the preset frequency threshold is 2, the step of performing test processing on the target test case according to the target test tool and the target access point for the first time to obtain a test case result, and obtaining a failed authentication conclusion, wherein the number of times of the failed authentication conclusion is 1, and the number of times of the failed authentication conclusion is less than the preset frequency threshold, the step of performing test processing on the target test case according to the target test tool and the target access point is re-performed to obtain the test case result, and when the test case result is obtained, the step of continuously obtaining the failed authentication conclusion, and when the number of times of the failed authentication conclusion is 2, the step of performing test processing on the target test case according to the target test tool and the target access point is re-performed to obtain the test case result, and when the test case result is obtained, the step of continuously obtaining the failed authentication conclusion, wherein the number of times of the failed authentication conclusion is 3, and the test processing of the test case to be authenticated is ended.
According to the automatic test processing method provided by the embodiment of the application, the target test case is determined by acquiring the target product information, the test tool and the target access point are determined, the test case is processed, the authentication conclusion is generated, and the method further comprises the steps of determining the importance degree information of the test case and determining the test sequence. And finally, determining an authentication conclusion according to test case results, showing that the test is passed or failed, and re-executing the test or ending the test according to the situation, so that the time cost of manual test and configuration is reduced, the important priority of the test content is considered, the effectiveness of test processing is improved, and the effect of automatically generating the test authentication result is realized.
Fig. 2 is a schematic flow chart of another automated test processing method provided by the present application, as shown in fig. 2, the method includes:
S201, automatically testing application.
And acquiring product basic information and test requirements carried by the product during the automatic test application according to the automatic test application submitted by the user.
S202, automatic test configuration.
Initializing automatic test configuration basic data, and updating test configuration so as to make the network environment connection of the test normal.
S203, automatic test processing.
Calculating according to the priority, the expected execution time, the necessity and the weight of the test cases, obtaining the scores of the test cases, sorting the test cases according to the scores, and calling a test tool to execute the test cases to obtain test results.
S204, automatic test authentication.
And according to the data such as the test requirements, the test case test results, the test case configuration and the like, the test case test results are statistically analyzed, the test conclusion is determined, and the authentication certificate is generated.
Fig. 3 is a schematic structural diagram of an automated test processing apparatus according to the present application, and as shown in fig. 3, an automated test processing apparatus 30 according to the present embodiment includes:
The first obtaining module 301 is configured to obtain target product information of a test case to be authenticated, where the target product information at least includes one of target product name information, target product type information, and target product module information;
The first determining module 302 is configured to determine, according to the target product information, a target test case from a preset test case database, where the test case database includes product information and test cases having a corresponding relationship with the product information;
a second determining module 303, configured to determine a target test tool and a target access point from the test tool database according to the tool information of the target test case;
The test processing module 304 is configured to perform test processing on a target test case according to the target test tool and the target access point, so as to obtain a test case result;
And the authentication conclusion module 305 is used for generating and displaying an authentication conclusion of the test case to be authenticated according to the test case result.
In one possible implementation manner, before performing test processing on the target test case according to the target test tool to obtain a test case result, determining the target test case from a preset test case database according to the target product information, and further includes:
Determining importance degree information of the target test case, wherein the importance degree information of the target test case is determined according to priority information and necessity information of the target test case;
and determining the test sequence of the target test case according to the importance degree information of the target test case.
In one possible implementation manner, determining the test sequence of the target test case according to the importance degree information of the target test case includes:
Determining global priority division information of a global target test case;
obtaining priority scores according to the global priority dividing information, the priority information and the priority weights of the priority information;
Determining a necessity score according to the necessity information and the necessity weight of the necessity information;
and obtaining the test sequence of the target test case according to the priority score and the necessity score.
In one possible implementation manner, determining the test sequence of the target test case according to the importance degree information of the target test case includes:
acquiring the predicted execution time length and the predicted shortest completion time length of the target test case;
Obtaining a duration evaluation score according to the estimated execution duration and the estimated shortest completion duration of the target test case and the timely duration evaluation weight;
and determining the test sequence of the target test case according to the time length evaluation score and the importance degree information of the target test case.
In one possible implementation manner, determining the test sequence of the target test case according to the duration evaluation score and the importance degree information of the target test case includes:
obtaining priority scores and necessity scores according to importance degree information of the target test cases;
Summing the time length evaluation score, the priority score and the necessity score to obtain a test case execution score, wherein the test case execution score meets the following conditions:
score=scoreA+scoreB+scoreC,
scoreA=Pweight*(Pmax*(i/n)),
scoreB=Tweight*(Toptimal-Min(etime,600)*(Toptimal/600)),
scoreC=Nweight*Nscore
Wherein score is test case execution score, scoreA is priority score, scoreB is duration evaluation score, scoreC is necessity score, P weight is priority weight, P max is priority maximum score, i is priority rank value, N is priority rank number, T weight is duration evaluation weight, T optimal is predicted shortest duration, e time is predicted execution duration, N weight is necessity weight, N score is necessity rank score;
and determining the test sequence of the target test case according to the test case execution score.
In one possible implementation manner, generating and displaying the authentication conclusion of the test case to be authenticated according to the test case result includes:
Determining a test-on-demand target test case in the target test cases and test-on-demand case results of the test-on-demand target test cases according to preset target test case requirements;
if the test-necessary case results of the test-necessary case represent that the test passes, obtaining a passing authentication conclusion of the test case to be authenticated;
And displaying the passing authentication conclusion of the test case to be authenticated.
In one possible implementation manner, after determining the test-necessary target test case in the target test cases and the test-necessary case result of the test-necessary target test case according to the preset target test case requirement, the method further includes:
If the test-by-test case result of the test-by-test target case represents that the test is not passed, obtaining a non-passed authentication conclusion of the test case to be authenticated;
and displaying the failed authentication conclusion of the test case to be authenticated.
In one possible implementation manner, after the failed test is characterized by the test case result of the test case of the test target, the method further includes:
Determining the times of failed authentication conclusion obtained after test processing of the test instance to be authenticated;
If the number of times of failing to pass the authentication conclusion is smaller than a preset number of times threshold, re-executing the test processing on the target test case according to the target test tool to obtain a test case result;
If the number of times of failed authentication conclusion is greater than a preset number of times threshold, ending the test processing of the test case to be authenticated, and taking the failed authentication conclusion as a global authentication result of the test case to be authenticated.
The automated test processing apparatus provided in this embodiment may perform the method provided in the foregoing method embodiment, and its implementation principle and technical effects are similar, which is not described herein in detail.
Fig. 4 is a schematic structural diagram of an automated test handling apparatus according to the present application. As shown in fig. 4, the electronic device 40 provided in this embodiment includes: at least one processor 401 and a memory 402. Optionally, the device 40 further comprises a communication component 403. Wherein the processor 401, the memory 402 and the communication section 403 are connected by a bus 404.
In a specific implementation, at least one processor 401 executes computer-executable instructions stored in a memory 402, so that the at least one processor 401 performs the above-described method.
The specific implementation process of the processor 401 may refer to the above-mentioned method embodiment, and its implementation principle and technical effects are similar, and this embodiment will not be described herein again.
In the above embodiment, it should be understood that the Processor may be a central processing unit (english: central Processing Unit, abbreviated as CPU), or may be other general purpose processors, digital signal processors (english: DIGITAL SIGNAL Processor, abbreviated as DSP), application specific integrated circuits (english: application SPECIFIC INTEGRATED Circuit, abbreviated as ASIC), or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present invention may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in a processor for execution.
The Memory may include high-speed Memory (Random Access Memory, RAM) or may further include Non-volatile Memory (NVM), such as at least one disk Memory.
The bus may be an industry standard architecture (Industry Standard Architecture, ISA) bus, an external device interconnect (PERIPHERAL COMPONENT, PCI) bus, or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, among others. The buses may be divided into address buses, data buses, control buses, etc. For ease of illustration, the buses in the drawings of the present application are not limited to only one bus or to one type of bus.
The application also provides a computer program product comprising a computer program which, when executed by a processor, implements the method described above.
The application also provides a computer readable storage medium, wherein computer execution instructions are stored in the computer readable storage medium, and when a processor executes the computer execution instructions, the method is realized.
The above-described readable storage medium may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk. A readable storage medium can be any available medium that can be accessed by a general purpose or special purpose computer.
An exemplary readable storage medium is coupled to the processor such the processor can read information from, and write information to, the readable storage medium. In the alternative, the readable storage medium may be integral to the processor. The processor and the readable storage medium may reside in an Application SPECIFIC INTEGRATED Circuits (ASIC). The processor and the readable storage medium may reside as discrete components in a device.
The division of units is merely a logical function division, and there may be another division manner in actual implementation, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method of the embodiments of the present invention. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the method embodiments described above may be performed by hardware associated with program instructions. The foregoing program may be stored in a computer readable storage medium. The program, when executed, performs steps including the method embodiments described above; and the aforementioned storage medium includes: various media that can store program code, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This invention is intended to cover any adaptations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains and as may be applied to the precise construction hereinbefore set forth and shown in the drawings and as follows in the scope of the appended claims. The scope of the invention is limited only by the appended claims.

Claims (12)

1. An automated test processing method, comprising:
Obtaining target product information of a test case to be authenticated, wherein the target product information at least comprises one of target product name information, target product type information and target product module information;
determining a target test case from a preset test case database according to the target product information, wherein the test case database comprises product information and test cases with corresponding relations with the product information;
Determining a target test tool and a target access point from a test tool database according to the tool information of the target test case;
according to the target testing tool and the target access point, testing the target testing case to obtain a testing case result;
And generating and displaying the authentication conclusion of the test case to be authenticated according to the test case result.
2. The method of claim 1, wherein after determining the target test case from a preset test case database according to the target product information before performing test processing on the target test case according to the target test tool and the target access point to obtain a test case result, the method further comprises:
Determining importance degree information of the target test case, wherein the importance degree information of the target test case is determined according to priority information and necessity information of the target test case;
And determining the test sequence of the target test case according to the importance degree information of the target test case.
3. The method according to claim 2, wherein determining the test order of the target test cases according to the importance information of the target test cases comprises:
Determining global priority division information of a global target test case;
Obtaining a priority score according to the global priority division information, the priority information and the priority weight of the priority information;
Determining a necessity score according to the necessity information and the necessity weight of the necessity information;
and obtaining the test sequence of the target test case according to the priority score and the necessity score.
4. The method according to claim 2, wherein determining the test order of the target test cases according to the importance information of the target test cases comprises:
Acquiring the predicted execution time length and the predicted shortest completion time length of the target test case;
Obtaining a duration evaluation score according to the estimated execution duration and the estimated shortest completion duration of the target test case and the time duration evaluation weight;
And determining the test sequence of the target test case according to the time length evaluation score and the importance degree information of the target test case.
5. The method of claim 4, wherein determining the test order of the target test case based on the duration evaluation score and the importance information of the target test case comprises:
Obtaining priority scores and necessity scores according to the importance degree information of the target test cases;
Summing the time length evaluation score, the priority score and the necessity score to obtain a test case execution score, wherein the test case execution score satisfies the following conditions:
score=scoreA+scoreB+scoreC,
scoreA=Pweight*(Pmax*(i/n)),
scoreB=Tweight*(Toptimal-Min(etime,600)*(Toptimal/600)),
scoreC=Nweight*Nscore
Wherein score is the test case execution score, scoreA is the priority score, scoreB is the duration evaluation score, scoreC is the necessity score, P weight is the priority weight, P max is the priority maximum score, i is the priority rank value, N is the priority rank number, T weight is the duration evaluation weight, T optimal is the predicted shortest completion duration, e time is the predicted execution duration, N weight is the necessity weight, N score is the necessity rank score;
and determining the test sequence of the target test case according to the test case execution score.
6. The method of claim 1, wherein generating and displaying the certification conclusion of the test case to be certified according to the test case result comprises:
determining a test-necessary target test case in the target test cases and test-necessary case results of the test-necessary target test case according to preset target test case requirements;
If the test-necessary case results of the test-necessary case represent that the test passes, obtaining a passing authentication conclusion of the test case to be authenticated;
and displaying the passing authentication conclusion of the test case to be authenticated.
7. The method of claim 6, wherein after determining the test-on-demand test cases in the target test cases and the test-on-demand case results of the test-on-demand test cases according to the preset target test case requirements, the method further comprises:
If the test-necessary case result of the test-necessary case is characterized as passing the test, obtaining a conclusion of passing the test of the test case to be authenticated;
and displaying the failed authentication conclusion of the test case to be authenticated.
8. The method of claim 7, wherein after the failed certification conclusion of the test case to be certified is reached if the test case result of the test case to be tested characterizes the failed test, the method further comprises:
determining the times of failed authentication conclusion obtained after the test processing of the test instance to be authenticated;
If the number of times of the failed authentication conclusion is smaller than a preset number of times threshold, re-executing the test processing of the target test case according to the target test tool and the target access point to obtain a test case result;
and if the times of the failed authentication conclusion is greater than a preset times threshold, ending the test processing of the test case to be authenticated, and taking the failed authentication conclusion as a global authentication result of the test case to be authenticated.
9. An automated test handling apparatus, comprising:
the first acquisition module is used for acquiring target product information of a test case to be authenticated, wherein the target product information at least comprises one of target product name information, target product type information and target product module information;
The first determining module is used for determining a target test case from a preset test case database according to the target product information, wherein the test case database comprises product information and test cases with corresponding relations with the product information;
The second determining module is used for determining a target test tool and a target access point from a test tool database according to the tool information of the target test case;
The test processing module is used for carrying out test processing on the target test case according to the target test tool and the target access point to obtain a test case result;
and the authentication conclusion module is used for generating and displaying an authentication conclusion of the test case to be authenticated according to the test case result.
10. An electronic device, comprising: a memory, a processor;
The memory stores computer-executable instructions;
The processor executing computer-executable instructions stored in the memory, causing the processor to perform the method of any one of claims 1-8.
11. A computer readable storage medium having stored therein computer executable instructions which when executed by a processor are adapted to carry out the method of any one of claims 1-8.
12. A computer program product comprising a computer program which, when executed by a processor, implements the method of any of claims 1-8.
CN202410565267.9A 2024-05-08 2024-05-08 Automatic test processing method, device, equipment, storage medium and product Pending CN118295930A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410565267.9A CN118295930A (en) 2024-05-08 2024-05-08 Automatic test processing method, device, equipment, storage medium and product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410565267.9A CN118295930A (en) 2024-05-08 2024-05-08 Automatic test processing method, device, equipment, storage medium and product

Publications (1)

Publication Number Publication Date
CN118295930A true CN118295930A (en) 2024-07-05

Family

ID=91686361

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410565267.9A Pending CN118295930A (en) 2024-05-08 2024-05-08 Automatic test processing method, device, equipment, storage medium and product

Country Status (1)

Country Link
CN (1) CN118295930A (en)

Similar Documents

Publication Publication Date Title
CN109871326B (en) Script recording method and device
US7747987B1 (en) System and method of analyzing risk in risk-based software testing
CN114546738B (en) Universal test method, system, terminal and storage medium for server
CN108984389B (en) Application program testing method and terminal equipment
CN110096430B (en) Third party SDK access test method, device, terminal and storage medium
CN111198809A (en) Interface automation test method and device
CN111579959A (en) Chip verification method, device and storage medium
CN110209520B (en) Method and device for improving SSD (solid State disk) testing efficiency, computer equipment and storage medium
CN116245074A (en) Chip verification method, device and storage medium
CN113377667A (en) Scene-based testing method and device, computer equipment and storage medium
CN115729817A (en) Method and device for generating and optimizing test case library, electronic equipment and storage medium
CN112148594A (en) Script testing method and device, electronic equipment and storage medium
CN109388564B (en) Test method and device and electronic equipment
CN109086198B (en) Database test method and device and storage medium
CN112905451A (en) Automatic testing method and device for application program
CN117493188A (en) Interface testing method and device, electronic equipment and storage medium
CN111176917B (en) Method, system, terminal and storage medium for testing stability of CPU SST-BF function
CN109684205B (en) System testing method, device, electronic equipment and storage medium
CN111352838A (en) Package file generation method, package file generation device and electronic equipment
CN118295930A (en) Automatic test processing method, device, equipment, storage medium and product
CN112084108A (en) Test script generation method and device and related components
CN111222942A (en) Data processing method and device, readable medium and electronic equipment
CN115309661A (en) Application testing method and device, electronic equipment and readable storage medium
CN114860608A (en) Scene construction based system automation testing method, device, equipment and medium
CN113836032A (en) Android interface concurrent testing method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination