CN115237757A - Case testing method and device - Google Patents

Case testing method and device Download PDF

Info

Publication number
CN115237757A
CN115237757A CN202210761845.7A CN202210761845A CN115237757A CN 115237757 A CN115237757 A CN 115237757A CN 202210761845 A CN202210761845 A CN 202210761845A CN 115237757 A CN115237757 A CN 115237757A
Authority
CN
China
Prior art keywords
case
test
priority
prediction model
execution information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210761845.7A
Other languages
Chinese (zh)
Inventor
史培宁
董桂官
李婧欣
陈仁伟
曹策
成曦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Electronics Standardization Institute
Original Assignee
China Electronics Standardization Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Electronics Standardization Institute filed Critical China Electronics Standardization Institute
Priority to CN202210761845.7A priority Critical patent/CN115237757A/en
Publication of CN115237757A publication Critical patent/CN115237757A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5027Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals
    • G06F9/5038Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals considering the execution order of a plurality of tasks, e.g. taking priority or time dependency constraints into consideration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2209/00Indexing scheme relating to G06F9/00
    • G06F2209/50Indexing scheme relating to G06F9/50
    • G06F2209/5021Priority

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Test And Diagnosis Of Digital Computers (AREA)

Abstract

The invention provides a case testing method and a case testing device, wherein the method comprises the following steps: acquiring execution information of each test case of the test and a preset priority of each test case; inputting the execution information of the test case into a pre-trained prediction model according to a preset priority, and adjusting the priority and the execution information of the subsequent test case to be input according to a prediction result output by the prediction model; and according to the adjusted priority, sequentially inputting the execution information of the subsequent test cases to be input after adjustment into a pre-trained prediction model, and adjusting the priority and the execution information of the subsequent test cases to be input again according to the prediction result output by the prediction model, and so on until all the test cases are tested. The invention realizes the dynamic arrangement of the test priority in the test operation process, improves the test efficiency, and can provide great help for the classification and tracing of the reasons of test failure.

Description

Case testing method and device
Technical Field
The invention relates to the field of automatic testing, in particular to a case testing method and device.
Background
Automated testing of test cases typically requires an experienced test designer to define the test cases: and generating a test case (or called test case) sequence with preset priority and operating the test case sequence with preset priority. The priority and/or order of test cases typically occurs before the test run begins and remains unchanged throughout the run. When a certain test case fails to test, a plurality of test cases related to the certain test case also need to continue to test in sequence, but the test results of the plurality of test cases related to the certain test case are not meaningful. Especially, when the number of test cases is large and problems occur in individual key cases, the test trend cannot be judged, and the test of a plurality of related test cases is also influenced. In addition, only the number of test failures is counted, so that real reasons are often easily ignored or important problems are easily buried in a large number of failure cases.
Disclosure of Invention
The invention provides a case testing method and device.
In a first aspect, the present invention provides a case testing method, including: acquiring execution information of each test case of the test and a preset priority of each test case; inputting the execution information of the test case into a pre-trained prediction model according to a preset priority, and adjusting the priority and the execution information of the subsequent test case to be input according to a prediction result output by the prediction model; and according to the adjusted priority, sequentially inputting the execution information of the subsequent test cases to be input after adjustment into a pre-trained prediction model, and adjusting the priority and the execution information of the subsequent test cases to be input again according to the prediction result output by the prediction model, and so on until all the test cases are tested.
Further, the method further comprises: the prediction model outputs a prediction result for a test case and an evaluation corresponding to the prediction result, the evaluation including a confidence and a confidence ranking.
Further, the inputting the execution information of the test case into the pre-trained prediction model according to the preset priority includes: and inputting the execution information of the test case into at least one pre-trained prediction model according to a preset priority, wherein the at least one prediction model is arranged in at least one corresponding device, each device tests the test case according to a preset automatic test protocol, and the automatic test protocol is used for ensuring that the test rule of each device on the test case is consistent.
Further, the execution information comprises parameters of the input prediction model and correlation relations with other test cases; and inputting the execution information of the test case into a pre-trained prediction model according to the preset priority, and adjusting the priority and the execution information of the subsequent test case to be input according to a prediction result output by the prediction model, wherein the method comprises the following steps of: and inputting the parameters of the test cases into a pre-trained prediction model according to a preset priority, and adjusting the priority and the incidence relation of the subsequent test cases to be input according to the prediction result output by the prediction model and the incidence relation with other test cases.
Further, the inputting parameters of the test case into a pre-trained prediction model according to a preset priority, and adjusting the priority and the association relationship of the subsequent test case to be input according to the prediction result output by the prediction model and the association relationship with other test cases includes: inputting the parameters of the test case into a pre-trained prediction model, adjusting the incidence relation of the subsequent test case to be input according to the prediction result output by the prediction model, and determining whether the subsequent test case to be input continues to be tested according to the adjusted incidence relation; if so, adjusting the priority of the subsequent test case to be input according to the prediction result.
Further, the incidence relation comprises a front case list and a rear case list, and the front/rear case list comprises front/rear test cases of the test cases; and adjusting the incidence relation of the subsequent test cases to be input according to the prediction result output by the prediction model, wherein the method comprises the following steps: and adjusting the weight of each test case in the front/rear case list of the test case to be input subsequently according to the prediction result output by the prediction model so as to complete the adjustment of the incidence relation according to the weight.
In a second aspect, the present invention further provides a case testing apparatus, including: the first processing module is used for acquiring the execution information of each test case of the test and the preset priority of each test case; the second processing module is used for inputting the execution information of the test case into a pre-trained prediction model according to the preset priority, and adjusting the priority and the execution information of the subsequent test case to be input according to the prediction result output by the prediction model; and according to the adjusted priority, sequentially inputting the execution information of the adjusted subsequent test cases to be input into the pre-trained prediction model, and adjusting the priority and the execution information of the subsequent test cases to be input again according to the prediction result output by the prediction model, and so on until all the test cases are tested.
In a third aspect, the present invention further provides an electronic device, comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor executes the computer program to perform the steps of any of the case testing methods described above.
In a fourth aspect, the present invention also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the case testing method as described in any of the above.
In a fifth aspect, the present invention also provides a computer program product comprising a computer program which, when executed by a processor, performs the steps of the case testing method as described in any one of the above.
According to the case testing method and device provided by the invention, the execution information of each testing case of the current test and the preset priority of each testing case are obtained; inputting the execution information of the test case into a pre-trained prediction model according to a preset priority, and adjusting the priority and the execution information of the subsequent test case to be input according to a prediction result output by the prediction model; and according to the adjusted priority, sequentially inputting the execution information of the subsequent test cases to be input after adjustment into a pre-trained prediction model, and adjusting the priority and the execution information of the subsequent test cases to be input again according to the prediction result output by the prediction model, and so on until all the test cases are tested. The test priority is dynamically arranged in the test running process, and the test efficiency is improved. The method provides great help for classifying and tracing the reasons of test failure.
Drawings
In order to more clearly illustrate the technical solutions of the present invention or the prior art, the drawings needed for the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
Fig. 1 is a schematic flow diagram of some embodiments of a case testing method provided in accordance with the present invention;
FIG. 2 is a schematic diagram of a base class of a test case;
FIG. 3 is a schematic diagram of the front case being processed through the first traversal and the rear list being sorted;
FIG. 4 is a schematic diagram of the post case after all traversal is completed on the basis of FIG. 3;
FIG. 5 is a schematic block diagram of some embodiments of a case testing apparatus provided in accordance with the present invention;
fig. 6 is a schematic structural diagram of an electronic device provided according to the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings. The embodiments and features of the embodiments of the invention may be combined with each other without conflict.
It should be noted that the terms "first", "second", and the like in the present invention are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a" or "an" or "the" modification(s) in the present invention are intended to be illustrative rather than limiting and that those skilled in the art will understand that reference to "one or more" unless the context clearly indicates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present invention are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
Referring to fig. 1, fig. 1 is a schematic flow chart of some embodiments of a case testing method according to the present invention. As shown in fig. 1, the method comprises the steps of:
step 101, obtaining the execution information of each test case of the test and the preset priority of each test case.
In the automatic testing process of the test cases, each test case (the test case is also called a case) has corresponding execution information and priority. As an example, in practical applications, each test case may be made to inherit a base class, and the execution information and priority may be defined in the base class. Generally, the process of inheritance begins with defining a base class that defines all the attributes and methods that are common to the derived class, which inherits the required attributes and methods from the base class, and adding new attributes and methods in the derived class. As shown in FIG. 2, the base class includes the properties, methods, and interfaces of the test case. Therefore, the execution information and the priority of each test case can be read by calling the base class of each test case.
The preset priority can be dynamically adjusted according to the relation between cases and the actual test situation, and is recorded on the attribute of the base class.
For each case of the automated test, it is implemented in the manner of the base class as shown in fig. 2. In addition, for different cases, a corresponding test execution interface may be implemented, or a general test execution interface may be implemented, for example, a request call test of restful. In some embodiments, a test execution interface may be called by an automation utility and a result determination method may be called.
Step 102, inputting the execution information of the test case into a pre-trained prediction model according to a preset priority, and adjusting the priority and the execution information of the subsequent test case to be input according to a prediction result output by the prediction model; and according to the adjusted priority, sequentially inputting the execution information of the subsequent test cases to be input after adjustment into a pre-trained prediction model, and adjusting the priority and the execution information of the subsequent test cases to be input again according to the prediction result output by the prediction model, and so on until all the test cases are tested.
In some embodiments, the execution information of the test case a is input into a pre-trained prediction model according to a preset priority, and the priority and the execution information of the test case to be input subsequently are adjusted according to a prediction result output by the prediction model, wherein the execution information of the test case to be input subsequently and the execution information of the test case a to be adjusted are partially or completely the same. As an example, if there are 7 cases ABCDEFG in total, the priority is: ABCDEFG, where case A is the same as case B's execution information, and case B is partially the same as the execution information of scale G. Inputting the case A into the model according to the priority, wherein the prediction result is failure, and the execution information of the case B and the case G is related to the case A, so that the case B is not tested according to the test result of the case A, the case G is ranked to the first end, the execution information of the case G overlapped with the case A is marked as the non-execution test, and the adjusted priority is as follows: CDEFG. The test priority is dynamically arranged in the test running process. And then inputting the case C into the model, and so on until all the test cases are tested. It can be seen that the reason why the test failed can be duplicated according to the execution information is case a. And meanwhile, case B does not carry out testing, so that the testing efficiency is improved.
The invention does not limit the network structure and the training mode of the prediction model.
According to the case testing method disclosed by some embodiments of the invention, the execution information of each testing case of the test and the preset priority of each testing case are obtained; inputting the execution information of the test case into a pre-trained prediction model according to a preset priority, and adjusting the priority and the execution information of the subsequent test case to be input according to a prediction result output by the prediction model; and according to the adjusted priority, sequentially inputting the execution information of the subsequent test cases to be input after adjustment into a pre-trained prediction model, and adjusting the priority and the execution information of the subsequent test cases to be input again according to the prediction result output by the prediction model, and so on until all the test cases are tested. The test priority is dynamically arranged in the test running process, and the test efficiency is improved. The method provides great help for classifying and tracing the reason of the test failure.
In some optional implementations, the method further comprises: the prediction model outputs a prediction result for the test case and an evaluation of the corresponding prediction result, the evaluation including a confidence level and a confidence level ranking.
By way of example, in the course of automated testing, the predicted results include: the response is calculated to be successful; the response is retransmitted to be successful after timeout; checking a return error code; and configuring a result value check expression.
As an example, the IDs and test results for the test cases with the top 10 confidence ranks may be displayed as desired.
As an example, the prediction result includes an association with one or more test cases.
In some optional implementations, inputting the execution information of the test case into the pre-trained prediction model according to a preset priority includes: and inputting the execution information of the test case into at least one pre-trained prediction model according to a preset priority, wherein the at least one prediction model is arranged in the corresponding at least one device, each device tests the test case according to a preset automatic test protocol, and the automatic test protocol is used for ensuring that the test rule of each device on the test case is consistent.
In order to ensure the testing efficiency, a plurality of computers (namely equipment) can be arranged for testing the cases. And inputting the test cases into a plurality of devices for testing according to the priorities, dynamically adjusting the test priorities of the remaining cases according to the test results of the devices, continuously summarizing and predicting the overall result, and using an automatic test protocol to enable the operation rules of the processors of the computers to be consistent.
When the cases are tested, a plurality of cases are tested at one time, before each test is started, if a new case is added or the original case relation is changed, one processing is required to be executed, if no function is added, only a regression test is carried out, and the reprocessing is generally not required.
In some alternative implementations, the execution information includes parameters of the input predictive model and associations with other test cases; inputting the execution information of the test case into a pre-trained prediction model according to a preset priority, and adjusting the priority and the execution information of the subsequent test case to be input according to a prediction result output by the prediction model, wherein the method comprises the following steps: and inputting parameters of the test cases into a pre-trained prediction model according to a preset priority, and adjusting the priority and the association relation of the subsequent test cases to be input according to the prediction result output by the prediction model and the association relation with other test cases.
As an example, the parameters of the input prediction model may be in json format, with case B having the corresponding json format parameters:
Figure BDA0003721196750000081
where sn, model, and devType denote attributes of the parameter, "55041067211M6385", "AH-001", and "06F" denote parameters to be input.
Associations with other test cases may be recorded in the attributes of the test case.
As an example, if there are 7 cases ABCDEFG in total, the priority is: ABCDEFG, after case a testing, the priority was adjusted to: BCDEFG. The parameters "55041067211M6385", "AH-001" and "06F" of case B are inputted into the model, the prediction result outputted by the prediction model is test failure, and the correlation between case B and other test cases is as follows: case B is the same as the execution information of case F, and case B is partially the same as the execution information of scale G. Because the case B is failed in prediction and the execution information of B is the same as that of case F, case F does not perform the test, case G ranks first in the last place, and the execution information of case G overlapping with case B is marked as no-execution test (the incidence relation of the subsequent test cases to be input is adjusted), and the adjusted priority is: and (5) CDEG.
In some optional implementation manners, according to a preset priority, inputting parameters of a test case into a pre-trained prediction model, and adjusting the priority and the association relationship of a subsequent test case to be input according to a prediction result output by the prediction model and the association relationship with other test cases, including: inputting the parameters of the test case into a pre-trained prediction model, adjusting the incidence relation of the subsequent test case to be input according to the prediction result output by the prediction model, and determining whether the subsequent test case to be input continues to be tested according to the adjusted incidence relation; if yes, the priority of the subsequent test cases to be input is adjusted according to the prediction result.
Still taking the above example, if the case B prediction result is failure and the execution information of B is the same as that of case F, then case F is not tested, and cases C, D, E, and G need to adjust the priority and the correlation with other cases.
In some optional implementations, the association includes a front case list and a back case list, and the front/back case list includes front/back test cases of the test cases; and adjusting the incidence relation of the subsequent test cases to be input according to the prediction result output by the prediction model, wherein the incidence relation comprises the following steps: and adjusting the weight of each test case in the front/rear case list of the test case to be input subsequently according to the prediction result output by the prediction model so as to complete the adjustment of the incidence relation according to the weight.
As shown in fig. 2, after each case inherits the base class, the attributes in the base class include a pre-case list and a post-case list (or called pre-case list and post-case list), and before all cases are tested, the pre-case list and the post-case list need to be set in advance. The base classes of all cases provide a preposed case list and a postpositional case list in a list form, and through the configured preposed and postpositional case relationship, a case preposed and postpositional relationship network can be quickly formed through two times of sequencing and traversal according to the following processing process. According to the relation between the test cases, the invention automatically adjusts the priority of the test cases by a program in the automatic test process through the base class of the preset cases and the execution result judgment method, and predicts and judges the whole result, so that the regression can be quickly carried out, and the analysis and the positioning of a large amount of automatic test case results can be assisted.
The treatment process is as follows:
1. initializing a list as a preposed case total list containing all test cases;
2. the relation lists are sorted according to the leading cases, wherein the relation lists can be automatically matched according to test input and results, and in some application scenarios, because the automatic matching is difficult to find configuration equipment and a test on one equipment instance, manual determination can be considered to be more accurate;
3. taking out a corresponding preposed case from the sorting result (the sorting result changes when a new case is added, or the sorting result changes when the case relation changes, the sorting result is used for accelerating the generation of a relation list, does not influence the subsequent work, is only used in the processing process, and has the significance that the preposed and postpositional relation of the case can be finished by only traversing all cases once);
4. adding all corresponding post-cases into a post-case list in the pre-case class, and deleting the corresponding post-cases from the pre-case general list;
5. all the relation lists (namely the pre-case list and the post-case list) are processed in sequence, and the situation that the pre-case is processed in a first traversal mode and the post-case list is sorted as shown in fig. 3 is solved;
6. similar to the process, a list is initialized, which contains all the test cases as a post-case general list, as shown in fig. 4;
7. sorting the relationship list according to subsequent cases;
8. similarly, a corresponding post case is taken out from the sorting result, the corresponding front case is deleted from the post case total list, and all the front cases are put into the front case list of the case class and are sequentially processed.
The first embodiment is as follows:
in a real automated case testing scenario, cases often reach thousands or even tens of thousands, and for convenience of description, the example cases are ABCDEFG for a total of 7 cases: the front and back relations of the cases are 5 in total, BC, BD, BE, CE and CF, the cases are sorted according to the steps, CDEF is required to BE deleted from a front list summary table and added into a back list table of B
And add EF to the postlist of C, as shown in fig. 3. Also according to the rear case row
The sequences are BC, BD, BE, CE and CF, BC is deleted from the total postlist, B is added into the preposed list of CDE, C is added into the preposed list of EF, as shown in FIG. 4, on the basis of FIG. 3, B case appears on the left, which means B case is the preposed case of C case, E and F case appears on the right, which means EF is the posttask of C. The pre/post list for each case is processed in turn.
The second embodiment:
further, in the automatic case testing process, when the test of the front-end task fails or is abnormal, the rear-end case corresponding to the influence needs to be judged, or the priority needs to be adjusted, all the rear-end cases corresponding to all the influence of the front-end task need to be calculated and evaluated, the basic model for evaluation can be performed according to the relevance input by the case and the number of all the corresponding rear-end cases, and the specific steps refer to the following steps:
1) Starting from all the post-case lists, since no subsequent case exists in the cases in the lists, the weights can be all set to be 1;
2) Sequentially processing a case from a front case list corresponding to the case according to the set case, if the case is not calculated and evaluated, checking whether the corresponding case of the rear list is completely processed, and if the case is completely processed, starting evaluation and calculation;
3) Evaluating and calculating according to the number of the cases and the json similarity of each case test input;
4) Repeating the steps 2), 3) until all cases have been processed.
Taking the first embodiment as an example, when calculating the weights of each case, assuming that the front and back are in a strong association relationship, calculating the weights of each case starts from a rear general list:
(1) When the calculation is started, ADEFG has no post task and is set as 1;
(2) After the calculation is finished, the weight of the post task of the C is already calculated, and then the weight of the C is E + F + C and is calculated to be 3;
(3) After C is calculated, the post-positioned task weight CDE and calculation of B are finished, then C + D + E + B is calculated as 3+ 1=6;
(4) Under the simplified weight model, the test B should be performed first, then the test C should be performed, other cases are tested according to the front-back relationship, and the priority setting is completed.
In some embodiments, it is further required to perform weighting calculation according to the json (i.e. parameters in json format) input for each test case, for example, the json corresponding to B is:
Figure BDA0003721196750000121
and the corresponding json for C is:
Figure BDA0003721196750000122
json for case D is:
{"devId":"8ee7c194-7ce1-4968-812e-1aa1ab2d868c","errcode":0}
for example, in the above embodiment, when the attributes such as ID, sn, mac, etc. are consistent, the association degree is set to be 1, and the other conditions are sequentially decreased. In addition, the setting of the association degree may be set according to different scenarios, for example, the device ID and the status flag may be set to the highest association degree, the value and the threshold are only a reference, and may be changed according to implementation scenarios, or may be changed according to different step thresholds. According to the association relation, only the association degree of the case with the pre-and post-relation is processed, and the unrelated case does not need to be processed.
Before the automatic test is started, the pre-processing of the context and case relevance is finished, and the case relevance is not changed in the whole test process. The pre-and post-set relations are preset, when one type of test result is concerned, for example, a test case fails, one advantage of the embodiments is that the model is predicted to search the case association relation, the most suitable trend is found from the test case result attributes by using the model rule, the possibly influenced case is predicted, and the priority of the test case is rearranged.
In the whole test process, the prepositive and postpositive relation of the case can not be changed, only the priority is changed or is directly set as untested, if the prepositive case is input, for example, equipment needs to be opened, and the postpositive case is closed, so that the prepositive equipment fails in test, and the relevant postpositive case can not be tested, namely untested, and the process can be judged according to the relevance of the input json.
Example three:
and in the test process, priority ranking is carried out according to the weight given by the prediction model, and when test case test failure occurs, the relevance is judged to determine whether the relevant case is not tested or the priority is adjusted, so as to carry out dynamic scheduling. According to the case relationship of the first embodiment, for example, the case B and the case C are strongly correlated, the case B must be tested first according to the evaluation prediction result, if the test fails, the case C is set as abnormal skipping, and meanwhile, the case E and the case F are further set as abnormal skipping. If the case B is weakly associated with the case C, after the case B fails to be tested, the case C has the highest priority according to the prediction evaluation model, and the case C should be tested first.
Additionally, during the automated case test run, one or more test case results associated with the test run may be received in series or in parallel;
according to the operation result of the case, according to the prediction model, adjusting the priority of the untested case or directly setting the untested case as an abnormal untested case:
wherein:
and 1, case test levels with multiple overall post tasks and high relevance are the highest priority.
And 2, the preposed tasks fail or are abnormal, and all postpositional tasks with high relevance are set to be abnormal and are not tested.
And 3, when parallel multi-task automatic testing is carried out, cases of the same pre-posed task and the post-posed task of the same pre-posed task are not parallel, namely, the parallel testing is carried out according to the total pre-posed task list and the post-posed task list of the task cases.
In summary, for the prior art, the beneficial technical effects of the invention are as follows: the automatic test method based on the real-time automatic test result and the prediction model helps to find the fault faster by optimizing the sequencing of the test cases, reduces a large number of similar cases tested repeatedly due to the same error, reduces the time required for a test engineer to receive test operation feedback, collects test failure information as soon as possible, merges and summarizes real test problems according to the relevance of the cases, and enables developers to carry out fault diagnosis quickly.
Referring to fig. 5, fig. 5 is a schematic structural diagram of some embodiments of a case testing apparatus according to the present invention, as an implementation of the methods shown in the above figures, the present invention further provides some embodiments of a case testing apparatus, which correspond to the embodiments of the methods shown in fig. 1, and which can be applied to various electronic devices.
As shown in fig. 5, the case testing apparatus of some embodiments includes a first processing module 501, a second processing module 502: the first processing module 501 is configured to obtain execution information of each test case of the test and a preset priority of each test case; the second processing module 502 is configured to input execution information of the test case into a pre-trained prediction model according to a preset priority, and adjust the priority and the execution information of the subsequent test case to be input according to a prediction result output by the prediction model; and according to the adjusted priority, sequentially inputting the execution information of the adjusted subsequent test cases to be input into the pre-trained prediction model, and adjusting the priority and the execution information of the subsequent test cases to be input again according to the prediction result output by the prediction model, and so on until all the test cases are tested.
In an optional implementation manner of some embodiments, the apparatus further includes a third processing module, configured to: the prediction model outputs a prediction result for the test case and an evaluation of the corresponding prediction result, the evaluation including a confidence and a confidence ranking.
In an optional implementation manner of some embodiments, the second processing module 502 is further configured to: and inputting the execution information of the test case into at least one pre-trained prediction model according to a preset priority, wherein the at least one prediction model is arranged in at least one corresponding device, each device tests the test case according to a preset automatic test protocol, and the automatic test protocol is used for ensuring that the test rule of each device on the test case is consistent.
In an optional implementation of some embodiments, the execution information includes parameters of the input prediction model and associations with other test cases; and a second processing module 502, further configured to: and inputting parameters of the test cases into a pre-trained prediction model according to a preset priority, and adjusting the priority and the association relation of the subsequent test cases to be input according to the prediction result output by the prediction model and the association relation with other test cases.
In an optional implementation manner of some embodiments, the second processing module 502 is further configured to: inputting the parameters of the test case into a pre-trained prediction model, adjusting the incidence relation of the subsequent test case to be input according to the prediction result output by the prediction model, and determining whether the subsequent test case to be input continues to be tested according to the adjusted incidence relation; if yes, adjusting the priority of the subsequent test case to be input according to the prediction result.
In an optional implementation of some embodiments, the association includes a front case list and a back case list, and the front/back case list includes front/back test cases of the test cases; and a second processing module 502, further configured to: and adjusting the weight of each test case in the front/rear case list of the test case to be input subsequently according to the prediction result output by the prediction model so as to complete the adjustment of the incidence relation according to the weight.
It will be appreciated that the modules described in the apparatus correspond to the steps in the method described with reference to figure 1. Therefore, the operations, features and advantageous effects described above for the method are also applicable to the apparatus and the modules and units included therein, and are not described herein again.
Fig. 6 illustrates a physical structure diagram of an electronic device, which may include, as shown in fig. 6: a processor (processor) 610, a communication Interface 620, a memory (memory) 630 and a communication bus 640, wherein the processor 610, the communication Interface 620 and the memory 630 complete communication with each other through the communication bus 640. Processor 610 may invoke logic instructions in memory 630 to perform a case testing method comprising: acquiring execution information of each test case of the test and a preset priority of each test case; inputting the execution information of the test case into a pre-trained prediction model according to a preset priority, and adjusting the priority and the execution information of the subsequent test case to be input according to a prediction result output by the prediction model; and according to the adjusted priority, sequentially inputting the execution information of the adjusted subsequent test cases to be input into the pre-trained prediction model, and adjusting the priority and the execution information of the subsequent test cases to be input again according to the prediction result output by the prediction model, and so on until all the test cases are tested.
In addition, the logic instructions in the memory 630 may be implemented in software functional units and stored in a computer readable storage medium when the logic instructions are sold or used as independent products. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk, and various media capable of storing program codes.
In another aspect, the present invention also provides a computer program product comprising a computer program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, enable the computer to perform the case testing method provided by the above methods, the method comprising: acquiring execution information of each test case of the test and a preset priority of each test case; inputting the execution information of the test case into a pre-trained prediction model according to a preset priority, and adjusting the priority and the execution information of the subsequent test case to be input according to a prediction result output by the prediction model; and according to the adjusted priority, sequentially inputting the execution information of the adjusted subsequent test cases to be input into the pre-trained prediction model, and adjusting the priority and the execution information of the subsequent test cases to be input again according to the prediction result output by the prediction model, and so on until all the test cases are tested.
In yet another aspect, the present invention also provides a non-transitory computer readable storage medium having stored thereon a computer program that, when executed by a processor, is implemented to perform the case testing methods provided above, the method comprising: acquiring execution information of each test case of the test and a preset priority of each test case; inputting the execution information of the test case into a pre-trained prediction model according to a preset priority, and adjusting the priority and the execution information of the subsequent test case to be input according to a prediction result output by the prediction model; and according to the adjusted priority, sequentially inputting the execution information of the subsequent test cases to be input after adjustment into a pre-trained prediction model, and adjusting the priority and the execution information of the subsequent test cases to be input again according to the prediction result output by the prediction model, and so on until all the test cases are tested.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the various embodiments or some parts of the above-described methods of the embodiments.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, and not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A case testing method, comprising:
acquiring execution information of each test case of the test and a preset priority of each test case;
inputting the execution information of the test case into a pre-trained prediction model according to a preset priority, and adjusting the priority and the execution information of the subsequent test case to be input according to a prediction result output by the prediction model; and according to the adjusted priority, sequentially inputting the execution information of the adjusted subsequent test cases to be input into the pre-trained prediction model, and adjusting the priority and the execution information of the subsequent test cases to be input again according to the prediction result output by the prediction model, and so on until all the test cases are tested.
2. Case testing method according to claim 1, characterized in that the method further comprises:
the prediction model outputs a prediction result for a test case and an evaluation corresponding to the prediction result, the evaluation including a confidence and a confidence ranking.
3. The case testing method of claim 1, wherein the inputting the execution information of the test case into the pre-trained predictive model according to the preset priority comprises:
inputting execution information of the test case into at least one pre-trained prediction model according to a preset priority, wherein the at least one prediction model is arranged in at least one corresponding device, each device tests the test case according to a preset automatic test protocol, and the automatic test protocol is used for ensuring that the test rule of each device on the test case is consistent.
4. A case testing method according to claim 1, wherein said execution information comprises parameters of the input predictive model and associations with other test cases; and
the method for adjusting the priority and the execution information of the test case to be input subsequently according to the prediction result output by the prediction model comprises the following steps:
and inputting the parameters of the test case into a pre-trained prediction model according to a preset priority, and adjusting the priority and the incidence relation of the subsequent test case to be input according to the prediction result output by the prediction model and the incidence relation with other test cases.
5. The case testing method of claim 4, wherein the inputting parameters of the test case into a pre-trained prediction model according to a preset priority, and adjusting the priority and the association of the test case to be input subsequently according to the prediction result output by the prediction model and the association with other test cases comprises:
inputting the parameters of the test case into a pre-trained prediction model, adjusting the incidence relation of the subsequent test case to be input according to the prediction result output by the prediction model, and determining whether the subsequent test case to be input continues to be tested according to the adjusted incidence relation;
if so, adjusting the priority of the subsequent test case to be input according to the prediction result.
6. The case testing method of claim 5, wherein the correlation comprises a pre-case list and a post-case list, the pre/post-case list comprising pre/post test cases of a test case; and
the adjusting the incidence relation of the subsequent test cases to be input according to the prediction result output by the prediction model comprises the following steps:
and adjusting the weight of each test case in the front/rear case list of the test case to be input subsequently according to the prediction result output by the prediction model so as to complete the adjustment of the incidence relation according to the weight.
7. A case testing apparatus, comprising:
the first processing module is used for acquiring the execution information of each test case of the test and the preset priority of each test case;
the second processing module is used for inputting the execution information of the test case into a pre-trained prediction model according to the preset priority, and adjusting the priority and the execution information of the subsequent test case to be input according to the prediction result output by the prediction model; and according to the adjusted priority, sequentially inputting the execution information of the adjusted subsequent test cases to be input into the pre-trained prediction model, and adjusting the priority and the execution information of the subsequent test cases to be input again according to the prediction result output by the prediction model, and so on until all the test cases are tested.
8. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the case testing method according to any of claims 1 to 6 are implemented by the processor when executing the program.
9. A non-transitory computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the case testing method according to any one of claims 1 to 6.
10. A computer program product comprising a computer program, characterized in that the computer program realizes the steps of case testing according to any one of claims 1 to 6 when executed by a processor.
CN202210761845.7A 2022-06-29 2022-06-29 Case testing method and device Pending CN115237757A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210761845.7A CN115237757A (en) 2022-06-29 2022-06-29 Case testing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210761845.7A CN115237757A (en) 2022-06-29 2022-06-29 Case testing method and device

Publications (1)

Publication Number Publication Date
CN115237757A true CN115237757A (en) 2022-10-25

Family

ID=83670985

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210761845.7A Pending CN115237757A (en) 2022-06-29 2022-06-29 Case testing method and device

Country Status (1)

Country Link
CN (1) CN115237757A (en)

Similar Documents

Publication Publication Date Title
CN108376221B (en) Software system security verification and evaluation method based on AADL (architecture analysis and design language) model extension
US8522214B2 (en) Keyword based software testing system and method
US10437702B2 (en) Data-augmented software diagnosis method and a diagnoser therefor
US8997052B2 (en) Risk-based test plan construction
Padgham et al. Model-based test oracle generation for automated unit testing of agent systems
JP2018185808A (en) Apparatus for and method of testing smart agreement based on block chain
JP2018026135A (en) System and method for cause point analysis for effective handling of static analysis alarms
CN112540887A (en) Fault drilling method and device, electronic equipment and storage medium
CN111160329A (en) Root cause analysis method and device
CN116166525A (en) Method and device for generating test script
CN115098292A (en) Application program crash root cause identification method and device and electronic equipment
CN112699046A (en) Application program testing method and device, electronic equipment and storage medium
CN115237757A (en) Case testing method and device
CN115934502A (en) Workload Generation for Optimal Stress Testing of Big Data Management Systems
US20170193838A1 (en) Dynamic response entry
CN112363933A (en) Automatic verification method and device for word paragraph table, computer equipment and storage medium
CN110008098B (en) Method and device for evaluating operation condition of nodes in business process
CN114138669A (en) Software automatic testing method based on function level selection symbolized mixed execution
LU501931B1 (en) Data exception analysis method and device
Shafie et al. Test case prioritization based on extended finite state machine model
CN113434408B (en) Unit test case sequencing method based on test prediction
CN113742216B (en) Method, device and storage medium for detecting efficiency of machine learning engine
CN118245364A (en) Automated model test method, device, equipment and storage medium
CN110321280B (en) Data processing method and related equipment
CN116860585A (en) Impulse testing method, impulse testing device, and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination