CN115994081A - Test case processing method and device, electronic equipment and storage medium - Google Patents

Test case processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN115994081A
CN115994081A CN202111215876.4A CN202111215876A CN115994081A CN 115994081 A CN115994081 A CN 115994081A CN 202111215876 A CN202111215876 A CN 202111215876A CN 115994081 A CN115994081 A CN 115994081A
Authority
CN
China
Prior art keywords
test case
test
case
code
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111215876.4A
Other languages
Chinese (zh)
Inventor
白翠琴
夏羿
金矾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Mobile Communications Group Co Ltd
China Mobile Hangzhou Information Technology Co Ltd
Original Assignee
China Mobile Communications Group Co Ltd
China Mobile Hangzhou Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Mobile Communications Group Co Ltd, China Mobile Hangzhou Information Technology Co Ltd filed Critical China Mobile Communications Group Co Ltd
Priority to CN202111215876.4A priority Critical patent/CN115994081A/en
Publication of CN115994081A publication Critical patent/CN115994081A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Debugging And Monitoring (AREA)

Abstract

The application discloses a test case processing method, a test case processing device, electronic equipment and a storage medium. The method comprises the following steps: acquiring first service information; acquiring at least one test case matched with the first service information from a test case library; testing using at least one test case; when the first test case in the at least one test case fails to be executed, determining at least one feature from log information generated by executing the first test case to obtain a first set; selecting at least one failure cause from the second set that matches the features in the first set; the second set includes at least one test case failure cause. According to the scheme provided by the application, based on the service information, the test cases matched with the service information can be automatically selected from the test case library, and the reason analysis can be automatically carried out on the test cases which fail to be executed, so that the time of manual processing is reduced, and the efficiency of testing is improved.

Description

Test case processing method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of automated testing technologies, and in particular, to a test case processing method, a device, an electronic device, and a storage medium.
Background
With the development of artificial intelligence technology, more and more products of intelligent voice interaction, such as intelligent sound equipment, robot customer service and the like, are available. Aiming at intelligent voice interactive products, in the traditional testing method, in order to ensure the reliability of a testing result, a test case set with a large scale needs to be constructed to verify each function of the products so as to test the intelligent degree of the products. In this case, when the code of the product is changed and regression verification is required, it is necessary to perform functional test using the test cases related to the test case set, and there is a problem that the number of related test cases is large, resulting in low test efficiency.
In the related art, the test cases for executing regression verification are selected by constructing the prediction model of the test cases, so that the effect of accurately testing the functions of the product by using a small number of test cases is realized, and the screening efficiency of the test cases is improved. However, for different test results, the tester also needs to spend a lot of time to analyze the results, so that the product functions can be perfected and optimized according to the analysis results. Thus, there is a problem that the analysis efficiency in the test is low, resulting in the inefficiency of the overall test.
Therefore, there is no effective solution at present how to improve the efficiency of product testing as a whole.
Disclosure of Invention
In order to solve the related technical problems, the embodiment of the application provides a test case processing method, a device, electronic equipment and a storage medium.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides a test case processing method, which comprises the following steps:
acquiring first service information;
acquiring at least one test case matched with the first service information from a test case library;
testing using the at least one test case;
when the execution of a first test case in the at least one test case fails, determining at least one feature from log information generated by executing the first test case to obtain a first set;
selecting at least one failure cause from the second set that matches a feature in the first set; the second set comprises at least one test case failure cause; wherein,
and carrying out image drawing on the matched at least one test case based on the first service information and the test.
In the above solution, the determining at least one feature from log information generated by executing the first test case includes:
Re-testing by using the first test case;
at least one feature is determined from log information generated by first executing and re-executing the first test case.
In the above solution, the selecting at least one failure cause matching the features in the first set from the second set includes:
at least one failure cause matching the features in the first set is selected from the second set based on a similarity algorithm.
In the above scheme, the method further comprises:
selecting at least one image of a failed use case from the use case image library;
the second set is determined based on the representation of the selected at least one failed use case.
In the above scheme, the method further comprises:
selecting at least one test case which is matched with the image of the first test case and fails to be executed from the test case library based on a similarity algorithm to obtain a third set, and selecting at least one test case which is matched with the image of the first test case and succeeds to be executed from the test case library based on the similarity algorithm to obtain a fourth set;
judging whether the first test case has a code defect or not based on the third set and the fourth set, and obtaining a judging result;
And updating the image of the first test case based on the judging result.
In the above-described arrangement, the first and second embodiments,
the determining, based on the third set and the fourth set, whether the first test case has a code defect includes:
determining code path coverage blocks of codes of the third and fourth sets;
determining a code difference set of the codes of the first test case and the third set based on the code path coverage block of the codes of the third set to obtain a fifth set, and determining a code intersection set of the codes of the first test case and the fourth set based on the code path coverage block of the codes of the fourth set to obtain a sixth set;
judging whether the fifth set and the sixth set relate to code variation or not;
when it is determined that the fifth set and the sixth set relate to code variation, it is determined that the first test case has a code defect.
In the above scheme, the method further comprises:
after the image is completed, the image of the test case is updated to the case image library.
The embodiment of the application also provides a test case processing device, which comprises:
the first acquisition unit is used for acquiring first service information;
the second acquisition unit is used for acquiring at least one test case matched with the first service information from the test case library;
The test unit is used for testing by using the at least one test case;
the determining unit is used for determining at least one characteristic from log information generated by executing a first test case when the execution of the first test case in the at least one test case fails, so as to obtain a first set;
a selection unit, configured to select at least one failure cause matching the features in the first set from the second set; the second set comprises at least one test case failure cause; wherein,
and the portrait unit is used for carrying out portrait on the matched at least one test case based on the first service information and the test.
The embodiment of the application also provides electronic equipment, which comprises: a processor and a memory for storing a computer program capable of running on the processor,
the processor is used for executing any step of the test case processing method when running the computer program.
The embodiments of the present application also provide a storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of any of the test case processing methods.
The test case processing method, the device, the electronic equipment and the storage medium provided by the embodiment of the application acquire the first service information, and then acquire at least one test case matched with the first service information from a test case library; testing using the at least one test case; when the execution of a first test case in the at least one test case fails, determining at least one feature from log information generated by executing the first test case to obtain a first set; selecting at least one failure cause from the second set that matches a feature in the first set; the second set comprises at least one test case failure cause; and carrying out image drawing on at least one matched test case based on the first service information and the test. According to the technical scheme, based on the acquired service information, the test cases to be executed, which are matched with the service information, can be accurately screened from the test case library, so that the screening efficiency of the test cases is improved, meanwhile, when the test cases fail to execute, the analysis of failure reasons can be automatically carried out, the time of manual analysis and investigation is reduced, the improvement of the overall efficiency of the test is realized, and meanwhile, the automation level of the test is also improved.
Drawings
FIG. 1 is a flow chart of a method for processing test cases according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a test case image according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a system architecture of a test case portrait according to an embodiment of the present application;
FIG. 4 is a flowchart illustrating a method for intelligently recommending test cases according to an embodiment of the present application;
FIG. 5 is a flow chart of a method for failure attribution of test cases according to an embodiment of the present application;
FIG. 6 is a flow chart of a method for defect localization of test case codes according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a test case handling apparatus according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The present application is further described in detail below with reference to the accompanying drawings and specific examples.
Before proceeding to the further detailed description of the present application, terms and nouns involved in embodiments of the present application are described, the terms and nouns involved in embodiments of the present application are applicable to the following explanation.
(1) Regression testing: for a software product, after a code corresponding to a certain function of the product is modified, the test case needs to be used for repeatedly testing the function of the product to determine whether the modified content introduces new defects or whether the code corresponding to other functions generates errors, so that the original function of the product is affected.
(2) Edit Distance (english may be expressed as Edit Distance): refers to the number of editing operations, such as insert, replace, delete, etc., required between two objects when one object is converted to another object. In general, the smaller the edit distance, the higher the similarity between two objects is explained; the larger the edit distance, the lower the similarity between the two objects.
(3) Word Frequency-inverse text Frequency (TF-IDF, term Frequency-Inverse Document Frequency) technique: a weighting technique for information retrieval and data mining is capable of determining the importance of terms in text as compared to the text.
When the test case is used for testing the functions of the software product, the software product can be subjected to problem repair or version update according to the defects or the change of the requirements of the functions of the product. In this process, modifications of the software code may be involved. In this case, regression testing of the functionality of the software product using the test cases that have already been executed is required to determine that the modified content does not introduce new errors, nor has it an impact on other functions of the software product.
In this process, a large-scale test data set is constructed in the conventional test method, and when regression testing is performed, a large number of constructed test cases are adopted to test the functions of the product, and the test cases may include test cases with inaccurate regression testing, so that the test efficiency is low. Therefore, how to improve the efficiency of the test when the regression test is required becomes a problem to be solved.
Aiming at the problems, a training sample set can be constructed by utilizing the historical test cases and entity elements and defects contained in the software products, so that a prediction model of the test cases and the defects is built. Then, the predicted result of the test case is obtained by inputting the entity element to be tested, and the test case to be executed is determined according to the predicted result, so that the accurate test of the software product can be realized. In the scheme, the test cases to be executed are screened out according to the historical test cases and the defect model, and compared with the traditional test method, the method can use a smaller number of test cases to achieve the purpose of accurately testing software products, and improves screening efficiency of the test cases.
However, in the process of testing a software product by using a test case, a tester also needs to manually check and analyze the cause of failure and locate a defect code when the test case fails to be executed. Thus, a lot of time is spent, resulting in low analysis efficiency of the test, and thus, low overall efficiency of the test.
Overall, the existing testing methods still have the problems of low overall efficiency and insufficient automation level.
Based on this, in various embodiments of the present application, according to the service information, a test case matching with the service information is obtained from the test case library to perform a test, and feature extraction is performed on a test case that fails to be executed, so as to analyze and obtain a failure cause of the test case that fails to be executed. Therefore, not only can test cases to be executed be automatically screened out, the efficiency of test screening is improved, but also automatic attribution can be carried out on test cases which fail to be executed, the manual processing time is shortened, and the overall efficiency and the automation level of the test are improved.
An embodiment of the present application provides a test case processing method, as shown in fig. 1, applied to an electronic device, where the method includes:
step 101: acquiring first service information;
step 102: acquiring at least one test case matched with the first service information from a test case library;
step 103: testing using the at least one test case;
step 104: when the execution of a first test case in the at least one test case fails, determining at least one feature from log information generated by executing the first test case to obtain a first set;
step 105: selecting at least one failure cause from the second set that matches a feature in the first set; the second set comprises at least one test case failure cause;
And carrying out image drawing on at least one matched test case based on the first service information and the test.
Here, in actual application, N test cases may be previously constructed and stored in the test case library before step 101; wherein N is an integer greater than 1; the number of N may be set as desired, for example, 500.
Here, in actual application, the test case may be automatically configured by a regular expression (RE, regular Expression), or may be configured by a manual input method, which is not limited in this embodiment of the present application.
In the practical application, in the process of constructing N test cases, corresponding service demand description information and labels can be set for each test case; the service requirement description information may include: title, use case step, business function description, use case verification point, expected result and other information; the tag is used for indicating the category of the test case.
Then, for the test case library, a word segmentation algorithm can be adopted to analyze and extract the service requirement description information of each test case in the test case library so as to obtain a feature pool corresponding to the test case library. Illustratively, after carrying out corpus analysis on the business requirement description information of each test case by adopting an FMM algorithm, extracting the characteristics corresponding to the test case and storing the characteristics into a characteristic pool; wherein, the feature in the feature pool has mapping relation with the test case.
When the regression test is required, the first service information can be obtained by receiving the manually input service information of the tester; the first service information may include service function names, program interface names, and other information.
In actual application, in step 102, after the first service information is acquired, the target feature corresponding to the first service information may be extracted first. Then, comparing the extracted target feature with each feature in the feature pool by using a similarity algorithm to obtain the similarity between the target feature and each feature; at least one feature is determined from the pool of features based on the obtained similarity. Then, based on the mapping relation between the features and the test cases, at least one test case matched with the first service information is determined, and at least one test case matched with the first service information is obtained from a test case library.
Wherein the similarity algorithm may comprise a naive bayes algorithm.
Specifically, after the similarity between the target feature and each feature in the feature pool is obtained by using a similarity algorithm, the features with high similarity values with the target feature are preferably selected based on the similarity values and sequentially arranged from large to small so as to determine at least one feature based on the similarity.
Here, after acquiring at least one test case matching the first service information from the test case library, the tester may further review the acquired test case to determine at least one test case that finally performs step 103.
In actual application, after at least one test case is obtained from the test case library, the corresponding test case can be imaged based on the first service information. In addition, when at least one test case is used for testing, the corresponding test case can be imaged based on the testing process.
Wherein, as shown in fig. 2, imaging each test case can be understood as:
on the one hand, based on the acquired first service information, the static characteristics corresponding to the corresponding test cases can be extracted, and the characteristics can include the basic information of the cases, service function description, test environment information, data configuration information and the like; the case basic information characterizes the information such as the name of the test case, the executing step and the like; the service function description characterizes service features associated with the test cases; and the test environment information characterizes the corresponding environment information when the test case is used for testing. Such as network conditions, third party resources, etc.; the data configuration information characterizes the corresponding configuration information when the test case is used for testing.
On the other hand, based on the process of testing the corresponding test case, dynamic characteristics (which can be understood as corresponding characteristics in the process of executing the test case) of the corresponding test case can be extracted, and the dynamic characteristics can specifically comprise characteristics such as an execution result, a full-link log, an execution code coverage block, execution daily information, an execution failure reason and the like; the execution result state represents the final execution result of the test case; the full-link log characterizes and records related information in the execution process of the test case; the executing code coverage block characterizes codes involved in the executing process of the test case; the daily execution information comprises time information, system condition and the like corresponding to the execution of the test case; the execution failure is attributed to a cause characterizing the test case execution failure.
For example, in a process of testing using a test case, the test case may be processed to generate a corresponding function script, and then a corresponding software testing process is performed using the generated function script. The pile inserting information (which can be understood as a probe) is arranged in the functional script, so that information acquisition in the execution process of the test case can be realized, and further, the extraction of dynamic characteristics of the test case can be realized.
In summary, by extracting the feature information of the whole life cycle of the test case, the image of the test case is realized.
When the method is actually applied, after the portrait is finished, the portrait of the test case can be updated into the case portrait library in time, so that when failure cause analysis is carried out later, the cause of the failure of the execution of the test case can be accurately judged based on the updated case portrait library.
Based on this, in an embodiment, the method may further include:
after the image is completed, the image of the test case is updated to the case image library.
In actual application, when it is determined that the execution of the first test case in the at least one test case fails according to the execution result of each test case in the at least one test case, the first test case can be used for retesting to determine whether the failure cause relates to an environmental factor.
If the first test case is successfully executed when the test is conducted again, the reason for the failure of the first test execution is explained possibly related to environmental factors; the environmental factors may include, among other things, environmental configurations, third party resources, network factors, and the like. In this case, the tester can analyze the environmental factors and check the reasons according to the image of the first test case.
If the first test case fails to be executed again, key features can be determined from log information generated by executing the first test case, so that analysis of failure reasons can be conveniently carried out.
In an embodiment, the determining at least one feature from log information generated by executing the first test case includes:
re-testing by using the first test case;
at least one feature is determined from log information generated by first executing and re-executing the first test case.
Here, when determining the key feature, at least one feature may be determined from log information of the first test case that is executed and re-executed for the first time, resulting in the first set.
Illustratively, at least one word with occurrence times reaching a preset number of times can be extracted from log information of the first test case executed for the first time and re-executed by using a TF-IDF technology as at least one determined feature, so that a first set is obtained; the preset times can be set according to requirements, such as 5 times.
In practical application, in order to determine the failure reason of the first test case, a second set needs to be determined.
Based on this, in an embodiment, the method may further include:
Selecting at least one image of a failed use case from the use case image library;
the second set is determined based on the representation of the selected at least one failed use case.
Here, since the image of at least one test case is stored in the case image library, the image of each test case contains the execution result (including execution success or execution failure) of the test case. Therefore, based on the execution result of the images of the test cases, at least one image of the failed case can be selected from the case image library, thereby determining the second set.
The image of each test case in the second set at least comprises an execution failure attribution (namely a test case failure reason) and a full link log.
Here, in the actual application, in order to determine at least one test case failure cause matched with the first test case, when the test case is imaged, a similar case set and a derivative case set may also be generated according to service information of the test case; the set of similar cases characterizes a set of test cases for testing normal conditions; the set of derived cases characterizes a set of test cases for testing for abnormal situations.
Thus, in determining the second set, images of at least one failed use case may be selected based on the set of similar use cases and the set of derived use cases.
When the method is actually applied, after the first set and the second set are determined, the automatic analysis of the failure reasons of the first test case can be realized, so that the time for manually analyzing and checking can be shortened, and the analysis efficiency of the test is improved.
In an embodiment, the selecting at least one failure cause matching the feature in the first set from the second set may include:
at least one failure cause matching the features in the first set is selected from the second set based on a similarity algorithm.
The similarity algorithm may be set as a minimum edit distance algorithm, or may be set as another similarity algorithm, which is not limited in this embodiment of the present application.
Illustratively, the similarity (i.e., edit distance) between the features in the first set and each representation in the second set can be obtained by comparing the features in the first set with the full link log in each representation in the second set using a minimum edit distance algorithm. Then, based on the obtained similarity, an image of at least one test case matching the features in the first set can be determined. Then, based on the determined execution failure attribution in the at least one test case portrait, at least one failure cause corresponding to the feature in the first set can be determined.
Specifically, after the similarity between the features in the first set and each image in the second set is obtained by using the minimum edit distance algorithm, the images of the test cases with high feature similarity values (i.e., small edit distances) in the first set can be preferentially selected based on the magnitude of the similarity values and sequentially arranged from large to small so as to determine the image of at least one test case based on the similarity.
Here, after determining at least one failure cause matching the features in the first set, the determined at least one failure cause may also be subjected to a manual feedback assessment; that is, the tester may verify the determined at least one failure cause to ensure accuracy of the failure cause of the first test case.
In actual application, after determining at least one failure reason of the first test case, the code defect can be further automatically positioned, so that the time for manually checking the code is reduced.
Based on this, in an embodiment, the method may further include:
selecting at least one test case which is matched with the image of the first test case and fails to be executed from the test case library based on a similarity algorithm to obtain a third set, and selecting at least one test case which is matched with the image of the first test case and succeeds to be executed from the test case library based on the similarity algorithm to obtain a fourth set;
Judging whether the first test case has a code defect or not based on the third set and the fourth set, and obtaining a judging result;
and updating the image of the first test case based on the judging result.
The similarity algorithm may be set as a minimum edit distance algorithm, or may be set as another type of similarity algorithm, which is not limited in this embodiment of the present application.
In actual application, when it is determined that the execution of the first test case fails, the images of all failed cases and the images of all successful cases may be selected according to the execution result of each test case in the case image library.
Illustratively, a minimum edit distance algorithm is utilized to obtain a similarity between the representation of the first test case and each of the failed case representations. Then, based on the obtained similarity, an image of at least one test case that fails to be executed and that matches the image of the first test case can be determined, and then, using the determined image of the at least one test case that fails to be executed, a corresponding at least one test case that fails to be executed can be determined from the test case library, thereby obtaining a third set.
When the method is actually applied, after the similarity between the image of the first test case and each failed case image in the failed case images is obtained, the images can be sequentially arranged from large to small based on the magnitude of the similarity value, so that when the image of at least one failed test case is determined, the image of the failed test case with the high image similarity value with the first test case is preferentially selected.
Illustratively, a minimum edit distance algorithm is utilized to obtain a similarity between the representation of the first test case and each of the functional case representations. Then, based on the obtained similarity, an image of at least one test case successfully executed, which is matched with the image of the first test case, can be determined, and then, by using the determined image of the at least one test case successfully executed, a corresponding at least one test case successfully executed can be determined from the test case library, so as to obtain a fourth set.
When the method is actually applied, after the similarity between the image of the first test case and each of the successful case images in the successful case images is obtained, the images of the successful test cases with high image similarity values with the first test case are preferably selected based on the similarity values and are sequentially arranged from large to small so as to determine the image of at least one successful test case.
In actual application, after the third set and the fourth set are obtained, whether the first test case has a code defect can be judged according to the codes of the test cases in the third set and the fourth set.
Based on this, in an embodiment, the determining, based on the third set and the fourth set, whether the first test case has a code defect includes:
determining code path coverage blocks of codes of the third and fourth sets;
determining a code difference set of the codes of the first test case and the third set based on the code path coverage block of the codes of the third set to obtain a fifth set, and determining a code intersection set of the codes of the first test case and the fourth set based on the code path coverage block of the codes of the fourth set to obtain a sixth set;
judging whether the fifth set and the sixth set relate to code variation or not;
when it is determined that the fifth set and the sixth set relate to code variation, it is determined that the first test case has a code defect.
In practical application, the code path coverage blocks of the codes of the third set and the fourth set can be determined by combining the images of each test case in the third set and the fourth set in the image library of the cases. Specifically, using the execution code coverage blocks of the test case representations in the third set, determining code path coverage blocks of the code of the third set; a code path coverage block of code of the fourth set is determined using the execution code coverage blocks of the test case representations in the fourth set.
In actual application, the code path coverage block of the code of the first test case can be determined by combining the images of the first test case in the case image library.
The code path coverage block of the code of the first test case is compared with the code path coverage block of the code of each test case in the third set, so that the code of the first test case and the code of each test case in the third set are different and serve as code difference sets, and a fifth set is further obtained.
Correspondingly, comparing the code path coverage block of the code of the first test case with the code path coverage block of the code of each test case in the fourth set to obtain the code identical to the code of each test case in the fourth set as a code intersection, and further obtaining a sixth set.
Then, based on the fifth set and the sixth set, it is determined whether the code involves code changes (which can be understood as whether the code is modified). Specifically, for the modified code, the region to which the modified code corresponds may be indicated by adding annotation content in the code at the time of modification. Thus, when it is necessary to determine whether the code involves a code change, it is possible to determine whether the code involves a code change by looking up whether there is a corresponding annotation content in the code.
Here, when it is determined that the fifth set and the sixth set relate to code variation, it may be determined that the first test case has a code defect. In this case, by locating the area involving the code variation to indicate that the tester may have a code defect, the time taken for the tester to check the code problem can be reduced, and the efficiency of the test can be improved.
After obtaining first service information, the test case processing method provided by the embodiment of the application obtains at least one test case matched with the first service information from a test case library; testing using the at least one test case; when the execution of a first test case in the at least one test case fails, determining at least one feature from log information generated by executing the first test case to obtain a first set; selecting at least one failure cause from the second set that matches a feature in the first set; the second set comprises at least one test case failure cause; and carrying out image drawing on at least one matched test case based on the first service information and the test. According to the technical scheme, based on the acquired service information, the test cases to be executed, which are matched with the service information, can be automatically screened from the test case library, so that the screening efficiency and accuracy of the test cases are improved, meanwhile, when the test cases fail to execute, the analysis of failure reasons can be automatically carried out, the time cost of manual analysis and investigation is reduced, the overall efficiency of testing is improved, and meanwhile, the automation level of testing is also improved.
The present application is described in further detail below in connection with examples of application.
In the application embodiment, an intelligent test system based on use case portrait is provided, and intelligent test on software products can be realized. When the product needs to be subjected to regression testing, not only can the test cases meeting the service requirements be automatically screened out from the test case set, but also the test cases can be automatically attributed and automatically positioned for the code defects when the test cases fail to be executed, so that the cost of manually screening and positioning is reduced, and the efficiency of testing the software product is improved.
Specifically, as shown in fig. 3, the intelligent test system may include: the system comprises a case intelligent recommendation module, an execution module, a case portrait module and a result analysis module; wherein,
the case intelligent recommending module is used for constructing a test case library and recommending corresponding test cases according to service requirements;
the execution module is used for executing the test case and collecting information in the execution process of the test case;
the case portrait module is used for carrying out portrait on the test case and managing the portrait of the test case;
the result analysis module is used for automatically attributing the failure test case and positioning the defect code when the failure test case relates to the change code.
The process for processing the test case in the embodiment of the application comprises the following steps:
step 1: the method comprises the steps that an intelligent case recommendation module constructs a plurality of test cases, and sets corresponding service requirement description information and labels (which can be understood as categories) for each constructed test case to form a test case library;
the test cases can be constructed in a manual input mode, and also can be constructed automatically, for example, by adopting RE automatic construction.
The service requirement description information may include information such as a title, a use case step, a service function description, a use case verification point, or an expected result.
When the method is actually applied, after the test case library is formed by the case intelligent recommendation module, corpus analysis, feature extraction and vectorization representation can be carried out on service demand description information corresponding to each test case in the test case library by utilizing jieba analysis and an FMM algorithm, so that a feature pool corresponding to the test case library is generated.
Step 2: after receiving the service information (i.e. the first service information), the case intelligent recommendation module selects M test cases matched with the service information from the test case library.
Wherein M is an integer greater than 1, and the value of M can be set according to the requirement.
Specifically, as shown in fig. 4, the process of selecting M test cases matched with the service information by the case construction module may include the following steps:
step 401: the use case construction module acquires service information;
here, when regression verification is required, the case intelligent recommendation module obtains service information, such as title, service description, case verification point, service function name, program interface name, and the like, by receiving service information manually input by a tester.
Step 402: the use case construction module extracts the characteristics of the target use case;
in actual application, the use case construction module extracts corresponding target feature information based on the acquired service information.
Step 403: the use case construction module matches the target use case characteristics with the characteristic pool;
specifically, a naive Bayesian algorithm is adopted to compare the characteristics of the target case with the characteristics of each test case in the characteristic pool, so as to obtain a corresponding similarity result.
Wherein, the formula of the naive bayes algorithm can be expressed as:
Figure BDA0003310772230000151
here, the A characterizes class features of the test cases in the feature pool; the B represents the characteristics of the target use case; p (A|B) characterizes the probability that B belongs to A; p (B|A) characterizes the probability of B being included in A; p (A) characterizes the probability of A appearing in all classes; p (B) characterizes the probability of B occurring in all features.
Step 404: the case construction module selects M test cases from the test case library;
in practical application, the case construction module can sequentially arrange the similarity results from large to small according to the values, and preferentially select M test cases with large corresponding similarity values, such as 3 test cases, according to the needs of testers.
Step 405: and determining the final test case.
Here, after the case intelligent recommendation module selects the M test cases matched from the test case library, the tester may perform a manual review to determine the final test case for executing step 3.
Meanwhile, the case intelligent recommendation module can send the received service information and the determined M test cases to the case portrait module so as to carry out portrait on the M test cases by using the case portrait module.
Step 3: the execution module respectively executes the M test cases determined by the case intelligent recommendation module, and collects information in the execution process of each test case.
In actual application, after the execution module acquires M test cases determined by the case intelligent recommendation module, each test case in the test case library is processed to obtain a corresponding function script. Then, calling an interface test engine of the execution module to execute the corresponding function script;
The execution module can collect dynamic information in the execution process of each test case by setting pile inserting information in the function script; the dynamic information at least comprises full link log information, code coverage information, assertion result information and use case result state.
Step 4: the case portrait module is used for carrying out portraits on each test case based on the service information and the information in the execution process of the test case;
specifically, the case portrait module may extract static features corresponding to the test case according to the service information, for example, case basic information, case service features, and the like. Meanwhile, the case portrait module can extract dynamic features corresponding to the test case according to information in the execution process of the test case, such as a case code coverage block, a case result state, failure attribution and the like. Based on the static and dynamic characteristics of the test case, namely the characteristics of the whole life cycle of the test case, the case portrait module can realize the portrait of the test case and store the portrait of the test case into a case portrait library after the portrait is completed.
Step 5: when the test case A (namely the first test case) in the M test cases fails to be executed, automatically analyzing the failure reason of the test case A by a result analysis module;
Specifically, as shown in fig. 5, the process of analyzing the failure cause of the test case a by the result analysis module may include the following steps:
step 501: the result analysis module sends execution information to the execution module so that the execution module executes the test case A again.
Here, in the process that the execution module executes the test case a again, at least all link log information in the process of executing again is collected, and the execution result and the code coverage block information are obtained.
The full-link log information may include an assertion result, exception information, a test environment detection result, a reference configuration detection result, and the like.
Step 502: according to the execution result of the test case A executed again, the result analysis module judges whether the test case A is executed successfully or not;
if the test case a is executed successfully, step 503 is executed; if test case A fails, step 504 is performed.
Step 503: the result analysis module performs environment attribution investigation;
specifically, the analysis of environmental reasons, such as network problems, environmental configuration information problems, or third party resource problems, may be performed based on the full link log information collected by the execution module.
Step 504: the result analysis module acquires all-link log information of the test case A which is executed for the first time and executed again;
Step 505: the result analysis module extracts log features from the full-link log information;
illustratively, the results analysis module may utilize a TF-IDF algorithm to extract log features, forming a failure attribution feature set F.
Step 506: the result analysis module utilizes a similarity algorithm to find out 5 failure test cases matched with the log features.
In actual application, the result analysis module can select images of all test cases with execution failure from the case image library to form a history failure case attribution set L.
Then, the minimum edit distance algorithm is adopted to calculate the edit distance (i.e. similarity) between the failure case attribution feature F and each test case image in the history failure case attribution set L, and the images of 5 test cases with small edit distances (i.e. large similarity) are preferentially selected by arranging the calculated edit distances from small to large in sequence, so that the failure reason of each test case can be determined, and a corresponding failure attribution list is formed.
Step 507: the result analysis module feeds back the failure attribution list;
specifically, the results analysis module feeds back the list of failure attributions to the testers for accuracy assessment of the automated attributions.
Step 508: the result analysis module updates the representation of the test case A based on the result of the manual assessment.
After the manual evaluation is completed, the result analysis module sends the failure attribution list of the test case A to the case portrait module so as to update the portrait of the test case A.
Step 6: the result analysis module automatically locates the defect codes of the test case A;
specifically, as shown in fig. 6, the process of automatically performing defect code positioning on the test case a by the result analysis module may include the following steps:
step 601: the result analysis module determines a similar case set matched with the test case A from a case image library according to the image of the test case A;
in practical application, the result analysis module determines the images of all test cases successfully executed and the images of all test cases failed to be executed from the case image library. And then, calculating the similarity between the image of the test case A and the image of each successfully executed test case by using a minimum editing distance algorithm, and obtaining a corresponding similarity result. Meanwhile, the minimum edit distance algorithm is utilized to calculate the similarity between the image of the test case A and the image of each test case with failed execution, and a corresponding similarity result is obtained.
Here, the similarity results between the image of the test case a and the image of the test case that was successfully executed may be arranged from large to small in value, and 5 images of test cases that were successfully executed with large similarity values may be preferentially selected to form the execution success case set S. Meanwhile, the similarity results between the images of the test case A and the images of the test cases with the execution failure are arranged from large to small according to the numerical values, and the images of the 5 test cases with the large similarity values and the execution failure are preferentially selected to form an execution failure case set F.
Step 602: the result analysis module obtains the code path coverage block of the execution success case set S and the execution failure case set F.
Here, the result analysis module may acquire the code path coverage block of each test case in the execution success case set S, and may also acquire the code path coverage block of each test case in the execution failure case set F, using the case image library.
Then, the code path coverage block of the test case A is compared with the code path coverage block of each test case in the execution success case set S, so that the code of the test case A is obtained and the code of each test case in the execution success case set S is different, and a code difference set Ts is obtained.
Correspondingly, the code path coverage block of the test case A is compared with the code path coverage block of each test case in the execution failure case set F to obtain the same part of the code of the test case A and the code of each test case in the execution failure case set F, and further obtain a code intersection Tf.
Step 603: the result analysis module determines whether the code difference set Ts and the code intersection Tf relate to a variation code.
Wherein, if it is determined that the change code is involved, step 604 is performed; if it is determined that the change code is not involved, step 606 is performed.
Here, in actual application, the result analysis module may determine whether the code difference set Ts and the code intersection Tf relate to a variable code, i.e., a modified code, according to the annotation content in the code of the test case. Specifically, for the modified code, the region to which the modified code corresponds may be indicated by adding annotation content in the code at the time of modification. Thus, when it is necessary to determine whether the code involves a code change, it is possible to determine whether the code involves a code change by looking up whether there is a corresponding annotation content in the code.
Step 604: when the result analysis module determines that the code difference set Ts and the code intersection set Tf relate to code variation, the suspicious defect codes exist in the codes of the test case A, and then the codes related to the code variation are automatically positioned.
Step 605: the result analysis module sends the determined suspicious defect code information to the case portrait module so as to update the portrait of the test case A.
Step 606: ending the current flow.
In an application embodiment of the application, an intelligent testing method based on use case portrait is provided. Based on the characteristic information of the whole life cycle of the test case, a case portrait model is established. When the code change needs to carry out regression verification, the test cases matched with the service description information can be automatically screened out according to the case portrait model, so that a tester can use the screened test cases to carry out testing. Therefore, the purpose of accurately testing the test cases with fewer use amounts can be achieved, the problem that the time cost is increased due to manual screening is avoided, and the screening efficiency of the test is improved.
In addition, when the test case fails to be executed, automatic attribution of the failed case can be realized based on the case portrait model, so that the efficiency of analyzing the reasons of the failed case can be accelerated, and the time of manual analysis is reduced. Meanwhile, in the scheme, when the test case fails to be executed, suspicion of the code defects can be judged based on a similarity algorithm, automatic positioning of the defect codes is realized, time for manually checking the program codes is shortened, and further analysis efficiency of the test is improved.
In order to implement the solution of the embodiment of the present application, the embodiment of the present application further provides a test case processing apparatus, as shown in fig. 7, where the apparatus includes:
a first acquiring unit 701, configured to acquire first service information;
a second obtaining unit 702, configured to obtain at least one test case matching the first service information from a test case library;
a test unit 703, configured to perform a test using the at least one test case;
a determining unit 704, configured to determine at least one feature from log information generated by executing a first test case, when execution of the first test case in the at least one test case fails, to obtain a first set;
a selection unit 705, configured to select at least one failure cause matching the feature in the first set from the second set; the second set comprises at least one test case failure cause; wherein,
and a portrait unit 706, configured to perform a portrait on the matched at least one test case based on the first service information and the test.
Here, it should be noted that the functions of the first acquiring unit 701 and the second acquiring unit 702 are equivalent to the functions of the case intelligent recommendation module in the application embodiment; the function of the test unit 703 corresponds to the function of the execution module in the application embodiment; the function of the portrayal unit 706 corresponds to the function of the use case portrayal module in the application embodiment; the functions of the determining unit 704 and the selecting unit 705 correspond to the functions of the result analysis module in the application embodiment.
Wherein, in an embodiment, the determining unit 704 is configured to:
re-testing by using the first test case;
determining at least one feature from log information generated by first executing and re-executing the first test case
Wherein, in an embodiment, the selecting unit 705 is configured to:
at least one failure cause matching the features in the first set is selected from the second set based on a similarity algorithm.
In an embodiment, the determining unit 704 is further configured to:
selecting at least one image of a failed use case from the use case image library;
the second set is determined based on the representation of the selected at least one failed use case.
In an embodiment, the selecting unit 705 is further configured to:
selecting at least one test case which is matched with the image of the first test case and fails to be executed from the test case library based on a similarity algorithm to obtain a third set, and selecting at least one test case which is matched with the image of the first test case and succeeds to be executed from the test case library based on the similarity algorithm to obtain a fourth set;
judging whether the first test case has a code defect or not based on the third set and the fourth set, and obtaining a judging result;
And updating the image of the first test case based on the judging result.
Wherein, in an embodiment, the selecting unit 705 is configured to:
determining code path coverage blocks of codes of the third and fourth sets;
determining a code difference set of the codes of the first test case and the third set based on the code path coverage block of the codes of the third set to obtain a fifth set, and determining a code intersection set of the codes of the first test case and the fourth set based on the code path coverage block of the codes of the fourth set to obtain a sixth set;
judging whether the fifth set and the sixth set relate to code variation or not;
when it is determined that the fifth set and the sixth set relate to code variation, it is determined that the first test case has a code defect.
In an embodiment, the portrait unit 706 is further configured to:
after the image is completed, the image of the test case is stored in the case image library.
In practical application, the first obtaining unit 701 may be implemented by a processor in the test case processing apparatus in combination with a communication interface; the second acquisition unit 702, the test unit 703, the determination unit 704, and the selection unit 705 may be implemented by a processor in a test case processing apparatus.
It should be noted that: in the test case processing device provided in the above embodiment, when performing test case processing, only the division of each program unit is used for illustration, in practical application, the processing allocation may be completed by different program units according to needs, that is, the internal structure of the device is divided into different program units, so as to complete all or part of the processing described above. In addition, the test case processing device and the test case processing method provided in the foregoing embodiments belong to the same concept, and specific implementation processes thereof are detailed in the method embodiments, which are not described herein again.
Based on the hardware implementation of the program modules, and in order to implement the method for processing the test case in the embodiment of the present application, the embodiment of the present application further provides an electronic device, as shown in fig. 8, the electronic device 800 includes:
a communication interface 801 capable of interacting with other devices;
a processor 802, connected to the communication interface 801, for interacting with other devices, and for executing the methods provided by one or more of the above technical solutions when running a computer program;
a memory 803, the computer program being stored on the memory 803.
Specifically, the processor 802 is configured to:
acquiring first service information through the communication interface 801;
acquiring at least one test case matched with the first service information from a test case library;
testing using the at least one test case;
when the execution of a first test case in the at least one test case fails, determining at least one feature from log information generated by executing the first test case to obtain a first set;
selecting at least one failure cause from the second set that matches a feature in the first set; the second set comprises at least one test case failure cause; wherein,
and carrying out image drawing on the matched at least one test case based on the first service information and the test.
Wherein, in one embodiment, the processor 802 is configured to:
re-testing by using the first test case;
at least one feature is determined from log information generated by first executing and re-executing the first test case.
Wherein in one embodiment, the processor 802 is configured to select at least one failure cause matching the features in the first set from the second set based on a similarity algorithm.
In an embodiment, the processor 802 is further configured to:
selecting at least one image of a failed use case from the use case image library;
the second set is determined based on the representation of the selected at least one failed use case.
In an embodiment, the processor 802 is further configured to:
selecting at least one test case which is matched with the image of the first test case and fails to be executed from the test case library based on a similarity algorithm to obtain a third set, and selecting at least one test case which is matched with the image of the first test case and succeeds to be executed from the test case library based on the similarity algorithm to obtain a fourth set;
judging whether the first test case has a code defect or not based on the third set and the fourth set, and obtaining a judging result;
and updating the image of the first test case based on the judging result.
Wherein, in one embodiment, the processor 802 is configured to:
determining code path coverage blocks of codes of the third and fourth sets;
determining a code difference set of the codes of the first test case and the third set based on the code path coverage block of the codes of the third set to obtain a fifth set, and determining a code intersection set of the codes of the first test case and the fourth set based on the code path coverage block of the codes of the fourth set to obtain a sixth set;
Judging whether the fifth set and the sixth set relate to code variation or not;
when it is determined that the fifth set and the sixth set relate to code variation, it is determined that the first test case has a code defect.
In an embodiment, the processor 802 is further configured to:
after the image is completed, the image of the test case is stored in the case image library.
It should be noted that: the specific processing of the processor 802 may be understood with reference to the methods described above.
Of course, in actual practice, the various components in electronic device 800 are coupled together via bus system 804. It is to be appreciated that the bus system 804 is employed to enable connected communications between these components. The bus system 804 includes a power bus, a control bus, and a status signal bus in addition to a data bus. But for clarity of illustration the various buses are labeled as bus system 804 in fig. 8.
The memory 803 in the present embodiment is used to store various types of data to support the operation of the electronic device 800. Examples of such data include: any computer program for operating on the electronic device 800.
The method disclosed in the embodiments of the present application may be applied to the processor 802, or implemented by the processor 802. The processor 802 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in the processor 802 or by instructions in the form of software. The processor 802 described above may be a general purpose processor, a digital signal processor (DSP, digital Signal Processor), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. The processor 802 may implement or perform the methods, steps, and logic blocks disclosed in embodiments of the present application. The general purpose processor may be a microprocessor or any conventional processor or the like. The steps of the method disclosed in the embodiments of the present application may be directly embodied in a hardware decoding processor or implemented by a combination of hardware and software modules in the decoding processor. The software modules may be located in a storage medium in the memory 803 and the processor 802 reads the information in the memory 803, in combination with its hardware, to perform the steps of the method as described above.
In an exemplary embodiment, the electronic device 800 can be implemented by one or more application specific integrated circuits (ASIC, application Specific Integrated Circuit), DSPs, programmable logic devices (PLD, programmable Logic Device), complex programmable logic devices (CPLD, complex Programmable Logic Device), field-programmable gate arrays (FPGA, field-Programmable Gate Array), general purpose processors, controllers, microcontrollers (MCU, micro Controller Unit), microprocessors (Microprocessor), or other electronic components for performing the aforementioned methods.
In an exemplary embodiment, the present application also provides a storage medium, i.e., a computer storage medium, specifically a computer readable storage medium, for example, including a memory 803 storing a computer program executable by the processor 802 of the electronic device 800 to perform the aforementioned test case processing steps. The computer readable storage medium may be Read Only Memory (ROM), programmable Read Only Memory (PROM, programmable Read-Only Memory), erasable programmable Read Only Memory (EPROM, erasable Programmable Read-Only Memory), electrically erasable programmable Read Only Memory (EEPROM, electrically Erasable Programmable Read-Only Memory), magnetic random access Memory (FRAM, ferromagnetic random access Memory), flash Memory (Flash Memory), magnetic surface Memory, optical disk, or compact disk Read Only Memory (CD-ROM, compact Disc Read-Only Memory); the magnetic surface memory may be a disk memory or a tape memory.
It should be noted that: "first," "second," etc. are used to distinguish similar objects and not necessarily to describe a particular order or sequence.
In addition, the embodiments described in the present application may be arbitrarily combined without any collision.
The foregoing description of the preferred embodiments of the present application is not intended to limit the scope of the present application, but is intended to cover any modifications, equivalents, and alternatives falling within the spirit and principles of the present application.

Claims (10)

1. A test case processing method, comprising:
acquiring first service information;
acquiring at least one test case matched with the first service information from a test case library;
testing using the at least one test case;
when the execution of a first test case in the at least one test case fails, determining at least one feature from log information generated by executing the first test case to obtain a first set;
selecting at least one failure cause from the second set that matches a feature in the first set; the second set comprises at least one test case failure cause; wherein,
And carrying out image drawing on the matched at least one test case based on the first service information and the test.
2. The method of claim 1, wherein the determining at least one feature from log information generated from executing the first test case comprises:
re-testing by using the first test case;
at least one feature is determined from log information generated by first executing and re-executing the first test case.
3. The method of claim 1, wherein selecting at least one failure cause from the second set that matches a feature in the first set comprises:
at least one failure cause matching the features in the first set is selected from the second set based on a similarity algorithm.
4. The method according to claim 1, wherein the method further comprises:
selecting at least one image of a failed use case from the use case image library;
the second set is determined based on the representation of the selected at least one failed use case.
5. The method according to claim 1, wherein the method further comprises:
selecting at least one test case which is matched with the image of the first test case and fails to be executed from the test case library based on a similarity algorithm to obtain a third set, and selecting at least one test case which is matched with the image of the first test case and succeeds to be executed from the test case library based on the similarity algorithm to obtain a fourth set;
Judging whether the first test case has a code defect or not based on the third set and the fourth set, and obtaining a judging result;
and updating the image of the first test case based on the judging result.
6. The method of claim 5, wherein the determining whether the first test case has a code defect based on the third set and the fourth set comprises:
determining code path coverage blocks of codes of the third and fourth sets;
determining a code difference set of the codes of the first test case and the third set based on the code path coverage block of the codes of the third set to obtain a fifth set, and determining a code intersection set of the codes of the first test case and the fourth set based on the code path coverage block of the codes of the fourth set to obtain a sixth set;
judging whether the fifth set and the sixth set relate to code variation or not;
when it is determined that the fifth set and the sixth set relate to code variation, it is determined that the first test case has a code defect.
7. The method according to any one of claims 1 to 6, further comprising:
After the image is completed, the image of the test case is updated to the case image library.
8. A test case processing apparatus, comprising:
the first acquisition unit is used for acquiring first service information;
the second acquisition unit is used for acquiring at least one test case matched with the first service information from the test case library;
the test unit is used for testing by using the at least one test case;
the determining unit is used for determining at least one characteristic from log information generated by executing a first test case when the execution of the first test case in the at least one test case fails, so as to obtain a first set;
a selection unit, configured to select at least one failure cause matching the features in the first set from the second set; the second set comprises at least one test case failure cause; wherein,
and the portrait unit is used for carrying out portrait on the matched at least one test case based on the first service information and the test.
9. An electronic device, comprising: a processor and a memory for storing a computer program capable of running on the processor,
wherein the processor is configured to execute the steps of the test case processing method of any one of claims 1 to 7 when the computer program is run.
10. A storage medium having stored thereon a computer program, which when executed by a processor, implements the steps of the test case processing method of any of claims 1 to 7.
CN202111215876.4A 2021-10-19 2021-10-19 Test case processing method and device, electronic equipment and storage medium Pending CN115994081A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111215876.4A CN115994081A (en) 2021-10-19 2021-10-19 Test case processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111215876.4A CN115994081A (en) 2021-10-19 2021-10-19 Test case processing method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115994081A true CN115994081A (en) 2023-04-21

Family

ID=85989093

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111215876.4A Pending CN115994081A (en) 2021-10-19 2021-10-19 Test case processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115994081A (en)

Similar Documents

Publication Publication Date Title
CN106844217B (en) Method and device for embedding point of applied control and readable storage medium
CN108268373A (en) Automatic test cases management method, device, equipment and storage medium
CN114116496A (en) Automatic testing method, device, equipment and medium
CN109144852A (en) Scan method, device, computer equipment and the storage medium of static code
CN114048129A (en) Automatic testing method, device, equipment and system for software function change
CN113434395A (en) Automatic generation method, device, equipment and medium of test case
CN114490404A (en) Test case determination method and device, electronic equipment and storage medium
CN114691403A (en) Server fault diagnosis method and device, electronic equipment and storage medium
CN116578499A (en) Intelligent analysis and test method and system for public component function change influence
CN116431522A (en) Automatic test method and system for low-code object storage gateway
CN115840560A (en) Management system for software development process
CN115994081A (en) Test case processing method and device, electronic equipment and storage medium
CN115994093A (en) Test case recommendation method and device
CN114942905A (en) Migration data verification method, device, equipment and storage medium
CN115203025A (en) Test defect analysis method and device
CN110413516B (en) Method and device for identifying slow SQL codes and electronic equipment
CN114860608A (en) Scene construction based system automation testing method, device, equipment and medium
CN114064510A (en) Function testing method and device, electronic equipment and storage medium
CN109374038B (en) Change test method of nuclear security level instrument control product based on application prototype
CN110177006B (en) Node testing method and device based on interface prediction model
CN113434408B (en) Unit test case sequencing method based on test prediction
CN112612882B (en) Review report generation method, device, equipment and storage medium
CN113220594B (en) Automatic test method, device, equipment and storage medium
CN117591431A (en) Test method, test device, computer equipment and storage medium
CN116881117A (en) Test case coverage rate evaluation method, device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination