CN114138659A - Test case processing method and system - Google Patents

Test case processing method and system Download PDF

Info

Publication number
CN114138659A
CN114138659A CN202111487801.1A CN202111487801A CN114138659A CN 114138659 A CN114138659 A CN 114138659A CN 202111487801 A CN202111487801 A CN 202111487801A CN 114138659 A CN114138659 A CN 114138659A
Authority
CN
China
Prior art keywords
case
tested
assertion
type
expected value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111487801.1A
Other languages
Chinese (zh)
Inventor
万玉子
刘爱辉
宋育芳
李淑凤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Construction Bank Corp
Original Assignee
China Construction Bank Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Construction Bank Corp filed Critical China Construction Bank Corp
Priority to CN202111487801.1A priority Critical patent/CN114138659A/en
Publication of CN114138659A publication Critical patent/CN114138659A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The application discloses a test case processing method and a test case processing system, wherein the case type of a case to be tested is determined, if the case type of the case to be tested is a performance test case type, a first assertion sequence expected value is obtained and used for detecting the execution condition of the case to be tested corresponding to the performance test case type, if the case type of the case to be tested is a function test case type, a second assertion sequence expected value is obtained and used for detecting the execution condition of the case to be tested corresponding to the function test case type. By the scheme, the assertion expected value does not need to be manually set, the generation of wrong assertion expected values caused by manual reasons is avoided, and the accuracy and the test quality of testing the test case are improved. In addition, the predicated values of the assertion sequences corresponding to different case types are obtained respectively by processing the cases to be tested of different case types, and the fine granularity of testing the test cases is improved.

Description

Test case processing method and system
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a test case processing method and system.
Background
And testing whether each case accurately and directly influences the stable operation of the financial institution business. Setting assertions to test cases during testing is a common practice for detecting whether test cases are successfully executed or not.
Assertion is a technical method for comparing whether an output value of a test case is consistent with an expected value, and its use is advantageous for improving code quality. Asserting whether the expected value setting is correct directly affects the test results. Currently, the expected value of an assertion is typically artificially set to a static value and bound to a test case.
The existing method for manually setting the static assertion expected value cannot meet the expanding automatic test requirement, and meanwhile, the manual setting of the assertion expected value causes the generation of wrong assertion expected value, so that the accuracy rate of testing a test case is reduced and the test quality is reduced.
Disclosure of Invention
In view of this, the present application discloses a method and a system for processing test cases, which aim to improve the accuracy, quality and fine granularity of the test for the test cases.
In order to achieve the purpose, the technical scheme is as follows:
the application discloses a test case processing method in a first aspect, and the method comprises the following steps:
determining the case type of a case to be tested; the case types comprise a performance test case type and a function test case type;
if the case type of the case to be tested is the performance test case type, calculating the case to be tested corresponding to the performance test case type through a pre-constructed performance test index assertion sequence expected value model to obtain a first assertion sequence expected value; the expected value of the first assertion sequence is used for detecting the execution condition of a to-be-tested case corresponding to the performance test case type;
if the case type of the case to be tested is the function test case type, matching the case to be tested corresponding to the function test case type through a pre-constructed function test case assertion sequence library, and obtaining a second assertion sequence expected value according to a matching result; and the expected value of the second assertion sequence is used for detecting the execution condition of the case to be tested corresponding to the type of the functional test case.
Preferably, if the case type of the case to be tested is the performance test case type, calculating the case to be tested corresponding to the performance test case type through a pre-established performance test index assertion sequence expected value model to obtain a first assertion sequence expected value, including:
acquiring system characteristics and case characteristics;
carrying out standardized processing and multiple collinearity elimination processing on the system characteristics and the case characteristics to obtain a processing result;
determining a model coefficient based on a pre-established linear regression model;
constructing a performance test index assertion sequence expected value model according to the model coefficient and the processing result;
and calculating the characteristics of the to-be-tested case corresponding to the type of the performance test case through the performance test index predicated sequence expected value model to obtain a first predicated sequence expected value.
Preferably, after the calculating the characteristics of the to-be-tested case corresponding to the type of the performance testing case through the performance testing index expected value model to obtain a first expected value of the assertion sequence, the method further includes:
establishing an assertion library of the performance test cases through the system characteristics, the case characteristics and preset historical performance test cases acquired in advance;
acquiring an actual value of an assertion sequence; the assertion sequence is obtained through prediction of an expected value model of the performance test index assertion sequence;
comparing whether the expected value of the first assertion sequence is consistent with the actual value of the assertion sequence;
if the performance test cases are consistent, deleting the to-be-tested cases corresponding to the performance test case types;
and if the actual values of the features and the assertion sequences of the cases to be tested are not consistent with the actual values of the features and the assertion sequences of the cases to be tested, which correspond to the types of the performance test cases, are stored in an assertion library of the performance test cases.
Preferably, if the case type of the case to be tested is the function test case type, matching the case to be tested corresponding to the function test case type through a pre-established function test case assertion sequence library, and obtaining a second assertion sequence expected value according to a matching result, includes:
decomposing the to-be-tested case corresponding to the function test case type to obtain the characteristics of the to-be-tested case corresponding to the function test case type; the characteristics of the case to be tested comprise a method name, an interface name, the number of the entries, an entry list and the circle complexity of the interface;
and matching the characteristics of the to-be-tested case corresponding to the type of the function test case with preset characteristics in a pre-constructed function test case assertion sequence library by a preset level matching method to obtain a matching result, and generating a second assertion sequence expected value of the to-be-tested case corresponding to the type of the function test case based on the matching result.
Preferably, the matching the features of the to-be-tested case with the preset features in the functional test case assertion sequence library by a preset hierarchy matching method to obtain a matching result, and generating a second assertion sequence expected value of the to-be-tested case corresponding to the functional test case type based on the matching result includes:
sequencing the method names, the interface names, the parameter numbers, the parameter list and the round complexity of the interfaces in the function test case assertion sequence library through a preset sequencing rule to obtain the sequenced method names, the sequenced interface names, the sequenced parameter numbers, the sequenced parameter list and the round complexity of the sequenced interfaces;
matching the method name of the case to be tested with the sorted method name;
if the method name of the case to be tested is successfully matched with the sorted method name, matching the interface name of the case to be tested with the sorted interface name;
if the interface name of the case to be tested is successfully matched with the ordered interface name, matching the number of the entries of the case to be tested with the ordered entries;
if the number of the entries of the case to be tested is successfully matched with the ordered number of the entries, matching the entry list of the case to be tested with the ordered entry list;
if the match between the entry list of the case to be tested and the sorted entry list is successful, matching the circle complexity of the interface of the case to be tested with the circle complexity of the sorted interface;
and if the circle complexity of the interface of the case to be tested is successfully matched with the circle complexity of the ordered interface, generating a second assertion sequence expected value of the case to be tested corresponding to the type of the function test case.
Preferably, the method further comprises the following steps:
and if the matching of the features of any group of cases to be tested with the features in the functional test case assertion sequence library fails, matching the features of the cases to be tested corresponding to the next functional test case type.
A second aspect of the present application discloses a test case processing system, the system comprising:
the determining unit is used for determining the case type of the case to be tested; the case types comprise a performance test case type and a function test case type;
the calculation unit is used for calculating the to-be-tested case corresponding to the performance test case type through a pre-constructed performance test index assertion sequence expected value model to obtain a first assertion sequence expected value if the case type of the to-be-tested case is the performance test case type; the expected value of the first assertion sequence is used for detecting the execution condition of a to-be-tested case corresponding to the performance test case type;
the matching unit is used for matching the case to be tested corresponding to the function test case type through a pre-constructed function test case assertion sequence library if the case type of the case to be tested is the function test case type, and obtaining a second assertion sequence expected value according to a matching result; and the expected value of the second assertion sequence is used for detecting the execution condition of the case to be tested corresponding to the type of the functional test case.
Preferably, the calculation unit includes:
the acquisition module is used for acquiring system characteristics and case characteristics;
the processing module is used for carrying out standardized processing and multiple collinearity elimination processing on the system characteristics and the case characteristics to obtain a processing result;
the determining module is used for determining a model coefficient based on a pre-established linear regression model;
the construction module is used for constructing a performance test index assertion sequence expected value model according to the model coefficient and the processing result;
and the calculation module is used for calculating the characteristics of the to-be-tested case corresponding to the type of the performance test case through the performance test index expected value model to obtain a first assertion sequence expected value.
Preferably, the method further comprises the following steps:
the establishing unit is used for establishing an assertion library of the performance test case through the system characteristics, the case characteristics and a preset historical performance test case acquired in advance;
an obtaining unit, configured to obtain an actual value of the assertion sequence; the assertion sequence is obtained through prediction of an expected value model of the performance test index assertion sequence;
a comparison unit for comparing whether the expected value of the first assertion sequence is consistent with the actual value of the assertion sequence;
the deleting unit is used for deleting the to-be-tested case corresponding to the performance testing case type if the performance testing cases are consistent with the performance testing case type;
and the storage unit is used for storing the characteristics of the to-be-tested case corresponding to the type of the performance test case and the actual value of the assertion sequence into the assertion library of the performance test case if the characteristics of the to-be-tested case and the actual value of the assertion sequence are not consistent.
Preferably, the matching unit includes:
the decomposition module is used for decomposing a pre-acquired case to be tested to obtain the characteristics of the case to be tested; the characteristics of the case to be tested comprise a method name, an interface name, the number of the entries, an entry list and the circle complexity of the interface;
and the matching module is used for matching the characteristics of the to-be-tested case corresponding to the type of the function test case with the preset characteristics in a pre-constructed function test case assertion sequence library by a preset level matching method to obtain a matching result, and generating a second assertion sequence expected value of the to-be-tested case corresponding to the type of the function test case based on the matching result.
According to the technical scheme, the application discloses a test case processing method and a test case processing system, the case type of a to-be-tested case is determined, the case type comprises a performance test case type and a function test case type, and if the case type of the to-be-tested case is the performance test case type, the to-be-tested case corresponding to the performance test case type is calculated through a pre-established performance test index assertion sequence expected value model to obtain a first assertion sequence expected value; the first assertion sequence expected value is used for detecting the execution condition of a to-be-tested case corresponding to the performance test case type, if the case type of the to-be-tested case is the function test case type, the to-be-tested case corresponding to the function test case type is matched through a pre-constructed function test case assertion sequence library, a second assertion sequence expected value is obtained according to a matching result, and the second assertion sequence expected value is used for detecting the execution condition of the to-be-tested case corresponding to the function test case type. By the scheme, the assertion expected value does not need to be manually set, the generation of wrong assertion expected values caused by manual reasons is avoided, and the accuracy of testing the test case is improved. In addition, the predicated values of the assertion sequences corresponding to different case types are obtained respectively by processing the cases to be tested of different case types, and the fine granularity of testing the test cases is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a schematic flowchart of a test case processing method disclosed in an embodiment of the present application;
fig. 2 is a schematic diagram of an interface call relationship disclosed in an embodiment of the present application;
FIG. 3 is a schematic illustration of a degree of complexity of a circle disclosed in an embodiment of the present application;
FIG. 4 is a schematic diagram of a process for obtaining an expected value of a second assertion sequence disclosed in an embodiment of the present application;
FIG. 5 is a schematic diagram of a test case processing system according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an electronic device disclosed in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In this application, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
As known from the background art, the existing method for manually setting the expected value of the static assertion cannot meet the expanding requirement of automatic testing, and meanwhile, the expected value of the assertion is manually set, so that the wrong expected value of the assertion is generated, and the accuracy and the testing quality of testing a test case are reduced.
In order to solve the above problems, the embodiments of the present application disclose a method and a system for processing a test case, which do not need to manually set an assertion expected value, avoid generating a wrong assertion expected value due to manual reasons, and improve the accuracy and the test quality of testing the test case. In addition, the predicated values of the assertion sequences corresponding to different case types are obtained respectively by processing the cases to be tested of different case types, and the fine granularity of testing the test cases is improved. The specific implementation is specifically illustrated by the following examples.
Referring to fig. 1, a schematic flow chart of a test case processing method disclosed in an embodiment of the present application is shown, where the test case processing method mainly includes the following steps:
s101: determining the case type of a case to be tested; the case types include a performance test case type and a function test case type, and if the case type of the case to be tested is the performance test case type, S102 is executed, and if the case type of the case to be tested is the function test case type, S103 is executed.
Wherein, the number of cases to be tested can be multiple.
The test corresponding to the performance test case type is a performance test, namely, each performance index of the system is tested by simulating conditions such as normal, peak value and abnormal load. Performance metrics such as response time, maximum concurrency number, etc.
The test corresponding to the function test case type is a function test, which means that each function of the application version package is verified, and whether each function meets the requirements of the user is checked.
S102: calculating a to-be-tested case corresponding to the type of the performance test case through a pre-constructed performance test index assertion sequence expected value model to obtain a first assertion sequence expected value; the expected value of the first assertion sequence is used for detecting the execution condition of the case to be tested corresponding to the performance test case type.
In S102, the assertion sequence often includes several sub-functions for one test case, and the implementation of one sub-function usually includes both internal and external modes.
In object-oriented programming languages, the internal approach is a method. In a procedure-oriented programming language, the internal manner is a function. The external means is typically a call interface. The return of each method (function) or interface can be provided with an assertion, and a plurality of assertions are combined together to form a sequence, and the format of the sequence is { { method (function) or interface number, assertion value }, { method (function) or interface number, assertion value } … }.
And detecting whether the execution of the to-be-tested case corresponding to the performance test case type is successful or not through the expected value of the first assertion sequence.
The expected value of the first sequence of assertions is combined by expected values of a number of assertions of a test case to form a sequence in the format of method (function) or interface number, assertions expected value, method (function) or interface number, assertions expected value ….
Specifically, the expected value of the first assertion sequence is shown as A1-A5:
a1: system features and case features are obtained.
The system characteristics refer to basic attributes of the system, such as hardware resource size, physical architecture, logical architecture, user number, transaction amount, and other information of the device.
The case characteristics refer to basic attributes of the case, such as case number, which sub-function points are included, sub-function point number, method or interface name, parameter list entry, number of input parameters, complexity of circle, interface call complexity, and the like.
A2: and carrying out standardized processing and multiple collinearity elimination processing on the system characteristics and the case characteristics to obtain a processing result.
Specifically, the system characteristics and the case characteristics are subjected to standardization processing and multiple collinearity elimination processing, and a processing result is obtained through the following process:
(1) the system features and case features are normalized, classified according to character values, and the feature values are converted from character types to numerical types, as shown in table 1.
Feature name Actual value (character type) After standardization (numerical type)
Database version 11g 1
Database version 12c 2
TABLE 1
Wherein 11g and 12c are actual values of the database version, and the value type 1 represents a value of the database version after being standardized; value type 2 represents the normalized value of another database version.
(2) And performing multiple collinearity analysis on each feature by using a variance expansion factor VIF, screening out system features and case features with correlation, and eliminating multiple collinearity.
Constructing a linear regression model of each feature with the remaining features, using the feature y1For example, feature y1Including system characteristics and casesFeatures, particularly features y1The formula (2) is shown in formula (1).
y1=c2y2+c3y3+…+cpyp+c0+ε (1)
Wherein, y1Asserting a sequence of values for the response time; y is2Asserting a sequence of values for the concurrency quantity; y is3Asserting a sequence of values for a processing capability; c. CpIs a feature ypAnd feature y1The regression coefficient of (2); y ispIs an asserted value sequence of the performance index p; c. C2Is a feature y2And feature y1The regression coefficient of (2); c. C3Is a feature y3And feature y1The regression coefficient of (2); c. CpIs a feature ypAnd feature y1The regression coefficient of (2); c. C0Is a constant term; ε is a random error term.
Obtaining a decision coefficient R according to a linear regression model2Calculating a first feature y1Variance expansion factor of
Figure BDA0003397273730000081
Such as VIF1If greater than 10, the feature is discarded from the sequence of arguments in the regression model for performance test assertion expected values, otherwise placed in the feature set, e.g., y1,y2,y5…}。
A3: based on a pre-established linear regression model, the model coefficients are determined.
Based on a pre-established linear regression model, the process of determining the model coefficients is as follows:
the system characteristics, the case characteristics and the like are independent variables (including but not limited to), the predicated expected value of the performance test index is a dependent variable y, a linear regression model is established, and model coefficients and confidence intervals of the system characteristics and the case characteristics are obtained. The variable y is calculated as shown in equation (2).
Figure BDA0003397273730000091
Wherein x is1The characteristic cpu core number in the response time class case set; x is the number of2The characteristic memory size in the response time class case set; x is the number ofpIs the size of the feature p in the set of response time class cases; a is0Is a constant term; a is1And b1Are all x1Regression coefficients with y; a is2And b2Are all x2Regression coefficients with y; a is2And bpAre all xpRegression coefficients with y; ε is a random error term.
The expression of the model coefficients of the system characteristics and the case characteristics is shown in formula (3).
Figure BDA0003397273730000092
Wherein, y1Asserting a sequence of values for the response time; y is2Asserting a sequence of values for the concurrency quantity; y isnThe value of n is an integer which is more than or equal to 1.
y11In response to the assertion value of time class case 1, case 1 represents the 1 st execution of the case, y12The predicate value representing the response time class case 2, case 2 representing the 2 nd execution of this case; y is1mFor the assertion value of the mth case of the response time class, the value of m is an integer greater than or equal to 1.
x1For the number of characteristic CPU cores, x, in the set of response time class cases11Number of CPU cores, x, of devices used in testing for response time class case 112The number of CPU cores of the equipment used in the test of the response time class case 2; x is the number of1mThe number of CPU cores of the device used in the test for the mth case of the response time class.
x2The characteristic memory size in the response time class case set; x is the number of21The size of a characteristic memory used in the test of the response time class case 1; x is the number of22The size of the characteristic memory used in the test of the response time class case 2; x is the number of2mFor the size of the characteristic memory used in the test of the mth case of response time class, the value of m is more than or equal to1 is an integer.
xpIs the characteristic p in the response time class case set; x is the number ofp1The size of the feature p used in the test for the response time class case 1; x is the number ofp2The size of the feature p used in the test for the response time class case 2; x is the number ofpmFor the size of the feature p used in the test of the mth case of the response time class, the value of m is an integer greater than or equal to 1.
The model coefficients of the specific system features and case features are shown in table 2.
Figure BDA0003397273730000101
TABLE 2
In Table 2, y1Asserting a sequence of values for the response time; y is3Asserting a sequence of values for the availability; y is4Asserting a sequence of values for a system transaction amount; y is5Asserting a sequence of values for system processing capabilities; y is7Asserting a sequence of values for a transaction success rate; a is1Is x1Regression coefficients with y; b1Is x1Regression coefficients with y; a is2Is x2Regression coefficients with y; b2Is x2Regression coefficients with y; a ispIs xpRegression coefficients with y; bpIs xpRegression coefficient with y.
A4: and constructing a performance test index assertion sequence expected value model through the model coefficients and the processing result.
And obtaining a performance index assertion sequence expected value model according to the model coefficient by analyzing the confidence interval and whether the parameter influences the test result.
For example, b is known from the value of the confidence interval2,ap,bpThe confidence of the features is low. By analysis, b2The value has an influence on the test result and needs to be preserved. a ispAnd bpAnd (4) eliminating the characteristic items without influencing the test result by the value, and so on.
The model for obtaining the expected value of the performance index assertion sequence is shown as formula (4), formula (5) and formula (6).
y1=0.023x1 (4)
y3=0.011x1 (5)
Figure BDA0003397273730000102
Wherein, y1Asserting a sequence of values for the response time; y is3Asserting a sequence of values for the availability; y is4Asserting a sequence of values for a system transaction amount; x is the number of1The number of characteristic CPU cores in the response time class case set; x is the number of2The characteristic memory size in the response time class case set.
A5: and calculating the characteristics of the to-be-tested case corresponding to the type of the performance test case through the performance test index assertion sequence expected value model to obtain a first assertion sequence expected value.
Optionally, after the characteristics of the to-be-tested case corresponding to the type of the performance test case are calculated through the performance test index assertion sequence expected value model to obtain a first assertion sequence expected value, B1-B5 is executed.
B1: and establishing an assertion library of the performance test cases through the system characteristics, the case characteristics and the preset historical performance test cases acquired in advance.
The system characteristics include the number of CPU cores, the number of memories, the number of cluster machines and the like.
Case characteristics include interface call complexity, etc.
The preset historical performance test case is the performance test case before the current moment.
The assertion library for a particular performance test case is shown in table 3.
Figure BDA0003397273730000111
TABLE 3
In table 3, the assertion library of the performance test case includes the number of the performance test case (N0001, N0002, N0003), the system characteristics (the number of CPU cores, the number of memories, the number of cluster machines), the case characteristics (the interface call complexity), and the assertion value (the response time).
The number of CPU cores of the performance test case number N0001 is 2c, the number of the memories is 8g, the number of the cluster machines is 3, the interface calling complexity is 11, and the response time is 33 ms.
The number of CPU cores of the performance test case number N0002 is 4c, the number of memories is 16g, the number of cluster machines is 6, the interface calling complexity is 23, and the response time is 17 ms.
The number of CPU cores of the performance test case number N0003 is 2c, the number of memories is 6g, the number of cluster machines is 9, the interface calling complexity is 19, and the response time is 29 ms.
The interface calling complexity is used for measuring the complexity of calling an external interface in a test case, and comprises the interface calling number and the calling relation. The number of the interface calls comprises the number of the interfaces of the nested calls, and the larger the number of the interface calls is, the larger the length of the assertion expected value sequence of the case is. The interface calling relation is divided into three modes of single-point calling, dependent calling and cyclic calling. A schematic diagram of a specific interface call relationship is shown in fig. 2.
In fig. 2, an interface 1 is a single-point call mode, an interface 2, an interface 3, an interface 4, and an interface 5 are dependent call modes, and an interface 6 is a loop call mode.
Interface call complexity for test case n is CnTo express, the interface call complexity model is constructed as shown in the following equation (7).
Figure BDA0003397273730000121
Wherein the content of the first and second substances,
Figure BDA0003397273730000122
number of interfaces using dependent calls in case n to be tested, cq、bqAre all dependent on the call interface complexity coefficient,
Figure BDA0003397273730000123
representing the number of interfaces using a loop call in a test case n, cp、bpAre all cyclic calling interface complexity coefficients, mnNumber of interfaces using a single point call for test case n, cfThe interface complexity coefficient is called for a single point.
The complexity of the code structure of the test case is represented by the degree of circle complexity. The specific degree of circle complexity is shown in FIG. 3.
In fig. 3, the round-robin complexity is 1, which means that the code has only one path, and for a code with one branch, the round-robin complexity is 2.
The round-robin complexity represents the maximum number of assertions of a method or interface in the case. The larger the degree of circle complexity, the higher the possibility of error in case test.
B2: acquiring an actual value of an assertion sequence; the assertion sequence is obtained through prediction of a performance test index assertion sequence expected value model.
The actual value of the predicted assertion sequence is obtained through an expected value model of the performance test index assertion sequence.
The assertion condition of a test case is measured by an assertion sequence method, the maximum assertion number of interfaces or methods in the test case is measured by a circle complexity degree, and the length of an assertion sequence in the test case is measured by calling the number through the interfaces.
B3: and comparing whether the expected value of the first assertion sequence is consistent with the actual value of the assertion sequence.
B4: and if the performance test cases are consistent, deleting the to-be-tested cases corresponding to the performance test case types.
B5: and if the actual values of the characteristics and the assertion sequence of the to-be-tested case corresponding to the type of the performance testing case are not consistent, storing the actual values of the characteristics and the assertion sequence of the to-be-tested case corresponding to the type of the performance testing case into an assertion library of the performance testing case.
The characteristics of the cases to be tested include case names, case numbers, system characteristics, case characteristics and the like.
And predicting the actual value of the predicated sequence through the expected value model of the performance test index predicated sequence, training the regression model of the expected value of the performance test index predicated sequence, and realizing self-learning optimization of the regression model of the expected value of the performance test index predicated sequence.
S103: matching the pre-constructed functional test case assertion sequence library with a to-be-tested case corresponding to the type of the functional test case, and obtaining a second assertion sequence expected value according to a matching result; the expected value of the second assertion sequence is used for detecting the execution condition of the case to be tested corresponding to the type of the functional test case.
And detecting whether the execution of the to-be-tested case corresponding to the type of the functional test case is successful or not through the expected value of the second assertion sequence.
The process of constructing the functional test case assertion sequence library is as follows:
and decomposing the tested function test cases into subfunctions, and decomposing the subfunctions into methods or interfaces, wherein each method or interface has case characteristics of number, name, entry list, incoming parameter number, round complexity of the method or interface, assertion and the like. The round-robin degree of a case is the maximum value of the round-robin degrees of each interface or method (function). The length of the assertion sequence of a case is the number of calls to the case interface or method (function).
For better understanding, a functional test case of the transaction amount monitoring short message alarm is described as an example. For example, the test case of the transaction amount monitoring short message alarm can be divided into two subfunctions of monitoring and alarming, the monitoring subfunctions are divided into an interface for acquiring the real-time transaction amount, an interface for acquiring the historical transaction amount, a method for calculating the baseline of the transaction amount, and whether the alarm is given or not is obtained by comparing the real-time transaction amount data with the baseline data, if the real-time transaction amount is larger than the baseline data, the alarm interface is called, and whether the called short message alarm interface or the mail alarm interface is called is judged according to the parameters. If the real-time transaction amount is less than the baseline data, no warning is given.
An assertion is set at each interface or method (function), as shown in table 4.
Figure BDA0003397273730000131
Figure BDA0003397273730000141
TABLE 4
In table 4, the assertion sequence is { { I0001, 0}, { M0001, "alarm" or "no alarm" }, { I0002, "short message alarm" or "mail alarm" } … }, the length of the assertion sequence is 7 for the number of interface or method (function) calls, and the case round complexity is 2 for the maximum round complexity of each interface or method (function). And warehousing the assertion sequences and case characteristics of all historical tested function cases to form a function test case assertion sequence library.
Specifically, the process of matching the pre-established assertion sequence library of the functional test case with the to-be-tested case corresponding to the type of the functional test case and obtaining the expected value of the second assertion sequence according to the matching result is as follows:
firstly, decomposing a to-be-tested case corresponding to a function test case type to obtain the characteristics of the to-be-tested case corresponding to the function test case type; the characteristics of the case to be tested comprise a method name, an interface name, the number of the entries, an entry list and the circle complexity of the interface.
The method comprises the steps of decomposing a function test case (for example, the case number is GA0001) corresponding to the type of the function test case into subfunctions, decomposing the subfunctions into methods or interfaces, analyzing case characteristics such as names, parameter entry lists, the number of input parameters, the complexity of circles and the like by each method or interface, and combining the case characteristics into a method or interface list L, wherein L is { L1,l2...lnAnd n is an integer greater than or equal to 1.
Wherein l1The information of the decomposed 1 st method or interface name, a parameter list, the number of the transmitted parameters, the complexity of circles and the like is obtained; l2The information of the decomposed 2 nd method or interface name, the parameter list, the number of the introduced parameters, the complexity of the circle and the like is obtained; lnFor the decomposed nth method or information such as an interface name, a parameter entry list, the number of introduced parameters, the complexity of circles and the like, the value of n is an integer which is more than 1.
Examples are as follows:
lnfor the decomposed nth method or information such as interface name, parameter list, number of input parameters, complexity of circle and the like:
{
l1{ method or interface name M1{ income-P11Ginseng C of root of Redburst Acantha12Ginseng C of root of Redburst Acantha13…},N1,Q1…},
l2{ method or interface name M2{ income-P21Ginseng C of root of Redburst Acantha22Ginseng C of root of Redburst Acantha23…},N2,Q2},
…}。
Wherein N is1Number of parameters introduced for method or interface 1; q1The round-robin degree of the method or interface 1; n is a radical of2Number of incoming parameters for method or interface 2; q2Is the round-robin complexity of the method or interface 2.
And then, matching the characteristics of the to-be-tested case corresponding to the type of the function test case with preset characteristics in a pre-constructed function test case assertion sequence library by a preset hierarchy matching method to obtain a matching result, and generating a second assertion sequence expected value of the to-be-tested case corresponding to the type of the function test case based on the matching result.
The method comprises the steps of analyzing the function test case type, analyzing and analyzing the function test case type.
For example, take L out of the interface list L in turn1,l2…, mixing1The name, the parameter entry list, the number of the input parameters and the round complexity of the interface are matched with the data in the assertion sequence library. When matching, a hierarchical matching mode is used, the method or interface name and parameter name in the assertion sequence library are sorted according to the alphabetical order, and meanwhile, matching of the number of entries is carried out when the method name or interface name is successfully matched, and the number of entries is matchedAnd when the numbers are the same, matching of the entry lists is carried out, and when the entry lists are successfully matched, any matching of the circle complexity … … is failed, and matching of the next case interface or method to be tested is carried out. And taking the assertion value of the corresponding method or interface in the assertion sequence library as the assertion expected value of the to-be-tested case method or interface when all case characteristics are completely the same. The hierarchical matching method avoids inefficient full-table scanning and improves matching efficiency.
Specifically, the process of matching the features of the to-be-tested case corresponding to the type of the functional test case with the preset features in the pre-constructed assertion sequence library of the functional test case by the preset hierarchy matching method to obtain a matching result, and generating the expected value of the second assertion sequence of the to-be-tested case corresponding to the type of the functional test case based on the matching result is shown in fig. 4.
Referring to fig. 4, a process involved in S302, matching features of a to-be-tested case corresponding to a type of a functional test case with preset features in a pre-constructed assertion sequence library of the functional test case by using a preset hierarchy matching method to obtain a matching result, and generating an expected value of a second assertion sequence of the to-be-tested case corresponding to the type of the functional test case based on the matching result, mainly includes the following steps:
s401: and sequencing the method names, the interface names, the parameter numbers, the parameter list and the circle complexity of the interfaces in the function test case assertion sequence library through a preset sequencing rule to obtain the sequenced method names, the sequenced interface names, the sequenced parameter numbers, the sequenced parameter list and the circle complexity of the sequenced interfaces.
S402: and matching the method name of the case to be tested with the sorted method name, executing S403 if the method name of the case to be tested is successfully matched with the sorted method name, and executing S408 if the method name of the case to be tested is failed to be matched with the sorted method name.
If the method name of the case to be tested is consistent with the sorted method name, the matching is successful, and if the method name of the case to be tested is not consistent with the sorted method name, the matching is failed.
S403: and matching the interface name of the case to be tested with the ordered interface name, if the interface name of the case to be tested is successfully matched with the ordered interface name, executing S404, and if the interface name of the case to be tested is unsuccessfully matched with the ordered interface name, executing S408.
If the interface name of the case to be tested is consistent with the ordered interface name, the matching is successful, and if the interface name of the case to be tested is inconsistent with the ordered interface name, the matching is failed.
S404: and matching the number of the entries of the cases to be tested with the sorted number of the entries, if the number of the entries of the cases to be tested is successfully matched with the sorted number of the entries, executing S405, and if the number of the entries of the cases to be tested is unsuccessfully matched with the sorted number of the entries, executing S408.
If the number of the entries of the cases to be tested is consistent with the ordered number of the entries, the matching is successful, and if the number of the entries of the cases to be tested is inconsistent with the ordered number of the entries, the matching is failed.
S405: and matching the entry list of the case to be tested with the sorted entry list, if the entry list of the case to be tested is successfully matched with the sorted entry list, executing S406, and if the entry list of the case to be tested is unsuccessfully matched with the sorted entry list, executing S408.
If the entry list of the case to be tested is consistent with the sorted entry list, the matching is successful, and if the entry list of the case to be tested is inconsistent with the sorted entry list, the matching is failed.
S406: and matching the circle complexity of the interface of the case to be tested with the circle complexity of the sorted interface, executing S407 if the circle complexity of the interface of the case to be tested is successfully matched with the circle complexity of the sorted interface, and executing S408 if the circle complexity of the interface of the case to be tested is failed to be matched with the circle complexity of the sorted interface.
And if the circle complexity of the interface of the case to be tested is not consistent with the circle complexity of the interface after sequencing, the matching fails.
S407: and generating a second assertion sequence expected value of the to-be-tested case corresponding to the type of the functional test case.
In S407, after all the methods or interfaces in the test case corresponding to the functional test case type are matched, the matching results are combined to generate an expected value of the assertion sequence of the test case, for example, the expected value of the assertion sequence of the test case GA0001 is as follows, GA0001 { { l { (l) } is1:0},{l2: 'short message alarm' or 'mail alarm' } … }.
Matching the entry parameter number when the matching of the method name or the interface name is successful, matching the entry parameter list when the entry parameter number is the same, and matching the circle complexity degree when the matching of the entry parameter list is successful until all case characteristics in the matched characteristic set are successfully matched. And inefficient full-table scanning is avoided, and the matching efficiency and the multiplexing degree of functional historical case data are improved.
S408: and executing the matching of the characteristics of the case to be tested corresponding to the next functional test case type.
And if the matching of the features of any group of cases to be tested with the features in the functional test case assertion sequence library fails, matching the features of the cases to be tested corresponding to the next functional test case type.
The case characteristics used in the hierarchical matching method are dynamically changed in a self-learning manner, the assertion sequence of the case to be tested is manually established when the matching between the characteristics of any group of cases to be tested and the characteristics in the function test case assertion sequence library fails, and relevant information is added into the assertion sequence library. And simultaneously, analyzing reasons causing matching failure, extracting new case characteristic factors, and adding the case characteristic factors into case characteristics needing matching.
In an automatic test process, the automatically generated predicated sequence value and the case execution step are automatically assembled, so that the test efficiency is improved.
According to the method and the device, the regression model of the expected value of the assertion sequence is continuously and dynamically optimized through historical case data, the multiplexing degree of the historical cases and the generation accuracy of the assertion sequence are improved, the testing quality is improved, and the stable operation capacity of the production environment service is enhanced.
By a full-process automation mode, the method has remarkable advantages under the condition of large-scale case tests such as micro-service and distributed case tests at the present stage, and meanwhile, the user usability and friendliness are high, and the method is convenient to popularize.
In the embodiment of the application, the assertion expected value does not need to be manually set, the generation of wrong assertion expected value caused by manual reasons is avoided, and the accuracy and the test quality of testing the test case are improved. In addition, the predicated values of the assertion sequences corresponding to different case types are obtained respectively by processing the cases to be tested of different case types, and the fine granularity of testing the test cases is improved.
Based on the test case processing method disclosed in fig. 1 in the foregoing embodiment, an embodiment of the present application further discloses a test case processing system, and as shown in fig. 5, the test case processing system includes a determining unit 501, a calculating unit 502, and a matching unit 503.
A determining unit 501, configured to determine a case type of a case to be tested; the case types include a performance test case type and a functional test case type.
The calculating unit 502 is configured to calculate, if the case type of the to-be-tested case is the performance testing case type, the to-be-tested case corresponding to the performance testing case type through a pre-constructed performance testing index assertion sequence expected value model to obtain a first assertion sequence expected value; the expected value of the first assertion sequence is used for detecting the execution condition of the case to be tested corresponding to the performance test case type.
The matching unit 503 is configured to match the to-be-tested case corresponding to the type of the function test case with a pre-established assertion sequence library of the function test case if the type of the to-be-tested case is the type of the function test case, and obtain an expected value of a second assertion sequence according to a matching result; the expected value of the second assertion sequence is used for detecting the execution condition of the case to be tested corresponding to the type of the functional test case.
Further, the calculating unit 502 includes an obtaining module, a processing module, a determining module, a constructing module, and a calculating module.
And the acquisition module is used for acquiring the system characteristics and the case characteristics.
And the processing module is used for carrying out standardized processing and multiple collinearity elimination processing on the system characteristics and the case characteristics to obtain a processing result.
And the determining module is used for determining the model coefficient based on a pre-established linear regression model.
And the construction module is used for constructing a performance test index assertion sequence expected value model through the model coefficient and the processing result.
And the calculation module is used for calculating the characteristics of the to-be-tested case corresponding to the type of the performance test case through the performance test index assertion sequence expected value model to obtain a first assertion sequence expected value.
Furthermore, the test case processing system also comprises an establishing unit, an obtaining unit, a comparing unit, a deleting unit and a storing unit.
And the establishing unit is used for establishing an assertion library of the performance test case through the system characteristics, the case characteristics and the preset historical performance test case acquired in advance.
An obtaining unit, configured to obtain an actual value of the assertion sequence; the assertion sequence is obtained through prediction of a performance test index assertion sequence expected value model.
And the comparison unit is used for comparing whether the expected value of the first assertion sequence is consistent with the actual value of the assertion sequence.
And the deleting unit is used for deleting the to-be-tested case corresponding to the performance testing case type if the performance testing cases are consistent with the performance testing case type.
And the storage unit is used for storing the characteristics of the to-be-tested case corresponding to the type of the performance test case and the actual values of the assertion sequence into the assertion library of the performance test case if the characteristics are inconsistent.
Further, the matching unit 503 includes a decomposition module and a matching module.
The decomposition module is used for decomposing the to-be-tested case corresponding to the function test case type to obtain the characteristics of the to-be-tested case corresponding to the function test case type; the characteristics of the case to be tested comprise a method name, an interface name, the number of the entries, an entry list and the circle complexity of the interface.
And the matching module is used for matching the characteristics of the to-be-tested case corresponding to the type of the function test case with the preset characteristics in a pre-constructed function test case assertion sequence library by a preset level matching method to obtain a matching result, and generating a second assertion sequence expected value of the to-be-tested case corresponding to the type of the function test case based on the matching result.
Further, the matching module comprises a sorting submodule, a first matching submodule, a second matching submodule, a third matching submodule, a fourth matching submodule, a fifth matching submodule and a generating submodule.
And the sequencing submodule is used for sequencing the method names, the interface names, the parameter numbers, the parameter list and the round complexity of the interfaces in the functional test case assertion sequence library through a preset sequencing rule to obtain the sequenced method names, the sequenced interface names, the sequenced parameter numbers, the sequenced parameter list and the round complexity of the sequenced interfaces.
And the first matching submodule is used for matching the method name of the case to be tested with the sequenced method name.
And the second matching submodule is used for matching the interface name of the case to be tested with the ordered interface name if the method name of the case to be tested is successfully matched with the ordered method name.
And the third matching submodule is used for matching the input parameters of the cases to be tested with the sorted input parameters if the interface names of the cases to be tested are successfully matched with the sorted interface names.
And the fourth matching submodule is used for matching the entry list of the case to be tested with the sorted entry list if the entry number of the case to be tested is successfully matched with the sorted entry number.
And the fifth matching submodule is used for matching the circle complexity of the interface of the case to be tested with the circle complexity of the sorted interface if the match between the entry list of the case to be tested and the sorted entry list is successful.
And the generation submodule is used for generating a second assertion sequence expected value of the to-be-tested case corresponding to the type of the function test case if the circle complexity of the interface of the to-be-tested case is successfully matched with the circle complexity of the sequenced interface.
Further, the test case processing system also includes an execution unit.
And the execution unit is used for matching the features of the to-be-tested case corresponding to the next functional test case type if the matching of the features of any group of to-be-tested cases and the features in the functional test case assertion sequence library fails.
In the embodiment of the application, the assertion expected value does not need to be manually set, the generation of wrong assertion expected value caused by manual reasons is avoided, and the accuracy and the test quality of testing the test case are improved. In addition, the predicated values of the assertion sequences corresponding to different case types are obtained respectively by processing the cases to be tested of different case types, and the fine granularity of testing the test cases is improved.
The embodiment of the application also provides a storage medium, wherein the storage medium comprises stored instructions, and when the instructions are executed, the equipment where the storage medium is located is controlled to execute the test case processing method.
The embodiment of the present application further provides an electronic device, which is shown in fig. 6 and specifically includes a memory 601 and one or more instructions 602, where the one or more instructions 602 are stored in the memory 601 and configured to be executed by the one or more processors 603 to execute the one or more instructions 602 to perform the test case processing method.
While, for purposes of simplicity of explanation, the foregoing method embodiments have been described as a series of acts or combination of acts, it will be appreciated by those skilled in the art that the present application is not limited by the order of acts or acts described, as some steps may occur in other orders or concurrently with other steps in accordance with the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
It should be noted that, in the present specification, the embodiments are all described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments may be referred to each other. For the system-class embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The steps in the method of the embodiments of the present application may be sequentially adjusted, combined, and deleted according to actual needs.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing is only a preferred embodiment of the present application and it should be noted that those skilled in the art can make several improvements and modifications without departing from the principle of the present application, and these improvements and modifications should also be considered as the protection scope of the present application.

Claims (10)

1. A method for test case processing, the method comprising:
determining the case type of a case to be tested; the case types comprise a performance test case type and a function test case type;
if the case type of the case to be tested is the performance test case type, calculating the case to be tested corresponding to the performance test case type through a pre-constructed performance test index assertion sequence expected value model to obtain a first assertion sequence expected value; the expected value of the first assertion sequence is used for detecting the execution condition of a to-be-tested case corresponding to the performance test case type;
if the case type of the case to be tested is the function test case type, matching the case to be tested corresponding to the function test case type through a pre-constructed function test case assertion sequence library, and obtaining a second assertion sequence expected value according to a matching result; and the expected value of the second assertion sequence is used for detecting the execution condition of the case to be tested corresponding to the type of the functional test case.
2. The method as claimed in claim 1, wherein if the case type of the to-be-tested case is the performance testing case type, calculating the to-be-tested case corresponding to the performance testing case type through a pre-constructed performance testing index assertion sequence expected value model to obtain a first assertion sequence expected value, including:
acquiring system characteristics and case characteristics;
carrying out standardized processing and multiple collinearity elimination processing on the system characteristics and the case characteristics to obtain a processing result;
determining a model coefficient based on a pre-established linear regression model;
constructing a performance test index assertion sequence expected value model according to the model coefficient and the processing result;
and calculating the characteristics of the to-be-tested case corresponding to the type of the performance test case through the performance test index predicated sequence expected value model to obtain a first predicated sequence expected value.
3. The method as claimed in claim 2, wherein after the calculating the characteristics of the case to be tested corresponding to the type of the performance testing case through the model of the expected value of the performance testing index assertion sequence to obtain the expected value of the first assertion sequence, the method further comprises:
establishing an assertion library of the performance test cases through the system characteristics, the case characteristics and preset historical performance test cases acquired in advance;
acquiring an actual value of an assertion sequence; the assertion sequence is obtained through prediction of an expected value model of the performance test index assertion sequence;
comparing whether the expected value of the first assertion sequence is consistent with the actual value of the assertion sequence;
if the performance test cases are consistent, deleting the to-be-tested cases corresponding to the performance test case types;
and if the actual values of the features and the assertion sequences of the cases to be tested are not consistent with the actual values of the features and the assertion sequences of the cases to be tested, which correspond to the types of the performance test cases, are stored in an assertion library of the performance test cases.
4. The method as claimed in claim 1, wherein if the case type of the to-be-tested case is the functional test case type, matching the to-be-tested case corresponding to the functional test case type through a pre-constructed functional test case assertion sequence library, and obtaining a second assertion sequence expected value according to a matching result, includes:
decomposing the to-be-tested case corresponding to the function test case type to obtain the characteristics of the to-be-tested case corresponding to the function test case type; the characteristics of the case to be tested comprise a method name, an interface name, the number of the entries, an entry list and the circle complexity of the interface;
and matching the characteristics of the to-be-tested case corresponding to the type of the function test case with preset characteristics in a pre-constructed function test case assertion sequence library by a preset level matching method to obtain a matching result, and generating a second assertion sequence expected value of the to-be-tested case corresponding to the type of the function test case based on the matching result.
5. The method as claimed in claim 4, wherein the matching the features of the to-be-tested case with the preset features in the functional test case assertion sequence library by a preset hierarchy matching method to obtain a matching result, and generating the second assertion sequence expected value of the to-be-tested case corresponding to the functional test case type based on the matching result comprises:
sequencing the method names, the interface names, the parameter numbers, the parameter list and the round complexity of the interfaces in the function test case assertion sequence library through a preset sequencing rule to obtain the sequenced method names, the sequenced interface names, the sequenced parameter numbers, the sequenced parameter list and the round complexity of the sequenced interfaces;
matching the method name of the case to be tested with the sorted method name;
if the method name of the case to be tested is successfully matched with the sorted method name, matching the interface name of the case to be tested with the sorted interface name;
if the interface name of the case to be tested is successfully matched with the ordered interface name, matching the number of the entries of the case to be tested with the ordered entries;
if the number of the entries of the case to be tested is successfully matched with the ordered number of the entries, matching the entry list of the case to be tested with the ordered entry list;
if the match between the entry list of the case to be tested and the sorted entry list is successful, matching the circle complexity of the interface of the case to be tested with the circle complexity of the sorted interface;
and if the circle complexity of the interface of the case to be tested is successfully matched with the circle complexity of the ordered interface, generating a second assertion sequence expected value of the case to be tested corresponding to the type of the function test case.
6. The method of claim 5, further comprising:
and if the matching of the features of any group of cases to be tested with the features in the functional test case assertion sequence library fails, matching the features of the cases to be tested corresponding to the next functional test case type.
7. A test case handling system, the system comprising:
the determining unit is used for determining the case type of the case to be tested; the case types comprise a performance test case type and a function test case type;
the calculation unit is used for calculating the to-be-tested case corresponding to the performance test case type through a pre-constructed performance test index assertion sequence expected value model to obtain a first assertion sequence expected value if the case type of the to-be-tested case is the performance test case type; the expected value of the first assertion sequence is used for detecting the execution condition of a to-be-tested case corresponding to the performance test case type;
the matching unit is used for matching the case to be tested corresponding to the function test case type through a pre-constructed function test case assertion sequence library if the case type of the case to be tested is the function test case type, and obtaining a second assertion sequence expected value according to a matching result; and the expected value of the second assertion sequence is used for detecting the execution condition of the case to be tested corresponding to the type of the functional test case.
8. The system of claim 7, wherein the computing unit comprises:
the acquisition module is used for acquiring system characteristics and case characteristics;
the processing module is used for carrying out standardized processing and multiple collinearity elimination processing on the system characteristics and the case characteristics to obtain a processing result;
the determining module is used for determining a model coefficient based on a pre-established linear regression model;
the construction module is used for constructing a performance test index assertion sequence expected value model according to the model coefficient and the processing result;
and the calculation module is used for calculating the characteristics of the to-be-tested case corresponding to the type of the performance test case through the performance test index expected value model to obtain a first assertion sequence expected value.
9. The system of claim 8, further comprising:
the establishing unit is used for establishing an assertion library of the performance test case through the system characteristics, the case characteristics and a preset historical performance test case acquired in advance;
an obtaining unit, configured to obtain an actual value of the assertion sequence; the assertion sequence is obtained through prediction of an expected value model of the performance test index assertion sequence;
a comparison unit for comparing whether the expected value of the first assertion sequence is consistent with the actual value of the assertion sequence;
the deleting unit is used for deleting the to-be-tested case corresponding to the performance testing case type if the performance testing cases are consistent with the performance testing case type;
and the storage unit is used for storing the characteristics of the to-be-tested case corresponding to the type of the performance test case and the actual value of the assertion sequence into the assertion library of the performance test case if the characteristics of the to-be-tested case and the actual value of the assertion sequence are not consistent.
10. The system of claim 7, wherein the matching unit comprises:
the decomposition module is used for decomposing a pre-acquired case to be tested to obtain the characteristics of the case to be tested; the characteristics of the case to be tested comprise a method name, an interface name, the number of the entries, an entry list and the circle complexity of the interface;
and the matching module is used for matching the characteristics of the to-be-tested case corresponding to the type of the function test case with the preset characteristics in a pre-constructed function test case assertion sequence library by a preset level matching method to obtain a matching result, and generating a second assertion sequence expected value of the to-be-tested case corresponding to the type of the function test case based on the matching result.
CN202111487801.1A 2021-12-07 2021-12-07 Test case processing method and system Pending CN114138659A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111487801.1A CN114138659A (en) 2021-12-07 2021-12-07 Test case processing method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111487801.1A CN114138659A (en) 2021-12-07 2021-12-07 Test case processing method and system

Publications (1)

Publication Number Publication Date
CN114138659A true CN114138659A (en) 2022-03-04

Family

ID=80384583

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111487801.1A Pending CN114138659A (en) 2021-12-07 2021-12-07 Test case processing method and system

Country Status (1)

Country Link
CN (1) CN114138659A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114637692A (en) * 2022-05-17 2022-06-17 杭州优诗科技有限公司 Test data generation and test case management method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114637692A (en) * 2022-05-17 2022-06-17 杭州优诗科技有限公司 Test data generation and test case management method
CN114637692B (en) * 2022-05-17 2022-08-19 杭州优诗科技有限公司 Test data generation and test case management method

Similar Documents

Publication Publication Date Title
US11099973B2 (en) Automated test case management systems and methods
US20210342313A1 (en) Autobuild log anomaly detection methods and systems
CN106294120B (en) Method, apparatus and computer program product for testing code
CN110287332B (en) Method and device for selecting simulation model in cloud environment
CN113268403B (en) Time series analysis and prediction method, device, equipment and storage medium
CN110647447A (en) Abnormal instance detection method, apparatus, device and medium for distributed system
CN114138659A (en) Test case processing method and system
CN110046086B (en) Expected data generation method and device for test and electronic equipment
CN111124791A (en) System testing method and device
CN113886373A (en) Data processing method and device and electronic equipment
CN109376285B (en) Data sorting verification method based on json format, electronic device and medium
CN107273293B (en) Big data system performance test method and device and electronic equipment
CN115564410A (en) State monitoring method and device for relay protection equipment
CN115455091A (en) Data generation method and device, electronic equipment and storage medium
CN115203556A (en) Score prediction model training method and device, electronic equipment and storage medium
CN114358910A (en) Abnormal financial data processing method, device, equipment and storage medium
CN114416852A (en) Data processing method, device, equipment and medium
CN110008098B (en) Method and device for evaluating operation condition of nodes in business process
CN114595216A (en) Data verification method and device, storage medium and electronic equipment
CN114461520A (en) Software testing method and device and electronic equipment
CN111124918B (en) Test data prediction method and device and processing equipment
CN113641823A (en) Text classification model training method, text classification device, text classification equipment and medium
CN109697141B (en) Method and device for visual testing
CN112783775A (en) Special character input testing method and device
CN111274112A (en) Application program pressure test method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination