CN115114146A - Interface test method, device, equipment and storage medium - Google Patents
Interface test method, device, equipment and storage medium Download PDFInfo
- Publication number
- CN115114146A CN115114146A CN202210622206.2A CN202210622206A CN115114146A CN 115114146 A CN115114146 A CN 115114146A CN 202210622206 A CN202210622206 A CN 202210622206A CN 115114146 A CN115114146 A CN 115114146A
- Authority
- CN
- China
- Prior art keywords
- parameter
- test
- field
- data
- file
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003860 storage Methods 0.000 title claims abstract description 21
- 238000010998 test method Methods 0.000 title abstract description 7
- 238000012360 testing method Methods 0.000 claims abstract description 608
- 238000012545 processing Methods 0.000 claims abstract description 90
- 238000006243 chemical reaction Methods 0.000 claims abstract description 34
- 238000000034 method Methods 0.000 claims description 45
- 238000005192 partition Methods 0.000 claims description 14
- 230000004044 response Effects 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 16
- 230000008569 process Effects 0.000 description 13
- 238000004590 computer program Methods 0.000 description 7
- 241000208340 Araliaceae Species 0.000 description 6
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 6
- 235000003140 Panax quinquefolius Nutrition 0.000 description 6
- 235000008434 ginseng Nutrition 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 238000011161 development Methods 0.000 description 3
- 238000013515 script Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000005429 filling process Methods 0.000 description 2
- 238000007373 indentation Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- XSQUKJJJFZCRTK-NJFSPNSNSA-N UREA C 14 Chemical compound N[14C](N)=O XSQUKJJJFZCRTK-NJFSPNSNSA-N 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- COCAUCFPFHUGAA-MGNBDDOMSA-N n-[3-[(1s,7s)-5-amino-4-thia-6-azabicyclo[5.1.0]oct-5-en-7-yl]-4-fluorophenyl]-5-chloropyridine-2-carboxamide Chemical compound C=1C=C(F)C([C@@]23N=C(SCC[C@@H]2C3)N)=CC=1NC(=O)C1=CC=C(Cl)C=N1 COCAUCFPFHUGAA-MGNBDDOMSA-N 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3684—Test management for test design, e.g. generating new test cases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3664—Environments for testing or debugging software
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
Abstract
The embodiment of the specification provides an interface test method, an interface test device, equipment and a storage medium, wherein the interface test method comprises the following steps: acquiring a test strategy identifier and a plurality of parameter data through an interface to be tested; filling the target sample template file corresponding to the test strategy identification according to the plurality of pieces of parameter entering data to obtain a filled target sample template file; the target sample template file after filling processing comprises test parameter input data and prediction parameter output data of each test case in a plurality of test cases; converting the target sample template file after filling processing into an execution file of each test case according to a preset file conversion rule; executing the execution file to obtain actual parameter data fed back by the interface; and generating a test result of the interface according to the predicted parameter data and the actual parameter data, thereby improving the generation efficiency of the test case.
Description
Technical Field
The present application relates to the field of test technologies, and in particular, to an interface test method, apparatus, device, and storage medium.
Background
The interface test is a test for testing interfaces between components of the system, and is mainly used for testing interfaces between the system and other external systems and interfaces between submodules in the system. With the development of electronic technology, people gradually put higher demands on interface testing efficiency. In order to improve the testing efficiency, save testing manpower, and reduce the inevitable error in the manual testing process, the automation of the interface testing has become a necessary trend for the development of the electronic industry.
Disclosure of Invention
The embodiment of the application provides an interface testing method, device, equipment and storage medium, and the generation efficiency of a test case can be improved.
In a first aspect, an embodiment of the present application provides an interface testing method, including:
acquiring a test strategy identifier and a plurality of parameter input data through an interface to be tested, wherein the parameter input data comprises a plurality of parameter input fields and field values corresponding to each parameter input field in the parameter input fields; an input parameter data comprises an input parameter field and a corresponding field value;
filling the target sample template file corresponding to the test strategy identification according to the plurality of pieces of parameter input data to obtain a filled target sample template file; the target sample template file after filling processing comprises test parameter input data and prediction parameter output data of each test case in a plurality of test cases; the target sample template file is configured with a parameter-out field and an incidence relation between the parameter-out field and the parameter-in fields;
converting the target sample template file after filling processing into an execution file of each test case according to a preset file conversion rule;
executing the execution file to obtain actual parameter data fed back by the interface;
and generating a test result of the interface according to the predicted parameter data and the actual parameter data.
In a second aspect, an embodiment of the present application provides an interface testing apparatus, including:
the data acquisition unit is used for acquiring a test strategy identifier and a plurality of parameter input data through an interface to be tested, wherein the parameter input data comprise a plurality of parameter input fields and field values corresponding to each parameter input field in the parameter input fields; an input parameter data comprises an input parameter field and a corresponding field value;
the file filling unit is used for filling the target sample template file corresponding to the test strategy identifier according to the plurality of pieces of parameter input data to obtain a filled target sample template file; the target sample template file after filling processing comprises test parameter input data and prediction parameter output data of each test case in a plurality of test cases; the target sample template file is configured with a parameter-out field and an incidence relation between the parameter-out field and the parameter-in fields;
the file conversion unit is used for converting the target sample template file after filling processing into an execution file of each test case according to a preset file conversion rule;
the file execution unit is used for executing the execution file and obtaining the actual parameter data fed back by the interface;
and the result generating unit is used for generating a test result of the interface according to the predicted parameter data and the actual parameter data.
In a third aspect, an embodiment of the present application provides an interface testing device, including: a processor; and a memory configured to store computer executable instructions that, when executed, cause the processor to perform the interface testing method of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium for storing computer-executable instructions, which, when executed by a processor, implement the interface testing method according to the first aspect.
It can be seen that, in the embodiment of the present application, first, a test policy identifier and a plurality of entry parameter data are obtained through an interface to be tested, where the plurality of entry parameter data includes a plurality of entry parameter fields and a field value corresponding to each entry parameter field in the plurality of entry parameter fields; an input parameter data comprises an input parameter field and a corresponding field value; secondly, filling the target sample template file corresponding to the test strategy identification according to the plurality of pieces of parameter input data to obtain a filled target sample template file; the target sample template file after filling processing comprises test parameter input data and prediction parameter output data of each test case in a plurality of test cases; configuring a parameter output field and an incidence relation between the parameter output field and a plurality of parameter input fields in a target sample template file; then, according to a preset file conversion rule, converting the target sample template file after filling processing into an execution file of each test case; then, executing the execution file of each test case to obtain actual parameter data fed back by the interface; and finally, generating a test result of the interface according to the predicted parameter data and the actual parameter data. Therefore, on one hand, the target sample template file can be filled according to the input parameter data, so that the filled target sample template file comprises the test input parameter data and the prediction output parameter data of each test case in the test cases, and the test cases can be flexibly assembled by utilizing the input parameter data; on the other hand, the target sample template file after filling processing can be converted into the execution file of each test case according to the preset file conversion rule, so that programming workers do not need to write corresponding codes for the execution file of each test case specially, the dependency of the interface test flow on the programming workers is reduced, and the generation efficiency of the test cases is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments described in the present specification, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts;
fig. 1 is a processing flow chart of an interface testing method according to an embodiment of the present application;
fig. 2 is a schematic interface diagram of a target sample template file according to an embodiment of the present disclosure;
FIG. 3 is a schematic interface diagram of a yaml configuration file according to an embodiment of the present application;
FIG. 4 is a flowchart illustrating another method for testing an interface according to an embodiment of the present disclosure;
fig. 5 is a processing flow chart of another interface test processing method according to an embodiment of the present application;
fig. 6 is a schematic diagram of an interface testing apparatus according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an interface test apparatus according to an embodiment of the present application.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the embodiments of the present application, the technical solutions in the embodiments of the present application will be described clearly and completely below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present specification, and not all embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments of the present application, shall fall within the protection scope of the present application.
In practical application, the testing behavior of the interface usually needs to use code, and a testing worker with code writing skills is relied on to write and maintain an execution file of each test case for testing the interface.
The Test Case (Test Case) refers to the description of a Test task performed on a specific software product, and embodies Test schemes, methods, techniques and strategies. The contents of the test object, the test environment, the input data, the test steps, the expected results, the test scripts and the like are included, and finally, a document is formed. Simply considered, a test case is a set of test inputs, execution conditions, and expected results tailored for a particular purpose to verify whether a particular software requirement is met. The test case may contain the following four contents: case title, preconditions, test steps and expected results. The case title mainly describes and tests a certain function; the precondition means that the case title needs to satisfy the condition; the test step mainly describes the operation steps of the use case; the expected result refers to meeting the expected (development specification, requirement document, user requirement, etc.) requirements. The test case does not contain an actual result, and the actual result can be generated only by executing the execution file corresponding to the test case before the test case is generated.
The execution file may be a test script (Testing script), which generally refers to a series of instructions for a particular test that may be executed by the automated test tool.
The writing of the execution files of the test cases is very time-consuming and energy-consuming and depends on the expertise of the test worker. Therefore, the test worker manually writes the execution file of each test case, which results in low test case generation efficiency. In order to overcome this problem, an embodiment of the present application provides an interface testing method.
Fig. 1 is a processing flow chart of an interface testing method according to an embodiment of the present application. The interface testing method described in fig. 1 may be performed by an interface testing device, which may be a terminal device, such as a mobile phone, a notebook computer, an intelligent interactive device, and so on; alternatively, the interface test device may also be a server, such as a standalone physical server, a cluster of servers, or a cloud server capable of cloud computing. The interface testing method provided in this embodiment specifically includes steps S102 to S110.
The interface test can be a test of interfaces among test system components, and is mainly used for testing interfaces between a system and other external systems, interfaces among submodules in the system, and key inspection of data exchange, transmission, control and management processes, mutual logic dependency relationship among systems and the like.
Step S102, acquiring a test strategy identifier and a plurality of parameter input data through an interface to be tested, wherein the plurality of parameter input data comprise a plurality of parameter input fields and field values corresponding to each parameter input field in the plurality of parameter input fields; an entry parameter data includes an entry parameter field and its corresponding field value.
The interface to be tested can be an interface between one system and other external systems, and can also be an interface between various sub-modules in the system.
The interface to be tested may be pre-configured with one test strategy or with multiple test strategies. Each test strategy may be preconfigured with corresponding test strategy parameters. The test strategy parameters include: the method comprises the steps of testing strategy identification, the number of preset parameter fields, the name of each preset parameter field, the incidence relation between the preset parameter fields and the like. The above-listed several test strategy parameters are merely exemplary and do not constitute a particular limitation on the test strategy parameters.
The test strategy identification is used for uniquely identifying one test strategy in a plurality of test strategies pre-configured by the interface to be tested. The test policy identification may be a name, number, etc. of the test policy.
The number of the preset parameter fields may be one or more. The number of the parameter field may be one or more.
The preset parameter field is a test strategy parameter in a pre-configured test strategy, and the parameter field is an input parameter which is acquired through an interface to be tested and can be used for interface test, and the two are not the same concept.
The multiple parameter data comprises multiple parameter fields and field values corresponding to each parameter field in the multiple parameter fields. For example, the plurality of input parameters includes: the field values a1 and a2 corresponding to the parameter field a and the parameter field a, the field values B1 corresponding to the parameter field B and the parameter field B, and the field values C1, C2 and C3 corresponding to the parameter field C and the parameter field C.
The number of the field values corresponding to each parameter field may be one or more.
An entry parameter data includes an entry parameter field and its corresponding field value. The input parameter data may be data in the form of a string. An entry field may correspond to a first substring, and a field value corresponding to an entry field may correspond to a second substring. Illustratively, a reference character string may include a first substring and a second substring having a one-to-one correspondence, and may also include a first substring and a plurality of second substrings having a one-to-many correspondence.
The following test strategies are described in some detail:
in each test strategy, the number of the preset parameter fields can be the same or different; the names of the preset parameter fields can be completely the same, can also be completely different, and can also be partially the same; the number of the preset parameter fields can be the same or different; the names of the preset parameter fields can be completely the same, can also be completely different, and can also be partially the same.
In a test strategy, the name of the preset parameter field can also be represented by the corresponding parameter field identifier. The parameter field identifier may be used to uniquely identify the default parameter field. The interface may store in advance a correspondence between the entry field identifier and a name of a preset entry field. The preset parameter field is the same, and is not described herein again.
Under the condition that the number and the reference field identification of the preset reference fields, the number and the reference field identification of the preset reference fields are configured in the test strategy, the number and the name of the preset reference fields and the number and the name of the preset reference fields can be determined according to the corresponding relationship between the prestored reference field identification and the name of the reference fields.
Each preset reference field may correspond to one or more field values, for example, a preset reference field "whether to perform a operation", whose field value may be "yes" or "no"; the parameter field "date" is preset, and the field values thereof may be "4 month 1 day", "4 month 2 days", and "4 month 3 days", and so on. The same default parameter field may include a plurality of field values having different numeric types, for example, a part of the field values of the default parameter field 1 is a boolean type, and another part of the field values is a string type.
Each preset parameter field may correspond to one or more field values, and the field value of each preset parameter field may be determined according to the association relationship between the preset parameter field and the field value of each parameter field.
In specific implementation, a corresponding preset relation may be configured for each preset parameter field, and the association relationship between the preset parameter field and the preset parameter field may be determined according to the field value of each parameter field and the preset relation corresponding to the parameter field.
Illustratively, the preset parameter fields are X1, X2 and X3, the preset parameter fields are Y1, Y2 and Y3, and the association relationship between the preset parameter fields and the preset parameter fields is as follows: y1 ═ X1, Y2 ═ X1+ X2, Y3 ═ X3/2. The field value of Y1 is determined according to the field value of X1 and a preset relation of 'Y1 ═ X1'; the field value of Y2 is determined according to the field value of X1, the field value of X2, and a preset relation "Y2 ═ X1+ X2"; the field value of Y3 is determined according to the field value of X2, the field value of X3, and a preset relation "Y3 ═ X3/2".
The three predetermined relationships described above are merely a few simple examples that are enumerated for ease of understanding.
In another embodiment, the argument of the preset relational expression may not include the preset argument field, for example, the preset argument field Y1 is t, which is used to indicate a time point when the interface to be tested acquires the test policy identifier and the multiple argument data.
The preset parameter field may correspond to N field values, the preset parameter field may correspond to M field values, each of the N field values of the preset parameter field may correspond to one field value of the M field values of the preset parameter field, and each of the M field values of the preset parameter field may correspond to one or more field values of the N field values of the preset parameter field. M and N are natural numbers greater than 0. M and N may have the same or different values.
For example, a field value a1 preset in the argument field a corresponds to a field value B1 preset out the argument field B; a field value a2 of the preset entry field a corresponds to a field value B1 of the preset exit field B; the field value a3 of the preset in-parameter field a corresponds to the field value B2 of the preset out-parameter field B.
The association relationship between the preset entry field and the preset exit field may be that a field value of the preset entry field is filled in a preset condition template to generate an entry condition, and whether the entry condition is satisfied is a field value corresponding to the preset exit field.
For example, in the test policy 1, preset parameter entry fields may be configured as a parameter entry field 1, a parameter entry field 2, and a parameter entry field 3, preset parameter exit fields may be configured as a parameter exit field 1 and a parameter exit field 2, and an association relationship between the parameter exit field 1 and the parameter entry field 1 may be configured as follows: if the field value of the entry field 1 is a1, the field value of the entry field 1 is x1, if the field value of the entry field 1 is not a1, the field value of the entry field 1 is x2, and the association relationship between the entry field 2 and the aforementioned three predetermined entry fields (i.e., entry field 1, entry field 2, and entry field 3) is: the field value of the argument field 2 is the sum of the field values of the argument field 1, the argument field 2, and the argument field 3.
Step S104, according to the plurality of pieces of parameter input data, filling the target sample template file corresponding to the test strategy identification to obtain a filled target sample template file; the target sample template file after filling processing comprises test input parameter data and prediction output parameter data of each test case in a plurality of test cases; the target sample template file is configured with a parameter-out field and an incidence relation between the parameter-out field and a plurality of parameter-in fields.
In specific implementation, the interface may pre-configure a plurality of test sample template files, each test policy corresponds to one test sample template file, and then, the test policy parameters corresponding to each test policy have a corresponding relationship with the configuration parameters of the target sample template file corresponding to the test policy.
For example, the test policy 1 is configured with a preset parameter field 1, a preset parameter field 2, and a preset parameter field 3, and then the target sample template file corresponding to the test policy 1 is configured with the parameter field 1, the parameter field 2, and the parameter field 3. Presetting a parameter field 1 and a parameter field 1 as the same field, wherein the names of the parameter field 1 and the parameter field 1 are the same, and the numerical type and the value range of the field value are also the same; presetting a parameter output field 2 and a parameter output field 2 to be the same field, wherein the names of the parameter output field 2 and the parameter output field 2 are the same, and the numerical type and the value range of the field value are also the same; the preset parameter field 3 and the parameter field 3 are the same field, the name is the same, and the value type and the value range of the field value are also the same.
For another example, the test policy 2 is configured with the association relationship between the preset entry field and the preset exit field as follows: and Y1 is equal to X1, and Y2 is equal to X1+ X2, the target sample template file corresponding to the test policy 2 is configured with association relationships between the parameter field and multiple parameter fields: y1 ═ X1, Y2 ═ X1+ X2.
The target sample template file may be a test sample template file corresponding to the test policy identifier among a plurality of test sample template files configured in advance.
The test sample template file can be an excel file, and can also be other file types which can be used for obtaining corresponding parameter output data through calculation according to the filled parameter input data.
In specific implementation, a plurality of test sample template files are pre-configured on the interface, a plurality of sheets can be established in one Excel workbook file, and each sheet is one test sample template file. The name of each sheet can adopt a test strategy identifier so as to conveniently query the corresponding test sample template file according to the test strategy identifier.
sheet is a table displayed in the workbook window of the Excel program. A sheet may consist of 1048576 rows and 2464 columns. The rows are numbered from 1 to 65536 and the column numbers are in turn represented by the letters A, B … … IV. The row number is displayed on the left side of the workbook window and the column number is displayed on the top side of the workbook window. Excel defaults to three sheets in one workbook, and users can add sheets according to needs, but each workbook can comprise 255 sheets at most. Each sheet has a name, and the sheet name is displayed on a worksheet tab. The worksheet tab displays the first three sheet names that the system defaults to: sheet1, Sheet2, Sheet 3. sheet is the most important part of Excel data storage and processing, and comprises cells arranged in rows and columns. It is part of a workbook, also known as a spreadsheet.
In an embodiment, N preset entry fields and M preset exit fields may be configured in the test policy corresponding to each test sample template file. In the test sample template file, for example, the 1 st column is used for representing the serial number of the test case; in the 2 nd to N +1 st columns, each column is used for representing a preset parameter field; from column N +2 to column N + M +2, each column is used to characterize a default parameter field. In addition, each row in the test sample template file corresponds to a test case. The xth row of the template file of the test sample is listed as a region to be filled, wherein x is more than or equal to 1 and less than or equal to the maximum value of the number of test cases, and y is more than or equal to 2 and less than or equal to N + M + 2.
If y is more than or equal to 2 and less than or equal to N +1, the x row and the y column of the test sample template file are used for representing the field value of the preset reference field corresponding to the y column of the x test case; and if y is more than or equal to N +2 and less than or equal to N + M +2, the y column in the x row of the test sample template file is used for representing the field value of the preset parameter field corresponding to the y column of the x test case.
The entry parameter data corresponding to each test case is not completely the same, that is, there are no two test cases with the same field value of each preset entry parameter field.
Optionally, after querying a corresponding target sample template file in a plurality of test sample template files pre-configured for the interface according to the test policy identifier, the interface testing method further includes: acquiring a plurality of preset entry fields which are configured in advance in the test strategy identification; judging whether the acquired multiple input parameter data are matched with multiple preset input parameter fields or not; if not, generating the alarm information with error according to the input parameters.
The obtained plurality of parameter input data are matched with a plurality of preset parameter input fields corresponding to the test strategy identification, wherein the number of the obtained parameter input fields is the same as that of the preset parameter input fields corresponding to the test strategy identification, and the names of the obtained parameter input fields correspond to those of the preset parameter input fields corresponding to the test strategy identification one by one.
For example, the obtained plurality of input parameters includes: the method comprises the following steps that 1 field value of a parameter entering field 1 and the parameter entering field 1, 3 field values of a parameter entering field 2 and the parameter entering field 2, and 2 field values of a parameter entering field 3 and the parameter entering field 3 are combined into 3 parameter entering fields, and a plurality of preset parameter entering fields corresponding to test strategy identification comprise: and the number of the obtained parameter fields is the same as that of the preset parameter fields corresponding to the test strategy identification. And the field name of the parameter input field 1 is the same as the field name of the preset parameter input field 1, the field name of the parameter input field 2 is the same as the field name of the preset parameter input field 2, and the field name of the parameter input field 3 is the same as the field name of the preset parameter input field 3, so that the obtained parameter input data are matched with the preset parameter input fields corresponding to the test strategy identifier.
For another example, the test policy identifier a01 of the test policy 1 and three character strings are obtained through the interface, where the character string 1 includes the reference field 1 and a plurality of field values of the reference field 1, the character string 2 includes the reference field 4 and a plurality of field values of the reference field 4, and the character string 3 includes the reference field 3 and a plurality of field values of the reference field 3. According to the test strategy identification a01, a corresponding target sample template file can be obtained through query. The test policy 1 corresponding to the test policy identifier a01 configures a plurality of preset entry fields: ginseng 1, ginseng 2 and ginseng 3; a plurality of preset parameter fields are also configured: ginseng 1 and ginseng 2; and the incidence relation between the preset parameter input field and the preset parameter output field is also configured. And respectively comparing the parameter entering field 1 with the parameter entering field 1, the parameter entering field 4 with the parameter entering field 2 and the parameter entering field 3 with the parameter entering field 3, determining that the three obtained character strings are not matched with the three preset parameter entering fields, and generating alarm information with errors according to the parameter entering data so as to inform a test worker that the parameter entering data is not completely consistent with the preset parameter entering fields corresponding to the test strategy identification. The alarm information may carry a parameter entry field inconsistent with a comparison result of a preset parameter entry field.
Optionally, according to a plurality of pieces of parameter data, performing a filling process on the target sample template file corresponding to the test policy identifier to obtain a filled target sample template file, including: according to the test strategy identification, inquiring a target sample template file matched with the test strategy identification in a plurality of test sample template files pre-configured by the interface; and filling the target sample template file according to the incidence relation between the parameter output field and the parameter input fields and each parameter input data to obtain the filled target sample template file.
The interface may be preconfigured with a plurality of test sample template files, each test sample template file corresponding to a test policy of the interface. Furthermore, according to the test strategy identification, a target sample template file corresponding to the test strategy identification can be inquired from the plurality of test sample template files.
The predicted parameter data may be a theoretical value of the parameter data calculated according to the association relationship between the parameter field and the plurality of parameter fields and the parameter data.
The number of the parameter field may be one or more. In the case where the number of argument fields is one, the predicted argument data may include a field value of the one argument field. In the case where the number of the parameter field is plural, the predicted parameter data may include plural field values. The number of the plurality of field values included in the prediction parameter data is the same as the number of the parameter fields, and the plurality of field values included in the prediction parameter data correspond to each parameter field one by one.
The target sample template file is filled with a plurality of test cases and the parameter entering data and parameter predicting data of each test case, which can be referred to fig. 2. Fig. 2 is an interface schematic diagram of a target sample template file according to an embodiment of the present application.
As shown in fig. 2, column 1 is a sequence number for characterizing a test case, column 2 is for characterizing a parameter-in field 1, which is referred to as parameter-in 1 for short, column 3 is for characterizing a parameter-in field 2, which is referred to as parameter-in 2 … …, column 8 is for characterizing a parameter-out field 3, which is referred to as parameter-out 3 for short. Each row corresponds to a test case.
Illustratively, the input parameters corresponding to the test case 1 include: the field value a1 of reference 1, the field value b1 of reference 2, the field value c1 of reference 3, and the field value d1 of reference 4. After the field values of the argument 1, the argument 2, the argument 3, and the argument 4 corresponding to the test case 1 are filled, a field value x1 of the argument 1, a field value y1 of the argument 2, and a field value z1 of the argument 3 are automatically generated. The value x1 of the field of the argument 1 is predicted argument data of the argument 1 in the test case 1, the value y1 of the field of the argument 2 is predicted argument data of the argument 2 in the test case 1, and the value z1 of the field of the argument 3 is predicted argument data of the argument 3 in the test case 1. Other test cases are similar to test case 1 and will not be described herein.
Optionally, the number of field values of each parameter field is at least one; the association relation between the parameter-out field and the parameter-in fields is determined according to the field value of each parameter-in field and a preset relation corresponding to the parameter-out field; filling the target sample template file according to the incidence relation between the parameter-output field and the parameter-input fields and each parameter-input data to obtain the filled target sample template file, wherein the method comprises the following steps: generating a plurality of parameter field value combinations according to each parameter field and at least one field value of each parameter field, and determining each parameter field value combination as test parameter data of a test case; the number of field values included in each parameter field value combination is the same as that of the plurality of parameter fields, and the field values included in each parameter field value combination correspond to the plurality of parameter fields one by one; according to the predicted parameter data of each test case, performing primary filling processing on the target sample template file; calculating the field value of the parameter output field corresponding to each test case according to the incidence relation between the parameter output field and the multiple parameter input fields and the test parameter input data of each test case to obtain the predicted parameter output data of each test case; and performing secondary filling processing on the target sample template file subjected to the primary filling processing according to the predicted parameter data of each test case to obtain the target sample template file subjected to the filling processing.
The number of field values of each parameter field may be one or more.
And generating a plurality of parameter field value combinations according to each parameter field and at least one field value of each parameter field, and determining each parameter field value combination as test parameter data of one test case. The number of field values included in each parameter field value combination is the same as the number of the plurality of parameter fields, and the field values included in each parameter field value combination are in one-to-one correspondence with the plurality of parameter fields.
For example, the number of the obtained parameter fields is 3, and the obtained parameter fields are parameter 1, parameter 2 and parameter 3. The ginseng 1 has two field values a1 and a 2; advance 2 has 3 field values b1, b2, b 3; advance 3 has 3 field values c1, c2, c 3. Then freely combine 2 field values of input parameter 1, 3 field values of input parameter 2 and 3 field values of input parameter 3 to generate 2 × 3 — 18 input parameter field value combinations: (a1, b1, c1), (a1, b1, c2), (a1, b1, c3) … … (a2, b3, c 3). Each of the combinations of the value of the parameter entry fields includes a value of the parameter entry 1, a value of the parameter entry 2, and a value of the parameter entry 3.
In another embodiment, the number of the obtained parameter fields may be one or more. Generating test parameter data of the test case according to each parameter field and at least one field value of each parameter field, wherein the following conditions exist:
(1) if only one parameter field A is obtained through the interface, the parameter field A only has one field value a1, only one test case can be generated based on a1, and the test parameter data of the test case is 'a 1'.
(2) If only one parameter field B is obtained through the interface and the parameter field B has a plurality of field values B1, B2 and B3, three test cases can be generated based on B1, B2 and B3, wherein the test parameter data of the test case 1 is 'B1', the test parameter data of the test case 2 is 'B2', and the test parameter data of the test case 3 is 'B3'.
(3) If multiple parameter fields are acquired through the interface: the parameter input field A only has one field value a1, the parameter input field B only has one field value B1, only one test case can be generated based on a1 and B1, and the parameter input field value combination corresponding to the test case is 'a 1B 1'.
(4) If multiple parameter fields are acquired through the interface: for example, if the parameter field a has only one field value a1 and the parameter field B has multiple field values B1 and B2, multiple parameter field value combinations can be generated: "a 1B 1" and "a 1B 2", the test reference data of the test case 1 is "a 1B 1", the test reference data of the test case 2 is "a 1B 2", and the reference field value combination "a 1B 1" includes a field value a1 of the reference field a and a field value B1 of the reference field B, and the reference field value combination "a 1B 2" includes a field value a1 of the reference field a and a field value B2 of the reference field B.
In specific implementation, the number of the parameter field value combinations can be determined as the number of test cases, and each test case corresponds to one parameter field value combination. In addition, when discarding at least one access parameter field value combination in advance, the number of test cases may be smaller than the number of access parameter field value combinations. For example, the test worker empirically determines that (a1, b1, c3) is unlikely to occur, has no test significance, and does not need to generate the corresponding test case of (a1, b1, c3), and may pre-configure the input parameter field value combination to be discarded.
And performing preliminary filling processing on the target sample template file according to the predicted parameter data of each test case. In specific implementation, the target sample template file may be an Excel file, and in a row corresponding to each test case of the Excel file, preliminary filling processing may be performed on the region to be filled in of each entry field based on field values included in the entry field value combination. This preliminary filling process is performed automatically by the receiving test equipment that performs the interface test method, rather than manually.
For example, referring to fig. 2, test case 1 corresponds to a value combination of input parameter values (a1, b1, c1, d1), and the interface test apparatus automatically fills a1 into a cell where reference 1 intersects with test case 1, fills b1 into a cell where reference 2 intersects with test case 1, fills c1 into a cell where reference 3 intersects with test case 1, and fills d1 into a cell where reference 4 intersects with test case 1.
And calculating the field value of the parameter field corresponding to each test case according to the incidence relation between the parameter field and the parameter fields and the test parameter data of each test case to obtain the predicted parameter data of each test case.
Illustratively, the input parameters corresponding to the test case 1 include: the field value a1 of reference 1, the field value b1 of reference 2, the field value c1 of reference 3, and the field value d1 of reference 4. In the table, association relations between the argument field and the multiple argument fields are configured in advance for each argument field, and therefore, after the interface test equipment executes preliminary filling processing so that field values of the argument 1, the argument 2, the argument 3, and the argument 4 corresponding to the test case 1 are filled in the target sample template file, the interface test equipment automatically generates a field value x1 of the argument 1 according to the association relations between the argument 1 and the multiple argument fields, automatically generates a field value y1 of the argument 2 according to the association relations between the argument 2 and the multiple argument fields, and automatically generates a field value z1 of the argument 3 according to the association relations between the argument 3 and the multiple argument fields. The value x1 of the field of the argument 1 is predicted argument data of the argument 1 in the test case 1, the value y1 of the field of the argument 2 is predicted argument data of the argument 2 in the test case 1, and the value z1 of the field of the argument 3 is predicted argument data of the argument 3 in the test case 1. Other test cases are similar to test case 1 and will not be described herein.
And performing secondary filling processing on the target sample template file subjected to the primary filling processing according to the predicted parameter data of each test case to obtain the target sample template file subjected to the filling processing.
In specific implementation, the target sample template file may be an Excel file, in a row corresponding to each test case of the Excel file, a region to be filled in of each entry field may be subjected to preliminary filling processing based on a field value included in an entry field value combination, and after the preliminary filling processing, the region to be filled in of each entry field may be subjected to secondary filling processing according to predicted entry data of each test case. The second fill process is automatically performed by the interface test apparatus performing the interface test method, not manually performed.
For example, referring to fig. 2, test case 1 corresponds to a combination of values of the argument values (a1, b1, c1, d1), and after the interface test apparatus performs the initial padding process, in an Excel file: the cell where the reference 1 intersects with the test case 1 shows a1, the cell where the reference 2 intersects with the test case 1 shows b1, the cell where the reference 3 intersects with the test case 1 shows c1, and the cell where the reference 4 intersects with the test case 1 shows d 1. And calculating to obtain predicted parameter data (x1, x2, x3), automatically filling x1 into the cell where the parameter 1 intersects with the test case 1, filling x2 into the cell where the parameter 2 intersects with the test case 1, and filling x3 into the cell where the parameter 3 intersects with the test case 1 by the interface test equipment.
Optionally, after generating an execution file of each test case according to the preset configuration file after the configuration processing, the interface testing method further includes: receiving a file editing instruction; in response to a file editing instruction, carrying out editing operation on any one or more of a target field and a field value corresponding to the target field in the target sample template file after filling processing to obtain an edited target sample template file; wherein the editing operation comprises adding, deleting or modifying; and modifying the configuration file after the configuration processing according to the edited target sample template file so as to generate an execution file of each test case.
The file editing instruction may be an adding instruction for the entry field and/or the exit field, a deleting instruction for the entry field and/or the exit field, or a modifying instruction for the entry field and/or the exit field. The file editing instruction may be an editing instruction for an association relationship between the parameter field and the plurality of parameter fields. The file editing instruction may be an instruction sent by a test worker, or may be a corresponding file editing instruction generated according to a version update operation of the test worker and sent to the interface under the condition of version update.
In response to a file editing instruction, carrying out editing operation on any one or more of a target field and a field value corresponding to the target field in the target sample template file after filling processing to obtain an edited target sample template file; wherein the editing operation comprises adding, deleting or modifying. For example, referring to fig. 2, a column is added to the right side of the entry 5 as entry 5, and the exit field associated with entry 5 is adaptively updated: and the incidence relation between the parameter 2 and a plurality of parameter input fields is that the field value of the parameter 2 is equal to the sum of the field values of the parameter 1 to the parameter 4 originally, and after updating, the field value of the parameter 2 is modified to be equal to the sum of the field values of the parameter 1 to the parameter 5.
Modifying the configuration processed configuration file according to the edited target sample template file to generate an execution file of each test case, wherein the configuration processed configuration file is modified based on the edited target sample template file according to a preset file conversion rule to generate the execution file of each test case.
When a test worker wants to perform fine adjustment on input parameters and/or output parameters, or when the test worker wants to perform version update on the input parameters and/or the output parameters, input parameter data does not need to be sent again, a brand-new test strategy does not need to be reconfigured, and the target sample template file can be edited on the basis of the target sample template file corresponding to the existing test strategy so as to generate a plurality of brand-new test cases. The target sample template file can be an Excel file, the technical requirement for editing operation on the Excel file is very simple, the Excel file does not depend on programming skills, readability of each parameter in the table is high, and the Excel template file is easy to master for inexperienced testing workers. In addition, by finely adjusting the target sample template file and generating a plurality of new test cases, compared with the method of resending the input parameter data and reconfiguring the test strategy, the method has the advantages of simple process and easy operation, obviously improves the maintenance efficiency of the test process, and improves the generation efficiency of the test cases.
And step S106, converting the target sample template file after filling processing into an execution file of each test case according to a preset file conversion rule.
Taking a sheet of which the target sample template file is an Excel file as an example for explanation, the preset file conversion rule may be that according to the number of the parameter input fields and the number of the parameter output fields configured in advance, in one sheet of the Excel file, which columns are the parameter input fields and which columns are the parameter output fields are determined, and according to the number of the parameter input fields and the number of the parameter output fields, the number of the first subcode templates corresponding to each parameter input field and the number of the second subcode templates corresponding to each parameter output field included in the preset code template of the execution file are determined.
And each line in the sheet corresponds to one test case, and the field value of each entry field and the field value of each exit field of the line are read aiming at any test case. Filling the field value of each parameter input field into a corresponding first subcode template; and filling the field value of each parameter field and the relational code for representing the incidence relation between the parameter field and the multiple parameter fields into the corresponding second subcode template so as to generate an execution file of each test case.
The execution file of the test case may be a json message execution file.
Optionally, the preset file conversion rule includes a preset configuration rule; converting the target sample template file after the filling processing into an execution file of each test case according to a preset file conversion rule, wherein the method comprises the following steps: configuring a preset configuration file according to a preset configuration rule and test entry parameter data and prediction entry parameter data of each test case; and generating an execution file of each test case according to the preset configuration file after the configuration processing.
The preset configuration file may be a YAML (another Markup Language) configuration file, or a configuration file in another Language. The YAML language is a language specially used for writing configuration files, is convenient for human to read and write, and has the following characteristics: case sensitive; representing the hierarchical relationship using indentation; the Tab key is not allowed to be used during indentation, and only a blank space is allowed to be used; the number of indented spaces is not important, so long as the elements of the same hierarchy are left-side aligned, and so on.
Fig. 3 is a schematic interface diagram of a YAML configuration file according to an embodiment of the present application.
As shown in fig. 3, the YAML profile display interface displays: "test items: and (3) rule: precision _ a _ rule "," input:10 "," rule: precision _ b _ rule "," input: 20".
Wherein "rule" is used to indicate the test policy identification. Illustratively, "precision _ a _ rule" and "precision _ b _ rule" are two different test policy identifications, respectively. "input" is used to indicate the number of preset parameter fields.
As shown in fig. 3, for example, the test policy identifier "precision _ a _ rule" corresponds to the test policy a, the number of the preset reference fields of the test policy a is "10", and the target sample template file matching with the test policy identifier "precision _ a _ rule" is pre-configured with 10 preset reference fields. The test policy identifier "decision _ b _ rule" corresponds to the test policy b, the number of the preset entry fields of the test policy b is "20", and then the target sample template file matched with the test policy identifier "decision _ b _ rule" is pre-configured with 20 preset entry fields.
In the interface testing method provided in the embodiment of the present application, the configuration information of each test policy may be configured by using a YAML configuration file, each test policy corresponds to one test policy identifier, and each test case may also be configured by using the YAML configuration file.
In specific implementation, the preset YAML configuration file can be configured according to the preset configuration rule and the test entry parameter data and the prediction exit parameter data of each test case.
The preset configuration rule is used for representing how to configure the yaml configuration file according to the target sample template file after the filling processing.
In an embodiment, the target sample template file after the filling processing is an Excel table after the filling processing, and the preset configuration rule may specify how to configure the YAML configuration file based on parameters of the Excel table, such as the number of rows, the number of columns, the parameter entry field, the parameter exit field, and each field value.
For example, the preset configuration rule specifies that: the method comprises the steps of determining the number of a plurality of test cases included in an Excel table based on the number of lines in the Excel table, and further determining the number of YAML configuration files used for generating an execution file of each test case based on the number of the plurality of test cases. One test case corresponds to one YAML configuration file for generating an execution file for the corresponding test case.
For another example, the preset configuration rule specifies: the preset region of the YAML profile is filled based on the field value of the parameter field.
After the YAML configuration file after the configuration processing corresponding to each test case is obtained, the YAML configuration file can be called, and the execution file of each test case is generated based on the preset code template.
The YAML configuration file can be used for filling in a preset code template to obtain a code text, and the code text is determined as the code text of an execution file of a test case. Furthermore, running the code text can be regarded as executing the test case.
The codes of the execution file and the Excel form can be decoupled through the preset configuration rules and the preset YAML configuration file, the data in the filled Excel form can be embodied in the converted execution file without manually writing or modifying the codes by a programmer, and the dependence on the programming skills of testing workers in the interface testing process is reduced. The test staff can automatically generate the execution file of each test case through the target sample template file without mastering programming skills.
In specific implementation, Jenkins scheduling tasks can be used for executing the execution files of each test case.
And step S108, executing the execution file to obtain the actual parameter data fed back by the interface.
The actual parameter data may be data fed back by the interface, which is acquired after the execution file of each test case is executed through the interface. The actual parameter output data may be actual parameter output data obtained after the interface test is performed on the input parameter data. The actual parameter data may be the same as the predicted parameter data or may be different from the predicted parameter data.
In an embodiment, according to a preset file conversion rule, the target sample template file after the padding processing may be converted into an execution file of a first test case, the execution file of the first test case is executed, the target sample template file after the padding processing is converted into an execution file of a second test case, the execution file … … of the second test case is executed, the target sample template file after the padding processing is converted into an execution file of a kth test case, and the execution file of the kth test case is executed. K is the number of the test cases corresponding to the target sample template file, and can be a natural number larger than 0.
In another embodiment, the target sample template file after the padding processing may be converted into execution files of K test cases according to a preset file conversion rule, and then each execution file of the execution files of the K test cases may be executed respectively. Namely, the unified conversion is performed first, and after the file conversion of each test case is finished, the execution files of each test case are uniformly executed.
And step S110, generating a test result of the interface according to the predicted parameter data and the actual parameter data.
For any test case, if the predicted parameter data and the actual parameter data are completely the same, the interface test has no error, and no parameters which need special attention and adjustment exist; if the predicted parameter data and the actual parameter data are different, at least one of the input parameter data, the test scene environment data and the configured test strategy has problems, and the error parameters need to be checked and corrected.
In specific implementation, an assertion rule can be configured in a decision scene, and according to the assertion rule, the actual parameter data is asserted to obtain an assertion result of the interface to be tested. And analyzing and processing the assertion result to generate a test report. The test report may be in the form of a log, and the test report may carry actual parameter data inconsistent with the predicted parameter data and related information of the corresponding test case.
The assertion processing may be a "judgment" operation, for example, the actual output parameter data that the testing staff wants to obtain is 200, but the actual output parameter data obtained is 404, and the more cumbersome operation is to manually open the data fed back by the interface, and to see whether 200 is available or not by eyes. If the assertion is used, the program can automatically judge whether the actual parameter data is 200 or not without manually checking. The convenience of assertion is particularly obvious when a large number of requests need to be tested, and much time can be saved.
The pre-configured assertion rule, for example, determines an age region in which a field value of the parameter-showing field "age" in the actual parameter-showing data is located, for example, between 0 and 18 years, between 18 and 50 years, and is greater than 50 years, returns a test result of the test case to be test result 1 if the field value of the parameter-showing field "age" in the acquired actual parameter-showing data is between 0 and 18 years, returns the test result of the test case to be test result 1 if the field value of the acquired "age" is between 18 and 50 years, and returns the test result of the test case to be test result 3 if the field value of the acquired "age" is greater than 50 years. And counting the test result of each test case to obtain the test result of the interface.
Optionally, generating a test result of the interface according to the predicted parameter data and the actual parameter data, including: under the condition that the predicted parameter data and the actual parameter data are not completely consistent, determining a test case with the predicted parameter data not matched with the actual parameter data as a target test case, and determining a test case with the predicted parameter data matched with the actual parameter data as a non-target test case; performing assertion processing on the target test case according to a preset assertion rule to obtain an assertion result of the target test case; and counting the assertion result of the target test case and the test result of the non-target test case to obtain the test result of the interface.
And under the condition that the predicted parameter data is completely consistent with the actual parameter data, the assertion processing is not triggered.
In particular implementation, the assertion rule may be configured in a decision scenario. And under the condition that the predicted parameter data is not completely consistent with the actual parameter data, performing assertion processing on the actual parameter data of the target test case according to the assertion rule to obtain an assertion result of the interface to be tested. And analyzing and processing the assertion result to generate a test report. The test report may be in the form of a log, and the test report may carry actual parameter data inconsistent with the predicted parameter data and related information of the corresponding test case.
The assertion processing may be a "judgment" operation, for example, the actual parameter data that the test worker wants to obtain is 200, but the actual parameter data obtained is 404, and the clumsy operation is to manually open the data fed back by the interface, and to see whether it is 200 or not by eyes. If the assertion is used, the program can automatically judge whether the actual parameter data is 200 or not without manually checking. The convenience of the assertion is particularly obvious when a large number of requests need to be tested, and much time can be saved.
The pre-configured assertion rule, for example, determines an age region in which a field value of the parameter-showing field "age" in the actual parameter-showing data is located, for example, between 0 and 18 years, between 18 and 50 years, and is greater than 50 years, returns a test result of the test case to be test result 1 if the field value of the parameter-showing field "age" in the acquired actual parameter-showing data is between 0 and 18 years, returns the test result of the test case to be test result 1 if the field value of the acquired "age" is between 18 and 50 years, and returns the test result of the test case to be test result 3 if the field value of the acquired "age" is greater than 50 years. In addition, the test result of the non-target test case may be determined in advance as the test result 4. And counting the test result of each test case to obtain the test result of the interface.
Optionally, generating a test result of the interface according to the predicted parameter data and the actual parameter data, including: counting the number of test cases with the predicted parameter data matched with the actual parameter data as a first number; counting the number of the test cases with the predicted parameter data not matched with the actual parameter data as a second number; determining the prediction accuracy of the interface according to the first quantity and the second quantity; and in a plurality of pre-divided probability threshold partitions, determining the probability threshold partition where the prediction accuracy is located as a target partition, and determining a test result corresponding to the target partition as a test result of the interface.
The prediction accuracy may be a ratio of the second quantity to the first quantity.
For each parameter field, a plurality of probability threshold partitions may be pre-partitioned, e.g., parameter field 1, with the corresponding probability threshold partition including: and 0,0.8) and 0.8,1, if the prediction accuracy of the test case 1 is within 0,0.8) aiming at the parameter field 1, determining the test result 1 corresponding to 0,0.8) as the test result of the parameter field 1 in the test case 1, counting the test result of each test case in the parameter field 1 as the test result of the parameter field 1, and counting the test result of each parameter field to obtain the interface test result.
In the embodiment of the interface testing method shown in fig. 1, first, a testing policy identifier and a plurality of parameter entry data are obtained through an interface to be tested, where the plurality of parameter entry data includes a plurality of parameter entry fields and field values corresponding to each of the plurality of parameter entry fields; an input parameter data comprises an input parameter field and a corresponding field value; secondly, filling the target sample template file corresponding to the test strategy identification according to the plurality of pieces of parameter input data to obtain a filled target sample template file; the target sample template file after filling processing comprises test parameter input data and prediction parameter output data of each test case in a plurality of test cases; configuring a parameter output field and an incidence relation between the parameter output field and a plurality of parameter input fields in a target sample template file; then, according to a preset file conversion rule, converting the target sample template file after filling processing into an execution file of each test case; then, executing the execution file of each test case to obtain actual parameter data fed back by the interface; and finally, generating a test result of the interface according to the predicted parameter data and the actual parameter data. Therefore, on one hand, the target sample template file can be filled according to the input parameter data, so that the filled target sample template file comprises the test input parameter data and the prediction output parameter data of each test case in the test cases, and the test cases can be flexibly assembled by utilizing the input parameter data; on the other hand, the target sample template file after filling processing can be converted into the execution file of each test case according to the preset file conversion rule, so that programming workers do not need to write corresponding codes for the execution file of each test case specially, the dependency of the interface test flow on the programming workers is reduced, and the generation efficiency of the test cases is improved.
The embodiment of the present application further provides another embodiment of an interface testing method based on the same technical concept as the foregoing embodiment of the method. Fig. 4 is a processing flow chart of another interface testing method according to an embodiment of the present application.
As shown in fig. 4, the interface testing method includes:
s402, test data are obtained.
Acquiring test data through an interface, wherein the test data carries a test strategy identifier and a plurality of parameter input data, and the plurality of parameter input data comprise a plurality of parameter input fields and field values corresponding to each parameter input field in the plurality of parameter input fields; an entry parameter data includes an entry parameter field and its corresponding field value.
And S404, testing and assembling.
And filling the target sample template file corresponding to the test strategy identification according to the plurality of pieces of entry parameter data to obtain the filled target sample template file, so as to realize flexible assembly of the test data. The target sample template file after filling processing comprises test parameter input data and prediction parameter output data of each test case in a plurality of test cases. The target sample template file is configured with a parameter-out field and an incidence relation between the parameter-out field and a plurality of parameter-in fields.
S406, generating an execution file.
And converting the target sample template file after filling processing into an execution file of each test case according to a preset file conversion rule.
And S408, testing and checking.
And executing the execution file of each test case to obtain the actual parameter data fed back by the interface. And generating a test result of the interface according to the predicted parameter data and the actual parameter data.
And S410, generating a test report.
And generating a test report for reflecting the test result according to the test result.
Since the technical concept is the same, the description in this embodiment is simpler, and the related parts should refer to the corresponding description of the method embodiments provided above.
The embodiment of the present application further provides another embodiment of an interface testing method based on the same technical concept as the foregoing embodiment of the method. Fig. 5 is a processing flow chart of another interface testing method according to an embodiment of the present application.
Referring to fig. 5, the interface testing method specifically includes steps S502 to S514.
Step S502, collecting the parameter data through the interface.
Step S504, the document is configured.
And configuring a document, wherein the interface to be tested can be configured with one or more test strategies in advance. Further, for each test strategy, a corresponding test sample template file may be configured. The configuration document may also be that assertion rules are pre-configured.
Step S506, generating a test case through tool permutation and combination.
Illustratively, the tool may be a test case generation tool, e.g., pytest.
Step S508, reading the analytic excel file.
The Excel file in this step may be a filled-in target sample template file.
Step S510, executing the test case.
And executing each test case, which may be an execution file for executing each test case converted from the excel file.
Step S512, check according to the expected parameters.
According to the expected parameter verification, a test result can be generated according to the predicted parameter data and the actual parameter data.
Step S514, generating a test report.
Since the technical concept is the same, the description in this embodiment is simpler, and the related parts should be referred to the corresponding description of the method embodiment provided above.
In the above embodiments, an interface testing method is provided, and correspondingly, an interface testing apparatus is also provided, which is described below with reference to the accompanying drawings.
Fig. 6 is a schematic diagram of an interface testing apparatus according to an embodiment of the present application.
The present embodiment provides an interface testing apparatus 600, which includes:
a data obtaining unit 601, configured to obtain a test policy identifier and multiple pieces of parameter entry data through an interface to be tested, where the multiple pieces of parameter entry data include multiple parameter entry fields and field values corresponding to each of the multiple parameter entry fields; an input parameter data comprises an input parameter field and a corresponding field value;
a file filling unit 602, configured to perform filling processing on a target sample template file corresponding to the test policy identifier according to the multiple pieces of parameter data, so as to obtain a target sample template file after the filling processing; the target sample template file after filling processing comprises test parameter input data and prediction parameter output data of each test case in a plurality of test cases; configuring a parameter output field and an incidence relation between the parameter output field and a plurality of parameter input fields in a target sample template file;
the file conversion unit 603 is configured to convert the target sample template file after the filling processing into an execution file of each test case according to a preset file conversion rule;
a file executing unit 604, configured to execute the execution file to obtain actual parameter data fed back by the interface;
the result generating unit 605 is configured to generate a test result of the interface according to the predicted parameter data and the actual parameter data.
Optionally, the file filling unit 602, when performing filling processing on the target sample template file corresponding to the test policy identifier according to the multiple pieces of entry parameter data to obtain a filled target sample template file, executes the following steps: according to the test strategy identification, inquiring a target sample template file matched with the test strategy identification in a plurality of test sample template files pre-configured by the interface; and filling the target sample template file according to the incidence relation between the parameter output field and the parameter input fields and each parameter input data to obtain the filled target sample template file.
Optionally, the number of field values of each parameter field is at least one; the association relation between the parameter-out field and the parameter-in fields is determined according to the field value of each parameter-in field and a preset relation corresponding to the parameter-out field;
the file filling unit 602 performs the following steps when performing filling processing on the target sample template file according to the association relationship between the parameter output field and the parameter input fields and each parameter input data to obtain a filled target sample template file:
generating a plurality of parameter field value combinations according to each parameter field and at least one field value of each parameter field, and determining each parameter field value combination as test parameter data of a test case; the number of field values included in each parameter field value combination is the same as that of the plurality of parameter fields, and the field values included in each parameter field value combination correspond to the plurality of parameter fields one by one; according to the predicted parameter data of each test case, performing primary filling processing on the target sample template file; calculating the field value of the parameter field corresponding to each test case according to the incidence relation between the parameter field and the parameter fields and the test parameter data of each test case to obtain the predicted parameter data of each test case; and performing secondary filling processing on the target sample template file subjected to the primary filling processing according to the predicted parameter data of each test case to obtain the target sample template file subjected to the filling processing.
Optionally, the preset file conversion rule includes a preset configuration rule; the file conversion unit 603, when converting the filled target sample template file into an execution file of each test case according to a preset file conversion rule, specifically performs the following steps: configuring a preset configuration file according to a preset configuration rule and test entry parameter data and prediction entry parameter data of each test case; and generating an execution file of each test case according to the preset configuration file after the configuration processing.
Optionally, the interface testing apparatus 600 further includes: an instruction receiving unit 606 for receiving a file editing instruction; a file editing unit 607, configured to, in response to a file editing instruction, perform an editing operation on any one or more of a target field and a field value corresponding to the target field in the filled target sample template file to obtain an edited target sample template file; wherein the editing operation comprises adding, deleting or modifying; the file modifying unit 608 is configured to modify the configuration file after the configuration processing according to the edited target sample template file, so as to generate an execution file corresponding to the test case.
Optionally, when the result generation unit 605 generates the test result of the interface according to the predicted parameter data and the actual parameter data, the following steps are performed: under the condition that the predicted parameter data and the actual parameter data are not completely consistent, determining a test case with the predicted parameter data not matched with the actual parameter data as a target test case, and determining a test case with the predicted parameter data matched with the actual parameter data as a non-target test case; performing assertion processing on the target test case according to a preset assertion rule to obtain an assertion result of the target test case; and counting the assertion result of the target test case and the test result of the non-target test case to obtain the test result of the interface.
Optionally, when the result generation unit 605 generates the test result of the interface according to the predicted parameter data and the actual parameter data, the following steps are performed: counting the number of test cases with the predicted parameter data matched with the actual parameter data as a first number; counting the number of the test cases with the predicted parameter data not matched with the actual parameter data as a second number; determining the prediction accuracy of the interface according to the first quantity and the second quantity; and in a plurality of pre-divided probability threshold partitions, determining the probability threshold partition where the prediction accuracy is located as a target partition, and determining a test result corresponding to the target partition as a test result of the interface.
Optionally, the interface testing apparatus 600 further includes: a field obtaining unit 609, configured to obtain a plurality of preset entry fields configured in advance in the target sample template file; a field determining unit 610, configured to determine whether the obtained multiple input parameter data matches multiple preset input parameter fields; if not, the alarm generating unit 611 is operated, and the alarm generating unit 611 is used to generate alarm information that the input parameter has an error.
The interface testing device provided by the embodiment of the application comprises a data acquisition unit, a file filling unit, a file conversion unit, a file execution unit and a result generation unit, wherein the data acquisition unit is used for acquiring a testing strategy identifier and a plurality of parameter input data through an interface to be tested, and the plurality of parameter input data comprise a plurality of parameter input fields and field values corresponding to each of the plurality of parameter input fields; an input parameter data comprises an input parameter field and a corresponding field value; the file filling unit is used for filling the target sample template file corresponding to the test strategy identification according to the plurality of pieces of parameter input data to obtain a filled target sample template file; the target sample template file after filling processing comprises test parameter input data and prediction parameter output data of each test case in a plurality of test cases; the target sample template file is configured with a parameter-output field and an incidence relation between the parameter-output field and a plurality of parameter-input fields; the file conversion unit is used for converting the filled target sample template file into an execution file of each test case according to a preset file conversion rule; the file execution unit is used for executing the execution file and obtaining the actual parameter data fed back by the interface; and the result generating unit is used for generating a test result of the interface according to the predicted parameter data and the actual parameter data. Therefore, on one hand, the target sample template file can be filled according to the input parameter data, so that the filled target sample template file comprises the test input parameter data and the prediction output parameter data of each test case in the test cases, and the test cases can be flexibly assembled by utilizing the input parameter data; on the other hand, the target sample template file after filling processing can be converted into the execution file of each test case according to the preset file conversion rule, so that programming workers do not need to write corresponding codes for the execution file of each test case specially, the dependency of the interface test flow on the programming workers is reduced, and the generation efficiency of the test cases is improved.
Corresponding to the above-described interface testing method, based on the same technical concept, an embodiment of the present application further provides an interface testing device, where the interface testing device is configured to execute the above-described interface testing method, and fig. 7 is a schematic structural diagram of an interface testing device provided in the embodiment of the present application.
As shown in fig. 7, the interface test apparatus may have a relatively large difference due to different configurations or performances, and may include one or more processors 701 and a memory 702, where one or more stored applications or data may be stored in the memory 702. Memory 702 may be, among other things, transient storage or persistent storage. The application program stored in memory 702 may include one or more modules (not shown), each of which may include a series of computer-executable instructions in the interface test apparatus. Still further, the processor 701 may be configured to communicate with the memory 702 to execute a series of computer-executable instructions in the memory 702 on the interface test device. The interface test equipment may also include one or more power supplies 703, one or more wired or wireless network interfaces 704, one or more input/output interfaces 705, one or more keyboards 706, and the like.
In one particular embodiment, the interface test apparatus comprises a memory, and one or more programs, wherein the one or more programs are stored in the memory, and the one or more programs may comprise one or more modules, and each module may comprise a series of computer-executable instructions for the interface test apparatus, and the one or more programs configured to be executed by the one or more processors include computer-executable instructions for:
acquiring a test strategy identifier and a plurality of parameter input data through an interface to be tested, wherein the plurality of parameter input data comprise a plurality of parameter input fields and field values corresponding to each parameter input field in the plurality of parameter input fields; an input parameter data comprises an input parameter field and a corresponding field value;
filling the target sample template file corresponding to the test strategy identification according to the plurality of pieces of parameter entering data to obtain a filled target sample template file; the target sample template file after filling processing comprises test parameter input data and prediction parameter output data of each test case in a plurality of test cases; the target sample template file is configured with a parameter-output field and an incidence relation between the parameter-output field and a plurality of parameter-input fields;
converting the target sample template file after filling processing into an execution file of each test case according to a preset file conversion rule;
executing the execution file to obtain actual parameter data fed back by the interface;
and generating a test result of the interface according to the predicted parameter data and the actual parameter data.
Based on the same technical concept, the embodiment of the present application further provides a computer-readable storage medium corresponding to the above-described interface testing method.
The present embodiments provide a computer-readable storage medium for storing computer-executable instructions, which when executed by a processor implement the following process:
acquiring a test strategy identifier and a plurality of parameter input data through an interface to be tested, wherein the plurality of parameter input data comprise a plurality of parameter input fields and field values corresponding to each parameter input field in the plurality of parameter input fields; an input parameter data comprises an input parameter field and a corresponding field value;
filling the target sample template file corresponding to the test strategy identification according to the plurality of pieces of parameter entering data to obtain a filled target sample template file; the target sample template file after filling processing comprises test parameter input data and prediction parameter output data of each test case in a plurality of test cases; configuring a parameter output field and an incidence relation between the parameter output field and a plurality of parameter input fields in a target sample template file;
converting the target sample template file after filling processing into an execution file of each test case according to a preset file conversion rule;
executing the execution file to obtain actual parameter data fed back by the interface;
and generating a test result of the interface according to the predicted parameter data and the actual parameter data.
It should be noted that the embodiment related to the computer-readable storage medium in this specification and the embodiment related to the interface testing method in this specification are based on the same inventive concept, and therefore, specific implementation of this embodiment may refer to implementation of the foregoing corresponding method, and repeated details are not described again.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, embodiments of the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the description may take the form of a computer program product embodied on one or more computer-readable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The description has been presented with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the description. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable interface testing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable interface testing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable interface testing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable interface test apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Embodiments of the application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. One or more embodiments of the specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only an example of this document and is not intended to limit this document. Various modifications and changes may occur to those skilled in the art from this document. Any modifications, equivalents, improvements, etc. which come within the spirit and principle of the disclosure are intended to be included within the scope of the claims of this document.
Claims (11)
1. An interface testing method, comprising:
acquiring a test strategy identifier and a plurality of parameter input data through an interface to be tested, wherein the parameter input data comprises a plurality of parameter input fields and field values corresponding to each parameter input field in the parameter input fields; an input parameter data comprises an input parameter field and a corresponding field value;
filling the target sample template file corresponding to the test strategy identification according to the plurality of pieces of parameter input data to obtain a filled target sample template file; the target sample template file after filling processing comprises test parameter input data and prediction parameter output data of each test case in a plurality of test cases; the target sample template file is configured with a parameter-out field and an incidence relation between the parameter-out field and the parameter-in fields;
converting the target sample template file after filling processing into an execution file of each test case according to a preset file conversion rule;
executing the execution file to obtain actual parameter data fed back by the interface;
and generating a test result of the interface according to the predicted parameter data and the actual parameter data.
2. The method according to claim 1, wherein the filling the target sample template file corresponding to the test policy identifier according to the plurality of pieces of entry data to obtain a filled target sample template file includes:
according to the test strategy identification, inquiring a target sample template file matched with the test strategy identification in a plurality of test sample template files pre-configured by the interface;
and filling the target sample template file according to the incidence relation between the parameter-output field and the parameter-input fields and each parameter-input data to obtain the filled target sample template file.
3. The method of claim 2, wherein the number of field values of each entry field is at least one; the association relationship between the parameter-out field and the parameter-in fields is as follows: the field value of the parameter field is determined according to the field value of each parameter field and a preset relation corresponding to the parameter field; the filling the target sample template file according to the association relationship between the parameter-output field and the parameter-input fields and each parameter-input data to obtain a filled target sample template file includes:
generating a plurality of parameter field value combinations according to each parameter field and at least one field value of each parameter field, and determining each parameter field value combination as test parameter data of a test case; the number of the field values included in each parameter field value combination is the same as that of the plurality of parameter fields, and the field values included in each parameter field value combination correspond to the plurality of parameter fields one by one;
according to the predicted parameter data of each test case, performing primary filling processing on the target sample template file;
calculating the field value of the parameter field corresponding to each test case according to the incidence relation between the parameter field and the parameter fields and the test parameter data of each test case to obtain the predicted parameter data of each test case;
and performing secondary filling processing on the target sample template file subjected to the primary filling processing according to the predicted parameter data of each test case to obtain the target sample template file subjected to the filling processing.
4. The method of claim 1, wherein the preset file conversion rules comprise preset configuration rules; converting the target sample template file after the filling processing into an execution file of each test case according to a preset file conversion rule, including:
configuring a preset configuration file according to a preset configuration rule and test entry parameter data and prediction entry parameter data of each test case;
and generating an execution file of each test case according to the preset configuration file after the configuration processing.
5. The method according to claim 4, wherein after generating the execution file of each test case according to the preset configuration file after the configuration processing, the method further comprises:
receiving a file editing instruction;
in response to the file editing instruction, performing editing operation on any one or more of a target field in the filled target sample template file and a field value corresponding to the target field to obtain an edited target sample template file; wherein the editing operation comprises adding, deleting or modifying; the target field refers to any field in the target sample template file;
and modifying the preset configuration file after the configuration processing according to the edited target sample template file so as to generate an execution file of each test case.
6. The method of claim 1, wherein generating the test result of the interface according to the predicted parametric data and the actual parametric data comprises:
under the condition that the predicted parameter data and the actual parameter data are not completely consistent, determining a test case with unmatched predicted parameter data and actual parameter data as a target test case, and determining a test case with matched predicted parameter data and actual parameter data as a non-target test case;
performing assertion processing on the target test case according to a preset assertion rule to obtain an assertion result of the target test case;
and counting the assertion result of the target test case and the test result of the non-target test case to obtain the test result of the interface.
7. The method of claim 1, wherein generating the test result of the interface according to the predicted parametric data and the actual parametric data comprises:
counting the number of test cases matched with the predicted parameter data and the actual parameter data as a first number; counting the number of test cases with unmatched predicted parameter data and actual parameter data as a second number;
determining the prediction accuracy of the interface according to the first quantity and the second quantity;
and in a plurality of pre-divided probability threshold partitions, determining the probability threshold partition where the prediction accuracy is located as a target partition, and determining a test result corresponding to the target partition as a test result of the interface.
8. The method according to claim 2, wherein after querying a corresponding target sample template file from a plurality of test sample template files pre-configured by the interface according to the test policy identifier, the method further comprises:
acquiring a plurality of preset entry fields which are configured in advance in the test strategy identification;
judging whether the acquired multiple input parameter data are matched with the multiple preset input parameter fields;
if not, generating the alarm information with error according to the input parameters.
9. An interface testing apparatus, the apparatus comprising:
the data acquisition unit is used for acquiring a test strategy identifier and a plurality of parameter input data through an interface to be tested, wherein the parameter input data comprise a plurality of parameter input fields and field values corresponding to each parameter input field in the parameter input fields; an input parameter data comprises an input parameter field and a field value corresponding to the input parameter field;
the file filling unit is used for filling the target sample template file corresponding to the test strategy identifier according to the plurality of pieces of parameter input data to obtain a filled target sample template file; the target sample template file after filling processing comprises test parameter input data and prediction parameter output data of each test case in a plurality of test cases; the target sample template file is configured with a parameter-out field and an incidence relation between the parameter-out field and the parameter-in fields;
the file conversion unit is used for converting the target sample template file after filling processing into an execution file of each test case according to a preset file conversion rule;
the file execution unit is used for executing the execution file and obtaining the actual parameter data fed back by the interface;
and the result generating unit is used for generating a test result of the interface according to the predicted parameter data and the actual parameter data.
10. An interface test apparatus, characterized in that the apparatus comprises:
a processor; and a memory configured to store computer-executable instructions that, when executed, cause the processor to perform the interface testing method of any one of claims 1-8.
11. A computer-readable storage medium for storing computer-executable instructions which, when executed by a processor, implement the interface testing method of any one of claims 1-8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210622206.2A CN115114146B (en) | 2022-06-02 | 2022-06-02 | Interface testing method, device, equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210622206.2A CN115114146B (en) | 2022-06-02 | 2022-06-02 | Interface testing method, device, equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115114146A true CN115114146A (en) | 2022-09-27 |
CN115114146B CN115114146B (en) | 2024-07-02 |
Family
ID=83326081
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210622206.2A Active CN115114146B (en) | 2022-06-02 | 2022-06-02 | Interface testing method, device, equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115114146B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115809201A (en) * | 2022-12-23 | 2023-03-17 | 广州市保伦电子有限公司 | Interface test control method, device, equipment and medium based on two-dimensional table |
CN116955300A (en) * | 2023-09-18 | 2023-10-27 | 云南瀚文科技有限公司 | File generation method and system based on label technology |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105760292A (en) * | 2014-12-18 | 2016-07-13 | 阿里巴巴集团控股有限公司 | Assertion verification method and device for unit testing |
CN109408381A (en) * | 2018-10-10 | 2019-03-01 | 四川新网银行股份有限公司 | A kind of product data automatic Verification platform and method based on data check template |
CN109697161A (en) * | 2017-10-24 | 2019-04-30 | 中兴通讯股份有限公司 | A kind of test method of storing process, storage medium and database server |
CN111767210A (en) * | 2020-06-12 | 2020-10-13 | 浙江大搜车软件技术有限公司 | Policy testing method and device, computer equipment and storage medium |
WO2022095518A1 (en) * | 2020-11-06 | 2022-05-12 | 深圳壹账通智能科技有限公司 | Automatic interface test method and apparatus, and computer device and storage medium |
-
2022
- 2022-06-02 CN CN202210622206.2A patent/CN115114146B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105760292A (en) * | 2014-12-18 | 2016-07-13 | 阿里巴巴集团控股有限公司 | Assertion verification method and device for unit testing |
CN109697161A (en) * | 2017-10-24 | 2019-04-30 | 中兴通讯股份有限公司 | A kind of test method of storing process, storage medium and database server |
CN109408381A (en) * | 2018-10-10 | 2019-03-01 | 四川新网银行股份有限公司 | A kind of product data automatic Verification platform and method based on data check template |
CN111767210A (en) * | 2020-06-12 | 2020-10-13 | 浙江大搜车软件技术有限公司 | Policy testing method and device, computer equipment and storage medium |
WO2022095518A1 (en) * | 2020-11-06 | 2022-05-12 | 深圳壹账通智能科技有限公司 | Automatic interface test method and apparatus, and computer device and storage medium |
Non-Patent Citations (2)
Title |
---|
XIAO HE等: "Testing bidirectional model transformation using metamorphic testing", INFORMATION AND SOFTWARE TECHNOLOGY, vol. 104, 11 October 2018 (2018-10-11), pages 109 - 129, XP085503814, DOI: 10.1016/j.infsof.2018.07.010 * |
彭新宇: "基于Selenium的Web自动化测试框架研究与实现", 中国优秀硕士学位论文全文数据库 信息科技辑, no. 6, 15 June 2021 (2021-06-15), pages 138 - 90 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115809201A (en) * | 2022-12-23 | 2023-03-17 | 广州市保伦电子有限公司 | Interface test control method, device, equipment and medium based on two-dimensional table |
CN115809201B (en) * | 2022-12-23 | 2023-10-20 | 广东保伦电子股份有限公司 | Interface test control method, device, equipment and medium based on two-dimensional table |
CN116955300A (en) * | 2023-09-18 | 2023-10-27 | 云南瀚文科技有限公司 | File generation method and system based on label technology |
CN116955300B (en) * | 2023-09-18 | 2023-11-28 | 云南瀚文科技有限公司 | File generation method and system based on label technology |
Also Published As
Publication number | Publication date |
---|---|
CN115114146B (en) | 2024-07-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104572122B (en) | A kind of generating means and method of software application data | |
US11080305B2 (en) | Relational log entry instituting system | |
US9037549B2 (en) | System and method for testing data at a data warehouse | |
US9021440B1 (en) | System and method for automated test script generation | |
CN111722839B (en) | Code generation method and device, electronic equipment and storage medium | |
CA3050159C (en) | Artificial intelligence (ai) based automatic rule generation | |
CN115114146B (en) | Interface testing method, device, equipment and storage medium | |
CN103838672A (en) | Automated testing method and device for all-purpose financial statements | |
US9135647B2 (en) | Methods and systems for flexible and scalable databases | |
CN112765023B (en) | Test case generation method and device | |
CN104133772A (en) | Automatic test data generation method | |
CN108388515A (en) | Test data generating method, device, equipment and computer readable storage medium | |
CN106951231B (en) | Computer software development method and device | |
CN110990274B (en) | Data processing method, device and system for generating test cases | |
US11256557B1 (en) | Efficient processing of rule-based computing workflows | |
CN110889272A (en) | Data processing method, device, equipment and storage medium | |
CN111125064B (en) | Method and device for generating database schema definition statement | |
CN111061733A (en) | Data processing method and device, electronic equipment and computer readable storage medium | |
CN111145011B (en) | Banking system building method and device | |
US20230102947A1 (en) | Providing operations in accordance with worksheet relationships and data object relationships | |
CN114138748A (en) | Database mapping file generation method, device, equipment and storage medium | |
CN111126008A (en) | XSD-based code generation method and device, computer equipment and storage medium | |
CN115543428A (en) | Simulated data generation method and device based on strategy template | |
US11704095B2 (en) | Dynamic API bot for robotic process automation | |
CN112051987B (en) | Service data processing method, device and equipment, program generating method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant |