CN113342679A - Interface test method and test device - Google Patents

Interface test method and test device Download PDF

Info

Publication number
CN113342679A
CN113342679A CN202110726729.7A CN202110726729A CN113342679A CN 113342679 A CN113342679 A CN 113342679A CN 202110726729 A CN202110726729 A CN 202110726729A CN 113342679 A CN113342679 A CN 113342679A
Authority
CN
China
Prior art keywords
interface
test
case
information
script
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110726729.7A
Other languages
Chinese (zh)
Inventor
周晔
穆海洁
吴启春
梁星元
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Pnr Co ltd
Original Assignee
China Pnr Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Pnr Co ltd filed Critical China Pnr Co ltd
Priority to CN202110726729.7A priority Critical patent/CN113342679A/en
Publication of CN113342679A publication Critical patent/CN113342679A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5083Techniques for rebalancing the load in a distributed system

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention discloses an interface testing method, which comprises the following steps: and an interface information acquisition step, namely acquiring and storing the interface information of the interface to be tested. And a script generation step, namely automatically generating a test script based on the interface information and the test requirement. The method comprises a case generating step, a test running step and a test execution step, wherein the case generating step is used for automatically generating a plurality of test cases based on interface information, case rules and expected results, the test running step is used for obtaining configured test conditions, when the test conditions are met, a test script and the test cases are called and executed, the interface to be tested is tested, and test results are recorded, wherein the test running step is carried out on a distributed architecture, and the test script and the test cases are distributed to target nodes on the distributed architecture by a load balancing based algorithm. The invention also discloses an interface testing device which comprises an interface information acquisition component, a script generation component, a case generation component and a test operation component.

Description

Interface test method and test device
Technical Field
The invention relates to the technical field of software, in particular to a testing technology of frequently delivered software.
Background
With the rise of distributed architecture and micro-deployment, software faces frequent upgrade deployment and version update. For software which needs to be delivered, updated and deployed frequently, better cooperation among developers, technical operators and quality assurance personnel is needed. The software industry is increasingly aware of: development and operation and maintenance work must work in close cooperation in order to deliver software products and services on time. Against this demand, DevOps becomes a popular technology. DevOps (a combination of Development and Operations) is a collective term for a set of processes, methods and systems for facilitating communication, collaboration and integration between Development (application/software engineering), technical Operations and Quality Assurance (QA) departments. DevOps is a mode of communication and cooperation between software developers (Dev) and IT operation and maintenance technicians (Ops). By automating the processes of 'software delivery' and 'architecture change', the software can be built, tested and released more quickly, frequently and reliably. DevOps can be thought of as the intersection of development (application/software engineering), technology operation and Quality Assurance (QA).
Traditional software organizations set development, IT operations, and quality assurance as separate departments. How to adopt new development methods (e.g. agile software development) in such an environment is an important topic: in the former way of working, development and deployment do not require IT support or QA-deep, cross-department support, but rather require extremely tight multi-department collaboration.
For example, in a conventional interface test project, a test script and a test case need to be written manually, and then the test script is run to execute the test case, so as to obtain a test result. In the process, various different works are involved, the test script is written with codes, developers usually have the capability of writing the codes, but operators and support personnel generally do not know the code writing. The writing of the test cases needs to have a deep understanding of the actual running environment, and operators can write the test cases, but developers do not necessarily know the running environment, so that the writing of the test cases is not well-versed. And finally, ensuring the expertise of personnel during analysis and evaluation of test results. Therefore, in the prior art, an interface test project needs close cooperation and cooperation of three different types of people to be realized, and people with three capabilities are extremely rare due to too high threshold, so that cooperation of multiple departments needs to be arranged. The cooperation of multiple departments inevitably has the problem of communication coordination, so that the project cost is increased, the test period is prolonged, and the requirements of fast rhythm, high-speed iteration and high-quality stable delivery are not met.
Disclosure of Invention
The invention provides an interface test technology which can automatically generate scripts, automatically generate test cases and automatically run according to set conditions.
According to an embodiment of the present invention, an interface testing method is provided, which includes the following steps:
an interface information acquisition step, namely acquiring and storing interface information of an interface to be tested;
a script generation step, which is to automatically generate a test script based on the interface information and the test requirement;
a case generation step, namely automatically generating a plurality of test cases based on the interface information, case rules and expected results;
and a test operation step, namely acquiring configured test conditions, calling and executing the test script and the test case when the test conditions are met, testing the interface to be tested, and recording a test result, wherein the test operation step is performed on the distributed architecture, and the test script and the test case are distributed to target nodes on the distributed architecture by a load balancing-based algorithm.
In one embodiment, the interface information includes interface basic information and interface parameter information, and in the interface information obtaining step, an interface document is generated for the interface to be tested, and the interface basic information and the interface parameter information of the interface to be tested are stored in the corresponding interface document.
In one embodiment, the interface basic information includes: interface name, interface request type, interface address, retry number, timeout time, optional asynchronous request and return data, and interface essential information is recorded in the source code of the interface in the form of method-level annotations. The interface parameter information includes: parameter name, parameter type, whether to fill, parameter structure path, data type, parameter special attribute, and interface parameter information recorded in the source code of the interface in the form of parameter level annotation.
In one embodiment, the step of saving the interface basic information and the interface parameter information of the interface to be tested into the corresponding interface document includes a homonymy distinguishing process, and the homonymy distinguishing process includes: for different program items, globally unique identification names are added to different package paths of each item and interfaces in the same package path so as to distinguish interfaces with the same name in different items. For complex nested interface parameters, the parent-child relationship among the parameters is stored, and the specified interface parameters can be positioned more quickly and accurately when the test case is generated in the subsequent case generation step through the parent-child relationship. The mark is set to distinguish the type of the interface request data and the type of the return data so as to distinguish the condition that the parameters in the request data and the return data are identical.
In one embodiment, the test requirements considered in generating the test script include: the method comprises the steps of inputting random generation rules of data, multi-environment configuration information used for running scripts under different testing or production environments, inquiring and acquiring case database configuration information of test cases stored in a case database, defining data generation rules of the input data, generating corresponding request data according to the data generation rules when a request is sent to replace placeholders in the scripts, regularly extracting returned data, and matching the returned data with expected data.
In one embodiment, customized script characteristics can be added when in use, and comprise request messages or request data, response data encryption and decryption, pre-dependent script preferential running of script running, pre-data cleaning and post-data recovery, and request redis data addition.
In one embodiment, in the case generating step, a corresponding case rule engine is called according to case rules, interface basic information and interface parameter information of an interface to be tested are obtained, the case rule engine is operated according to the interface basic information and the interface parameter information, a plurality of test data meeting an expected result are generated, and the test data are test cases.
In one embodiment, each test case is given a globally unique test case number and stored in a case database.
In one embodiment, the configured test conditions include: script path, use case path, run environment, run cycle time, and feedback path. And when the test conditions are met, calling the test script and the test case, searching a proper target node on the distributed architecture based on a load balancing algorithm to execute the test script and the test case, generating a test report according to the test result and feeding back the test report according to a feedback path.
In one embodiment, the test report includes: the number of project interfaces, the number of successful use cases, the number of failed use cases, the detailed results of use case execution, the number of successful and failed steps and the specific use case step display.
According to an embodiment of the present invention, an interface testing apparatus is provided, including:
the interface information acquisition component acquires and stores interface information of the interface to be tested;
the script generation component automatically generates a test script based on the interface information and the test requirement;
the case generation component automatically generates a plurality of test cases based on the interface information, case rules and expected results;
and the test operation component is deployed in a distributed architecture, acquires configured test conditions, calls the test script and the test case when the test conditions are met, distributes the test script and the test case to target nodes on the distributed architecture by a load balancing-based algorithm, executes the test script and the test case by the target nodes, and records test results.
In one embodiment, the interface information includes interface basic information and interface parameter information, the interface information obtaining component includes an interface document generating module and an interface information storing module, the interface document generating module generates an interface document for the interface to be tested, and the interface information storing module stores the interface basic information and the interface parameter information of the interface to be tested into the corresponding interface document.
In one embodiment, the interface basic information includes: interface name, interface request type, interface address, retry number, timeout time, optional asynchronous request and return data, and interface essential information is recorded in the source code of the interface in the form of method-level annotations. The interface parameter information includes: parameter name, parameter type, whether to fill, parameter structure path, data type, parameter special attribute, and interface parameter information recorded in the source code of the interface in the form of parameter level annotation.
In one embodiment, the interface information storage module includes a homonymy distinguishing process when storing the interface basic information and the interface parameter information of the interface to be tested into the corresponding interface document, and the homonymy distinguishing process includes: for different program items, globally unique identification names are added to different package paths of each item and interfaces in the same package path so as to distinguish interfaces with the same name in different items. For complex nested interface parameters, the parent-child relationship among the parameters is stored, and the specified interface parameters can be positioned more quickly and accurately when the test case is generated in the subsequent case generation step through the parent-child relationship. The mark is set to distinguish the type of the interface request data and the type of the return data so as to distinguish the condition that the parameters in the request data and the return data are identical.
In one embodiment, the test requirements considered by the script generation component in generating the test script include: the method comprises the steps of inputting random generation rules of data, multi-environment configuration information used for running scripts under different testing or production environments, inquiring and acquiring case database configuration information of test cases stored in a case database, defining data generation rules of the input data, generating corresponding request data according to the data generation rules when a request is sent to replace placeholders in the scripts, regularly extracting returned data, and matching the returned data with expected data.
In one embodiment, the script generation component can add customized script characteristics when in use, wherein the customized script characteristics comprise request messages or request data, response data encryption and decryption, pre-dependent script preferential running of script running, pre-data cleaning and post-data recovery, and request redis data addition.
In one embodiment, the case generating component comprises a case rule engine library, a case generating engine and a case database, wherein the case generating engine calls the corresponding case rule engine from the case rule engine library according to case rules and acquires interface basic information and interface parameter information of an interface to be tested, the case generating engine operates the called case rule engine according to the interface basic information and the interface parameter information to generate a plurality of test data meeting expected results, the test data are test cases, and the test cases are stored in the case database.
In one embodiment, each test case is assigned a globally unique test case number.
In one embodiment, the test run component includes a test condition configuration module, a scheduling module, a number of execution nodes deployed in a distributed architecture, and a feedback module. The test conditions configured by the test condition configuration module comprise: script path, use case path, run environment, run cycle time, and feedback path. When the test conditions are met, the scheduling module calls the test script and the test cases, searches a proper execution node on the distributed architecture based on a load balancing algorithm to serve as a target node, and sends the test script and the test cases to the target node. And the target node executes the test script and the test case, and the test result is sent to the feedback module. And the feedback module generates a test report and feeds back according to the feedback path.
In one embodiment, the test report includes: the number of project interfaces, the number of successful use cases, the number of failed use cases, the detailed results of use case execution, the number of successful and failed steps and the specific use case step display.
The interface test method and the interface test device can automatically generate the test script and the test case aiming at the interface to be tested and automatically run according to the set test conditions. The function of automatically generating the script and the test case is beneficial to reducing the threshold of developers, and the interface test can be carried out by people who are not skilled in the design of the test script and the test case. The interface test method and the interface test device are deployed on a distributed architecture, can run in a load balancing mode, and have higher execution efficiency. The test data shows that the execution efficiency of the test device deployed in the distributed architecture is more than 400% higher than that of the single-node deployment. The overall time consumption of the interface test items can be saved by more than 30%.
Drawings
Fig. 1 discloses a flow chart of an interface testing method according to an embodiment of the invention.
FIG. 2 is a block diagram of an interface testing apparatus according to an embodiment of the present invention.
Detailed Description
Referring to fig. 1, fig. 1 discloses a flow chart of an interface testing method according to an embodiment of the invention. The interface testing method comprises the following steps:
s101, acquiring interface information. In the interface information obtaining step S101, interface information of the interface to be tested is obtained and stored. In one embodiment, the interface information includes interface basic information and interface parameter information. In the interface information obtaining step S101, an interface document is first generated for the interface to be tested, and then the interface basic information and the interface parameter information of the interface to be tested are stored in the corresponding interface document. In one embodiment, the interface basic information includes: interface name, interface request type, interface address, number of retries, timeout time, optional asynchronous request, and return data. The interface basic information is recorded in the source code of the interface in the form of method-level annotations. The interface parameter information includes: parameter name, parameter type, whether filling is necessary, parameter structure path, data type and parameter special attribute. Interface parameter information is recorded in the source code of the interface in the form of parameter level annotations. In the source code of the interface to be tested, corresponding annotation is added to the interface to be tested, and the annotation has two modes including method level annotation and parameter level annotation. The method level annotations are primarily directed to interface basic information including, but not limited to, interface name, interface request type, interface address, number of retries, timeout time, optional asynchronous requests, and return data. The parameter level annotation is mainly directed to interface parameter information, including but not limited to parameter name, parameter type, whether padding is necessary, parameter structure path, data type, and parameter specific attribute. The special attribute of the parameter is a random value generation rule which needs to be replaced by a parameter placeholder when the interface parameter generation automation script is set. When the interface test method is executed, the plug-in is triggered to scan all the interfaces added with annotations when the codes are compiled, and the basic information and the parameter information of the interfaces are obtained.
Firstly, generating an interface document for each interface and storing the interface document, and triggering to modify the corresponding interface document when the interface information is changed. And classifying the interface documents of each interface according to different applications and different environments. The information recorded in the interface document is divided into two parts, namely interface basic information description and interface parameter information description. And then storing the interface basic information and the interface parameter information of the interface to be tested into a corresponding interface document. In an actual use environment, a homonymous interface or a homonymous parameter may be used in different program items, and when interface basic information and interface parameter information are stored, homonymous distinguishing processing is required to distinguish the homonymous interface or the homonymous parameter. In one embodiment, the homonym differentiation process includes: for different program items, globally unique identification names are added to different package paths of each item and interfaces in the same package path so as to distinguish interfaces with the same name in different items. For complex nested interface parameters, the parent-child relationship among the parameters is stored, and the specified interface parameters can be positioned more quickly and accurately when the test case is generated in the subsequent case generation step through the parent-child relationship. The mark is set to distinguish the type of the interface request data and the type of the return data so as to distinguish the condition that the parameters in the request data and the return data are identical.
S102, a script generation step. In the script generation step S102, a test script is automatically generated based on the interface information and the test requirement. In one embodiment, the test requirements considered in generating the test script include: the method comprises the steps of inputting random generation rules of data, multi-environment configuration information used for running scripts under different testing or production environments, inquiring and acquiring case database configuration information of test cases stored in a case database, defining data generation rules of the input data, generating corresponding request data according to the data generation rules when a request is sent to replace placeholders in the scripts, regularly extracting returned data, and matching the returned data with expected data.
The following is an example of automatically generating scripts:
function name: bindpes & Add personal Bank card jmx
Scene: test plan
Obtaining environment configuration information, reading environment configuration file environment.
Obtaining database configuration information, and reading a database connection address, a user name and a password in a database configuration file db.jmx;
acquiring the cycle times of a cycle controller, circularly traversing the number totalNum of test cases to be executed in a database for 5 times, and sharing the request data and the expected data of the test cases to be executed;
executing an input data custom rule to generate an assignment jmeter request data variable $ { I _ requestSeqId } to be 'ST 202101134' _ common step;
executing an interface request, wherein the variable name in the replacement request is a variable value, and the request comprises a request server IP address, a port number, a request interface path, a method name, request data and other public steps;
if the return code of the corresponding data result R _ respCode extracted by regular mode is '000' and returns success, and the value is equal to the expected data $ { E _ respCode }, the case is successfully operated and is a public step;
when totalNum is larger than 1, the 3 steps are executed circularly until all use cases are executed.
In one embodiment, customized script characteristics can be added when in use, and comprise request messages or request data, response data encryption and decryption, pre-dependent script preferential running of script running, pre-data cleaning and post-data recovery, and request redis data addition. Each interface request can be selected to add a needed customized component according to the requirement, and the customized requirements of different interfaces are greatly met.
S103, a case generating step, namely automatically generating a plurality of test cases based on the interface information, case rules and expected results. In one embodiment, in the case generating step S103, a corresponding case rule engine is called according to a case rule, interface basic information and interface parameter information of an interface to be tested are obtained, the case rule engine is operated according to the interface basic information and the interface parameter information, and a plurality of test data meeting an expected result are generated, where the plurality of test data are test cases. In one embodiment, each test case is given a globally unique test case number and stored in a case database.
An example of generating a test case is as follows:
for example, for the parameter of the length of the requested data, a length check case is generated. The length check use case may be a fixed length use case or a variable length use case. For example, the length of the request data is 8, and three test cases are generated, the lengths of which are 7, 8 and 9 respectively. Of these, length 8 is a test case (success case) that satisfies the requirements, and length 7 and 9 are test cases (failure cases) that do not satisfy the requirements. Variable length use cases, for example, request data of length [3,12], generate six test cases of length 2, 3, 4, 11, 12, 13, respectively. Of these, the test cases (success cases) having lengths of 3, 4, 11, and 12 satisfy the requirements, and the test cases (failure cases) having lengths of 2 and 13 do not satisfy the requirements. For example, for the mandatory requirement parameters of the request data, the mandatory requirement check case of the request data is generated. Test cases include cases with fields null and non-null. For example, for a parameter type, a parameter enumeration value type test case is generated. Including data that is within the enumerated value range and use cases that are not within the enumerated value range. For example, a field special character check case, a test case such as forward check and reverse check in a special format, such as a time format, a number format, etc., is generated. In short, a sufficient number of test cases are generated according to the test requirement of a certain specified parameter, and some of the test cases are cases (successful cases) which meet the test requirement, and the other are cases (failed cases) which do not meet the test requirement.
As described above, the test case is generated by running the case rule engine according to the interface basic information and the interface parameter information, and the rule engine can be called from the rule engine library. And selecting and calling a proper rule engine according to the test rule, and in one embodiment, adding a customized rule engine according to professional knowledge by a user. An example of a rules engine is as follows: for example, for a test with a fixed value of 10, cases with input data of 10 and 10 are generated, cases corresponding to random date, time, random number, card number, mailbox, identification card, mobile phone number and the random cases depend on the generated values of other data, dependent input data and independent data, and 0 or more of the three fixed values, the random values and the dependent input data can be selected for use. For data needing encryption processing, an encryption rule engine can be called, for example, encryption rules such as 2.0 encryptor, MD5 encryption, 3DES, SHA256, AES, FileBase64 and the like, to generate corresponding encrypted data use cases and non-encrypted data use cases. And for the test that the interfaces have calling relation and depend on data transmission, setting of a dependent script is provided, the dependent script runs preferentially, and the returned parameter value is transmitted as the input data of the script. After the rule engine is called and set, a plurality of corresponding test cases are generated according to the arrangement and combination of the interface information, the number of input data, the number of the rule engines and the number of expected data results. Each test case is given a globally unique test case number and is stored in a case database in mysql format.
S104, a test operation step, namely acquiring the configured test conditions, calling and executing the test script and the test case when the test conditions are met, testing the interface to be tested, and recording the test result. The test operation step is executed on the distributed architecture, and the test script and the test case are distributed to the target nodes on the distributed architecture by a load balancing-based algorithm. In one embodiment, the configured test conditions include: script path, use case path, run environment, run cycle time, and feedback path. And when the test condition is met, automatically executing the test. And calling the test script and the test case during test execution, searching a proper target node on the distributed architecture based on a load balancing algorithm to execute the test script and the test case, generating a test report according to a test result and feeding back according to a feedback path. In one embodiment, the test report includes: the number of project interfaces, the number of successful use cases, the number of failed use cases, the detailed results of use case execution, the number of successful and failed steps and the specific use case step display.
The automatic running of the test can also be understood as the timed running of the script, and setting a project script path, a use case path, a running environment, a running cycle time and a feedback path (namely a responsible person needing to be notified). And sending a timing task after the project is set. The execution nodes of the test are deployed by adopting a distributed architecture, the task operation reads the project script configuration and the test case and sends the project script configuration and the test case to a script scheduling machine, and the scheduling machine calls the target execution machine operation script through a load balancing algorithm. And the execution machine calls the execution script file and writes the script execution step into mongoDB, and the execution result is written into the mysql database. And after the script is executed, the script execution result is written into the database. The data extracted from the database can be sent to a front-end server for timely examination by related responsible persons in the operation configuration. The test report also includes the details of the test: the number of project interfaces, the number of successful use cases, the number of failed use cases, the detailed results of use case execution, the number of successful and failed steps and the specific use case step display are provided, and a clearer report is provided and returned to the user.
FIG. 2 is a block diagram of an interface testing apparatus according to an embodiment of the present invention. The interface testing device corresponds to the interface testing method, and the interface testing device is usually implemented by software, and the interface testing method is executed when the interface testing device is operated. As shown in the figure, the interface testing device includes: an interface information acquisition component 201, a script generation component 202, a use case generation component 203, and a test run component 204.
The interface information acquisition component 201 acquires and stores the interface information of the interface to be tested. In one embodiment, the interface information includes interface basic information and interface parameter information. The interface basic information includes: interface name, interface request type, interface address, retry number, timeout time, optional asynchronous request and return data, and interface essential information is recorded in the source code of the interface in the form of method-level annotations. The interface parameter information includes: parameter name, parameter type, whether to fill, parameter structure path, data type, parameter special attribute, and interface parameter information recorded in the source code of the interface in the form of parameter level annotation. The interface information acquisition component 201 includes an interface document generation module 211 and an interface information saving module 212. The interface document generating module 211 generates an interface document for the interface to be tested, and the interface information storing module 212 stores the interface basic information and the interface parameter information of the interface to be tested into the corresponding interface document. In one embodiment, the interface information storing module 212 stores the interface basic information and the interface parameter information of the interface to be tested in the corresponding interface document, where the storing includes a homonymy distinguishing process, and the homonymy distinguishing process includes: for different program items, globally unique identification names are added to different package paths of each item and interfaces in the same package path so as to distinguish interfaces with the same name in different items. For complex nested interface parameters, the parent-child relationship among the parameters is stored, and the specified interface parameters can be positioned more quickly and accurately when the test case is generated in the subsequent case generation step through the parent-child relationship. The mark is set to distinguish the type of the interface request data and the type of the return data so as to distinguish the condition that the parameters in the request data and the return data are identical.
The script generation component 202 automatically generates a test script based on the interface information and the test requirements. In one embodiment, the test requirements considered when the script generation component 202 generates the test script include: the method comprises the steps of inputting random generation rules of data, multi-environment configuration information used for running scripts under different testing or production environments, inquiring and acquiring case database configuration information of test cases stored in a case database, defining data generation rules of the input data, generating corresponding request data according to the data generation rules when a request is sent to replace placeholders in the scripts, regularly extracting returned data, and matching the returned data with expected data. In one embodiment, the script generation component 202 also allows for the addition of custom script features when used, including request messages or request data, encryption and decryption of response data, pre-dependent script preferential execution of script execution, pre-data cleanup and post-data recovery, request redis data addition.
The use case generation component 203 automatically generates a number of test cases based on the interface information, use case rules, and expected results. In one embodiment, use case generation component 203 includes a use case rules engine library 231, a use case generation engine 232, and a use case database 233. The case generating engine 232 calls a corresponding case rule engine from the case rule engine library 231 according to case rules and obtains interface basic information and interface parameter information of the interface to be tested, the case generating engine 232 operates the called case rule engine according to the interface basic information and the interface parameter information to generate a plurality of test data meeting expected results, the plurality of test data are test cases, and the plurality of test cases are stored in the case database 233. In one embodiment, each test case is assigned a globally unique test case number.
The test operation component 204 is deployed in a distributed architecture, acquires configured test conditions, calls the test script and the test case when the test conditions are met, distributes the test script and the test case to target nodes on the distributed architecture based on a load balancing algorithm, executes the test script and the test case by the target nodes, and records test results. In one embodiment, the test run component 204 includes a test condition configuration module 241, a scheduling module 242, a number of execution nodes 243 deployed in a distributed architecture, and a feedback module 244. The test conditions configured by the test condition configuration module 241 include: script path, use case path, run environment, run cycle time, and feedback path. When the test condition is satisfied, the scheduling module 242 invokes the test script and the test case, searches for a suitable execution node 243 on the distributed architecture based on the load balancing algorithm as a target node, and sends the test script and the test case to the target node 243. The target node 243 executes the test script and the test case, and the test result is sent to the feedback module 244. The feedback module 244 generates a test report and feeds back according to a feedback path. In one embodiment, the test report generated by the feedback module 244 includes: the number of project interfaces, the number of successful use cases, the number of failed use cases, the detailed results of use case execution, the number of successful and failed steps and the specific use case step display.
The specific implementation details of the interface testing apparatus correspond to the interface testing method described above, so the specific details are not described again here.
The interface test method and the interface test device have the following advantages:
and synchronizing interface information when the target code base is deployed, and automatically generating a script. The code capability of a script is not required to be designed by a tester, and the cost of script design time is reduced. And intelligently generating a test case through a rule engine for interface information input and test professional knowledge conversion. The method helps testers to shorten the time of designing the case and avoid omission of test case points; the threshold of the developer for designing the test case is reduced, the problem that the developer does not understand the design of the test case is solved, and a foundation is laid for role exchange. The invention does not need the user to input interface information, request mode, parameter type and the like, and automatically generates a script with the information related to the synchronous interface of the interface description file of the target code library; the interface information change can synchronously update the script, and the correctness and the integrity of the script are ensured. The invention automatically generates parameter cases of the full test points based on interface information including conversion into a rule engine without limitation of using test method equivalence classes, boundary values, error speculation and the like.
The invention supports two dimensions of parameter and message multiple encryption and decryption modes, such as 3DES, MD5, SHA256, CFCA, AES, FILEBASE64 and the like. The invention provides a database data assertion model which supports multi-table verification of various databases, such as MYSQL, ORACLE, SQL Service, REDIS, MangoDB and the like. The invention supports continuous integration of version release, calls automatic test of scripts every time release, and automatically generates a test report. The invention supports timing batch scheduling, uses distributed architecture deployment, and the multi-node executive machine executes tasks in parallel, reduces the time of regression testing, and improves the execution efficiency by more than 400 percent compared with single-node deployment.
The interface test method and the interface test device can automatically generate the test script and the test case aiming at the interface to be tested and automatically run according to the set test conditions. The function of automatically generating the script and the test case is beneficial to reducing the threshold of developers, and the interface test can be carried out by people who are not skilled in the design of the test script and the test case. The interface test method and the interface test device are deployed on a distributed architecture, can run in a load balancing mode, and have higher execution efficiency. The test data shows that the execution efficiency of the test device deployed in the distributed architecture is more than 400% higher than that of the single-node deployment. The overall time consumption of the interface test items can be saved by more than 30%.
It should also be noted that the above-mentioned embodiments are only specific embodiments of the present invention. It is apparent that the present invention is not limited to the above embodiments and similar changes or modifications can be easily made by those skilled in the art from the disclosure of the present invention and shall fall within the scope of the present invention. The embodiments described above are provided to enable persons skilled in the art to make or use the invention and that modifications or variations can be made to the embodiments described above by persons skilled in the art without departing from the inventive concept of the present invention, so that the scope of protection of the present invention is not limited by the embodiments described above but should be accorded the widest scope consistent with the innovative features set forth in the claims.

Claims (10)

1. An interface testing method, comprising:
an interface information acquisition step, namely acquiring and storing interface information of an interface to be tested;
a script generation step, which is to automatically generate a test script based on the interface information and the test requirement;
a case generation step, namely automatically generating a plurality of test cases based on the interface information, case rules and expected results;
and a test operation step, namely acquiring configured test conditions, calling and executing the test script and the test case when the test conditions are met, testing the interface to be tested, and recording a test result, wherein the test operation step is performed on the distributed architecture, and the test script and the test case are distributed to target nodes on the distributed architecture by a load balancing-based algorithm.
2. The interface testing method of claim 1, wherein the interface information includes interface basic information and interface parameter information, and in the interface information obtaining step, an interface document is generated for the interface to be tested, and the interface basic information and the interface parameter information of the interface to be tested are stored in the corresponding interface document.
3. The interface test method of claim 2,
the interface basic information includes: interface name, interface request type, interface address, retry number, timeout time, optional asynchronous request and return data, wherein the interface basic information is recorded in the source code of the interface in the form of method-level annotation;
the interface parameter information includes: parameter name, parameter type, whether to fill, parameter structure path, data type, parameter special attribute, and interface parameter information recorded in the source code of the interface in the form of parameter level annotation.
4. The interface testing method of claim 3, wherein in the case generating step, the corresponding case rule engine is called according to case rules, interface basic information and interface parameter information of the interface to be tested are obtained, the case rule engine is operated according to the interface basic information and the interface parameter information, a plurality of testing data meeting an expected result are generated, and the plurality of testing data are testing cases.
5. The interface testing method of claim 1, wherein the configured test conditions comprise: script path, use case path, running environment, running cycle time and feedback path;
and when the test conditions are met, calling the test script and the test case, searching a proper target node on the distributed architecture based on a load balancing algorithm to execute the test script and the test case, generating a test report according to the test result and feeding back the test report according to a feedback path.
6. An interface testing apparatus, comprising:
the interface information acquisition component acquires and stores interface information of the interface to be tested;
the script generation component automatically generates a test script based on the interface information and the test requirement;
the case generation component automatically generates a plurality of test cases based on the interface information, case rules and expected results;
and the test operation component is deployed in a distributed architecture, acquires configured test conditions, calls the test script and the test case when the test conditions are met, distributes the test script and the test case to target nodes on the distributed architecture by a load balancing-based algorithm, executes the test script and the test case by the target nodes, and records test results.
7. The interface testing apparatus of claim 6, wherein the interface information includes interface basic information and interface parameter information, the interface information obtaining component includes an interface document generating module and an interface information storing module, the interface document generating module generates an interface document for the interface to be tested, and the interface information storing module stores the interface basic information and the interface parameter information of the interface to be tested into the corresponding interface document.
8. The interface test apparatus of claim 7,
the interface basic information includes: interface name, interface request type, interface address, retry number, timeout time, optional asynchronous request and return data, wherein the interface basic information is recorded in the source code of the interface in the form of method-level annotation;
the interface parameter information includes: parameter name, parameter type, whether to fill, parameter structure path, data type, parameter special attribute, and interface parameter information recorded in the source code of the interface in the form of parameter level annotation.
9. The interface testing apparatus of claim 8, wherein the use case generating component includes a use case rule engine library, a use case generating engine, and a use case database, the use case generating engine calls a corresponding use case rule engine from the use case rule engine library according to a use case rule and obtains interface basic information and interface parameter information of an interface to be tested, the use case generating engine runs the called use case rule engine according to the interface basic information and the interface parameter information to generate a plurality of test data satisfying an expected result, the plurality of test data are test cases, and the plurality of test cases are stored in the use case database.
10. The interface testing apparatus of claim 6, wherein the test run component comprises a test condition configuration module, a scheduling module, a number of execution nodes deployed in a distributed architecture, and a feedback module;
the test conditions configured by the test condition configuration module comprise: script path, use case path, running environment, running cycle time and feedback path;
when the test conditions are met, the scheduling module calls the test script and the test case, searches a proper execution node on the distributed architecture based on a load balancing algorithm to serve as a target node, and sends the test script and the test case to the target node;
the target node executes the test script and the test case, and the test result is sent to the feedback module;
and the feedback module generates a test report and feeds back according to the feedback path.
CN202110726729.7A 2021-06-29 2021-06-29 Interface test method and test device Pending CN113342679A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110726729.7A CN113342679A (en) 2021-06-29 2021-06-29 Interface test method and test device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110726729.7A CN113342679A (en) 2021-06-29 2021-06-29 Interface test method and test device

Publications (1)

Publication Number Publication Date
CN113342679A true CN113342679A (en) 2021-09-03

Family

ID=77481442

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110726729.7A Pending CN113342679A (en) 2021-06-29 2021-06-29 Interface test method and test device

Country Status (1)

Country Link
CN (1) CN113342679A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116340187A (en) * 2023-05-25 2023-06-27 建信金融科技有限责任公司 Rule engine migration test method and device, electronic equipment and storage medium
CN117033234A (en) * 2023-08-24 2023-11-10 广东保伦电子股份有限公司 Interface testing method, device, equipment and medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108897687A (en) * 2018-06-29 2018-11-27 泰华智慧产业集团股份有限公司 A kind of API automated testing method and system based on data-driven
WO2020082585A1 (en) * 2018-10-25 2020-04-30 深圳壹账通智能科技有限公司 Method and device for interface testing
CN111221743A (en) * 2020-03-18 2020-06-02 时时同云科技(成都)有限责任公司 Automatic testing method and system
CN112363907A (en) * 2020-09-14 2021-02-12 杭州大搜车汽车服务有限公司 Test method and device for Dubbo interface, electronic device and storage medium
CN112416743A (en) * 2020-01-21 2021-02-26 上海哔哩哔哩科技有限公司 Test control system, method and equipment
CN112597003A (en) * 2020-12-11 2021-04-02 平安普惠企业管理有限公司 Automatic testing method and device and computer equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108897687A (en) * 2018-06-29 2018-11-27 泰华智慧产业集团股份有限公司 A kind of API automated testing method and system based on data-driven
WO2020082585A1 (en) * 2018-10-25 2020-04-30 深圳壹账通智能科技有限公司 Method and device for interface testing
CN112416743A (en) * 2020-01-21 2021-02-26 上海哔哩哔哩科技有限公司 Test control system, method and equipment
CN111221743A (en) * 2020-03-18 2020-06-02 时时同云科技(成都)有限责任公司 Automatic testing method and system
CN112363907A (en) * 2020-09-14 2021-02-12 杭州大搜车汽车服务有限公司 Test method and device for Dubbo interface, electronic device and storage medium
CN112597003A (en) * 2020-12-11 2021-04-02 平安普惠企业管理有限公司 Automatic testing method and device and computer equipment

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116340187A (en) * 2023-05-25 2023-06-27 建信金融科技有限责任公司 Rule engine migration test method and device, electronic equipment and storage medium
CN116340187B (en) * 2023-05-25 2023-08-15 建信金融科技有限责任公司 Rule engine migration test method and device, electronic equipment and storage medium
CN117033234A (en) * 2023-08-24 2023-11-10 广东保伦电子股份有限公司 Interface testing method, device, equipment and medium

Similar Documents

Publication Publication Date Title
US11099823B2 (en) Systems and methods for transformation of reporting schema
US20200401506A1 (en) System and Method for Performing Automated API Tests
CN107368503B (en) Data synchronization method and system based on button
CN104866599B (en) The production method and system of Visual Report Forms
US7421621B1 (en) Application integration testing
CN108845940B (en) Enterprise-level information system automatic function testing method and system
WO2019029160A1 (en) Application construction method and system, computer device, and storage medium
CN101908015B (en) Device and method for creating test case based on components
CN106021111A (en) Interface testing method and system
CN113342679A (en) Interface test method and test device
US20200133711A1 (en) Event-Triggered Configuration of Workflow Processes for Computer Software Development Systems
CN110503408B (en) Business process management system based on custom configuration
CN104679500B (en) Method and device for realizing automatic generation of entity class
CN111930354B (en) Framework component system for software development and construction method thereof
CN111737140A (en) Interface automation test method, device, equipment and computer readable storage medium
CN109857649B (en) Resource testing method and system
CN103176892A (en) Page monitoring method and system
CN107357721B (en) Method and device for testing system
EP2913757A1 (en) Method, system, and computer software product for test automation
US20090248186A1 (en) Methods and Systems for Matching Configurable Manufacturing Capacity Requirements and Availability
CN102567517A (en) Device and method for issuing data of database
CN102567066A (en) Business system development method, business system development platform and business system development system
US20090228903A1 (en) Data event sending method and apparatus and event processing system
CN113254399A (en) Log query method and device, electronic equipment and medium
CN111858738A (en) Data information transmission method, device and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210903

RJ01 Rejection of invention patent application after publication