CN111752844A - Interface testing method and device, computing equipment and storage medium - Google Patents

Interface testing method and device, computing equipment and storage medium Download PDF

Info

Publication number
CN111752844A
CN111752844A CN202010612367.4A CN202010612367A CN111752844A CN 111752844 A CN111752844 A CN 111752844A CN 202010612367 A CN202010612367 A CN 202010612367A CN 111752844 A CN111752844 A CN 111752844A
Authority
CN
China
Prior art keywords
test
interface
test case
executing
plan
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010612367.4A
Other languages
Chinese (zh)
Inventor
胡一川
汪冠春
褚瑞
李玮
刘桐烔
季琛
徐丽婧
王建周
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Benying Network Technology Co Ltd
Beijing Laiye Network Technology Co Ltd
Original Assignee
Beijing Benying Network Technology Co Ltd
Beijing Laiye Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Benying Network Technology Co Ltd, Beijing Laiye Network Technology Co Ltd filed Critical Beijing Benying Network Technology Co Ltd
Priority to CN202010612367.4A priority Critical patent/CN111752844A/en
Publication of CN111752844A publication Critical patent/CN111752844A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Abstract

The specification discloses an interface testing method, an interface testing device, a computing device and a storage medium, wherein the method comprises the following steps: s1, acquiring an interface contained in the test plan and parameters corresponding to the interface; s2, determining a test case corresponding to each interface, and carrying out permutation and combination processing on the test cases according to a preset permutation rule; and S3, executing the test plan, testing the interface by executing the test case, and generating a test report. Through the combination of the test cases, the comprehensive coverage of various states of the test interface is realized; the test case is subjected to format processing, so that interfaces of any language and any protocol can be tested, no code requirement is required for personnel writing the test case, and the execution plan can be triggered in various modes through setting the timing module and monitoring the online version. According to the interface testing method, the testing result is uniform and transparent, the error is clear at a glance, the labor cost is reduced, and the problem can be found in time.

Description

Interface testing method and device, computing equipment and storage medium
Technical Field
The invention relates to the technical field of software engineering, in particular to an interface testing method, an interface testing device, computing equipment and a storage medium.
Background
It is known that before the software product is released, testers need to perform multiple tests on each part of functions of the software under various environments, and the software is continuously perfected under the co-effort of developers, wherein the interface test is an important link. Not only does the implementation of manual testing consume a significant amount of labor and material, but it often takes a long time and the repeatability of the testing is poor. With the development of the testing technology, the automatic testing greatly reduces the burden of manual testing, can realize a large amount of repeated operations, and can find problems in time when the automatic testing is carried out on the interface in the system. Compared with manual testing, the automatic testing can improve the testing performance to a great extent, saves resources such as manpower and time, and is accurate in testing result and high in testing efficiency.
Therefore, it is an urgent need to solve the problem of researching an automated testing method based on an interface to better implement software automated testing.
Disclosure of Invention
The present specification provides an interface testing method, apparatus, computing device and storage medium to overcome at least one technical problem in the prior art.
According to a first aspect of embodiments herein, there is provided an interface testing method, including:
s1, acquiring an interface contained in the test plan and parameters corresponding to the interface;
s2, determining a test case corresponding to each interface, and carrying out permutation and combination processing on the test cases according to a preset permutation rule;
and S3, executing the test plan, testing the interface by executing the test case, and generating a test report.
Optionally, before the step of obtaining the interface and the parameters corresponding to the interface included in the test plan, the method further includes:
and S01, analyzing the configured timing task, adding a timing tool, and executing the test plan at the set time.
Optionally, before the step of obtaining the interface and the parameters corresponding to the interface included in the test plan, the method further includes:
and S02, automatically triggering and executing the test plan when the version iteration exists on the monitoring discovery line.
Optionally, the step of determining the test case corresponding to each interface, and performing permutation and combination processing on the test cases according to a preset permutation rule includes:
s21, obtaining a test case corresponding to each interface, and uniformly processing the input and output of the test case into a json form;
s22, carrying out permutation and combination processing on the test cases according to a preset permutation rule to form a test case flow;
and S23, carrying out permutation and combination processing on the test case flow according to the test plan.
Optionally, the executing the test plan, and the executing the test case to test the interface, and the generating the test result report includes:
s31, executing the test cases in each test case flow according to the sequence after the permutation and combination, sending request data to a specified interface, and checking whether the returned data meet expectations; if the returned data is expected, executing S32; if the returned data do not meet the expectation, comparing whether the returned data meet the fault tolerance mechanism, if so, executing S32, otherwise, terminating the execution and returning an error report;
s32, injecting the request data and the return data of the previous test case into a user-defined variable, taking the user-defined variable as the input parameter of the next test case in the test case flow, wherein the life cycle of the user-defined variable is the total running time of the test case in the test case flow;
and S33, generating a test report according to the execution condition of the test case in each test case flow.
According to a second aspect of the embodiments of the present specification, there is provided an interface testing apparatus, including an interface acquisition module, a use case processing module, and an execution report module, wherein:
the interface acquisition module is configured to acquire an interface contained in the test plan and parameters corresponding to the interface;
the case processing module is configured to determine a test case corresponding to each interface, and perform permutation and combination processing on the test cases according to a preset permutation rule;
the execution report module is configured to execute the test plan, test the interface by executing the test case, and generate a test report.
Optionally, the apparatus further comprises a timing module, wherein:
the timing module is configured to analyze the configured timing task, add a timing tool, and execute the test plan at a set time.
Optionally, the apparatus further comprises a triggering module, wherein:
the trigger module is configured to automatically trigger execution of the test plan when version iteration exists on the monitoring discovery line.
According to a third aspect of embodiments herein, there is provided a computing device comprising a storage device for storing a computer program and a processor for executing the computer program to cause the computing device to perform the steps of the interface testing method.
According to a fourth aspect of embodiments herein, there is provided a storage medium storing a computer program for use in the above-mentioned computing device, the computer program, when executed by a processor, implementing the steps of the interface testing method.
The beneficial effects of the embodiment of the specification are as follows:
the embodiment of the specification provides an interface testing method, an interface testing device, a computing device and a storage medium, wherein the interface testing method is used for generating corresponding test case flows and carrying out combined arrangement on the test case flows by acquiring a test interface and test cases corresponding to the interface according to a preset arrangement sequence, defining custom variables among the test case flows and associating the test cases in one test case flow. In addition, the test cases are uniformly processed into a json form, so that interfaces of any language and any protocol can be tested, and no code requirement is required for personnel writing the test cases. Meanwhile, the test result is generated into a test report, the link of the test result can be directly forwarded, the link is transparent and uniform, the occurred errors are clear at a glance, the interaction is reduced, and the communication cost of the tester and the research personnel is reduced. In addition, the timing function increases the automation degree, and is convenient for testers to use; when the project is online and rolled back, the iteration of the online version is monitored, the full regression test can be automatically triggered, the labor cost is reduced, and the problem can be found in time.
The innovation points of the embodiment of the specification comprise:
1. in this embodiment, an interface testing method is provided, which obtains a test interface and a test case corresponding to the interface, performs combination processing on the test cases according to a preset arrangement sequence, generates a corresponding test case flow and performs combination arrangement on the test case flow, defines a custom variable within the test case flow, and associates the test cases within one test case flow, which is one of innovation points of the embodiments of the present specification.
2. In this embodiment, the test cases are uniformly processed into a json form, an interface of any language and any protocol can be tested, and no code requirement is imposed on a person writing the test cases, which is one of the innovative points of the embodiments of the present specification.
3. In this embodiment, when a project is online and rolled back, iteration of an online version is monitored, a full regression test can be automatically triggered, labor cost is reduced, and a problem can be found in the first time.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic structural diagram of an application scenario of an interface testing method provided in an embodiment of the present specification;
fig. 2 is a schematic flowchart of an interface testing method provided in an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of an interface testing apparatus according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a use case processing module of an interface testing apparatus according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an execution report module of an interface testing apparatus according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a computing device provided in an embodiment of the present specification.
Detailed Description
The technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without inventive effort based on the embodiments of the present invention, are within the scope of the present invention.
It should be noted that the terms "including" and "having" and any variations thereof in the embodiments of the present specification and the drawings are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
With the development of the testing technology, the automatic testing greatly reduces the burden of manual testing, can realize a large amount of repeated operations, can quickly obtain the state of an interface by automatically testing the interface, finds problems in time, can improve the testing performance to a great extent by the automatic testing compared with the manual testing, saves resources such as manpower and time, has accurate testing result and high testing efficiency.
The embodiment of the specification discloses an interface testing method, an interface testing device, computing equipment and a storage medium, which are respectively described in detail below.
First, the testing environment of the interface test is explained, and the testing environment of the interface test can be customized by the user (usually, four environments are development, test, gray scale and on-line) and the access address of each environment is configured. Some information and algorithms used for verification can be configured in the environment, and automatic association can be carried out according to the specified environment when test case configuration is obtained. And when a single test case or test plan is executed, reading the environment configuration information according to the associated environment name so as to find the corresponding environment address. The same test plan can be executed simultaneously in one or more environments, but due to the arrangement of multiple interfaces, the same test plan can only be executed once in the same environment at the same time, for example, in a test environment of 11:30, two executing planets 001 cannot exist, but the executing planets 001, planets 002, planets 003 and planets 001 can also be executed simultaneously at the same time in the development, test, grayscale, production and other environments.
Next, the injection item and the check item will be explained.
Injection item: that is, the injection variables are injected from all or part of the request header, return header, request body, return body, and static type case of the test case.
Examination items: all or part of the request header, return header, request body, and return body are also checked.
Figure BDA0002562489180000061
Figure BDA0002562489180000071
For example, the return body is:
Figure BDA0002562489180000072
the value is obtained using $ responsebody. "1234567" which can be checked and injected.
Example one
Fig. 1 is a schematic structural diagram of an application scenario of an interface testing method provided in an embodiment of this specification. Fig. 1 is a schematic diagram of an architecture of an integrated test platform corresponding to an interface test method applied to the platform. In the figure, the core part of the integrated test platform is: method (interface, instance), case, flow, plan four parts, the auxiliary part is: the environment and the trigger execution mode are two parts. The Method layer is used for obtaining the configuration of the test interface, the case layer is used for configuring the test case associated with the interface, and the case can configure one or more versions; the flow layer is a group of test cases which are arranged and combined according to the user requirement in sequence, a user-defined variable is arranged among the test cases of single flow, the request content of the case executed in front or the part or the whole of the return data is injected into the input of the case executed in back, the group of test cases are correlated and are put into a flow for testing, and the flow is divided into two types of execution which is terminated when an error is encountered and is continuously executed when the error is ignored according to the fault tolerance. The Plan layer Plan is a test Plan which is really executed in a platform, generally is an iteration of a whole function module or some services, the Plan is composed of a plurality of flows, the flows are executed in parallel, execution dependence is avoided, a version to be executed, an interface type, an operation environment and a trigger mode under each environment can be configured in the Plan, and the execution mode in the Plan is divided into: timing execution, issue execution (trigger execution), and manual execution.
After the test plan is triggered and executed in a specified mode, testing is carried out in a specified environment, all specified test interfaces, namely interfaces, are obtained, relevant test cases are executed according to a preset sequence so as to execute the test plan, parallel execution is carried out among flows, case serial execution in flows is carried out, data are sent to the specified interfaces, whether return values meet expectations or not is checked, or addition, deletion, modification and checking are carried out on a specified database, and a test report is generated according to the execution condition of the test cases. The input and output of the test case are uniformly processed into a json form without the limitation of programming language and protocol; the system has a timing execution function, and the project is online and rolled back, full regression testing is automatically triggered, so that the labor cost is reduced, the system is transparent and uniform, errors are clear at a glance, and the testing efficiency is high.
Example two
Fig. 2 is a schematic flowchart of an interface testing method provided in an embodiment of the present disclosure. As shown in fig. 2, an interface testing method includes:
110. and acquiring the interface contained in the test plan and the parameters corresponding to the interface.
Acquiring an interface and corresponding parameters contained in a test plan, wherein the interface at least comprises any one of three types of interfaces, namely three interface forms of Grpc, http and graphql, the Grpc type interface automatically acquires a request mode, uri, method name, request information, response information and the like according to a proto file, and a user only needs to modify the value of an important Key according to the request mode, the uri, the method name, the request information, the response information and the like; http type interfaces are json friendly per se; and the Graphql type interface automatically obtains the formats of the Graphql query and the garphql variable, and the user only needs to modify the value of the important variable. An interface configuration is obtained to determine a test interface.
Optionally, before the step of obtaining the interface and the parameters corresponding to the interface included in the test plan, the method further includes:
102. and analyzing the configured timing task, adding a timing tool, and executing the test plan at the set time.
When the test plan submits a new adding or modifying request, the configured timing task is analyzed, the timing task is added to the cron module, and the cron module realizes timing execution.
Optionally, before the step of obtaining the interface and the parameters corresponding to the interface included in the test plan, the method further includes:
103. and when the monitoring finds that the version iteration exists on the line, automatically triggering to execute the test plan.
For example, "user add-drop-and-review test plan" automatically triggers testing of the test environment three points in the morning each day, and automatically triggers testing on the line when a version iteration is found on the line.
120. And determining a test case corresponding to each interface, and carrying out permutation and combination processing on the test cases according to a preset permutation rule.
Cases in a flow are usually oriented to different interfaces, for example, "add/delete/modify" can interface 4 interfaces, and usually test in a flow.
Optionally, the step of determining the test case corresponding to each interface, and performing permutation and combination processing on the test cases according to a preset permutation rule includes:
122. and acquiring a test case corresponding to each interface, and uniformly processing the input and output of the test case into a json form.
The method comprises the steps of obtaining test cases written by testers, wherein the test cases can be divided into three types, namely interface types, instance types and static types. The interface type use case is used for sending specified data to an interface and checking whether returned data meet expectations or not; the instance type use case is used to perform an add-drop-and-delete-modify-check of the database, e.g., insert a piece of initialization data into mysql: insert intro; static type use cases are used to inject values for variables, for example, for a given user operation (userid), different test users may exist in different environments, and different values may be injected for the same variable in different environments using static variables.
The input and output of the interface type use case are unified in format, and specifically, for a grpc type interface, json is analyzed into a request which can be transmitted through the grpc according to a proto file; the http type interface is json friendly and does not require additional processing; and the graphql type interface generates request content according to the configured query (interface module registration) splicing variables (use case input). The input and output of the instance type use case and the interface type use case are processed similarly; the static type use case is static json.
124. And carrying out permutation and combination processing on the test cases according to a preset permutation rule to form a test case flow.
The test case flow is composed of a group of test cases, the test cases in the same flow are sequentially executed, and the input and output of the case executed firstly can be used as part or all of the input of the case executed later.
The test case flow is divided into two types according to the height of a fault tolerance mechanism, namely execution is stopped when an error is encountered, and execution is continued when the error is ignored, wherein the fault tolerance of execution is low when the error is encountered and the fault tolerance of execution is high when the error is ignored.
126. And carrying out permutation and combination processing on the test case flow according to the test plan.
The test plan comprises a plurality of test case flows, and the flows are executed in parallel without execution dependency.
130. And executing the test plan, testing the interface by executing the test case, and generating a test report.
The interface in the test plan is tested by executing the test cases according to the preset sequence to complete the coverage of various states of the interface, the test plan is an iteration of a whole functional module or some services, and finally a test report of the whole functional module or some services is generated according to the execution condition of the cases.
Optionally, the executing the test plan, and the executing the test case to test the interface, and the generating the test result report includes:
132. executing the test cases in each test case flow according to the sequence after the permutation and combination, sending request data to a specified interface, and checking whether the returned data meet expectations; if the returned data is expected, executing S32; if the returned data does not meet the expectation, comparing whether the returned data meets the fault tolerance mechanism, if so, executing S32, otherwise, terminating execution and returning an error report.
The test cases can be independently run for debugging or temporary testing, and can also be uniformly scheduled in the test plan. Three operations, namely checking, injection and sleeping, are run in series in the execution of a single test case, and the single test case returns immediately after encountering an error in the execution and is not executed continuously.
The check operation of the use case refers to checking whether any part of the request and the return value meets the expectation, for example, whether a certain field length returned is equal to 7, and the return value is also stored in a json form.
And the injection operation of the use case is to inject any part of the request and the return into a variable with a self-defined name for use by a subsequent use case. User _ define _ variable _ name } is used. The life cycle of the custom variable is a running flow.
The sleep operation means how long to wait for the next use case to be executed after the use case is executed, and is generally used for waiting for the asynchronous task to return a result.
134. And injecting the request data and the return data of the previous test case into a user-defined variable, and taking the user-defined variable as an input parameter of the next test case in the test case flow, wherein the life cycle of the user-defined variable is the total running time of the test case in the test case flow.
By the injection operation of the test cases, the user-defined variables are defined among the test cases in the test case flow, so that a group of test cases are associated and placed in one flow for testing.
136. And generating a test report according to the execution condition of the test case in each test case flow.
And generating a test report according to the execution condition of the test case, wherein the test report can be forwarded in a chained mode, so that the test report is transparent and uniform, the errors are clear at a glance, the interaction is reduced, and the communication cost of testers and research personnel is reduced.
In the embodiment, an interface test method is provided, in which input and output of a test case are processed in a unified format by obtaining a test interface and a test case corresponding to the interface, so that a programming language and a protocol are not limited; combining the test cases according to a preset sequence and defining custom variables among the test case flows to realize the association of the test cases, thereby completing more comprehensive test coverage on the interface; the automatic, timing and manual execution triggering modes improve the automation degree of the test and improve the working efficiency of testers.
EXAMPLE III
Fig. 3 is a schematic structural diagram of an interface testing apparatus according to an embodiment of the present disclosure. As shown in fig. 3, an interface testing apparatus 200 includes an interface obtaining module 210, a use case processing module 220, and an execution reporting module 230, wherein:
the interface acquisition module is configured to acquire the interface included in the test plan and the parameters corresponding to the interface.
The case processing module is configured to determine a test case corresponding to each interface, and perform permutation and combination processing on the test cases according to a preset permutation rule.
Optionally, the use case processing module includes an acquisition preprocessing unit, a flow generating unit, and a flow combining unit.
Fig. 4 is a schematic structural diagram of a use case processing module of an interface test apparatus according to an embodiment of the present disclosure. As shown in fig. 4, the use case processing module 220 includes an acquisition preprocessing unit 222, a flow generating unit 224 and a flow combining unit 226, wherein
The obtaining preprocessing unit 222 is configured to obtain a test case corresponding to each interface, and uniformly process input and output of the test case into a json form.
The flow generating unit 224 is configured to perform permutation and combination processing on the test cases according to a preset permutation rule to form a test case flow.
The flow combining unit 226 is configured to perform permutation and combination processing on the test case flows according to the test plan.
The execution report module is configured to execute the test plan, test the interface by executing the test case, and generate a test report.
Optionally, the execution report module includes a use case execution unit, a flow operation unit, and a report generation unit.
Fig. 5 is a schematic structural diagram of an execution report module of an interface testing apparatus according to an embodiment of the present disclosure. As shown in fig. 5, the execution report module 230 includes a use case execution unit 232, a flow execution unit 234, and a report generation unit 236, wherein
The case execution unit 232 is configured to execute the test cases in each test case flow according to the sequence after the permutation and combination, send request data to the specified interface, and check whether the returned data meets expectations; if the returned data is in accordance with the expectation, continuing to execute the flow operation unit; if the returned data is not in accordance with the expectation, comparing whether the returned data meets the fault-tolerant mechanism, if so, continuing to execute the flow operation unit, otherwise, terminating the execution and returning an error report.
The process running unit 234 is configured to inject the request data and the return data of the previous test case into a custom variable, and the custom variable is used as an input parameter of the next test case in the test case process, where a life cycle of the custom variable is a total running time of the test case in the test case process.
The report generating unit 236 is configured to generate a test report according to the execution condition of the test case in each test case flow.
Optionally, the apparatus 200 further comprises a timing module 202, wherein:
the timing module 202 is configured to parse the configured timing task, add a timing tool, and execute the test plan at a set time.
Optionally, the apparatus 200 further comprises a triggering module 204, wherein:
the trigger module 204 is configured to automatically trigger execution of the test plan when version iteration exists on the monitoring discovery line.
In this embodiment, an interface test apparatus 200 is provided, which can implement the functions of the interface test method, and the corresponding implementation steps and effects can be referred to as method parts.
Example four
Fig. 6 is a schematic structural diagram of a computing device provided in an embodiment of the present specification. As shown in fig. 6, a computing device 600 comprises a storage device 610 and a processor 620, wherein the storage device 610 is used for storing a computer program, and the processor 620 runs the computer program to make the computing device 600 execute the steps of the interface testing method.
The present specification provides a storage medium storing a computer program for use in the above-described computing device 600, which when executed by a processor, implements the steps of the interface testing method.
To sum up, the present specification provides an interface testing method, an interface testing device, a computing device, and a storage medium, where the testing method forms a testing flow through associated test cases, and has no code requirements for personnel writing the test cases, and has a timing execution function and an online automatic trigger function, so that the testing result is clear, the labor cost of testing is reduced, and the testing efficiency is improved.
Those of ordinary skill in the art will understand that: the figures are merely schematic representations of one embodiment, and the blocks or flow diagrams in the figures are not necessarily required to practice the present invention.
Those of ordinary skill in the art will understand that: modules in the devices in the embodiments may be distributed in the devices in the embodiments according to the description of the embodiments, or may be located in one or more devices different from the embodiments with corresponding changes. The modules of the above embodiments may be combined into one module, or further split into multiple sub-modules.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. An interface testing method, comprising:
s1, acquiring an interface contained in the test plan and parameters corresponding to the interface;
s2, determining a test case corresponding to each interface, and carrying out permutation and combination processing on the test cases according to a preset permutation rule;
and S3, executing the test plan, testing the interface by executing the test case, and generating a test report.
2. The method according to claim 1, wherein before the step of obtaining the interface and the parameters corresponding to the interface included in the test plan, the method further comprises:
and S01, analyzing the configured timing task, adding a timing tool, and executing the test plan at the set time.
3. The method according to claim 1, wherein before the step of obtaining the interface and the parameters corresponding to the interface included in the test plan, the method further comprises:
and S02, automatically triggering and executing the test plan when the version iteration exists on the monitoring discovery line.
4. The method according to claim 1, wherein the step of determining the test cases corresponding to each interface and performing permutation and combination processing on the test cases according to a preset permutation rule comprises:
s21, obtaining a test case corresponding to each interface, and uniformly processing the input and output of the test case into a json form;
s22, carrying out permutation and combination processing on the test cases according to a preset permutation rule to form a test case flow;
and S23, carrying out permutation and combination processing on the test case flow according to the test plan.
5. The method of claim 1, wherein executing the test plan to test the interface by executing the test case, and wherein generating the test result report comprises:
s31, executing the test cases in each test case flow according to the sequence after the permutation and combination, sending request data to a specified interface, and checking whether the returned data meet expectations; if the returned data is expected, executing S32; if the returned data do not meet the expectation, comparing whether the returned data meet the fault tolerance mechanism, if so, executing S32, otherwise, terminating the execution and returning an error report;
s32, injecting the request data and the return data of the previous test case into a user-defined variable, taking the user-defined variable as the input parameter of the next test case in the test case flow, wherein the life cycle of the user-defined variable is the total running time of the test case in the test case flow;
and S33, generating a test report according to the execution condition of the test case in each test case flow.
6. An interface testing device, comprising an interface acquisition module, a use case processing module and an execution report module, wherein:
the interface acquisition module is configured to acquire an interface contained in the test plan and parameters corresponding to the interface;
the case processing module is configured to determine a test case corresponding to each interface, and perform permutation and combination processing on the test cases according to a preset permutation rule;
the execution report module is configured to execute the test plan, test the interface by executing the test case, and generate a test report.
7. The apparatus of claim 6, further comprising a timing module, wherein:
the timing module is configured to analyze the configured timing task, add a timing tool, and execute the test plan at a set time.
8. The apparatus of claim 6, further comprising a triggering module, wherein:
the trigger module is configured to automatically trigger execution of the test plan when version iteration exists on the monitoring discovery line.
9. A computing device comprising a storage device for storing a computer program and a processor for executing the computer program to cause the computing device to perform the steps of the method of any of claims 1-5.
10. A storage medium, characterized in that it stores a computer program for use in a computing device according to claim 9, which computer program, when being executed by a processor, realizes the steps of the method according to any one of claims 1-5.
CN202010612367.4A 2020-06-30 2020-06-30 Interface testing method and device, computing equipment and storage medium Pending CN111752844A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010612367.4A CN111752844A (en) 2020-06-30 2020-06-30 Interface testing method and device, computing equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010612367.4A CN111752844A (en) 2020-06-30 2020-06-30 Interface testing method and device, computing equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111752844A true CN111752844A (en) 2020-10-09

Family

ID=72676652

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010612367.4A Pending CN111752844A (en) 2020-06-30 2020-06-30 Interface testing method and device, computing equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111752844A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112380141A (en) * 2020-12-11 2021-02-19 上海中通吉网络技术有限公司 Interface case set execution method and device
CN112559356A (en) * 2020-12-18 2021-03-26 杭州兑吧网络科技有限公司 Automatic software testing method and system
CN112631914A (en) * 2020-12-24 2021-04-09 上海幻电信息科技有限公司 Data testing method and device
CN113485914A (en) * 2021-06-09 2021-10-08 镁佳(北京)科技有限公司 Vehicle-mounted voice SDK testing method, device and system
CN115525561A (en) * 2022-10-11 2022-12-27 深圳市航盛电子股份有限公司 Protocol interface testing method, device, terminal equipment and storage medium
CN115952100A (en) * 2023-01-10 2023-04-11 北京百度网讯科技有限公司 Interface test method, device, system, electronic equipment and storage medium

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112380141A (en) * 2020-12-11 2021-02-19 上海中通吉网络技术有限公司 Interface case set execution method and device
CN112559356A (en) * 2020-12-18 2021-03-26 杭州兑吧网络科技有限公司 Automatic software testing method and system
CN112631914A (en) * 2020-12-24 2021-04-09 上海幻电信息科技有限公司 Data testing method and device
CN113485914A (en) * 2021-06-09 2021-10-08 镁佳(北京)科技有限公司 Vehicle-mounted voice SDK testing method, device and system
CN115525561A (en) * 2022-10-11 2022-12-27 深圳市航盛电子股份有限公司 Protocol interface testing method, device, terminal equipment and storage medium
CN115952100A (en) * 2023-01-10 2023-04-11 北京百度网讯科技有限公司 Interface test method, device, system, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN111752844A (en) Interface testing method and device, computing equipment and storage medium
CN107273286B (en) Scene automatic test platform and method for task application
Turner et al. The state-based testing of object-oriented programs
US11868226B2 (en) Load test framework
CN102693183B (en) Method and system for realizing automatic software testing
US10241901B2 (en) Web application performance testing
US20170161167A1 (en) End-to-end tracing and logging
CN112286806B (en) Automatic test method and device, storage medium and electronic equipment
Sun et al. Automated testing of WS-BPEL service compositions: A scenario-oriented approach
CN111522728A (en) Method for generating automatic test case, electronic device and readable storage medium
CN110888818A (en) Test case configuration system and method, automatic test system and method
Wahler et al. CAST: Automating software tests for embedded systems
Sharma Automatic generation of test suites from decision table-theory and implementation
Bellettini et al. TestUml: user-metrics driven web applications testing
CN113505895B (en) Machine learning engine service system, model training method and configuration method
Grambow et al. Using application benchmark call graphs to quantify and improve the practical relevance of microbenchmark suites
CN110287092A (en) A kind of electricity transaction system and its automatic test approach based on graphical interfaces
Tsai et al. Scenario-based test case generation for state-based embedded systems
us Saqib et al. Functionality, performance, and compatibility testing: A model based approach
CN113220586A (en) Automatic interface pressure test execution method, device and system
Nieminen et al. Adaptable design for root cause analysis of a model-based software testing process
Zhang Research on software development and test environment automation based on android platform
CN116594918B (en) Test case change influence analysis method, device, equipment and storage medium
Gomes et al. Application of model-based testing to dynamic evaluation of functional mockup units
George et al. Automated Framework for API Testing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination