CN116578497A - Automatic interface testing method, system, computer equipment and storage medium - Google Patents

Automatic interface testing method, system, computer equipment and storage medium Download PDF

Info

Publication number
CN116578497A
CN116578497A CN202310843177.7A CN202310843177A CN116578497A CN 116578497 A CN116578497 A CN 116578497A CN 202310843177 A CN202310843177 A CN 202310843177A CN 116578497 A CN116578497 A CN 116578497A
Authority
CN
China
Prior art keywords
interface
test
test task
application program
task
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310843177.7A
Other languages
Chinese (zh)
Inventor
张家华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Inspur Intelligent Technology Co Ltd
Original Assignee
Suzhou Inspur Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Inspur Intelligent Technology Co Ltd filed Critical Suzhou Inspur Intelligent Technology Co Ltd
Priority to CN202310843177.7A priority Critical patent/CN116578497A/en
Publication of CN116578497A publication Critical patent/CN116578497A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention relates to the technical field of automatic testing, and discloses an interface automatic testing method, an interface automatic testing system, computer equipment and a storage medium, wherein the interface automatic testing method comprises the following steps: acquiring an interface document uploaded by a client, and accessing at least one application program interface based on the interface document; wherein the interface document conforms to an open application programming interface specification; creating an interface test task based on business processing logic corresponding to at least one application program interface; and executing the interface test task and generating an automatic test result report. The invention realizes the automatic test of the API interfaces of multiple versions and improves the utilization rate of resources.

Description

Automatic interface testing method, system, computer equipment and storage medium
Technical Field
The invention relates to the technical field of automatic testing, in particular to an interface automatic testing method, an interface automatic testing system, computer equipment and a storage medium.
Background
With the continuous popularization of the micro-service concept, more and more systems adopt a front-end and back-end separated architecture, and more API interfaces (Application Programming Interface, application program interfaces) of the back end are also used. For a multi-module application system, the service logic of each module has a clear limit, and one function may involve a plurality of modules, so in the life cycle of the application system, along with the release of each version, a corresponding back-end API interface exists, the management of the multi-version API interface is more complicated, and before entering a testing stage, research and development personnel can perform internal testing of the API interface, so how to perform automatic testing on the back-end multi-version API interface is a problem to be solved urgently by those skilled in the art.
Disclosure of Invention
In view of the above, the present invention provides an automatic testing method, system, computer device and storage medium for interfaces, so as to solve the problem that testing of back-end multi-version API interfaces is difficult.
In a first aspect, the present invention provides an automated interface testing method, including:
acquiring an interface document uploaded by a client, and accessing at least one application program interface based on the interface document; wherein the interface document conforms to an open application programming interface specification;
creating an interface test task based on business processing logic corresponding to at least one application program interface;
and executing the interface test task and generating an automatic test result report.
According to the automatic interface testing method provided by the invention, the user generates the interface file at the client through the preset open application programming interface specification file, and accesses at least one application program interface based on the interface file, so that the management of the multi-version API interface is realized, the utilization rate of resources is improved, and the interface testing task is created based on the business processing logic corresponding to the at least one application program interface to execute the interface testing task, so that an automatic testing result report is generated, the automatic creation of the interface testing task is realized, and the time cost for creating and maintaining the testing environment is effectively reduced.
In an alternative embodiment, obtaining an interface document uploaded by a client, accessing at least one application program interface based on the interface document, includes:
extracting target interface information in the interface document, and searching the target interface information in a grouping way according to a preset label grouping way;
and selecting at least one application program interface based on the packet search result, and accessing the at least one application program interface.
According to the automatic interface testing method provided by the invention, the target interface information is searched in a grouping way according to the preset label grouping, so that the identification efficiency and accuracy of the application program interface are improved.
In an alternative embodiment, creating an interface test task based on business processing logic corresponding to at least one application program interface includes:
determining an interface request mode corresponding to at least one application program interface based on the interface document;
determining interface parameters by using a preset parameter list based on at least one application program interface;
determining preset interface state code information and preset interface return data corresponding to at least one application program interface based on an interface document;
and creating an interface test task based on the interface request mode, the interface parameters, the preset interface state code information and the preset interface return data.
According to the interface automatic test method provided by the invention, the interface test task is created based on the interface request mode, the interface parameters, the interface state code information and the interface return data, so that the created interface test task is more in line with the business logic of a user, and the multi-version API interface can be effectively managed.
In an alternative embodiment, performing an interface test task, generating an automated test result report, includes:
inputting an interface request mode and interface parameters into at least one application program interface to generate current interface state code information and current interface return data;
comparing the current interface state code information with preset interface state code information, and detecting a data structure in the returned data of the current interface;
and when the current interface state code information is different from the preset interface state code information and the data structure in the current interface return data is empty, ending the interface test task and generating an automatic test result report.
According to the interface automatic test method provided by the invention, the interface test tasks corresponding to the application program interface can be monitored in real time through the interface state code information and the interface return data, and further the automatic test results are fed back to the user, so that the user can know the test process of the interface test tasks in real time.
In an alternative embodiment, the interface test task is executed, and an automated test result report is generated, and the method further includes:
comparing the current interface state code information with a preset abnormal value, and ending the interface test task when the current interface state code information is the preset abnormal value and the data structure in the current interface return data is empty.
According to the automatic interface testing method provided by the invention, the current interface state code information is compared with the preset abnormal value, so that the judging speed of the total execution result of the interface testing task is improved.
In an alternative embodiment, the interface test task is executed, and an automated test result report is generated, and the method further includes:
when the interface test task is a multi-stage test task, acquiring an automatic test result corresponding to the current-stage test task;
when the automatic test result corresponding to the test task at the current stage is abnormal, interrupting the execution of the interface test task, generating an interrupt result, and displaying the interrupt result through the client.
According to the automatic interface testing method provided by the invention, the testing time is saved by respectively monitoring the testing task results in different stages, and the real-time monitoring and modification of the multi-stage testing task are realized.
In an alternative embodiment, the method further comprises:
and acquiring the field number of the previous stage, extracting the field number of the current stage in the interface return data, and associating the field number of the current stage with the field number of the previous stage.
In the automatic testing method of the interface, the parameters of each stage are the same path in the automatic testing process, so that the field numbers are associated, and the relevant parameters of the last stage of the interface testing task can be effectively and accurately identified by using the field numbers.
In an alternative embodiment, before performing the interface test task and generating the automated test result report, the method includes:
and acquiring user operation data or preset period trigger data uploaded by the client, executing an interface test task based on the user operation data or the preset period trigger data, and generating an automatic test result report.
According to the interface automatic test method, the interface test tasks can be monitored in real time or periodically through the user operation data or the preset period trigger data, and the flexibility of application program interface management is improved.
In an alternative embodiment, the method further comprises:
The automatic test result report is sent to the client for visual display, so that a user can arrange test result feedback information based on the automatic test result report; the automated test result report comprises the total execution result of the interface test task, the total execution time of the interface test task, the execution time of each stage of the interface test task, the execution result of each stage of the interface test task and the coverage rate of the interface test.
According to the interface automatic test method provided by the invention, the automatic test result report is sent to the client for visual display, and the test result focused by the user is intuitively displayed from multiple dimensions, so that the user can arrange the test result feedback information in real time through the automatic test result report.
In an alternative embodiment, the method further comprises:
and acquiring test result feedback information uploaded by a user, and managing the interface test task based on the test result feedback information.
According to the automatic interface testing method, the interface testing task is managed through the feedback information of the testing result uploaded by the user, so that the interface testing task is more in line with the current testing scene, the working efficiency of research personnel is improved, and the flexible management of the back-end multi-version API is realized.
In an alternative embodiment, managing the interface test tasks based on the test result feedback information includes:
updating the interface test task based on the test result feedback information, and generating an updated interface test task;
and deleting the interface test task and the automatic test result report based on the test result feedback information.
In an alternative embodiment, managing the interface test task based on the test result feedback information further includes:
and executing the updated interface test task, generating an interface test modification execution result, and sending the interface test modification execution result to the client for visual display.
According to the automatic test method for the interface, provided by the invention, the interface test task updating unit can be used for creating the interface test task conforming to the current scene based on the feedback information of the test result on the basis of the existing interface test task, so that the creation time of the interface test task is saved, and the flexible management of the back-end multi-version API interface is realized through updating and deleting the interface test task.
In an alternative embodiment, before obtaining the interface document uploaded by the client and accessing at least one application program interface based on the interface document, the method further comprises:
Generating an interface file conforming to the open application programming interface specification by using an interface document generating tool, and displaying the interface file conforming to the open application programming interface specification on a client;
and acquiring the interface file which accords with the open application programming interface specification after the user edits, and taking the interface file which accords with the open application programming interface specification after the user edits as an interface document.
In an alternative embodiment, the interface document generation tool employs a swagger tool.
According to the automatic interface testing method provided by the invention, the interface file conforming to the open application programming interface specification is generated by the interface file generating tool in advance, so that a user can set related data of the interface file more intuitively and simply, the setting requirement on the user is lower, and the application scene of the automatic interface testing method is enlarged.
In a second aspect, the present invention provides an interface automation test system comprising:
the application program interface access module is used for acquiring the interface document uploaded by the client and accessing at least one application program interface based on the interface document; wherein the interface document conforms to an open application programming interface specification;
The test task arrangement module is connected with the application program interface access module and is used for creating an interface test task based on the service processing logic corresponding to at least one application program interface;
the test task execution module is connected with the test task arrangement module and used for executing the interface test task and generating an automatic test result report.
In an alternative embodiment, a test task orchestration module comprises:
the interface association unit is used for determining an interface request mode corresponding to at least one application program interface based on the interface document;
an interface parameter setting unit, configured to determine an interface parameter by using a preset parameter list based on at least one application program interface;
the interface response setting unit is used for determining interface state code information and interface return data corresponding to at least one application program interface based on the interface document;
the test task creation unit is connected with the interface association unit, the interface parameter setting unit and the interface response setting unit and is used for creating an interface test task based on the interface request mode, the interface parameter, the interface state code information and the interface return data.
In an alternative embodiment, the method further comprises:
and the test task management module is used for acquiring the feedback information of the test result uploaded by the user and managing the interface test task based on the feedback information of the test result.
In an alternative embodiment, a test task management module includes:
the interface test task updating unit is used for updating the interface test task based on the feedback information of the test result and generating an updated interface test task;
and the interface test task deleting unit is used for deleting the interface test task and the automatic test result report form based on the test result feedback information.
In a third aspect, the present invention provides a computer device comprising: the memory and the processor are in communication connection, computer instructions are stored in the memory, and the processor executes the computer instructions, so that the interface automation test method of the first aspect or any implementation manner corresponding to the first aspect is executed.
In a fourth aspect, the present invention provides a computer readable storage medium having stored thereon computer instructions for causing a computer to perform a method of automated interface testing according to the first aspect or any of its corresponding embodiments.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described, and it is obvious that the drawings in the description below are some embodiments of the present invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of an automated interface test method according to an embodiment of the present invention;
FIG. 2 is a flow chart of another interface automated test method according to an embodiment of the invention;
FIG. 3 is a flow chart of yet another interface automated test method according to an embodiment of the present invention;
FIG. 4 is a flow chart of yet another interface automated test method according to an embodiment of the present invention;
FIG. 5 is a flow diagram of an interface automation test process according to an embodiment of the present invention;
FIG. 6 is a schematic flow chart of viewing newly created pets in a pet store according to an embodiment of the invention;
FIG. 7 is a flow chart of a multi-stage pet testing task scheduled in a pet store according to an embodiment of the present invention;
FIG. 8 is a block diagram of an interface automation test system in accordance with an embodiment of the present invention;
fig. 9 is a schematic diagram of a hardware structure of a computer device according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
In accordance with an embodiment of the present invention, an interface automated test method embodiment is provided, it being noted that the steps shown in the flowchart of the figures may be performed in a computer system, such as a set of computer executable instructions, and, although a logical order is shown in the flowchart, in some cases, the steps shown or described may be performed in an order other than that shown or described herein.
In this embodiment, an automatic interface testing method is provided, which may be used in the above mobile terminal, such as a mobile phone, a tablet computer, etc., fig. 1 is a flowchart of an automatic interface testing method according to an embodiment of the present invention, and as shown in fig. 1, the flowchart includes the following steps:
step S101, an interface document uploaded by a client is obtained, and at least one application program interface is accessed based on the interface document; wherein the interface document conforms to an open application programming interface specification.
Specifically, before an interface document uploaded by a client is acquired, an interface document generating tool is utilized to generate an interface file conforming to an open application programming interface specification, and the interface file conforming to the open application programming interface specification is displayed on the client; and acquiring the interface file which accords with the open application programming interface specification after the user edits, and taking the interface file which accords with the open application programming interface specification after the user edits as an interface document.
Further, the interface document generation tool adopts a swagger tool (an interface document generation tool).
Specifically, the client stores a preset open application programming interface specification (OpenAPI specification) file, and the user directly selects an API interface (i.e., an application program interface) from the preset open application programming interface specification file stored in the client, sets corresponding interface parameters and an interface request mode, and further generates an interface document.
Further, the OpenAPI specification is a RESTful API (a set of protocols to specify the interaction modes of the front end in multiple forms and the same background), and the swag tool provides an SDK (Software Development Kit ) of a computer programming language such as java, python, golang, so that a user can also use the swag tool to arrange json (JavaScript Object Notation, JS object numbered musical notation, a lightweight data exchange format) or YAML (YAML Ain't Markup Language, format for expressing data serialization) files (i.e. interface documents) conforming to the OpenAPI specification; for development teams which have used the OpenAPI specification, only a corresponding json or yaml file needs to be provided to access the APIs of different versions of the application system; and for development teams not using the OpenAPI specification, the functions of the APIs can be input online.
Step S102, an interface test task is created based on business processing logic corresponding to at least one application program interface.
Specifically, processing logic that connects one or more APIs (i.e., application program interfaces) to complete a service, each API having an input and an output, the input of the API representing the delivery of a parameter, the output representing the return and status of the interface; and recording associated information for the condition of a plurality of APIs, and further creating an interface test task, wherein the interface test task is an execution flow chart with a precursor and a rear-end.
And step S103, executing an interface test task and generating an automatic test result report.
Specifically, before executing the interface test task and generating the automated test result report, the method further comprises the following steps: and acquiring user operation data or preset period trigger data uploaded by the client, executing an interface test task based on the user operation data or the preset period trigger data, and generating an automatic test result report.
Further, the execution mode of the interface test task comprises: manually triggering and periodically triggering; the manual triggering mode requires a user to perform clicking operation on a WEB (World Wide WEB) interface of a client to generate user operation data; the user only needs to define a trigger period to generate preset periodic trigger data, for example, the user can specify a specific time of each day to execute the test task; or performing test tasks weekly, hourly, etc., as the case may be, according to development schedule requirements.
Further, the automatic test result report is sent to the client for visual display, so that a user can arrange test result feedback information based on the automatic test result report; the automated test result report includes the total execution result (success or failure, failure reason), the total execution time (time from start to end) of the interface test task, the execution time (response time of the application program interface) of each stage of the interface test task, the execution result (return information and response state of the application program interface) of each stage of the interface test task, and the coverage rate of the interface test.
Further, obtaining test result feedback information uploaded by a user, and managing the interface test task based on the test result feedback information, specifically including: updating the interface test task based on the test result feedback information, and generating an updated interface test task; and deleting the interface test task and the automatic test result report based on the test result feedback information.
Further, the copying and updating operation of the interface test task is to better cope with the situations of multi-version APIs, and the updating iteration condition of each module is different under the micro-service architecture, so that in order to more quickly arrange the new version of the API test task, a user can directly copy the interface test task of the previous version, and then modify the interface test task on the basis, so as to generate test result feedback information.
Further, the updated interface test task is executed, an interface test modification execution result is generated, and the interface test modification execution result is sent to the client for visual display.
According to the interface automation test method provided by the embodiment, a user generates an interface document at a client through a preset open application programming interface specification file, at least one application program interface is accessed based on the interface document, management of multi-version API interfaces is achieved, the utilization rate of resources is improved, an interface test task is created based on business processing logic corresponding to the at least one application program interface, the interface test task is executed, an automation test result report is generated, automatic creation of the interface test task is achieved, and time cost for creating and maintaining a test environment is effectively reduced.
In this embodiment, an automatic interface testing method is provided, which may be used in the above mobile terminal, such as a mobile phone, a tablet computer, etc., fig. 2 is a flowchart of an automatic interface testing method according to an embodiment of the present invention, and as shown in fig. 2, the flowchart includes the following steps:
step S201, an interface document uploaded by a client is obtained, and at least one application program interface is accessed based on the interface document; wherein the interface document conforms to an open application programming interface specification.
Specifically, the step S201 includes:
and step 2011, extracting target interface information in the interface document, and searching the target interface information in a grouping way according to a preset label grouping way.
Step S2012, selecting at least one application program interface based on the packet search result, and accessing the at least one application program interface.
Step S202, an interface test task is created based on business processing logic corresponding to at least one application program interface. Please refer to step S102 in the embodiment shown in fig. 1 in detail, which is not described herein.
And step S203, executing an interface test task and generating an automatic test result report. Please refer to step S103 in the embodiment shown in fig. 1 in detail, which is not described herein.
According to the interface automatic test method, the target interface information is searched in a grouping mode according to the preset label grouping, and therefore identification efficiency and accuracy of an application program interface are improved.
In this embodiment, an automatic interface testing method is provided, which may be used in the above mobile terminal, such as a mobile phone, a tablet computer, etc., and fig. 3 is a flowchart of an automatic interface testing method according to an embodiment of the present invention, as shown in fig. 3, where the flowchart includes the following steps:
Step S301, an interface document uploaded by a client is obtained, and at least one application program interface is accessed based on the interface document; wherein the interface document conforms to an open application programming interface specification. Please refer to step S201 in the embodiment shown in fig. 2 in detail, which is not described herein.
Step S302, an interface test task is created based on business processing logic corresponding to at least one application program interface.
Specifically, the step S302 includes:
step S3021, determining an interface request mode corresponding to at least one application program interface based on the interface document.
In step S3022, the interface parameters are determined using the preset parameter list based on the at least one application program interface.
Step S3023, determining preset interface status code information and preset interface return data corresponding to at least one application program interface based on the interface document.
Step S3024, creating an interface test task based on the interface request mode, the interface parameters, the preset interface status code information, and the preset interface return data.
Specifically, an interface request mode and interface parameters in an interface test task are used as input of an application program interface, interface state code information and interface return data are used as output of the application program interface, and then the interface test task is executed, so that an automatic test result is obtained.
Further, the interface test task supports a branch flow, a serial flow and a parallel flow; and supports various types of request patterns.
Further, the field number of the last stage is obtained, the field number of the current stage in the interface return data is extracted, and the field number of the current stage is associated with the field number of the last stage.
Further, when the interface test task is a multi-stage task, after the interface test task of the previous stage is executed, the field number of the current stage needs to be associated with the field number of the previous stage when the interface test task of the current stage is executed.
And step S303, executing an interface test task and generating an automatic test result report. Please refer to step S203 in the embodiment shown in fig. 2 in detail, which is not described herein.
According to the interface automatic test method provided by the embodiment, parameters of each stage are the same path in an automatic test process, so that field numbers are associated, and related parameters of the last stage of an interface test task can be effectively and accurately identified by using the field numbers.
In this embodiment, an automatic interface testing method is provided, which may be used in the above mobile terminal, such as a mobile phone, a tablet computer, etc., and fig. 4 is a flowchart of an automatic interface testing method according to an embodiment of the present invention, as shown in fig. 4, where the flowchart includes the following steps:
Step S401, obtaining an interface document uploaded by a client, and accessing at least one application program interface based on the interface document; wherein the interface document conforms to an open application programming interface specification. Please refer to step S301 in the embodiment shown in fig. 3 in detail, which is not described herein.
Step S402, an interface test task is created based on business processing logic corresponding to at least one application program interface. Please refer to step S302 in the embodiment shown in fig. 3 in detail, which is not described herein.
And S403, executing an interface test task and generating an automatic test result report.
Specifically, the step S403 includes:
step S4031, the interface request mode and the interface parameters are input into at least one application program interface to generate the current interface state code information and the current interface return data.
Step S4032, comparing the current interface status code information with the preset interface status code information, and detecting the data structure in the current interface return data.
Step S4033, when the current interface status code information is different from the preset interface status code information and the data structure in the current interface return data is empty, ending the interface test task and generating an automatic test result report.
Or comparing the current interface state code information with a preset abnormal value, and ending the interface test task when the current interface state code information is the preset abnormal value and the data structure in the current interface return data is empty.
Further, when the interface test task is a multi-stage test task, an automatic test result corresponding to the current stage test task is obtained; when the automatic test result corresponding to the test task at the current stage is abnormal, interrupting the execution of the interface test task, generating an interrupt result, and displaying the interrupt result through the client.
According to the interface automation test method provided by the embodiment, the interface test task is created based on the interface request mode, the interface parameters, the interface state code information and the interface return data, so that the created interface test task is more in line with the business logic of a user, and the multi-version API interface can be effectively managed.
The following describes a test procedure of an interface automation test method by means of specific embodiments.
Example 1:
as shown in fig. 5, before the automatic test of the API interface, a user manually inputs related information of the API interface in a document conforming to the OpenAPI specification, generates an interface document, and uploads the interface document to the API automatic test system;
The API automatic test system creates an API automatic test task through an interface document and associates an API interface;
the API automatic test system stores test tasks;
the API automatic test system executes the test task and sends the execution result to the user;
the user checks the execution result of the test task, and the API automatic test system displays the execution result of the task, including execution time, failure reasons and the like;
the user sends related data of updating and deleting test tasks to the API automatic test system;
the API automatic test system updates and deletes the test tasks, and then re-executes the updated and deleted test tasks, so as to iterate.
Example 2:
the following embodiments 2 and 3 use pet stores as application scenarios to perform interface test task arrangement, where interface documents define 4 interfaces and 2 data structures using json files, respectively:
1) The pet category filtering query can be performed according to the pet category, and all pet information is returned;
2) A pet is newly built in the pet store by the per pets and post method, and the newly built pet information is returned;
3) Inquiring according to the pet id by using a get method, and returning pet information;
4) The delete method deletes according to the pet id;
5) Pet structure, pet information including id and NewPet structure;
6) NewPet structure, new pet information, including pet name and pet category.
As shown in fig. 6, the newly-built pet can be checked in the pet store, and the task scheduling flow is as follows:
1) According to business logic, the requirement of meeting the requirement is divided into two stages, wherein the first stage is to newly establish a pet in a pet store, and the second stage is to inquire the pet according to the newly established pet information, namely, the part corresponding to a broken line in a flow chart; the starting and ending processes are the zone bits built in the visual interface;
2) A new construction stage of the pet;
a) API association; the method comprises the steps that a tags group is arranged in API definitions based on the OpenAPI specification, when API association is carried out through a WEB interface, grouping is carried out according to tags, then a user selects a corresponding API under the grouping, and particularly when the number of APIs of an application system is large, grouping search is carried out according to tags; after selecting the API, selecting a used request mode;
b) Parameter setting of the API; after the request mode is selected in the last step, a corresponding parameter list is displayed on the WEB interface, in the example, the name and the category of the pet are displayed, and for different parameter types, a default filling value is provided, and a default value can be modified by a user, so that the operation flow of the user is simplified;
c) API interface response setting; this stage is to let the user tell the system how to determine that the interface return is normal, including two aspects: the first is the status code information, normal value and abnormal value of the interface; second, whether the return data of the interface is correct; for example, in this example, the normal state is a data structure in which the interface return state code is 200 and the returned data is Pet;
3) A pet inquiring stage;
a) API association; the method comprises the steps that a tags group is arranged in API definitions based on the OpenAPI specification, when API association is carried out through a WEB interface, grouping is carried out according to tags, then a user selects a corresponding API under the grouping, and particularly when the number of APIs of an application system is large, grouping search is carried out according to tags; after selecting the API, selecting a used request mode;
b) Parameter setting of the API; after the request mode is selected in the last step, a corresponding parameter list is displayed on the WEB interface, and in the example, the parameter information is in a path and is associated with the last stage, so that parameter association setting is needed;
c) Parameter association setting of the API; in the example, the id parameter is an id field in the returned data of the newly-built interface of the pet, and the id is only required to be associated to the id field of the returned data of the previous stage; the scene of the example is simpler, if a complex test scene is encountered, the parameter association setting can be associated with any stage before, and both the request parameter and the return data can be used;
d) API interface response setting; this stage is to let the user tell the system how to determine that the interface return is normal, including two aspects: the first is the status code information, normal value and abnormal value of the interface; second, whether the return data of the interface is correct; for example, in this example, the normal state is a data structure in which the interface return status code is 200 and the returned data is Pet and cannot be null;
after the interface test task is created, in the execution stage, the next stage is continuously executed only if the previous stage is successfully executed and the data is normal; if the task is interrupted, abnormal prompt information can be seen on the WEB interface.
Example 3:
as shown in fig. 7, the application scenario of this embodiment is: before newly creating a pet in a pet store, checking whether the pet name is available, and if so, creating the pet; inquiring the pet information after the new construction, deleting after the inquiry is successful, and verifying that the deletion is successful.
1) According to business logic, the requirement of meeting the requirement is divided into five stages, wherein the first stage is to check whether a pet exists or not, the second stage is to newly establish a pet in a pet store, the third stage is to inquire the pet according to the newly established pet information, the fourth stage is to delete the newly established pet, and the fifth stage is to check that the pet is successfully deleted, namely the part corresponding to a broken line in a flow chart; the starting and ending processes are the zone bits built in the visual interface;
2) Newly-built verification stage of pet: the pet interface is simply expanded and can be used for filtering and inquiring according to the name and the category of the pet
a) API association: the method comprises the steps that a tags group is arranged in API definitions based on the OpenAPI specification, when API association is carried out through a WEB interface, grouping is carried out according to tags, then a user selects a corresponding API under the grouping, and particularly when the number of APIs of an application system is large, grouping search is carried out according to tags; after selecting the API, selecting a used request mode;
b) Parameter setting of API: after the request mode is selected in the last step, a corresponding parameter list is displayed on a WEB interface, in the example, the name and the category of the pet are displayed, and the parameters are actually in the query path, and for different parameter types, a default filling value is provided, and a default user can modify the default filling value, so that the operation flow of the user is simplified;
c) API interface response settings: this stage is to let the user tell the system how to determine that the interface return is normal, including two aspects: the first is the status code information, normal value and abnormal value of the interface; second, whether the return data of the interface is correct; for example, in this example, the normal state is a data structure in which the interface return status code is 200 and the returned data is Pet, and the data is null;
And continuing to carry out a new pet construction stage after the verification of the pet is successful, if the verification fails, directly ending, and displaying a branch which indicates the API test task is carried out in a WEB interface.
3) Newly-built stage of pet:
a) API association: the method comprises the steps that a tags group is arranged in API definitions based on the OpenAPI specification, when API association is carried out through a WEB interface, grouping is carried out according to tags, then a user selects a corresponding API under the grouping, and particularly when the number of APIs of an application system is large, grouping search is carried out according to tags; after selecting the API, selecting a used request mode;
b) Parameter setting of API: after the request mode is selected in the last step, a corresponding parameter list is displayed on the WEB interface, in the example, the name and the category of the pet are displayed, and for different parameter types, a default filling value is provided, and a default value can be modified by a user, so that the operation flow of the user is simplified;
c) Parameter association settings of API: in this example, the name parameter is a name field in the parameter of the pet verification interface, and only the name is required to be associated to the name field in the parameter setting of the previous stage; the scene of the example is simpler, if a complex test scene is encountered, the parameter association setting can be associated with any stage before, and both the request parameter and the return data can be used;
d) API interface response settings: this stage is to let the user tell the system how to determine that the interface return is normal, including two aspects: the first is the status code information, normal value and abnormal value of the interface; second, whether the return data of the interface is correct; for example, in this example, the normal state is a data structure in which the interface return state code is 200 and the returned data is Pet;
4) A pet inquiring stage:
a) API association: the method comprises the steps that a tags group is arranged in API definitions based on the OpenAPI specification, when API association is carried out through a WEB interface, grouping is carried out according to tags, then a user selects a corresponding API under the grouping, and particularly when the number of APIs of an application system is large, grouping search is carried out according to tags; after selecting the API, selecting a used request mode;
b) Parameter setting of API: after the request mode is selected in the last step, a corresponding parameter list is displayed on the WEB interface, and in the example, the parameter information is in a path and is associated with the last stage, so that parameter association setting is needed;
c) Parameter association settings of API: in the example, the id parameter is an id field in the returned data of the newly-built interface of the pet, and the id is only required to be associated to the id field of the returned data of the previous stage;
d) API interface response settings: this stage is to let the user tell the system how to determine that the interface return is normal, including two aspects: the first is the status code information, normal value and abnormal value of the interface; second, whether the return data of the interface is correct; for example, in this example, the normal state is a data structure in which the interface return status code is 200 and the returned data is Pet and cannot be null;
5) A pet deleting stage:
a) API association: the method comprises the steps that a tags group is arranged in API definitions based on the OpenAPI specification, when API association is carried out through a WEB interface, grouping is carried out according to tags, then a user selects a corresponding API under the grouping, and particularly when the number of APIs of an application system is large, grouping search is carried out according to tags; after selecting the API, selecting a used request mode;
b) Parameter setting of API: after the request mode is selected in the last step, a corresponding parameter list is displayed on the WEB interface, and in the example, the parameter information is in a path and is associated with the new construction stage of the pet, so that parameter association setting is needed;
c) Parameter association settings of API: in the example, the id parameter is an id field in the returned data of the newly-built interface of the pet, and the id is only required to be associated to the id field of the returned data of the newly-built stage;
d) API interface response settings: this stage is to let the user tell the system how to determine that the interface return is normal, including two aspects: the first is the status code information, normal value and abnormal value of the interface; second, whether the return data of the interface is correct; for example, in this example, the normal state is that the interface return status code is 204 and the returned data is the id of the deleted pet;
6) And (3) a pet deleting and checking stage:
a) API association: the method comprises the steps that a tags group is arranged in API definitions based on the OpenAPI specification, when API association is carried out through a WEB interface, grouping is carried out according to tags, then a user selects a corresponding API under the grouping, and particularly when the number of APIs of an application system is large, grouping search is carried out according to tags; after selecting the API, selecting a used request mode;
b) Parameter setting of API: after the request mode is selected in the last step, a corresponding parameter list is displayed on a WEB interface, in the example, the name and the category of the pet are displayed, and the parameters are actually in the query path, and for different parameter types, a default filling value is provided, and a default user can modify the default filling value, so that the operation flow of the user is simplified;
c) Parameter association settings of API: in the example, the name and tag parameters are parameters in the pet verification interface, and only relevant information is required to be associated;
d) API interface response settings: this stage is to let the user tell the system how to determine that the interface return is normal, including two aspects: the first is the status code information, normal value and abnormal value of the interface; second, whether the return data of the interface is correct; for example, in this example, the normal state is a data structure in which the interface return status code is 200 and the returned data is Pet, and the data is null;
for complex business logic, the user can also compile and discharge the API test task according to the actual requirement, after the test task is executed, the user can clearly see the flow branch of the test task operation, and intuitively display the total execution result of the interface test task, which comprises the following steps: interface test task total execution result (success or failure, failure reason); the total execution time (time from start to end) of the interface test task; the execution time of each stage of the interface test task (the corresponding time of the interface); the results of execution of each stage of the interface test task (interface return information, response status, etc.), and the coverage of the interface test.
In the above embodiment, the API management based on the multiple versions of the OpenAPI specification; the normal system can manage various API versions in the whole software life cycle, and can effectively manage the APIs of the whole system with different versions under the condition that different versions are simultaneously on line at the same time; business logic arrangement based on visual arrangement; the initiative is given to the actual user, the user can establish the arranging task according to the own actual business logic, and the branch flow, the serial flow and the parallel flow are supported; and support various types of request modes; repeatedly executable API automatic test flow; in the process of daily development iteration, the command of the API interface has little change, and the API test task of the version can be rapidly arranged based on the arrangement test task of the previous version; the API automates the test results report, visually presents the test results of interest to the user from multiple dimensions, and can be exported.
In this embodiment, an automatic interface testing system is further provided, and the device is used to implement the foregoing embodiments and preferred embodiments, which are not described in detail. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated.
An embodiment of the present invention provides an automated interface testing system, as shown in fig. 8, including:
an application program interface access module 801, configured to obtain an interface document uploaded by a client, and access at least one application program interface based on the interface document; wherein the interface document conforms to an open application programming interface specification.
The test task orchestration module 802 is connected to the application program interface access module 801, and is configured to create an interface test task based on service processing logic corresponding to at least one application program interface.
The test task execution module 803 is connected to the test task orchestration module 802, and is configured to execute the interface test task and generate an automated test result report.
In some alternative embodiments, the application program interface access module 801 includes:
and the grouping searching unit is used for extracting the target interface information in the interface document and carrying out grouping searching on the target interface information according to the preset label grouping.
And the access unit is used for selecting at least one application program interface based on the packet search result and accessing the at least one application program interface.
In some alternative embodiments, test task orchestration module 802 comprises:
And the interface association unit is used for determining an interface request mode corresponding to at least one application program interface based on the interface document.
And the interface parameter setting unit is used for determining interface parameters by utilizing a preset parameter list based on at least one application program interface.
And the interface response setting unit is used for determining interface state code information and interface return data corresponding to at least one application program interface based on the interface document.
The test task creation unit is connected with the interface association unit, the interface parameter setting unit and the interface response setting unit and is used for creating an interface test task based on the interface request mode, the interface parameter, the interface state code information and the interface return data.
In some alternative embodiments, the test task execution module 803 includes:
the generating unit is used for inputting the interface request mode and the interface parameters into at least one application program interface and generating current interface state code information and current interface return data;
the first comparison unit is used for comparing the current interface state code information with the preset interface state code information and detecting a data structure in the current interface return data;
And the ending unit is used for ending the interface test task and generating an automatic test result report when the current interface state code information is different from the preset interface state code information and the data structure in the current interface return data is empty.
In some optional embodiments, the ending unit is further configured to compare the current interface status code information with a preset outlier, and when the current interface status code information is the preset outlier and the data structure in the current interface return data is empty, end the interface test task.
In some alternative embodiments, the test task execution module 803 further includes:
the automatic test result acquisition unit is used for acquiring an automatic test result corresponding to the current stage test task when the interface test task is a multi-stage test task;
and the interrupt unit is used for interrupting the execution of the interface test task when the automatic test result corresponding to the test task at the current stage is abnormal, generating an interrupt result and displaying the interrupt result through the client.
In some alternative embodiments, test task orchestration module 802 further comprises:
the interface parameter association setting unit 1025 is connected to the interface response setting unit 1023, and the interface parameter association setting unit 1025 is configured to obtain a field number of a previous stage, extract a field number of a current stage in the interface return data, and associate the field number of the current stage with the field number of the previous stage.
In some alternative embodiments, further comprising:
the triggering unit is used for acquiring user operation data or preset period triggering data uploaded by the client, executing interface test tasks based on the user operation data or the preset period triggering data and generating an automatic test result report.
In some alternative embodiments, further comprising:
the display unit is used for sending the automatic test result report to the client for visual display so that a user can arrange test result feedback information based on the automatic test result report; the automated test result report comprises the total execution result of the interface test task, the total execution time of the interface test task, the execution time of each stage of the interface test task, the execution result of each stage of the interface test task and the coverage rate of the interface test.
In some alternative embodiments, further comprising:
and the test task management module is used for acquiring the feedback information of the test result uploaded by the user and managing the interface test task based on the feedback information of the test result.
In some alternative embodiments, the test task management module includes:
and the interface test task updating unit is used for updating the interface test task based on the feedback information of the test result and generating an updated interface test task.
And the interface test task deleting unit is used for deleting the interface test task and the automatic test result report form based on the test result feedback information.
In some optional embodiments, the interface test task updating unit is further configured to execute the updated interface test task, generate an interface test modification execution result, and send the interface test modification execution result to the client for visual display.
In some alternative embodiments, further comprising:
the editing unit is used for generating an interface file conforming to the open application programming interface specification by using the interface document generating tool and displaying the interface file conforming to the open application programming interface specification on the client;
the interface file acquisition unit is used for acquiring the interface file which accords with the open application programming interface specification after being edited by the user, and taking the interface file which accords with the open application programming interface specification after being edited by the user as an interface document.
Further functional descriptions of the above respective modules and units are the same as those of the above corresponding embodiments, and are not repeated here.
An interface automation test system in this embodiment is presented in the form of functional units, where the units are ASIC (Application Specific Integrated Circuit ) circuits, processors and memory executing one or more software or firmware programs, and/or other devices that can provide the above-described functionality.
The embodiment of the invention also provides computer equipment, which is provided with the interface automatic test system shown in the figure 8.
Referring to fig. 9, fig. 9 is a schematic structural diagram of a computer device according to an alternative embodiment of the present invention, as shown in fig. 9, the computer device includes: one or more processors 10, memory 20, and interfaces for connecting the various components, including high-speed interfaces and low-speed interfaces. The various components are communicatively coupled to each other using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions executing within the computer device, including instructions stored in or on memory to display graphical information of the GUI on an external input/output device, such as a display device coupled to the interface. In some alternative embodiments, multiple processors and/or multiple buses may be used, if desired, along with multiple memories and multiple memories. Also, multiple computer devices may be connected, each providing a portion of the necessary operations (e.g., as a server array, a set of blade servers, or a multiprocessor system). One processor 10 is illustrated in fig. 9.
The processor 10 may be a central processor, a network processor, or a combination thereof. The processor 10 may further include a hardware chip, among others. The hardware chip may be an application specific integrated circuit, a programmable logic device, or a combination thereof. The programmable logic device may be a complex programmable logic device, a field programmable gate array, a general-purpose array logic, or any combination thereof.
Wherein the memory 20 stores instructions executable by the at least one processor 10 to cause the at least one processor 10 to perform the methods shown in implementing the above embodiments.
The memory 20 may include a storage program area that may store an operating system, at least one application program required for functions, and a storage data area; the storage data area may store data created according to the use of the computer device, etc. In addition, the memory 20 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid-state storage device. In some alternative embodiments, memory 20 may optionally include memory located remotely from processor 10, which may be connected to the computer device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
Memory 20 may include volatile memory, such as random access memory; the memory may also include non-volatile memory, such as flash memory, hard disk, or solid state disk; the memory 20 may also comprise a combination of the above types of memories.
The computer device further comprises input means 30 and output means 40. The processor 10, memory 20, input device 30, and output device 40 may be connected by a bus or other means, for example by a bus connection in fig. 9.
The input device 30 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the computer apparatus, such as a touch screen, a keypad, a mouse, a trackpad, a touchpad, a pointer stick, one or more mouse buttons, a trackball, a joystick, and the like. The output means 40 may include a display device, auxiliary lighting means (e.g., LEDs), tactile feedback means (e.g., vibration motors), and the like. Such display devices include, but are not limited to, liquid crystal displays, light emitting diodes, displays and plasma displays. In some alternative implementations, the display device may be a touch screen.
The embodiments of the present invention also provide a computer readable storage medium, and the method according to the embodiments of the present invention described above may be implemented in hardware, firmware, or as a computer code which may be recorded on a storage medium, or as original stored in a remote storage medium or a non-transitory machine readable storage medium downloaded through a network and to be stored in a local storage medium, so that the method described herein may be stored on such software process on a storage medium using a general purpose computer, a special purpose processor, or programmable or special purpose hardware. The storage medium can be a magnetic disk, an optical disk, a read-only memory, a random access memory, a flash memory, a hard disk, a solid state disk or the like; further, the storage medium may also comprise a combination of memories of the kind described above. It will be appreciated that a computer, processor, microprocessor controller or programmable hardware includes a storage element that can store or receive software or computer code that, when accessed and executed by the computer, processor or hardware, implements the methods illustrated by the above embodiments.
Although embodiments of the present invention have been described in connection with the accompanying drawings, various modifications and variations may be made by those skilled in the art without departing from the spirit and scope of the invention, and such modifications and variations fall within the scope of the invention as defined by the appended claims.

Claims (20)

1. An automated interface testing method, comprising:
acquiring an interface document uploaded by a client, and accessing at least one application program interface based on the interface document; wherein the interface document meets an open application programming interface specification;
creating an interface test task based on the service processing logic corresponding to the at least one application program interface;
and executing the interface test task to generate an automatic test result report.
2. The method of claim 1, wherein the obtaining the interface document uploaded by the client, accessing at least one application program interface based on the interface document, comprises:
extracting target interface information in the interface document, and carrying out grouping search on the target interface information according to preset label grouping;
and selecting the at least one application program interface based on the packet search result, and accessing the at least one application program interface.
3. The method of claim 1, wherein creating an interface test task based on the business processing logic corresponding to the at least one application program interface comprises:
determining an interface request mode corresponding to the at least one application program interface based on the interface document;
Determining interface parameters by using a preset parameter list based on the at least one application program interface;
determining preset interface state code information and preset interface return data corresponding to the at least one application program interface based on the interface document;
and creating the interface test task based on the interface request mode, the interface parameters, the preset interface state code information and the preset interface return data.
4. The method of claim 3, wherein the performing the interface test task to generate an automated test result report comprises:
inputting the interface request mode and the interface parameters into the at least one application program interface to generate current interface state code information and current interface return data;
comparing the current interface state code information with the preset interface state code information, and detecting a data structure in the current interface return data;
and when the current interface state code information is different from the preset interface state code information and the data structure in the current interface return data is empty, ending the interface test task and generating the automatic test result report.
5. The method of claim 4, wherein the performing the interface test task generates an automated test result report, further comprising:
comparing the current interface state code information with a preset abnormal value, and ending the interface test task when the current interface state code information is the preset abnormal value and the data structure in the current interface return data is empty.
6. The method of claim 5, wherein the performing the interface test task generates an automated test result report, further comprising:
when the interface test task is a multi-stage test task, acquiring an automatic test result corresponding to the current-stage test task;
when the automatic test result corresponding to the current stage test task is abnormal, interrupting the execution of the interface test task, generating an interrupt result, and displaying the interrupt result through the client.
7. The method of claim 3, wherein creating an interface test task based on the business processing logic corresponding to the at least one application program interface further comprises:
and acquiring a field number of a previous stage, extracting a field number of a current stage in the interface return data, and associating the field number of the current stage with the field number of the previous stage.
8. The method of claim 1, further comprising, prior to said performing said interface test task and generating an automated test results report:
and acquiring user operation data or preset period trigger data uploaded by the client, executing the interface test task based on the user operation data or the preset period trigger data, and generating the automatic test result report.
9. The method as recited in claim 1, further comprising:
the automatic test result report is sent to a client for visual display, so that a user can arrange test result feedback information based on the automatic test result report; the automatic test result report comprises the total execution result of the interface test task, the total execution time of the interface test task, the execution time of each stage of the interface test task, the execution result of each stage of the interface test task and the coverage rate of the interface test.
10. The method as recited in claim 9, further comprising:
and acquiring the feedback information of the test result uploaded by the user, and managing the interface test task based on the feedback information of the test result.
11. The method of claim 10, wherein managing the interface test tasks based on the test result feedback information comprises:
updating the interface test task based on the test result feedback information, and generating an updated interface test task;
and deleting the interface test task and the automated test result report based on the test result feedback information.
12. The method of claim 11, wherein managing the interface test tasks based on the test result feedback information further comprises:
and executing the updated interface test task, generating an interface test modification execution result, and sending the interface test modification execution result to a client for visual display.
13. The method of claim 1, wherein prior to accessing the interface document uploaded by the client based on the interface document, further comprising:
generating an interface file conforming to an open application programming interface specification by using an interface document generating tool, and displaying the interface file conforming to the open application programming interface specification on the client;
And acquiring an interface file which accords with the open application programming interface specification after the user editing, and taking the interface file which accords with the open application programming interface specification after the user editing as the interface file.
14. The method of claim 12, wherein the interface document generation tool employs a swagger tool.
15. An automated interface testing system, comprising:
the application program interface access module is used for acquiring an interface document uploaded by the client and accessing at least one application program interface based on the interface document; wherein the interface document meets an open application programming interface specification;
the test task arrangement module is connected with the application program interface access module and is used for creating an interface test task based on the service processing logic corresponding to the at least one application program interface;
and the test task execution module is connected with the test task arrangement module and used for executing the interface test task and generating an automatic test result report.
16. The system of claim 15, wherein the test task orchestration module comprises:
an interface association unit, configured to determine an interface request manner corresponding to the at least one application program interface based on the interface document;
An interface parameter setting unit, configured to determine an interface parameter by using a preset parameter list based on the at least one application program interface;
an interface response setting unit, configured to determine interface status code information and interface return data corresponding to the at least one application program interface based on the interface document;
and the test task creation unit is connected with the interface association unit, the interface parameter setting unit and the interface response setting unit and is used for creating the interface test task based on the interface request mode, the interface parameter, the interface state code information and the interface return data.
17. The system of claim 15, further comprising:
and the test task management module is used for acquiring the test result feedback information uploaded by the user and managing the interface test task based on the test result feedback information.
18. The system of claim 17, wherein the test task management module comprises:
the interface test task updating unit is used for updating the interface test task based on the test result feedback information and generating an updated interface test task;
And the interface test task deleting unit is used for deleting the interface test task and the automatic test result report form based on the test result feedback information.
19. A computer device, comprising:
a memory and a processor communicatively coupled to each other, the memory having stored therein computer instructions, the processor executing the computer instructions to perform the interface automation test method of any of claims 1 to 14.
20. A computer readable storage medium having stored thereon computer instructions for causing a computer to perform the interface automation test method of any one of claims 1 to 14.
CN202310843177.7A 2023-07-11 2023-07-11 Automatic interface testing method, system, computer equipment and storage medium Pending CN116578497A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310843177.7A CN116578497A (en) 2023-07-11 2023-07-11 Automatic interface testing method, system, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310843177.7A CN116578497A (en) 2023-07-11 2023-07-11 Automatic interface testing method, system, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116578497A true CN116578497A (en) 2023-08-11

Family

ID=87543494

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310843177.7A Pending CN116578497A (en) 2023-07-11 2023-07-11 Automatic interface testing method, system, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116578497A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117171056A (en) * 2023-11-02 2023-12-05 绿城科技产业服务集团有限公司 Test method and device based on automatic interface

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111930635A (en) * 2020-09-17 2020-11-13 深圳微品致远信息科技有限公司 Swagger-based rapid automatic testing method and system
CN112306855A (en) * 2019-08-02 2021-02-02 北大方正集团有限公司 Interface automation test method, device, terminal and storage medium
KR20210090575A (en) * 2020-11-27 2021-07-20 베이징 바이두 넷컴 사이언스 앤 테크놀로지 코., 엘티디. A method, an apparatus, an electronic device, a storage medium and a program for testing code

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112306855A (en) * 2019-08-02 2021-02-02 北大方正集团有限公司 Interface automation test method, device, terminal and storage medium
CN111930635A (en) * 2020-09-17 2020-11-13 深圳微品致远信息科技有限公司 Swagger-based rapid automatic testing method and system
KR20210090575A (en) * 2020-11-27 2021-07-20 베이징 바이두 넷컴 사이언스 앤 테크놀로지 코., 엘티디. A method, an apparatus, an electronic device, a storage medium and a program for testing code

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117171056A (en) * 2023-11-02 2023-12-05 绿城科技产业服务集团有限公司 Test method and device based on automatic interface
CN117171056B (en) * 2023-11-02 2024-01-09 绿城科技产业服务集团有限公司 Test method and device based on automatic interface

Similar Documents

Publication Publication Date Title
US8839107B2 (en) Context based script generation
US8731998B2 (en) Three dimensional visual representation for identifying problems in monitored model oriented business processes
CN111125444A (en) Big data task scheduling management method, device, equipment and storage medium
US20060129609A1 (en) Database synchronization using change log
CN104572327A (en) Method, device and system for processing browser crash
CN103257852B (en) The method and apparatus that a kind of development environment of distribution application system is built
CN110865840B (en) Application management method, device, server and storage medium
US8296723B2 (en) Configurable unified modeling language building blocks
CN116578497A (en) Automatic interface testing method, system, computer equipment and storage medium
CN111782452A (en) Method, system, device and medium for interface contrast test
CN111338931B (en) Buried point testing method, device, equipment and storage medium
CN111552521A (en) Application data reporting method, device, server and storage medium
CN107463391A (en) Task processing method, device and equipment
CN112667795B (en) Dialogue tree construction method and device, dialogue tree operation method, device and system
CN109063040B (en) Client program data acquisition method and system
CN110011827A (en) Towards doctor conjuncted multi-user's big data analysis service system and method
CN115984022B (en) Unified account checking method and device for distributed payment system
CN112559525A (en) Data checking system, method, device and server
CN112395333A (en) Method and device for checking data exception, electronic equipment and storage medium
JP5735998B2 (en) Operation system
CN115543423A (en) Method, device and equipment for generating benchmarks and storage medium
CN114936152A (en) Application testing method and device
EP3018576B1 (en) A method for controlling changes in a computer system
CN113568614A (en) Task issuing method, electronic device and storage medium
CN109525642B (en) LIMS system client data automatic reporting method under user mechanism

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20230811