CN116974882A - Interface testing method and related device - Google Patents

Interface testing method and related device Download PDF

Info

Publication number
CN116974882A
CN116974882A CN202210424945.0A CN202210424945A CN116974882A CN 116974882 A CN116974882 A CN 116974882A CN 202210424945 A CN202210424945 A CN 202210424945A CN 116974882 A CN116974882 A CN 116974882A
Authority
CN
China
Prior art keywords
interface
test
server
testing
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210424945.0A
Other languages
Chinese (zh)
Inventor
庄镛鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202210424945.0A priority Critical patent/CN116974882A/en
Publication of CN116974882A publication Critical patent/CN116974882A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3696Methods or tools to render software testable
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The embodiment of the application provides an interface testing method and a related device, wherein the method comprises the following steps: interface selection information which is configured by a testing party through a visual interaction interface and corresponds to a service interface on a target line is obtained; receiving parameter information input on the visual interaction interface and used for testing the service interface on the target line; when a test instruction of the target online service interface submitted through the visual interaction interface is received, calling the target online service interface corresponding to the interface selection information according to the parameter information so as to test the target online service interface; and displaying the test result of the service interface on the target line on the visual interaction interface. According to the scheme provided by the embodiment of the application, a script does not need to be written, the threshold and the test cost of the interface test are reduced, and the convenience and the efficiency of the interface test are improved. The embodiment of the application can be applied to various scenes such as cloud technology, artificial intelligence, intelligent traffic, auxiliary driving and the like.

Description

Interface testing method and related device
Technical Field
The application relates to the technical field of software testing, in particular to an interface testing method and a related device.
Background
Currently, developers may use tools in a local development environment to perform testing operations on a local interface.
For an interface provided by a system of an online server, the existing test scheme mainly adopts a mode of manually writing a script and uploading the script to the server to execute the test. The main drawbacks of this approach are: because the script is required to be written manually and uploaded during the test, the operation is very complicated and the efficiency is very low; meanwhile, because a person with professional knowledge is required to write the test script during each test, the labor cost is high.
Disclosure of Invention
The embodiment of the application provides an interface testing method and a related device, which can realize interface testing at least to a certain extent without writing a script and improve the efficiency of the interface testing.
Other features and advantages of the application will be apparent from the following detailed description, or may be learned by the practice of the application.
According to an aspect of an embodiment of the present application, there is provided an interface testing method, including: interface selection information which is configured by a testing party through a visual interaction interface and corresponds to a service interface on a target line is obtained; receiving parameter information input on the visual interaction interface and used for testing the service interface on the target line; when a test instruction of the target online service interface submitted through the visual interaction interface is received, calling the target online service interface corresponding to the interface selection information according to the parameter information so as to test the target online service interface; and displaying the test result of the service interface on the target line on the visual interaction interface.
According to an aspect of an embodiment of the present application, there is provided an interface test apparatus, the apparatus including: the acquisition unit is used for acquiring interface selection information which is configured by the testing party through the visual interaction interface and corresponds to the service interface on the target line; the receiving unit is used for receiving parameter information which is input on the visual interaction interface and is used for testing the target online service interface; the testing unit is used for calling the target online service interface corresponding to the interface selection information according to the parameter information when receiving a testing instruction of the target online service interface submitted through the visual interaction interface so as to test the target online service interface; and the display unit is used for displaying the test result of the service interface on the target line on the visual interaction interface.
In some embodiments of the application, based on the foregoing, the parameter information includes an interface request parameter and server selection information for indicating testing of a specified server providing the target online service interface.
In some embodiments of the application, based on the foregoing, the test unit is configured to: generating a test task corresponding to the designated server according to the parameter information, and adding the test task corresponding to the designated server into a message queue; and acquiring the test task from the message queue, and executing the test task to test the designated server.
In some embodiments of the present application, based on the foregoing solution, the specified server is plural, and the test unit is configured to: acquiring test tasks corresponding to each appointed server from the message queue in sequence; and after each test task is obtained, an interface calling request is obtained according to the content package of the test task, and the interface calling request is sent to a corresponding designated server so as to test a plurality of designated servers providing the service interfaces on the target line.
In some embodiments of the application, based on the foregoing scheme, the obtaining unit is further configured to: acquiring interface protocol definition files corresponding to all online service interfaces, wherein the interface protocol definition files are generated according to service codes of the corresponding online service interfaces, and the interface protocol definition files comprise interface information; after interface selection information corresponding to a service interface on a target line configured by a testing party through a visual interaction interface is obtained, analyzing the interface protocol definition file corresponding to the service interface on the target line to obtain interface information; the display unit is further configured to: and displaying the interface information through the visual interaction interface.
In some embodiments of the present application, based on the foregoing solution, the parameter information further includes a script configuration parameter, and the obtaining unit is further configured to: analyzing the interface protocol definition file corresponding to each online service interface to obtain interface information in each interface protocol definition file; generating an original performance test script corresponding to each online service interface according to script specifications and interface information in each interface protocol definition file, wherein the original performance test script comprises at least one reserved configuration item; the test unit is configured to: acquiring the test task from the message queue; determining a corresponding original performance test script according to the content of the test task, and filling the script configuration parameters into reserved configuration items in the original performance test script to obtain a complete performance test script; and performing performance test on the designated server by executing the complete performance test script.
In some embodiments of the application, based on the foregoing, the test unit is configured to: determining a target execution machine in an idle state; and sending the complete performance test script to the target execution machine so as to perform performance test on the specified server by executing the complete performance test script on the target execution machine.
In some embodiments of the present application, based on the foregoing solution, the specified server is plural, and the test unit is configured to: executing the complete performance test script on the target execution machine to trigger sending a test request to a load balancing server, wherein the test request is used for performing performance test on a designated server selected by the load balancing server according to a load balancing strategy.
In some embodiments of the application, based on the foregoing, the display unit is configured to: polling the target execution machine regularly to determine whether the complete performance test script is executed; if the complete performance test script is executed, a test result is obtained from the target execution machine; and outputting the test result through the visual interaction interface.
According to an aspect of the embodiments of the present application, there is provided a computer readable medium having stored thereon a computer program which, when executed by a processor, implements an interface test method as described in the above embodiments.
According to an aspect of an embodiment of the present application, there is provided an electronic apparatus including: one or more processors; and a storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the interface testing method as described in the above embodiments.
According to an aspect of an embodiment of the present application, there is provided a computer program product including computer instructions stored in a computer-readable storage medium, from which computer instructions a processor of a computer device reads, the processor executing the computer instructions, causing the computer device to perform the interface test method as described in the above embodiment.
In the technical scheme provided by some embodiments of the present application, the whole interface test scheme is driven by the interaction operation of the tester and the visual interaction interface, specifically, the tester can configure interface selection information, input parameter information for testing the service interface on the target line, submit a corresponding test instruction, and automatically call the service interface on the target line according to the parameter information after receiving the command of the test instruction, thereby realizing the test of the service interface on the target line, and finally, can output and display a corresponding test result through the visual interaction interface. Therefore, the scheme of the embodiment of the application ensures that the testing party can realize the testing of the interface only by carrying out a plurality of interactive operations with the visual interactive interface, realizes the visualization of the interface testing, has no need of writing test scripts in the whole flow, has convenient and quick testing process, greatly reduces the threshold of the interface testing, obviously improves the efficiency of the interface testing and also reduces the testing cost.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application. It is evident that the drawings in the following description are only some embodiments of the present application and that other drawings may be obtained from these drawings without inventive effort for a person of ordinary skill in the art. In the drawings:
FIG. 1 shows a schematic overall architecture of a technical solution of an embodiment of the present application;
FIG. 2 illustrates a flow chart of an interface test method according to one embodiment of the application;
FIG. 3 shows a block diagram of the architecture of the solution of an embodiment of the application;
FIG. 4 shows a schematic diagram of a visual interactive interface for interface debugging according to one embodiment of the present application;
FIG. 5 shows a schematic diagram of a visual interactive interface for performance testing according to an embodiment of the present application;
FIG. 6 illustrates a flow diagram for displaying interface information through a visual interactive interface, according to an embodiment of the application;
FIG. 7 shows a flowchart of the details of step 230 of FIG. 2, according to one embodiment of the application;
FIG. 8 illustrates a flow chart showing display of test results for a service interface on a target line on a visual interactive interface in accordance with one embodiment of the present application;
FIG. 9 shows a schematic diagram of a visual interactive interface graphically displaying performance test results, according to one embodiment of the present application;
FIG. 10 illustrates an interactive flow diagram before starting debugging and after starting single instance debugging, according to one embodiment of the present application;
FIG. 11 illustrates an interactive flow diagram after initiating a service full debug and after requesting to view the full debug results, according to one embodiment of the present application;
FIG. 12 illustrates an interactive flow diagram before and after initiation of performance testing, according to one embodiment of the application;
FIG. 13 illustrates an interactive flow diagram of asynchronously performing performance tests and asynchronously polling test results, according to one embodiment of the application;
FIG. 14 illustrates an interactive flow diagram of query performance test results according to one embodiment of the application;
FIG. 15 shows a block diagram of an interface test apparatus according to one embodiment of the application;
Fig. 16 shows a schematic diagram of a computer system suitable for use in implementing an embodiment of the application.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the application. One skilled in the relevant art will recognize, however, that the application may be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known methods, devices, implementations, or operations are not shown or described in detail to avoid obscuring aspects of the application.
The block diagrams depicted in the figures are merely functional entities and do not necessarily correspond to physically separate entities. That is, the functional entities may be implemented in software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The flow diagrams depicted in the figures are exemplary only, and do not necessarily include all of the elements and operations/steps, nor must they be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the order of actual execution may be changed according to actual situations.
Interface testing may include interface debugging and performance testing of a server providing the interface.
At present, when interface debugging and performance testing are performed on a local interface, a developer can use some visual tools to operate conveniently and quickly in a local development environment, for example, a postman interface debugging tool can perform interface debugging, and performance testing can also be performed through a JMeter performance testing tool.
However, for services deployed to an online environment, since the system of online servers typically has no UI interface, it is not possible to install and use a visual tool. In the related art, when a developer performs interface debugging and performance testing on an online interface, a manner of writing a script and uploading the script to a server is often needed to perform testing operation.
The current on-line interface debugging mode can meet the basic interface debugging requirement, but for a developer, a complex interface debugging script needs to be written and run on a server, the complex script is usually run out, and if the complex interface debugging script is not properly stored, the next debugging needs to be rewritten to cause repeated work. Meanwhile, for online large-scale services, such as services of hundreds of servers, if all the servers of the services need to be subjected to interface debugging and availability verification, hundreds of scripts need to be modified, and great pressure is brought to developers.
The online performance test mode adopted at present is implemented by adopting a mode of programming scripts online and uploading the scripts to a server for execution, and the mode can meet the basic performance test requirement. However, the developer needs to master specialized performance test knowledge to write the performance test script, and the execution of the script by uploading it to the server is complicated. For the performance test results, the performance test results can be only checked through simple digital results on a server, and rich chart display cannot be realized.
Therefore, the application provides an interface testing method, which can overcome the defects, realize the visual operation of interface debugging and performance testing of a server providing an interface, automatically generate a testing script, test without writing the testing script, test a large-scale server providing an interface quickly without mastering professional testing knowledge, improve the efficiency of interface testing and reduce the testing cost.
Fig. 1 shows a schematic diagram of an exemplary system architecture to which the technical solution of an embodiment of the present application may be applied. As shown in fig. 1, the system architecture 100 may include: the user terminal 110, the system server 120 and the server cluster 130, wherein the server cluster 130 specifically includes a first server 131, a second server 132 and a third server 133. The individual servers in the server cluster 130 provide an interface outwards that other terminals or servers can invoke by accessing the individual servers in the server cluster 130. Communication connections are established between the user terminal 110 and the system server 120 and between each server in the server cluster 130 and the system server 120. An interface debugging and performance testing automation platform is deployed on the system server 120, and a client capable of communicating with the interface debugging and performance testing automation platform is deployed on the user terminal 110. When the interface testing method provided by the present application is applied to the system architecture shown in fig. 1, one process may be as follows: firstly, a user accesses an interface debugging and performance testing automation platform on a system server 120 through a client of a user terminal 110, so as to open an interface debugging interface returned by the interface debugging and performance testing automation platform, wherein the interface debugging interface comprises a button for initiating interface debugging; then, the user can configure various information in the interface debugging interface, specifically, the user needs to select an interface to be debugged, input interface request parameters for debugging the interface through the interface debugging interface, and then select a designated server needing to be debugged; next, after the user clicks a button for initiating interface debugging in the interface debugging interface through the user terminal 110, the user terminal 110 will send various information configured by the interface debugging interface to the interface debugging and performance test automation platform on the system server 120, and the interface debugging and performance test automation platform initiates interface debugging to the designated server in the server cluster 130; then, the interface debugging and performance testing automation platform receives interface response information and generates a corresponding debugging result; finally, the system server 120 returns the debug result to the user terminal 110, so as to display the debug result in the interface debug interface.
In some embodiments of the present application, the debug results are graphically presented in an interface debug interface.
In some embodiments of the present application, a user accesses the interface debugging and performance test automation platform on the system server 120 through the client of the user terminal 110, and may also open a performance test page provided by the interface debugging and performance test automation platform, and the user may select an interface to be debugged in the performance test page, input interface request parameters and script configuration parameters for testing the interface, and then select a designated server to be subjected to performance test; after receiving the information, the interface debugging and performance test automation platform on the system server 120 automatically generates a corresponding performance test script, and performs performance test on the specified server by executing the performance test script.
It should be understood that the number of user terminals, system servers, and servers in a server cluster in fig. 1 is merely illustrative. There may be any number of user terminals, system servers, and any number of servers in a server cluster, as desired. For example, the system server may be a server cluster formed by a plurality of servers, and the number of servers in the server cluster may be less than three or more than three.
It should be noted that fig. 1 shows only one embodiment of the present application. Although in the solution of the embodiment of fig. 1, the implementation terminal and the terminal device for providing an interface are servers, in other embodiments of the present application, the implementation terminal and the terminal device for providing an interface may be various terminal devices such as a desktop, a notebook, an iPAD, a smart phone, a vehicle-mounted terminal, and the like; although the solution in the embodiment of fig. 1 performs performance testing by executing the performance test script on the interface debugging and performance test automation platform, in other embodiments of the present application, the interface debugging and performance test automation platform may also send the generated performance test script to the execution machine, and the execution machine executes the performance test script, so as to perform performance testing on the specified server; although in the solution of the embodiment of fig. 1, each server in the server cluster 130 provides the same interface outwards, in other embodiments of the present application, each server in the server cluster 130 may also provide different interfaces outwards, and some servers may provide multiple interfaces outwards at the same time. The embodiments of the present application should not be limited in any way, nor should the scope of the application be limited in any way.
It is easy to understand that the interface testing method provided by the embodiment of the present application is generally executed by a server, and accordingly, the interface testing device is generally disposed in the server. However, in other embodiments of the present application, the terminal device may also have a similar function as the server, so as to execute the interface test scheme provided by the embodiment of the present application.
Therefore, the embodiment of the application can be applied to the terminal or the server. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, basic cloud computing services such as big data and artificial intelligent platforms. The terminal may be, but is not limited to, a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, etc. The terminal and the server may be directly or indirectly connected through wired or wireless communication, and the present application is not limited herein.
The implementation details of the technical scheme of the embodiment of the application are described in detail below:
Fig. 2 shows a flow chart of an interface test method according to an embodiment of the application, which may be used in particular for interface debugging or performance testing of a server providing an interface. The interface testing method can be performed by various devices capable of calculating and processing, such as a user terminal or a cloud server, wherein the user terminal comprises, but is not limited to, a mobile phone, a computer, an intelligent voice interaction device, an intelligent household appliance, a vehicle-mounted terminal, a wearable device and the like. The embodiment of the application can be applied to various scenes, including but not limited to cloud technology, artificial intelligence, intelligent transportation, auxiliary driving and the like. Referring to fig. 2, the interface testing method at least includes the following steps:
in step 210, interface selection information configured by the testing party through the visual interaction interface and corresponding to the service interface on the target line is obtained.
The tester may be a user capable of performing the interface test, for example, a tester or developer of the interface. The interface may be an API interface, the API (Application Programming Interface ) being a number of predefined functions, with the purpose of providing the application and developer the ability to access a set of routines based on some software or hardware, without having to access source code, or understand the details of the internal operating mechanisms.
The visual interactive interface may be a Web page designed using HTML (HyperText Markup Language ), CSS (Cascading Style Sheets, cascading style sheets), javaScript, etc., techniques, etc. The visual interactive interface is displayed on a client capable of accessing a Web page, such as a browser used by the tester.
The interface selection information may be basic information of an interface such as a name, an identification, an interface description, etc. of the interface, and the tester indicates which interface the tester wants to test by configuring the interface selection information.
In one embodiment of the present application, obtaining interface selection information configured by a testing party through a visual interactive interface and corresponding to a service interface on a target line includes: interface selection information corresponding to a target online service interface provided by a specified service and configured by a tester through a specified service entry of a visual interaction interface is obtained.
The scheme of the embodiment of the present application will be described in detail with reference to fig. 3. Fig. 3 shows an architecture block diagram of a technical solution of an embodiment of the present application. Referring to fig. 3, the whole architecture block diagram is divided into an access layer, a logic layer and a data layer, wherein the logic layer is a core portion, and the logic layer includes an interface debugging and performance testing automation platform, an execution machine cluster, an nminx cluster and a business service cluster, and the interface debugging and performance testing automation platform can be used as an execution body of the embodiment of the present application. The user terminal can access the interface debugging and performance testing automation platform through the Nginx service in the access layer, so that a platform visual page returned by the interface debugging and performance testing automation platform can be obtained, wherein the platform visual page is a visual interaction interface in step 210, and a user can interact with the platform visual page to configure various information related to interface debugging or performance testing, so that interface debugging or performance testing is initiated. The business service cluster comprises a plurality of service instances, each service instance can be deployed on one server, each service instance can provide one or more interfaces, and each service instance can provide the same interface outwards or different interfaces outwards. Other terminal devices can normally access the business service cluster through the gateway, so that interfaces provided by service instances in the business service cluster are called. The interface debugging and performance testing automation platform comprises an interface debugging module and a performance testing module, wherein the interface debugging operation can be carried out through the interface debugging module, and the performance testing operation can be carried out on a server providing an interface through the performance testing module.
In the interface debug page, the user may select an online service interface that requires interface debugging. FIG. 4 shows a schematic diagram of a visual interactive interface for interface debugging, according to an embodiment of the present application. Referring to fig. 4, a user clicking a conference control service in a service list, entering a conference control service entry, and when the user clicking an interface debugging control, entering an interface debugging sub-interface, wherein basic information of a plurality of conference interfaces is displayed, wherein the basic information of the conference interfaces is interface selection information. The basic information of the conference interface may include an interface request http method, an interface path, and an interface name. For example, fig. 4 shows that the http method of the interface request in the basic information of the first conference interface is POST, the path of the interface is "/metering/create", and the name of the interface is "create conference"; fig. 4 shows that the http method of the interface request in the basic information of the second conference interface is POST, the path of the interface is "/metering/query", and the name of the interface is "inquiry conference". When a user clicks basic information of a certain conference interface, a debugging entrance of the corresponding conference interface is unfolded, and when the user configures corresponding parameter information in the debugging entrance and submits a test instruction, configuration of corresponding interface selection information is automatically realized.
Therefore, the tester can configure interface selection information corresponding to a plurality of online service interfaces through one visual interaction interface. The creation conference interface and the inquiry conference interface both belong to a conference control service, so the conference interface and the inquiry conference interface can be simultaneously provided by one service instance in the embodiment of fig. 3; all of the service instances in the embodiment of fig. 3 may be conference control services capable of creating conference interfaces and querying conference interfaces. FIG. 5 shows a schematic diagram of a visual interactive interface for performance testing according to an embodiment of the present application. Referring to fig. 5, when a user clicks a "performance test" control in the controllable service portal, a performance test sub-interface is entered, in which performance test operations on the interface may be initiated. The performance test sub-interface also displays basic information of a plurality of conference interfaces, and when a user clicks the basic information of a certain conference interface, performance test entries of the corresponding conference interfaces are unfolded. The performance test portal of the query conference interface is opened for the user as shown in fig. 5. When the user configures corresponding parameter information in the performance test entrance of the query conference interface and submits a test instruction, the user can automatically configure the interface selection information of the query conference interface.
FIG. 6 illustrates a flow diagram for displaying interface information through a visual interactive interface, according to an embodiment of the application. Referring to fig. 6, the interface testing method may further include the following steps:
in step 610, an interface protocol definition file corresponding to each on-line service interface is obtained, the interface protocol definition file being generated according to a service code of the corresponding on-line service interface, the interface protocol definition file including interface information.
In one embodiment of the application, the interface protocol definition file corresponding to each online service interface is generated by the tool dependency library at the time of construction of the service code from the annotations added in the service code, the tool dependency library being introduced into the service code.
Specifically, a tool dependency library, such as Swagger, can be introduced into the service code, and the interface code of the service code is added with comments of information, such as interface names, interface descriptions, interface request http methods, interface request addresses, interface request parameter definitions, interface response parameter definitions, and the like. After the annotation is added, when the business code is constructed, swagger automatically generates an interface protocol definition file, wherein the interface protocol definition file contains information of all interfaces.
Swagger is a canonical and complete framework that eliminates manual work in API documents, providing a series of solutions for generating, visualizing, and maintaining API documents.
In one embodiment of the present application, the interface test method further includes: analyzing the interface protocol definition files corresponding to the online service interfaces to obtain interface selection information contained in the interface information of each interface protocol definition file; storing the interface protocol definition file into a disk; and correspondingly storing the interface selection information and the address stored in the disk by the interface protocol definition file into the corresponding interface record of the database.
The interface selection information may be interface basic information, and the interface information may be interface detailed information including the interface selection information. Specifically, the interface information may include various information such as generation of an interface name, an interface description, an interface request http method, an interface request address, and the like.
Referring to fig. 3, the data layer includes a database and a disk, the database is used for storing interface selection information and addresses stored in the disk by the interface protocol definition file, and the disk is used for storing the interface protocol definition file.
The database and the disk can be located on terminal equipment where the interface debugging and performance testing automation platform is located or terminal equipment which can be accessed by the interface debugging and performance testing automation platform.
According to the embodiment of the application, the interface selection information is stored in the database, so that a user can ensure the configuration efficiency when configuring the interface selection information through the visual interaction interface.
In step 620, after obtaining the interface selection information configured by the testing party through the visual interaction interface and corresponding to the service interface on the target line, the interface protocol definition file corresponding to the service interface on the target line is parsed to obtain the interface information.
Because the interface selection information and the address stored in the disk of the interface protocol definition file are correspondingly stored in the database, after the interface selection information configured by the testing party is obtained, the address stored in the disk of the corresponding interface protocol definition file can be obtained, the address stored in the disk of the interface protocol definition file is further addressed according to the address stored in the disk of the interface protocol definition file, the interface protocol definition file corresponding to the service interface on the target line is obtained, and the interface protocol definition file is further analyzed.
The interface protocol definition file can be parsed according to the Swagger protocol file specification, and complete interface information contained in the file can be obtained by parsing the interface protocol definition file. Parsing the interface protocol definition file may be implemented by a Swagger interface protocol parsing module in the interface debugging and performance testing automation platform shown in fig. 3.
In step 630, the interface information is displayed through the visual interactive interface.
When the user selects the interface to be tested, detailed interface information can be further displayed through the visual interaction interface, so that the user can be guided to input parameter information.
With continued reference to fig. 2, in step 220, parameter information for testing the on-target line service interface entered on the visual interactive interface is received.
In one embodiment of the application, the parameter information includes an interface request parameter and server selection information for indicating testing of a designated server providing the service interface on the target line.
The designated server may be one or more servers that provide the on-target line service interface, or may be all servers that provide the on-target line service interface.
The user can input parameter information through a form provided by the visual interaction interface.
The interface request parameter is a parameter that needs to be entered into the service interface on the target line, which is used to generate the request message. For example, in the embodiment of fig. 4 and 5, the interface request parameter is "12345678" corresponding to the parameter name "meetingid". It can be seen that the visual interactive interface also displays a parameter description corresponding to the request parameter, where the parameter description includes a parameter name and a description, and the parameter description may belong to interface information obtained by parsing the interface protocol definition file in the embodiment of fig. 6. The server selection information may be input by the user by selecting in a drop-down menu, for example, in the embodiments of fig. 4 and 5, the selection result in the drop-down menu corresponding to the server is ALL, which indicates that the test is performed on the full server providing the conference interface on behalf of the user. Of course, in other embodiments of the present application, other servers may be selected in a drop-down menu, for example, a specific server or server group may be selected, so as to implement testing of the specific server or server group.
In step 230, when a test instruction for the on-target line service interface submitted through the visual interactive interface is received, the on-target line service interface corresponding to the interface selection information is called according to the parameter information, so as to test the on-target line service interface.
The test instructions, parameter information, and interface selection information may be received together.
A request button is shown in the visual interactive interface shown in the embodiments of fig. 4 and 5, and a user clicking on the request button may submit a test instruction for the corresponding conference interface. The on-target service interface may be invoked by encapsulating the interface request parameters as an http request message and sending the http request message to a designated server that provides the on-target service interface.
Fig. 7 shows a flowchart of the details of step 230 of fig. 2, according to one embodiment of the application. Referring to fig. 7, step 230 may specifically include the following steps:
in step 231, a test task corresponding to the specified server is generated according to the parameter information, and the test task corresponding to the specified server is added to the message queue.
Specifically, test tasks may be added to the Kafka message queue.
Kafka is an open source stream processing platform written in the scalea and Java languages. The system is a distributed publish-subscribe message system, and can provide a unified, high-throughput and low-latency message transfer platform for real-time data. The data layer shown in fig. 3 also includes a Kafka queue for storing test tasks.
In step 232, test tasks are obtained from the message queue and executed to test the designated server.
In one embodiment of the present application, the designating server is a plurality of, the testing task is obtained from the message queue, and the testing task is executed to test the designating server, including: acquiring test tasks corresponding to each appointed server from the message queue in sequence; after each test task is obtained, an interface call request is obtained according to the content package of the test task, and the interface call request is sent to a corresponding designated server so as to test a plurality of designated servers providing the service interfaces on the target line.
Each time a test task is acquired and executed, the test of a designated server corresponding to the test task for providing the service interface on the target line is realized; when all the test tasks in the message queue are performed, the testing of all the designated servers is completed. All test tasks in the message queue may be performed in an asynchronous manner, and after starting to perform the preceding test task in the message queue, the following test task may be performed even if the test task has not been performed.
In one embodiment of the present application, the parameter information further includes script configuration parameters, and the interface test method further includes: analyzing the interface protocol definition file corresponding to each online service interface to obtain interface information in each interface protocol definition file; generating an original performance test script corresponding to each online service interface according to the script specification and interface information in each interface protocol definition file, wherein the original performance test script comprises at least one reserved configuration item;
obtaining a test task from the message queue and executing the test task to test the designated server, including: acquiring a test task from a message queue; determining a corresponding original performance test script according to the content of the test task, and filling script configuration parameters into reserved configuration items in the original performance test script to obtain a complete performance test script; and performing performance test on the specified server by executing the complete performance test script.
After the interface protocol definition file is parsed, the interface debugging and performance test automation platform generates a corresponding original performance test script by using interface information in each interface protocol definition file according to the script specification of the Jmeter, wherein the content of the original performance test script contains interface information and also contains a specific configuration item reserved in a placeholder mode, for example, the configuration item can be the number of requested threads, test duration and the like. The original performance test script may be saved in disk and the memory address of the original performance test script in disk may be saved in the corresponding record of the database for subsequent retrieval. The generation of the original performance test script can be realized by a Jmeter running script generation module in the performance test automation platform shown in fig. 3.
Because the test tasks correspond to the on-line service interfaces and the original performance test scripts correspond to the on-line service interfaces, the corresponding relationship exists between the test tasks and the original performance test scripts. Referring to fig. 5, when a user requests performance test on the inquiry conference interface, two script configuration parameters, namely thread number and test duration, are also input into the visual interaction interface; when the interface debugging and performance testing automation platform executes the testing task, the two script configuration parameters are filled into the configuration item of the placeholder in the original performance testing script, so that the complete performance testing script is generated.
In one embodiment of the application, performance testing of a specified server by executing a complete performance test script includes: determining a target execution machine in an idle state; and sending the complete performance test script to the target execution machine so as to perform performance test on the specified server by executing the complete performance test script on the target execution machine.
The target execution machine may be one execution machine in an idle state in an execution machine cluster, and the execution machine cluster may include a plurality of execution machines, and each execution machine may be a server. Referring to fig. 3, the interface debugging and performance testing automation platform includes a performance testing executor management module, which can interact with executors in an executor cluster, in which some executors are in idle state and some executors are in busy state. The target execution machine in an idle state may test a single service instance by executing a full performance test script.
Because the performance test needs to be performed for a longer time, more resources are occupied. According to the embodiment of the application, the target executor is utilized to execute the complete performance test script, so that the script execution function can be matched with the interface debugging and performance test automation platform, the maintainability of the platform is improved, and the excessive redundancy of the platform is avoided; meanwhile, by setting the executor cluster, even if a part of executors are busy, performance test can be performed in real time, and the efficiency of the performance test is ensured.
In one embodiment of the application, determining a target execution machine in an idle state includes: acquiring CPU utilization rate and memory utilization rate of each execution machine in the execution machine cluster; and determining the target execution machine in an idle state in the execution machine cluster according to the CPU utilization rate and the memory utilization rate of each execution machine.
In one embodiment of the present application, determining a target execution machine in an idle state in an execution machine cluster according to a CPU utilization rate and a memory utilization rate of each execution machine includes: and determining an execution machine with the CPU utilization rate lower than a preset CPU utilization rate threshold value and the memory utilization rate lower than a preset memory utilization rate as a target execution machine.
In one embodiment of the present application, determining a target execution machine in an idle state in an execution machine cluster according to a CPU utilization rate and a memory utilization rate of each execution machine includes: for each execution machine, determining a weighted sum of CPU utilization rate and memory utilization rate of the execution machine; and determining an execution machine with the weighted sum lower than a preset weighted sum threshold value as a target execution machine.
The weights for CPU utilization and memory utilization may be set based on expert experience.
In one embodiment of the application, determining a target execution machine in an idle state includes: transmitting an idle state acquisition request to each execution machine in the execution machine cluster; acquiring idle state signals returned by each execution machine according to the idle state acquisition request; and determining the target execution machine in the idle state according to the idle state signal.
Specifically, each execution machine can determine whether the execution machine is idle according to the CPU utilization rate and the memory utilization rate of the execution machine, and generate a corresponding idle state signal.
In the embodiment of the application, the target execution machine in the idle state can be determined only by sending the idle state acquisition request, other extra calculation is not needed, and the calculation cost can be reduced.
Jmeter tools can be deployed on each of the execution machines in the execution machine cluster, and when the target execution machine executes the complete performance test script, the Jmeter tools need to be started to execute the complete performance test script. JMeter is a Java-based stress testing tool for stress testing software, which was originally designed for Web application testing, but later extends to other testing areas. It can be used to test static and dynamic resources such as static files, java servlets, CGI scripts, java objects, databases, FTP servers, etc. Jmeters can be used to simulate huge loads on servers, networks, or objects, from testing their strength and analyzing overall performance under different pressure categories.
In one embodiment of the present application, the designated server is plural, and the performance test is performed on the designated server by executing the complete performance test script on the target execution machine, including: executing a complete performance test script on the target execution machine to trigger sending a test request to the load balancing server, wherein the test request is used for performing performance test on a designated server selected by the load balancing server according to a load balancing strategy.
Specifically, before sending the complete performance test script to the target execution machine, a specified server information list containing information of each specified server is configured for the load balancing server, so as to obtain a load balancing IP address of the load balancing server. According to the load balancing IP address, a test request can be triggered to be sent to the load balancing server. The load balancing server may employ an nmginx server.
In one embodiment of the application, the load balancing server selects the designated server by: determining the test request receiving quantity of each designated server in the preset time length every preset time length; the designated server with the smallest test request receiving amount is selected.
In one embodiment of the application, the load balancing server selects the designated server by: obtaining the concurrency of test requests of each designated server at the current moment; and selecting a designated server with the concurrency of the test requests smaller than a preset concurrency threshold of the test requests.
Nginx is a high-performance HTTP and reverse proxy web server, and provides IMAP/POP3/SMTP services, which can be used as a load balancing server.
With continued reference to fig. 3, when performance testing is required for the full-volume server, an executing machine in an idle state in the nginix cluster may be used as a load balancing server, so as to implement load balancing among multiple service instances, ensure testing efficiency, and avoid excessive testing pressure on a certain service instance.
With continued reference to fig. 2, in step 240, the results of the test on the on-target line service interface are displayed on the visual interactive interface.
Fig. 4 shows the debug results and execution status for a plurality of IP addresses, each representing a server, for example, the execution status of the server having an IP address 192.168.1.102 is execution failure. Fig. 5 shows performance test results of a server, where the performance test results include a plurality of indexes, which are speed, total amount, success number, failure number, success rate, average time consumption, minimum time consumption, maximum time consumption, and the like, respectively, where the speed is in tps (Transaction Per Second, transaction amount per second), which represents the number of test requests sent per second, the total amount is the total number of test requests sent, the success number is the number of test requests successfully responded, the failure number is the number of test requests unsuccessfully responded, the success rate is the ratio of the number of test requests successfully responded to the total number of test requests sent, the average time consumption is an average value of response time consumption of each test request, the minimum time consumption is the shortest response time consumption of the test requests, and the maximum time consumption is the longest response time consumption of the test requests.
The test result of each test task can be obtained in a polling mode, and the test result obtained by polling is displayed on the visual interactive interface in real time, so that the execution condition of each server can be clear for a user.
FIG. 8 illustrates a flow chart showing the results of a test on a target line service interface displayed on a visual interactive interface according to one embodiment of the present application. Referring to fig. 8, displaying the test result of the service interface on the target line on the visual interaction interface may specifically include the following steps:
in step 810, the target execution machine is periodically polled to determine if the complete performance test script is complete.
The execution state of the test task can be obtained through the target execution machine, and whether the complete performance test script is executed or not can be obtained.
In step 820, if the complete performance test script is executed, the test result is obtained from the target execution machine.
The complete performance test script is executed by the target execution machine, so that the target execution machine obtains a corresponding test result.
In step 830, the test results are output through the visual interactive interface.
And returning the test result to the terminal equipment where the visual interaction interface is located, and displaying the test result on a screen where the visual interaction interface is displayed on the terminal equipment.
Specifically, the test results may be displayed in a data table manner, or may be displayed in a graph manner such as a pie chart, a bar chart, a fan-shaped chart, a line chart, or the like. Referring to fig. 5, the performance test results are displayed on the visual interactive interface in the form of a data table. FIG. 9 illustrates a schematic diagram of a visual interactive interface graphically displaying performance test results, according to one embodiment of the present application. Referring to fig. 9, the performance test results are graphically displayed on the visual interactive interface, wherein the time-consuming distribution is graphically illustrated and the rate change curve is graphically illustrated. In fig. 5 and 9, a "download result" button set corresponding to the performance test result is also shown on the visual interaction interface, and when the user clicks the "download result" button, the performance test result can be exported locally in a mode of pictures, documents, and the like.
Next, the solution of the embodiment of the present application will be further described with reference to fig. 10 to 14, and the test platform in fig. 10 to 14 is the interface debugging and performance test automation platform in the foregoing embodiment.
FIG. 10 illustrates an interactive flow diagram before starting debugging and after starting single instance debugging, according to one embodiment of the present application. Referring to fig. 10, the following procedure may be included before starting the debug: firstly, clicking an interface to be debugged on a test platform web by a user, and requesting the test platform web to check interface details; then, the test platform obtains the address of the interface protocol file stored in the disk by inquiring the database; next, the test platform retrieves the interface protocol file from the disk according to the address and analyzes the interface protocol file; finally, the test platform returns interface details to the test platform web.
After the single instance debugging is started, the following interaction flow is included: firstly, a user starts single instance debugging on a test platform web, and the test platform web sends a request to the test platform to request service single instance debugging; then, the test platform packages according to the request parameters to obtain an http request, and sends an interface request to an appointed IP instance in the business service cluster by sending the http request; then, the business service cluster returns an interface response to the test platform; and finally, the test platform returns an interface debugging result to the test platform web.
FIG. 11 illustrates an interactive flow diagram after initiating a service full debug and after requesting to view the full debug results, according to one embodiment of the present application. Referring to fig. 11, after the service full debug is started, the following interaction flow may be included: firstly, a user starts full-scale debugging of service on a test platform web, and the test platform web sends a request to the test platform to request full-scale debugging of the service; the test platform then places the debug task in the Kafka queue and returns the task id to the test platform web. Next, full debug is performed asynchronously, including the following interactive flows: the test platform acquires a task from the Kafka and packages an http request according to the task content; then, the test platform requests an interface from a service instance in the service cluster, and the service instance in the service cluster returns an interface response to the test platform; and finally, the test platform writes the result data into a database.
The following interactive flow is included after the request to view the full debug result: firstly, a user requests to view the full-quantity debugging results on a test platform web, and the test platform web sends a request to the test platform to request to view the debugging results; then, the test platform checks result data in the database according to the task id; finally, the test platform returns the full-scale results to the test platform web.
FIG. 12 shows an interactive flow diagram before and after initiation of performance testing, according to one embodiment of the application. Referring to fig. 12, the following interaction flow may be included before the performance test is initiated: firstly, clicking an interface to be debugged on a test platform web by a user, and requesting the test platform web to check interface details; then, the test platform obtains the address of the interface protocol file stored in the disk by inquiring the database; next, the test platform retrieves the interface protocol file from the disk according to the address and analyzes the interface protocol file; finally, the test platform returns interface details to the test platform web.
After the performance test is initiated, the following interaction flow may be included: firstly, a user starts performance test on a test platform web, and the test platform web requests performance test from the test platform; then, the test platform puts the performance test task into Kafka; finally, the test platform returns the task id to the test platform web.
FIG. 13 illustrates an interactive flow diagram of asynchronously performing performance testing and asynchronously polling test results, according to one embodiment of the application. Referring to fig. 13, the asynchronous performance test includes the following interaction flow: firstly, a test platform acquires tasks from Kafka; then, the test platform judges whether to execute the full capacity performance test, if the full capacity performance test is executed, the configuration of the nginx is applied, and if the single instance test is executed, the step is skipped; then, the test platform acquires an idle execution machine; then, the test platform finds out the script according to the task content and fills the script; then, the test platform uploads the script to an executive machine in the performance test executive machine cluster, the script is executed on the executive machine, and the executive machine can return information of successful script reception to the test platform; and finally, the executing machine initiates a test to the designated ip or the load balancing ip in the business service cluster when executing the script, and receives a test result returned by the business service cluster.
The asynchronous polling test result comprises the following interactive flow: firstly, a test platform asynchronously polls test results of a performance test executor cluster to determine whether performance test is finished; then, the performance test executor cluster returns successful execution to the test platform; then, the test platform pulls an execution result file to the performance test executor cluster, and the performance test executor cluster returns the result file to the test platform; and finally, the test platform analyzes the result file and puts the analysis content into a database.
FIG. 14 illustrates an interactive flow diagram of query performance test results according to one embodiment of the application. Referring to fig. 14, the query performance test result includes the following interaction flow: firstly, a user requests to inquire a test result on a test platform web, and the test platform web sends a request to the test platform to request to inquire the test result; then, the test platform searches a result from the database according to the task id and returns a test result to the test platform web; finally, the test platform web constructs a chart and displays the test result in a chart mode.
In summary, according to the interface test method provided by the embodiment of the application, an automatic interface debugging and performance test platform is designed, and a convenient on-line interface debugging and performance test mode is provided for a developer, so that the developer does not need to master complex script writing skills, does not need to carry out complex script writing execution, and can realize professional test on the interface only by interacting with a visual interface. For large-scale services, a user can click on a designated server for debugging under the service, or can debug all servers with one key. Therefore, the scheme of the embodiment of the application greatly reduces the threshold of interface test, so that a developer can concentrate more on service development and actual performance tuning, and the working efficiency is greatly improved.
The following describes an embodiment of the apparatus of the present application, which may be used to perform the interface testing method of the above embodiment of the present application. For details not disclosed in the embodiments of the apparatus of the present application, please refer to the embodiments of the interface testing method of the present application.
FIG. 15 shows a block diagram of an interface test apparatus according to one embodiment of the application.
Referring to fig. 15, an interface test apparatus 1500 according to an embodiment of the present application includes: an acquisition unit 1510, a reception unit 1520, a test unit 1530, and a display unit 1540. The obtaining unit 1510 is configured to obtain interface selection information configured by the testing party through the visual interaction interface and corresponding to the service interface on the target line; the receiving unit 1520 is configured to receive parameter information input on the visual interactive interface for testing the on-target line service interface; the test unit 1530 is configured to call, when a test instruction for the on-target line service interface submitted through the visual interactive interface is received, the on-target line service interface corresponding to the interface selection information according to the parameter information, so as to test the on-target line service interface; the display unit 1540 is configured to display a test result of the on-target line service interface on the visual interaction interface.
In some embodiments of the application, based on the foregoing, the parameter information includes an interface request parameter and server selection information for indicating testing of a specified server providing the target online service interface.
In some embodiments of the present application, based on the foregoing scheme, the test unit 1530 is configured to: generating a test task corresponding to the designated server according to the parameter information, and adding the test task corresponding to the designated server into a message queue; and acquiring the test task from the message queue, and executing the test task to test the designated server.
In some embodiments of the present application, based on the foregoing solution, the specified server is plural, and the test unit 1530 is configured to: acquiring test tasks corresponding to each appointed server from the message queue in sequence; and after each test task is obtained, an interface calling request is obtained according to the content package of the test task, and the interface calling request is sent to a corresponding designated server so as to test a plurality of designated servers providing the service interfaces on the target line.
In some embodiments of the present application, based on the foregoing scheme, the obtaining unit 1510 is further configured to: acquiring interface protocol definition files corresponding to all online service interfaces, wherein the interface protocol definition files are generated according to service codes of the corresponding online service interfaces, and the interface protocol definition files comprise interface information; after interface selection information corresponding to a service interface on a target line configured by a testing party through a visual interaction interface is obtained, analyzing the interface protocol definition file corresponding to the service interface on the target line to obtain interface information; the display unit 1540 further functions to: and displaying the interface information through the visual interaction interface.
In some embodiments of the present application, based on the foregoing solution, the parameter information further includes a script configuration parameter, and the obtaining unit 1510 is further configured to: analyzing the interface protocol definition file corresponding to each online service interface to obtain interface information in each interface protocol definition file; generating an original performance test script corresponding to each online service interface according to script specifications and interface information in each interface protocol definition file, wherein the original performance test script comprises at least one reserved configuration item; the test unit 1530 is configured to: acquiring the test task from the message queue; determining a corresponding original performance test script according to the content of the test task, and filling the script configuration parameters into reserved configuration items in the original performance test script to obtain a complete performance test script; and performing performance test on the designated server by executing the complete performance test script.
In some embodiments of the present application, based on the foregoing scheme, the test unit 1530 is configured to: determining a target execution machine in an idle state; and sending the complete performance test script to the target execution machine so as to perform performance test on the specified server by executing the complete performance test script on the target execution machine.
In some embodiments of the present application, based on the foregoing solution, the specified server is plural, and the test unit 1530 is configured to: executing the complete performance test script on the target execution machine to trigger sending a test request to a load balancing server, wherein the test request is used for performing performance test on a designated server selected by the load balancing server according to a load balancing strategy.
In some embodiments of the present application, based on the foregoing scheme, the display unit 1540 is configured to: polling the target execution machine regularly to determine whether the complete performance test script is executed; if the complete performance test script is executed, a test result is obtained from the target execution machine; and outputting the test result through the visual interaction interface.
Fig. 16 shows a schematic diagram of a computer system suitable for use in implementing an embodiment of the application.
It should be noted that, the computer system 1600 of the electronic device shown in fig. 16 is only an example, and should not impose any limitation on the functions and the application scope of the embodiments of the present application.
As shown in fig. 16, the computer system 1600 includes a central processing unit (Central Processing Unit, CPU) 1601 that can perform various appropriate actions and processes, such as performing the methods described in the above embodiments, according to a program stored in a Read-Only Memory (ROM) 1602 or a program loaded from a storage section 1608 into a random access Memory (Random Access Memory, RAM) 1603. In the RAM 1603, various programs and data required for system operation are also stored. The CPU 1601, ROM 1602, and RAM 1603 are connected to each other by a bus 1604. An Input/Output (I/O) interface 1605 is also connected to bus 1604.
The following components are connected to the I/O interface 1605: an input portion 1606 including a keyboard, a mouse, and the like; an output portion 1607 including a Cathode Ray Tube (CRT), a liquid crystal display (Liquid Crystal Display, LCD), and the like, a speaker, and the like; a storage section 1608 including a hard disk or the like; and a communication section 1609 including a network interface card such as a LAN (Local Area Network ) card, a modem, or the like. The communication section 1609 performs communication processing via a network such as the internet. The drive 1610 is also connected to the I/O interface 1605 as needed. A removable medium 1611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is installed as needed on the drive 1610 so that a computer program read out therefrom is installed into the storage section 1608 as needed.
In particular, according to embodiments of the present application, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such embodiments, the computer program may be downloaded and installed from a network via the communication portion 1609, and/or installed from the removable media 1611. When executed by a Central Processing Unit (CPU) 1601, performs various functions defined in the system of the present application.
It should be noted that, the computer readable medium shown in the embodiments of the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-Only Memory (ROM), an erasable programmable read-Only Memory (Erasable Programmable Read Only Memory, EPROM), flash Memory, an optical fiber, a portable compact disc read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. Where each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present application may be implemented by software, or may be implemented by hardware, and the described units may also be provided in a processor. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
As an aspect, the present application also provides a computer-readable medium that may be contained in the electronic device described in the above embodiment; or may exist alone without being incorporated into the electronic device. The computer-readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to implement the methods described in the above embodiments.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functions of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the application. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, and includes several instructions to cause a computing device (may be a personal computer, a server, a touch terminal, or a network device, etc.) to perform the method according to the embodiments of the present application.
It will be appreciated that in particular embodiments of the present application, where data relating to interface testing is involved, user approval or consent is required when the above embodiments of the present application are applied to particular products or technologies, and the collection, use and processing of the relevant data is required to comply with relevant legal regulations and standards in the relevant countries and regions.
Other embodiments of the application will be apparent to those skilled in the art from consideration of the specification and practice of the embodiments disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains.
It is to be understood that the application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (10)

1. An interface testing method, the method comprising:
interface selection information which is configured by a testing party through a visual interaction interface and corresponds to a service interface on a target line is obtained;
Receiving parameter information input on the visual interaction interface and used for testing the service interface on the target line;
when a test instruction of the target online service interface submitted through the visual interaction interface is received, calling the target online service interface corresponding to the interface selection information according to the parameter information so as to test the target online service interface;
and displaying the test result of the service interface on the target line on the visual interaction interface.
2. The interface test method according to claim 1, wherein the parameter information includes an interface request parameter and server selection information for instructing to test a specified server providing the target on-line service interface.
3. The interface testing method according to claim 2, wherein the calling the service interface on the target line corresponding to the interface selection information according to the parameter information to test the service interface on the target line comprises:
generating a test task corresponding to the designated server according to the parameter information, and adding the test task corresponding to the designated server into a message queue;
And acquiring the test task from the message queue, and executing the test task to test the designated server.
4. The interface testing method according to claim 3, wherein the designated server is plural, the acquiring the test task from the message queue, and executing the test task to test the designated server, comprises:
acquiring test tasks corresponding to each appointed server from the message queue in sequence;
and after each test task is obtained, an interface calling request is obtained according to the content package of the test task, and the interface calling request is sent to a corresponding designated server so as to test a plurality of designated servers providing the service interfaces on the target line.
5. The interface testing method of claim 1, further comprising:
acquiring interface protocol definition files corresponding to all online service interfaces, wherein the interface protocol definition files are generated according to service codes of the corresponding online service interfaces, and the interface protocol definition files comprise interface information;
after interface selection information corresponding to a service interface on a target line configured by a testing party through a visual interaction interface is obtained, analyzing the interface protocol definition file corresponding to the service interface on the target line to obtain interface information;
And displaying the interface information through the visual interaction interface.
6. The interface testing method of claim 3, wherein the parameter information further comprises script configuration parameters, the method further comprising:
analyzing the interface protocol definition file corresponding to each online service interface to obtain interface information in each interface protocol definition file;
generating an original performance test script corresponding to each online service interface according to script specifications and interface information in each interface protocol definition file, wherein the original performance test script comprises at least one reserved configuration item;
the step of obtaining the test task from the message queue and executing the test task to test the specified server includes:
acquiring the test task from the message queue;
determining a corresponding original performance test script according to the content of the test task, and filling the script configuration parameters into reserved configuration items in the original performance test script to obtain a complete performance test script;
and performing performance test on the designated server by executing the complete performance test script.
7. The interface testing method according to claim 6, wherein said performing performance testing on said specified server by executing said complete performance test script comprises:
Determining a target execution machine in an idle state;
and sending the complete performance test script to the target execution machine so as to perform performance test on the specified server by executing the complete performance test script on the target execution machine.
8. The interface testing method of claim 7, wherein the specified server is a plurality of servers, and wherein the performance testing of the specified server by executing the complete performance test script on the target execution machine comprises:
executing the complete performance test script on the target execution machine to trigger sending a test request to a load balancing server, wherein the test request is used for performing performance test on a designated server selected by the load balancing server according to a load balancing strategy.
9. The interface testing method according to claim 7, wherein displaying the test result of the service interface on the target line on the visual interactive interface comprises:
polling the target execution machine regularly to determine whether the complete performance test script is executed;
if the complete performance test script is executed, a test result is obtained from the target execution machine;
And outputting the test result through the visual interaction interface.
10. An interface testing apparatus, the apparatus comprising:
the acquisition unit is used for acquiring interface selection information which is configured by the testing party through the visual interaction interface and corresponds to the service interface on the target line;
the receiving unit is used for receiving parameter information which is input on the visual interaction interface and is used for testing the target online service interface;
the testing unit is used for calling the target online service interface corresponding to the interface selection information according to the parameter information when receiving a testing instruction of the target online service interface submitted through the visual interaction interface so as to test the target online service interface;
and the display unit is used for displaying the test result of the service interface on the target line on the visual interaction interface.
CN202210424945.0A 2022-04-22 2022-04-22 Interface testing method and related device Pending CN116974882A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210424945.0A CN116974882A (en) 2022-04-22 2022-04-22 Interface testing method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210424945.0A CN116974882A (en) 2022-04-22 2022-04-22 Interface testing method and related device

Publications (1)

Publication Number Publication Date
CN116974882A true CN116974882A (en) 2023-10-31

Family

ID=88477242

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210424945.0A Pending CN116974882A (en) 2022-04-22 2022-04-22 Interface testing method and related device

Country Status (1)

Country Link
CN (1) CN116974882A (en)

Similar Documents

Publication Publication Date Title
US9652364B1 (en) Cloud service for mobile testing and debugging
US9003423B1 (en) Dynamic browser compatibility checker
JP2018139106A (en) Cloud connected automatic testing
CN109144856A (en) A kind of UI automated testing method calculates equipment and storage medium
CN108628748B (en) Automatic test management method and automatic test management system
CN109062780A (en) The development approach and terminal device of automatic test cases
CN107608901B (en) Jmeter-based testing method and device, storage medium and electronic equipment
WO2018184361A1 (en) Application test method, server, terminal, and storage media
CN109446075B (en) Interface testing method and device
CN113900958A (en) Test case script generation method, system, medium and electronic device
US11023558B1 (en) Executing functions on-demand on a server utilizing web browsers
US20210117313A1 (en) Language agnostic automation scripting tool
Rattanapoka et al. An MQTT-based IoT cloud platform with flow design by Node-RED
CN111666201A (en) Regression testing method, device, medium and electronic equipment
CN115268964A (en) Data reinjection method and system, electronic device and readable storage medium
CN112988588A (en) Client software debugging method and device, storage medium and electronic equipment
US10887186B2 (en) Scalable web services execution
CN117632710A (en) Method, device, equipment and storage medium for generating test code
US11836510B2 (en) Snapshot capture of computing device user interfaces
CN113672671A (en) Method and device for realizing data processing
CN111338928A (en) Chrome-based browser testing method and device
CN116974882A (en) Interface testing method and related device
CN113064987A (en) Data processing method, apparatus, electronic device, medium, and program product
CN113378346A (en) Method and device for model simulation
Hirschfeld et al. Cloud‐based simulation studies in R‐A tutorial on using doRedis with Amazon spot fleets

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination