CN113010413B - Automatic interface testing method and device - Google Patents

Automatic interface testing method and device Download PDF

Info

Publication number
CN113010413B
CN113010413B CN202110191047.0A CN202110191047A CN113010413B CN 113010413 B CN113010413 B CN 113010413B CN 202110191047 A CN202110191047 A CN 202110191047A CN 113010413 B CN113010413 B CN 113010413B
Authority
CN
China
Prior art keywords
interface
interfaces
information
external service
input parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110191047.0A
Other languages
Chinese (zh)
Other versions
CN113010413A (en
Inventor
刘桂秋
卢文博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Optical Network Information Technology Co ltd
Fiberhome Telecommunication Technologies Co Ltd
Original Assignee
Wuhan Optical Network Information Technology Co ltd
Fiberhome Telecommunication Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Optical Network Information Technology Co ltd, Fiberhome Telecommunication Technologies Co Ltd filed Critical Wuhan Optical Network Information Technology Co ltd
Priority to CN202110191047.0A priority Critical patent/CN113010413B/en
Publication of CN113010413A publication Critical patent/CN113010413A/en
Application granted granted Critical
Publication of CN113010413B publication Critical patent/CN113010413B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3676Test management for coverage analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention relates to the field of software automation testing, and provides an interface automation testing method and device. The automatic interface testing method comprises the following steps: generating all automatic test scripts of the external service interface based on the software source code; traversing log files of all external service interfaces and storing the log files into simulation modules of the corresponding interfaces; loading all simulation modules of external service interfaces, traversing and running an automatic test script of an interface to be tested, wherein the interface to be tested needs to call other interfaces, and accessing the simulation modules of the other interfaces to process. According to the invention, an interface automation test script is generated based on a software source code, and a simulation module is established through a log file, and in the test process, other interfaces are called, and the simulation module is directly accessed to process, so that two interfaces with mutual dependency can be serially executed, thereby reducing the dependency between the interfaces, simplifying the test environment deployment and improving the interface test efficiency.

Description

Automatic interface testing method and device
[ field of technology ]
The invention relates to the field of software automation testing, in particular to an interface automation testing method and device.
[ background Art ]
Application testing is the process of operating a program under specified conditions to find program errors, measure software quality, and evaluate whether it meets design requirements.
Automated testing is a process that converts human-based testing behavior into machine execution. Typically, after the test cases are designed and passed through the review, the test is performed step by the tester according to the procedure described in the test cases, resulting in a comparison of the actual results with the expected results. In the process, in order to save manpower, time or hardware resources and improve the test efficiency, automatic test is introduced.
The automatic test is an essential content in the technical field of software development, and the current automation mainly comprises interface automatic test and interface automatic test; the automatic test of the interface is to directly carry out the calling test of the interface and check the returned result of the interface; the interface automation test is to execute the same operation as the manual test through the control of the driving interface. The automatic test of the interface is a means for automatically testing the interface test scene provided by the system to ensure the correctness of the logic function of the interface and improve the regression test coverage rate.
In the actual software development and test process, the interfaces of the software can be changed continuously due to the continuous change of the requirements, common changes include interface parameter adjustment, interface addition and deletion and the like, and the changes influence the operation of an interface automation test script; the common practice for the change is to re-run the automatic test script, and locate and analyze whether the failure is caused by the new demand development change or the environment reason and other factors through the error condition; however, in the current automatic test writing process, the dependence and the inter-calling relationship exist between services, and the automatic test verification can be carried out only after the interfaces of all the services are delivered, so that the test execution is too late; the handling of these call relationships also affects the operation of automation, resulting in difficulties in building and maintaining test data.
The invention patent application with the bulletin number of CN111198813A and the bulletin day of 2020, 5 and 26 days discloses an interface test method and device; according to the scheme, the processing rule is formulated by analyzing error information after the automatic script is operated, so that the effect of simplifying maintenance of the automatic script is achieved, and calling conditions among interfaces are not considered.
The invention patent application with the bulletin number of CN104252413A and the bulletin day of 2014, 12 months and 31 days discloses an automatic script generation method, device and system; according to the scheme, the local compiled and debugged script can be automatically uploaded to the server side and converted into the automatic script through an online web system, the scheme does not relate to dependency decoupling between services, and when the data volume of the automatic script is up, the whole maintenance workload is quite large.
In view of this, overcoming the defects in the prior art is a problem to be solved in the art.
[ invention ]
The invention aims to solve the technical problems that:
the existing interface test system lacks of decoupling treatment on the dependence between interfaces, so that the deployment of a test environment is complex, the development and maintenance workload of an automatic script is large, and the test efficiency of the interfaces is low.
The invention achieves the aim through the following technical scheme:
in a first aspect, the present invention provides an interface automation test method, the interface automation test method comprising: generating all automatic test scripts of the external service interface based on the software source code;
traversing to obtain log files of all external service interfaces, collecting input parameter information and return value information in the log files of the external service interfaces, and storing the input parameter information and the return value information in the log files into simulation modules of the corresponding external service interfaces;
loading input parameter information and return value information in simulation modules of all external service interfaces, traversing and running an automatic test script of an interface to be tested, wherein the interface to be tested needs to call other interfaces, and accessing the simulation modules of the other interfaces to process; obtaining an operation result of the interface to be tested; confirming whether the operation result of the interface to be tested is matched with the value range of the return value of the interface to be tested defined in the source code;
if yes, determining that the interface to be tested works normally; otherwise, determining that the interface to be tested works abnormally.
Preferably, the input parameter information includes: one or more of an input parameter name, an input parameter type, a value range of the input parameter and a default value of the input parameter;
the return value information includes: one or more of a return value name, a return value type, a return value range, and a return value default value.
Preferably, all automation test scripts for the external service interface are generated based on the software source code, and in particular:
traversing all source code files of external services in a scanning software code library, analyzing and extracting information of each external service interface, archiving and storing the information of each external service interface according to interface names as keywords, and generating an automatic test script of the interface based on the value range of key input parameters and default values of other input parameters.
Preferably, the information of the external service interface includes one or more of interface name, input parameter type, value range of input parameter, default value of input parameter, return value name, return value type, value range of return value and default value of return value.
Preferably, the operation result of the interface to be tested is specific:
and the interface to be tested needs to call simulation modules of other interfaces to be processed, and if the simulation modules of the other interfaces have input parameters consistent with the input parameters of the interface to be tested, return the return values of the simulation modules of the other interfaces corresponding to the input parameters of the interface to be tested directly to obtain the operation result of the interface to be tested.
Preferably, the log files of the external service interface include real-time log files generated by running an automated test script of the external service interface and historical log files generated by historically testing the external service interface.
In a second aspect, the present invention provides an interface automation testing device, configured to implement the interface automation testing method of the first aspect, where the interface automation testing device includes:
the system comprises an information acquisition unit, a generation module, a simulation module and a confirmation module, and specifically comprises the following components:
the information acquisition unit is used for acquiring information of all external service interfaces in the software code library;
the generation module is used for generating an automatic test script of the interface according to the value range of the key input parameters and the default values of other input parameters;
the simulation module is used for storing the input parameter information and the return value information of each interface in the log file and providing simulation data when the calling of other interfaces is involved;
and the confirmation module is used for confirming whether the operation result is matched with the value range of the return value, if so, determining that the interface to be tested works normally, otherwise, determining that the interface to be tested works abnormally.
Preferably, the confirmation module is further configured to determine whether all source code files are processed, whether all external service interface information is processed, and whether all log files of the external service interface are processed.
Preferably, the interface automatic test device further comprises: and the execution module is used for executing the automatic test script.
Preferably, the interface automatic test device further comprises: and the management module is used for continuously arranging and updating the data in the simulation module.
Compared with the prior art, the invention has the beneficial effects that:
the invention provides an interface automatic test method and device, which are used for generating an interface automatic test script based on a software source code, establishing a simulation module through a log file, and directly accessing the simulation module for processing related to the call of other interfaces in the test process to obtain the operation result of an interface to be tested, so that two interfaces with mutual dependency relationship can be executed in series, thereby reducing the dependency between the interfaces, simplifying the test environment deployment and improving the interface test efficiency.
[ description of the drawings ]
In order to more clearly illustrate the technical solution of the embodiments of the present invention, the drawings that are required to be used in the embodiments of the present invention will be briefly described below. It is evident that the drawings described below are only some embodiments of the present invention and that other drawings may be obtained from these drawings without inventive effort for a person of ordinary skill in the art.
FIG. 1 is a flow chart of an automated interface testing method provided by an embodiment of the present invention;
FIG. 2 is a flow chart of an embodiment of the present invention for generating automated test scripts for all out-of-service interfaces based on software source code;
FIG. 3 is a flowchart of processing log files of an external service interface according to an embodiment of the present invention;
FIG. 4 is a flow chart of a load simulation module running an automated test script provided by an embodiment of the present invention;
fig. 5 is a block diagram of an automated interface testing apparatus according to an embodiment of the present invention.
[ detailed description ] of the invention
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
In the description of the present invention, the terms "inner", "outer", "longitudinal", "transverse", "upper", "lower", "top", "bottom", etc. refer to an orientation or positional relationship based on that shown in the drawings, merely for convenience of describing the present invention and do not require that the present invention must be constructed and operated in a specific orientation, and thus should not be construed as limiting the present invention.
In addition, the technical features of the embodiments of the present invention described below may be combined with each other as long as they do not collide with each other.
Example 1:
the embodiment of the invention provides an interface automatic test method, as shown in fig. 1, comprising the following steps:
in step 201, generating an automated test script for all out-of-service interfaces based on software source code; the software source code contains various external services, the external services relate to different interfaces according to different functions, and the interfaces may also need to call other interfaces to realize the functions of the interfaces.
In step 202, log files of all external service interfaces are traversed, input parameter information and return value information of each external service interface in the log files are collected, and the information is stored in a simulation module of the corresponding external service interface.
The log file of the external service interface refers to the log file obtained by the history of all interfaces in the external service or the running of the automatic test script; the simulation module is used for storing the input parameter information and the return value information of each interface in the log file and providing simulation data when the calling of other interfaces is involved.
In step 203, loading all simulation modules of the external service interface, and traversing an automatic test script of the interface to be tested; the simulation modules are arranged in the external service interfaces, different interfaces are provided with different simulation modules, and the interface to be tested is one external service interface waiting for testing in all the external service interfaces.
In step 204, the interface to be tested needs to call other interfaces, and first accesses the simulation modules of the other interfaces to process, so as to obtain the operation result of the interface to be tested; the operation result of the interface to be tested is specifically: if the simulation modules of the other interfaces have the input parameters consistent with the input parameters of the interface to be tested, the return values of the simulation modules of the other interfaces corresponding to the input parameters of the interface to be tested are directly returned to obtain the operation result of the interface to be tested; if the simulation modules of the other interfaces do not have the input parameters consistent with the input parameters of the interfaces to be tested, returning to the matching failure, and marking the input parameters with the matching failure.
In step 205, it is determined whether the operation result of the interface to be tested matches the range of values of the return values of the interface to be tested defined in the source code.
In step 206, if yes, it is determined that the interface to be tested is normal.
In step 207, otherwise, it is determined that the interface under test is not working properly.
In an embodiment of the present invention, the input parameter information includes: one or more of an input parameter name, an input parameter type, a value range of the input parameter and a default value of the input parameter; the default value of the input parameter is a default value defined by a source code;
the return value information includes: one or more of a return value name, a return value type, a return value range of the return value, and a return value default value; the default value of the return value is a default value defined by the source code.
In the embodiment of the invention, all automatic test scripts for external service interfaces are generated based on the software source code, and the method is specific:
traversing all source code files of external services in a scanning software code library, analyzing and extracting information of each external service interface, archiving and storing the information of each external service interface according to interface names as keywords, generating an automatic test script of the interface based on the value range of key input parameters and default values of other input parameters, and covering typical interval range combinations of all key fields; the key input parameters are parameters with higher importance levels in the input parameters; the key fields include boundary values, intermediate values, and exception data.
The specific steps are shown in fig. 2:
in step 301, a code directory of a software interface is set, ensuring that the running interface is provided with access to the code directory when it is tested automatically.
In step 302, a database storing information of all external service interfaces is provided in the software, and the database is connected before the test, and is used for storing collected information, so that the automatic test script can be generated conveniently.
In step 303, the source code file in the code directory of the acquisition software interface is traversed.
In step 304, the source code file acquired in step 303 is traversed, so that all external services in the source code file are acquired.
In step 305, all external services acquired in step 303 are traversed, so that interface information in all external services is acquired.
In step 306, the interface information in the external service acquired in step 305 is collected, and one or more of the interface name, the input parameter type, the value range of the input parameter, the default value of the input parameter, the return value name, the return value type, the value range of the return value, and the default value of the return value in the interface information are collected.
In step 307, an automated test script of the interface is generated according to the value range of the key input parameter and the default value of the other input parameters, covering the typical interval range combination of the boundary value, the intermediate value and the abnormal data; the typical interval range combination refers to random free combination of boundary values, intermediate values and abnormal data, and an automatic test script for generating an interface according to the value range of key input parameters and default values of other input parameters covers all random free combination.
In step 308, it is determined whether all processing of interface information in the external service is completed; if yes, go to step 309; if not, go to step 305.
In step 309, it is determined whether all processing for external services is completed; if yes, go to step 310; if not, go to step 304.
In step 310, it is determined whether the processing of all source code files is completed; if yes, turning to end; if not, go to step 303.
In the embodiment of the invention, the information of the external service interface includes an interface name, an input parameter type, a value range of the input parameter, a default value of the input parameter, a return value name, a return value type, a value range of the return value and a default value of the return value.
The step of processing the log file of the external service interface in step 202 is as shown in fig. 3:
in step 401, all log files for external services are traversed; the log files comprise real-time log files generated by executing the automatic test script of the external service interface and historical log files generated by historically testing the external service interface.
In step 402, traversing interface information in the log file; the interface information includes an interface name, an input parameter type, a value range of the input parameter, a default value of the input parameter, a return value name, a return value type, a value range of the return value, and a default value of the return value.
In step 403, all input parameter information and corresponding return value information of the interface information are parsed and acquired.
The input parameter information comprises one or more of an input parameter name, an input parameter type, a value range of the input parameter and a default value of the input parameter; the return value information includes one or more of a return value name, a return value type, a range of values for the return value, and a default value for the return value.
In step 404, it is determined whether the interface information obtained by parsing is consistent with the interface information collected in the source code, and whether the name, number and value range of the input parameter are reasonable is checked, if yes, the process goes to step 406; if not, go to step 405.
Checking whether the name, number and value range of the input parameters are reasonable or not, and specifically: checking whether the names of the input parameters in the interface information obtained by analysis are consistent with the names of the input parameters in the interface information acquired in the source code, checking whether the number of the input parameters in the interface information obtained by analysis is consistent with the number of the input parameters in the interface information acquired in the source code, and checking whether the value range of the input parameters in the interface information obtained by analysis is consistent with the value range of the input parameters in the interface information acquired in the source code.
In step 405, if the parsed input parameters do not exist in the source code input parameter list, filling is performed using default values defined by the source code; the source code parameter list is a table summary of various parameters set by the source code; if the input parameters obtained by analysis are removed from the source code input parameter list, automatically eliminating redundant input parameters obtained by analysis, and recording a log; the redundant input parameters are input parameters which are not existed in the new version of software or interface information after part of the input parameters used before are eliminated or rejected in the updating of the software or interface continuously; if the value range of the input parameters obtained by analysis is unreasonable, filling by using the original value defined by the source code, and recording a log; the unreasonable value range of the input parameter means that the value range of the input parameter is not within the preset value range of the input parameter.
In step 406, the input parameter information and the return value information obtained by the final analysis are saved in the simulation module; the final input parameters obtained by analysis are the interface information collected from the source code and the return value corresponding to the interface information collected from the source code, which are directly output in step 404, or the input parameters obtained by analysis are the input parameter information and the return value information obtained by processing the input parameters in step 405.
In step 407, it is determined whether or not the processing of all interface information is completed; if yes, go to step 408; if not, go to step 402.
In step 408, it is determined whether all processing of log files for external services is completed; if yes, turning to end; if not, go to step 401.
In the embodiment of the invention, the log files of the external service interface comprise real-time log files generated by executing the automatic test script of the external service interface and historical log files generated by historically testing the external service interface.
In the embodiment of the invention, the interface to be tested needs to call the simulation modules of other interfaces to process, and if the input parameters consistent with the input parameters of the interface to be tested exist in the other simulation modules, the return values corresponding to the input parameters in the simulation modules are directly returned to obtain the operation result of the interface to be tested.
In step 203, all simulation modules of the external service interface are loaded, and specific steps of traversing the automated test script for running the interface to be tested are shown in fig. 4:
in step 501, all simulation modules of the external service interface are loaded; collecting input parameter information and return value information; the input parameter information comprises one or more of an input parameter name, an input parameter type, a value range of the input parameter and a default value of the input parameter; the return value information includes one or more of a return value name, a return value type, a range of values for the return value, and a default value for the return value.
In step 502, an automated test script for running the interface under test is traversed.
In step 503, the interface call in the runtime automation script is traversed.
In step 504, query whether there are matched input parameters in the simulation module; if yes, go to step 506; if not, go to step 505.
In step 505, the simulation module returns a match failure, marking the input parameters of the match failure.
In step 506, the simulation module returns a return value corresponding to the matched input parameter, and records the running result.
In step 507, it is determined whether or not the operation of all calls of the interface to be tested is completed, if yes, the step 508 is turned to, and if no, the step 503 is turned to.
In step 508, it is determined whether the operation of the automated test scripts of all interfaces is completed, if yes, the process goes to end; if not, go to step 502.
Example 2:
taking the example that the A website commodity payment interface calls the Unionpay payment interface, the embodiment of the invention provides an interface automatic test method, which comprises the following steps:
the information acquisition unit acquires information of an external service interface in a software code library and acquires information of a commodity payment interface, wherein the commodity payment interface is required to call a Unionpay payment interface to pay; the commodity payment interface information comprises information such as the number of commodities, commodity amount, commodity brands and the like;
the generation module generates an automatic test script of the interface according to one or more of the input parameter name, the input parameter type, the value range of the input parameter, the default value of the input parameter, the return value name, the return value type, the value range of the return value and the default value of the return value of the commodity payment interface information, for example, the generation module generates an automatic test script of the interface according to various combination conditions such as the number of commodities, commodity amount, commodity brand, corresponding to-be-paid amount and the like of the commodity payment interface information, such as 0-1 commodity payment, 0-multi-commodity payment, 10000-commodity payment, multi-commodity payment and the like.
And traversing log files of the Unionpay payment interface, collecting information such as the number of commodities, commodity amount, commodity brand and corresponding to-be-paid amount obtained historically and by running the automated test script of the interface, and storing the information into a simulation module of the Unionpay payment interface.
In the actual test process, the real Unionpay interface cannot be called, the automatic test of the commodity payment interface cannot be completed, in this case, a simulation module of the commodity payment interface is loaded, the simulation module is used for storing information such as the number of commodities, commodity amount, commodity brand and corresponding to-be-paid amount of the Unionpay payment interface in a log file, a simulation Unionpay service of the Unionpay payment interface is started, when the Unionpay payment interface is called, the simulation Unionpay service is called, and the simulation Unionpay service provides simulated corresponding to-be-paid amount based on the matched information such as the number of commodities, commodity amount and commodity brand.
And the confirmation module confirms whether the to-be-paid amount is within a preset value range of the to-be-paid amount, if so, the commodity payment interface is determined to work normally, and if not, the commodity payment interface is determined to work abnormally.
Example 3:
an embodiment of the present invention provides an interface automation test device, configured to implement the interface automation test method described in the foregoing embodiment 1, as shown in fig. 5, where the interface automation test device includes: the system comprises an information acquisition unit, a generation module, a simulation module and a confirmation module, and specifically comprises the following components:
the information acquisition unit is used for acquiring information of all external service interfaces in the software code library;
the generation module is used for generating an automatic test script of the interface according to the value range of the key input parameters and the default values of other input parameters;
the simulation module is used for storing the input parameter information and the return value information of each interface in the log file and providing simulation data when the calling of other interfaces is involved;
and the confirmation module is used for confirming whether the operation result is matched with the value range of the return value, if so, determining that the interface to be tested works normally, otherwise, determining that the interface to be tested works abnormally.
In the embodiment of the invention, the confirmation module is further used for judging whether to finish the processing of all source code files, whether to finish the processing of the information of all external service interfaces and whether to finish the processing of the log files of all external service interfaces.
In an embodiment of the present invention, the interface automatic test apparatus further includes: and the execution module is used for executing the automatic test script.
In an embodiment of the present invention, the interface automatic test apparatus further includes: and the management module is used for continuously arranging and updating the data in the simulation module.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, and alternatives falling within the spirit and principles of the invention.

Claims (10)

1. An automated interface testing method, comprising: generating all automatic test scripts of the external service interface based on the software source code;
traversing to obtain log files of all external service interfaces, collecting input parameter information and return value information in the log files of the external service interfaces, and storing the input parameter information and the return value information in the log files into simulation modules of the corresponding external service interfaces;
if the input parameters obtained through analysis do not exist in the source code input parameter list, filling by using default values defined by the source codes; the source code parameter list is a table summary of various parameters set by the source code; if the input parameters obtained by analysis are removed from the source code input parameter list, automatically eliminating redundant input parameters obtained by analysis, and recording a log; the redundant input parameters are input parameters which are not existed in the new version of software or interface information after part of the input parameters used before are eliminated or rejected in the updating of the software or interface continuously; if the value range of the input parameters obtained by analysis is unreasonable, filling by using the original value defined by the source code, and recording a log; the unreasonable value range of the input parameter means that the value range of the input parameter is not within the preset value range of the input parameter;
loading input parameter information and return value information in simulation modules of all external service interfaces, traversing and running an automatic test script of an interface to be tested, wherein the interface to be tested needs to call other interfaces, and accessing the simulation modules of the other interfaces to process; obtaining an operation result of the interface to be tested; confirming whether the operation result of the interface to be tested is matched with the value range of the return value of the interface to be tested defined in the source code;
if yes, determining that the interface to be tested works normally; otherwise, determining that the interface to be tested works abnormally.
2. The method of automated interface testing according to claim 1, wherein the input parameter information comprises: one or more of an input parameter name, an input parameter type, a value range of the input parameter and a default value of the input parameter;
the return value information includes: one or more of a return value name, a return value type, a return value range, and a return value default value.
3. The method for automatically testing interfaces according to claim 1, wherein the generating automated test scripts for all out-of-service interfaces based on the software source code specifically comprises:
traversing all source code files of external services in a scanning software code library, analyzing and extracting information of each external service interface, archiving and storing the information of each external service interface according to interface names as keywords, and generating an automatic test script of the interface based on the value range of key input parameters and default values of other input parameters.
4. The method of claim 3, wherein the information of the external service interface includes one or more of an interface name, an input parameter type, a value range of an input parameter, a default value of an input parameter, a return value name, a return value type, a value range of a return value, and a default value of a return value.
5. The method for automatically testing an interface according to claim 1, wherein the obtaining the operation result of the interface to be tested specifically includes:
and the interface to be tested needs to call simulation modules of other interfaces to be processed, and if the simulation modules of the other interfaces have input parameters consistent with the input parameters of the interface to be tested, return the return values of the simulation modules of the other interfaces corresponding to the input parameters of the interface to be tested directly to obtain the operation result of the interface to be tested.
6. The automated interface testing method of claim 1, wherein the log files of the external service interface comprise real-time log files generated by running an automated test script of the external service interface and historical log files generated by historically testing the external service interface.
7. An interface automation test device for implementing the interface automation test method of any one of claims 1-6, the interface automation test device comprising: the system comprises an information acquisition unit, a generation module, a simulation module and a confirmation module, and specifically comprises the following components:
the information acquisition unit is used for acquiring information of all external service interfaces in the software code library;
the generation module is used for generating an automatic test script of the interface according to the value range of the key input parameters and the default values of other input parameters;
the simulation module is used for storing the input parameter information and the return value information of each interface in the log file and providing simulation data when the calling of other interfaces is involved;
and the confirmation module is used for confirming whether the operation result is matched with the value range of the return value, if so, determining that the interface to be tested works normally, otherwise, determining that the interface to be tested works abnormally.
8. The automated interface testing apparatus of claim 7, wherein the validation module is further configured to determine one or more of whether to complete processing of all source code files, whether to complete processing of information for all external service interfaces, and whether to complete processing of log files for all external service interfaces.
9. The automated interface test equipment of claim 7, further comprising: and the execution module is used for executing the automatic test script.
10. The automated interface test equipment of claim 7, further comprising: and the management module is used for continuously arranging and updating the data in the simulation module.
CN202110191047.0A 2021-02-20 2021-02-20 Automatic interface testing method and device Active CN113010413B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110191047.0A CN113010413B (en) 2021-02-20 2021-02-20 Automatic interface testing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110191047.0A CN113010413B (en) 2021-02-20 2021-02-20 Automatic interface testing method and device

Publications (2)

Publication Number Publication Date
CN113010413A CN113010413A (en) 2021-06-22
CN113010413B true CN113010413B (en) 2023-04-25

Family

ID=76403700

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110191047.0A Active CN113010413B (en) 2021-02-20 2021-02-20 Automatic interface testing method and device

Country Status (1)

Country Link
CN (1) CN113010413B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110737590B (en) * 2019-09-16 2023-11-03 上海御渡半导体科技有限公司 Offline debugging method
CN113760759A (en) * 2021-09-02 2021-12-07 广东睿住智能科技有限公司 Debugging method, debugging device, electronic device, and storage medium
CN113987980B (en) * 2021-09-23 2022-05-20 北京连山科技股份有限公司 Popular simulation implementation method for physical PHD (graphical user device)
CN114925516B (en) * 2022-05-16 2024-01-26 北京世冠金洋科技发展有限公司 Automatic modeling and simulating method and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108241543A (en) * 2016-12-30 2018-07-03 深圳壹账通智能科技有限公司 Method, service server and the system that business operation breakpoint performs
CN109831440A (en) * 2019-02-21 2019-05-31 中国联合网络通信集团有限公司 Interface packets conversion method, device and storage medium
CN111782613A (en) * 2020-06-23 2020-10-16 南昌航空大学 Method for optimizing operation efficiency of model integration platform

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104360920B (en) * 2014-12-02 2018-06-26 微梦创科网络科技(中国)有限公司 A kind of automatic interface testing method and device
CN108519948A (en) * 2018-04-04 2018-09-11 上海携程商务有限公司 The automatic interface testing method and system of daily record driving
CN112306855B (en) * 2019-08-02 2022-06-17 北大方正集团有限公司 Interface automation test method, device, terminal and storage medium
CN110727596A (en) * 2019-10-15 2020-01-24 北京弘远博学科技有限公司 APP interface automatic testing method
CN111782546B (en) * 2020-07-23 2021-10-01 北京斗米优聘科技发展有限公司 Automatic interface testing method and device based on machine learning
CN112052172B (en) * 2020-09-04 2024-01-30 云账户技术(天津)有限公司 Rapid test method and device for third-party channel and electronic equipment
CN112084113B (en) * 2020-09-16 2024-02-23 上海创景信息科技有限公司 Configurable automatic test method and system based on embedded simulation verification software

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108241543A (en) * 2016-12-30 2018-07-03 深圳壹账通智能科技有限公司 Method, service server and the system that business operation breakpoint performs
CN109831440A (en) * 2019-02-21 2019-05-31 中国联合网络通信集团有限公司 Interface packets conversion method, device and storage medium
CN111782613A (en) * 2020-06-23 2020-10-16 南昌航空大学 Method for optimizing operation efficiency of model integration platform

Also Published As

Publication number Publication date
CN113010413A (en) 2021-06-22

Similar Documents

Publication Publication Date Title
CN113010413B (en) Automatic interface testing method and device
US8126581B2 (en) Improving design manufacturing, and transportation in mass manufacturing through analysis of defect data
CN102053906A (en) System and method for collecting program runtime information
CN108897686B (en) Full-entry automatic testing method and device
CN111522728A (en) Method for generating automatic test case, electronic device and readable storage medium
CN110764998A (en) Data comparison method, device and equipment based on Django framework and storage medium
CN112817843B (en) Project management method and system
CN115328784A (en) Agile interface-oriented automatic testing method and system
CN111813652B (en) Automatic test method for checking abnormal value of data related to data missing
CN113220597B (en) Test method, test device, electronic equipment and storage medium
CN114490413A (en) Test data preparation method and device, storage medium and electronic equipment
CN117370217B (en) Automatic interface test result generation method based on python
CN117421238A (en) Disaster recovery account supplementing test method and device, electronic equipment and storage medium
CN116069628A (en) Intelligent-treatment software automatic regression testing method, system and equipment
CN115629956A (en) Software defect management method and system based on interface automatic test
CN115437943A (en) Automatic interface document verification method and device and server
US9201771B2 (en) Method for evaluating a production rule for a memory management analysis
CN111198798B (en) Service stability measuring method and device
US7516048B2 (en) Externalized metric calculation engine
CN114116470A (en) Automatic static model checking method and device
CN111813665A (en) Big data platform interface data testing method and system based on python
CN116383068B (en) Quick test method, device and storage medium of C++ program interface
CN112905438A (en) Automatic testing method and device
CN111597101A (en) SDK access state detection method, computer device and computer readable storage medium
CN118331840B (en) Data and interface joint test method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant