CN113010413A - Automatic interface testing method and device - Google Patents

Automatic interface testing method and device Download PDF

Info

Publication number
CN113010413A
CN113010413A CN202110191047.0A CN202110191047A CN113010413A CN 113010413 A CN113010413 A CN 113010413A CN 202110191047 A CN202110191047 A CN 202110191047A CN 113010413 A CN113010413 A CN 113010413A
Authority
CN
China
Prior art keywords
interface
interfaces
information
external service
tested
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110191047.0A
Other languages
Chinese (zh)
Other versions
CN113010413B (en
Inventor
刘桂秋
卢文博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Optical Network Information Technology Co Ltd
Fiberhome Telecommunication Technologies Co Ltd
Original Assignee
Wuhan Optical Network Information Technology Co Ltd
Fiberhome Telecommunication Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Optical Network Information Technology Co Ltd, Fiberhome Telecommunication Technologies Co Ltd filed Critical Wuhan Optical Network Information Technology Co Ltd
Priority to CN202110191047.0A priority Critical patent/CN113010413B/en
Publication of CN113010413A publication Critical patent/CN113010413A/en
Application granted granted Critical
Publication of CN113010413B publication Critical patent/CN113010413B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3676Test management for coverage analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention relates to the field of software automation test, and provides an interface automation test method and device. The interface automatic testing method comprises the following steps: generating all automatic test scripts of the external service interface based on the software source code; traversing all log files of the external service interfaces, and storing the log files into the simulation modules of the corresponding interfaces; loading all simulation modules of the external service interface, and traversing and operating the automatic test script of the interface to be tested, wherein the interface to be tested needs to call other interfaces and access the simulation modules of the other interfaces for processing. The invention generates the interface automatic test script based on the software source code, establishes the simulation module through the log file, relates to other interface calling in the test process, and directly accesses the simulation module for processing, so that two interfaces with mutual dependency relationship can be executed in series, thereby reducing the dependency between the interfaces, simplifying the test environment deployment and improving the interface test efficiency.

Description

Automatic interface testing method and device
[ technical field ] A method for producing a semiconductor device
The invention relates to the field of software automation test, in particular to an interface automation test method and device.
[ background of the invention ]
Application testing is the process of operating a program under specified conditions to discover bugs, measure software quality, and evaluate whether it can meet design requirements.
Automated testing is a process that translates human-oriented testing behavior into machine execution. Typically, after a test case is designed and passes review, the test is performed step by a tester according to the procedures described in the test case, resulting in a comparison of the actual results with the expected results. In the process, in order to save manpower, time or hardware resources and improve the testing efficiency, automatic testing is introduced.
The automatic test is an essential content in the technical field of software development, and the current automation mainly comprises an interface automatic test and an interface automatic test; the interface automatic test is to directly carry out the calling test of the interface and check the result returned by the interface; the interface automation test is to execute the same operation as the manual test by driving the control of the interface. The interface automatic test is a means for automatically testing an interface test scene provided by a system to ensure the interface logic function to be correct and improve the regression test coverage rate.
In the actual software development and test process, the interface of the software is continuously changed due to the continuous change of the requirement, the common changes comprise interface parameter adjustment, interface addition and deletion and the like, and the changes influence the operation of an interface automatic test script; aiming at the common practice of change, an automatic test script needs to be run again, and the failure caused by the factors such as the failure of new requirement development change or environmental reasons is analyzed through the error condition; however, in the current writing of automated testing, there are dependencies and mutual calling relations between services, and the automated testing verification can be performed only after the interfaces of all services are delivered, which results in too late test execution; the handling of these call relationships also affects the automation of the operation, making building and maintaining test data difficult.
The invention patent application with the publication number of CN111198813A and the publication date of 26/5/2020 discloses an interface testing method and device; according to the scheme, the processing rule is formulated by analyzing the error information after the operation of the automatic script, so that the effect of simplifying the maintenance of the automatic script is achieved, and the calling condition between the interfaces is not considered.
The invention patent application with the publication number of CN104252413A and the publication date of 2014, 12 and 31 discloses an automatic script generation method, device and system; the scheme can automatically upload locally compiled and debugged scripts to a server side through an online web system and convert the locally compiled and debugged scripts into the automatic scripts, the scheme does not involve dependence decoupling between services, and the whole maintenance workload is very large after the data volume of the automatic scripts comes.
In view of the above, it is an urgent problem in the art to overcome the above-mentioned drawbacks of the prior art.
[ summary of the invention ]
The technical problem to be solved by the invention is as follows:
the existing interface test system lacks the decoupling processing for the dependence between the interfaces, which results in complex test environment deployment, larger workload of automatic script development and maintenance and lower interface test efficiency.
The invention achieves the above purpose by the following technical scheme:
in a first aspect, the present invention provides an interface automation test method, where the interface automation test method includes: generating all automatic test scripts of the external service interface based on the software source code;
the method comprises the steps of obtaining log files of all external service interfaces in a traversing mode, collecting input parameter information and return value information in the log files of the external service interfaces, and storing the input parameter information and the return value information in the log files into simulation modules of corresponding external service interfaces;
loading input parameter information and return value information in simulation modules of all external service interfaces, traversing and operating an automatic test script of an interface to be tested, wherein the interface to be tested needs to call other interfaces and firstly accesses the simulation modules of the other interfaces for processing; obtaining an operation result of the interface to be tested; confirming whether the running result of the interface to be tested is matched with the value range of the return value of the interface to be tested defined in the source code;
if so, determining that the interface to be tested works normally; otherwise, determining that the interface to be tested does not work normally.
Preferably, the input parameter information includes: inputting one or more items of parameter names, input parameter types, value ranges of the input parameters and default values of the input parameters;
the return value information includes: one or more items of the name of the return value, the type of the return value, the value range of the return value and the default value of the return value.
Preferably, the automated test scripts of all external service interfaces are generated based on software source code, specifically:
and traversing and scanning all source code files of the external services in the software code base, analyzing and extracting the information of each external service interface, filing and storing the information of each external service interface as a keyword according to the interface name, and generating an automatic test script of the interface based on the value range of the key input parameters and the default values of other input parameters.
Preferably, the information of the external service interface includes one or more of an interface name, an input parameter type, a value range of an input parameter, a default value of an input parameter, a return value name, a return value type, a value range of a return value, and a default value of a return value.
Preferably, the operation result of the interface to be tested is specifically:
the interface to be tested needs to call other interfaces, the simulation modules of the other interfaces are accessed for processing, if the input parameters consistent with the input parameters of the interface to be tested exist in the simulation modules of the other interfaces, the return values corresponding to the input parameters of the interface to be tested in the simulation modules of the other interfaces are directly returned, and the operation result of the interface to be tested is obtained.
Preferably, the log file of the external service interface includes a real-time log file generated by running an automated test script of the external service interface and a historical log file generated by historically testing the external service interface.
In a second aspect, the present invention provides an interface automation test apparatus for implementing the interface automation test method in the first aspect, where the interface automation test apparatus includes:
information acquisition unit, generation module, simulation module and confirmation module, it is specific:
the information acquisition unit is used for acquiring information of all external service interfaces in the software code base;
the generating module is used for generating an automatic test script of the interface according to the value range of the key input parameter and the default values of other input parameters;
the simulation module is used for storing input parameter information and return value information of each interface in the log file and providing simulation data when calling of other interfaces is involved;
and the confirmation module is used for confirming whether the operation result is matched with the value range of the return value, if so, determining that the interface to be detected works normally, and otherwise, determining that the interface to be detected works abnormally.
Preferably, the confirmation module is further configured to determine whether to complete processing of all source code files, whether to complete processing of information of all external service interfaces, and whether to complete processing of log files of all external service interfaces.
Preferably, the interface automation test device further includes: and the execution module is used for executing the automatic test script.
Preferably, the interface automation test device further includes: and the management module is used for continuously arranging and updating the data in the simulation module.
Compared with the prior art, the invention has the beneficial effects that:
the invention provides an interface automatic test method and device, which are characterized in that an interface automatic test script is generated based on a software source code, a simulation module is established through a log file, and in the test process, other interfaces are called and directly accessed to the simulation module for processing to obtain the operation result of an interface to be tested, so that two interfaces with mutual dependency relationship can be executed in series, the dependency between the interfaces is reduced, the test environment deployment is simplified, and the interface test efficiency is improved.
[ description of the drawings ]
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required to be used in the embodiments of the present invention will be briefly described below. It is obvious that the drawings described below are only some embodiments of the invention, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
Fig. 1 is a flowchart of an interface automation test method according to an embodiment of the present invention;
FIG. 2 is a flowchart of an embodiment of the present invention for generating all automated test scripts for out-of-service interfaces based on software source code;
FIG. 3 is a flowchart of processing a log file of an external service interface according to an embodiment of the present invention;
FIG. 4 is a flowchart of loading a simulation module to run an automated test script according to an embodiment of the present invention;
fig. 5 is a structural diagram of an interface automation test apparatus according to an embodiment of the present invention.
[ detailed description ] embodiments
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the description of the present invention, the terms "inner", "outer", "longitudinal", "lateral", "upper", "lower", "top", "bottom", and the like indicate orientations or positional relationships based on those shown in the drawings, and are for convenience only to describe the present invention without requiring the present invention to be necessarily constructed and operated in a specific orientation, and thus should not be construed as limiting the present invention.
In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
Example 1:
an embodiment of the present invention provides an interface automation test method, as shown in fig. 1, the interface automation test method includes:
in step 201, generating all automatic test scripts of the external service interface based on the software source code; the software source code contains various external services, which relate to different interfaces according to different functions, and these interfaces may need to call other interfaces to realize the functions of these interfaces.
In step 202, the log files of all the external service interfaces are traversed, the input parameter information and the return value information of each external service interface in the log files are collected, and the collected information is stored in the simulation module of the corresponding external service interface.
The log file of the external service interface refers to log files obtained in the history of all interfaces in the external service or by running the automatic test script; the simulation module is used for storing input parameter information and return value information of each interface in the log file and providing simulation data when other interfaces are called.
In step 203, loading all simulation modules of the external service interface, and traversing and running the automated test script of the interface to be tested; the simulation module is arranged in the external service interface, different interfaces have different simulation modules, and the interface to be tested is one external service interface waiting for testing in all the external service interfaces.
In step 204, the interface to be tested needs to call other interfaces, and the simulation modules of the other interfaces are accessed for processing to obtain the operation result of the interface to be tested; the operation result of the interface to be tested specifically comprises the following steps: if the input parameters consistent with the input parameters of the interface to be tested exist in the simulation modules of other interfaces, directly returning the return values of the input parameters corresponding to the interface to be tested in the simulation modules of other interfaces to obtain the operation result of the interface to be tested; if the simulation modules of other interfaces do not have input parameters consistent with the input parameters of the interface to be tested, returning matching failure, and marking the input parameters of the matching failure.
In step 205, it is determined whether the running result of the interface to be tested matches the value range of the return value of the interface to be tested defined in the source code.
In step 206, if yes, it is determined that the interface to be tested is working normally.
Otherwise, in step 207, it is determined that the interface under test is not functioning properly.
In an embodiment of the present invention, the input parameter information includes: inputting one or more items of parameter names, input parameter types, value ranges of the input parameters and default values of the input parameters; the default value of the input parameter is the default value defined by the source code;
the return value information includes: one or more items of the name of the return value, the type of the return value, the value range of the return value and the default value of the return value; the default value of the return value is the default value defined by the source code.
In the embodiment of the present invention, the automated test scripts of all external service interfaces are generated based on the software source code, specifically:
traversing and scanning all source code files of the external services in the software code library, analyzing and extracting information of each external service interface, archiving and storing the information of each external service interface as a keyword according to an interface name, generating an automatic test script of the interface based on a value range of a key input parameter and default values of other input parameters, and covering typical interval range combinations of all key fields; the key input parameters are parameters with higher importance levels in the input parameters; the key fields include boundary values, intermediate values, and exception data.
Wherein, the specific steps are shown in figure 2:
in step 301, a code catalog of the software interface is set, and it is ensured that the code catalog has an access right when the interface automation test is run.
In step 302, a database for storing information of all external service interfaces is provided in the software, and the database is connected before testing is prepared, and is used for storing the acquired information so as to facilitate subsequent generation of an automatic test script.
In step 303, the source code file in the code directory of the acquired software interface is traversed.
In step 304, the source code file obtained in step 303 is traversed, so as to obtain all the external services in the source code file.
In step 305, all the external services acquired in step 303 are traversed, so as to acquire interface information in all the external services.
In step 306, the interface information in the external service acquired in step 305 is acquired, and one or more of an interface name, an input parameter type, a value range of the input parameter, a default value of the input parameter, a return value name, a return value type, a value range of the return value, and a default value of the return value in the interface information are acquired.
In step 307, an automated test script of the interface is generated according to the value range of the key input parameter and the default values of other input parameters, and typical interval range combinations of boundary values, intermediate values and abnormal data are covered; the typical interval range combination refers to random free combination of boundary values, intermediate values and abnormal data, and all random free combination is covered by an automatic test script of an interface generated according to the value range of key input parameters and default values of other input parameters.
In step 308, it is determined whether to complete the processing of all interface information in the external service; if yes, go to step 309; if not, go to step 305.
In step 309, it is determined whether all external services have been processed; if yes, go to step 310; if not, go to step 304.
In step 310, it is determined whether all the source code files have been processed; if yes, ending; if not, go to step 303.
In the embodiment of the present invention, the information of the external service interface includes an interface name, an input parameter type, a value range of an input parameter, a default value of an input parameter, a return value name, a return value type, a value range of a return value, and a default value of a return value.
The step of processing the log file of the external service interface in step 202 is shown in fig. 3:
in step 401, traversing all log files of the external service; the log files comprise real-time log files generated by executing the automatic test scripts of the external service interface and historical log files generated by historically testing the external service interface.
In step 402, traversing the interface information in the log file; the interface information comprises an interface name, an input parameter type, a value range of the input parameter, a default value of the input parameter, a return value name, a return value type, a value range of the return value and a default value of the return value.
In step 403, all the input parameter information and corresponding return value information of the interface information are analyzed and obtained.
The input parameter information comprises one or more items of input parameter names, input parameter types, value ranges of the input parameters and default values of the input parameters; the return value information comprises one or more items of return value names, return value types, value ranges of the return values and default values of the return values.
In step 404, judging whether the interface information obtained by analysis is consistent with the interface information collected in the source code, checking whether the name, number and value range of the input parameter are reasonable, and if so, turning to step 406; if not, go to step 405.
Wherein, whether the name, number and the value range of the check input parameter are reasonable, it is specific: checking whether the name of the input parameter in the interface information obtained by analysis is consistent with the name of the input parameter in the interface information collected in the source code, checking whether the number of the input parameter in the interface information obtained by analysis is consistent with the number of the input parameter in the interface information collected in the source code, and checking whether the value range of the input parameter in the interface information obtained by analysis is consistent with the value range of the input parameter in the interface information collected in the source code.
In step 405, if the input parameter obtained by parsing does not exist in the source code input parameter list, filling the input parameter by using a default value defined by the source code; the source code parameter list is used for summarizing tables of various parameters set by the source code; if the input parameters obtained by analysis are removed from the source code input parameter list, automatically eliminating redundant input parameters obtained by analysis, and recording logs; the redundant input parameters refer to input parameters which do not exist in new software or interface information any more after part of the previously used input parameters are eliminated or rejected in the continuous updating and updating of the software or the interface; if the value range of the input parameter obtained by analysis is unreasonable, filling the original value defined by the source code, and recording a log; the unreasonable value range of the input parameters means that the value range of the input parameters is not within the preset value range of the input parameters.
In step 406, the input parameter information and the return value information obtained by the final analysis are stored in the simulation module; the input parameters obtained through the final analysis are the interface information acquired from the source code and the return value corresponding to the interface information acquired from the source code, which are directly output in step 404, or the input parameters obtained through the final analysis are the input parameter information and the return value information obtained after the input parameters are processed in step 405.
In step 407, it is determined whether processing of all interface information is completed; if yes, go to step 408; if not, go to step 402.
In step 408, it is determined whether all processing of log files of the external service is completed; if yes, ending; if not, go to step 401.
In the embodiment of the present invention, the log file of the external service interface includes a real-time log file generated by executing an automated test script of the external service interface and a historical log file generated by historically testing the external service interface.
In the embodiment of the invention, the interface to be tested needs to call other interfaces, the simulation modules of the other interfaces are accessed for processing, if the other simulation modules have input parameters consistent with the input parameters of the interface to be tested, the return values corresponding to the input parameters in the simulation modules are directly returned, and the operation result of the interface to be tested is obtained.
The specific steps of loading all simulation modules of the external service interface in step 203 and traversing the automated test script running the interface to be tested are shown in fig. 4:
in step 501, loading all simulation modules of the external service interface; collecting input parameter information and return value information; the input parameter information comprises one or more items of input parameter names, input parameter types, value ranges of the input parameters and default values of the input parameters; the return value information comprises one or more items of return value names, return value types, value ranges of the return values and default values of the return values.
In step 502, the automated test script running the interface to be tested is traversed.
In step 503, the interface calls in the run automation script are traversed.
In step 504, querying whether a simulation module has a matched input parameter; if yes, go to step 506; if not, go to step 505.
In step 505, the simulation module returns a match failure, marking the input parameters of the match failure.
In step 506, the simulation module returns a return value corresponding to the matched input parameter, and records the operation result.
In step 507, it is determined whether all the calls of the interface to be tested are completed, if yes, the step 508 is performed, and if not, the step 503 is performed.
In step 508, whether the running of the automated test scripts of all the interfaces is finished is judged, and if yes, the process is finished; if not, go to step 502.
Example 2:
taking an example that a commodity payment interface of a website A calls a Unionpay payment interface, the embodiment of the invention provides an automatic interface testing method, which comprises the following steps:
the information acquisition unit acquires information of an external service interface in a software code base and acquires information of a commodity payment interface, wherein the commodity payment interface needs to call a UnionPay payment interface to pay; the commodity payment interface information comprises information such as the number of commodities, the amount of the commodities, the brands of the commodities and the like;
the generation module generates an automatic test script of the interface according to one or more items of an input parameter name, an input parameter type, a value range of an input parameter, a default value of an input parameter, a return value name, a return value type, a value range of a return value and a default value of a return value of the commodity payment interface information, for example, the generation module generates an automatic test script of the interface under various combination conditions of 0-element 1-piece commodity payment, 0-element multi-piece commodity payment, 10000-element commodity payment, multi-piece commodity payment and the like according to the number of commodities, the commodity amount, the commodity brand, the corresponding amount to be paid and the like of the commodity payment interface information.
And traversing a log file of the Unionpay payment interface, acquiring information such as the number of commodities, the amount of the commodities, the brand of the commodities, the corresponding amount to be paid and the like which are acquired historically and by running the automatic testing script of the interface, and storing the information into a simulation module of the Unionpay payment interface.
In the actual test process, the real UnionPay payment interface cannot be called, the automatic test of the commodity payment interface cannot be completed, under the condition, the simulation module of the commodity payment interface is loaded, the simulation module is used for collecting information such as the number of commodities, the amount of commodities, the brand of the commodities, the corresponding amount to be paid and the like of the UnionPay payment interface in a log file, the simulation UnionPay payment service of the UnionPay payment interface is started, when the UnionPay payment interface is called by the commodity payment interface, the simulation UnionPay payment service is called, and the simulation UnionPay payment service provides the simulated corresponding amount to be paid based on the information such as the number of the matched commodities, the amount of the commodities, the brand of the commodities and the like.
The confirmation module confirms whether the amount of money to be paid is within a preset value range of the amount of money to be paid, if so, the commodity payment interface is determined to work normally, and otherwise, the commodity payment interface is determined to work abnormally.
Example 3:
an embodiment of the present invention provides an interface automation test apparatus, which is used to implement the interface automation test method described in embodiment 1, and as shown in fig. 5, the interface automation test apparatus includes: information acquisition unit, generation module, simulation module and confirmation module, it is specific:
the information acquisition unit is used for acquiring information of all external service interfaces in the software code base;
the generating module is used for generating an automatic test script of the interface according to the value range of the key input parameter and the default values of other input parameters;
the simulation module is used for storing input parameter information and return value information of each interface in the log file and providing simulation data when calling of other interfaces is involved;
and the confirmation module is used for confirming whether the operation result is matched with the value range of the return value, if so, determining that the interface to be detected works normally, and otherwise, determining that the interface to be detected works abnormally.
In this embodiment of the present invention, the confirmation module is further configured to determine whether to complete processing of all source code files, whether to complete processing of information of all external service interfaces, and whether to complete processing of log files of all external service interfaces.
In an embodiment of the present invention, the interface automation test apparatus further includes: and the execution module is used for executing the automatic test script.
In an embodiment of the present invention, the interface automation test apparatus further includes: and the management module is used for continuously arranging and updating the data in the simulation module.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (10)

1. An automated interface testing method, comprising: generating all automatic test scripts of the external service interface based on the software source code;
the method comprises the steps of obtaining log files of all external service interfaces in a traversing mode, collecting input parameter information and return value information in the log files of the external service interfaces, and storing the input parameter information and the return value information in the log files into corresponding simulation modules of the external service interfaces;
loading input parameter information and return value information in simulation modules of all external service interfaces, traversing and operating an automatic test script of an interface to be tested, wherein the interface to be tested needs to call other interfaces and firstly accesses the simulation modules of the other interfaces for processing; obtaining an operation result of the interface to be tested; confirming whether the running result of the interface to be tested is matched with the value range of the return value of the interface to be tested defined in the source code;
if so, determining that the interface to be tested works normally; otherwise, determining that the interface to be tested does not work normally.
2. The method of claim 1, wherein the inputting parameter information comprises: inputting one or more items of parameter names, input parameter types, value ranges of the input parameters and default values of the input parameters;
the return value information includes: one or more items of the name of the return value, the type of the return value, the value range of the return value and the default value of the return value.
3. The method for automatically testing an interface according to claim 1, wherein the generating of the automated test scripts for all the out-of-service interfaces based on the software source code specifically comprises:
and traversing and scanning all source code files of the external services in the software code base, analyzing and extracting the information of each external service interface, filing and storing the information of each external service interface as a keyword according to the interface name, and generating an automatic test script of the interface based on the value range of the key input parameters and the default values of other input parameters.
4. The automated interface testing method according to claim 3, wherein the information of the external service interface includes one or more of an interface name, an input parameter type, a value range of an input parameter, a default value of an input parameter, a return value name, a return value type, a value range of a return value, and a default value of a return value.
5. The method according to claim 1, wherein the obtaining the operation result of the interface to be tested specifically includes:
the interface to be tested needs to call other interfaces, the simulation modules of the other interfaces are accessed for processing, if the input parameters consistent with the input parameters of the interface to be tested exist in the simulation modules of the other interfaces, the return values corresponding to the input parameters of the interface to be tested in the simulation modules of the other interfaces are directly returned, and the operation result of the interface to be tested is obtained.
6. The automated interface testing method according to claim 1, wherein the log file of the external service interface includes a real-time log file generated by running an automated test script of the external service interface and a historical log file generated by historically testing the external service interface.
7. An interface automatic testing device, characterized in that the interface automatic testing device is used for realizing the interface automatic testing method of any one of claims 1-6, the interface automatic testing device comprises: information acquisition unit, generation module, simulation module and confirmation module, it is specific:
the information acquisition unit is used for acquiring information of all external service interfaces in the software code base;
the generating module is used for generating an automatic test script of the interface according to the value range of the key input parameter and the default values of other input parameters;
the simulation module is used for storing input parameter information and return value information of each interface in the log file and providing simulation data when calling of other interfaces is involved;
and the confirmation module is used for confirming whether the operation result is matched with the value range of the return value, if so, determining that the interface to be detected works normally, and otherwise, determining that the interface to be detected works abnormally.
8. The device according to claim 7, wherein the confirmation module is further configured to determine one or more of whether to complete processing of all source code files, whether to complete processing of information of all external service interfaces, and whether to complete processing of log files of all external service interfaces.
9. The interface automation test device of claim 7 further comprising: and the execution module is used for executing the automatic test script.
10. The interface automation test device of claim 7 further comprising: and the management module is used for continuously arranging and updating the data in the simulation module.
CN202110191047.0A 2021-02-20 2021-02-20 Automatic interface testing method and device Active CN113010413B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110191047.0A CN113010413B (en) 2021-02-20 2021-02-20 Automatic interface testing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110191047.0A CN113010413B (en) 2021-02-20 2021-02-20 Automatic interface testing method and device

Publications (2)

Publication Number Publication Date
CN113010413A true CN113010413A (en) 2021-06-22
CN113010413B CN113010413B (en) 2023-04-25

Family

ID=76403700

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110191047.0A Active CN113010413B (en) 2021-02-20 2021-02-20 Automatic interface testing method and device

Country Status (1)

Country Link
CN (1) CN113010413B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113760759A (en) * 2021-09-02 2021-12-07 广东睿住智能科技有限公司 Debugging method, debugging device, electronic device, and storage medium
CN113987980A (en) * 2021-09-23 2022-01-28 北京连山科技股份有限公司 Popular simulation implementation method for physical PHD (graphical user device)
US20220197782A1 (en) * 2019-09-16 2022-06-23 Shanghai Ncatest Technologies Co., Ltd. Offline debugging method
CN114925516A (en) * 2022-05-16 2022-08-19 北京世冠金洋科技发展有限公司 Method and device for automatic modeling and simulation

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104360920A (en) * 2014-12-02 2015-02-18 微梦创科网络科技(中国)有限公司 Automatic testing method and device for interface
CN108241543A (en) * 2016-12-30 2018-07-03 深圳壹账通智能科技有限公司 Method, service server and the system that business operation breakpoint performs
CN108519948A (en) * 2018-04-04 2018-09-11 上海携程商务有限公司 The automatic interface testing method and system of daily record driving
CN109831440A (en) * 2019-02-21 2019-05-31 中国联合网络通信集团有限公司 Interface packets conversion method, device and storage medium
CN110727596A (en) * 2019-10-15 2020-01-24 北京弘远博学科技有限公司 APP interface automatic testing method
CN111782613A (en) * 2020-06-23 2020-10-16 南昌航空大学 Method for optimizing operation efficiency of model integration platform
CN111782546A (en) * 2020-07-23 2020-10-16 北京斗米优聘科技发展有限公司 Automatic interface testing method and device based on machine learning
CN112052172A (en) * 2020-09-04 2020-12-08 云账户技术(天津)有限公司 Rapid testing method and device for third-party channel and electronic equipment
CN112084113A (en) * 2020-09-16 2020-12-15 上海创景信息科技有限公司 Configurable automatic test method and system based on embedded simulation verification software
CN112306855A (en) * 2019-08-02 2021-02-02 北大方正集团有限公司 Interface automation test method, device, terminal and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104360920A (en) * 2014-12-02 2015-02-18 微梦创科网络科技(中国)有限公司 Automatic testing method and device for interface
CN108241543A (en) * 2016-12-30 2018-07-03 深圳壹账通智能科技有限公司 Method, service server and the system that business operation breakpoint performs
CN108519948A (en) * 2018-04-04 2018-09-11 上海携程商务有限公司 The automatic interface testing method and system of daily record driving
CN109831440A (en) * 2019-02-21 2019-05-31 中国联合网络通信集团有限公司 Interface packets conversion method, device and storage medium
CN112306855A (en) * 2019-08-02 2021-02-02 北大方正集团有限公司 Interface automation test method, device, terminal and storage medium
CN110727596A (en) * 2019-10-15 2020-01-24 北京弘远博学科技有限公司 APP interface automatic testing method
CN111782613A (en) * 2020-06-23 2020-10-16 南昌航空大学 Method for optimizing operation efficiency of model integration platform
CN111782546A (en) * 2020-07-23 2020-10-16 北京斗米优聘科技发展有限公司 Automatic interface testing method and device based on machine learning
CN112052172A (en) * 2020-09-04 2020-12-08 云账户技术(天津)有限公司 Rapid testing method and device for third-party channel and electronic equipment
CN112084113A (en) * 2020-09-16 2020-12-15 上海创景信息科技有限公司 Configurable automatic test method and system based on embedded simulation verification software

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220197782A1 (en) * 2019-09-16 2022-06-23 Shanghai Ncatest Technologies Co., Ltd. Offline debugging method
US11789851B2 (en) * 2019-09-16 2023-10-17 Shanghai Ncatest Technologies Co., Ltd. Offline debugging method
CN113760759A (en) * 2021-09-02 2021-12-07 广东睿住智能科技有限公司 Debugging method, debugging device, electronic device, and storage medium
CN113987980A (en) * 2021-09-23 2022-01-28 北京连山科技股份有限公司 Popular simulation implementation method for physical PHD (graphical user device)
CN113987980B (en) * 2021-09-23 2022-05-20 北京连山科技股份有限公司 Popular simulation implementation method for physical PHD (graphical user device)
CN114925516A (en) * 2022-05-16 2022-08-19 北京世冠金洋科技发展有限公司 Method and device for automatic modeling and simulation
CN114925516B (en) * 2022-05-16 2024-01-26 北京世冠金洋科技发展有限公司 Automatic modeling and simulating method and device

Also Published As

Publication number Publication date
CN113010413B (en) 2023-04-25

Similar Documents

Publication Publication Date Title
CN113010413B (en) Automatic interface testing method and device
CN102122265B (en) System and method for verifying computer software test results
US20080222608A1 (en) Method and system for managing software testing
CN111522728A (en) Method for generating automatic test case, electronic device and readable storage medium
CN108897686B (en) Full-entry automatic testing method and device
CN108509344B (en) Daily cutting batch test method, equipment and readable storage medium
CN112817843B (en) Project management method and system
CN111258881B (en) Intelligent test system for workflow test
CN112363953B (en) Interface test case generation method and system based on crawler technology and rule engine
CN110188036A (en) A kind of method for testing software and device
CN115328784A (en) Agile interface-oriented automatic testing method and system
CN113886262A (en) Software automation test method and device, computer equipment and storage medium
CN113220597B (en) Test method, test device, electronic equipment and storage medium
CN114490413A (en) Test data preparation method and device, storage medium and electronic equipment
CN114138670A (en) Method based on interface automation test and function, performance and safety test fusion
CN117370217A (en) Automatic interface test result generation method based on python
CN115629956A (en) Software defect management method and system based on interface automatic test
CN111198798B (en) Service stability measuring method and device
CN111813665A (en) Big data platform interface data testing method and system based on python
CN113609698A (en) Process reliability analysis method and system based on process fault database
CN112905438A (en) Automatic testing method and device
CN116340187B (en) Rule engine migration test method and device, electronic equipment and storage medium
CN117331847B (en) Automatic test method and system supporting graphic interface
CN116303104B (en) Automated process defect screening management method, system and readable storage medium
CN116010272A (en) Multi-API (application program interface) automatic testing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant