CN112905459A - Service interface testing method and device, electronic equipment and storage medium - Google Patents

Service interface testing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112905459A
CN112905459A CN202110170956.6A CN202110170956A CN112905459A CN 112905459 A CN112905459 A CN 112905459A CN 202110170956 A CN202110170956 A CN 202110170956A CN 112905459 A CN112905459 A CN 112905459A
Authority
CN
China
Prior art keywords
test
interface
case data
information
generating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110170956.6A
Other languages
Chinese (zh)
Other versions
CN112905459B (en
Inventor
罗秉安
连煜伟
邹大卫
蔡晓惠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial and Commercial Bank of China Ltd ICBC
Original Assignee
Industrial and Commercial Bank of China Ltd ICBC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial and Commercial Bank of China Ltd ICBC filed Critical Industrial and Commercial Bank of China Ltd ICBC
Priority to CN202110170956.6A priority Critical patent/CN112905459B/en
Publication of CN112905459A publication Critical patent/CN112905459A/en
Application granted granted Critical
Publication of CN112905459B publication Critical patent/CN112905459B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention discloses a service interface testing method, a device, electronic equipment and a storage medium, which can be applied to the field of finance, wherein the method comprises the following steps: acquiring an interface source code to be tested, and generating a test script according to a test item and the interface source code, wherein the interface source code comprises: the interface calls a function; generating IO information of the interface to be tested according to the interface calling function and the test item; generating test case data according to the IO information based on a pre-constructed case data generation model, wherein the test case data comprises: assertion information; testing the interface to be tested according to the test case data and the test script; receiving test return information from the interface to be tested, and judging a test operation result according to the test return information and the assertion information; and responding to the test operation result to judge that the test fails, analyzing the test return information, and restarting the test operation on the interface to be tested according to the analysis result. The invention can improve the efficiency of service interface test.

Description

Service interface testing method and device, electronic equipment and storage medium
Technical Field
The invention relates to the technical field of finance, in particular to a service interface testing method and device, electronic equipment and a storage medium.
Background
At present, the types of service interfaces of an open platform are numerous, but service calling modes are different, and a http (request-response protocol) mode and an RPC (Remote Procedure Call) mode are common, and the message structures of various interfaces are different, entry functions of services to be tested are also different, and some compensation mechanisms need to specially throw exceptions to functions to be called to test transactions. If the business transaction test is initiated purely manually, it is very time-consuming and labor-intensive. The existing automatic test is generally targeted, a specific test method is used for different interface types, a unified method is lacked, the compiling cost of test cases is high, meanwhile, the instability of a test environment and test data often causes large negative effects on the automatic test, and the analysis cost of an automatic test result is overhigh.
Disclosure of Invention
In view of the above, the present invention provides a service interface testing method, apparatus, electronic device and storage medium to solve at least one of the above-mentioned problems.
According to a first aspect of the present invention, there is provided a service interface testing method, the method comprising:
acquiring an interface source code to be tested, and generating a test script according to a test item and the interface source code, wherein the interface source code comprises: the interface calls a function;
generating Input and Output (IO) information of the interface to be tested according to the interface calling function and the test item, wherein the IO information comprises: input field information and output field information;
generating test case data according to the IO information based on a pre-constructed case data generation model, wherein the test case data comprises: asserting information, wherein the case data generation model is constructed based on a historical interface to be tested, historical test items and historical test case data;
testing the interface to be tested according to the test case data and the test script;
receiving test return information from the interface to be tested, and judging a test operation result according to the test return information and the assertion information;
and responding to the test operation result to judge that the test fails, analyzing the test return information, and restarting the test operation on the interface to be tested according to the analysis result.
According to a second aspect of the present invention, there is provided a service interface testing apparatus, the apparatus comprising:
a source code obtaining unit, configured to obtain an interface source code to be tested, where the interface source code includes: the interface calls a function;
the test script generating unit is used for generating a test script according to the test item and the interface source code;
an IO information generating unit, configured to generate input/output IO information of the interface to be tested according to the interface call function and the test item, where the IO information includes: input field information and output field information;
the test case data generation unit is used for generating test case data according to the IO information based on a pre-constructed case data generation model, and the test case data comprises: asserting information, wherein the case data generation model is constructed based on a historical interface to be tested, historical test items and historical test case data;
the test unit is used for carrying out test operation on the interface to be tested according to the test case data and the test script;
the test result unit is used for receiving test return information from the interface to be tested and judging a test operation result according to the test return information and the assertion information;
and the analysis unit is used for responding to the test operation result and judging that the test fails, analyzing the test return information and restarting the test operation on the interface to be tested according to the analysis result.
According to a third aspect of the present invention, there is provided an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method when executing the program.
According to a fourth aspect of the invention, a computer-readable storage medium is provided, on which a computer program is stored which, when being executed by a processor, carries out the steps of the above-mentioned method.
According to the technical scheme, the test script is generated according to the acquired source code of the interface to be tested and the test item, the IO information of the interface to be tested is generated according to the interface calling function and the test item, the model is generated based on the case data which is constructed in advance, the test case data is generated according to the IO information, then the test operation is performed on the interface to be tested according to the test case data and the test script, when the test fails, the test return information can be analyzed based on the assertion information, the test operation is restarted on the interface according to the analysis result, compared with the prior art, the technical scheme can improve the usability of writing the test case based on the case data generation model, meanwhile, the test return information is analyzed by the assertion information, the test operation is restarted on the interface according to the analysis result, and the automatic repair of the test data and the environment is realized, the analysis cost of the test result is reduced, and therefore the working efficiency of the automatic test of the service interface can be improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a flow chart of a service interface testing method according to an embodiment of the present invention;
FIG. 2 is a block diagram of a service interface test apparatus according to an embodiment of the present invention;
FIG. 3 is a block diagram of an exemplary architecture of a platform services interface test system according to an embodiment of the present invention;
fig. 4 is a block diagram of the structure of the input-output field generation module 21 according to an embodiment of the present invention;
FIG. 5 is a flowchart of the operation of a platform services interface test system according to an embodiment of the present invention;
fig. 6 is a specific flowchart of analyzing the IO field of the interface of step S3 according to the embodiment of the present invention;
fig. 7 is a schematic block diagram of a system configuration of an electronic apparatus 600 according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Because the existing automatic test of the service interface has pertinence, a specific test method is used for different types of service interfaces, so that the compiling cost of a test case is high, meanwhile, the instability of a test environment and test data often causes great negative influence on the test, the analysis cost of a test result is overhigh, and the working efficiency of the automatic test of the service interface is reduced. Based on this, the embodiment of the invention provides a service interface testing scheme, which is combined with a machine learning technology, can improve the ease of compiling an automatic testing case, support the automatic repair of testing data and environment, reduce the analysis cost of a testing result, and further improve the working efficiency of the automatic testing of the service interface. The following detailed description of embodiments of the invention refers to the accompanying drawings.
Fig. 1 is a flowchart of a service interface testing method according to an embodiment of the present invention, as shown in fig. 1, the method includes:
step 101, obtaining an interface source code to be tested, and generating a test script according to a test item and the interface source code, wherein the interface source code comprises: the interface calls a function.
102, generating Input/Output (IO) information of the interface to be tested according to the interface call function and the test item, where the IO information includes: input field information and output field information.
103, generating test case data according to the IO information based on a pre-constructed case data generation model, wherein the test case data comprises: and asserting information, wherein the case data generation model is constructed based on a historical interface to be tested, historical test items and historical test case data. Assertion information herein refers to expected information on the results of a test item.
In one embodiment, assignment operations may be performed on Input (Input) fields and Output (Output) fields according to the IO information, respectively, based on a pre-constructed case data generation model. Specifically, the field attributes in the input field and the output field may be obtained according to the IO information, and then, based on a pre-constructed case data generation model, the fields may be assigned according to the field attributes in the input field and the output field.
Taking Input field Input as an example, firstly obtaining member (namely field) variables of Input type fields, then analyzing all member attributes, and if the member variables are base variables such as int, coolan, Date, String and the like, putting the member variables into MAP (a function); if the member is the LIST or the MAP, firstly constructing the corresponding LIST and MAP; if the member is an entity class, the nested loop calls from the head, and finally the MAP defined by the complete input field is obtained. And meanwhile, the attribute definition of each field is obtained, and when the annotation mode is used for defining whether the field is necessary to be input, the maximum length and the like, the annotation and the parameters of the field can be analyzed to obtain the field attribute. And after the complete input field attribute of the interface is obtained, automatically generating a default value of each field based on a pre-constructed case data generation model, and finishing assignment operation.
In one embodiment, when the interface call function is a MAP call function, the IO information may be generated according to valid fields of an output function and an input function related to the MAP call function, and the test item. That is, the interface does not have program definition classes of Input and Output, but directly uses MAP call and return, at this time, a static code analysis mode is used to obtain all fields in the program using GET function for Input parameter MAP, and obtain all effective Input fields. And then, automatically generating a default value of each field based on a pre-constructed case data generation model, and finishing assignment operation.
After the assignment operation is completed, test case data can be generated from the input and output fields of the assignment.
And 104, testing the interface to be tested according to the test case data and the test script.
Specifically, a job request message may be generated according to the test case data; and then testing the interface to be tested according to the operation request message and the test script.
And 105, receiving test return information from the interface to be tested, and judging a test operation result according to the test return information and the assertion information.
And 106, responding to the test operation result and judging that the test fails, analyzing the test return information, and restarting the test operation on the interface to be tested according to the analysis result.
Specifically, the test return information may be analyzed first to generate an analysis result; and then generating new test case data based on the case data generation model and the test items according to the analysis result, and restarting test operation on the interface to be tested according to the new test case data. Namely, the above-mentioned step 103-104 is executed again.
In one embodiment, the case data generation model can be updated according to the test return information, the analysis result and the new test case data, so that the accuracy of the case data generation model for generating the case data can be improved.
Compared with the prior art, the embodiment of the invention can improve the usability of test case compiling based on the case data generation model, simultaneously, analyze the test return information based on the assertion information and reinitiate the test operation to the interface according to the analysis result, realizes the automatic repair of the test data and the environment, and reduces the analysis cost of the test result, therefore, the working efficiency of the automatic test of the service interface can be improved.
In order to further understand the present invention, a detailed flow of the service interface test according to the embodiment of the present invention is given below, and the detailed flow includes:
1. and (3) script generation: and automatically analyzing and acquiring the type and the calling mode of the interface to be tested in a static code analysis mode, and generating a corresponding test script in a JAVA format.
2. Automatically generating test case data: the method comprises the steps of automatically generating a test case in an EXCEL format, and automatically supplementing contents such as test data, assertion and the like of the case by combining a case data generation model of a machine learning platform.
Specifically, firstly, the input and output fields of the interface are automatically analyzed and acquired, the message MAP definition of the input and output of the interface is generated, and the corresponding test case template in the EXCEL format is automatically generated according to the input and output fields and the attributes thereof.
Then, combining a machine learning platform, automatically analyzing built-in variables or most common data values corresponding to the fields according to the case data generation model to obtain the best matched test data, and giving initial values to input and output fields of the test case; and automatically generating case assertion, and judging a test return result.
The case data generation Model here may be a Generalized Linear Model (GLM) that can be constructed and trained from historical test case data, Input of test initiation requests, Output of test return results.
The test case data specifically includes: the interface service name and function name to be tested at this time and the assignment definition of each field, which may be internal variables such as { WORK _ DATE } (working DATE) defined and may also be actual data, are test case data part fields stored in the EXCEL format, as the test case segment shown in table 1 below, where the assignment of the field WORK _ DATE is the internal variable # { WORK _ DATE }, and the assignment of the field id _ code at this time is the actual identity number.
Parameter name Default assignment 2
work_date #{WORK_DATE}
real_time #{WORK_TIME}
medium_id {medium_id}
password psw({medium_id},112233)
id_code 450521199212010029
TABLE 1
The test case data also includes a test assertion definition, the assertion being an expected judgment of the result of the test transaction. As shown in the test assertion definition segment of table 2 below, it is expected that MEDIUM _ N0 should return the account number actually used this time for the transaction, and it is expected that MEDIUM _ STATUS should return 3.
Domain name Expected value of 1
MEDIUM_NO { assertion 3.MEDIUM _ NO }
MEDIUM_STATUS 3
TABLE 2
The Input of the test request and the Output of the test result are both actual data in the test process, and a JSON message segment of the transaction test initiation request Input is as follows, and contains transaction actual data such as an event list number, a card number, a password and the like. The Output message of the transaction RETURN result generally includes fields such as RETURN _ CODE and RETURN _ MSG.
Figure BDA0002938896520000071
And automatically analyzing the test case of the service by using a machine learning platform according to a case data generation model, wherein the test data with the most matched fields may be a built-in variable, if the transfer transaction is performed currently, the field word _ DATE field is analyzed to obtain the most matched assignment of the built-in variable # { WORK _ DATE }, or may be the most common data value of the transaction success, if the field medium _ no is obtained the most matched card number test data, and accordingly, the field of the test case is endowed with an initial value. And similarly, a case data generation model is used for automatically generating case assertions and judging test return results.
3. And (3) script operation: the tester can further supplement rich test data based on the generated test case, then initiate script test execution, automatically pack the case data into transaction request messages, and automatically initiate corresponding transaction request calls (http or RPC and other modes) to the interface to be tested according to the JAVA test script.
4. Judging the assertion: and receiving transaction return information, analyzing to obtain the value of a corresponding field in the return value according to the definition of the return Output field, and comparing the field value with the corresponding assertion. For example, the analysis returns the message Output, such as the specific return value of the field MEDIUM _ STATUS, and compares with the assertion that the expected MEDIUM _ STATUS is 3, and when the comparison result is in accordance with the assertion, it determines that the transaction is successful.
5. Testing self-adaptation: for the transaction which fails or fails in the test, the machine learning platform is combined to learn the test process data, the reason that the current test fails is judged, if the reason is a data problem, the test case data is automatically modified, after the test environment is checked, the transaction is automatically initiated and then verified, the automatic test result in a stable state is obtained, and meanwhile, the test process data is input to a case data generation model of the machine learning platform for learning and training.
In one embodiment, for example, if the test item is a withdrawal transaction, if RETURN _ CODE is "13001246" and RETURN _ MSG is "account balance is insufficient", the analysis determines that the reason for the test failure is that the account data of the current transaction cannot be satisfied, and at this time, the account number media _ NO of the case data is automatically modified to another most matched account number data. If the test result is reported to be wrong and returned to the No provider available, the distributed service environment is analyzed and judged to be abnormal, the environment automatic repair service is called, and after the test environment is successfully checked, the transaction is automatically initiated and then verified, so that the automatic test result in the stable state is obtained. For the transaction failure caused by the unknown reason that the reason for judging the test failure is caused by the environmental problem and cannot be automatically repaired or the other unknown reasons that the assertion cannot be satisfied with the non-service error, the test result is automatically listed as skipping, so that the interference caused by error reporting information of the environmental problem can be avoided being eliminated.
The embodiment of the invention can automatically identify the interface to be tested and generate the automatic test script, adapts to different calling method flows, simultaneously utilizes the machine learning platform to automatically generate the test case data, reduces the cost for compiling the test case, automatically corrects the problematic test case data by combining the machine learning platform in the test process, and improves the success rate of the automatic test.
Based on similar inventive concepts, the embodiment of the present invention further provides a service interface testing apparatus, which may be preferably used to implement the flow of the service interface testing method.
Fig. 2 is a block diagram of the service interface test apparatus, as shown in fig. 2, the apparatus includes: a source code obtaining unit 201, a test script generating unit 202, an IO information generating unit 203, a test case data generating unit 204, a testing unit 205, a test result unit 206, and an analyzing unit 207, wherein:
a source code obtaining unit 201, configured to obtain an interface source code to be tested, where the interface source code includes: the interface calls a function.
And the test script generating unit 202 is configured to generate a test script according to the test item and the interface source code.
An IO information generating unit 203, configured to generate input/output IO information of the interface to be tested according to the interface call function and the test item, where the IO information includes: input field information and output field information.
In an embodiment, when the interface call function is a MAP call function, the IO information generation unit is specifically configured to: and generating the IO information according to the effective fields of the output function and the input function related to the MAP calling function and the test items.
The test case data generating unit 204 is configured to generate test case data according to the IO information based on a pre-constructed case data generation model, where the test case data includes: and asserting information, wherein the case data generation model is constructed based on a historical interface to be tested, historical test items and historical test case data.
In one embodiment, the test case data generation unit 204 includes: an assignment module and a test case data generation module, wherein:
the assignment module is used for generating a model based on pre-constructed case data and performing assignment operation on the input field and the output field according to the IO information;
and the test case data generating module is used for generating test case data according to the input field and the output field of the assignment.
The assignment module includes: the field attribute acquisition submodule and the assignment submodule, wherein:
the field attribute obtaining submodule is used for obtaining each field attribute in the input field and the output field according to the IO information;
and the assignment submodule is used for generating a model based on the pre-constructed case data and performing assignment operation on each field according to each field attribute in the input field and the output field.
And the testing unit 205 is configured to perform a testing operation on the interface to be tested according to the test case data and the test script.
Specifically, the test unit 205 includes: the device comprises a message generating module and a testing module, wherein:
the message generating module is used for generating an operation request message according to the test case data;
and the test module is used for carrying out test operation on the interface to be tested according to the operation request message and the test script.
And the test result unit 206 is configured to receive test return information from the interface to be tested, and determine a test operation result according to the test return information and the assertion information.
And the analysis unit 207 is configured to analyze the test return information in response to the determination that the test operation result is a test failure, and reinitiate a test operation on the interface to be tested according to the analysis result.
In one embodiment, the analysis unit 207 comprises: an analysis module and a retest module, wherein:
the analysis module is used for analyzing the test return information to generate an analysis result;
and the retesting module is used for generating new test case data based on the case data generation model and the test items according to the analysis result and restarting test operation on the interface to be tested according to the new test case data.
In practical operation, the above apparatus further comprises: and the updating unit is used for updating the case data generation model according to the test return information, the analysis result and the new test case data.
For specific execution processes of the units, the modules, and the sub-modules, reference may be made to the description in the foregoing method embodiments, and details are not described here again.
In practical operation, the units, the modules and the sub-modules may be combined or may be arranged singly, and the present invention is not limited thereto.
For better understanding of the present invention, a specific embodiment is given below by taking the platform service interface of the Dubbo framework as an example.
In this embodiment, a test system of a platform service interface is provided, and fig. 3 is a block diagram of an example structure of the system, and as shown in fig. 3, the system includes: the scenario generation device 1, the case generation device 2, the scenario execution device 3, the assertion verification device 4, and the test self-adaptation device 5 are described below.
(1) Scenario generation device 1
The script generating device automatically acquires the type and the calling mode of the interface in a static code analysis mode and generates a corresponding test script template in a JAVA format. Specifically, the script generating apparatus 1 includes: the system comprises an interface source code analysis module 11 and a JAVA script generation module 12, wherein:
the interface source code analysis module 11: and scanning and recording the program files of the service interfaces, and automatically acquiring the names and calling functions of all the service interfaces under the items to be tested in a static code analysis mode. For the platform services interface of the Dubbo framework, it is generally recommended to record all platform services interfaces in/META-INF/Dubbo/consumer. The interface source code analysis module analyzes and scans the consumer, xml file to obtain all the dubbo reference nodes, then analyzes each node to obtain the id (identification) and interface value, and then obtains all the function names of the interface by calling getDecleardmethods (a calling function) by using JAVA reflection technology for the interface program name.
The JAVA script generating module 12: and automatically generating an automatic test script in a JAVA program format based on the result obtained by the interface source code analysis module 11. And generating a corresponding JAVA test script for each function of the tested interface program. For example, the interface program to be tested is named as debo card management. JAVA, which has two functions of fuca and FunB, and two JAVA test scripts named as debo card management funitest. JAVA and debo card management funbtest. JAVA are generated by default. For each JAVA test script, test data can be loaded based on a TestNG (a test framework) framework, pre-processing and post-processing are supported for beforeTest, afterTest and the like, and key test flow templates including data preparation, environment backup, call-up test, environment recovery, test assertion and the like are automatically generated, wherein the template patterns are as follows:
Figure BDA0002938896520000101
Figure BDA0002938896520000111
(2) case generation device 2
The case generation device obtains the input and output fields of the interface in a static code analysis mode, obtains the input and output message MAP definition of the interface, automatically generates a corresponding test case in an EXCEL format according to the input and output fields and the attributes thereof, fills default case data, and automatically supplements the case according to a grammar path. Specifically, the case generation apparatus 2 includes: an input/output field generation module 21 and a default case data generation module 22, wherein:
input/output field generation module 21: and automatically analyzing and acquiring the input and output fields of the interface and generating the complete field definition of the input and output of the interface. The module automatically adapts to various interfaces for Input and Output interaction, including interfaces with program definition classes for Input and Output Ouput, or interfaces which do not have the program definition classes for Input and Output but directly use MAP (MAP) calling, and analyzes to obtain the complete definitions of all Input and Output fields of the interface function, including the attributes of the name, type, Input/Output necessity, maximum length and the like of the fields.
In one embodiment, the input-output field generation module 21 includes the modules shown in FIG. 4: I/O definition program class analysis module 211, MAP program class analysis module 212, wherein:
the I/O definition program class analysis module 211: the module judges that the tested interface program class has program definitions of Input and Output Ouput, for example, when the interface program DebitCardMandmage. JAVA has corresponding Input definition class DebitCardManageInput. JAVA and Output definition class DebitCardManageOutput. JAVA, the module analyzes the Input and Output definition classes in a JAVA reflection mode to obtain all Input and Output fields and specific attributes thereof, and records the Input and Output fields as the format of Map.
Taking the Input definition Input class as an example, the overall process is as follows: firstly, acquiring member variables of an Input class by using getFields, then analyzing all member attributes, and if the member variables are basic variables such as int, coolan, Date, String and the like, putting the member variables into Map; if the member is a List or a Map, constructing the corresponding List and Map first; if the member is an entity class, the nested loop calls from the head, and finally the Map defined by the complete input field is obtained. And simultaneously acquiring attribute definitions of each field, and if the maximum length and the necessary length of the field are defined by using an annotation mode, analyzing the annotation and the parameters of the field to obtain the field attribute. Similarly, through the above flow, the complete field definition of the input definition Output class can also be obtained.
MAP program class analysis module 212: the module judges that the interface program does not Input program definition classes of Input and Output, but directly uses MAP calling and returning, and uses a static code analysis mode to obtain all fields using GET function for Input parameter MAP in the program to obtain all effective Input fields.
After the definitions of the complete input field and the complete output field are obtained, the input and output field generation module is combined with a case data generation model of a machine learning platform to automatically generate a default value of each field:
for the most commonly used field names, such as eventno (event number), workdate (working date), and the like, the case data generation model is subjected to model learning, and can be solidified into parameters such as the event number, the working date, and the like, and automatically generate internal variables such as { event number }, workdate } and the like for the case, the internal variables dynamically generate corresponding values during running, such as { event number } is random data with the length of 27 bits, and { workdate } is the current date, and the machine learning platform can continuously increase the internal variables of the commonly used fields.
If the value is not within the range of the built-in variable, the most common value assigned to the field name, for example, the field name CardNo, is automatically retrieved on the machine learning platform and is assigned as the most frequently used bank card number.
If the big data platform fails in retrieval and analysis, default initial values are automatically given, for example, the int type default value is 0, the bootean default value is true, the Date type default value is Date.
Default case data generation module 22: and automatically generating a corresponding test case file in the EXCEL format according to the input and output fields of the interface and the default assignment thereof. Case data is shown in table 3 below, where table 3 shows the input field definition style, where Info is a nested Map layer, under which there are the transenno, idType and idCode fields. Wherein, the tranSerno is a built-in transaction serial number and can automatically generate a unique number; obtaining the most matched certificate information by using the idType certificate type and the idCode certificate number and using a case data generation model of a machine learning platform, wherein the matching information may comprise: the credential information most commonly used in a transaction is tested for a successful steady state of the transaction. Other fields such as account information can also be automatically matched and acquired by using the case data generation model.
Inputting class names Parameter name Type (B) Must be transfused Maximum length Default assignments
DebitCardManageFunInput Info Map
DebitCardManageFunInput.Info tranSerno String Y 23 { transaction sequence number }
DebitCardManageFunInput.Info idType int Y 4 1
DebitCardManageFunInput.Info idCode String Y 18 440105200001011111
TABLE 3
Likewise, the output field definitions may be automatically generated based on the table 3 style. Meanwhile, the assertion for judging whether the transaction return code transok is 0 or not can be built in, and various processing for calculating the transaction return field information and judging the result can be provided, such as the step of asserting whether the account balance after the transaction is equal to the sum of the sum before the transaction and the sum of the deposit of this time to a deposit interface or not.
The EXCEL test case file defines an enumeration dictionary of a test interface calling mode, including a 0-RPC mode (default), a 1-HTTP mode, and the like, in addition to an input/output field and default data thereof of the test interface. The tester can also supplement rich test data according to the business process, and can also select a calling mode and fill a detailed interface test address.
(3) Script execution device 3
The script running device is responsible for packaging the case data into a transaction request message in an MAP format, and the JAVA test script automatically initiates a corresponding transaction request call (in a http or RPC mode and the like) to the interface according to the case definition. Specifically, the script execution apparatus 3 includes: case data packaging module 31, interface transaction request module 32, wherein:
case data packaging module 31: and after the test cases are supplemented and enriched, packing the case data. The main flow is to store all input fields and their data into the request transaction message. Taking a common MAP format request message as an example, referring to the case data style, first generating an Info object, storing the fields and data below the Info object in ways of put (transeno, 23), put (idType,1) and put (idCode,440105200001011111), then storing the Info object in a way of put (debo card manager fun input, Info) into a root object of an input class, and generating and storing the rest fields in a similar way.
Interface transaction request module 32: and analyzing the interface calling mode defined by the EXCEL test case file and automatically calling the tested interface of the test environment. Taking an RPC calling mode as an example, according to an interface program name and a function name, using a JAVA reflection mechanism, first obtaining an interface class object by class. For an interface which needs special processing, such as transaction compensation testing, exception is purposely thrown out when the interface is called, so that an exception handling mechanism of the tested interface is pulled up.
(4) Assertion verifying device 4
The assertion verifying device is used for verifying assertion of a transaction automation test result, analyzing to obtain a value corresponding to each field in a return value by receiving transaction return information and according to definition of an interface Output return field, and verifying each field value according to preset assertion. Specifically, the assertion verifying means 4 includes: a returned information analyzing module 41 and a test result judging module 42, wherein:
the return information analysis module 41: after the transaction request is called, the return information of the transaction is returned, which is generally information in a MAP format or a serialized class object, and for the latter, preprocessing is automatically performed, and the return information in the MAP format is obtained by firstly deserializing and then circularly calling getFields (a calling function) by using a JAVA reflection mechanism. For the return information in the MAP format, the specific content of each field in the transaction return information can be obtained by combining the definition of the interface Output return field.
The test result judging module 42: judging the assertion of the interface test result (namely, the return information), firstly obtaining the built-in assertion, including whether the transaction return code transok is 0, and the like, and then judging the self-defined assertion to obtain the judgment whether the test result meets the expectation. If all assertions are in accordance, namely the test results are in accordance with expectations, the transaction is successful, otherwise, the transaction is failed, and the reasons why the assertions do not pass are marked, so that abnormal results are analyzed and judged at a later stage. The module can also send all test process data to the machine learning platform, so that the case data generation model can be conveniently used for learning and training.
(5) Test self-adapting device 5
The test self-adaptation device is responsible for automatically carrying out data self-adaptation according to the reason that the assertion fails to pass the test of the result, and re-sending the transaction and then verifying the transaction to obtain the test result in a stable state. Specifically, the test self-adapting device 5 includes: a data adaptation module 51, a transaction re-verification module 52, wherein:
data adaptation module 51: based on the learning of the case data generation model to the test process data, the data of the test case is automatically modified according to the reason of the current test failure, for example, the reason of the test failure is a data problem, for example, the test withdrawal transaction, and the reason of the test failure is that the account balance is insufficient, the module automatically judges that the data is possibly polluted, and the data self-adaptive module automatically takes out the account information in the test data of the latest successful deposit or withdrawal transaction and replaces the account information in the test case data.
Transaction re-validation module 52: after the data or environment is repaired, the test transaction is re-validated. For example, if the transaction result is no error of the service provider, a checking mechanism provided by the service is automatically called, and if the environment is determined to be unstable after the checking is passed, the request call of the failed transaction is directly pulled up again.
FIG. 5 is a flowchart of the platform services interface test system, and as shown in FIG. 5, the flowchart is mainly divided into two parts, where the first part is mainly responsible for generating test scripts and cases, including steps S1 to S4; the second part is mainly responsible for testing transaction initiation, result judgment and test adaptation, and includes steps S5 to S9, which are described below.
Step S1: the interface source code analysis module analyzes the interface source code. The interface source code analysis module scans and records the program files of the service interfaces, and automatically acquires the names and the call-up functions of all the service interfaces under the items to be tested in a static code analysis mode.
Step S2: and the JAVA script generating module generates a test script. The JAVA script generating module generates a corresponding JAVA test script for each function of the tested interface program, the script can automatically load test data based on a TestNG framework, supports pre-processing and post-processing of before test, after test and the like, and automatically generates a template of key test processes including data preparation, environment backup, call-up test, environment recovery, test assertion and the like.
Step S3: and the input and output field generation module analyzes the IO field of the interface. The input and output field generation module automatically analyzes and acquires the input and output fields of the interface and generates the complete field definition of the input and output of the interface; the interfaces for automatically interacting various Input and Output, including the interfaces with program definition classes for Input and Output Ouput, or the interfaces which do not have the program definition classes for Input and Output but directly use MAP call, are automatically adapted, and all Input and Output field definitions of the functions of the interfaces are obtained through analysis.
Step S4: the default case data generation module generates case data. And the default case data generation module automatically generates a corresponding test case file in the EXCEL format according to the interface input and output field and the default assignment thereof, and is internally provided with an enumeration dictionary providing an assertion processing function and a calling mode of the test interface.
After the test script and the test case are automatically generated, a tester can supplement rich test data and predicate judgment according to the service flow, and can also select a calling mode and fill a detailed interface test address.
Step S5: the case data packing module packs the case data. And the case data packaging module packages and stores all input fields and data thereof into the transaction request message.
Step S6: the interface transaction request module initiates a test transaction. The interface transaction request module analyzes the interface invocation mode defined by the EXCEL test case file, automatically initiates a test invocation to the tested interface of the test environment by using the transaction message prepared in the step S5.
Step S7: the returned information analysis module analyzes the returned information. The returned information analysis module converts the returned information of the transaction into information in the MAP format in a unified manner, and obtains the specific content of each field in the returned information of the transaction by combining the definition of the Output returned fields of the interface.
Step S8: and the test result judging module judges the test result. The test result judging module judges the assertion of the interface test result and processes the interface test result by utilizing the built-in or supplementary assertion to obtain the judgment whether the test result meets the expectation. If all assertions pass, the testing result is in accordance with expectation, the transaction is successful, otherwise, the transaction fails, and then the analysis and judgment of the abnormal result are carried out according to the reason that the specific assertions of the standard column fail.
Step S9: and the self-adaptive testing device is used for self-adaptive automatic testing. And the test self-adaptation device automatically performs data self-adaptation for the test of the result failure according to the reason of the assertion failure, and re-initiates the transaction and then verifies the transaction to obtain the test result in a stable state.
Fig. 6 is a specific flowchart of analyzing the IO field of the interface in step S3, and as shown in fig. 6, the flowchart includes:
step S301: and acquiring member variables of the current layer. The analysis and judgment are carried out for each member variable according to the following steps.
Step S302: and judging whether the member variable is a basic variable such as int, coolean, Date, String and the like.
Step S303: and if the basic variable exists, directly adding the member variable and the attribute thereof into the result set and ending the process.
Step S304: if not, judging whether the member variable is the LIST/MAP container.
Step S305: if the LIST/MAP container is used, a LIST/MAP layer is newly added, and the member variable name is used as the name of the layer.
Step S306: and if the entity type is not the basic variable or the LIST/MAP container, judging whether the entity type is the entity type. If the entity class is the entity class, the definition of the entity class is acquired, and the step S301 is called by nesting and circulating.
The platform service interface test system provided by the embodiment of the invention can automatically complete the generation of automatic test scripts and test cases of various platform service interfaces, and can start the work of testing and the like and adapt to the test result. In particular, the system has the following advantages:
(1) different call-up modes such as RPC and http are automatically supported, and call-up services of different test interfaces are automatically supported. The testing personnel can automatically generate the script and the testing case of the automatic test by only providing the interface name without paying attention to the detailed implementation mode of the tested interface.
(2) The separation of script and data is automatically realized. The automatic test script in JAVA format is a high-level abstraction of automatic test, and the automatic test data in EXCEL format is used for realizing concrete data of business process test. And for the service testers, supplementing rich service data in the EXCEL according to the service flow.
(3) The method has the advantages that different interfaces are automatically analyzed and recognized to obtain input and output field definitions, meanwhile, default test data are automatically generated, particularly, the most common built-in variables are used in a combined mode, the matched most common data are automatically obtained through a machine learning platform, and the ease of writing cases is effectively improved.
(4) And automatically judging program result assertions, providing default assertions, supporting modular embedded assertions, automatically analyzing and judging transaction test results, and analyzing and judging abnormal results for specific reasons of automatic pillars for which assertions do not pass.
(5) And realizing the self-adaptive analysis and adaptation of the test result. And for the test that the result fails, combining a machine learning platform, automatically performing data self-adaptation according to the specific reason that the assertion fails, and re-initiating the transaction and then verifying to obtain the test result in a stable state. The influence of the test environment or the test data on the automatic test is avoided, and the analysis cost of a large number of failure cases of the automatic test for a tester is reduced.
The present embodiment also provides an electronic device, which may be a desktop computer, a tablet computer, a mobile terminal, and the like, but is not limited thereto. In this embodiment, the electronic device may be implemented with reference to the above method embodiment and the embodiment of the service interface testing apparatus, and the contents thereof are incorporated herein, and repeated descriptions are omitted.
Fig. 7 is a schematic block diagram of a system configuration of an electronic apparatus 600 according to an embodiment of the present invention. As shown in fig. 7, the electronic device 600 may include a central processor 100 and a memory 140; the memory 140 is coupled to the central processor 100. Notably, this diagram is exemplary; other types of structures may also be used in addition to or in place of the structure to implement telecommunications or other functions.
In one embodiment, the service interface test function may be integrated into the central processor 100. The central processor 100 may be configured to control as follows:
acquiring an interface source code to be tested, and generating a test script according to a test item and the interface source code, wherein the interface source code comprises: the interface calls a function;
generating Input and Output (IO) information of the interface to be tested according to the interface calling function and the test item, wherein the IO information comprises: input field information and output field information;
generating test case data according to the IO information based on a pre-constructed case data generation model, wherein the test case data comprises: asserting information, wherein the case data generation model is constructed based on a historical interface to be tested, historical test items and historical test case data;
testing the interface to be tested according to the test case data and the test script;
receiving test return information from the interface to be tested, and judging a test operation result according to the test return information and the assertion information;
and responding to the test operation result to judge that the test fails, analyzing the test return information, and restarting the test operation on the interface to be tested according to the analysis result.
As can be seen from the above description, the electronic device provided in the embodiment of the present invention generates a test script according to the acquired source code of the interface to be tested and the test item, then generates IO information of the interface to be tested according to the interface call function and the test item, generates a model based on the case data that is constructed in advance, generates test case data according to the IO information, and then performs a test operation on the interface to be tested according to the test case data and the test script, and when the test fails, can analyze test return information based on assertion information and reinitiate the test operation on the interface according to an analysis result, compared with the prior art, the embodiment of the present invention can improve the ease of writing the test case based on the case data generation model, and at the same time, analyze the test return information by the assertion information and reinitiate the test operation on the interface according to the analysis result, thereby implementing automatic repair of the test data and, the analysis cost of the test result is reduced, and therefore the working efficiency of the automatic test of the service interface can be improved.
In another embodiment, the service interface testing device may be configured separately from the central processing unit 100, for example, the service interface testing device may be configured as a chip connected to the central processing unit 100, and the service interface testing function is realized by the control of the central processing unit.
As shown in fig. 7, the electronic device 600 may further include: communication module 110, input unit 120, audio processing unit 130, display 160, power supply 170. It is noted that the electronic device 600 does not necessarily include all of the components shown in fig. 7; furthermore, the electronic device 600 may also comprise components not shown in fig. 7, which may be referred to in the prior art.
As shown in fig. 7, the central processor 100, sometimes referred to as a controller or operational control, may include a microprocessor or other processor device and/or logic device, the central processor 100 receiving input and controlling the operation of the various components of the electronic device 600.
The memory 140 may be, for example, one or more of a buffer, a flash memory, a hard drive, a removable media, a volatile memory, a non-volatile memory, or other suitable device. The information relating to the failure may be stored, and a program for executing the information may be stored. And the central processing unit 100 may execute the program stored in the memory 140 to realize information storage or processing, etc.
The input unit 120 provides input to the cpu 100. The input unit 120 is, for example, a key or a touch input device. The power supply 170 is used to provide power to the electronic device 600. The display 160 is used to display an object to be displayed, such as an image or a character. The display may be, for example, an LCD display, but is not limited thereto.
The memory 140 may be a solid state memory such as Read Only Memory (ROM), Random Access Memory (RAM), a SIM card, or the like. There may also be a memory that holds information even when power is off, can be selectively erased, and is provided with more data, an example of which is sometimes called an EPROM or the like. The memory 140 may also be some other type of device. Memory 140 includes buffer memory 141 (sometimes referred to as a buffer). The memory 140 may include an application/function storage section 142, and the application/function storage section 142 is used to store application programs and function programs or a flow for executing the operation of the electronic device 600 by the central processing unit 100.
The memory 140 may also include a data store 143, the data store 143 for storing data, such as contacts, digital data, pictures, sounds, and/or any other data used by the electronic device. The driver storage portion 144 of the memory 140 may include various drivers of the electronic device for communication functions and/or for performing other functions of the electronic device (e.g., messaging application, address book application, etc.).
The communication module 110 is a transmitter/receiver 110 that transmits and receives signals via an antenna 111. The communication module (transmitter/receiver) 110 is coupled to the central processor 100 to provide an input signal and receive an output signal, which may be the same as in the case of a conventional mobile communication terminal.
Based on different communication technologies, a plurality of communication modules 110, such as a cellular network module, a bluetooth module, and/or a wireless local area network module, may be provided in the same electronic device. The communication module (transmitter/receiver) 110 is also coupled to a speaker 131 and a microphone 132 via an audio processor 130 to provide audio output via the speaker 131 and receive audio input from the microphone 132 to implement general telecommunications functions. Audio processor 130 may include any suitable buffers, decoders, amplifiers and so forth. In addition, an audio processor 130 is also coupled to the central processor 100, so that recording on the local can be enabled through a microphone 132, and so that sound stored on the local can be played through a speaker 131.
Embodiments of the present invention further provide a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the steps of the service interface testing method.
In summary, the embodiments of the present invention provide a definition for automatically analyzing and identifying various platform service interface programs to obtain interface input and output, and further automatically generate corresponding automated test scripts and cases, support invocation manners such as http and RPC, support different invocation manners of various interfaces, and combine with a machine learning platform, so as to improve ease of writing automated test cases and data, and support automatic repair of test data and environment, thereby improving work efficiency of automated testing of open platform service interfaces.
The preferred embodiments of the present invention have been described above with reference to the accompanying drawings. The many features and advantages of the embodiments are apparent from the detailed specification, and thus, it is intended by the appended claims to cover all such features and advantages of the embodiments which fall within the true spirit and scope thereof. Further, since numerous modifications and changes will readily occur to those skilled in the art, it is not desired to limit the embodiments of the invention to the exact construction and operation illustrated and described, and accordingly, all suitable modifications and equivalents may be resorted to, falling within the scope thereof.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The principle and the implementation mode of the invention are explained by applying specific embodiments in the invention, and the description of the embodiments is only used for helping to understand the method and the core idea of the invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (16)

1. A method for testing a service interface, the method comprising:
acquiring an interface source code to be tested, and generating a test script according to a test item and the interface source code, wherein the interface source code comprises: the interface calls a function;
generating Input and Output (IO) information of the interface to be tested according to the interface calling function and the test item, wherein the IO information comprises: input field information and output field information;
generating test case data according to the IO information based on a pre-constructed case data generation model, wherein the test case data comprises: asserting information, wherein the case data generation model is constructed based on a historical interface to be tested, historical test items and historical test case data;
testing the interface to be tested according to the test case data and the test script;
receiving test return information from the interface to be tested, and judging a test operation result according to the test return information and the assertion information;
and responding to the test operation result to judge that the test fails, analyzing the test return information, and restarting the test operation on the interface to be tested according to the analysis result.
2. The method of claim 1, wherein analyzing the test return information and reinitiating the test operation for the interface to be tested according to the analysis result comprises:
analyzing the test return information to generate an analysis result;
and generating new test case data based on the case data generation model and the test items according to the analysis result, and restarting test operation on the interface to be tested according to the new test case data.
3. The method of claim 2, wherein after reinitiating the testing operation on the interface to be tested according to the analysis result, the method further comprises:
and updating the case data generation model according to the test return information, the analysis result and the new test case data.
4. The method of claim 1, wherein generating test case data from the IO information based on a pre-constructed case data generation model comprises:
based on a pre-constructed case data generation model, respectively carrying out assignment operation on an input field and an output field according to the IO information;
and generating test case data according to the input field and the output field of the assignment.
5. The method of claim 4, wherein based on a pre-constructed case data generation model, performing assignment operations on input fields and output fields according to the IO information respectively comprises:
acquiring each field attribute in an input field and an output field according to the IO information;
and respectively carrying out assignment operation on each field according to each field attribute in the input field and the output field based on a pre-constructed case data generation model.
6. The method of claim 1, wherein when the interface call function is a MAP call function, generating the IO information of the interface to be tested according to the interface call function and the test item comprises:
and generating the IO information according to the effective fields of the output function and the input function related to the MAP calling function and the test items.
7. The method of claim 1, wherein performing a test operation on the interface to be tested according to the test case data and the test script comprises:
generating an operation request message according to the test case data;
and testing the interface to be tested according to the operation request message and the test script.
8. A service interface testing apparatus, the apparatus comprising:
a source code obtaining unit, configured to obtain an interface source code to be tested, where the interface source code includes: the interface calls a function;
the test script generating unit is used for generating a test script according to the test item and the interface source code;
an IO information generating unit, configured to generate input/output IO information of the interface to be tested according to the interface call function and the test item, where the IO information includes: input field information and output field information;
the test case data generation unit is used for generating test case data according to the IO information based on a pre-constructed case data generation model, and the test case data comprises: asserting information, wherein the case data generation model is constructed based on a historical interface to be tested, historical test items and historical test case data;
the test unit is used for carrying out test operation on the interface to be tested according to the test case data and the test script;
the test result unit is used for receiving test return information from the interface to be tested and judging a test operation result according to the test return information and the assertion information;
and the analysis unit is used for responding to the test operation result and judging that the test fails, analyzing the test return information and restarting the test operation on the interface to be tested according to the analysis result.
9. The apparatus of claim 8, wherein the analysis unit comprises:
the analysis module is used for analyzing the test return information to generate an analysis result;
and the retesting module is used for generating new test case data based on the case data generation model and the test items according to the analysis result and restarting test operation on the interface to be tested according to the new test case data.
10. The apparatus of claim 9, further comprising:
and the updating unit is used for updating the case data generation model according to the test return information, the analysis result and the new test case data.
11. The apparatus of claim 8, wherein the test case data generating unit comprises:
the assignment module is used for generating a model based on pre-constructed case data and performing assignment operation on the input field and the output field according to the IO information;
and the test case data generating module is used for generating test case data according to the input field and the output field of the assignment.
12. The apparatus of claim 11, wherein the assignment module comprises:
the field attribute obtaining submodule is used for obtaining each field attribute in the input field and the output field according to the IO information;
and the assignment submodule is used for generating a model based on the pre-constructed case data and performing assignment operation on each field according to each field attribute in the input field and the output field.
13. The apparatus according to claim 8, wherein when the interface call function is a MAP call function, the IO information generation unit is specifically configured to:
and generating the IO information according to the effective fields of the output function and the input function related to the MAP calling function and the test items.
14. The apparatus of claim 8, wherein the test unit comprises:
the message generating module is used for generating an operation request message according to the test case data;
and the test module is used for carrying out test operation on the interface to be tested according to the operation request message and the test script.
15. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the method of any of claims 1 to 7 are implemented when the processor executes the program.
16. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN202110170956.6A 2021-02-08 2021-02-08 Service interface testing method and device, electronic equipment and storage medium Active CN112905459B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110170956.6A CN112905459B (en) 2021-02-08 2021-02-08 Service interface testing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110170956.6A CN112905459B (en) 2021-02-08 2021-02-08 Service interface testing method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112905459A true CN112905459A (en) 2021-06-04
CN112905459B CN112905459B (en) 2024-05-03

Family

ID=76123935

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110170956.6A Active CN112905459B (en) 2021-02-08 2021-02-08 Service interface testing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112905459B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113742212A (en) * 2021-06-30 2021-12-03 中国工商银行股份有限公司 New and old system migration test method and device
CN113778871A (en) * 2021-09-07 2021-12-10 未鲲(上海)科技服务有限公司 Mock testing method, device, equipment and storage medium
CN113986751A (en) * 2021-11-09 2022-01-28 中国建设银行股份有限公司 Testing method and device suitable for multiple operating systems
CN114090454A (en) * 2021-11-29 2022-02-25 苏州万店掌网络科技有限公司 Interface automatic testing method, device, equipment and storage medium
CN114116520A (en) * 2021-12-08 2022-03-01 北京字节跳动网络技术有限公司 Algorithm evaluation method, device, gateway and storage medium
CN114936358A (en) * 2022-06-10 2022-08-23 知迪汽车技术(北京)有限公司 Human-computer interaction based verification method and verification system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109474488A (en) * 2018-10-31 2019-03-15 中国银行股份有限公司 Interface test method, device and computer equipment
CN110334021A (en) * 2019-06-25 2019-10-15 深圳前海微众银行股份有限公司 Generation method, device, equipment and the storage medium of interface testing case
CN111813680A (en) * 2020-07-13 2020-10-23 中国建设银行股份有限公司 Test script generation method and device, computer equipment and storage medium
CN111949543A (en) * 2020-08-13 2020-11-17 中国工商银行股份有限公司 Testing method and device based on distributed platform, electronic equipment and storage medium
CN112148620A (en) * 2020-10-12 2020-12-29 中国农业银行股份有限公司 Test case generation method and related equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109474488A (en) * 2018-10-31 2019-03-15 中国银行股份有限公司 Interface test method, device and computer equipment
CN110334021A (en) * 2019-06-25 2019-10-15 深圳前海微众银行股份有限公司 Generation method, device, equipment and the storage medium of interface testing case
CN111813680A (en) * 2020-07-13 2020-10-23 中国建设银行股份有限公司 Test script generation method and device, computer equipment and storage medium
CN111949543A (en) * 2020-08-13 2020-11-17 中国工商银行股份有限公司 Testing method and device based on distributed platform, electronic equipment and storage medium
CN112148620A (en) * 2020-10-12 2020-12-29 中国农业银行股份有限公司 Test case generation method and related equipment

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113742212A (en) * 2021-06-30 2021-12-03 中国工商银行股份有限公司 New and old system migration test method and device
CN113778871A (en) * 2021-09-07 2021-12-10 未鲲(上海)科技服务有限公司 Mock testing method, device, equipment and storage medium
CN113986751A (en) * 2021-11-09 2022-01-28 中国建设银行股份有限公司 Testing method and device suitable for multiple operating systems
CN114090454A (en) * 2021-11-29 2022-02-25 苏州万店掌网络科技有限公司 Interface automatic testing method, device, equipment and storage medium
CN114090454B (en) * 2021-11-29 2023-01-24 苏州万店掌网络科技有限公司 Interface automation test method, device, equipment and storage medium
CN114116520A (en) * 2021-12-08 2022-03-01 北京字节跳动网络技术有限公司 Algorithm evaluation method, device, gateway and storage medium
CN114116520B (en) * 2021-12-08 2023-05-26 抖音视界有限公司 Algorithm evaluation method, device, gateway and storage medium
CN114936358A (en) * 2022-06-10 2022-08-23 知迪汽车技术(北京)有限公司 Human-computer interaction based verification method and verification system

Also Published As

Publication number Publication date
CN112905459B (en) 2024-05-03

Similar Documents

Publication Publication Date Title
CN112905459B (en) Service interface testing method and device, electronic equipment and storage medium
US8356056B2 (en) Functional extensions for business objects
CN107423213A (en) A kind of filec descriptor distribution detection method and device
CN111930617A (en) Automatic testing method and device based on data objectification
CN108111364B (en) Service system testing method and device
CN108664385A (en) A kind of test method and device of application programming interface
CN113051163A (en) Unit testing method, unit testing device, electronic equipment and storage medium
CN111782266A (en) Method and device for determining software performance benchmark
CN107341106A (en) Application compatibility detection method, exploitation terminal and storage medium
US6694290B1 (en) Analyzing an extended finite state machine system model
CN105404574B (en) Smart card and mobile terminal consistency test method and device
CN112711640A (en) Method and device for configuring business handling process
CN111897738A (en) Automatic testing method and device based on atomic service
CN114443039A (en) Input parameter verification method and device, electronic equipment and storage medium
CN112860585B (en) Test script assertion generation method and device
CN112905461A (en) Method and device for executing automatic interface test case
CN112561690A (en) Method, system, equipment and storage medium for testing credit card staging service interface
CN108628750B (en) Test code processing method and device
CN113342600A (en) Method and device for monitoring program dependent plug-in
CN101251824B (en) Method for testing public objects request proxy structure and tools
US20190251015A1 (en) Mainframe testing framework
CN112685317B (en) Custom test method and device based on test pile
JP2016021161A (en) Simulator system, gateway system test device, and gateway system test method
CN116893978B (en) Test plan generation method, system and storage medium based on PTCRB authentication
CN115858012B (en) Program variable configuration method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant