CN112395202B - Interface automation test method and device, computer equipment and storage medium - Google Patents

Interface automation test method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN112395202B
CN112395202B CN202011353754.7A CN202011353754A CN112395202B CN 112395202 B CN112395202 B CN 112395202B CN 202011353754 A CN202011353754 A CN 202011353754A CN 112395202 B CN112395202 B CN 112395202B
Authority
CN
China
Prior art keywords
interface
tested
test
test result
interfaces
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011353754.7A
Other languages
Chinese (zh)
Other versions
CN112395202A (en
Inventor
荆伟
尤长浩
徐勇
徐梅兰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yunwangwandian Technology Co ltd
Original Assignee
Shenzhen Yunwangwandian Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Yunwangwandian Technology Co ltd filed Critical Shenzhen Yunwangwandian Technology Co ltd
Priority to CN202011353754.7A priority Critical patent/CN112395202B/en
Publication of CN112395202A publication Critical patent/CN112395202A/en
Application granted granted Critical
Publication of CN112395202B publication Critical patent/CN112395202B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/368Test management for test version control, e.g. updating test cases to a new software version
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Abstract

The application relates to an interface automation test method, an interface automation test device, computer equipment and a storage medium. The method comprises the following steps: acquiring input parameter information related to all interfaces to be tested from a designated database, and generating an interface test script according to the input parameter information related to all the interfaces to be tested; executing the interface test script to call all the interfaces to be tested; and acquiring all the return messages received after calling all the interfaces to be tested, and analyzing the test results according to all the received return messages to acquire the test results of all the interfaces to be tested. The embodiment of the invention can adapt to the automatic test of the current quick iteration version, and can quickly and accurately call the existing query interface to realize the quick coverage of the main process of the business when the emergency version is involved, especially when the new version needs to be released in the process of modifying or upgrading the bottom layer code.

Description

Interface automation test method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of interface testing technologies, and in particular, to an interface automated testing method and apparatus, a computer device, and a storage medium.
Background
When an emergency version is involved, particularly when a new version needs to be released after modification or upgrade of an underlying code, a distributed remote call (RPC) service needing to be called needs to be quickly and accurately covered in a test so as to analyze the influence of the modification of the new version on an existing interface.
In view of the implementation manner of the conventional Automation of the order query interface of the central office, the conventional SAT Automation (SAT is a Suning Automation Tester which is an automated testing tool developed by Suning) of the query interface starts from an upstream order taking link, writes different variables (constants) into an Excel document according to different types of orders, reads a plurality of configuration files through the SAT to generate a specified order, and then initiates a query.
The implementation mode is complex, many shared files and many conflicts exist, so that the maintenance cost of a tester is relatively high, the link is relatively long, the execution time of each test case is long, the test case cannot adapt to the automatic test of the current quick iteration version, when an emergency version is involved, especially when a bottom layer code is modified or upgraded to release a new version, the requirement of quickly and accurately calling the existing query interface to quickly cover the main process of the business cannot be met, and if the query interface fails to cover the test, the situation that similar cyclic reference ($ ref) causes that the front end cannot be identified may occur.
Disclosure of Invention
The embodiment of the invention can adapt to the automatic test of the current quick iteration version, and can quickly and accurately call the existing query interface to realize the quick coverage of the main process of the business when the emergency version is involved, particularly when the new version needs to be released in the process of modifying or upgrading the bottom layer code.
The present invention provides, according to a first aspect, an automated interface testing method, which, in one embodiment, is used for testing one interface to be tested or simultaneously testing a plurality of interfaces to be tested; the method comprises the following steps:
acquiring input parameter information related to all interfaces to be tested from a designated database, and generating an interface test script according to the input parameter information related to all the interfaces to be tested;
executing the interface test script to call all the interfaces to be tested;
and acquiring all the return messages received after calling all the interfaces to be tested, and analyzing the test results according to all the received return messages to acquire the test results of all the interfaces to be tested.
In one embodiment, the step of obtaining input parameter information related to all interfaces to be tested from a designated database and generating an interface test script according to the input parameter information related to all interfaces to be tested includes:
acquiring input parameter information related to each interface to be tested from a specified database;
generating a plurality of request messages corresponding to each interface to be tested according to the input parameter information related to each interface to be tested;
and generating an interface test script according to the plurality of request messages corresponding to all the interfaces to be tested.
In one embodiment, the input parameter information related to each interface to be tested comprises a plurality of groups of parameter entering information, and each group of parameter entering information comprises parameter entering values corresponding to a plurality of interface fields; a group of parameter information is used for generating a request message;
the step of generating a request message corresponding to any interface to be tested according to a group of access information included in input parameter information related to the interface to be tested comprises the following steps:
and adding the parameter values corresponding to the interface fields included in the group of parameter information of any interface to be tested into the interface field information corresponding to the interface fields to carry out message assembly, and assembling to obtain a request message corresponding to the interface to be tested.
In one embodiment, the step of generating an interface test script according to a plurality of request messages corresponding to all interfaces to be tested includes:
acquiring preset test type information corresponding to each interface to be tested;
and generating an interface test script according to the multiple request messages corresponding to all the interfaces to be tested and the preset test type information.
In one embodiment, the step of executing the interface test script to call all the interfaces to be tested includes:
executing the interface test script to send a plurality of request messages corresponding to the interface to be tested to the called party associated with the interface to be tested through each interface to be tested; and the called party associated with any interface to be tested generates a corresponding return message after receiving any request message corresponding to the interface to be tested and returns the corresponding return message.
In one embodiment, the step of obtaining all return messages received after all the interfaces to be tested are called, analyzing the test result according to all the received return messages, and obtaining the test results of all the interfaces to be tested includes:
acquiring a plurality of return messages returned by a called party associated with each interface to be tested;
and analyzing the test result according to a plurality of returned messages returned by the called party associated with each interface to be tested respectively to obtain the test result of each interface to be tested.
In one embodiment, the step of analyzing the test result according to the multiple return messages returned by the called party associated with each interface to be tested to obtain the test result of each interface to be tested includes:
acquiring a pre-specified baseline version test result set related to each interface to be tested, wherein the baseline version test result set related to each interface to be tested comprises a plurality of historical return messages obtained by testing the interface to be tested of the baseline version;
and respectively taking a plurality of returned messages returned by the called party associated with each interface to be tested as a current version test result set related to the interface to be tested, performing binary comparison on the current version test result set related to the interface to be tested and a baseline version test result set by using a preset comparison tool, and performing test result analysis according to a comparison result between the current version test result set related to the interface to be tested and the baseline version test result set to obtain a test result of the interface to be tested.
The invention provides an interface automatic test device according to a second aspect, which is used for testing one interface to be tested or simultaneously testing a plurality of interfaces to be tested in one embodiment; the device includes:
the interface test script generation module is used for acquiring input parameter information related to all interfaces to be tested from the specified database and generating interface test scripts according to the input parameter information related to all the interfaces to be tested;
the interface test script execution module is used for executing the interface test script so as to call all the interfaces to be tested;
and the test result analysis module is used for acquiring all the return messages received after all the interfaces to be tested are called, and analyzing the test results according to all the received return messages to acquire the test results of all the interfaces to be tested.
The present invention provides according to a third aspect a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of an embodiment of any of the methods described above when executing the computer program.
The present invention provides according to a fourth aspect a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the embodiments of the method of any one of the above.
In the embodiment of the invention, during testing, the input parameter information related to all interfaces to be tested is obtained from the specified database, and an interface test script is generated according to the input parameter information related to all interfaces to be tested; executing the interface test script to call all the interfaces to be tested; and acquiring all the return messages received after calling all the interfaces to be tested, and analyzing the test results according to all the received return messages to acquire the test results of all the interfaces to be tested. The embodiment of the invention can adapt to the automatic test of the current quick iteration version, can quickly and accurately call the existing query interface to realize the quick coverage of the main flow of the business when the emergent version is involved, particularly when the bottom layer code needs to be released for modification or upgrading, and directly shortens the execution time of the original SAT automation from dozens of hours to several minutes, thereby greatly improving the working efficiency of testers and saving the time cost.
Drawings
FIG. 1 is a flow chart of an automated test of a conventional interface;
FIG. 2 is a flow chart illustrating a method for automated testing of an interface according to an embodiment;
FIG. 3 is a flow diagram that illustrates the generation of an interface test script, according to one embodiment;
FIG. 4 is a flowchart illustrating an analysis of a test result of an interface to be tested according to an embodiment;
FIG. 5 is a schematic diagram illustrating a process of analyzing a test result of an interface to be tested according to another embodiment;
FIG. 6 is a block diagram of an automated interface test equipment according to an embodiment;
FIG. 7 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
If the new service relates to the call of an existing query interface, for example, in an order query scene, the newly added service in the new version needs to call a distributed remote call service for querying an order of the new service (specifically, the distributed remote call service can be called by calling an order query interface of a remote service platform, and the calling precondition is that a calling party and a called party register called system and interface information on the remote service platform), all main process interfaces of the new service need to be tested before the new version is formally released to analyze the influence of the change of the new version on the existing order query interface, and if an emergency version is involved, especially when a new version needs to be released after the transformation or the upgrade of a bottom code, the interface needing to be called can be covered more quickly and accurately in the test.
However, the existing interface test mode is complex, many shared files and many conflicts exist, so that the maintenance cost of a tester is relatively high, the link is relatively long, the execution time of each test case is long, the automatic test of the current fast iteration version cannot be adapted, and when an emergency version is involved, especially when a new version needs to be released for the transformation or the upgrade of a bottom code, the requirement of quickly and accurately calling the existing query interface to quickly cover the main process of a service cannot be met. Taking a test order query interface as an example, as shown in fig. 1, in a conventional test mode, firstly, a tester needs to manually assemble each field of a newly added service in an excel document (normally, an order field of a newly added type has about 300 fields), then, a local excel file is read through an SAT automation tool, the reading process consumes a lot of time, and a waiting time needs to be set between each step, incremental information reception of subsequent orders (such as incremental information of payment, refund and the like of an order) also needs to be manually input in the excel, the SAT automation tool also needs to handle problems such as field conflict, reading failure and the like in the reading process, after order data is created, an automation interface which needs to be implemented is called to perform order query and debugging, an inquiry request message adopts a mode of still reading a local excel to perform assembly, and after the request message is assembled, the test tool continues to read the request message and sends the request message to a tested interface of a remote service platform.
Aiming at the defects of the implementation mode, the invention provides an automatic interface testing method. In one embodiment, the interface automated testing method comprises the steps as shown in fig. 2:
s110: and acquiring input parameter information related to all interfaces to be tested from the specified database, and generating an interface test script according to the input parameter information related to all the interfaces to be tested.
S120: and executing the interface test script to call all the interfaces to be tested.
S130: and acquiring all the return messages received after calling all the interfaces to be tested, and analyzing the test results according to all the received return messages to acquire the test results of all the interfaces to be tested.
The embodiment can be applied to common test equipment (for example, a smart phone, a tablet computer, a notebook computer, a desktop computer, a server, and the like), when testing is performed, a tester obtains all interfaces to be tested from a specified database through the test equipment (in the embodiment, only one interface to be tested can be tested in one test, or multiple interfaces to be tested can be tested simultaneously, therefore, all the interfaces to be tested referred to in the embodiment refer to all the interfaces to be tested in one test, and the number of all the interfaces to be tested can be one or multiple).
The interface to be tested refers to an interface called by a newly added service in a new version, and in different application scenarios, the interface to be tested may be different specific interfaces, for example, in an order query scenario, the interface to be tested is an order query interface, and in a user information query scenario, the interface to be tested is a user information query interface. The designated database is a database which is designated in advance and used for storing the input parameter information of all the interfaces to be tested, and can be implemented by using a MySQL database, and the implementation manner of the database is not limited in this embodiment. The input parameter information related to the interface to be tested is a parameter value required for implementing the interface to be tested, for example, in a certain order query scene, information such as an order number, a membership number, a calling system and the like related to an order needs to be acquired, and the information for acquiring the input parameter related to the interface to be tested can refer to the description of each called system interface document (the input parameter of each interface is defined on the interface document, for example, if the interface document defines an order number and a membership number which need to input the order, the order number and the membership number of the order need to be searched in a specified database), and then the input parameter information is acquired from the specified database by using the service-specific service code.
More specifically, the input parameter information related to each interface to be tested comprises multiple groups of parameter entering information, and each group of parameter entering information comprises parameter entering values corresponding to multiple interface fields; a set of access parameters is used to generate a request message.
The return message is a message which is generated and returned to the test equipment according to the received request message after the called party corresponding to the interface to be tested is called, and one request message corresponds to one return message. In different application scenarios, the return messages include different return fields, for example, in an order query scenario, the return messages may include detail information of the queried order, such as order price, picture, and consignee information, and certainly, the return fields in the return message corresponding to each interface to be tested are different, and the return fields are subject to the definition of the interface document.
The method comprises the steps of synchronously calling a local code to send a request message to a remote service platform, calling the interface to be tested through an interface test script, obtaining input parameter information from the existing test environment when a tester tests, and then assembling the interface test script based on the input parameter information, wherein the operation of assembling the interface test script is simple, and points to be concerned by the tester writing the script are few (for example, only the interface code, the parameter message, the filename of a return message and the like of the interface to be tested need to be concerned in the order query service scene, so that the time of automatically testing the interface can be obviously shortened, the original automatic version of the interface is automatically published for tens of hours, and the efficiency of automatically upgrading the existing version is greatly improved.
Based on the foregoing embodiments, in an embodiment, the step of obtaining input parameter information related to all interfaces to be tested from a designated database, and generating an interface test script according to the input parameter information related to all interfaces to be tested includes, as shown in fig. 3:
s111: acquiring input parameter information related to each interface to be tested from a specified database;
s112: generating a plurality of request messages corresponding to each interface to be tested according to the input parameter information related to each interface to be tested;
s113: and generating an interface test script according to the plurality of request messages corresponding to all the interfaces to be tested.
In this embodiment, the interface test script can be used to simultaneously test one interface to be tested or multiple interfaces to be tested, when the test equipment generates the interface test script, for any interface to be tested, the test equipment obtains input parameter information related to the interface to be tested from a specified database, then generates multiple request messages corresponding to the interface to be tested according to the input parameter information related to the interface to be tested, and after the test equipment generates multiple corresponding request messages for all interfaces to be tested, generates the interface test script according to the multiple request messages corresponding to all interfaces to be tested.
It should be noted that, the number of interfaces called by the new service is usually multiple, and for any interface to be tested, multiple request messages need to be sent to the interface during testing (it is understandable that specific message parameters of different request messages may be different). In this embodiment, multiple request messages can be sent to the interface to be tested at one time through the interface test script, so that the test efficiency of the interface to be tested is improved.
In one embodiment, the step of generating, by the test device, a request packet corresponding to any interface to be tested according to a set of entry parameter information included in the input parameter information related to the interface to be tested includes: and adding the parameter values corresponding to the interface fields included in the group of parameter information of any interface to be tested into the interface field information corresponding to the interface fields to carry out message assembly, and assembling to obtain a request message corresponding to the interface to be tested.
In another embodiment, the step of generating an interface test script according to a plurality of request messages corresponding to all interfaces to be tested includes: acquiring preset test type information corresponding to each interface to be tested; and generating an interface test script according to the multiple request messages corresponding to all the interfaces to be tested and the preset test type information.
Specifically, the tester puts the generated request message into a test class (i.e., the test class information) through the test device to assemble an interface test script, and then selects a corresponding test environment configuration, such as zkService (zookeeper service), appCode, scmService, and the like, to execute the interface test script, so that the generated request message can be sent to a remote service platform, such as a sunning remote service platform (RSF platform).
In one embodiment, the step of executing the interface test script to call all the interfaces to be tested includes: and executing the interface test script to send the plurality of request messages corresponding to the interface to be tested to the related called party through each interface to be tested. And after receiving any request message corresponding to the interface to be tested, the called party associated with any interface to be tested generates a corresponding return message and returns the return message to the test equipment.
In an embodiment, the step of obtaining all the return messages received after all the interfaces to be tested are called, and performing test result analysis according to all the received return messages to obtain the test results of all the interfaces to be tested, as shown in fig. 4, includes:
s131: acquiring a plurality of return messages returned by a called party associated with each interface to be tested;
s132: and respectively analyzing the test result according to the plurality of returned messages returned by the called party associated with each interface to be tested, and obtaining the test result of each interface to be tested.
Specifically, after receiving a request message corresponding to any interface to be tested, a called party associated with the interface to be tested generates a corresponding return message and returns the generated return message to the device to be tested, and after receiving return messages returned by called parties associated with all devices to be tested, the device to be tested performs test result analysis according to the return message corresponding to each interface to be tested (i.e., multiple return messages returned by the called parties associated with the interface to be tested) respectively, so as to obtain a test result of the interface to be tested in the test.
More specifically, in an embodiment, the step of analyzing the test result according to the multiple return messages returned by the called party associated with each interface to be tested, to obtain the test result of each interface to be tested, as shown in fig. 5, includes:
s1321: acquiring a pre-specified baseline version test result set related to each interface to be tested, wherein the baseline version test result set related to each interface to be tested comprises a plurality of historical return messages obtained by testing the interface to be tested of the baseline version;
s1322: and respectively taking a plurality of returned messages returned by the called party associated with each interface to be tested as a current version test result set related to the interface to be tested, performing binary comparison on the current version test result set related to the interface to be tested and a baseline version test result set by using a preset comparison tool, and performing test result analysis according to a comparison result between the current version test result set related to the interface to be tested and the baseline version test result set to obtain a test result of the interface to be tested.
In this embodiment, the test device stores the return message after receiving it. Specifically, each interface to be tested locally has a corresponding return message storage path (for example, a certain folder), and after receiving a return message, the test device determines the corresponding interface to be tested and stores the corresponding interface to be tested in the return message storage path corresponding to the interface to be tested, more specifically, a return message is stored as a return message file (for example, txt file, etc.), and since one interface to be tested corresponds to multiple return messages, one return message storage path is associated with multiple return message files (that is, one folder contains multiple return message files). The test result set corresponding to each interface to be tested (which refers to the return message file obtained after all the return messages corresponding to the interface to be tested in one test are stored) obtained by automatically testing the interface of each new version is still stored after the test, so that in a new test performed after the version is updated again, the test result set of the current version of the relevant interface to be tested in the latest version test (which refers to the test result set obtained by the interface to be tested in the current version test) can be compared with the test result set of the baseline version of the interface to be tested (the baseline version can refer to the previous version, and if a plurality of versions exist, any previous version appointed in advance).
The tester may open a local comparison tool (for example, beyond company, winMerge, or the like) using a development tool, perform binary comparison on the baseline version test result set and the current version test result set of each interface to be tested by using the comparison tool, perform difference display on a place where the current version test result set is inconsistent with the baseline version test result set by using the comparison tool (for example, highlight, red mark, and/or thick processing is performed on file names of return message files with inconsistent comparison so as to distinguish the return message files with consistent comparison), open the return message files with difference processing by using the tester to Compare results, perform difference processing on inconsistent fields in the return message with difference processing in the two test result sets by using the comparison tool, and analyze causes of inconsistency based on the return message files with difference processing, thereby obtaining the test result of each interface to be tested in the test.
Specifically, if the two test result sets are determined to be all consistent after comparison, it can be determined that the service is not affected by the change of the new version, the automatic test result of the interface to be tested passes, and the new version of the test result set is stored subsequently. If the two test result sets are not completely consistent after comparison, the reasons for inconsistency may be 1, code BUG, 2, component upgrade (e.g., mySQL database version upgrade), 3, new version with new added fields, etc., which need to be handled by the tester. In actual testing, a tester executes an edited interface test script by using a development tool, selects a test environment, and after the interface test script is executed, checks a specified directory (a pre-specified return message storage path), and finds that the total execution time for executing 20 test cases by using the interface test script is about 16 seconds, while the execution time for executing 20 test cases by using a common automation task is about 2 hours. Under the scene of completing the same test task, the efficiency is improved by more than 400 times, and a large amount of time is saved.
The reason why the test result needs to be analyzed by the intervention of a tester is that in the automatic test in the industry at present, because the message of the scene is specifically positioned, more subsequent steps involve with the oral of a corresponding developer, the reliability and safety are relatively high through manual analysis, in addition, in the actual test work, the number of scenes which do not pass the test is small, and if the scene is a large-area scene, the requirement is not suitable for realizing the automation.
Based on the same inventive concept, the invention also provides an automatic interface testing device. The device can be used for testing one interface to be tested or simultaneously testing a plurality of interfaces to be tested. In one embodiment, as shown in fig. 6, the interface automation test device includes the following modules:
the interface test script generating module 110 is configured to obtain input parameter information related to all interfaces to be tested from a specified database, and generate an interface test script according to the input parameter information related to all interfaces to be tested;
an interface test script execution module 120, configured to execute the interface test script to call all interfaces to be tested;
the test result analysis module 130 is configured to obtain all return messages received after all the interfaces to be tested are called, and perform test result analysis according to all the received return messages to obtain test results of all the interfaces to be tested.
In one embodiment, the interface test script generating module comprises:
the input parameter acquisition submodule is used for acquiring input parameter information related to each interface to be tested from a specified database;
the request message generation submodule is used for generating a plurality of request messages corresponding to each interface to be tested according to the input parameter information related to each interface to be tested;
and the interface test script generation submodule is used for generating an interface test script according to the plurality of request messages corresponding to all the interfaces to be tested.
In one embodiment, the input parameter information related to each interface to be tested comprises a plurality of groups of parameter entering information, and each group of parameter entering information comprises parameter entering values corresponding to a plurality of interface fields; a set of access parameters is used to generate a request message.
The request message generation sub-module is further configured to generate a request message corresponding to the interface to be tested according to a set of entry parameter information included in the input parameter information related to any one of the interfaces to be tested. More specifically, the request message generation sub-module is further configured to add an entry parameter value corresponding to each interface field included in a set of entry parameter information of any interface to be tested to the interface field information corresponding to each interface field to perform message assembly, and assemble to obtain a request message corresponding to the interface to be tested.
In one embodiment, the interface test script generation sub-module includes:
the test type acquisition unit is used for acquiring preset test type information corresponding to each interface to be tested;
and the interface test script generating unit is used for generating an interface test script according to the plurality of request messages corresponding to all the interfaces to be tested and the preset test type information.
In an embodiment, the interface test script execution module is specifically configured to execute the interface test script, so as to send multiple request messages corresponding to each interface to be tested to the callee associated with the interface to be tested through each interface to be tested. After receiving any request message corresponding to the interface to be tested, the called party associated with any interface to be tested generates a corresponding return message and returns the return message.
In one embodiment, the test result analysis module includes:
the return message acquisition submodule is used for acquiring a plurality of return messages returned by the called party associated with each interface to be tested;
and the test result analysis submodule is used for carrying out test result analysis according to a plurality of returned messages returned by the called party associated with each interface to be tested respectively to obtain the test result of each interface to be tested.
In one embodiment, the test result analysis submodule includes:
a baseline result set acquisition unit, configured to acquire a pre-specified baseline version test result set associated with each interface to be tested, where the baseline version test result set associated with each interface to be tested includes multiple historical return messages obtained by testing the interface to be tested of the baseline version;
and the test result analysis unit is used for respectively taking a plurality of returned messages returned by the called party associated with each interface to be tested as a current version test result set related to the interface to be tested, performing binary comparison on the current version test result set related to the interface to be tested and a baseline version test result set by using a preset comparison tool, and performing test result analysis according to a comparison result between the current version test result set related to the interface to be tested and the baseline version test result set to obtain a test result of the interface to be tested.
For specific definition of the interface automated testing apparatus, reference may be made to the definition of the interface automated testing method in the foregoing, and details are not described here. All or part of each module in the interface automatic testing device can be realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, the internal structure of which may be as shown in FIG. 7. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer equipment is used for storing data such as input parameter information of the interface to be tested. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a method for automated testing of an interface.
Those skilled in the art will appreciate that the architecture shown in fig. 7 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the following steps when executing the computer program:
acquiring input parameter information related to all interfaces to be tested from a designated database, and generating an interface test script according to the input parameter information related to all the interfaces to be tested; executing the interface test script to call all the interfaces to be tested; and acquiring all the return messages received after calling all the interfaces to be tested, and analyzing the test results according to all the received return messages to acquire the test results of all the interfaces to be tested.
In one embodiment, when the processor executes the computer program to obtain the input parameter information related to all the interfaces to be tested from the designated database and generate the interface test script according to the input parameter information related to all the interfaces to be tested, the following steps are also implemented:
acquiring input parameter information related to each interface to be tested from a specified database; generating a plurality of request messages corresponding to each interface to be tested according to the input parameter information related to each interface to be tested; and generating an interface test script according to the plurality of request messages corresponding to all the interfaces to be tested.
In one embodiment, the input parameter information related to each interface to be tested comprises a plurality of groups of parameter entering information, and each group of parameter entering information comprises parameter entering values corresponding to a plurality of interface fields; a set of access parameters is used to generate a request message.
In one embodiment, when the processor executes the computer program to generate a request message corresponding to any interface to be tested according to a set of parameter information included in input parameter information related to the interface to be tested, the following steps are further implemented:
and adding the parameter values corresponding to the interface fields included in the group of parameter information of any interface to be tested into the interface field information corresponding to the interface fields to carry out message assembly, and assembling to obtain a request message corresponding to the interface to be tested.
In one embodiment, when the processor executes the computer program to generate the interface test script according to the multiple request messages corresponding to all the interfaces to be tested, the following steps are also implemented:
acquiring preset test type information corresponding to each interface to be tested; and generating an interface test script according to the plurality of request messages corresponding to all interfaces to be tested and the preset test type information.
In one embodiment, the processor executes the computer program to implement the following steps when executing the interface test script to call all the interfaces to be tested:
executing the interface test script to send a plurality of request messages corresponding to the interface to be tested to the called party associated with the interface to be tested through each interface to be tested; and the called party associated with any interface to be tested generates a corresponding return message after receiving any request message corresponding to the interface to be tested and returns the corresponding return message.
In one embodiment, the processor executes the computer program to obtain all return messages received after calling all the interfaces to be tested, performs test result analysis according to all the received return messages, and further implements the following steps when obtaining the test results of all the interfaces to be tested:
acquiring a plurality of return messages returned by a called party associated with each interface to be detected; and analyzing the test result according to a plurality of returned messages returned by the called party associated with each interface to be tested respectively to obtain the test result of each interface to be tested.
In one embodiment, when the processor executes the computer program, the processor analyzes the test result according to a plurality of return messages returned by the called party associated with each interface to be tested, and when the processor obtains the test result of each interface to be tested, the processor further implements the following steps:
acquiring a pre-specified baseline version test result set related to each interface to be tested, wherein the baseline version test result set related to each interface to be tested comprises a plurality of historical return messages obtained by testing the interface to be tested of the baseline version; and respectively taking a plurality of returned messages returned by the called party associated with each interface to be tested as a current version test result set related to the interface to be tested, performing binary comparison on the current version test result set related to the interface to be tested and a baseline version test result set by using a preset comparison tool, and performing test result analysis according to a comparison result between the current version test result set related to the interface to be tested and the baseline version test result set to obtain a test result of the interface to be tested.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
acquiring input parameter information related to all interfaces to be tested from a designated database, and generating an interface test script according to the input parameter information related to all the interfaces to be tested; executing the interface test script to call all the interfaces to be tested; and acquiring all the return messages received after calling all the interfaces to be tested, and analyzing the test results according to all the received return messages to acquire the test results of all the interfaces to be tested.
In one embodiment, when the computer program is executed by the processor, and the input parameter information related to all the interfaces to be tested is obtained from the designated database, and the interface test script is generated according to the input parameter information related to all the interfaces to be tested, the following steps are further implemented:
acquiring input parameter information related to each interface to be tested from a specified database; generating a plurality of request messages corresponding to each interface to be tested according to the input parameter information related to each interface to be tested; and generating an interface test script according to the plurality of request messages corresponding to all the interfaces to be tested.
In one embodiment, the input parameter information related to each interface to be tested comprises a plurality of groups of parameter entering information, and each group of parameter entering information comprises parameter entering values corresponding to a plurality of interface fields; a set of access parameters is used to generate a request message.
In one embodiment, when the computer program is executed by the processor and generates a request message corresponding to any interface to be tested according to a set of parameter information included in input parameter information related to the interface to be tested, the following steps are further implemented:
and adding the parameter values corresponding to the interface fields included in the group of parameter information of any interface to be tested into the interface field information corresponding to the interface fields to carry out message assembly, and assembling to obtain a request message corresponding to the interface to be tested.
In one embodiment, when the computer program is executed by the processor and generates an interface test script according to a plurality of request messages corresponding to all interfaces to be tested, the following steps are also implemented:
acquiring preset test type information corresponding to each interface to be tested; and generating an interface test script according to the plurality of request messages corresponding to all interfaces to be tested and the preset test type information.
In one embodiment, when the computer program is executed by the processor and executes the interface test script to call all the interfaces to be tested, the following steps are also implemented:
executing the interface test script to send a plurality of request messages corresponding to the interface to be tested to the called party associated with the interface to be tested through each interface to be tested; and the called party associated with any interface to be tested generates a corresponding return message after receiving any request message corresponding to the interface to be tested and returns the corresponding return message.
In one embodiment, when the computer program is executed by the processor, acquires all return messages received after all interfaces to be tested are called, performs test result analysis according to all the received return messages, and acquires the test results of all the interfaces to be tested, the following steps are further implemented:
acquiring a plurality of return messages returned by a called party associated with each interface to be tested; and analyzing the test result according to a plurality of returned messages returned by the called party associated with each interface to be tested respectively to obtain the test result of each interface to be tested.
In one embodiment, when the computer program is executed by the processor, the computer program performs test result analysis according to multiple return messages returned by the called party associated with each interface to be tested, and when the test result of each interface to be tested is obtained, the following steps are further implemented:
acquiring a pre-specified baseline version test result set related to each interface to be tested, wherein the baseline version test result set related to each interface to be tested comprises a plurality of historical return messages obtained by testing the interface to be tested of the baseline version; and respectively taking a plurality of returned messages returned by the called party associated with each interface to be tested as a current version test result set related to the interface to be tested, performing binary comparison on the current version test result set related to the interface to be tested and a baseline version test result set by using a preset comparison tool, and performing test result analysis according to a comparison result between the current version test result set related to the interface to be tested and the baseline version test result set to obtain a test result of the interface to be tested.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), rambus (Rambus) direct RAM (RDRAM), direct Rambus Dynamic RAM (DRDRAM), and Rambus Dynamic RAM (RDRAM), among others.
All possible combinations of the technical features in the above embodiments may not be described for the sake of brevity, but should be considered as being within the scope of the present disclosure as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is specific and detailed, but not to be understood as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent application shall be subject to the appended claims.

Claims (8)

1. An automatic interface test method is characterized in that the method is used for testing one interface to be tested or simultaneously testing a plurality of interfaces to be tested; the method comprises the following steps:
acquiring input parameter information related to all the interfaces to be tested from a designated database, and generating an interface test script according to the input parameter information related to all the interfaces to be tested;
executing the interface test script to call all the interfaces to be tested;
acquiring all return messages received after all the interfaces to be tested are called, and analyzing test results according to all the received return messages to acquire the test results of all the interfaces to be tested;
the step of obtaining all the return messages received after calling all the interfaces to be tested, and analyzing the test results according to all the received return messages to obtain the test results of all the interfaces to be tested comprises the following steps:
obtaining a plurality of return messages returned by the called party associated with each interface to be tested, and analyzing test results according to the plurality of return messages returned by the called party associated with each interface to be tested respectively to obtain the test result of each interface to be tested;
the step of analyzing the test result according to the multiple returned messages returned by the called party associated with each interface to be tested to obtain the test result of each interface to be tested comprises the following steps:
acquiring a pre-specified baseline version test result set related to each interface to be tested, wherein the baseline version test result set related to each interface to be tested comprises a plurality of historical return messages obtained by testing the interface to be tested of the baseline version;
and respectively taking a plurality of returned messages returned by the called party associated with each interface to be tested as a current version test result set related to the interface to be tested, performing binary comparison on the current version test result set related to the interface to be tested and a baseline version test result set by using a preset comparison tool, and performing test result analysis according to a comparison result between the current version test result set related to the interface to be tested and the baseline version test result set to obtain a test result of the interface to be tested.
2. The method for automatically testing an interface according to claim 1, wherein the step of obtaining all the input parameter information related to the interface to be tested from the designated database and generating an interface test script according to all the input parameter information related to the interface to be tested comprises:
acquiring input parameter information related to each interface to be tested from a specified database;
generating a plurality of request messages corresponding to each interface to be tested according to the input parameter information related to each interface to be tested;
and generating an interface test script according to the plurality of request messages corresponding to all the interfaces to be tested.
3. The method for automated interface testing of claim 2,
the input parameter information related to each interface to be tested comprises a plurality of groups of parameter entering information, and each group of parameter entering information comprises parameter entering values corresponding to a plurality of interface fields; a group of the access information is used for generating a request message;
generating a request message corresponding to any interface to be tested according to a group of access information included in input parameter information related to the interface to be tested, wherein the step comprises the following steps:
and adding the parameter values corresponding to the interface fields included in the group of parameter information of any interface to be tested into the interface field information corresponding to the interface fields to carry out message assembly, and assembling to obtain a request message corresponding to the interface to be tested.
4. The automated interface testing method of claim 2,
the step of generating an interface test script according to the plurality of request messages corresponding to all the interfaces to be tested includes:
acquiring preset test type information corresponding to each interface to be tested;
and generating an interface test script according to the plurality of request messages corresponding to all the interfaces to be tested and preset test type information.
5. The method for automated interface testing of claim 2,
the step of executing the interface test script to call all the interfaces to be tested includes:
executing the interface test script to send a plurality of request messages corresponding to the interface to be tested to the called party associated with the interface to be tested through each interface to be tested; and the called party associated with any one interface to be tested generates a corresponding return message after receiving any request message corresponding to the interface to be tested and returns the corresponding return message.
6. An automatic interface test device is characterized in that the device is used for testing one interface to be tested or simultaneously testing a plurality of interfaces to be tested; the device comprises:
the interface test script generation module is used for acquiring all input parameter information related to the interface to be tested from a specified database and generating an interface test script according to all input parameter information related to the interface to be tested;
the interface test script execution module is used for executing the interface test script so as to call all the interfaces to be tested;
the test result analysis module is used for acquiring all the return messages received after all the interfaces to be tested are called, and analyzing the test results according to all the received return messages to acquire the test results of all the interfaces to be tested;
the test result analysis module comprises:
the return message acquisition submodule is used for acquiring a plurality of return messages returned by the called party associated with each interface to be tested;
the test result analysis submodule is used for carrying out test result analysis according to a plurality of returned messages returned by the called party associated with each interface to be tested respectively to obtain a test result of each interface to be tested;
the test result analysis submodule comprises:
a baseline result set acquisition unit, configured to acquire a pre-specified baseline version test result set associated with each interface to be tested, where the baseline version test result set associated with each interface to be tested includes multiple historical return messages obtained by testing the interface to be tested of the baseline version;
and the test result analysis unit is used for respectively taking a plurality of returned messages returned by the called party associated with each interface to be tested as a current version test result set related to the interface to be tested, performing binary comparison on the current version test result set related to the interface to be tested and a baseline version test result set by using a preset comparison tool, and performing test result analysis according to a comparison result between the current version test result set related to the interface to be tested and the baseline version test result set to obtain a test result of the interface to be tested.
7. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the method of any of claims 1 to 5 are implemented when the computer program is executed by the processor.
8. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 5.
CN202011353754.7A 2020-11-26 2020-11-26 Interface automation test method and device, computer equipment and storage medium Active CN112395202B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011353754.7A CN112395202B (en) 2020-11-26 2020-11-26 Interface automation test method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011353754.7A CN112395202B (en) 2020-11-26 2020-11-26 Interface automation test method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112395202A CN112395202A (en) 2021-02-23
CN112395202B true CN112395202B (en) 2023-04-14

Family

ID=74604622

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011353754.7A Active CN112395202B (en) 2020-11-26 2020-11-26 Interface automation test method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112395202B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112817867A (en) * 2021-02-25 2021-05-18 平安消费金融有限公司 Interface test script generation method and device, computer equipment and medium
CN113377668A (en) * 2021-06-29 2021-09-10 南京苏宁软件技术有限公司 Automatic testing method and device for service interface and computer equipment
CN113468049B (en) * 2021-06-29 2023-07-04 平安养老保险股份有限公司 Configurable interface-based test method, device, equipment and medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110471842A (en) * 2019-07-12 2019-11-19 平安普惠企业管理有限公司 A kind of test method, device and computer readable storage medium
CN111858376A (en) * 2020-07-29 2020-10-30 平安养老保险股份有限公司 Request message generation method and interface test method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110471842A (en) * 2019-07-12 2019-11-19 平安普惠企业管理有限公司 A kind of test method, device and computer readable storage medium
CN111858376A (en) * 2020-07-29 2020-10-30 平安养老保险股份有限公司 Request message generation method and interface test method

Also Published As

Publication number Publication date
CN112395202A (en) 2021-02-23

Similar Documents

Publication Publication Date Title
CN112395202B (en) Interface automation test method and device, computer equipment and storage medium
CN107992409B (en) Test case generation method and device, computer equipment and storage medium
CN108427613B (en) Abnormal interface positioning method and device, computer equipment and storage medium
CN108459962B (en) Code normalization detection method and device, terminal equipment and storage medium
CN110569035B (en) Code compiling method, device, equipment and storage medium of software development project
CN108874661B (en) Test mapping relation library generation method and device, computer equipment and storage medium
CN109726134B (en) Interface test method and system
CN109977008B (en) Method and terminal for making JS code depended on by application program compatible with native library
CN110058998B (en) Software testing method and device
US8661414B2 (en) Method and system for testing an order management system
US20210048999A1 (en) Automated generation of status chains for software updates
CN112380130A (en) Application testing method and device based on call dependency relationship
CN115391228A (en) Precise test method, device, equipment and medium
CN113282513B (en) Interface test case generation method and device, computer equipment and storage medium
CN113835713B (en) Source code packet downloading method, device, computer equipment and storage medium
US20210026756A1 (en) Deriving software application dependency trees for white-box testing
EP2913757A1 (en) Method, system, and computer software product for test automation
CN113377669A (en) Automatic testing method and device, computer equipment and storage medium
CN112612706A (en) Automated testing method, computer device and storage medium
CN116431522A (en) Automatic test method and system for low-code object storage gateway
CN111159025A (en) Application program interface testing method and device, computer equipment and storage medium
CN115757172A (en) Test execution method and device, storage medium and computer equipment
CN115934129A (en) Software project updating method and device, computer equipment and storage medium
CN113886262A (en) Software automation test method and device, computer equipment and storage medium
CN114461219A (en) Data analysis method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210510

Address after: 518002 unit 3510-130, Luohu business center, 2028 Shennan East Road, Chengdong community, Dongmen street, Luohu District, Shenzhen City, Guangdong Province

Applicant after: Shenzhen yunwangwandian Technology Co.,Ltd.

Address before: No.1-1 Suning Avenue, Xuzhuang Software Park, Xuanwu District, Nanjing, Jiangsu Province, 210000

Applicant before: Suning Cloud Computing Co.,Ltd.

REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40044723

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant