CN112395202A - Interface automation test method and device, computer equipment and storage medium - Google Patents

Interface automation test method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN112395202A
CN112395202A CN202011353754.7A CN202011353754A CN112395202A CN 112395202 A CN112395202 A CN 112395202A CN 202011353754 A CN202011353754 A CN 202011353754A CN 112395202 A CN112395202 A CN 112395202A
Authority
CN
China
Prior art keywords
interface
tested
test
interfaces
test result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011353754.7A
Other languages
Chinese (zh)
Other versions
CN112395202B (en
Inventor
荆伟
尤长浩
徐勇
徐梅兰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yunwangwandian Technology Co ltd
Original Assignee
Suning Cloud Computing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suning Cloud Computing Co Ltd filed Critical Suning Cloud Computing Co Ltd
Priority to CN202011353754.7A priority Critical patent/CN112395202B/en
Publication of CN112395202A publication Critical patent/CN112395202A/en
Application granted granted Critical
Publication of CN112395202B publication Critical patent/CN112395202B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/368Test management for test version control, e.g. updating test cases to a new software version
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The application relates to an interface automation test method, an interface automation test device, computer equipment and a storage medium. The method comprises the following steps: acquiring input parameter information related to all interfaces to be tested from a designated database, and generating an interface test script according to the input parameter information related to all the interfaces to be tested; executing the interface test script to call all the interfaces to be tested; and acquiring all the return messages received after calling all the interfaces to be tested, and analyzing the test results according to all the received return messages to acquire the test results of all the interfaces to be tested. The embodiment of the invention can adapt to the automatic test of the current quick iteration version, and can quickly and accurately call the existing query interface to realize the quick coverage of the main process of the business when the emergency version is involved, especially when the new version needs to be released in the process of modifying or upgrading the bottom layer code.

Description

Interface automation test method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of interface testing technologies, and in particular, to an interface automated testing method and apparatus, a computer device, and a storage medium.
Background
When an emergency version is involved, particularly when a new version needs to be released after modification or upgrade of an underlying code, a distributed remote call (RPC) service needing to be called needs to be quickly and accurately covered in a test so as to analyze the influence of the modification of the new version on an existing interface.
In view of the implementation manner of the conventional Automation of the order query interface of the central office, the conventional SAT Automation (SAT is a Suning Automation Tester which is an automated testing tool developed by Suning) of the query interface starts from an upstream order taking link, writes different variables (constants) into an Excel document according to different types of orders, reads a plurality of configuration files through the SAT to generate a specified order, and then initiates a query.
The implementation mode is complex, many files are shared, many conflicts exist, so that the maintenance cost of a tester is relatively high, the link is long, the execution time of each test case is long, the test case cannot adapt to the automatic test of the current quick iteration version, when an emergency version is involved, especially when a new version needs to be released after the transformation or the upgrade of a bottom layer code, the requirement of quickly and accurately calling the existing query interface to quickly cover the main process of the business cannot be met, and if the query interface fails to cover the test, the situation that similar cyclic reference ($ ref) causes the situation that the front end cannot be identified may occur.
Disclosure of Invention
The embodiment of the invention can adapt to the automatic test of the current quick iteration version, and can quickly and accurately call the existing query interface to realize the quick coverage of the main process of the business when the emergency version is involved, particularly when the new version needs to be released in the process of modifying or upgrading the bottom layer code.
The present invention provides, according to a first aspect, an automated interface testing method, which, in one embodiment, is used for testing one interface to be tested or simultaneously testing a plurality of interfaces to be tested; the method comprises the following steps:
acquiring input parameter information related to all interfaces to be tested from a designated database, and generating an interface test script according to the input parameter information related to all the interfaces to be tested;
executing the interface test script to call all the interfaces to be tested;
and acquiring all the return messages received after calling all the interfaces to be tested, and analyzing the test results according to all the received return messages to acquire the test results of all the interfaces to be tested.
In one embodiment, the step of obtaining input parameter information related to all interfaces to be tested from a designated database and generating an interface test script according to the input parameter information related to all interfaces to be tested includes:
acquiring input parameter information related to each interface to be tested from a specified database;
generating a plurality of request messages corresponding to each interface to be tested according to the input parameter information related to each interface to be tested;
and generating an interface test script according to the plurality of request messages corresponding to all the interfaces to be tested.
In one embodiment, the input parameter information related to each interface to be tested comprises a plurality of groups of parameter entering information, and each group of parameter entering information comprises parameter entering values corresponding to a plurality of interface fields; a group of access parameter information is used for generating a request message;
the step of generating a request message corresponding to any interface to be tested according to a group of access information included in input parameter information related to the interface to be tested comprises the following steps:
and adding the parameter values corresponding to the interface fields included in the group of parameter information of any interface to be tested into the interface field information corresponding to the interface fields to carry out message assembly, and assembling to obtain a request message corresponding to the interface to be tested.
In one embodiment, the step of generating an interface test script according to a plurality of request messages corresponding to all interfaces to be tested includes:
acquiring preset test type information corresponding to each interface to be tested;
and generating an interface test script according to the plurality of request messages corresponding to all interfaces to be tested and the preset test type information.
In one embodiment, the step of executing the interface test script to call all the interfaces to be tested includes:
executing the interface test script to send a plurality of request messages corresponding to the interface to be tested to the called party associated with the interface to be tested through each interface to be tested; and the called party associated with any interface to be tested generates a corresponding return message after receiving any request message corresponding to the interface to be tested and returns the corresponding return message.
In one embodiment, the step of obtaining all return messages received after all the interfaces to be tested are called, analyzing the test result according to all the received return messages, and obtaining the test results of all the interfaces to be tested includes:
acquiring a plurality of return messages returned by a called party associated with each interface to be tested;
and analyzing the test result according to a plurality of returned messages returned by the called party associated with each interface to be tested respectively to obtain the test result of each interface to be tested.
In one embodiment, the step of analyzing the test result according to the multiple return messages returned by the called party associated with each interface to be tested to obtain the test result of each interface to be tested includes:
acquiring a pre-specified baseline version test result set related to each interface to be tested, wherein the baseline version test result set related to each interface to be tested comprises a plurality of historical return messages obtained by testing the interface to be tested of the baseline version;
and respectively taking a plurality of returned messages returned by the called party associated with each interface to be tested as a current version test result set related to the interface to be tested, performing binary comparison on the current version test result set related to the interface to be tested and a baseline version test result set by using a preset comparison tool, and performing test result analysis according to a comparison result between the current version test result set related to the interface to be tested and the baseline version test result set to obtain a test result of the interface to be tested.
The invention provides an interface automatic test device according to a second aspect, which is used for testing one interface to be tested or simultaneously testing a plurality of interfaces to be tested in one embodiment; the device includes:
the interface test script generation module is used for acquiring input parameter information related to all interfaces to be tested from the specified database and generating interface test scripts according to the input parameter information related to all the interfaces to be tested;
the interface test script execution module is used for executing the interface test script so as to call all the interfaces to be tested;
and the test result analysis module is used for acquiring all the return messages received after all the interfaces to be tested are called, and analyzing the test results according to all the received return messages to acquire the test results of all the interfaces to be tested.
The present invention provides according to a third aspect a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of an embodiment of any of the methods described above when executing the computer program.
The present invention provides according to a fourth aspect a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the embodiments of the method of any one of the above.
In the embodiment of the invention, during testing, the input parameter information related to all interfaces to be tested is obtained from the specified database, and an interface test script is generated according to the input parameter information related to all interfaces to be tested; executing the interface test script to call all the interfaces to be tested; and acquiring all the return messages received after calling all the interfaces to be tested, and analyzing the test results according to all the received return messages to acquire the test results of all the interfaces to be tested. The embodiment of the invention can adapt to the automatic test of the current quick iteration version, can quickly and accurately call the existing query interface to realize the quick coverage of the main flow of the business when the emergent version is involved, particularly when the bottom layer code needs to be released for modification or upgrading, and directly shortens the execution time of the original SAT automation from dozens of hours to several minutes, thereby greatly improving the working efficiency of testers and saving the time cost.
Drawings
FIG. 1 is a flow chart of an automated test of a conventional interface;
FIG. 2 is a flow chart illustrating a method for automated testing of an interface according to an embodiment;
FIG. 3 is a flow diagram that illustrates the generation of an interface test script, according to one embodiment;
FIG. 4 is a flowchart illustrating an analysis of a test result of an interface to be tested according to an embodiment;
FIG. 5 is a schematic diagram illustrating a process of analyzing a test result of an interface to be tested according to another embodiment;
FIG. 6 is a block diagram of an exemplary automated interface test equipment;
FIG. 7 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The new version may add a new service that may be used if the new service involves a call to an existing query interface, for example, in the order query scenario, a newly added service in the new version needs to invoke a distributed remote invocation service for querying an order of the new service (specifically, the distributed remote invocation service can be invoked by invoking an order query interface of a remote service platform, and the invocation precondition is that an invoking party and a called party register called system and interface information on the remote service platform), then all main process interfaces of the new service are tested before the new version is formally released to analyze the influence of the change of the new version on the existing order query interface, if an urgent version is involved, especially when the new version needs to be released after the modification or upgrade of the bottom layer code, the interface needing to be called can be covered more quickly and accurately in the test.
However, the existing interface test mode is complex, many shared files and many conflicts exist, so that the maintenance cost of a tester is relatively high, the link is relatively long, the execution time of each test case is long, the automatic test of the current fast iteration version cannot be adapted, and when an emergency version is involved, especially when a new version needs to be released for the transformation or the upgrade of a bottom code, the requirement of quickly and accurately calling the existing query interface to quickly cover the main process of a service cannot be met. Taking a test order query interface as an example, as shown in fig. 1, in a conventional test mode, firstly, a tester needs to manually assemble each field of a newly added service in an excel document (normally, an order field of a newly added type has about 300 fields), then a local excel file is read by an SAT automation tool, the reading process consumes a lot of time, waiting time is set between each step, incremental information reception of subsequent orders (such as incremental information of payment, refund and the like of an order) also needs to be manually input in the excel, problems such as field conflict, reading failure and the like also need to be processed in the reading process of the SAT automation tool, after order data is created, an automation interface which needs to be implemented is called to perform order query and debugging, an inquiry request message still adopts a mode of reading the local excel to perform assembly, and after the request message is assembled, the test tool is used to read the request message and send the request message to a remote service platform The interface of the station under test.
Aiming at the defects of the implementation mode, the invention provides an automatic interface testing method. In one embodiment, the interface automated testing method comprises the steps as shown in fig. 2:
s110: and acquiring input parameter information related to all interfaces to be tested from the specified database, and generating an interface test script according to the input parameter information related to all the interfaces to be tested.
S120: and executing the interface test script to call all the interfaces to be tested.
S130: and acquiring all the return messages received after calling all the interfaces to be tested, and analyzing the test results according to all the received return messages to acquire the test results of all the interfaces to be tested.
The embodiment can be applied to common test equipment (for example, a smart phone, a tablet computer, a notebook computer, a desktop computer, a server, etc.), during testing, a tester obtains all input parameter information related to all interfaces to be tested from a specified database through the test equipment (in the embodiment, only one interface to be tested can be tested in one test, or a plurality of interfaces to be tested can be tested simultaneously, so that all interfaces to be tested referred to in the embodiment refer to all interfaces to be tested in one test, the number of all interfaces to be tested can be one or more), an interface test script is generated according to the input parameter information related to all interfaces to be tested, and then the generated interface test script is executed to call all interfaces to be tested, wherein the interface test script can realize testing only one interface to be tested or testing a plurality of interfaces to be tested simultaneously, when any interface to be tested is called, a called party (which can be a called system) corresponding to the interface to be tested returns a corresponding return message to the equipment to be tested, and after the equipment to be tested acquires all return messages received after all interfaces to be tested are called, the equipment to be tested performs test result analysis according to all received return messages so as to acquire test results of all interfaces to be tested.
The interface to be tested refers to an interface called by a newly added service in a new version, and in different application scenarios, the interface to be tested may be different specific interfaces, for example, in an order query scenario, the interface to be tested is an order query interface, and in a user information query scenario, the interface to be tested is a user information query interface. The designated database is a database which is designated in advance and used for storing the input parameter information of all the interfaces to be tested, and can be implemented by using a MySQL database, and the implementation manner of the database is not limited in this embodiment. The input parameter information related to the interface to be tested is a parameter value required for implementing the interface to be tested, for example, information such as an order number, a membership number, a calling system and the like related to an order is required to be acquired in a certain order inquiry scene, and the input parameter information related to the interface to be tested can be obtained by referring to a description of an interface document of each called system (input parameters of each interface are defined on the interface document, for example, if the interface document defines an order number and a membership number which require to input an order, the order number and the membership number of the order are required to be searched in a specified database), and then a service specific service code is used for acquiring the input parameter information in the specified database.
More specifically, the input parameter information related to each interface to be tested comprises multiple groups of parameter entering information, and each group of parameter entering information comprises parameter entering values corresponding to multiple interface fields; a set of access parameters is used to generate a request message.
The return message is a message which is generated and returned to the test equipment according to the received request message after the called party corresponding to the interface to be tested is called, and one request message corresponds to one return message. In different application scenarios, the return messages include different return fields, for example, in an order query scenario, the return messages may include detail information of the queried order, such as order price, picture, and consignee information, and certainly, the return fields in the return message corresponding to each interface to be tested are different, and the return fields are subject to the definition of the interface document.
The embodiment of the invention abandons the traditional mode of manufacturing data through excel, improves the mode of directly acquiring the data related to the interface to be tested from the specified database (for example, in an order inquiry service scene, after the function test of a new version is stable, the order of a new service is put in storage, so the order data can be directly acquired from the database), and improves the mode of calling the input parameters of the inquiry interface into calling through executing an interface test script, wherein the mode synchronously calls a local code to send a request message to a remote service platform, calls the interface to be tested through the interface test script, a tester only needs to acquire input parameter information from the existing test environment during testing, and then assembles the interface test script based on the input parameter information, the operation of assembling the interface test script is simple, and the points which need to be concerned by the tester to write the script are fewer (for example, in an order inquiry service scene, only the interface code, the reference message, the file name of the returned message and the like of the interface to be tested need to be concerned), compared with the traditional mode of reading local excel, the time for automatically testing the interface can be obviously shortened, the execution time of the original SAT automation is directly shortened from dozens of hours to several minutes, the working efficiency of testing personnel is greatly improved, and the time cost is saved.
Based on the foregoing embodiments, in an embodiment, the step of obtaining input parameter information related to all interfaces to be tested from a designated database, and generating an interface test script according to the input parameter information related to all interfaces to be tested includes, as shown in fig. 3:
s111: acquiring input parameter information related to each interface to be tested from a specified database;
s112: generating a plurality of request messages corresponding to each interface to be tested according to the input parameter information related to each interface to be tested;
s113: and generating an interface test script according to the plurality of request messages corresponding to all the interfaces to be tested.
In this embodiment, the interface test script can be used to simultaneously test one to-be-tested interface or multiple to-be-tested interfaces, when the test equipment generates the interface test script, for any one to-be-tested interface, the test equipment obtains input parameter information related to the to-be-tested interface from the specified database, then generates multiple request messages corresponding to the to-be-tested interface according to the input parameter information related to the to-be-tested interface, and generates the interface test script according to the multiple request messages corresponding to all to-be-tested interfaces after the test equipment generates the multiple request messages corresponding to all to-be-tested interfaces.
It should be noted that, the number of interfaces called by the new service is usually multiple, and for any interface to be tested, multiple request messages need to be sent to the interface to be tested during testing (it is understandable that specific message parameters of different request messages may be different), whereas in the conventional interface automation test, the SAT automation tool can only test one interface to be tested at each test, and only one interface to be tested can be sent each time a message is sent to the interface to be tested. In this embodiment, multiple request messages can be sent to the interface to be tested at one time through the interface test script, so that the test efficiency of the interface to be tested is improved.
In one embodiment, the step of generating, by the test device, a request packet corresponding to any interface to be tested according to a set of entry parameter information included in the input parameter information related to the interface to be tested includes: and adding the parameter values corresponding to the interface fields included in the group of parameter information of any interface to be tested into the interface field information corresponding to the interface fields to carry out message assembly, and assembling to obtain a request message corresponding to the interface to be tested.
In another embodiment, the step of generating an interface test script according to a plurality of request messages corresponding to all interfaces to be tested includes: acquiring preset test type information corresponding to each interface to be tested; and generating an interface test script according to the plurality of request messages corresponding to all interfaces to be tested and the preset test type information.
Specifically, the tester puts the generated request message into the test class (i.e. the test class information) through the test device to assemble the interface test script, and then the tester selects the corresponding test environment configuration, such as zkservice (zookeeper service), appCode, scmService, etc., and executes the interface test script to send the generated request message to a remote service platform, such as a Sunning remote service platform (RSF platform)
In one embodiment, the step of executing the interface test script to call all the interfaces to be tested includes: and executing the interface test script to send the plurality of request messages corresponding to the interface to be tested to the related called party through each interface to be tested. After receiving any request message corresponding to the interface to be tested, the called party associated with any interface to be tested generates a corresponding return message and returns the return message to the test equipment.
In an embodiment, the step of obtaining all the return messages received after all the interfaces to be tested are called, and performing test result analysis according to all the received return messages to obtain the test results of all the interfaces to be tested, as shown in fig. 4, includes:
s131: acquiring a plurality of return messages returned by a called party associated with each interface to be tested;
s132: and analyzing the test result according to a plurality of returned messages returned by the called party associated with each interface to be tested respectively to obtain the test result of each interface to be tested.
Specifically, after receiving the request message corresponding to the interface to be tested, the called party associated with any interface to be tested generates a corresponding return message and returns the generated return message to the device to be tested, and after receiving the return messages returned by the called parties associated with all the devices to be tested, the device to be tested performs test result analysis according to the return messages corresponding to each interface to be tested (i.e., the return messages returned by the called party associated with the interface to be tested) respectively, thereby obtaining the test result of the interface to be tested in the current test.
More specifically, in an embodiment, the step of analyzing the test result according to the multiple return messages returned by the called party associated with each interface to be tested, to obtain the test result of each interface to be tested, as shown in fig. 5, includes:
s1321: acquiring a pre-specified baseline version test result set related to each interface to be tested, wherein the baseline version test result set related to each interface to be tested comprises a plurality of historical return messages obtained by testing the interface to be tested of the baseline version;
s1322: and respectively taking a plurality of returned messages returned by the called party associated with each interface to be tested as a current version test result set related to the interface to be tested, performing binary comparison on the current version test result set related to the interface to be tested and a baseline version test result set by using a preset comparison tool, and performing test result analysis according to a comparison result between the current version test result set related to the interface to be tested and the baseline version test result set to obtain a test result of the interface to be tested.
In this embodiment, the test device stores the return message after receiving it. Specifically, each interface to be tested locally has a corresponding return message storage path (for example, a certain folder), and after receiving a return message, the test device determines the corresponding interface to be tested and stores the corresponding interface to be tested in the return message storage path corresponding to the interface to be tested, more specifically, a return message is stored as a return message file (for example, txt file, etc.), and since one interface to be tested corresponds to multiple return messages, one return message storage path is associated with multiple return message files (that is, one folder contains multiple return message files). The test result set (which refers to the return message file obtained after all the return messages corresponding to the interface to be tested in the test are stored) corresponding to each interface to be tested, which is obtained by automatically testing the interface of each new version, is still continuously stored after the test, so that in a new test performed after the version is updated again, the test result set of the current version of the relevant interface to be tested in the test of the latest version (which refers to the test result set obtained by the interface to be tested in the test of the current version) can be compared with the test result set of the baseline version of the interface to be tested (the baseline version can refer to the previous version, and if a plurality of versions exist, any one past version which is specified in advance).
Wherein, the tester can open a local comparison tool (such as Beyond company, WinMerge, etc.) by using a development tool, binary comparison is performed on a baseline version test result set and a current version test result set of each interface to be tested by the comparison tool, the comparison tool can perform difference display on a place where the current version test result set is inconsistent with the baseline version test result set (such as highlighting, marking red and/or thickening the file names of return message files with inconsistent comparison to distinguish the return message files with consistent comparison), then the tester can open the return message files with difference processing to Compare the results, inconsistent fields in the return message with difference processing in the two test result sets can also be subjected to difference processing by the comparison tool, and the tester can analyze the cause of inconsistency based on the return message files with difference processing, thereby obtaining the test result of each interface to be tested in the test.
Specifically, if the two test result sets are determined to be all consistent after comparison, it can be determined that the service is not affected by the change of the new version, the automatic test result of the interface to be tested passes, and the new version of the test result set is stored subsequently. If the two test result sets are not completely consistent after comparison, the reasons of inconsistency may be 1, code BUG, 2, component upgrade (for example, version upgrade of MySQL database), 3, new added fields in the new version, etc., which need to be processed by the tester. In actual testing, a tester executes an edited interface test script by using a development tool, selects a test environment, and after the interface test script is executed, checks a specified directory (a pre-specified return message storage path), and finds that the total execution time for executing 20 test cases by using the interface test script is about 16 seconds, while the execution time for executing 20 test cases by using a common automation task is about 2 hours. Under the scene of completing the same test task, the efficiency is improved by more than 400 times, and a large amount of time is saved.
The reason why the test result needs to be analyzed by the intervention of the tester is that in the automatic test in the industry at present, because the message of the scene is specifically positioned, more subsequent steps involve oral interventions of corresponding developers, the reliability and safety are relatively high through manual analysis, in addition, in the actual test work, the number of scenes which cannot pass the test is small, and if the scene is a large-area scene, the requirement is not suitable for realizing the automation.
Based on the same inventive concept, the invention also provides an automatic interface testing device. The device can be used for testing one interface to be tested or simultaneously testing a plurality of interfaces to be tested. In one embodiment, as shown in fig. 6, the interface automated testing apparatus includes the following modules:
the interface test script generating module 110 is configured to obtain input parameter information related to all interfaces to be tested from a specified database, and generate an interface test script according to the input parameter information related to all interfaces to be tested;
an interface test script execution module 120, configured to execute the interface test script to call all interfaces to be tested;
the test result analysis module 130 is configured to obtain all return messages received after all the interfaces to be tested are called, and perform test result analysis according to all the received return messages to obtain test results of all the interfaces to be tested.
In one embodiment, the interface test script generation module includes:
the input parameter acquisition submodule is used for acquiring input parameter information related to each interface to be tested from a specified database;
the request message generation submodule is used for generating a plurality of request messages corresponding to each interface to be tested according to the input parameter information related to each interface to be tested;
and the interface test script generation submodule is used for generating an interface test script according to the plurality of request messages corresponding to all the interfaces to be tested.
In one embodiment, the input parameter information related to each interface to be tested comprises a plurality of groups of parameter entering information, and each group of parameter entering information comprises parameter entering values corresponding to a plurality of interface fields; a set of access parameters is used to generate a request message.
And the request message generation sub-module is also used for generating a request message corresponding to the interface to be tested according to a group of input parameter information included in the input parameter information related to any interface to be tested. More specifically, the request message generation sub-module is further configured to add an entry parameter value corresponding to each interface field included in a set of entry parameter information of any interface to be tested to the interface field information corresponding to each interface field to perform message assembly, and assemble to obtain a request message corresponding to the interface to be tested.
In one embodiment, the interface test script generation submodule includes:
the test type acquisition unit is used for acquiring preset test type information corresponding to each interface to be tested;
and the interface test script generating unit is used for generating an interface test script according to the plurality of request messages corresponding to all the interfaces to be tested and the preset test type information.
In an embodiment, the interface test script execution module is specifically configured to execute the interface test script, so as to send the multiple request messages corresponding to the interface to be tested to the called party associated with the interface to be tested through each interface to be tested. After receiving any request message corresponding to the interface to be tested, the called party associated with any interface to be tested generates a corresponding return message and returns the return message.
In one embodiment, the test result analysis module includes:
the return message acquisition submodule is used for acquiring a plurality of return messages returned by the called party associated with each interface to be tested;
and the test result analysis submodule is used for carrying out test result analysis according to a plurality of returned messages returned by the called party associated with each interface to be tested respectively to obtain the test result of each interface to be tested.
In one embodiment, the test result analysis submodule includes:
a baseline result set obtaining unit, configured to obtain a pre-specified baseline version test result set associated with each interface to be tested, where the baseline version test result set associated with each interface to be tested includes multiple historical return messages obtained by the interface to be tested for testing the baseline version;
and the test result analysis unit is used for respectively taking a plurality of returned messages returned by the called party associated with each interface to be tested as a current version test result set related to the interface to be tested, performing binary comparison on the current version test result set related to the interface to be tested and a baseline version test result set by using a preset comparison tool, and performing test result analysis according to a comparison result between the current version test result set related to the interface to be tested and the baseline version test result set to obtain a test result of the interface to be tested.
For specific limitations of the interface automated testing apparatus, reference may be made to the above limitations of the interface automated testing method, which are not described herein again. All or part of each module in the interface automatic testing device can be realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, the internal structure of which may be as shown in FIG. 7. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer equipment is used for storing data such as input parameter information of the interface to be tested. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a method for automated testing of an interface.
Those skilled in the art will appreciate that the architecture shown in fig. 7 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the following steps when executing the computer program:
acquiring input parameter information related to all interfaces to be tested from a designated database, and generating an interface test script according to the input parameter information related to all the interfaces to be tested; executing the interface test script to call all the interfaces to be tested; and acquiring all the return messages received after calling all the interfaces to be tested, and analyzing the test results according to all the received return messages to acquire the test results of all the interfaces to be tested.
In one embodiment, when the processor executes the computer program to obtain the input parameter information related to all the interfaces to be tested from the designated database and generate the interface test script according to the input parameter information related to all the interfaces to be tested, the following steps are also implemented:
acquiring input parameter information related to each interface to be tested from a specified database; generating a plurality of request messages corresponding to each interface to be tested according to the input parameter information related to each interface to be tested; and generating an interface test script according to the plurality of request messages corresponding to all the interfaces to be tested.
In one embodiment, the input parameter information related to each interface to be tested comprises a plurality of groups of parameter entering information, and each group of parameter entering information comprises parameter entering values corresponding to a plurality of interface fields; a set of access parameters is used to generate a request message.
In one embodiment, when the processor executes the computer program to generate a request message corresponding to any interface to be tested according to a set of parameter information included in input parameter information related to the interface to be tested, the following steps are further implemented:
and adding the parameter values corresponding to the interface fields included in the group of parameter information of any interface to be tested into the interface field information corresponding to the interface fields to carry out message assembly, and assembling to obtain a request message corresponding to the interface to be tested.
In one embodiment, when the processor executes the computer program to generate the interface test script according to the plurality of request messages corresponding to all the interfaces to be tested, the following steps are also implemented:
acquiring preset test type information corresponding to each interface to be tested; and generating an interface test script according to the plurality of request messages corresponding to all interfaces to be tested and the preset test type information.
In one embodiment, the processor executes the computer program to implement the following steps when executing the interface test script to call all the interfaces to be tested:
executing the interface test script to send a plurality of request messages corresponding to the interface to be tested to the called party associated with the interface to be tested through each interface to be tested; and the called party associated with any interface to be tested generates a corresponding return message after receiving any request message corresponding to the interface to be tested and returns the corresponding return message.
In one embodiment, the processor executes the computer program to obtain all return messages received after calling all the interfaces to be tested, performs test result analysis according to all the received return messages, and further implements the following steps when obtaining the test results of all the interfaces to be tested:
acquiring a plurality of return messages returned by a called party associated with each interface to be tested; and analyzing the test result according to a plurality of returned messages returned by the called party associated with each interface to be tested respectively to obtain the test result of each interface to be tested.
In one embodiment, when the processor executes the computer program, the processor analyzes the test result according to a plurality of return messages returned by the called party associated with each interface to be tested, and when the processor obtains the test result of each interface to be tested, the processor further implements the following steps:
acquiring a pre-specified baseline version test result set related to each interface to be tested, wherein the baseline version test result set related to each interface to be tested comprises a plurality of historical return messages obtained by testing the interface to be tested of the baseline version; and respectively taking a plurality of returned messages returned by the called party associated with each interface to be tested as a current version test result set related to the interface to be tested, performing binary comparison on the current version test result set related to the interface to be tested and a baseline version test result set by using a preset comparison tool, and performing test result analysis according to a comparison result between the current version test result set related to the interface to be tested and the baseline version test result set to obtain a test result of the interface to be tested.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
acquiring input parameter information related to all interfaces to be tested from a designated database, and generating an interface test script according to the input parameter information related to all the interfaces to be tested; executing the interface test script to call all the interfaces to be tested; and acquiring all the return messages received after calling all the interfaces to be tested, and analyzing the test results according to all the received return messages to acquire the test results of all the interfaces to be tested.
In one embodiment, when the computer program is executed by the processor, and the input parameter information related to all the interfaces to be tested is obtained from the designated database, and the interface test script is generated according to the input parameter information related to all the interfaces to be tested, the following steps are further implemented:
acquiring input parameter information related to each interface to be tested from a specified database; generating a plurality of request messages corresponding to each interface to be tested according to the input parameter information related to each interface to be tested; and generating an interface test script according to the plurality of request messages corresponding to all the interfaces to be tested.
In one embodiment, the input parameter information related to each interface to be tested comprises a plurality of groups of parameter entering information, and each group of parameter entering information comprises parameter entering values corresponding to a plurality of interface fields; a set of access parameters is used to generate a request message.
In one embodiment, when the computer program is executed by the processor and generates a request message corresponding to any interface to be tested according to a set of parameter information included in input parameter information related to the interface to be tested, the following steps are further implemented:
and adding the parameter values corresponding to the interface fields included in the group of parameter information of any interface to be tested into the interface field information corresponding to the interface fields to carry out message assembly, and assembling to obtain a request message corresponding to the interface to be tested.
In one embodiment, when the computer program is executed by the processor and generates an interface test script according to a plurality of request messages corresponding to all interfaces to be tested, the following steps are also implemented:
acquiring preset test type information corresponding to each interface to be tested; and generating an interface test script according to the plurality of request messages corresponding to all interfaces to be tested and the preset test type information.
In one embodiment, when the computer program is executed by the processor and executes the interface test script to call all the interfaces to be tested, the following steps are also implemented:
executing the interface test script to send a plurality of request messages corresponding to the interface to be tested to the called party associated with the interface to be tested through each interface to be tested; and the called party associated with any interface to be tested generates a corresponding return message after receiving any request message corresponding to the interface to be tested and returns the corresponding return message.
In one embodiment, the computer program is executed by the processor to obtain all return messages received after all the interfaces to be tested are called, perform test result analysis according to all the received return messages, and when the test results of all the interfaces to be tested are obtained, further implement the following steps:
acquiring a plurality of return messages returned by a called party associated with each interface to be tested; and analyzing the test result according to a plurality of returned messages returned by the called party associated with each interface to be tested respectively to obtain the test result of each interface to be tested.
In one embodiment, when the computer program is executed by the processor, the computer program performs test result analysis according to multiple return messages returned by the called party associated with each interface to be tested, and when the test result of each interface to be tested is obtained, the following steps are further implemented:
acquiring a pre-specified baseline version test result set related to each interface to be tested, wherein the baseline version test result set related to each interface to be tested comprises a plurality of historical return messages obtained by testing the interface to be tested of the baseline version; and respectively taking a plurality of returned messages returned by the called party associated with each interface to be tested as a current version test result set related to the interface to be tested, performing binary comparison on the current version test result set related to the interface to be tested and a baseline version test result set by using a preset comparison tool, and performing test result analysis according to a comparison result between the current version test result set related to the interface to be tested and the baseline version test result set to obtain a test result of the interface to be tested.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An automatic interface test method is characterized in that the method is used for testing one interface to be tested or simultaneously testing a plurality of interfaces to be tested; the method comprises the following steps:
acquiring input parameter information related to all the interfaces to be tested from a designated database, and generating an interface test script according to the input parameter information related to all the interfaces to be tested;
executing the interface test script to call all the interfaces to be tested;
and acquiring all return messages received after all the interfaces to be tested are called, and analyzing test results according to all the received return messages to acquire the test results of all the interfaces to be tested.
2. The method for automatically testing an interface according to claim 1, wherein the step of obtaining all the input parameter information related to the interface to be tested from the designated database and generating an interface test script according to all the input parameter information related to the interface to be tested comprises:
acquiring input parameter information related to each interface to be tested from a specified database;
generating a plurality of request messages corresponding to each interface to be tested according to the input parameter information related to each interface to be tested;
and generating an interface test script according to the plurality of request messages corresponding to all the interfaces to be tested.
3. The automated interface testing method of claim 2,
the input parameter information related to each interface to be tested comprises a plurality of groups of parameter entering information, and each group of parameter entering information comprises parameter entering values corresponding to a plurality of interface fields; a group of the access information is used for generating a request message;
generating a request message corresponding to any interface to be tested according to a group of access information included in input parameter information related to the interface to be tested, wherein the step comprises the following steps:
and adding the parameter values corresponding to the interface fields included in the group of parameter information of any interface to be tested into the interface field information corresponding to the interface fields to carry out message assembly, and assembling to obtain a request message corresponding to the interface to be tested.
4. The automated interface testing method of claim 2,
the step of generating an interface test script according to the plurality of request messages corresponding to all the interfaces to be tested includes:
acquiring preset test type information corresponding to each interface to be tested;
and generating an interface test script according to the plurality of request messages corresponding to all the interfaces to be tested and preset test type information.
5. The automated interface testing method of claim 2,
the step of executing the interface test script to call all the interfaces to be tested includes:
executing the interface test script to send a plurality of request messages corresponding to the interface to be tested to the called party associated with the interface to be tested through each interface to be tested; and the called party associated with any interface to be tested generates a corresponding return message after receiving any request message corresponding to the interface to be tested and returns the corresponding return message.
6. The method for automated interface testing according to claim 5,
the step of obtaining all the return messages received after calling all the interfaces to be tested, and analyzing the test results according to all the received return messages to obtain the test results of all the interfaces to be tested comprises the following steps:
and acquiring a plurality of return messages returned by the called party associated with each interface to be tested, and analyzing the test result according to the plurality of return messages returned by the called party associated with each interface to be tested to acquire the test result of each interface to be tested.
7. The method for automated interface testing according to claim 6,
the step of analyzing the test result according to the multiple returned messages returned by the called party associated with each interface to be tested to obtain the test result of each interface to be tested comprises the following steps:
acquiring a pre-specified baseline version test result set related to each interface to be tested, wherein the baseline version test result set related to each interface to be tested comprises a plurality of historical return messages obtained by testing the interface to be tested of the baseline version;
and respectively taking a plurality of returned messages returned by the called party associated with each interface to be tested as a current version test result set related to the interface to be tested, performing binary comparison on the current version test result set related to the interface to be tested and a baseline version test result set by using a preset comparison tool, and performing test result analysis according to a comparison result between the current version test result set related to the interface to be tested and the baseline version test result set to obtain a test result of the interface to be tested.
8. An automatic interface test method is characterized in that the device is used for testing one interface to be tested or simultaneously testing a plurality of interfaces to be tested; the device comprises:
the interface test script generation module is used for acquiring all input parameter information related to the interface to be tested from a specified database and generating an interface test script according to all input parameter information related to the interface to be tested;
the interface test script execution module is used for executing the interface test script so as to call all the interfaces to be tested;
and the test result analysis module is used for acquiring all the return messages received after all the interfaces to be tested are called, and analyzing the test results according to all the received return messages to acquire the test results of all the interfaces to be tested.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the method of any of claims 1 to 7 are implemented when the computer program is executed by the processor.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN202011353754.7A 2020-11-26 2020-11-26 Interface automation test method and device, computer equipment and storage medium Active CN112395202B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011353754.7A CN112395202B (en) 2020-11-26 2020-11-26 Interface automation test method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011353754.7A CN112395202B (en) 2020-11-26 2020-11-26 Interface automation test method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112395202A true CN112395202A (en) 2021-02-23
CN112395202B CN112395202B (en) 2023-04-14

Family

ID=74604622

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011353754.7A Active CN112395202B (en) 2020-11-26 2020-11-26 Interface automation test method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112395202B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112817867A (en) * 2021-02-25 2021-05-18 平安消费金融有限公司 Interface test script generation method and device, computer equipment and medium
CN113377668A (en) * 2021-06-29 2021-09-10 南京苏宁软件技术有限公司 Automatic testing method and device for service interface and computer equipment
CN113468049A (en) * 2021-06-29 2021-10-01 平安养老保险股份有限公司 Test method, device, equipment and medium based on configurable interface

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110471842A (en) * 2019-07-12 2019-11-19 平安普惠企业管理有限公司 A kind of test method, device and computer readable storage medium
CN111858376A (en) * 2020-07-29 2020-10-30 平安养老保险股份有限公司 Request message generation method and interface test method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110471842A (en) * 2019-07-12 2019-11-19 平安普惠企业管理有限公司 A kind of test method, device and computer readable storage medium
CN111858376A (en) * 2020-07-29 2020-10-30 平安养老保险股份有限公司 Request message generation method and interface test method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112817867A (en) * 2021-02-25 2021-05-18 平安消费金融有限公司 Interface test script generation method and device, computer equipment and medium
CN113377668A (en) * 2021-06-29 2021-09-10 南京苏宁软件技术有限公司 Automatic testing method and device for service interface and computer equipment
CN113468049A (en) * 2021-06-29 2021-10-01 平安养老保险股份有限公司 Test method, device, equipment and medium based on configurable interface
CN113468049B (en) * 2021-06-29 2023-07-04 平安养老保险股份有限公司 Configurable interface-based test method, device, equipment and medium

Also Published As

Publication number Publication date
CN112395202B (en) 2023-04-14

Similar Documents

Publication Publication Date Title
CN112395202B (en) Interface automation test method and device, computer equipment and storage medium
CN107992409B (en) Test case generation method and device, computer equipment and storage medium
CN108427613B (en) Abnormal interface positioning method and device, computer equipment and storage medium
CN108459962B (en) Code normalization detection method and device, terminal equipment and storage medium
CN110058998B (en) Software testing method and device
CN109726134B (en) Interface test method and system
CN109977008B (en) Method and terminal for making JS code depended on by application program compatible with native library
CN108874661B (en) Test mapping relation library generation method and device, computer equipment and storage medium
CN113282513B (en) Interface test case generation method and device, computer equipment and storage medium
CN112631926B (en) Code test coverage rate display method and device, computer equipment and storage medium
CN113886262A (en) Software automation test method and device, computer equipment and storage medium
CN114527974B (en) Method and device for realizing business function of software product and computer equipment
CN115391228A (en) Precise test method, device, equipment and medium
CN114461219A (en) Data analysis method and device, computer equipment and storage medium
CN113377669A (en) Automatic testing method and device, computer equipment and storage medium
CN112612706A (en) Automated testing method, computer device and storage medium
CN112346981A (en) Joint debugging test coverage rate detection method and system
CN116431522A (en) Automatic test method and system for low-code object storage gateway
CN115757172A (en) Test execution method and device, storage medium and computer equipment
CN113468058B (en) Regression testing method and device for software as service platform and electronic equipment
CN115934129A (en) Software project updating method and device, computer equipment and storage medium
CN112486824B (en) Case code generation method, device, computer equipment and storage medium
CN114528213A (en) Automatic baffle plate testing method, device, equipment and storage medium
CN109240906B (en) Database configuration information adaptation method and device, computer equipment and storage medium
KR102111392B1 (en) Test unified administration system and Controlling Method for the Same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210510

Address after: 518002 unit 3510-130, Luohu business center, 2028 Shennan East Road, Chengdong community, Dongmen street, Luohu District, Shenzhen City, Guangdong Province

Applicant after: Shenzhen yunwangwandian Technology Co.,Ltd.

Address before: No.1-1 Suning Avenue, Xuzhuang Software Park, Xuanwu District, Nanjing, Jiangsu Province, 210000

Applicant before: Suning Cloud Computing Co.,Ltd.

REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40044723

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant