CN113010424A - Interface automation test processing method, system, computer equipment and storage medium - Google Patents

Interface automation test processing method, system, computer equipment and storage medium Download PDF

Info

Publication number
CN113010424A
CN113010424A CN202110286192.7A CN202110286192A CN113010424A CN 113010424 A CN113010424 A CN 113010424A CN 202110286192 A CN202110286192 A CN 202110286192A CN 113010424 A CN113010424 A CN 113010424A
Authority
CN
China
Prior art keywords
test
interface
time sequence
sequence diagram
test case
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110286192.7A
Other languages
Chinese (zh)
Other versions
CN113010424B (en
Inventor
高爱家
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An E Wallet Electronic Commerce Co Ltd
Original Assignee
Ping An E Wallet Electronic Commerce Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An E Wallet Electronic Commerce Co Ltd filed Critical Ping An E Wallet Electronic Commerce Co Ltd
Priority to CN202110286192.7A priority Critical patent/CN113010424B/en
Publication of CN113010424A publication Critical patent/CN113010424A/en
Application granted granted Critical
Publication of CN113010424B publication Critical patent/CN113010424B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Abstract

The invention discloses an automatic interface test processing method, which comprises the following steps: the method comprises the steps of obtaining a tested object and a first test case set corresponding to the tested object, wherein the first test case set comprises a plurality of first test cases for executing different interface tests on the tested object; sequentially recording a first interface node path of the tested object during the test of each first test case in the first test case set as a first time sequence diagram to obtain a first time sequence diagram set corresponding to the first test case set; acquiring a plurality of second interface node paths of the tested object during the test of a plurality of second test cases in a preset time period, and recording to obtain a second time sequence diagram set, wherein the plurality of second test cases are part or all of the first test case sets; and comparing and analyzing the second time sequence diagram set according to the first time sequence diagram set to obtain a target test coverage report of the second time sequence diagram set. The invention can effectively manage the interface test.

Description

Interface automation test processing method, system, computer equipment and storage medium
Technical Field
The embodiment of the invention relates to the field of interface testing, in particular to an interface automatic testing processing method, an interface automatic testing processing system, computer equipment and a storage medium.
Background
The automatic interface test is an important link for quality assurance in the current software research and development process and is a high-efficiency functional regression test means. At present, an interface automation test script is not associated with a tested system, and the existing automation script is in a passive stage and is actively maintained and managed by workers. The automatic script is generated by depending on personnel, so that the influence of personnel change is large, the error rate is high, and the efficiency is low; in the prior art, the execution of an automatic case is limited by factors such as environment, code branches, implementation methods and the like, and the problems of long execution time and low stability generally exist. The scripts for the automated interface testing need a method for automated management, which is a method for changing manual work into automated scripts for management, so that the current automated coverage situation, the failure sequence diagram and the incomplete automated design workload can be quickly known.
Disclosure of Invention
In view of the above, an object of the embodiments of the present invention is to provide an interface automation test processing method, system, computer device and storage medium, which can effectively manage interface tests.
In order to achieve the above object, an embodiment of the present invention provides an interface automation test processing method, including:
the method comprises the steps of obtaining a tested object and a first test case set corresponding to the tested object, wherein the first test case set comprises a plurality of first test cases for executing different interface tests on the tested object;
sequentially recording a first interface node path of the tested object during the test of each first test case in the first test case set as a first time sequence diagram to obtain a first time sequence diagram set corresponding to the first test case set;
acquiring a plurality of second interface node paths of the tested object during the test of a plurality of second test cases in a preset time period, and recording to obtain a second time sequence diagram set, wherein the plurality of second test cases are part or all of the first test case sets;
and comparing and analyzing the second time sequence diagram set according to the first time sequence diagram set to obtain a target test coverage report of the second time sequence diagram set.
Further, the step of sequentially recording a first interface node path of the object under test when executing the test of each first test case in the first test case set as a first timing diagram to obtain a first timing diagram set corresponding to the first test case set includes:
obtaining test parameters of the first test case, wherein the test parameters comprise request parameters and return parameters of the first test case;
generating a test script corresponding to the first test case according to the test parameters of the first test case;
sequentially executing the test script on the tested object to obtain a first interface node path corresponding to the first test case, wherein the first interface node path comprises a plurality of first interface nodes corresponding to the tested object during testing and the precedence relationship between the tested object and the plurality of first interface nodes;
generating a first time sequence diagram corresponding to the first test case according to the first interface node path;
and carrying out de-duplication processing on the first time sequence diagram to obtain a first time sequence diagram set.
Further, the step of obtaining a plurality of second interface node paths when the object to be tested executes a test of a plurality of second test cases within a preset time period, and recording to obtain a second time sequence atlas, where the second test cases are part or all of the first test cases, includes:
acquiring an automatic execution script for executing the first test case;
associating the automatic execution script with the first test case to obtain a second test case;
executing the automatic execution script according to a preset timing task within a preset time to obtain a second interface node path corresponding to the second test case, wherein the second interface node path comprises a plurality of second interface nodes corresponding to the tested object during testing and the precedence relationship between the tested object and the plurality of second interface nodes;
and obtaining a second time sequence diagram corresponding to the second test case according to the second interface node path record to obtain a second time sequence diagram set.
Further, the method further comprises:
determining a third time sequence diagram in the second time sequence diagram set according to the target test coverage report, wherein the third time sequence diagram is any time sequence diagram corresponding to the test case which does not complete the test or has a fault in the second test case set;
performing regression processing on the third timing diagram to determine whether the third timing diagram is stable according to a regression result;
storing the third timing graph in the first set of timing graphs when the third timing graph is stable.
Further, the method further comprises:
and comparing the first time sequence diagram, the second time sequence diagram and the third time sequence diagram to generate a time sequence diagram overlay diagram according to the comparison result.
Further, the method further comprises:
receiving a test optimization request of a user, wherein the test optimization request comprises target node data for optimization;
inquiring a target time sequence diagram corresponding to the target node data according to the target node data;
acquiring a target test script corresponding to the target sequence diagram;
and sending the target test script to the user.
Further, the method further comprises:
uploading the target test coverage report of the second timing diagram set into a blockchain.
In order to achieve the above object, an embodiment of the present invention further provides an interface automation test processing system, including:
the device comprises an acquisition module, a test module and a processing module, wherein the acquisition module is used for acquiring a tested object and a first test case set corresponding to the tested object, and the first test case set comprises a plurality of first test cases for executing different interface tests on the tested object;
the first recording module is used for recording a first interface node path of the tested object during the test of each first test case in the first test case set as a first time sequence diagram in sequence to obtain a first time sequence diagram set corresponding to the first test case set;
the second recording module is used for acquiring a plurality of second interface node paths of the tested object during the test of a plurality of second test cases in a preset time period, and recording to obtain a second time sequence diagram set, wherein the plurality of second test cases are part or all of the first test case sets;
and the analysis module is used for carrying out comparative analysis on the second time sequence diagram set according to the first time sequence diagram set to obtain a test coverage report of the second time sequence diagram set.
In order to achieve the above object, an embodiment of the present invention provides a computer device, which includes a memory and a processor, where the memory stores a computer program that is executable on the processor, and the computer program, when executed by the processor, implements the steps of the interface automation test processing method as described above.
To achieve the above object, an embodiment of the present invention provides a computer-readable storage medium, in which a computer program is stored, the computer program being executable by at least one processor to cause the at least one processor to execute the steps of the interface automation test processing method as described above.
According to the interface automated test processing method, the interface automated test processing system, the computer device and the storage medium provided by the embodiment of the invention, the first time sequence atlas is generated by drawing the time sequence chart of the first test case set, the second time sequence chart is generated by the execution node path of the second test case when the second test case is tested on the tested object in the subsequent time period, and then the second time sequence atlas and the first time sequence atlas are compared and analyzed, so that the target coverage test report of the second time sequence chart can be obtained, and the coverage condition, the failed time sequence chart and the design workload of the incomplete automated test during the current automated test can be rapidly known according to the target coverage test report.
Drawings
Fig. 1 is a flowchart of a first embodiment of an interface automated test processing method according to the present invention.
Fig. 2 is a schematic diagram of program modules of a second embodiment of the interface automated test processing system according to the invention.
Fig. 3 is a schematic diagram of a hardware structure of a third embodiment of the computer device according to the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example one
Referring to fig. 1, a flowchart illustrating steps of an interface automation test processing method according to a first embodiment of the present invention is shown. It is to be understood that the flow charts in the embodiments of the present method are not intended to limit the order in which the steps are performed. The following description is made by way of example with the computer device 4 as the execution subject. The details are as follows.
Step S100, a tested object and a first test case set corresponding to the tested object are obtained, wherein the first test case set comprises a plurality of first test cases for executing different interface tests on the tested object.
Specifically, the object to be tested is an interface to be tested, and interface data of the object to be tested is acquired. When the tested object is tested differently, different first test cases are compiled for the tested object according to different test requests, and a first test case set is obtained. And compiling the first test case by using a test script to obtain a test script set corresponding to the first test case set.
And step S120, recording a first interface node path of the tested object when the tested object executes the test of each first test case in the first test case set as a first time sequence diagram in sequence, and obtaining a first time sequence diagram set corresponding to the first test case set.
Specifically, a test log of a tested object executing a first test case through a test script for interface test is collected within a period of time, the test log includes a downstream service ip and a log serial number, and the test log is analyzed to obtain a first interface node path generated by a test request initiated from each tested interface. The systems are connected into a first interface node path by calling the interface strings of the systems, and the transaction data is circulated on the first interface node path. The interface node of the system through which a complete data flow process passes is a timing diagram. For example, a first interface node path of a test request initiated from an interface node a of a tested object is acquired from a test log, downstream interface nodes which are further called by the interface node a in the first interface node path are analyzed until the lowest layer of the first interface node path is analyzed, the interface nodes take the interface node a as a starting point, the precedence relationship of the interface nodes is stored, and a first timing diagram of the interface a is drawn. The log analysis may configure a regular model or individual analysis according to different system printing modes to obtain downstream interface data, for example, the regular expression corresponding to the data format of the configuration system ip may be matched to the downstream interface data of the first interface node path.
In an exemplary embodiment, the step S120 further includes:
step S121, obtaining the test parameters of the first test case, wherein the test parameters comprise the request parameters and the return parameters of the first test case.
And S122, generating a test script corresponding to the first test case according to the test parameters of the first test case.
Step S123, sequentially executing the test script on the object to be tested to obtain a first interface node path corresponding to the first test case, where the first interface node path includes a plurality of first interface nodes corresponding to the object to be tested when the object to be tested is tested and a precedence relationship between the object to be tested and the plurality of first interface nodes.
Step S124, generating a first timing diagram corresponding to the first test case according to the first interface node path.
Step S125, performing deduplication processing on the first timing diagram to obtain a first timing diagram set.
Specifically, the test parameters of the first test case may be stored in a database in a text form, the test parameters are used to generate a test script, the test script is an automated test script, and the automated test script may be started automatically at regular time by setting a timing mode on an automated test platform such as a Phoenix Framework, where the test parameters include a request parameter and a return parameter, the request parameter includes a test requirement related parameter of the object to be tested, and the return parameter refers to a return result after the test script is executed, and may be understood as an expected result. When the test script is executed, the test script calls the interface node corresponding to the tested object and the interface nodes of other downstream systems to perform testing, generates a first time sequence diagram corresponding to the test script, performs deduplication processing on the first time sequence diagrams of all the interface nodes, and generates a non-repetitive first time sequence diagram set. The design is carried out by taking the sequence diagram as a base line, the script corresponding to the existing sequence diagram of the interface node can be checked during design execution, repeated sequence diagrams and the upstream automation condition can be automatically recommended, and the quick maintenance and design of the script are facilitated.
Step S140, obtaining a plurality of second interface node paths when the object under test executes a test of a plurality of second test cases within a preset time period, and recording to obtain a second time sequence diagram set, where the plurality of second test cases are part or all of the first test cases.
Specifically, the second test case may be a first test case in which the user performs part or all of case tests on the interface to be tested again within a preset time period, the first test case is automatically tested through the automated execution script, a second interface node path is correspondingly recorded, and the user may optimize or modify interface data of the interface to be tested or a downstream system interface within the preset time period. And testing the tested object through the second test case, executing the test script corresponding to the second test case, acquiring the called interface node in the executing process, and generating a second timing chart according to the called interface node. And continuously recording a second time sequence diagram of each interface node of the tested object by the background along with the continuous test work, and acquiring the second time sequence diagram in a preset time period to obtain a second time sequence diagram set.
In an exemplary embodiment, the step S140 further includes:
step S141, obtaining an automatic execution script for executing the first test case; step S142, associating the automatic execution script with the first test case to obtain a second test case; step S143, executing the automated execution script according to a preset timing task within a preset time to obtain a second interface node path corresponding to the second test case, where the second interface node path includes a plurality of second interface nodes corresponding to the tested object during testing and a precedence relationship between the tested object and the plurality of second interface nodes; step S144, obtaining a second timing diagram corresponding to the second test case according to the second interface node path record, and obtaining a second timing diagram set.
Specifically, the plurality of second test cases are part or all of the first test case sets, the automation of the automated execution script can be started by setting a timing mode on an automated test platform such as a Phoenix Framework, the test of the object to be tested is realized, the test script is executed to generate a corresponding second timing diagram, for example, a tested interface node a is called to perform the test due to the test script having a request parameter corresponding to the test case, a second interface node path with the tested interface node a as a starting point is generated, the second timing diagram is generated according to the second interface node path, and the second timing diagram is associated with the tested interface node a, so that the interface identifier of the tested interface node a can be associated to the second timing diagram. And (4) transitioning the test script from the test script to the automatic execution script, and gradually establishing a basic system for the automatic execution interface test according to the drawn test interaction diagram.
Step S160, analyzing the second time-sequence diagram set by comparing the first time-sequence diagram set to obtain a target test coverage report of the second time-sequence diagram set.
Specifically, comparing whether the timing charts in the second timing chart set are the same as the first timing chart set, if the number of the timing charts appearing in the second timing chart set is smaller than that of the first timing chart set, generating a test coverage report with a low coverage rate of the second test case according to the situation, wherein the test coverage rate of the second timing chart set is equal to the number of the second timing charts/the number of the timing charts in the first timing chart set; if the number of the time sequence diagrams appearing in the second time sequence diagram set is equal to that of the first time sequence diagram, generating a test coverage report with a second test case coverage rate of 100 percent according to the situation; and if the timing diagrams in the second timing diagram set do not exist in the first timing diagram set, determining that the timing diagrams needing to be regressed appear in the second test case.
In an exemplary embodiment, the method further comprises:
step S200, determining a third timing chart in the second timing chart set according to the target test coverage report, wherein the third timing chart is any timing chart corresponding to the test case which does not complete the test or has a fault in the second test case set. Step S220 is performed to perform a regression process on the third timing chart, so as to correct whether the third timing chart is stable according to a regression result. In step S240, when the third timing diagram is stable, the third timing diagram is stored in the first timing diagram set.
Specifically, a third timing chart (including the interface node of the object to be tested) that needs to be regressed is calculated according to the target test coverage report, and a test case associated with the third timing chart is obtained. The process of regressing the third time chart is as follows: and acquiring a timing diagram closest to the third timing diagram from the first timing diagram set, testing the test script of the timing diagram corresponding to the third timing diagram, and modifying the test script corresponding to the tested third timing diagram according to the test result so as to enable the test result of the test script corresponding to the third timing diagram to achieve the expected effect. The method is to automatically modify the test script, i.e. when the test result of the test script corresponding to the third timing chart reaches the expected effect, the third timing chart is stable. When the third timing diagram is stable, the third timing diagram is integrated into the first timing diagram to strengthen the first timing diagram set.
In an exemplary embodiment, the method further comprises:
and comparing the first time sequence diagram, the second time sequence diagram and the third time sequence diagram to generate an overlay diagram according to the comparison result.
Specifically, by automatically and periodically executing the first test case, a time sequence diagram set can be output and compared with a time sequence diagram set and a time sequence diagram set of stock in a preset time period, wherein the time sequence diagram set of the stock is the first time sequence diagram set, so that an automatic coverage map and an unfinished automatic coverage map are generated. The automatic coverage map can quickly know the current automatic coverage condition, the failed time sequence chart and the incomplete time sequence chart of distribution.
In an exemplary embodiment, the method further comprises:
step S300, receiving a test optimization request of a user, wherein the test optimization request comprises target node data for optimization. Step S320, querying a target timing chart corresponding to the target node data according to the target node data. Step S340, acquiring a target test script corresponding to the target sequence diagram. And step S360, sending the target test script to the user.
Specifically, the test optimization request is to change a certain target node in the first interface node path, generate a test optimization request from node data corresponding to the target node, find a target time sequence diagram containing the target node in a database storing the first time sequence diagram set through the target node data of the target node, then find a target test script associated with the target time sequence diagram in the database, and push the target test script to the user, so that the user optimizes the target test script to obtain the optimized test script.
In an exemplary embodiment, the method further comprises:
uploading the target test coverage report of the second timing diagram set into a blockchain.
Specifically, uploading the target test coverage report to the blockchain may ensure its security and fair transparency to the user. The target test coverage report can be downloaded from the blockchain by the user equipment so as to verify whether the target test coverage report is tampered. The blockchain referred to in this example is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, consensus mechanism, encryption algorithm, and the like. A block chain (Blockchain), which is essentially a decentralized database, is a series of data blocks associated by using a cryptographic method, and each data block contains information of a batch of network transactions, so as to verify the validity (anti-counterfeiting) of the information and generate a next block. The blockchain may include a blockchain underlying platform, a platform product service layer, an application service layer, and the like.
Example two
Referring to fig. 2, a program module diagram of a second embodiment of the interface automated test processing system of the invention is shown. In this embodiment, the interface automatic test processing system 40 may include or be divided into one or more program modules, and the one or more program modules are stored in a storage medium and executed by one or more processors to implement the present invention and implement the interface automatic test processing method. The program module referred to in the embodiments of the present invention refers to a series of computer program instruction segments capable of performing specific functions, and is more suitable for describing the execution process of the interface automation test processing system 40 in the storage medium than the program itself. The following description will specifically describe the functions of the program modules of the present embodiment:
an obtaining module 400, configured to obtain a tested object and a first test case set corresponding to the tested object, where the first test case set includes a plurality of first test cases for performing different interface tests on the tested object.
Specifically, the object to be tested is an interface to be tested, and interface data of the object to be tested is acquired. When the tested object is tested differently, different first test cases are compiled for the tested object according to different test requests, and a first test case set is obtained. And compiling the first test case by using a test script to obtain a test script set corresponding to the first test case set.
A first recording module 402, configured to record, as a first timing diagram, a first interface node path of the object under test when the object under test executes the test of each first test case in the first test case set, so as to obtain a first timing diagram set corresponding to the first test case set.
Specifically, a test log of a tested object executing a first test case through an automatic test script for interface test in a period of time is collected, the test log includes a downstream service ip and a log serial number, and the test log is analyzed to obtain a first interface node path generated by a test request initiated from each tested interface. The systems are connected into a first interface node path by calling the interface strings of the systems, and the transaction data is circulated on the first interface node path. The interface node of the system through which a complete data flow process passes is a timing diagram. For example, a first interface node path of a test request initiated from an interface node a of a tested object is acquired from a test log, downstream interface nodes which are further called by the interface node a in the first interface node path are analyzed until the lowest layer of the first interface node path is analyzed, the interface nodes take the interface node a as a starting point, the precedence relationship of the interface nodes is stored, and a first timing diagram of the interface a is drawn. The log analysis may configure a regular model or individual analysis according to different system printing modes to obtain downstream interface data, for example, the regular expression corresponding to the data format of the configuration system ip may be matched to the downstream interface data of the first interface node path.
In an exemplary embodiment, the first recording module 402 is further configured to:
sequentially recording a first interface node path of the tested object during the test of each first test case in the first test case set as a first time sequence diagram to obtain a first time sequence diagram set corresponding to the first test case set; generating a test script corresponding to the first test case according to the test parameters of the first test case; sequentially executing the test script on the tested object to obtain a first interface node path corresponding to the first test case, wherein the first interface node path comprises a plurality of first interface nodes corresponding to the tested object during testing and the precedence relationship between the tested object and the plurality of first interface nodes; generating a first time sequence diagram corresponding to the first test case according to the first interface node path; and carrying out de-duplication processing on the first time sequence diagram to obtain a first time sequence diagram set.
Specifically, the test parameters of the first test case may be stored in a database in a text form, the test parameters are used to generate a test script, the test script is an automated test script, and the automated test script may be started automatically at regular time by setting a timing mode on an automated test platform such as a Phoenix Framework, where the test parameters include a request parameter and a return parameter, the request parameter includes a test requirement related parameter of the object to be tested, and the return parameter refers to a return result after the test script is executed, and may be understood as an expected result. When the test script is executed, the test script calls the interface node corresponding to the tested object and the interface nodes of other downstream systems to perform testing, generates a first time sequence diagram corresponding to the test script, performs deduplication processing on the first time sequence diagrams of all the interface nodes, and generates a non-repetitive first time sequence diagram set. The design is carried out by taking the sequence diagram as a base line, the script corresponding to the existing sequence diagram of the interface node can be checked during design execution, repeated sequence diagrams and the upstream automation condition can be automatically recommended, and the quick maintenance and design of the script are facilitated.
A second recording module 404, configured to obtain a plurality of second interface node paths when the object under test executes a test of a plurality of second test cases within a preset time period, and record to obtain a second time sequence atlas, where the plurality of second test cases are part or all of the first test cases.
Specifically, the second test case may be a first test case in which the user performs part or all of case tests on the interface to be tested again within a preset time period, the first test case is automatically tested through the automated execution script, a second interface node path is correspondingly recorded, and the user may optimize or modify interface data of the interface to be tested or a downstream system interface within the preset time period. And testing the tested object through the second test case, executing the test script corresponding to the second test case, acquiring the called interface node in the executing process, and generating a second timing chart according to the called interface node. And continuously recording a second time sequence diagram of each interface node of the tested object by the background along with the continuous test work, and acquiring the second time sequence diagram in a preset time period to obtain a second time sequence diagram set.
In an exemplary embodiment, the second recording module 404 is further configured to:
acquiring an automatic execution script for executing the first test case; associating the automatic execution script with the first test case to obtain a second test case; executing the automatic execution script according to a preset timing task within a preset time to obtain a second interface node path corresponding to the second test case, wherein the second interface node path comprises a plurality of second interface nodes corresponding to the tested object during testing and the precedence relationship between the tested object and the plurality of second interface nodes; and obtaining a second time sequence diagram corresponding to the second test case according to the second interface node path record to obtain a second time sequence diagram set.
Specifically, the plurality of second test cases are part or all of the first test case sets, the automation of the automated execution script can be started by setting a timing mode on an automated test platform such as a Phoenix Framework, the test of the object to be tested is realized, the test script is executed to generate a corresponding second timing diagram, for example, a tested interface node a is called to perform the test due to the test script having a request parameter corresponding to the test case, a second interface node path with the tested interface node a as a starting point is generated, the second timing diagram is generated according to the second interface node path, and the second timing diagram is associated with the tested interface node a, so that the interface identifier of the tested interface node a can be associated to the second timing diagram. And (4) transitioning the test script from the test script to the automatic execution script, and gradually establishing a basic system for the automatic execution interface test according to the drawn test interaction diagram.
An analyzing module 406, configured to analyze the second time-sequence chart set according to the first time-sequence chart set by comparison, so as to obtain a test coverage report of the second time-sequence chart set.
Specifically, comparing whether the timing charts in the second timing chart set are the same as the first timing chart set, if the number of the timing charts appearing in the second timing chart set is smaller than that of the first timing chart set, generating a test coverage report with a low coverage rate of the second test case according to the situation, wherein the test coverage rate of the second timing chart set is equal to the number of the second timing charts/the number of the timing charts in the first timing chart set; if the number of the time sequence diagrams appearing in the second time sequence diagram set is equal to that of the first time sequence diagram, generating a test coverage report with a second test case coverage rate of 100 percent according to the situation; and if the timing diagrams in the second timing diagram set do not exist in the first timing diagram set, determining that the timing diagrams needing to be regressed appear in the second test case.
EXAMPLE III
Fig. 3 is a schematic diagram of a hardware architecture of a computer device according to a third embodiment of the present invention. In this embodiment, the computer device 4 is a device capable of automatically performing numerical calculation and/or information processing according to a preset or stored instruction. The computer device 4 may be a rack server, a blade server, a tower server or a rack server (including an independent server or a server cluster composed of a plurality of servers), and the like. As shown in FIG. 3, the computer device 4 includes, but is not limited to, at least a memory 41, a processor 42, a network interface 43, and an interface automated test processing system 40, which may be communicatively coupled to each other via a system bus. Wherein:
in this embodiment, the memory 41 includes at least one type of computer-readable storage medium including a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. In some embodiments, the memory 41 may be an internal storage unit of the computer device 4, such as a hard disk or a memory of the computer device 4. In other embodiments, the memory 41 may also be an external storage device of the computer device 4, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like provided on the computer device 4. Of course, the memory 41 may also include both internal and external storage devices of the computer device 4. In this embodiment, the memory 41 is generally used for storing an operating system and various application software installed in the computer device 4, such as the program code of the interface automation test processing system 40 in the second embodiment. Further, the memory 41 may also be used to temporarily store various types of data that have been output or are to be output.
Processor 42 may be a Central Processing Unit (CPU), controller, microcontroller, microprocessor, or other data Processing chip in some embodiments. The processor 42 is typically used to control the overall operation of the computer device 4. In this embodiment, the processor 42 is configured to run the program code stored in the memory 41 or process data, for example, run the interface automation test processing system 40, so as to implement the interface automation test processing method according to the first embodiment.
The network interface 43 may comprise a wireless network interface or a wired network interface, and the network interface 43 is generally used for establishing communication connection between the server 4 and other electronic devices. For example, the network interface 43 is used to connect the server 4 to an external terminal via a network, establish a data transmission channel and a communication connection between the server 4 and the external terminal, and the like. The network may be a wireless or wired network such as an Intranet (Intranet), the Internet (Internet), a Global System of Mobile communication (GSM), Wideband Code Division Multiple Access (WCDMA), a 4G network, a 5G network, Bluetooth (Bluetooth), Wi-Fi, and the like. It is noted that fig. 3 only shows the computer device 4 with components 40-43, but it is to be understood that not all shown components are required to be implemented, and that more or less components may be implemented instead.
In this embodiment, the interface automatic test processing system 40 stored in the memory 41 may be further divided into one or more program modules, and the one or more program modules are stored in the memory 41 and executed by one or more processors (in this embodiment, the processor 42) to complete the present invention.
For example, fig. 2 shows a schematic diagram of program modules of the second embodiment of implementing the interface automated test processing system 40, in this embodiment, the interface automated test processing system 40 may be divided into the obtaining module 400, the first recording module 402, the second recording module 404, and the analyzing module 406. The program module referred to in the present invention refers to a series of computer program instruction segments capable of performing specific functions, and is more suitable than a program for describing the execution process of the interface automation test processing system 40 in the computer device 4. The specific functions of the program modules 400 and 406 have been described in detail in the second embodiment, and are not described herein again.
Example four
The present embodiment also provides a computer-readable storage medium, such as a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, a server, an App application mall, etc., on which a computer program is stored, which when executed by a processor implements corresponding functions. The computer-readable storage medium of this embodiment is used for a computer program, and when executed by a processor, implements the interface automation test processing method of the first embodiment.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. An interface automation test processing method is characterized by comprising the following steps:
the method comprises the steps of obtaining a tested object and a first test case set corresponding to the tested object, wherein the first test case set comprises a plurality of first test cases for executing different interface tests on the tested object;
sequentially recording a first interface node path of the tested object during the test of each first test case in the first test case set as a first time sequence diagram to obtain a first time sequence diagram set corresponding to the first test case set;
acquiring a plurality of second interface node paths of the tested object during the test of a plurality of second test cases in a preset time period, and recording to obtain a second time sequence diagram set, wherein the plurality of second test cases are part or all of the first test case sets;
and comparing and analyzing the second time sequence diagram set according to the first time sequence diagram set to obtain a target test coverage report of the second time sequence diagram set.
2. The method according to claim 1, wherein the step of sequentially recording a first interface node path of the object under test for executing the test of each first test case in the first test case set as a first timing diagram to obtain a first timing diagram set corresponding to the first test case set includes:
obtaining test parameters of the first test case, wherein the test parameters comprise request parameters and return parameters of the first test case;
generating a test script corresponding to the first test case according to the test parameters of the first test case;
sequentially executing the test script on the tested object to obtain a first interface node path corresponding to the first test case, wherein the first interface node path comprises a plurality of first interface nodes corresponding to the tested object during testing and the precedence relationship between the tested object and the plurality of first interface nodes;
generating a first time sequence diagram corresponding to the first test case according to the first interface node path;
and carrying out de-duplication processing on the first time sequence diagram to obtain a first time sequence diagram set.
3. The method according to claim 1, wherein the step of obtaining a plurality of second interface node paths of the object under test during the test of a plurality of second test cases within a preset time period and recording the second time-series atlas, wherein the second test cases are part or all of the first test cases, includes:
acquiring an automatic execution script for executing the first test case;
associating the automatic execution script with the first test case to obtain a second test case;
executing the automatic execution script according to a preset timing task within a preset time to obtain a second interface node path corresponding to the second test case, wherein the second interface node path comprises a plurality of second interface nodes corresponding to the tested object during testing and the precedence relationship between the tested object and the plurality of second interface nodes;
and obtaining a second time sequence diagram corresponding to the second test case according to the second interface node path record to obtain a second time sequence diagram set.
4. The method of automated interface test processing of claim 1, further comprising:
determining a third time sequence diagram in the second time sequence diagram set according to the target test coverage report, wherein the third time sequence diagram is any time sequence diagram corresponding to the test case which does not complete the test or has a fault in the second test case set;
performing regression processing on the third timing diagram to determine whether the third timing diagram is stable according to a regression result;
storing the third timing graph in the first set of timing graphs when the third timing graph is stable.
5. The method of automated interface test processing of claim 4, further comprising:
and comparing the first time sequence diagram, the second time sequence diagram and the third time sequence diagram to generate a time sequence diagram overlay diagram according to the comparison result.
6. The method of automated interface test processing of claim 1, further comprising:
receiving a test optimization request of a user, wherein the test optimization request comprises target node data for optimization;
inquiring a target time sequence diagram corresponding to the target node data according to the target node data;
acquiring a target test script corresponding to the target sequence diagram;
and sending the target test script to the user.
7. The method of automated interface test processing of claim 1, further comprising:
uploading the target test coverage report of the second timing diagram set into a blockchain.
8. An interface automated test processing system, comprising:
the device comprises an acquisition module, a test module and a processing module, wherein the acquisition module is used for acquiring a tested object and a first test case set corresponding to the tested object, and the first test case set comprises a plurality of first test cases for executing different interface tests on the tested object;
the first recording module is used for recording a first interface node path of the tested object during the test of each first test case in the first test case set as a first time sequence diagram in sequence to obtain a first time sequence diagram set corresponding to the first test case set;
the second recording module is used for acquiring a plurality of second interface node paths of the tested object during the test of a plurality of second test cases in a preset time period, and recording to obtain a second time sequence diagram set, wherein the plurality of second test cases are part or all of the first test case sets;
and the analysis module is used for carrying out comparative analysis on the second time sequence diagram set according to the first time sequence diagram set to obtain a test coverage report of the second time sequence diagram set.
9. A computer arrangement, characterized in that the computer arrangement comprises a memory, a processor, the memory having stored thereon a computer program executable on the processor, the computer program, when executed by the processor, implementing the steps of the interface automation test processing method according to any one of claims 1-7.
10. A computer-readable storage medium, having stored therein a computer program executable by at least one processor to cause the at least one processor to perform the steps of the interface automation test processing method according to any one of claims 1 to 7.
CN202110286192.7A 2021-03-17 2021-03-17 Interface automatic test processing method, system, computer equipment and storage medium Active CN113010424B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110286192.7A CN113010424B (en) 2021-03-17 2021-03-17 Interface automatic test processing method, system, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110286192.7A CN113010424B (en) 2021-03-17 2021-03-17 Interface automatic test processing method, system, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113010424A true CN113010424A (en) 2021-06-22
CN113010424B CN113010424B (en) 2024-04-02

Family

ID=76409240

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110286192.7A Active CN113010424B (en) 2021-03-17 2021-03-17 Interface automatic test processing method, system, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113010424B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101645037A (en) * 2009-09-11 2010-02-10 兰雨晴 Coverage analysis method of foundational software platform application program interface integrated test
CN101916225A (en) * 2010-09-02 2010-12-15 于秀山 Graphical user interface software function coverage testing method
WO2016170937A1 (en) * 2015-04-20 2016-10-27 三菱電機株式会社 Test automation device, test automation method, and test automation program
CN107807881A (en) * 2017-09-28 2018-03-16 北京新能源汽车股份有限公司 Method of testing, device and the computer equipment of code coverage
CN108073519A (en) * 2018-01-31 2018-05-25 百度在线网络技术(北京)有限公司 Method for generating test case and device
US20190102281A1 (en) * 2017-10-04 2019-04-04 International Business Machines Corporation Measuring and improving test coverage
CN109857654A (en) * 2019-01-17 2019-06-07 珠海金山网络游戏科技有限公司 A kind of method, apparatus and system of the timing flow chart automatically generating test case
CN111309635A (en) * 2020-03-26 2020-06-19 北京奇艺世纪科技有限公司 Test case generation method, device, server and storage medium
CN111552631A (en) * 2020-03-27 2020-08-18 中国平安人寿保险股份有限公司 System testing method, device and computer readable storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101645037A (en) * 2009-09-11 2010-02-10 兰雨晴 Coverage analysis method of foundational software platform application program interface integrated test
CN101916225A (en) * 2010-09-02 2010-12-15 于秀山 Graphical user interface software function coverage testing method
WO2016170937A1 (en) * 2015-04-20 2016-10-27 三菱電機株式会社 Test automation device, test automation method, and test automation program
CN107807881A (en) * 2017-09-28 2018-03-16 北京新能源汽车股份有限公司 Method of testing, device and the computer equipment of code coverage
US20190102281A1 (en) * 2017-10-04 2019-04-04 International Business Machines Corporation Measuring and improving test coverage
CN108073519A (en) * 2018-01-31 2018-05-25 百度在线网络技术(北京)有限公司 Method for generating test case and device
CN109857654A (en) * 2019-01-17 2019-06-07 珠海金山网络游戏科技有限公司 A kind of method, apparatus and system of the timing flow chart automatically generating test case
CN111309635A (en) * 2020-03-26 2020-06-19 北京奇艺世纪科技有限公司 Test case generation method, device, server and storage medium
CN111552631A (en) * 2020-03-27 2020-08-18 中国平安人寿保险股份有限公司 System testing method, device and computer readable storage medium

Also Published As

Publication number Publication date
CN113010424B (en) 2024-04-02

Similar Documents

Publication Publication Date Title
US9170821B1 (en) Automating workflow validation
CN110928802A (en) Test method, device, equipment and storage medium based on automatic generation of case
CN110222535B (en) Processing device, method and storage medium for block chain configuration file
CN112380255A (en) Service processing method, device, equipment and storage medium
CN111767227A (en) Recording playback test method and device
CN111190823A (en) UI automation test method, electronic device and computer readable storage medium
CN111679968A (en) Interface calling abnormity detection method and device, computer equipment and storage medium
CN112527484A (en) Workflow breakpoint continuous running method and device, computer equipment and readable storage medium
CN113377667A (en) Scene-based testing method and device, computer equipment and storage medium
EP3514680B1 (en) Identification of changes in functional behavior and runtime behavior of a system during maintenance cycles
CN111142929A (en) Firmware configuration method, device, equipment and medium in equipment production process
CN110781090A (en) Control method and device for data processing test, computer equipment and storage medium
CN107992420B (en) Management method and system for test item
CN110069382B (en) Software monitoring method, server, terminal device, computer device and medium
CN111752789B (en) Pressure testing method, computer device and computer readable storage medium
CN112132544B (en) Inspection method and device of business system
CN116431522A (en) Automatic test method and system for low-code object storage gateway
CN113010424A (en) Interface automation test processing method, system, computer equipment and storage medium
CN114385498A (en) Performance test method, system, computer equipment and readable storage medium
CN114416420A (en) Equipment problem feedback method and system
CN114461219A (en) Data analysis method and device, computer equipment and storage medium
CN111274128A (en) Test method, test device, computer equipment and computer readable storage medium
CN113360389A (en) Performance test method, device, equipment and storage medium
CN112463431A (en) BIOS error positioning method, device, equipment and storage medium
CN111752600A (en) Code anomaly detection method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant