CN113010424B - Interface automatic test processing method, system, computer equipment and storage medium - Google Patents

Interface automatic test processing method, system, computer equipment and storage medium Download PDF

Info

Publication number
CN113010424B
CN113010424B CN202110286192.7A CN202110286192A CN113010424B CN 113010424 B CN113010424 B CN 113010424B CN 202110286192 A CN202110286192 A CN 202110286192A CN 113010424 B CN113010424 B CN 113010424B
Authority
CN
China
Prior art keywords
test
time sequence
interface
test case
tested object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110286192.7A
Other languages
Chinese (zh)
Other versions
CN113010424A (en
Inventor
高爱家
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An E Wallet Electronic Commerce Co Ltd
Original Assignee
Ping An E Wallet Electronic Commerce Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An E Wallet Electronic Commerce Co Ltd filed Critical Ping An E Wallet Electronic Commerce Co Ltd
Priority to CN202110286192.7A priority Critical patent/CN113010424B/en
Publication of CN113010424A publication Critical patent/CN113010424A/en
Application granted granted Critical
Publication of CN113010424B publication Critical patent/CN113010424B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Abstract

The invention discloses an interface automatic test processing method, which comprises the following steps: acquiring a tested object and a first test case set corresponding to the tested object, wherein the first test case set comprises a plurality of first test cases for executing different interface tests on the tested object; sequentially recording a first interface node path of each first test case in the first test case set executed by the tested object as a first timing chart, and obtaining a first timing chart set corresponding to the first test case set; acquiring a plurality of second interface node paths when the tested object executes the test of a plurality of second test cases in a preset time period, and recording to obtain a second time sequence chart set, wherein the plurality of second test cases are part or all of the first test case sets; and comparing and analyzing the second time sequence atlas according to the first time sequence atlas to obtain a target test coverage report of the second time sequence atlas. The invention can effectively manage the interface test.

Description

Interface automatic test processing method, system, computer equipment and storage medium
Technical Field
The embodiment of the invention relates to the field of interface testing, in particular to an interface automatic testing processing method, an interface automatic testing processing system, computer equipment and a storage medium.
Background
The automatic test of the interface is an important link of quality assurance in the current software development process, and is a high-efficiency functional regression test means. At present, an interface automation test script is not associated with a tested system, and the existing automation script is in a passive stage and is actively maintained and managed by workers. The automatic script is generated by personnel, so that the influence of personnel variation is large, the error rate is high, and the efficiency is low; in the prior art, the automatic use case is limited by factors such as environment, code branching, implementation method and the like, and the problems of long execution time and low stability are generally existed. The script of the interface automation test needs a method to automatically manage, and is manually converted into an automation script to manage, so that the current automation coverage condition, the failed time sequence diagram and the incomplete automation design workload can be quickly known.
Disclosure of Invention
Accordingly, an object of the embodiments of the present invention is to provide an interface automation test processing method, system, computer device and storage medium, which can effectively manage interface tests.
To achieve the above object, an embodiment of the present invention provides an interface automation test processing method, including:
acquiring a tested object and a first test case set corresponding to the tested object, wherein the first test case set comprises a plurality of first test cases for executing different interface tests on the tested object;
sequentially recording a first interface node path of each first test case in the first test case set executed by the tested object as a first timing chart, and obtaining a first timing chart set corresponding to the first test case set;
acquiring a plurality of second interface node paths when the tested object executes the test of a plurality of second test cases in a preset time period, and recording to obtain a second time sequence chart set, wherein the plurality of second test cases are part or all of the first test case sets;
and comparing and analyzing the second time sequence atlas according to the first time sequence atlas to obtain a target test coverage report of the second time sequence atlas.
Further, the step of sequentially recording, as a first timing diagram, a first interface node path when the tested object executes the test of each first test case in the first test case set, to obtain a first timing diagram set corresponding to the first test case set includes:
acquiring test parameters of the first test case, wherein the test parameters comprise request parameters and return parameters of the first test case;
generating a test script corresponding to the first test case according to the test parameters of the first test case;
executing the test script on the tested object in sequence to obtain a first interface node path corresponding to the first test case, wherein the first interface node path comprises a plurality of first interface nodes corresponding to the tested object in test and a precedence relationship between the tested object and the plurality of first interface nodes;
generating a first timing diagram corresponding to the first test case according to the first interface node path;
and performing de-duplication processing on the first time sequence diagram to obtain a first time sequence diagram set.
Further, the step of obtaining a plurality of second interface node paths when the tested object executes the test of a plurality of second test cases in the preset time period, and recording to obtain a second time sequence chart set, where the second test cases are part or all of the first test cases includes:
acquiring an automatic execution script for executing the first test case;
associating the automatic execution script with the first test case to obtain a second test case;
executing the automatic execution script according to a preset timing task within a preset time to obtain a second interface node path corresponding to the second test case, wherein the second interface node path comprises a plurality of second interface nodes corresponding to the tested object and a sequence relationship between the tested object and the plurality of second interface nodes;
and obtaining a second time sequence diagram corresponding to the second test case according to the second interface node path record, and obtaining a second time sequence diagram set.
Further, the method further comprises:
determining a third time sequence diagram in the second time sequence diagram set according to the target test coverage report, wherein the third time sequence diagram is any time sequence diagram corresponding to a test case in the second test case set, which is not tested or fails;
performing regression processing on the third time sequence diagram to determine whether the third time sequence diagram is stable according to regression results;
when the third timing diagram is stable, the third timing diagram is stored in the first timing diagram set.
Further, the method further comprises:
comparing the first time sequence diagram, the second time sequence diagram and the third time sequence diagram to generate a time sequence diagram coverage diagram according to the comparison result.
Further, the method further comprises:
receiving a test optimization request of a user, wherein the test optimization request comprises target node data for optimization;
inquiring a target time sequence diagram corresponding to the target node data according to the target node data;
acquiring a target test script corresponding to the target time sequence diagram;
and sending the target test script to the user.
Further, the method further comprises:
uploading the target test coverage report of the second timing diagram set into a blockchain.
To achieve the above object, an embodiment of the present invention further provides an interface automation test processing system, including:
the device comprises an acquisition module, a test module and a test module, wherein the acquisition module is used for acquiring a tested object and a first test case set corresponding to the tested object, and the first test case set comprises a plurality of first test cases for executing different interface tests on the tested object;
the first recording module is used for sequentially recording a first interface node path of each first test case in the first test case set executed by the tested object as a first timing diagram to obtain a first timing diagram set corresponding to the first test case set;
the second recording module is used for acquiring a plurality of second interface node paths when the tested object executes the test of a plurality of second test cases in a preset time period, and recording to obtain a second time sequence chart set, wherein the plurality of second test cases are part or all of the first test case sets;
and the analysis module is used for comparing and analyzing the second time sequence atlas according to the first time sequence atlas to obtain a test coverage report of the second time sequence atlas.
To achieve the above object, an embodiment of the present invention provides a computer device including a memory, a processor, and a computer program stored in the memory, where the computer program is executable on the processor, and the computer program is executed by the processor to implement the steps of the interface automation test processing method as described above.
To achieve the above object, an embodiment of the present invention provides a computer-readable storage medium having stored therein a computer program executable by at least one processor to cause the at least one processor to perform the steps of the interface automation test processing method as described above.
According to the interface automation test processing method, the system, the computer equipment and the storage medium, the first time sequence chart set is generated by drawing the time sequence chart of the first test case set, and when the second test case is tested on the tested object in the subsequent time period, the second time sequence chart is generated by the execution node path of the second test case, and then the second time sequence chart set and the first time sequence chart set are subjected to comparative analysis, so that a target coverage test report of the second time sequence chart can be obtained, and the coverage condition, the failed time sequence chart and the design workload of distributing incomplete automation tests in the current automation test can be quickly known according to the target coverage test report.
Drawings
FIG. 1 is a flowchart of an embodiment of an interface automated test handling method according to the present invention.
FIG. 2 is a schematic diagram illustrating a program module of a second embodiment of an interface automatic test processing system according to the present invention.
Fig. 3 is a schematic diagram of a hardware structure of a third embodiment of the computer device of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Example 1
Referring to fig. 1, a flowchart of the steps of an interface automation test processing method according to a first embodiment of the present invention is shown. It will be appreciated that the flow charts in the method embodiments are not intended to limit the order in which the steps are performed. An exemplary description will be made below with the computer device 4 as an execution subject. Specifically, the following is described.
Step S100, a tested object and a first test case set corresponding to the tested object are obtained, wherein the first test case set comprises a plurality of first test cases for executing different interface tests on the tested object.
Specifically, the tested object is a tested interface which needs to be tested, and interface data of the tested object is obtained. When the tested object performs different tests, different first test cases are compiled for the tested object according to different test requests, and a first test case set is obtained. And writing the first test case in a test script to obtain a test script set corresponding to the first test case set.
Step S120, recording, as a first timing diagram, a first interface node path of the tested object when executing the test of each first test case in the first test case set in sequence, so as to obtain a first timing diagram set corresponding to the first test case set.
Specifically, test logs of interface tests of the tested object through the first test case executed by the test script within a period of time are collected, the test logs comprise a downstream service ip and a log serial number, the test logs are analyzed to obtain a first interface node path generated by test requests initiated from all tested interfaces, for example, when online shopping is performed, a buyer orders to final goods receiving, and a data stream of one complete transaction needs to pass through a plurality of systems, such as an ERP system, a warehouse system, a distribution system, an end system and the like. The systems are connected through calling the interfaces of the systems to form a first interface node path, and transaction data flows on the first interface node path. The interface node of the system through which a complete data stream process passes is the timing diagram. For example, a first interface node path of a test request initiated from an interface node A of a tested object is obtained from a test log, a downstream interface node, which is further called by the interface node A, in the first interface node path is analyzed until the downstream interface node is analyzed to the bottommost layer of the first interface node path, the interface nodes take the interface node A as a starting point, the precedence relationship of the interface nodes is stored, and a first timing diagram of the interface A is drawn. The log parsing may configure a regular model or perform personality parsing according to different system printing modes to obtain downstream interface data, for example, the downstream interface data of the first interface node path may be matched by configuring a regular expression corresponding to the data format of the system ip.
In an exemplary embodiment, the step S120 further includes:
step S121, obtaining test parameters of the first test case, where the test parameters include request parameters and return parameters of the first test case.
Step S122, generating a test script corresponding to the first test case according to the test parameters of the first test case.
Step S123, executing the test script on the tested object in sequence to obtain a first interface node path corresponding to the first test case, where the first interface node path includes a plurality of first interface nodes corresponding to the tested object when the tested object is tested and a precedence relationship between the tested object and the plurality of first interface nodes.
Step S124, generating a first timing chart corresponding to the first test case according to the first interface node path.
And step S125, performing de-duplication processing on the first time sequence diagram to obtain a first time sequence diagram set.
Specifically, the test parameters of the first test case can be stored in a database in a text form, the test parameters are used for generating a test script, the test script is an automatic test script, the test of the tested object can be realized by setting a timing mode on an automatic test platform such as Phoenix Framework and starting the automatic test script automatically, wherein the test parameters comprise request parameters and return parameters, the request parameters comprise test requirement related parameters of the tested object, the return parameters refer to a return result after the test script is executed, and the expected result can be understood. When executing the test script, the test script calls the interface nodes corresponding to the tested object and the interface nodes of other downstream systems to test, generates a first time sequence diagram corresponding to the test script, and performs de-duplication processing on the first time sequence diagrams of all the interface nodes to generate a first non-repeated time sequence diagram set. The time sequence diagram is taken as a base line for design, a script corresponding to the existing time sequence diagram of the interface node can be checked in design execution, repeated time sequence diagrams and upstream automation conditions can be automatically recommended, and rapid maintenance and design of the script are facilitated.
Step S140, a plurality of second interface node paths of the tested object executing the test of a plurality of second test cases in a preset time period are obtained, and a second time sequence chart set is recorded, where the plurality of second test cases are part or all of the first test cases.
Specifically, the second test case may be a first test case in which the user performs a test of a part or all of the cases again on the tested interface within a preset time period, the first test case is automatically tested by an automatic execution script, a node path of the second interface is correspondingly recorded, and in the preset time period, the user may optimize or modify interface data of the tested interface or a downstream system interface. And testing the tested object through the second test case, executing a test script corresponding to the second test case, acquiring the called interface node in the execution process, and generating a second time sequence diagram according to the called interface node. Along with the continuous test work, the background continuously records the second time sequence diagram of each interface node of the tested object, and obtains the second time sequence diagram in a preset time period to obtain a second time sequence diagram set.
In an exemplary embodiment, the step S140 further includes:
step S141, acquiring an automatic execution script for executing the first test case; step S142, associating the automatic execution script with the first test case to obtain a second test case; step S143, executing the automated execution script according to a preset timing task within a preset time to obtain a second interface node path corresponding to the second test case, where the second interface node path includes a plurality of second interface nodes corresponding to the tested object and a precedence relationship between the tested object and the plurality of second interface nodes when the tested object is tested; step S144, obtaining a second time sequence diagram corresponding to the second test case according to the second interface node path record, and obtaining a second time sequence diagram set.
Specifically, the plurality of second test cases are part or all of the first test case sets, the test on the tested object can be achieved by setting a timing mode on an automatic test platform such as Phoenix Framework and starting an automatic execution script in a timing mode, a corresponding second time sequence diagram is generated when the test script is executed, and because the test script has a request parameter corresponding to the test case, for example, a tested interface node A is called to test, a second interface node path taking the tested interface node A as a starting point is generated, the second time sequence diagram is generated according to the second interface node path, and the interface identifier of the tested interface node A can be related to the second time sequence diagram by associating the second time sequence diagram with the tested interface node A. And the test script is transited from the test script to the automatic execution script, and the test script is gradually changed according to the drawn test interaction diagram, so as to establish a basic system for testing the automatic execution interface.
And step S160, comparing and analyzing the second time sequence chart set according to the first time sequence chart set to obtain a target test coverage report of the second time sequence chart set.
Specifically, comparing whether the timing charts in the second timing chart set are the same as the first timing chart set, if the number of the timing charts appearing in the second timing chart set is smaller than that of the first timing chart set, generating a test coverage report with low coverage rate of the second test case according to the situation, wherein the test coverage rate of the second timing chart set=the number of the second timing charts/the number of the timing charts in the first timing chart set; if the number of the timing charts appearing in the second timing chart set is equal to that of the first timing chart, generating a test coverage report of 100% of the coverage rate of the second test case according to the condition; if the timing diagrams in the second timing diagram set do not exist in the first timing diagram set, determining that the timing diagrams needing to be subjected to regression appear in the second test case.
In an exemplary embodiment, the method further comprises:
step S200, determining a third timing chart in the second timing chart set according to the target test coverage report, where the third timing chart is any timing chart corresponding to a test case in the second test case set that has not completed a test or has failed. Step S220, performing regression processing on the third timing chart to correct whether the third timing chart is stable according to the regression result. Step S240, when the third timing chart is stable, storing the third timing chart in the first timing chart set.
Specifically, according to the target test coverage report, a third timing diagram (including interface nodes of the tested object) needing regression is calculated, and test cases associated with the third timing diagram are obtained. The process of regressing the third timing diagram is: and acquiring a time sequence diagram closest to the third time sequence diagram from the first time sequence diagram set, testing the test script of the time sequence diagram corresponding to the third time sequence diagram, and modifying the test script corresponding to the tested third time sequence diagram according to the test result so as to enable the test result of the test script corresponding to the third time sequence diagram to reach the expected effect. The method is to automatically correct the test script, namely, when the test result of the test script corresponding to the third time sequence diagram reaches the expected effect, the third time sequence diagram is stable. After the third timing diagram is stabilized, the third timing diagram is integrated into the first timing diagram to sound the first timing diagram set.
In an exemplary embodiment, the method further comprises:
comparing the first time sequence diagram, the second time sequence diagram and the third time sequence diagram to generate a coverage diagram according to a comparison result.
Specifically, by performing the first test case automatically and periodically, a time sequence chart set can be output, and compared with the time sequence chart set and the stored time sequence chart set in a preset time period, the stored time sequence chart set is the first time sequence chart set, so that an automatic coverage chart and an unfinished automatic coverage chart are generated. The automatic coverage map can quickly know the current automatic coverage condition, the failed time sequence map and the time sequence map with incomplete allocation.
In an exemplary embodiment, the method further comprises:
step S300, receiving a test optimization request of a user, wherein the test optimization request comprises target node data for optimization. Step S320, querying a target timing diagram corresponding to the target node data according to the target node data. Step S340, obtaining a target test script corresponding to the target timing diagram. And step S360, the target test script is sent to the user.
Specifically, the test optimization request is obtained by modifying a certain target node in the first interface node path, generating a test optimization request by the node data corresponding to the target node, searching a target time sequence diagram containing the target node in a database storing a first time sequence diagram set by the target node data of the target node, then searching a target test script associated with the target time sequence diagram in the database, and pushing the target test script to a user so as to optimize the target test script by the user to obtain an optimized test script.
In an exemplary embodiment, the method further comprises:
uploading the target test coverage report of the second timing diagram set into a blockchain.
Specifically, uploading the target test coverage report to the blockchain may ensure its security and fair transparency to the user. The user device may download the target test coverage report from the blockchain to verify whether the target test coverage report has been tampered with. The blockchain referred to in this example is a novel mode of application for computer technology such as distributed data storage, point-to-point transmission, consensus mechanisms, encryption algorithms, and the like. The Blockchain (Blockchain), which is essentially a decentralised database, is a string of data blocks that are generated by cryptographic means in association, each data block containing a batch of information of network transactions for verifying the validity of the information (anti-counterfeiting) and generating the next block. The blockchain may include a blockchain underlying platform, a platform product services layer, an application services layer, and the like.
Example two
With continued reference to fig. 2, a schematic diagram of program modules of a second embodiment of the interface automated test processing system of the present invention is shown. In this embodiment, the interface automation test processing system 40 may include or be partitioned into one or more program modules, one or more program modules being stored in a storage medium and executed by one or more processors to perform the present invention and implement the interface automation test processing method described above. Program modules in accordance with the embodiments of the present invention are directed to a series of computer program instruction segments capable of performing particular functions, and are more suited to describing the execution of the interface automated test processing system 40 on a storage medium than the program itself. The following description will specifically describe functions of each program module of the present embodiment:
the obtaining module 400 is configured to obtain a tested object and a first test case set corresponding to the tested object, where the first test case set includes a plurality of first test cases that execute different interface tests on the tested object.
Specifically, the tested object is a tested interface which needs to be tested, and interface data of the tested object is obtained. When the tested object performs different tests, different first test cases are compiled for the tested object according to different test requests, and a first test case set is obtained. And writing the first test case in a test script to obtain a test script set corresponding to the first test case set.
The first recording module 402 is configured to record, as a first timing diagram, a first interface node path when the tested object executes the test of each first test case in the first test case set in sequence, so as to obtain a first timing diagram set corresponding to the first test case set.
Specifically, test logs of interface tests of the tested objects through the first test cases executed by the automatic test scripts within a period of time are collected, the test logs comprise downstream service ips and log serial numbers, the test logs are analyzed to obtain first interface node paths generated by test requests initiated from all tested interfaces, for example, when online shopping is performed, data flows of one complete transaction need to pass through a plurality of systems, such as an ERP system, a warehouse system, a distribution system, an end system and the like, from purchasing to final receiving. The systems are connected through calling the interfaces of the systems to form a first interface node path, and transaction data flows on the first interface node path. The interface node of the system through which a complete data stream process passes is the timing diagram. For example, a first interface node path of a test request initiated from an interface node A of a tested object is obtained from a test log, a downstream interface node, which is further called by the interface node A, in the first interface node path is analyzed until the downstream interface node is analyzed to the bottommost layer of the first interface node path, the interface nodes take the interface node A as a starting point, the precedence relationship of the interface nodes is stored, and a first timing diagram of the interface A is drawn. The log parsing may configure a regular model or perform personality parsing according to different system printing modes to obtain downstream interface data, for example, the downstream interface data of the first interface node path may be matched by configuring a regular expression corresponding to the data format of the system ip.
In an exemplary embodiment, the first recording module 402 is further configured to:
sequentially recording a first interface node path of each first test case in the first test case set executed by the tested object as a first timing chart, and obtaining a first timing chart set corresponding to the first test case set; generating a test script corresponding to the first test case according to the test parameters of the first test case; executing the test script on the tested object in sequence to obtain a first interface node path corresponding to the first test case, wherein the first interface node path comprises a plurality of first interface nodes corresponding to the tested object in test and a precedence relationship between the tested object and the plurality of first interface nodes; generating a first timing diagram corresponding to the first test case according to the first interface node path; and performing de-duplication processing on the first time sequence diagram to obtain a first time sequence diagram set.
Specifically, the test parameters of the first test case can be stored in a database in a text form, the test parameters are used for generating a test script, the test script is an automatic test script, the test of the tested object can be realized by setting a timing mode on an automatic test platform such as Phoenix Framework and starting the automatic test script automatically, wherein the test parameters comprise request parameters and return parameters, the request parameters comprise test requirement related parameters of the tested object, the return parameters refer to a return result after the test script is executed, and the expected result can be understood. When executing the test script, the test script calls the interface nodes corresponding to the tested object and the interface nodes of other downstream systems to test, generates a first time sequence diagram corresponding to the test script, and performs de-duplication processing on the first time sequence diagrams of all the interface nodes to generate a first non-repeated time sequence diagram set. The time sequence diagram is taken as a base line for design, a script corresponding to the existing time sequence diagram of the interface node can be checked in design execution, repeated time sequence diagrams and upstream automation conditions can be automatically recommended, and rapid maintenance and design of the script are facilitated.
The second recording module 404 is configured to obtain a plurality of second interface node paths when the tested object executes the test of a plurality of second test cases in a preset period of time, and record to obtain a second timing chart set, where the plurality of second test cases are part or all of the first test cases.
Specifically, the second test case may be a first test case in which the user performs a test of a part or all of the cases again on the tested interface within a preset time period, the first test case is automatically tested by an automatic execution script, a node path of the second interface is correspondingly recorded, and in the preset time period, the user may optimize or modify interface data of the tested interface or a downstream system interface. And testing the tested object through the second test case, executing a test script corresponding to the second test case, acquiring the called interface node in the execution process, and generating a second time sequence diagram according to the called interface node. Along with the continuous test work, the background continuously records the second time sequence diagram of each interface node of the tested object, and obtains the second time sequence diagram in a preset time period to obtain a second time sequence diagram set.
In an exemplary embodiment, the second recording module 404 is further configured to:
acquiring an automatic execution script for executing the first test case; associating the automatic execution script with the first test case to obtain a second test case; executing the automatic execution script according to a preset timing task within a preset time to obtain a second interface node path corresponding to the second test case, wherein the second interface node path comprises a plurality of second interface nodes corresponding to the tested object and a sequence relationship between the tested object and the plurality of second interface nodes; and obtaining a second time sequence diagram corresponding to the second test case according to the second interface node path record, and obtaining a second time sequence diagram set.
Specifically, the plurality of second test cases are part or all of the first test case sets, the test on the tested object can be achieved by setting a timing mode on an automatic test platform such as Phoenix Framework and starting an automatic execution script in a timing mode, a corresponding second time sequence diagram is generated when the test script is executed, and because the test script has a request parameter corresponding to the test case, for example, a tested interface node A is called to test, a second interface node path taking the tested interface node A as a starting point is generated, the second time sequence diagram is generated according to the second interface node path, and the interface identifier of the tested interface node A can be related to the second time sequence diagram by associating the second time sequence diagram with the tested interface node A. And the test script is transited from the test script to the automatic execution script, and the test script is gradually changed according to the drawn test interaction diagram, so as to establish a basic system for testing the automatic execution interface.
And an analysis module 406, configured to compare and analyze the second time sequence atlas according to the first time sequence atlas, and obtain a test coverage report of the second time sequence atlas.
Specifically, comparing whether the timing charts in the second timing chart set are the same as the first timing chart set, if the number of the timing charts appearing in the second timing chart set is smaller than that of the first timing chart set, generating a test coverage report with low coverage rate of the second test case according to the situation, wherein the test coverage rate of the second timing chart set=the number of the second timing charts/the number of the timing charts in the first timing chart set; if the number of the timing charts appearing in the second timing chart set is equal to that of the first timing chart, generating a test coverage report of 100% of the coverage rate of the second test case according to the condition; if the timing diagrams in the second timing diagram set do not exist in the first timing diagram set, determining that the timing diagrams needing to be subjected to regression appear in the second test case.
Example III
Referring to fig. 3, a hardware architecture diagram of a computer device according to a third embodiment of the present invention is shown. In this embodiment, the computer device 4 is a device capable of automatically performing numerical calculation and/or information processing according to a preset or stored instruction. The computer device 4 may be a rack server, a blade server, a tower server, or a rack server (including a stand-alone server, or a server cluster made up of multiple servers), or the like. As shown in FIG. 3, the computer device 4 includes, but is not limited to, at least a memory 41, a processor 42, a network interface 43, and an interface automated test processing system 40 communicatively coupled to each other via a system bus. Wherein:
in this embodiment, the memory 41 includes at least one type of computer-readable storage medium including flash memory, a hard disk, a multimedia card, a card memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. In some embodiments, the memory 41 may be an internal storage unit of the computer device 4, such as a hard disk or a memory of the computer device 4. In other embodiments, the memory 41 may also be an external storage device of the computer device 4, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the computer device 4. Of course, the memory 41 may also comprise both an internal memory unit of the computer device 4 and an external memory device. In this embodiment, the memory 41 is typically used to store an operating system and various types of application software installed on the computer device 4, such as program codes of the interface automation test processing system 40 of the second embodiment. In addition, the memory 41 may also be used to temporarily store various types of data that have been output or are to be output.
Processor 42 may be a central processing unit (Central Processing Unit, CPU), controller, microcontroller, microprocessor, or other data processing chip in some embodiments. The processor 42 is typically used to control the overall operation of the computer device 4. In this embodiment, the processor 42 is configured to execute the program code stored in the memory 41 or process data, for example, to execute the interface automation test processing system 40, so as to implement the interface automation test processing method of the first embodiment.
The network interface 43 may comprise a wireless network interface or a wired network interface, which network interface 43 is typically used for establishing a communication connection between the server 4 and other electronic devices. For example, the network interface 43 is used to connect the server 4 to an external terminal through a network, establish a data transmission channel and a communication connection between the server 4 and the external terminal, and the like. The network may be an Intranet (Intranet), the Internet (Internet), a global system for mobile communications (Global System of Mobile communication, GSM), wideband code division multiple access (Wideband Code Division Multiple Access, WCDMA), a 4G network, a 5G network, bluetooth (Bluetooth), wi-Fi, or other wireless or wired network. It is noted that fig. 3 only shows a computer device 4 having components 40-43, but it is understood that not all of the illustrated components are required to be implemented, and that more or fewer components may be implemented instead.
In this embodiment, the interface automation test processing system 40 stored in the memory 41 may be further divided into one or more program modules, which are stored in the memory 41 and executed by one or more processors (the processor 42 in this embodiment) to complete the present invention.
For example, fig. 2 shows a schematic program module diagram of a second embodiment of the interface automation test processing system 40, where the interface automation test processing system 40 may be divided into the acquisition module 400, the first recording module 402, the second recording module 404, and the analysis module 406. Program modules in the present invention are understood to mean a series of computer program instruction segments capable of performing a specific function, more appropriately than a program, describing the execution of the interface automated test handling system 40 in the computer device 4. The specific functions of the program modules 400-406 are described in detail in the second embodiment, and are not described herein.
Example IV
The present embodiment also provides a computer-readable storage medium such as a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, a server, an App application store, etc., on which a computer program is stored, which when executed by a processor, performs the corresponding functions. The computer readable storage medium of the present embodiment is used for a computer program, which when executed by a processor implements the interface automation test processing method of the first embodiment.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment.
The foregoing description is only of the preferred embodiments of the present invention, and is not intended to limit the scope of the invention, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein or in the alternative, which may be employed directly or indirectly in other related arts.

Claims (9)

1. An interface automation test processing method, comprising:
acquiring a tested object and a first test case set corresponding to the tested object, wherein the first test case set comprises a plurality of first test cases for executing different interface tests on the tested object;
sequentially recording a first interface node path of each first test case in the first test case set executed by the tested object as a first timing chart, and obtaining a first timing chart set corresponding to the first test case set;
acquiring a plurality of second interface node paths when the tested object executes the test of a plurality of second test cases in a preset time period, and recording to obtain a second time sequence chart set, wherein the plurality of second test cases are part or all of the first test case sets;
comparing and analyzing the second time sequence atlas according to the first time sequence atlas to obtain a target test coverage report of the second time sequence atlas;
the method further comprises the steps of:
determining a third time sequence diagram in the second time sequence diagram set according to the target test coverage report, wherein the third time sequence diagram is any time sequence diagram corresponding to a test case which is not tested or fails in the plurality of second test cases;
performing regression processing on the third time sequence diagram to determine whether the third time sequence diagram is stable according to regression results;
when the third timing diagram is stable, the third timing diagram is stored in the first timing diagram set.
2. The method according to claim 1, wherein the step of sequentially recording, as a first timing chart, a first interface node path when the tested object executes the test of each first test case in the first test case set, to obtain a first timing chart set corresponding to the first test case set includes:
acquiring test parameters of the first test case, wherein the test parameters comprise request parameters and return parameters of the first test case;
generating a test script corresponding to the first test case according to the test parameters of the first test case;
executing the test script on the tested object in sequence to obtain a first interface node path corresponding to the first test case, wherein the first interface node path comprises a plurality of first interface nodes corresponding to the tested object in test and a precedence relationship between the tested object and the plurality of first interface nodes;
generating a first timing diagram corresponding to the first test case according to the first interface node path;
and performing de-duplication processing on the first time sequence diagram to obtain a first time sequence diagram set.
3. The method for automatically testing and processing an interface according to claim 1, wherein the step of obtaining a plurality of second interface node paths when the tested object executes the test of a plurality of second test cases in a preset period of time, and recording to obtain a second time sequence atlas, where the plurality of second test cases are part or all of the first test case sets includes:
acquiring an automatic execution script for executing the first test case;
associating the automatic execution script with the first test case to obtain a second test case;
executing the automatic execution script according to a preset timing task within a preset time to obtain a second interface node path corresponding to the second test case, wherein the second interface node path comprises a plurality of second interface nodes corresponding to the tested object and a sequence relationship between the tested object and the plurality of second interface nodes;
and obtaining a second time sequence diagram corresponding to the second test case according to the second interface node path record, and obtaining a second time sequence diagram set.
4. The interface automation test processing method of claim 1, further comprising:
comparing the first time sequence diagram, the second time sequence diagram and the third time sequence diagram to generate a time sequence diagram coverage diagram according to the comparison result.
5. The interface automation test processing method of claim 1, further comprising:
receiving a test optimization request of a user, wherein the test optimization request comprises target node data for optimization;
inquiring a target time sequence diagram corresponding to the target node data according to the target node data;
acquiring a target test script corresponding to the target time sequence diagram;
and sending the target test script to the user.
6. The interface automation test processing method of claim 1, further comprising:
uploading the target test coverage report of the second timing diagram set into a blockchain.
7. An interface automation test processing system for implementing the interface automation test processing method of any one of claims 1 to 6, comprising:
the device comprises an acquisition module, a test module and a test module, wherein the acquisition module is used for acquiring a tested object and a first test case set corresponding to the tested object, and the first test case set comprises a plurality of first test cases for executing different interface tests on the tested object;
the first recording module is used for sequentially recording a first interface node path of each first test case in the first test case set executed by the tested object as a first timing diagram to obtain a first timing diagram set corresponding to the first test case set;
the second recording module is used for acquiring a plurality of second interface node paths when the tested object executes the test of a plurality of second test cases in a preset time period, and recording to obtain a second time sequence chart set, wherein the plurality of second test cases are part or all of the first test case sets;
and the analysis module is used for comparing and analyzing the second time sequence atlas according to the first time sequence atlas to obtain a test coverage report of the second time sequence atlas.
8. A computer device, characterized in that it comprises a memory, a processor, on which a computer program is stored which can be run on the processor, which computer program, when being executed by the processor, implements the steps of the interface automation test processing method according to any of claims 1-6.
9. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program executable by at least one processor to cause the at least one processor to perform the steps of the interface automation test processing method according to any of claims 1-6.
CN202110286192.7A 2021-03-17 2021-03-17 Interface automatic test processing method, system, computer equipment and storage medium Active CN113010424B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110286192.7A CN113010424B (en) 2021-03-17 2021-03-17 Interface automatic test processing method, system, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110286192.7A CN113010424B (en) 2021-03-17 2021-03-17 Interface automatic test processing method, system, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113010424A CN113010424A (en) 2021-06-22
CN113010424B true CN113010424B (en) 2024-04-02

Family

ID=76409240

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110286192.7A Active CN113010424B (en) 2021-03-17 2021-03-17 Interface automatic test processing method, system, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113010424B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101645037A (en) * 2009-09-11 2010-02-10 兰雨晴 Coverage analysis method of foundational software platform application program interface integrated test
CN101916225A (en) * 2010-09-02 2010-12-15 于秀山 Graphical user interface software function coverage testing method
WO2016170937A1 (en) * 2015-04-20 2016-10-27 三菱電機株式会社 Test automation device, test automation method, and test automation program
CN107807881A (en) * 2017-09-28 2018-03-16 北京新能源汽车股份有限公司 Method of testing, device and the computer equipment of code coverage
CN108073519A (en) * 2018-01-31 2018-05-25 百度在线网络技术(北京)有限公司 Method for generating test case and device
CN109857654A (en) * 2019-01-17 2019-06-07 珠海金山网络游戏科技有限公司 A kind of method, apparatus and system of the timing flow chart automatically generating test case
CN111309635A (en) * 2020-03-26 2020-06-19 北京奇艺世纪科技有限公司 Test case generation method, device, server and storage medium
CN111552631A (en) * 2020-03-27 2020-08-18 中国平安人寿保险股份有限公司 System testing method, device and computer readable storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10725894B2 (en) * 2017-10-04 2020-07-28 International Business Machines Corporation Measuring and improving test coverage

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101645037A (en) * 2009-09-11 2010-02-10 兰雨晴 Coverage analysis method of foundational software platform application program interface integrated test
CN101916225A (en) * 2010-09-02 2010-12-15 于秀山 Graphical user interface software function coverage testing method
WO2016170937A1 (en) * 2015-04-20 2016-10-27 三菱電機株式会社 Test automation device, test automation method, and test automation program
CN107807881A (en) * 2017-09-28 2018-03-16 北京新能源汽车股份有限公司 Method of testing, device and the computer equipment of code coverage
CN108073519A (en) * 2018-01-31 2018-05-25 百度在线网络技术(北京)有限公司 Method for generating test case and device
CN109857654A (en) * 2019-01-17 2019-06-07 珠海金山网络游戏科技有限公司 A kind of method, apparatus and system of the timing flow chart automatically generating test case
CN111309635A (en) * 2020-03-26 2020-06-19 北京奇艺世纪科技有限公司 Test case generation method, device, server and storage medium
CN111552631A (en) * 2020-03-27 2020-08-18 中国平安人寿保险股份有限公司 System testing method, device and computer readable storage medium

Also Published As

Publication number Publication date
CN113010424A (en) 2021-06-22

Similar Documents

Publication Publication Date Title
CN107885656B (en) Automatic product algorithm testing method and application server
CN108491321B (en) Method and device for determining test case range and storage medium
US9170821B1 (en) Automating workflow validation
CN112052172B (en) Rapid test method and device for third-party channel and electronic equipment
CN108111364B (en) Service system testing method and device
CN110928802A (en) Test method, device, equipment and storage medium based on automatic generation of case
CN112559354A (en) Front-end code specification detection method and device, computer equipment and storage medium
CN112380255A (en) Service processing method, device, equipment and storage medium
CN114116496A (en) Automatic testing method, device, equipment and medium
CN113377667A (en) Scene-based testing method and device, computer equipment and storage medium
CN113448862A (en) Software version testing method and device and computer equipment
CN112561370A (en) Software version management method and device, computer equipment and storage medium
CN110727575B (en) Information processing method, system, device and storage medium
CN107992420B (en) Management method and system for test item
CN115774707B (en) Object attribute-based data processing method and device, electronic equipment and storage medium
CN113010424B (en) Interface automatic test processing method, system, computer equipment and storage medium
CN112181836A (en) Test case generation method, system, device and storage medium
CN116431522A (en) Automatic test method and system for low-code object storage gateway
CN112132544B (en) Inspection method and device of business system
CN112416648A (en) Data verification method and device
CN114416596A (en) Application testing method and device, computer equipment and storage medium
CN111400245B (en) Art resource migration method and device
CN114416420A (en) Equipment problem feedback method and system
CN114238024A (en) Timing diagram generation method and system
CN113138906A (en) Call chain data acquisition method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant