CN113190453A - User interface testing method, device, server and medium - Google Patents

User interface testing method, device, server and medium Download PDF

Info

Publication number
CN113190453A
CN113190453A CN202110506115.8A CN202110506115A CN113190453A CN 113190453 A CN113190453 A CN 113190453A CN 202110506115 A CN202110506115 A CN 202110506115A CN 113190453 A CN113190453 A CN 113190453A
Authority
CN
China
Prior art keywords
tested
test
user interface
codes
page
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110506115.8A
Other languages
Chinese (zh)
Inventor
李一伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Century Trading Co Ltd
Beijing Wodong Tianjun Information Technology Co Ltd
Original Assignee
Beijing Jingdong Century Trading Co Ltd
Beijing Wodong Tianjun Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Century Trading Co Ltd, Beijing Wodong Tianjun Information Technology Co Ltd filed Critical Beijing Jingdong Century Trading Co Ltd
Priority to CN202110506115.8A priority Critical patent/CN113190453A/en
Publication of CN113190453A publication Critical patent/CN113190453A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3676Test management for coverage analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The embodiment of the invention discloses a user interface testing method, a device, a server and a medium, wherein the method comprises the following steps: acquiring manual test operation data of each page to be tested in a system to be tested, and analyzing the operation data to generate an automatic execution script; executing the automatic execution script to test each page to be tested, and reading program operation data of the system to be tested in the test process; and calculating the test coverage rate of the user interface based on the program operation data and the call relation graph of all codes of the system to be tested. The technical scheme of the embodiment solves the problems that the efficiency of developing the test script to automatically test the user interface is low and the input-output ratio of the test is low, realizes the complete automatic test process of the user interface from the test script generation, the automatic test to the test result measurement, and improves the efficiency and the input-output ratio of the automatic test of the user interface.

Description

User interface testing method, device, server and medium
Technical Field
The embodiment of the invention relates to the technical field of computers, in particular to a user interface testing method, a user interface testing device, a user interface testing server and a user interface testing medium.
Background
The User Interface (UI) automatic test is an important branch of the automatic test field, can directly reflect the end-to-end operation of a User, and tends to the test of a real scene. The common method is that test developers write codes to realize the use case of UI interface operation, and the method of browser driver (web driver) and the like is used for simulating the operation scene of users. Automated test coverage is a common method for testing integrity measurement, and generally measures the test by using indexes such as code coverage rate.
However, in the process of implementing the present invention, at least the following technical problems are found in the prior art: the efficiency of test developers for developing test scripts is low, the execution of the test scripts is unstable, the input-return ratio of UI automatic test is low, and the test efficiency cannot be improved in actual test.
Disclosure of Invention
The embodiment of the invention provides a user interface testing method, a device, a server and a medium, which are used for realizing a complete user interface automatic testing process from test script generation and automatic testing to test result measurement and improving the efficiency and input-output ratio of the user interface automatic testing.
In a first aspect, an embodiment of the present invention provides a user interface testing method, where the method includes:
acquiring manual test operation data of each page to be tested in a system to be tested, and analyzing the operation data to generate an automatic execution script;
executing the automatic execution script to test each page to be tested, and reading program operation data of the system to be tested in the test process;
and calculating the test coverage rate of the user interface based on the program operation data and the call relation graph of all codes of the system to be tested.
In a second aspect, an embodiment of the present invention further provides a user interface testing apparatus, where the apparatus includes:
the test script generating module is used for acquiring manual test operation data of each page to be tested in the system to be tested and analyzing the operation data to generate an automatic execution script;
the test module is used for executing the automatic execution script to test each page to be tested and reading program running data of the system to be tested in the test process;
and the test statistical module is used for calculating the test coverage rate of the user interface based on the program operation data and the call relation graph of all the codes of the system to be tested.
In a third aspect, an embodiment of the present invention further provides a server, where the server includes:
one or more processors;
a memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement a user interface testing method as provided by any of the embodiments of the invention.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the user interface testing method provided in any embodiment of the present invention.
The embodiment of the invention has the following advantages or beneficial effects:
in the embodiment of the invention, the manual test operation data of each front-end page to be tested in the system to be tested is obtained by recording first, and the operation data is analyzed to generate the automatic execution script; then, executing the automatic execution script to test each page to be tested, and reading program operation data of a system to be tested in the test process; calculating the test coverage rate of the user interface based on the program operation data and the call relation graph of all codes of the system to be tested; the problem that the efficiency of developing the test script to test the user interface is low and the input-output ratio of the test is low is solved, the complete automatic test process of the user interface from the test script generation and the automatic test to the test result measurement is realized, the efficiency and the input-output ratio of the automatic test of the user interface are improved, and the test stability is better.
Drawings
FIG. 1 is a flowchart of a method for testing a user interface according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating a code invocation relationship according to an embodiment of the present invention;
FIG. 3 is a flowchart of a user interface testing method according to a second embodiment of the present invention;
fig. 4 is a schematic structural diagram of a user interface testing apparatus according to a third embodiment of the present invention;
fig. 5 is a schematic structural diagram of a server according to a fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a flowchart of a user interface testing method according to an embodiment of the present invention, which is applicable to a case where an automatic user interface test is performed in a program development process. The method can be executed by a user interface testing device, which can be implemented by software and/or hardware and is integrated in an electronic device with application development function.
As shown in fig. 1, the user interface testing method includes the steps of:
s110, acquiring manual test operation data of each page to be tested in the system to be tested, and analyzing the operation data to generate an automatic execution script.
The manual test operation data of the page to be tested is artificial data for operating on a front-end page of the system to be tested, a program tester or other users firstly traverse page elements of the page to be tested once, and then the operation process is converted into an automatic test script so as to automatically generate an automatic execution script of the page to be tested.
Specifically, the manual test operation data of each page to be tested can be acquired through the page operation monitoring agent plug-in. And enabling the page operation monitoring agent and the system to be tested to run in the same life cycle, recording data of manual test operation, and sending the data to a server side for system test so as to generate an automatic execution script. In the process of generating the automatic execution script, firstly, matching corresponding operation event test scripts in a preset script library for each operation event in manual test data, and then storing the test scripts according to a preset data structure and format to obtain a final automatic execution script, wherein the automatic execution script comprises a page element automatic execution script and an interface test automatic execution script. That is, for the operation of the front-end user interface, a plurality of scripts, that is, the automatic execution scripts of the user interface elements, may be recorded, and one operation event corresponds to one or more automatic scripts of the user interface elements. Http or https requests (XMLHttpReques) for front-end user interface elements and back-end engineering interactions may be recorded as a plurality of interface automation test scripts. One operation event corresponds to one or more http interface test scripts; the interface test script may be used as a supplement to the page elements of the user interface elements involved in the corresponding http or https request event when execution of the script is unstable.
Further, the file format definition of the auto-execute script may include a file name and a command set, and the command set includes command, target, value, and targets information. Wherein, command represents the event of operation, such as click, select, type input, etc.; target represents an element of an operation, such as id bgDiv, "xpath// [ @ id \ content \ j/div/div/div (exemplarily representing a search path of an operation element); targets represent other positioning modes of operation elements, namely other search paths.
And S120, executing the automatic execution script to test each page to be tested, and reading program running data of the system to be tested in the testing process.
In the process of testing the user interface, based on a page driver and a script resolver, executing an automatic execution script to test a page to be tested, wherein the test comprises positioning an operation element (target) in the automatic execution script, sending an operation command (command) of an operation event to a server to enable a system to be tested to execute corresponding operation, and if the positioning of the operation element is overtime, further resolving information in an operation element positioning path list (targets) until a parameter value is input according to one of the operation element positioning paths and a positioning mode of an operation path corresponding to the operation is obtained.
Meanwhile, in this embodiment, program running data of the system to be tested in the testing process is collected through an agent or tangent plane mode based on the spring framework, wherein the program running data includes a program running log of the system to be tested, and the program running log includes a name, a calling method, an entry parameter and an exit parameter of the function to be tested.
S130, calculating the test coverage rate of the user interface based on the program operation data and the call relation graph of all codes of the system to be tested.
Based on the obtained program operation data, a whole call chain of the tested function code, namely a directed call relation graph, can be obtained. Illustratively, the directed graph data structure is shown in fig. 2, and a function name is used as a node, and a parameter between two functions having a calling relationship is passed as a link between the two nodes. In two connected nodes, the output parameter of the upstream node is the input parameter of the downstream node. In fig. 2, the output parameter of the node 1 is transmitted to the nodes 2 and 3 as the input parameters of the nodes 2 and 3, respectively, and the output parameter of the node 3 is transmitted to the node 4 as the input parameter of the node 4.
Further, the directed relation graph of the tested function is matched with the directed relation graphs of all the functions to be tested, which are acquired in advance, and the matching rate of the directed relation graphs is used as the test coverage rate. The directed relationship graph of all the functions to be tested can be a grammatical relationship graph determined by performing static analysis on all the codes to be tested.
According to the technical scheme of the embodiment, manual test operation data of each front-end page to be tested in the system to be tested is obtained through recording, and the operation data is analyzed to generate an automatic execution script; then, executing the automatic execution script to test each page to be tested, and reading program operation data of a system to be tested in the test process; calculating the test coverage rate of the user interface based on the program operation data and the call relation graph of all codes of the system to be tested; the method solves the problems of low efficiency and low input-output ratio of the test of the user interface by developing the test script, realizes the complete automatic test process of the user interface from the test script generation and the automatic test to the test result measurement, improves the efficiency and the input-output ratio of the automatic test of the user interface, and has the possibility of errors compared with manually developed test codes.
Example two
Fig. 3 is a flowchart of a user interface testing method provided in the second embodiment of the present invention, which is applicable to a process of automatically testing a user interface and obtaining a test coverage in a program development process, and further obtaining a call relation diagram of all programs to be tested. The method may be performed by a user interface testing apparatus, which may be implemented by software and/or hardware, integrated in a server device having an application development function.
As shown in fig. 3, the user interface testing method includes the steps of:
s210, all codes of the system to be tested are obtained and analyzed, and a calling relation graph of all the codes is generated.
In the embodiment, the calling relationship graph of the code relationship to all the codes is statically analyzed for all the codes to be tested. Firstly, performing lexical analysis on class files in all codes to generate abstract syntax trees of all the codes, then analyzing the abstract syntax trees to determine calling relations among the files, and generating calling relation graphs of all the codes based on the calling relations. If the calling relation among the code classes is stored in a multi-linked list form, finally, the multi-linked list is analyzed to obtain a code calling relation directed graph. Alternatively, the Abstract Syntax Tree may be browsed by an AST (Abstract Syntax Tree) View to determine a call relation diagram of all codes.
In a preferred embodiment, since the code modification is a more conventional matter in the code testing stage, the update state of all the codes can be detected at regular time; if the updating exists in all the codes, the calling relation graphs of all the codes are correspondingly updated so as to accurately calculate the test coverage rate in the subsequent test process.
S220, acquiring manual test operation data of each page to be tested in the system to be tested, and analyzing the operation data to generate an automatic execution script.
And S230, executing the automatic execution script to test each page to be tested, and reading program running data of the system to be tested in the test process.
S240, calculating the test coverage rate of the user interface based on the program operation data and the call relation graph of all codes of the system to be tested.
When the amount of the testable code in one test script is too large, the program running data amount is large, the obtained directed call relational graph of the tested code is large, the test script may miss tests of some operation events, nodes in the directed call relational graph of the tested code are not completely consistent with nodes in the directed relational graphs of all the codes, and finally, the directed call relational graph of the tested code may not be matched in the call relational graphs of all the codes, so that the calculation of the test coverage rate is influenced. In this case, the directed call relation graph of the tested code can be cut, and the tested code call relation graph is cut into a plurality of tested code call relation graph subgraphs to be divided into a plurality of call relation graph subgraphs according to the distribution situation of nodes and node branches in the tested code call relation graph. And matching the multiple tested code calling relation graph subgraphs with the calling relation graphs of all the codes of the system to be tested, and determining the final user interface testing coverage rate.
Preferably, when the call relational graphs are matched, a test coverage identifier can be set for each node in the call relational graphs of all the codes, when the nodes in the two matched relational graphs are successfully matched once, the values of the test coverage identifier bits are accumulated to 1, and finally, the nodes with the values of the test coverage identifier bits being 0 can be filtered according to the sequence of the values of the node coverage identifier bits in the call relational graphs of all the codes. Thus, the uncovered method node can be prompted. Compared with a test coverage rate calculation method for judging that a plurality of lines of codes are not executed, the calculation method in the embodiment can determine the untested operation events more intuitively.
According to the technical scheme of the embodiment, all calling relation graphs of the codes to be tested are statically analyzed in advance, then manual test operation data of each front-end page to be tested in the system to be tested are recorded and obtained, and the operation data are analyzed to generate an automatic execution script; then, executing the automatic execution script to test each page to be tested, and reading program operation data of a system to be tested in the test process; calculating the test coverage rate of the user interface based on the program operation data and the call relation graph of all codes of the system to be tested; the problem that the efficiency of developing the test script to test the user interface is low and the input-output ratio of the test is low is solved, the complete automatic test process of the user interface from the test script generation and the automatic test to the test result measurement is realized, the efficiency and the input-output ratio of the automatic test of the user interface are improved, and the test stability is better. Untested operational events or system functions can be determined explicitly.
The following is an embodiment of the user interface testing apparatus provided in the embodiments of the present invention, and the apparatus and the user interface testing method in the embodiments described above belong to the same inventive concept, and can implement the user interface testing method in the embodiments described above. Reference may be made to the embodiments of the user interface testing method described above for details which are not elaborated in the embodiments of the user interface testing apparatus.
EXAMPLE III
Fig. 4 is a schematic structural diagram of a user interface testing apparatus according to a third embodiment of the present invention, which is applicable to a program development process, and performs an automatic user interface test to obtain a test coverage.
As shown in fig. 4, the user interface testing apparatus includes a test script generating module 310, a testing module 320, and a test statistic module 330.
The test script generating module 310 is configured to obtain manual test operation data of each page to be tested in the system to be tested, and analyze the operation data to generate an automatic execution script; the test module 320 is configured to execute the automatic execution script to test each page to be tested, and read program running data of the system to be tested in a test process; and the test statistic module 330 is configured to calculate a user interface test coverage rate based on the program running data and the call relationship graph of all the codes of the system to be tested.
According to the technical scheme of the embodiment, manual test operation data of each front-end page to be tested in the system to be tested is obtained through recording, and the operation data is analyzed to generate an automatic execution script; then, executing the automatic execution script to test each page to be tested, and reading program operation data of a system to be tested in the test process; calculating the test coverage rate of the user interface based on the program operation data and the call relation graph of all codes of the system to be tested; the problem that the efficiency of developing the test script to test the user interface is low and the input-output ratio of the test is low is solved, the complete automatic test process of the user interface from the test script generation and the automatic test to the test result measurement is realized, the efficiency and the input-output ratio of the automatic test of the user interface are improved, and the test stability is better.
Optionally, the test script generating module 310 is specifically configured to:
acquiring manual test operation data of each page to be tested through a page operation monitoring agent plug-in;
and matching corresponding test scripts for each operation event in the manual test data, and storing the test scripts according to a preset data structure and format to obtain the automatic execution scripts, wherein the automatic execution scripts comprise page element automatic execution scripts and interface test automatic execution scripts.
Optionally, the test module 320 is specifically configured to:
executing the automatic execution script to test the page to be tested based on the page driver and the script resolver;
program running data of the system to be tested in the testing process is collected in an agent or tangent plane mode, wherein the program running data comprises a program running log of the system to be tested.
Optionally, the test statistic module 330 is specifically configured to:
determining the name of the function called and executed in the test process and the parameter transfer relation of the called function according to the program running log;
generating a tested code calling relation graph based on the function name and the parameter transfer relation;
and matching the tested code calling relation graph with the calling relation graphs of all codes of the system to be tested, and determining the test coverage rate of the user interface.
Optionally, the test statistic module 330 is further configured to:
and taking the function name as a node, and taking the parameter transfer relationship as a connection relationship between two associated nodes to generate a directed tested code calling relationship graph.
Optionally, the user interface testing apparatus further includes a call relation clipping module:
the test code calling relation graph is cut into a plurality of test code calling relation graph subgraphs according to the distribution condition of the nodes and the node branches in the test code calling relation graph;
and matching the plurality of tested code calling relation graph subgraphs with the calling relation graphs of all codes of the system to be tested, and determining the testing coverage rate of the user interface.
Optionally, the user interface testing apparatus further includes a complete code call relation graph prefabricating module, configured to obtain and analyze all codes of the system to be tested before testing the page to be tested, and generate a call relation graph of all the codes.
Optionally, the complete code call relation graph prefabricating module is specifically configured to:
performing lexical analysis on class files in all the codes to generate abstract syntax trees of all the codes;
analyzing the abstract syntax tree and determining the calling relation among various files;
and generating a calling relation graph of all the codes based on the calling relation.
Optionally, the user interface testing apparatus further includes a relationship diagram updating module, configured to detect an update state of all the codes at regular time;
and when the all codes are updated, updating the calling relation graph of the all codes.
The user interface testing device provided by the embodiment of the invention can execute the user interface testing method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example four
Fig. 5 is a schematic structural diagram of a server according to a fourth embodiment of the present invention. FIG. 5 illustrates a block diagram of an exemplary server 12 suitable for use in implementing embodiments of the present invention. The server 12 shown in fig. 5 is only an example, and should not bring any limitation to the function and the scope of use of the embodiment of the present invention. The server 12 may be any terminal device with computing capability, such as a smart controller and a terminal device such as a server and a mobile phone.
As shown in FIG. 5, the server 12 is in the form of a general purpose computing device. The components of the server 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
The server 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by server 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)30 and/or cache memory 32. The server 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 5, and commonly referred to as a "hard drive"). Although not shown in FIG. 5, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. System memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in system memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of the described embodiments of the invention.
The server 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with the server 12, and/or with any devices (e.g., network card, modem, etc.) that enable the server 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, the server 12 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet) via the network adapter 20. As shown, the network adapter 20 communicates with the other modules of the server 12 via the bus 18. It should be appreciated that although not shown in FIG. 5, other hardware and/or software modules may be used in conjunction with the server 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 16 executes various functional applications and data processing by executing programs stored in the system memory 28, for example, to implement a user interface testing method provided by the embodiment of the present invention, the method includes:
acquiring manual test operation data of each page to be tested in a system to be tested, and analyzing the operation data to generate an automatic execution script;
executing the automatic execution script to test each page to be tested, and reading program operation data of the system to be tested in the test process;
and calculating the test coverage rate of the user interface based on the program operation data and the call relation graph of all codes of the system to be tested.
EXAMPLE five
This fifth embodiment provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements a user interface testing method according to any embodiment of the present invention, and the method includes:
acquiring manual test operation data of each page to be tested in a system to be tested, and analyzing the operation data to generate an automatic execution script;
executing the automatic execution script to test each page to be tested, and reading program operation data of the system to be tested in the test process;
and calculating the test coverage rate of the user interface based on the program operation data and the call relation graph of all codes of the system to be tested.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer-readable storage medium may be, for example but not limited to: an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination thereof. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It will be understood by those skilled in the art that the modules or steps of the invention described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of computing devices, and optionally they may be implemented by program code executable by a computing device, such that it may be stored in a memory device and executed by a computing device, or it may be separately fabricated into various integrated circuit modules, or it may be fabricated by fabricating a plurality of modules or steps thereof into a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (12)

1. A user interface testing method, comprising:
acquiring manual test operation data of each page to be tested in a system to be tested, and analyzing the operation data to generate an automatic execution script;
executing the automatic execution script to test each page to be tested, and reading program operation data of the system to be tested in the test process;
and calculating the test coverage rate of the user interface based on the program operation data and the call relation graph of all codes of the system to be tested.
2. The method of claim 1, wherein the obtaining manual test operation data of each page to be tested in the system to be tested and analyzing the operation data to generate an automatic execution script comprises:
acquiring manual test operation data of each page to be tested through a page operation monitoring agent plug-in;
and matching corresponding test scripts for each operation event in the manual test data, and storing the test scripts according to a preset data structure and format to obtain the automatic execution scripts, wherein the automatic execution scripts comprise page element automatic execution scripts and interface test automatic execution scripts.
3. The method of claim 1, wherein the executing the auto-execute script tests the pages to be tested and reads program running data of the system to be tested during the testing process, comprising:
executing the automatic execution script to test the page to be tested based on the page driver and the script resolver;
program running data of the system to be tested in the testing process is collected in an agent or tangent plane mode, wherein the program running data comprises a program running log of the system to be tested.
4. The method of claim 3, wherein the calculating user interface test coverage based on the call relation graph of the program running data and all codes of the system to be tested comprises:
determining the name of the function called and executed in the test process and the parameter transfer relation of the called function according to the program running log;
generating a tested code calling relation graph based on the function name and the parameter transfer relation;
and matching the tested code calling relation graph with the calling relation graphs of all codes of the system to be tested, and determining the test coverage rate of the user interface.
5. The method of claim 4, wherein generating a tested code call relationship graph based on the function name and the parameter passing relationship comprises:
and taking the function name as a node, and taking the parameter transfer relationship as a connection relationship between two associated nodes to generate a directed tested code calling relationship graph.
6. The method of claim 5, further comprising:
according to the distribution condition of nodes and node branches in a tested code calling relation graph, cutting the tested code calling relation graph into a plurality of tested code calling relation graph subgraphs;
and matching the plurality of tested code calling relation graph subgraphs with the calling relation graphs of all codes of the system to be tested, and determining the testing coverage rate of the user interface.
7. The method of any of claims 1-6, wherein prior to performing testing of the page to be tested, the method further comprises:
and acquiring and analyzing all codes of the system to be tested, and generating a calling relation graph of all the codes.
8. The method of claim 7, wherein the obtaining and analyzing all code of the system under test to generate a call relation graph of all code comprises:
performing lexical analysis on class files in all the codes to generate abstract syntax trees of all the codes;
analyzing the abstract syntax tree and determining the calling relation among various files;
and generating a calling relation graph of all the codes based on the calling relation.
9. The method of claim 6, further comprising:
regularly detecting the updating state of all the codes;
and when the all codes are updated, updating the calling relation graph of the all codes.
10. A user interface testing apparatus, comprising:
the test script generating module is used for acquiring manual test operation data of each page to be tested in the system to be tested and analyzing the operation data to generate an automatic execution script;
the test module is used for executing the automatic execution script to test each page to be tested and reading program running data of the system to be tested in the test process;
and the test statistical module is used for calculating the test coverage rate of the user interface based on the program operation data and the call relation graph of all the codes of the system to be tested.
11. A server, characterized in that the server comprises:
one or more processors;
a memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement a user interface testing method as claimed in any one of claims 1-9.
12. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the user interface testing method according to any one of claims 1 to 9.
CN202110506115.8A 2021-05-10 2021-05-10 User interface testing method, device, server and medium Pending CN113190453A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110506115.8A CN113190453A (en) 2021-05-10 2021-05-10 User interface testing method, device, server and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110506115.8A CN113190453A (en) 2021-05-10 2021-05-10 User interface testing method, device, server and medium

Publications (1)

Publication Number Publication Date
CN113190453A true CN113190453A (en) 2021-07-30

Family

ID=76988646

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110506115.8A Pending CN113190453A (en) 2021-05-10 2021-05-10 User interface testing method, device, server and medium

Country Status (1)

Country Link
CN (1) CN113190453A (en)

Similar Documents

Publication Publication Date Title
CN107506300B (en) User interface testing method, device, server and storage medium
US7900198B2 (en) Method and system for parameter profile compiling
CN109815141B (en) Test method and device
CN113900958A (en) Test case script generation method, system, medium and electronic device
CN103186463B (en) Determine the method and system of the test specification of software
CN112131573A (en) Method and device for detecting security vulnerability and storage medium
CN113505895B (en) Machine learning engine service system, model training method and configuration method
CN111274130A (en) Automatic testing method, device, equipment and storage medium
CN110597704B (en) Pressure test method, device, server and medium for application program
CN113836014A (en) Interface testing method and device, electronic equipment and storage medium
CN112988578A (en) Automatic testing method and device
CN112799939A (en) Incremental code coverage rate testing method and device, storage medium and electronic equipment
CN112084108A (en) Test script generation method and device and related components
CN110716859A (en) Method for automatically pushing test cases for modified codes and related device
CN111666201A (en) Regression testing method, device, medium and electronic equipment
CN116069650A (en) Method and device for generating test cases
CN112286802B (en) Method and device for testing program performance and electronic equipment
CN115705250A (en) Monitoring stack usage to optimize programs
CN113190453A (en) User interface testing method, device, server and medium
CN110297639B (en) Method and apparatus for detecting code
CN113220586A (en) Automatic interface pressure test execution method, device and system
CN113986263A (en) Code automation test method, device, electronic equipment and storage medium
CN113806231A (en) Code coverage rate analysis method, device, equipment and medium
CN115129575A (en) Code coverage result generation method and device
CN113238940A (en) Interface test result comparison method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination