CN117149638A - UI (user interface) automatic testing method and device, computer equipment and storage medium - Google Patents

UI (user interface) automatic testing method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN117149638A
CN117149638A CN202311126238.4A CN202311126238A CN117149638A CN 117149638 A CN117149638 A CN 117149638A CN 202311126238 A CN202311126238 A CN 202311126238A CN 117149638 A CN117149638 A CN 117149638A
Authority
CN
China
Prior art keywords
test
testing
service process
task
requirement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311126238.4A
Other languages
Chinese (zh)
Inventor
李军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mgjia Beijing Technology Co ltd
Original Assignee
Mgjia Beijing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mgjia Beijing Technology Co ltd filed Critical Mgjia Beijing Technology Co ltd
Priority to CN202311126238.4A priority Critical patent/CN117149638A/en
Publication of CN117149638A publication Critical patent/CN117149638A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention relates to the technical field of computers, and discloses a UI (user interface) automatic testing method, a device, computer equipment and a storage medium. After receiving the file packet transmitted by the testing end, the method and the device firstly judge whether the equipment to be tested meets the testing premise of automatic testing by checking the survival state of the service process. When the service process is in a survival state, the service process can be disconnected with the testing end, and the testing requirement is received, so that the service process is utilized to complete an automatic testing process based on the testing requirement. By the mode, the embodiment avoids long-time connection with the test end, solves the problem that the test cannot be continued when the connection is disconnected, and achieves the effects of offline execution and reducing the occupation of hardware in the execution.

Description

UI (user interface) automatic testing method and device, computer equipment and storage medium
Technical Field
The invention relates to the technical field of computers, in particular to a UI (user interface) automatic test method, a UI automatic test device, computer equipment and a storage medium.
Background
Android system hardware is diverse, such as: cell phones, flat panels, televisions, car machines, robots, intelligent hardware and the like. When the system automation test is actually performed, the situation that the tested equipment cannot be connected through a data line or a network for a long time is encountered. Once disconnected from the device under test, testing cannot continue.
Therefore, how to avoid the above situation and to complete the system automation test is a problem to be solved at present.
Disclosure of Invention
In view of the above, the present invention provides a method, apparatus, computer device and storage medium for UI automated test, so as to solve the problem that the test cannot be completed because the device under test cannot be connected for a long time.
In a first aspect, the present invention provides a UI automation test method, the method being performed by a device under test, the method comprising:
receiving a file packet transmitted by a test end, wherein the file packet comprises a test application; checking the survival state of the service process; when the service process is in a survival state, disconnecting the service process from the test end, receiving a test requirement, wherein the test requirement is generated by a test application; based on the test requirement, the service process is utilized to complete the automatic test of the device to be tested.
According to the UI automatic test method provided by the embodiment, after the file packet transmitted by the test end is received, whether the device to be tested meets the test premise of automatic test is judged by checking the survival state of the service process. When the service process is in a survival state, the service process can be disconnected with the testing end, and the testing requirement is received, so that the service process is utilized to complete an automatic testing process based on the testing requirement. By the mode, the embodiment avoids long-time connection with the test end, solves the problem that the test cannot be continued when the connection is disconnected, and achieves the effects of offline execution and reducing the occupation of hardware in the execution.
In an alternative embodiment, the method further comprises:
when the service process is in a non-survival state, connection is established with the test end so as to start the service process.
In an alternative embodiment, the automatic test of the device under test is accomplished by using a service process based on the test requirements, comprising:
performing test task configuration based on the test requirement to generate a test configuration file; extracting task configuration parameters from the test configuration file; judging whether the current test task is an interrupted test task or not based on the task configuration parameters, and obtaining a judging result; and automatically testing the equipment to be tested based on the judgment result.
In an optional implementation manner, the file package further includes an analysis script, a first configuration file and a second configuration file, and the automatic test is performed on the device to be tested based on the judgment result, including:
when the judgment result is that the current test task is not the test task after interruption, analyzing the first configuration file by utilizing an analysis script to obtain a dependent function and a variable; analyzing the second configuration file by utilizing an analysis script to obtain a use case execution list; and operating each use case according to the use case execution list by utilizing the service process in combination with the dependent functions and the variables so as to complete the automatic test of the device to be tested.
In an alternative embodiment, when the current test task is a test task after interruption, before the parsing script is used to parse the first configuration file to obtain the dependent function and the variable, the method further includes:
acquiring a corresponding execution result when interruption occurs; the initial execution state of the automated test is updated based on the execution result.
The UI automatic test method provided by the embodiment can continue to run from the position where the interruption occurs after the service process is abnormally interrupted and restarted, so that the resource occupation of the restarting is avoided, the test time is saved, and the test efficiency is improved.
In an alternative embodiment, after the automated testing of the device under test is completed with the service process based on the test requirements, the method further comprises:
and generating prompt information, and displaying the prompt information on an interactive interface of the test application.
In an alternative embodiment, after generating the prompt information and displaying the prompt information on the interactive interface of the test application, the method further includes:
when a query instruction is received, test data, test logs and test results of an automatic test are obtained; and processing the test data, the test log and the test result, and displaying the processing result on the interactive interface.
According to the embodiment, all the test data, the test logs and the test results are processed, and the processing results are displayed on the interactive interface, so that a tester can analyze the test process based on the processing results.
In a second aspect, the present invention provides a UI automation test device, the device comprising:
the first receiving module is used for receiving a file packet transmitted by the testing end, wherein the file packet comprises a testing application; the checking module is used for checking the survival state of the service process; the second receiving module is used for disconnecting with the testing end when the service process is in a survival state, receiving a testing requirement, wherein the testing requirement is generated by a testing application; and the testing module is used for completing automatic testing of the equipment to be tested by utilizing the service process based on the testing requirement.
In a third aspect, the present invention provides a computer device comprising: the UI automatic test system comprises a memory and a processor, wherein the memory and the processor are in communication connection, the memory stores computer instructions, and the processor executes the computer instructions, so that the UI automatic test method of the first aspect or any implementation mode corresponding to the first aspect is executed.
In a fourth aspect, the present invention provides a computer readable storage medium having stored thereon computer instructions for causing a computer to perform the UI automation test method of the first aspect or any of its corresponding embodiments.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described, and it is obvious that the drawings in the description below are some embodiments of the present invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow diagram of a UI automation test method in accordance with an embodiment of the invention;
FIG. 2 is a flow diagram of another UI automation test method in accordance with an embodiment of the invention;
FIG. 3 is a flow diagram of yet another UI automation test method in accordance with an embodiment of the invention;
FIG. 4 is a block diagram of a UI automation test apparatus according to an embodiment of the invention;
fig. 5 is a schematic diagram of a hardware structure of a computer device according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
When the system automatic test is actually performed, the test end and the equipment to be tested are required to be in a connection state all the time. Once the connection is broken, the test cannot continue to complete. In order to solve the above problems, an embodiment of the present invention provides a UI automation test method, where a device to be tested only needs to be connected with a test segment for a short time, and after receiving a file packet transmitted by a test end and establishing connection with the test end, the device to be tested starts a start service process of the device to be tested, and at this time, the device to be tested can disconnect from the test end. When the device to be tested receives the test requirement, the automatic test can be completed through the service process. The embodiment of the method can avoid the condition of long-time connection with the test terminal, solves the problem that the test can not be continuously finished due to disconnection, and achieves the effect of finishing the test without long-time connection.
In accordance with an embodiment of the present invention, there is provided an UI automated test method embodiment, it being noted that the steps shown in the flowchart of the figures may be performed in a computer system, such as a set of computer executable instructions, and, although a logical order is shown in the flowchart, in some cases, the steps shown or described may be performed in an order other than that shown or described herein.
In this embodiment, a UI automation test method is provided, which may be used for any android hardware device, such as a mobile phone, a watch, a television, a car set, etc., fig. 1 is a flowchart of the UI automation test method according to an embodiment of the present invention, and as shown in fig. 1, the flowchart includes the following steps:
step S101, a file packet transmitted by a test end is received.
Specifically, the test end is connected with the device to be tested through an android debug bridge (Android Debug Bridge, abbreviated as adb). The file package includes a test application, i.e., an application program and a file for executing a test function.
For example, after the device to be tested establishes a connection with the test end, when the device to be tested receives the file packet transmitted by the test end, the test application is installed in the device to be tested, and the device to be tested only needs to receive the test application without responding in the installation process.
Illustratively, the package includes, in addition to the test application (i.e., test. Apk), a dependency file that includes other test applications (i.e., 2.0 custom apk, actions. Apk), an Zhuoduan command line tool execution files (am, pm, dumpsys, etc.), shell parsing scripts, and configuration files that are dependent during the test. The configuration files comprise a first configuration file taking sl as an extension and a second configuration file taking tc as an extension, wherein the first configuration file comprises basic functions and variables which are relied on in the process of executing the use cases, and the second configuration file comprises test cases. The dependent file described in this embodiment is used to provide a file foundation for the testing process.
Step S102, checking the survival state of the service process.
In particular, there are two kinds of surviving states of the service process, namely, a surviving state and a non-surviving state. The manner of checking the survival state is not particularly limited herein, and may be selected by one skilled in the art according to the actual circumstances.
Illustratively, in the present embodiment, the surviving state of the service process is determined as follows. When the device to be tested is android device, the background process has a process number, i.e. pid. Because android runs on the linux kernel, the process corresponds to proc pseudo system rules, $pid is a process number, and path/proc/$pid/cmdline is a command line command and parameter file. Therefore, when the folder "path/proc/pid" exists, it is determined that the service process is in a surviving state.
Step S103, when the service process is in a survival state, the service process is disconnected with the testing end, and the testing requirement is received.
Specifically, when the service process is in a surviving state, the device to be tested has a test condition, and at this time, the device to be tested can be disconnected from the test terminal. When the test requirements generated by the test application are received, the test can be started.
Illustratively, in another alternative embodiment, when the service process is in a non-surviving state, a connection is established with the testing end to initiate the service process.
Specifically, when the service process is in a non-survival state, it indicates that the device to be tested does not have a test condition, and to test, the test end must start the service process by establishing a connection with the test end, so that the test can be started.
Step S104, based on the test requirement, the service process is utilized to complete the automatic test of the device to be tested.
Specifically, the service process executes the test cases to complete the automated testing of the device under test.
According to the UI automatic test method provided by the embodiment, after the file packet transmitted by the test end is received, whether the device to be tested meets the test premise of automatic test is judged by checking the survival state of the service process. When the service process is in a survival state, the service process can be disconnected with the testing end, and the testing requirement is received, so that the service process is utilized to complete an automatic testing process based on the testing requirement. By the mode, the embodiment avoids long-time connection with the test end, solves the problem that the test cannot be continued when the connection is disconnected, and achieves the effects of offline execution and reducing the occupation of hardware in the execution.
In this embodiment, a UI automation test method is provided, which may be used for any android hardware device, such as a mobile phone, a watch, a television, a car set, etc., fig. 2 is a flowchart of the UI automation test method according to an embodiment of the present invention, and as shown in fig. 2, the flowchart includes the following steps:
step S201, a file packet transmitted by a test end is received. Please refer to step S101 in the embodiment shown in fig. 1 in detail, which is not described herein.
Step S202, checking the survival status of the service process. Please refer to step S102 in the embodiment shown in fig. 1 in detail, which is not described herein.
Step S203, when the service process is in a survival state, the service process is disconnected with the testing terminal, and the testing requirement is received. Please refer to step S103 in the embodiment shown in fig. 1 in detail, which is not described herein.
Step S204, based on the test requirement, the service process is utilized to complete the automatic test of the device to be tested.
Specifically, the step S204 includes:
step S2041, test task configuration is performed based on the test requirements, and a test configuration file is generated.
Specifically, each test requirement corresponds to a test task, and each test task has a test configuration file corresponding to the test task. The test configuration file comprises task parameter configuration, circulation configuration and parallel background script configuration.
The task configuration parameters are used for uniquely identifying the test tasks, namely, the task configuration parameters are in one-to-one correspondence with the test configuration files.
The cycle configuration comprises cycle type configuration and cycle value configuration, the cycle type is cycle times or cycle duration, and the cycle value corresponds to the cycle type. Examples of applications corresponding to a test requirement include example a, example b, and example c, for example. When the configured cycle type is the cycle number and the cycle value is 10, each use case is sequentially operated for 10 times, namely, the use case a is operated for 10 times, then the use case b is operated for 10 times, finally the use case c is operated for 10 times, and the operation is finished. When the configured cycle type is cycle duration and the cycle value is 30 seconds, starting from the time when the case a starts to operate, starting to operate the case b after 30 seconds, starting to operate the case c after 30 seconds, and ending the operation after 30 seconds.
The background script configuration includes other tasks that need to be run while the current test task is running. For example, a monitoring script that runs performance while testing to monitor performance data during operation of the device. The background script is configured, so that testing capabilities such as performance testing and load pressure testing are realized, and testing requirements beyond UI automation testing are expanded.
Step S2042, extracting task configuration parameters from the test configuration file.
Specifically, the test configuration file is parsed according to a preset parsing rule, and task configuration parameters are extracted from the test configuration file after parsing, and it is to be noted that the task configuration parameters are used for representing the test configuration file.
For example, the configuration file is parsed by the parsing script according to a preset parsing rule, where the preset parsing rule may be that a tab separator parses a column number, and different parsing rules are corresponding to different column numbers and english names of a designated column.
Step S2043, judge whether the current test task is the test task after interrupt based on the task configuration parameters, and obtain the judging result.
Specifically, whether historical test data corresponding to the task configuration parameters exists in the preset storage position is judged, when the historical test data exist, the current test task is started, but not completed, namely the current test task is a test task after interruption. When the historical test data corresponding to the task configuration parameters does not exist in the preset storage position, the current test task is not started, namely the current test task is not the interrupted test task. The preset storage location is used for storing test data generated in the test process.
And step S2044, automatically testing the equipment to be tested based on the judging result.
In some alternative embodiments, step S2044 includes:
and a step a1 of analyzing the first configuration file by utilizing an analysis script to obtain a dependent function and a variable when the judgment result is that the current test task is not the test task after interruption.
Specifically, the parsing scheme of the parsing script for the first configuration file is consistent with the parsing scheme in the embodiment of step S2042. The first configuration file is a file with an extension of.sl, which is used for storing the dependent functions and variables required in the running process of the use case.
And a step a2, analyzing the second configuration file by utilizing the analysis script to obtain a use case execution list.
Specifically, the second configuration file includes all use cases required for the test. After the second configuration file is analyzed by the analysis script, the application corresponding to the current test requirement is extracted from the analyzed second configuration file according to the preset mapping relation between the test requirement and the application. Generating a case execution list according to the cyclic configuration in the test configuration file, and configuring a corresponding file name for each case in the case execution list, wherein the file name takes a tc as an extension. It should be noted that, the mapping relationship between the "test requirement and the use case" is constructed by a person skilled in the art based on the test requirement before the test, and may be stored in the memory of the device under test in advance.
For example, according to the mapping relation between the test requirement and the use case, the use case corresponding to the current test requirement is determined to be the use case a, the use case b and the use case c. And generating a case execution list according to the configured circulation times, wherein the circulation times are 10 times in the embodiment, and then the case execution list has 30 cases, and a corresponding file name is respectively configured for each case, wherein the file name is used for identifying the case. It should be noted that, in the operation process, each use case adopts a "burn after reading" mode, that is, when execution of a certain use case is completed, test data corresponding to the certain use case is written into a preset storage position immediately, and then the use case is deleted from the use case execution list.
And a3, combining the dependent functions and the variables, and running each use case according to the use case execution list by utilizing the service process so as to complete the automatic test of the device to be tested.
Specifically, in the running process, each use case in the use case execution list is sequentially run by the service process, and in the use case execution process, the dependent function and the variable corresponding to the use case are loaded in real time. And ending the automatic test process of the equipment to be tested until all the use cases in the use case execution list are completely executed.
Meanwhile, when the service operation case is utilized, a corresponding process is started according to the configuration of the parallel background script recorded in step S2041, and the corresponding process is operated in parallel with the service process, so as to complete the acquisition of performance data in the operation process.
In some alternative embodiments, when the current test task is a test task after interruption, before the step a1, the method further includes:
and b1, acquiring a corresponding execution result when the interrupt occurs.
Specifically, when the current test task is a post-interrupt retest task, the last piece of test data is obtained from the preset storage location, and the file name corresponding to the last piece of test data is used for determining the use case (namely, the execution result corresponding to the interrupt) which completes the operation when the interrupt occurs.
Illustratively, one test data is generated for each instance of execution. For example, there are 30 cases in the case execution list, and file names corresponding to the respective cases are x001.Tc, x002.Tc, …, and x030.Tc. In the test process, when the 14 th use case is completely run and the 15 th use case is running, the service process is interrupted due to the occurrence of an abnormality, and after the service is restarted, the last piece of test data is read from a preset storage position, and the use case file name is extracted from the last piece of test data, wherein the use case file name is x014.Tc. Based on the file name, the 14 th use case is finished when the interrupt occurs.
And step b2, updating the initial execution state of the automatic test based on the execution result.
Illustratively, still taking the embodiment in step b1 as an example, determining that the execution result when the interrupt occurs is that the 14 th use case has been run to completion. Then, the 15 th use case in the use case execution list can be directly used as the initial execution use case (i.e. initial execution state) of the automatic test. When the operation is started, the operation is started directly from the 15 th use case.
The UI automatic test method provided by the embodiment can continue to run from the position where the interruption occurs after the service process is abnormally interrupted and restarted, so that the resource occupation of the restarting is avoided, the test time is saved, and the test efficiency is improved.
In this embodiment, a UI automation test method is provided, which may be used for any android hardware device, such as a mobile phone, a watch, a television, a car set, etc., fig. 3 is a flowchart of the UI automation test method according to an embodiment of the present invention, and as shown in fig. 3, the flowchart includes the following steps:
step S301, a file packet transmitted by the testing end is received.
Step S302, checking the survival status of the service process.
Step S303, disconnecting the service process from the testing terminal when the service process is in a surviving state, and receiving the testing requirement.
Step S304, based on the test requirement, the service process is utilized to complete the automatic test of the device to be tested.
After the step S304, the method further includes:
step S305, generating prompt information, and displaying the prompt information on the interactive interface of the test application.
Specifically, the prompt information is the information for prompting the completion of the test, and the prompt information is displayed on the interactive interface of the test application, so that the tester can know the test progress conveniently, and the test result is checked.
Step S306, when a query instruction is received, test data, test logs and test results of the automatic test are obtained.
Specifically, after each use case is finished, test data, test logs and test results corresponding to the use case are generated. When a query instruction is received, test data, test logs and test results corresponding to each use case in the automatic test process are obtained from a preset storage position.
Step S307, processing the test data, the test log and the test result, and displaying the processing result on the interactive interface.
Specifically, the test data, the test log and the test result are respectively analyzed and processed, and the obtained processing results are respectively displayed. The manner of the analysis processing in this embodiment is not particularly limited, and those skilled in the art can select according to actual display requirements.
According to the embodiment, all the test data, the test logs and the test results are subjected to statistical processing, and the processing results are displayed on the interactive interface, so that a tester can analyze the test process based on the processing results.
In this embodiment, a UI automation testing device is further provided, and the device is used to implement the foregoing embodiments and preferred embodiments, and will not be described in detail. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated.
The present embodiment provides a UI automation testing device, as shown in fig. 4, including:
the first receiving module 401 is configured to receive a file packet transmitted by the testing end, where the file packet includes a testing application.
A checking module 402, configured to check a survival status of the service process.
The second receiving module 403 is configured to disconnect from the testing terminal when the service process is in a surviving state, and receive a testing requirement, where the testing requirement is generated by the testing application.
And the testing module 404 is configured to complete an automated test on the device under test by using the service process based on the testing requirement.
In some alternative embodiments, the apparatus further comprises:
and the connection module is used for establishing connection with the test end when the service process is in a non-survival state so as to start the service process.
In some alternative embodiments, the test module 404 includes:
and the generating sub-module is used for carrying out test task configuration based on the test requirement and generating a test configuration file.
And the extraction submodule is used for extracting task configuration parameters from the test configuration file.
And the judging sub-module is used for judging whether the current test task is the test task after interruption based on the task configuration parameters, and obtaining a judging result.
And the testing sub-module is used for automatically testing the equipment to be tested based on the judging result.
In some optional embodiments, the package in the first receiving module 401 further includes an parsing script, a first configuration file, and a second configuration file, and the testing submodule includes:
and the first analysis unit is used for analyzing the first configuration file by utilizing an analysis script to obtain a dependent function and a variable when the judgment result is that the current test task is not the test task after interruption.
And the second analysis unit is used for analyzing the second configuration file by using the analysis script to obtain a use case execution list.
And the operation unit is used for operating each use case according to the use case execution list by utilizing the service process in combination with the dependent function and the variable so as to complete the automatic test of the equipment to be tested.
In some optional embodiments, when the current test task in the sub-module is determined to be a retest task after interruption, before the first parsing unit, the method further includes:
the acquisition unit is used for acquiring the corresponding execution result when the interrupt occurs.
And the updating unit is used for updating the initial execution state of the automatic test based on the execution result.
In some alternative embodiments, after the test module 404, the apparatus further comprises:
and the generating module is used for generating prompt information and displaying the prompt information on the interactive interface of the test application.
In some alternative embodiments, after generating the module, the apparatus further comprises:
and the acquisition module is used for acquiring test data, test logs and test results of the automatic test when receiving the query instruction.
And the storage module is used for processing the test data, the test log and the test result and displaying the processing result on the interactive interface.
Further functional descriptions of the above respective modules and units are the same as those of the above corresponding embodiments, and are not repeated here.
The UI automation test device in this embodiment is presented in the form of functional units, where the units refer to ASIC (Application Specific Integrated Circuit ) circuits, processors and memories executing one or more software or fixed programs, and/or other devices that can provide the above described functionality.
The embodiment of the invention also provides computer equipment, which is provided with the UI automatic testing device shown in the figure 4.
Referring to fig. 5, fig. 5 is a schematic structural diagram of a computer device according to an alternative embodiment of the present invention, as shown in fig. 5, the computer device includes: one or more processors 10, memory 20, and interfaces for connecting the various components, including high-speed interfaces and low-speed interfaces. The various components are communicatively coupled to each other using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions executing within the computer device, including instructions stored in or on memory to display graphical information of the GUI on an external input/output device, such as a display device coupled to the interface. In some alternative embodiments, multiple processors and/or multiple buses may be used, if desired, along with multiple memories and multiple memories. Also, multiple computer devices may be connected, each providing a portion of the necessary operations (e.g., as a server array, a set of blade servers, or a multiprocessor system). One processor 10 is illustrated in fig. 5.
The processor 10 may be a central processor, a network processor, or a combination thereof. The processor 10 may further include a hardware chip, among others. The hardware chip may be an application specific integrated circuit, a programmable logic device, or a combination thereof. The programmable logic device may be a complex programmable logic device, a field programmable gate array, a general-purpose array logic, or any combination thereof.
Wherein the memory 20 stores instructions executable by the at least one processor 10 to cause the at least one processor 10 to perform the methods shown in implementing the above embodiments.
The memory 20 may include a storage program area that may store an operating system, at least one application program required for functions, and a storage data area; the storage data area may store data created according to the use of the computer device, etc. In addition, the memory 20 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid-state storage device. In some alternative embodiments, memory 20 may optionally include memory located remotely from processor 10, which may be connected to the computer device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
Memory 20 may include volatile memory, such as random access memory; the memory may also include non-volatile memory, such as flash memory, hard disk, or solid state disk; the memory 20 may also comprise a combination of the above types of memories.
The computer device also includes a communication interface 30 for the computer device to communicate with other devices or communication networks.
The embodiments of the present invention also provide a computer readable storage medium, and the method according to the embodiments of the present invention described above may be implemented in hardware, firmware, or as a computer code which may be recorded on a storage medium, or as original stored in a remote storage medium or a non-transitory machine readable storage medium downloaded through a network and to be stored in a local storage medium, so that the method described herein may be stored on such software process on a storage medium using a general purpose computer, a special purpose processor, or programmable or special purpose hardware. The storage medium can be a magnetic disk, an optical disk, a read-only memory, a random access memory, a flash memory, a hard disk, a solid state disk or the like; further, the storage medium may also comprise a combination of memories of the kind described above. It will be appreciated that a computer, processor, microprocessor controller or programmable hardware includes a storage element that can store or receive software or computer code that, when accessed and executed by the computer, processor or hardware, implements the methods illustrated by the above embodiments.
Although embodiments of the present invention have been described in connection with the accompanying drawings, various modifications and variations may be made by those skilled in the art without departing from the spirit and scope of the invention, and such modifications and variations fall within the scope of the invention as defined by the appended claims.

Claims (10)

1. A UI automation test method, wherein the method is performed by a device under test, the method comprising:
receiving a file packet transmitted by a test end, wherein the file packet comprises a test application;
checking the survival state of the service process;
when the service process is in a survival state, disconnecting the service process from the test terminal, and receiving a test requirement, wherein the test requirement is generated by the test application;
and based on the test requirement, utilizing the service process to complete automatic test of the equipment to be tested.
2. The method according to claim 1, wherein the method further comprises:
and when the service process is in a non-survival state, establishing connection with the test end so as to start the service process.
3. The method according to claim 1 or 2, wherein said performing an automated test of said device under test with said service process based on said test requirements comprises:
performing test task configuration based on the test requirement to generate a test configuration file;
extracting task configuration parameters from the test configuration file;
judging whether the current test task is a test task after interruption or not based on the task configuration parameters, and obtaining a judging result;
and automatically testing the equipment to be tested based on the judging result.
4. The method of claim 3, wherein the package further includes an parsing script, a first configuration file, and a second configuration file, and wherein automatically testing the device under test based on the determination result includes:
when the judging result is that the current testing task is not the interrupted retesting task, analyzing the first configuration file by utilizing the analysis script to obtain a dependent function and a variable;
analyzing the second configuration file by utilizing the analysis script to obtain a use case execution list;
and operating each use case according to the use case execution list by utilizing the service process in combination with the dependent functions and the variables so as to complete the automatic test of the equipment to be tested.
5. The method of claim 4, wherein when the current test task is the post-interrupt retest task, before parsing the first configuration file using the parsing script to obtain a dependent function and a variable, further comprising:
acquiring a corresponding execution result when interruption occurs;
and updating the initial execution state of the automatic test based on the execution result.
6. The method of claim 1, wherein after the automated testing of the device under test with the service process is completed based on the test requirements, the method further comprises:
and generating prompt information, and displaying the prompt information on an interactive interface of the test application.
7. The method of claim 6, wherein after generating the hint information, displaying the hint information on the interactive interface of the test application, the method further comprises:
when a query instruction is received, test data, test logs and test results of an automatic test are obtained;
and processing the test data, the test log and the test result, and displaying the processing result on an interactive interface.
8. An UI automation test device, the device comprising:
the first receiving module is used for receiving a file packet transmitted by the testing end, wherein the file packet comprises a testing application;
the checking module is used for checking the survival state of the service process;
the second receiving module is used for disconnecting with the testing end when the service process is in a survival state, and receiving a testing requirement, wherein the testing requirement is generated by the testing application;
and the testing module is used for completing automatic testing of the equipment to be tested by utilizing the service process based on the testing requirement.
9. A computer device, comprising:
a memory and a processor, the memory and the processor being communicatively connected to each other, the memory having stored therein computer instructions, the processor executing the computer instructions to perform the UI automated test method of any one of claims 1 to 7.
10. A computer-readable storage medium having stored thereon computer instructions for causing a computer to perform the UI automation test method of any one of claims 1 to 7.
CN202311126238.4A 2023-09-01 2023-09-01 UI (user interface) automatic testing method and device, computer equipment and storage medium Pending CN117149638A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311126238.4A CN117149638A (en) 2023-09-01 2023-09-01 UI (user interface) automatic testing method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311126238.4A CN117149638A (en) 2023-09-01 2023-09-01 UI (user interface) automatic testing method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117149638A true CN117149638A (en) 2023-12-01

Family

ID=88911489

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311126238.4A Pending CN117149638A (en) 2023-09-01 2023-09-01 UI (user interface) automatic testing method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117149638A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105487966A (en) * 2014-09-17 2016-04-13 腾讯科技(深圳)有限公司 Program testing method, device and system
CN106021095A (en) * 2016-05-06 2016-10-12 北京邮电大学 A push mechanism-based Android application automated testing method
US10038711B1 (en) * 2017-01-30 2018-07-31 XM Ltd. Penetration testing of a networked system
CN113760704A (en) * 2020-09-16 2021-12-07 北京沃东天骏信息技术有限公司 Web UI (user interface) testing method, device, equipment and storage medium
CN115118646A (en) * 2022-06-29 2022-09-27 苏州浪潮智能科技有限公司 Data interaction method and device for switch test system and electronic equipment
CN115454869A (en) * 2022-09-21 2022-12-09 中国平安人寿保险股份有限公司 Interface automation test method, device, equipment and storage medium
CN115509913A (en) * 2022-09-27 2022-12-23 上海易景信息科技有限公司 Software automation test method, device, machine readable medium and equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105487966A (en) * 2014-09-17 2016-04-13 腾讯科技(深圳)有限公司 Program testing method, device and system
CN106021095A (en) * 2016-05-06 2016-10-12 北京邮电大学 A push mechanism-based Android application automated testing method
US10038711B1 (en) * 2017-01-30 2018-07-31 XM Ltd. Penetration testing of a networked system
CN113760704A (en) * 2020-09-16 2021-12-07 北京沃东天骏信息技术有限公司 Web UI (user interface) testing method, device, equipment and storage medium
CN115118646A (en) * 2022-06-29 2022-09-27 苏州浪潮智能科技有限公司 Data interaction method and device for switch test system and electronic equipment
CN115454869A (en) * 2022-09-21 2022-12-09 中国平安人寿保险股份有限公司 Interface automation test method, device, equipment and storage medium
CN115509913A (en) * 2022-09-27 2022-12-23 上海易景信息科技有限公司 Software automation test method, device, machine readable medium and equipment

Similar Documents

Publication Publication Date Title
CN107451040B (en) Method and device for positioning fault reason and computer readable storage medium
CN109510742B (en) Server network card remote test method, device, terminal and storage medium
CN111382048B (en) Management method and device for mobile equipment on true machine test platform
CN104809045A (en) Operation method and device of monitoring script
CN110659198A (en) Application program test case execution method and device and software test system
CN110990289B (en) Method and device for automatically submitting bug, electronic equipment and storage medium
CN103731663A (en) Method and device for testing smart television
CN112948190A (en) Hardware testing method, system and related device of server
CN112115055A (en) Multi-machine automatic testing method and device and computer equipment
CN112068852A (en) Method, system, equipment and medium for installing open source software based on domestic server
CN112269697B (en) Equipment storage performance testing method, system and related device
CN113315675A (en) White box switch U-Boot automatic testing method, system and storage medium
CN113127329B (en) Script debugging method and device and computer storage medium
CN117149638A (en) UI (user interface) automatic testing method and device, computer equipment and storage medium
CN110955601A (en) Android platform stability-based automatic testing method and device and electronic equipment
CN115470141A (en) Fault simulation method, device and related equipment
CN115145381A (en) Method, system, storage medium and equipment for remotely resetting BMC chip
CN111026667B (en) Script execution method and device and electronic equipment
CN113986263A (en) Code automation test method, device, electronic equipment and storage medium
CN116382968B (en) Fault detection method and device for external equipment
CN112560041B (en) Method, apparatus and computer storage medium for automated quality verification detection
CN116303067B (en) Testing method, device, equipment and medium based on cloud testing platform
CN115098403A (en) Network environment checking and repairing method, system, device and medium for testing machine
CN110347409B (en) Automatic control method, client and server
CN115794642A (en) Test environment deployment method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination