CN111143200A - Method and device for recording and playing back touch event, storage medium and equipment - Google Patents

Method and device for recording and playing back touch event, storage medium and equipment Download PDF

Info

Publication number
CN111143200A
CN111143200A CN201911288956.5A CN201911288956A CN111143200A CN 111143200 A CN111143200 A CN 111143200A CN 201911288956 A CN201911288956 A CN 201911288956A CN 111143200 A CN111143200 A CN 111143200A
Authority
CN
China
Prior art keywords
event
recording
operation event
touch
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911288956.5A
Other languages
Chinese (zh)
Inventor
张德恒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Huaduo Network Technology Co Ltd
Original Assignee
Guangzhou Huaduo Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Huaduo Network Technology Co Ltd filed Critical Guangzhou Huaduo Network Technology Co Ltd
Priority to CN201911288956.5A priority Critical patent/CN111143200A/en
Publication of CN111143200A publication Critical patent/CN111143200A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The method comprises the steps of generating a transparent window arranged on the uppermost layer, intercepting an operation event of a user and recording related information of the operation event through the window when the user is recorded, and realizing recording and playback of the user operation based on the recorded related information when the user is played back, so that repeated manual operation is omitted.

Description

Method and device for recording and playing back touch event, storage medium and equipment
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method, an apparatus, a storage medium, and a device for recording and playing back a touch event.
Background
With the development of the mobile phone market, the user group using the android mobile phone is larger and larger, people use the mobile phone more and more, and in some fields, such as the test field, people can perform a lot of repeated operations on the mobile phone frequently, and the operations are not interesting and waste manpower. Therefore, a processing scheme capable of recording user operations and replacing manual repetitive operations by playback is desired in the market.
Currently, recording and playback of user operations for android systems are mainly performed in two ways, namely, through a monkey runner tool and based on Instrumentation. The MonkeyRunner tool is an Android black box automatic testing tool based on coordinate points, and provides a set of API (application programming interface) to be called by a user/tester, however, when the operation of the user on a mobile phone is recorded and played back through the MonkeyRunner tool, the operation needs to depend on a computer, namely, the mobile phone is required to be connected with the computer first, and then the related operation is executed on the computer; the Instrumentation is a testing framework of Android package, and can monitor all interaction events contained in an application program being installed or used in a system through the Instrumentation, but when recording user operation based on the Instrumentation, it is necessary to ensure that a testing application and a tested application have the same package name and signature, and recording across applications cannot be performed. The applicability of both of these approaches is clearly limited. In addition, there is a scheme for recording and playing back based on monitoring of a system event file in the related art, however, the reading of the system event file requires the adb shell right or the root right, which is not possible for a device which is not connected with a computer or does not perform root.
Disclosure of Invention
To overcome the problems in the related art, the present specification provides a method, an apparatus, a storage medium, and a device for recording and playing back a touch event.
According to a first aspect of embodiments of the present specification, there is provided a method for recording and playing back a touch event, including:
generating a transparent window and placing the transparent window on the uppermost layer;
after receiving a recording instruction, detecting a user touch signal received by the transparent window;
recording relevant information of an operation event triggered by the touch signal, wherein the relevant information comprises the type of the operation event, screen coordinates of a received user touch signal and time associated with the operation event;
when the operation event is determined to be played back, simulating a corresponding touch event based on the recorded related information of the operation event so as to play back the operation event.
In some examples, the simulating of the corresponding touch event based on the recorded related information of the operation event is implemented by a barrier-free service.
In some examples, the simulating a corresponding touch event based on the recorded related information of the operation event includes:
based on the recorded relevant information of the operation event, description is carried out through a gesture description class and a drawing path class, and then a corresponding touch event is simulated through a gesture simulating method of an obstacle-free service class.
In some examples, before simulating the corresponding touch event based on the recorded information related to the operation event, the method further comprises the following steps:
and calculating relative time between more than one recorded operation event after receiving the recording instruction according to the time associated with the operation event so as to enable the response sequence of each touch event to be consistent with the response sequence of each operation event.
In some examples, the above playing back the operation event includes:
and informing an operating system to transmit the touch event to a top-level application so that the top-level application responds to the touch event.
In some examples, after recording information related to the operation event triggered by the touch signal, the method further includes:
and distributing the operation event to a lower layer window to respond to the operation event.
In some examples, the method further comprises:
and displaying at least one control to receive the recording instruction and/or the playback instruction.
In some examples, the method is performed by an APP.
In some examples, the method further comprises:
generating a recording text for storing information related to all operation events recorded during recording
According to a second aspect of the embodiments of the present specification, there is provided a recording and playback apparatus for triggering an event, including:
the generating module is used for generating a transparent window and placing the transparent window on the uppermost layer;
the detection module is used for detecting a user touch signal received by the transparent window after receiving a recording instruction;
the recording module is used for recording relevant information of an operation event triggered by the touch signal, wherein the relevant information comprises the type of the operation event, screen coordinates of a received user touch signal and time associated with the operation event;
and the playback module is used for simulating a corresponding touch event based on the recorded related information of the operation event when the operation event is determined to be played back so as to play back the operation event.
According to a third aspect of embodiments of the present specification, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs any one of the methods of the embodiments of the specification.
According to a fourth aspect of embodiments herein, there is provided a computer apparatus comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements any of the methods in the embodiments herein when executing the program.
The technical scheme provided by the embodiment of the specification can have the following beneficial effects:
in the method, a transparent window arranged on the uppermost layer is generated, when the user operates the recording device, the window intercepts the operation event of the user and records the related information of the operation event, and when the user operates the recording device, the recording device simulates the corresponding touch event based on the recorded related information to realize the recording and the playback of the user operation, thereby saving the repeated manual operation; the method in the embodiment of the specification acquires the relevant information of the user operation based on the window irrelevant to the application which the user really needs to operate, is not read from a system file, and skips the authorization of an operating system to simulate the touch event during playback, so that the method does not depend on a computer, does not need root authority, and has better applicability.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the specification.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present specification and together with the description, serve to explain the principles of the specification.
FIG. 1 is a flow chart illustrating a method for recording and playback of touch events according to an exemplary embodiment of the present description;
FIG. 2 is a schematic diagram of an application interface shown in accordance with an application example;
FIG. 3 is a diagram illustrating a gesture path according to an example application in accordance with an embodiment of the present description;
FIG. 4 is a hardware block diagram of a computer device in which a recording and playback apparatus for touch events according to an embodiment of the present disclosure is located;
FIG. 5 is a block diagram of a touch event recording and playback device shown in the present specification according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present specification. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the specification, as detailed in the appended claims.
The terminology used in the description herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the description. As used in this specification and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, the first information may also be referred to as second information, and similarly, the second information may also be referred to as first information, without departing from the scope of the present specification. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
In the use of smart devices, some fields often have the requirement of repeated operations, such as testing fields and game fields. Taking the test field as an example, in the web black box test (i.e., the web function test), a tester needs to repeatedly click corresponding controls for functions such as form submission, result query, and the like, which obviously results in complex operation and low efficiency. At present, there are some processes for recording operations by means of tool recording or script writing, and processing schemes for executing simulation operations by playing back or running scripts are used to replace manual repetitive operations.
Currently, recording and playback of user operations for android systems are mainly performed in two ways, namely, through a monkey runner tool and based on Instrumentation. The MonkeyRunner tool is an Android black box automatic testing tool based on coordinate points, and provides a set of API (application programming interface) to be called by a user/tester, however, when the operation of the user on a mobile phone is recorded and played back through the MonkeyRunner tool, the operation needs to depend on a computer, namely, the mobile phone is required to be connected with the computer first, and then the related operation is executed on the computer; the Instrumentation is a testing framework of Android package, and can monitor all interaction events contained in an application program being installed or used in a system through the Instrumentation, but when recording user operation based on the Instrumentation, it is necessary to ensure that a testing application and a tested application have the same package name and signature, and recording across applications cannot be performed. In addition, a scheme for recording and playing back based on monitoring of a system event file is provided in the related art, when a user uses an Android application for interaction, a sensor of an Android device generates and sends an event to a kernel, and information is stored in a device file in a dev/input/event directory to form the system event file, however, the system event file needs to be read with an adb shell authority or a root authority, and cannot be read with a device which is not connected with a computer or does not perform root.
The following provides a detailed description of examples of the present specification.
As shown in fig. 1, fig. 1 is a flowchart illustrating a method for recording and playing back a touch event according to an exemplary embodiment, the method comprising:
in step 101, a transparent window is generated and placed on the uppermost layer;
in some examples, the transparent window mentioned in this step may be a transparent full-screen floating window, in this step, a transparent full-screen floating window disposed at the uppermost layer of all windows is implemented, and the required information is recorded when the user performs an operation through the window.
A window is the most important part of the user interface, being the visual interface between the user and the application that generated the window. In general, each time a user begins to run an application, the application creates and displays a window; when the user manipulates an object in the window, the program reacts accordingly. In this step, the transparent window is disposed on the uppermost layer, which may be: and setting the transparent window at the top, so that the transparent window is displayed at the forefront row of all windows of the user interface. Thus, the transparent window is not covered by other windows, so that the window operated by the user is ensured to be the transparent window.
Step 102, after receiving a recording instruction, detecting a user touch signal received by the transparent window;
in some examples, the user touch signal mentioned in this step may represent an input operation performed by a user on the device, and generally represents an input operation performed by a user on a touch screen of the device. Common input operations performed by a user on a touch screen of a device include single click, double click, short press, long press, sliding, double finger zooming, and the like. One touch event may be composed of only one input operation or multiple input operations, for example, one touch event may be a single click on several positions of the touch screen, or a single click may be followed by a sliding motion.
Recording relevant information of an operation event triggered by the touch signal in step 103, wherein the relevant information comprises the type of the operation event, screen coordinates of a received user touch signal and time associated with the operation event;
in some examples, the information related to the operation event triggered by the touch signal mentioned in this step may be obtained through an event distribution function (dispatch touch ()) of the transparent window. dispatch touch event () is one of three important functions in the Android event distribution mechanism, the remaining two are onIntercepttouch () and ontouch () respectively. When a user operates, an operation event triggered by a user touch signal is preferentially called back to an event distribution function of the transparent window, and information of the type of the operation event, the screen coordinate of the received user touch signal and the time associated with the operation event can be acquired in the event distribution function. Different from the method of reading the system event file to obtain the data of the user operation in the related technology, the method intercepts the data of the user operation through a transparent window before the data of the user operation is consumed, and records the data of the user operation during interception. Because the system event file is not read, reading is not required to be carried out through an adb shell command of a computer or after root authority is obtained.
Wherein, the operation event type may include any one of the following: press, lift, and slide. When a user touches the screen, an operation event is triggered, the Android system packages the operation event into a MotionEvent, wherein pressing refers to pressing the screen, lifting refers to moving away from the screen, and sliding refers to sliding on the screen. A single click operation may be considered to be a combination of a press down operational event and a lift up operational event; a slide operation may be considered to be a combination of a press-down operation event, a lift-up operation event, and a plurality of slide operation events.
In some examples, after recording the information related to the operation event triggered by the touch signal in this step, the method further includes: and distributing the operation event to a lower layer window to respond to the operation event. The interception of the user operation through the transparent window in steps 101 to 103 is to record the relevant information of the operation, and distribute the operation event to the lower layer window after recording to respond to the operation event, so that the operation of the user is effective on the application that the user really wants to operate. Corresponding to step 101, since the transparent window is disposed at the uppermost layer, the lower layer window here refers to the foremost layer of all windows of the user interface in the case of removing the transparent window. The transparent window is to intercept an operation event of the application to be actually operated by the user, and therefore, the method of this embodiment of the present specification may be to start execution after the user starts to execute the application to be actually operated, or start execution of the application to be actually operated after execution, where in the case of the transparent window being set on top, the lower window refers to a window of the application to be actually operated by the user.
When it is determined to play back the operational event, a corresponding touch event is simulated based on the recorded information related to the operational event to play back the operational event in step 104.
After the user operation is recorded, the playback at this time means that the same operation is repeated in the current interface, that is, the user operation during recording needs to be simulated. Generally, in an android system, a touch event of one application needs to be responded on another application, the touch event needs to be sent to an operating system, and then the touch event is sent to the application to be responded by the operating system, and root authority is needed in the process. In this step, the touch event is directly simulated, and the operating system is skipped. In some examples, the simulating of the corresponding touch event based on the recorded information related to the operation event in this step may be implemented by an obstacle-free service. The accessibility service (accessibility service) is an auxiliary function provided by the Android system, and can run in the background, monitor some state transitions of the user interface, such as page switching, focus changes, notifications, etc., and receive system callbacks when the above-mentioned accessible events are triggered. Different from the current scheme for realizing android device simulation operation, the method is often restricted by root permission, and the simulated touch event can be directly sent to the application to be really operated by a user through barrier-free service to be responded by the application, so that the root permission is not needed, and the applicability is expanded.
The specific process of this step may include: based on the recorded relevant information of the operation event, description is carried out through a gesture description class and a drawing path class, and then a corresponding touch event is simulated through a gesture simulating method of an obstacle-free service class. The gesture description class (getturedescription) is used for describing gestures, and to realize simulation operation, a gesture to be simulated is described first; the drawing path class (android. graphics. path) is used to describe the path, and in combination with the gesture description class, the path of the simulated gesture can be described, for example, in a sliding operation event, the gesture can be composed of a plurality of strokes, the path is an outline followed by the strokes, and in a single-click operation event, the length of the path is zero, and the strokes are a click without movement. After describing the path of the gesture of the operation to be simulated, the operation is simulated by a barrier-free service type gesture simulation method (dispatch gesture). The problem of repeated manual operation can be solved by controlling the playback times in the step and carrying out multiple times of playback.
Since the time associated with the operation event in the recorded information is usually the real time of the current time, before simulating the corresponding touch event based on the recorded information related to the operation event, the method may further include the steps of: and calculating relative time between more than one recorded operation event after receiving the recording instruction according to the time associated with the operation event so as to enable the response sequence of each touch event to be consistent with the response sequence of each operation event. By calculating the time interval between each operation event, the operation events can be sent one by one according to the time interval during playback, thereby achieving the purpose of accurately simulating the operation of a user.
In some examples, the process of playing back the operation event mentioned in this step may include: and informing an operating system to transmit the touch event to a top-level application so that the top-level application responds to the touch event. When the simulated touch event is a two-finger zoom operation, the operating system is notified to transmit the two-finger zoom operation to the application, so that the application responds to the two-finger zoom operation.
In the embodiment of the specification, the recording and the playback of the user operation are realized by recording the relevant information of the operation event operated by the user during the recording of the transparent window and simulating the corresponding touch event based on the recorded relevant information during the playback, so that the repeated manual operation is omitted, and the method of the embodiment of the specification does not depend on a computer and does not need root authority, thereby having wider applicability.
In addition, in order to record or play back the user operation at any time, in some examples, the method further includes: and displaying at least one control to receive the recording instruction and/or the playback instruction. The control can comprise buttons for starting recording, ending recording, playing back recording, canceling recording and the like, and the control is convenient for a user to control.
When the user operates the recording device, the related information of all operation events during the recording period can be recorded to generate a recording text, the recording text is stored locally, and the recording text is loaded during the playback. The methods of the embodiments of the present specification may be packaged as an application program and executed by the application program. In this way, the user can send the recorded text to another device installed with the application program for playback, thereby realizing cross-device use.
In order to describe the embodiments of the present disclosure in more detail, a practical application example is set forth below.
In this application example, the method of this specification is applied to testing a game software, and in the original processing scheme, the testing mode of the tester is as follows: the method comprises the steps of manually clicking 'start game' on a test level interface to enable the game to run automatically, manually clicking any position of a screen after the automatic running is finished, enabling a game level interface to appear, clicking 'end game' and returning to the test level interface, wherein the steps need to be repeated for hundreds of times due to the fact that hundreds of times of test data are needed. In this application example, the method of the embodiment of the present specification is executed by an application program (hereinafter referred to as APP), and includes the following steps:
in step 201, after the APP is started, the APP provides a floating control for controlling recording and playback, including buttons for starting recording, ending recording, playing back recording, and the like;
in step 202, the APP generates a full-screen transparent floating window, and the window is placed on the uppermost layer;
in this application example, as shown in fig. 2, fig. 2 is a schematic diagram of an application interface shown in this specification according to an application example, a user opens an APP after opening a game software test level interface to be actually operated, and since the APP generates a transparent floating window, the window content of the game software is not covered, the user can clearly see the interface of the game software to be actually operated, so that the user can know a screen position corresponding to a position to be operated by the user, for example, a position of a game control of "start game";
in step 203, after the user starts recording through the control, detecting a user touch signal received by the floating window;
in step 204, when the user operates the device, the triggered operation event is preferentially called back to an event distribution function of the floating window, in the function, the operation event type (code type is MotionEvent. getaction ()) in the MotionEvent and the screen coordinates (code type is getRawX (), getRawY ()) of the received user touch signal are obtained, the information of the time (code type is systemclock. uptimemis ()) related to the operation event is obtained, and the related information is recorded and stored;
in the application example, a user carries out click operation three times in total during the recording period, namely clicking 'start game', clicking a certain position of a screen and clicking 'end game', wherein each click operation can be regarded as a pressed operation event and a lifted operation event, and the APP records information of the operation event type, the screen coordinate corresponding to each operation event and the time related to the operation event type through a floating window;
after the relevant information of each operation event is recorded and saved in step 205, the operation event is distributed to a lower layer window to respond to the operation event;
in this application example, the lower layer window of the floating window refers to a game software to be tested and tested level interface which is actually operated by a user, for example, when an operation event corresponding to the user clicking to "start a game" is distributed to the lower layer window, the game software responds to the operation event to enable the game to run automatically;
in step 206, when the user finishes recording through the control, generating a recording file from all the operation event related information recorded during the recording period, and storing the recording file to the local;
in step 207, when the user plays back the operation through the control, the APP loads the record file, and calculates the relative time between the operation events according to the time associated with the operation events recorded in the record file;
in step 208, based on the recorded related information of the operation event, the related information of the operation event is sent one by one according to relative time, the description is performed through a gesture description class and a drawing path class, then a corresponding touch event is simulated through a gesture simulation method of an obstacle-free service class, and an operation system is notified to transmit the simulated touch event to a top-layer application, so that the top-layer application responds to the touch event;
in this application example, as shown in fig. 3, fig. 3 is a schematic diagram of a gesture path shown in an application example according to an embodiment of the present specification, where since a touch event in the application example is a single-click operation, the path is zero, and a circle represents a touch event in the schematic diagram; in addition, during playback, the floating window is not displayed, the top-layer application is tested game software, along with the progress of playback, touch events simulated by the APP comprise click operation on a control position of 'starting game', click operation on a certain position on a screen and click operation on a control position of 'ending game', and each touch event is responded by the game software according to time intervals of the touch events, so that the operation of the previous user during testing is repeated;
in step 209, the user plays back the recorded operation for multiple times by setting the times of automatic playback recording and the interval time, so that the problem of repeated manual operation is solved;
in step 210, when the user performs a test on another device, the APP is installed on the replaced device, the recorded text stored in the device before replacement is transmitted to the replaced device, and a playback operation is directly performed during the test on the replaced device.
Corresponding to the embodiment of the method, the specification also provides an embodiment of a recording and playback device of the touch event and a terminal applied by the recording and playback device.
The embodiment of the recording and playback device for touch events in the specification can be applied to computer equipment, such as a server or terminal equipment. The device embodiments may be implemented by software, or by hardware, or by a combination of hardware and software. The software implementation is taken as an example, and as a logical device, the device is formed by reading corresponding computer program instructions in the nonvolatile memory into the memory for operation through the processor in which the file processing is located. From a hardware aspect, as shown in fig. 4, it is a hardware structure diagram of a computer device in which a recording and playback apparatus of a touch event is located in the embodiments of this specification, except for the processor 510, the memory 530, the network interface 520, and the nonvolatile memory 540 shown in fig. 4, a server or an electronic device in which the apparatus 531 is located in the embodiments may also include other hardware according to an actual function of the computer device, which is not described again.
Accordingly, the embodiments of the present specification also provide a computer storage medium, in which a program is stored, and the program, when executed by a processor, implements the method in any of the above embodiments.
Embodiments of the present description may take the form of a computer program product embodied on one or more storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having program code embodied therein. Computer-usable storage media include permanent and non-permanent, removable and non-removable media, and information storage may be implemented by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of the storage medium of the computer include, but are not limited to: phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technologies, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium, may be used to store information that may be accessed by a computing device.
As shown in fig. 5, fig. 5 is a block diagram of a touch event recording and playback apparatus according to an exemplary embodiment, the apparatus including:
a generating module 51, configured to generate a transparent window and place the transparent window on the uppermost layer;
the detection module 52 is configured to detect a user touch signal received by the transparent window after receiving a recording instruction;
a recording module 53, configured to record relevant information of an operation event triggered by the touch signal, where the relevant information includes a type of the operation event, a screen coordinate at which a user touch signal is received, and a time associated with the operation event;
a playback module 54, configured to, when it is determined to play back the operation event, simulate a corresponding touch event based on the recorded related information of the operation event to play back the operation event.
The implementation process of the functions and actions of each module in the above device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, wherein the modules described as separate parts may or may not be physically separate, and the parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules can be selected according to actual needs to achieve the purpose of the solution in the specification. One of ordinary skill in the art can understand and implement it without inventive effort.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
Other embodiments of the present description will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This specification is intended to cover any variations, uses, or adaptations of the specification following, in general, the principles of the specification and including such departures from the present disclosure as come within known or customary practice within the art to which the specification pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the specification being indicated by the following claims.
It will be understood that the present description is not limited to the precise arrangements described above and shown in the drawings, and that various modifications and changes may be made without departing from the scope thereof. The scope of the present description is limited only by the appended claims.
The above description is only a preferred embodiment of the present disclosure, and should not be taken as limiting the present disclosure, and any modifications, equivalents, improvements, etc. made within the spirit and principle of the present disclosure should be included in the scope of the present disclosure.

Claims (10)

1. A method for recording and playing back touch events is characterized by comprising the following steps:
generating a transparent window and placing the transparent window on the uppermost layer;
after receiving a recording instruction, detecting a user touch signal received by the transparent window;
recording relevant information of an operation event triggered by the touch signal, wherein the relevant information comprises the type of the operation event, screen coordinates of a received user touch signal and time associated with the operation event;
when the operation event is determined to be played back, simulating a corresponding touch event based on the recorded related information of the operation event so as to play back the operation event.
2. The method of claim 1, wherein simulating the corresponding touch event based on the recorded information related to the operational event comprises:
based on the recorded relevant information of the operation event, description is carried out through a gesture description class and a drawing path class, and then a corresponding touch event is simulated through a gesture simulating method of an obstacle-free service class.
3. The method of claim 1, further comprising, before simulating a corresponding touch event based on the recorded information related to the operational event, the steps of:
and calculating relative time between more than one recorded operation event after receiving the recording instruction according to the time associated with the operation event so as to enable the response sequence of each touch event to be consistent with the response sequence of each operation event.
4. The method of claim 1, wherein the process of playing back the operational event comprises:
and informing an operating system to transmit the touch event to a top-level application so that the top-level application responds to the touch event.
5. The method of claim 1, wherein after recording information related to the operation event triggered by the touch signal, the method further comprises:
and distributing the operation event to a lower layer window to respond to the operation event.
6. The method of claim 1, further comprising:
and displaying at least one control to receive the recording instruction and/or the playback instruction.
7. The method of claim 1, further comprising:
and generating a recording text, wherein the recording text is used for storing the related information of all the operation events recorded during the recording.
8. An apparatus for recording and playback of touch events, comprising:
the generating module is used for generating a transparent window and placing the transparent window on the uppermost layer;
the detection module is used for detecting a user touch signal received by the transparent window after receiving a recording instruction;
the recording module is used for recording relevant information of an operation event triggered by the touch signal, wherein the relevant information comprises the type of the operation event, screen coordinates of a received user touch signal and time associated with the operation event;
and the playback module is used for simulating a corresponding touch event based on the recorded related information of the operation event when the operation event is determined to be played back so as to play back the operation event.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program implements the method of any one of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method of any one of claims 1 to 7.
CN201911288956.5A 2019-12-12 2019-12-12 Method and device for recording and playing back touch event, storage medium and equipment Pending CN111143200A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911288956.5A CN111143200A (en) 2019-12-12 2019-12-12 Method and device for recording and playing back touch event, storage medium and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911288956.5A CN111143200A (en) 2019-12-12 2019-12-12 Method and device for recording and playing back touch event, storage medium and equipment

Publications (1)

Publication Number Publication Date
CN111143200A true CN111143200A (en) 2020-05-12

Family

ID=70518249

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911288956.5A Pending CN111143200A (en) 2019-12-12 2019-12-12 Method and device for recording and playing back touch event, storage medium and equipment

Country Status (1)

Country Link
CN (1) CN111143200A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111708474A (en) * 2020-05-26 2020-09-25 广州朗国电子科技有限公司 Method and device for simulating user touch operation, storage medium and all-in-one machine equipment
CN112000271A (en) * 2020-08-13 2020-11-27 努比亚技术有限公司 Touch signal identification control method and device and computer readable storage medium
CN112131117A (en) * 2020-09-25 2020-12-25 腾讯科技(深圳)有限公司 Game testing method and device, electronic equipment and storage medium
CN112817790A (en) * 2021-03-02 2021-05-18 腾讯音乐娱乐科技(深圳)有限公司 Method for simulating user behavior
CN113032273A (en) * 2021-04-01 2021-06-25 广州虎牙科技有限公司 Application program debugging method and device, computer equipment and storage medium
WO2022127130A1 (en) * 2020-12-18 2022-06-23 华为技术有限公司 Method for adding operation sequence, electronic device, and system
EP4064043A1 (en) * 2021-03-26 2022-09-28 Beijing Xiaomi Mobile Software Co., Ltd. Method and apparatus for generating combined instruction during application operation
CN115686334A (en) * 2022-10-31 2023-02-03 荣耀终端有限公司 Operation control method, electronic device and readable storage medium
CN113032273B (en) * 2021-04-01 2024-04-19 广州虎牙科技有限公司 Application program debugging method and device, computer equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102841845A (en) * 2012-07-30 2012-12-26 广东欧珀移动通信有限公司 Automatic testing method of Android device software
CN104346276A (en) * 2013-08-08 2015-02-11 腾讯科技(深圳)有限公司 Method and device for software tests
CN107025165A (en) * 2017-03-07 2017-08-08 腾讯科技(深圳)有限公司 Game automated testing method and relevant apparatus
US20170262130A1 (en) * 2016-03-11 2017-09-14 Spirent Communications, Inc. Performance test application sequence script
CN107357724A (en) * 2017-06-27 2017-11-17 深圳市泰衡诺科技有限公司上海分公司 Automatic software test method and device in Android system
CN108021494A (en) * 2017-12-27 2018-05-11 广州优视网络科技有限公司 A kind of method for recording of application operating, back method and related device
CN109165062A (en) * 2018-07-24 2019-01-08 苏宁易购集团股份有限公司 A kind of terminal remote assists control method and system
CN109800135A (en) * 2017-11-17 2019-05-24 腾讯科技(深圳)有限公司 A kind of information processing method and terminal

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102841845A (en) * 2012-07-30 2012-12-26 广东欧珀移动通信有限公司 Automatic testing method of Android device software
CN104346276A (en) * 2013-08-08 2015-02-11 腾讯科技(深圳)有限公司 Method and device for software tests
US20170262130A1 (en) * 2016-03-11 2017-09-14 Spirent Communications, Inc. Performance test application sequence script
CN107025165A (en) * 2017-03-07 2017-08-08 腾讯科技(深圳)有限公司 Game automated testing method and relevant apparatus
CN107357724A (en) * 2017-06-27 2017-11-17 深圳市泰衡诺科技有限公司上海分公司 Automatic software test method and device in Android system
CN109800135A (en) * 2017-11-17 2019-05-24 腾讯科技(深圳)有限公司 A kind of information processing method and terminal
CN108021494A (en) * 2017-12-27 2018-05-11 广州优视网络科技有限公司 A kind of method for recording of application operating, back method and related device
CN109165062A (en) * 2018-07-24 2019-01-08 苏宁易购集团股份有限公司 A kind of terminal remote assists control method and system

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111708474A (en) * 2020-05-26 2020-09-25 广州朗国电子科技有限公司 Method and device for simulating user touch operation, storage medium and all-in-one machine equipment
CN112000271A (en) * 2020-08-13 2020-11-27 努比亚技术有限公司 Touch signal identification control method and device and computer readable storage medium
CN112131117A (en) * 2020-09-25 2020-12-25 腾讯科技(深圳)有限公司 Game testing method and device, electronic equipment and storage medium
CN112131117B (en) * 2020-09-25 2022-04-01 腾讯科技(深圳)有限公司 Game testing method and device, electronic equipment and storage medium
WO2022127130A1 (en) * 2020-12-18 2022-06-23 华为技术有限公司 Method for adding operation sequence, electronic device, and system
CN112817790A (en) * 2021-03-02 2021-05-18 腾讯音乐娱乐科技(深圳)有限公司 Method for simulating user behavior
EP4064043A1 (en) * 2021-03-26 2022-09-28 Beijing Xiaomi Mobile Software Co., Ltd. Method and apparatus for generating combined instruction during application operation
US11899926B2 (en) 2021-03-26 2024-02-13 Beijing Xiaomi Mobile Software Co., Ltd. Method and apparatus for generating combined instruction during application operation, and storage medium
CN113032273A (en) * 2021-04-01 2021-06-25 广州虎牙科技有限公司 Application program debugging method and device, computer equipment and storage medium
CN113032273B (en) * 2021-04-01 2024-04-19 广州虎牙科技有限公司 Application program debugging method and device, computer equipment and storage medium
CN115686334A (en) * 2022-10-31 2023-02-03 荣耀终端有限公司 Operation control method, electronic device and readable storage medium
CN115686334B (en) * 2022-10-31 2023-11-28 荣耀终端有限公司 Operation control method, electronic device and readable storage medium

Similar Documents

Publication Publication Date Title
CN111143200A (en) Method and device for recording and playing back touch event, storage medium and equipment
CN103810089B (en) Automatically testing gesture-based applications
CN110090444B (en) Game behavior record creating method and device, storage medium and electronic equipment
CN110362483A (en) Performance data acquisition method, device, equipment and storage medium
CN102147756B (en) Methods and systems for testing terminal
CN108664380B (en) After-execution software debugging system with performance display and debugging method
CN105653438B (en) The striding course automated testing method and device of Android device
CN112947969B (en) Page off-screen rendering method, device, equipment and readable medium
KR20090084905A (en) Method and system for graphical user interface testing
CN106843663B (en) Method, device, equipment and medium for positioning pre-jump position in page
CN107608609A (en) A kind of event object sending method and device
CN108509348A (en) A kind of test method and mobile terminal of system aging
CN107250979B (en) Application event tracking
US11179644B2 (en) Videogame telemetry data and game asset tracker for session recordings
CN108984380A (en) A kind of server test method, device and medium based on linux system
CN107704391A (en) A kind of method of testing and device based on Selenium
US8621486B2 (en) Significance level automation
CN112988304B (en) Recording method and device of operation mode, electronic equipment and storage medium
CN111104017B (en) Sliding positioning processing method and device
US20090265156A1 (en) Dynamically varying simulation precision
US20210349666A1 (en) Transparent interactive interface for ballot marking and methods of using the same
US10496524B2 (en) Separating test coverage in software processes using shared memory
CN111708704A (en) Cloud real machine testing method and device, terminal and storage medium
CN108170593A (en) The method and device of application program operation
CN112306838A (en) Page layout compatibility testing method, device and equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20200512

Assignee: GUANGZHOU CUBESILI INFORMATION TECHNOLOGY Co.,Ltd.

Assignor: GUANGZHOU HUADUO NETWORK TECHNOLOGY Co.,Ltd.

Contract record no.: X2021440000031

Denomination of invention: Touch event recording and playback method, device, storage medium and device

License type: Common License

Record date: 20210125

EE01 Entry into force of recordation of patent licensing contract