CN115686334A - Operation control method, electronic device and readable storage medium - Google Patents

Operation control method, electronic device and readable storage medium Download PDF

Info

Publication number
CN115686334A
CN115686334A CN202211343134.4A CN202211343134A CN115686334A CN 115686334 A CN115686334 A CN 115686334A CN 202211343134 A CN202211343134 A CN 202211343134A CN 115686334 A CN115686334 A CN 115686334A
Authority
CN
China
Prior art keywords
touch operation
touch
application
recording
behavior data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211343134.4A
Other languages
Chinese (zh)
Other versions
CN115686334B (en
Inventor
王傲飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211343134.4A priority Critical patent/CN115686334B/en
Publication of CN115686334A publication Critical patent/CN115686334A/en
Application granted granted Critical
Publication of CN115686334B publication Critical patent/CN115686334B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The application discloses an operation control method, electronic equipment and a readable storage medium, and belongs to the technical field of terminals. The method comprises the following steps: and receiving a recording starting operation, wherein the recording starting operation is used for triggering and starting recording of the touch operation of the user on the screen of the electronic equipment. And recording the behavior data of the touch operation of the user in the first application interface of the first application program. And under the condition that the recording is finished, if the first application interface is opened, sequentially responding to at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the recorded behavior data of the at least one touch operation. According to the method and the device, the event corresponding to at least one touch operation of the user is automatically and repeatedly executed in the first application interface according to the recorded behavior data by recording the behavior data of the touch operation of the user, the need of manual repeated operation of the user is avoided, the problem that the operation efficiency is reduced due to fatigue of the user is avoided, and the task execution effect is improved.

Description

Operation control method, electronic device and readable storage medium
Technical Field
The present application relates to the field of terminal technologies, and in particular, to an operation control method, an electronic device, and a readable storage medium.
Background
With the rapid development of terminal technology, electronic devices are widely used. During the use of the electronic device, in some scenarios, a user may be required to perform some kind of operation mechanically and repeatedly, such as a red packet displayed in an application interface is required to be continuously pulled down by the user when the red packet is flushed in an application. Accordingly, the electronic device performs corresponding operations, such as opening the red envelope and displaying the amount of the red envelope.
However, these repetitive operations may gradually fatigue the user, and as time goes by, the efficiency of the operations is easily reduced due to the fatigue of the user, thereby affecting the task performance.
Disclosure of Invention
The application provides an operation control method, an electronic device and a readable storage medium, which can solve the problem that the operation efficiency is reduced due to fatigue when a user carries out repeated operation for a long time, so that the task execution effect is influenced. The technical scheme is as follows:
in a first aspect, a method for operation control, an electronic device and a readable storage medium are provided, where the method includes:
receiving a recording starting operation, wherein the recording starting operation is used for triggering the start of recording the touch operation of a user on a screen of the electronic equipment;
recording behavior data of touch operation of the user in a first application interface of a first application program;
and under the condition that the recording is finished, if the first application interface is opened, sequentially responding to at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the recorded behavior data of the at least one touch operation.
Therefore, by recording the behavior data of the touch operation of the user and automatically and repeatedly executing the event corresponding to at least one touch operation of the user in the first application interface according to the recorded behavior data, the need of manual repeated operation of the user is avoided, the problem of reduction of operation efficiency caused by fatigue of the user is avoided, and the task execution effect can be improved.
As an example of the present application, if the first application interface is opened when the recording is finished, sequentially responding to at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the recorded behavior data of the at least one touch operation in the first application interface includes:
under the condition that the recording is finished, generating a recording file, wherein the recording file comprises behavior data of the at least one touch operation;
and under the condition that the first application interface is opened, sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation on the basis of the behavior data of the at least one touch operation in the recording file.
Therefore, by generating the recording file, the at least one touch operation can be replayed based on the recording file, so that the corresponding touch events can be automatically and sequentially executed in the first application interface according to the execution sequence of the at least one touch operation, and further the manual operation of a user is avoided.
As an example of the present application, when the first application interface is opened, sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation in the recording file includes:
displaying a target interface, wherein the target interface comprises the recording file;
responding to a playing instruction of the recording file in the target interface, and sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation in the recording file under the condition that the first application interface is opened.
Therefore, after the recording file is generated, the target interface comprising the recording file is displayed, so that the user can play the recording file in the target interface conveniently, the user can play the recording file according to the requirement of the user, and the user experience can be improved.
As an example of the present application, when the first application interface is opened in response to a play instruction for the recording file in the target interface, sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation in the recording file includes:
receiving a repetition frequency setting instruction in the target interface, wherein the repetition frequency setting instruction carries a target frequency and is used for indicating the target frequency of repeated execution of the at least one touch operation;
responding to a playing instruction of the recorded file in the target interface, and sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation in the recorded file under the condition that the first application interface is opened;
after the current round of playing is finished, counting the repeated playing times of the recorded file;
if the repeated playing times do not reach the target times, continuously responding to the at least one touch operation in the first application interface in sequence according to the execution sequence of the at least one touch operation on the basis of the behavior data of the at least one touch operation in the recording file;
and if the repeated playing times reach the target times, determining that the response is finished.
Therefore, the repeat time setting option is provided in the target interface, so that the user can set the repeat playing time of the recorded file according to the requirement, the user is prevented from manually playing the recorded file repeatedly, and the user experience is improved. And the electronic equipment can automatically play the recorded file for multiple times, so that the corresponding touch event is executed in the first application interface for multiple times, and the task execution effect can be improved.
As an example of the present application, the at least one touch operation includes a first touch operation that calls a second application program; the sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the recorded behavior data of the at least one touch operation under the condition that the first application interface is opened includes:
under the condition that the first application interface is opened, if the current execution sequence of the first touch operation is determined based on the behavior data of the first touch operation, monitoring whether the second application program is in a running state;
and starting the second application program under the condition that the second application program is not in a running state.
Therefore, in the process of playing the recorded file, another application program, namely the second application program can be pulled up as required, the realization capability of the service is expanded, and the richness of the service is increased.
As an example of the present application, the method further comprises:
receiving the touch operation of the user in the application interface of a third application program in the process of sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation;
responding to the touch operation of the user in the third application program.
In this way, in the process of sequentially executing the touch event corresponding to each touch operation in the first application interface based on the behavior data of at least one touch operation, when the re-touch operation of the user is received, the normal response can still be performed, so that the user experience can be improved.
As an example of the present application, when the first application interface is opened, sequentially responding to at least one touch operation in the first application interface according to an execution sequence of the at least one touch operation based on the recorded behavior data of the at least one touch operation includes:
when the first application interface is opened, traversing according to the display sequence of the currently opened views from top to bottom when the execution sequence of the second touch operation is determined to be reached based on the behavior data of the second touch operation, wherein the second touch operation is any one of the at least one touch operation;
when a view is traversed, whether an application program to which the currently traversed view belongs is the first application program is inquired;
under the condition that the application program to which the currently traversed view belongs is not the first application program, continuously traversing the next view;
and sending the behavior data of the second touch operation to the first application program for processing under the condition that the application program to which the currently traversed view belongs is the first application program, and finishing the traversing operation.
Therefore, the touch events are issued in a mode of traversing view by view, so that each touch event can be responded orderly, normal response to the touch events received again can be still ensured under the condition of receiving the touch events again, and the condition that the response of the touch events is disordered or lost is avoided.
As an example of the present application, the electronic device includes a touch event management module, an interception feedback module, and an event receiving module; when each view is traversed, inquiring whether the application program to which the currently traversed view belongs is the first application program or not, wherein the inquiring comprises the following steps:
when the touch event management module traverses to one view, the event attribute information of the second touch event is sent to an application program to which the currently traversed view belongs, and the event attribute information of the second touch event is determined based on the behavior data of the second touch event;
the application program sends first indication information to the interception feedback module under the condition that the second touch event is determined to be intercepted according to the event attribute information of the second touch event, wherein the first indication information is used for indicating the interception of the second touch event;
the interception feedback module feeds back the first indication information to the touch event management module;
the touch event management module determines that the application to which the currently traversed view belongs is the first application.
Therefore, the touch event management module is responsible for issuing the events, the interception feedback module is responsible for feeding back the interception results of the events and feeding the interception results back to the touch event management module, and the touch events can be responded in order.
As an example of the present application, in a case that the application program to which the currently traversed view belongs is the first application program, sending behavior data of the second touch operation to the first application program for processing includes:
the touch event management module sends the behavior data of the second touch operation to the event receiving module;
the first application program acquires the behavior data of the second touch event from the event receiving module;
the first application responds to the second touch event based on the behavior data of the second touch event.
In this way, the touch event management module sends the behavior data to the event receiving module, so that the first application program obtains the behavior data from the event receiving module and carries out response processing, that is, a data transmission channel is established through the event receiving module, so that the touch event management module can send the behavior data to be processed by different application programs in a uniform data packaging manner.
As an example of the application, in a case that the number of the at least one touch operation is plural, the operation types of the plural touch operations include one or more of a click operation, a slide operation, and a long press operation.
Therefore, the behavior data of the touch operation of different operation types can be recorded, and the richness of the service can be improved.
As an example of the present application,
in the case that the at least one touch operation comprises the click operation, behavior data of the click operation comprises click position coordinates and click time of the click operation;
in a case where the at least one touch operation includes the slide operation, behavior data of the slide operation includes a slide start position coordinate, a slide end position coordinate, and a slide start time of the slide operation;
and in the case that the at least one touch operation comprises the long press operation, the behavior data of the long press operation comprises long press position coordinates, a long press starting time and a long press duration of the long press operation.
Therefore, the behavior data of the touch operation of different operation types can be recorded, and the richness of the service can be improved.
As an example of the present application, after receiving a recording start operation, the method further includes:
responding to one touch operation in the first application program every time the touch operation of the user in the first application interface is received.
Therefore, in the process of recording the touch operation of the user, each recorded touch operation can still be responded normally, and the problem that the normal response is influenced by the recording operation can be avoided.
As an example of the present application, if the first application interface is opened after the recording is finished, sequentially responding to at least one touch operation in the first application interface according to an execution sequence of the at least one touch operation based on the recorded behavior data of the at least one touch operation in the first application interface includes:
under the condition that a recording ending operation is received, if the first application interface is opened, sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation; alternatively, the first and second electrodes may be,
and under the condition that the recording duration reaches a preset recording duration, if the first application interface is opened, sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation.
Therefore, the recording can be finished through the recording finishing operation, or the recording can be finished through setting the preset recording duration, so that the recording finishing mode is increased.
In a second aspect, there is provided an operation-controlled apparatus having a function of implementing the method behavior of the operation control in the first aspect described above. The operation control device comprises at least one module, and the at least one module is used for realizing the operation control method provided by the first aspect.
In a third aspect, an electronic device is provided, where the structure of the electronic device includes a processor and a memory, and the memory is used to store a program that supports the electronic device to execute the method for operation control provided in the first aspect, and to store data used to implement the method for operation control in the first aspect. The processor is configured to execute programs stored in the memory. The electronic device may further comprise a communication bus for establishing a connection between the processor and the memory.
In a fourth aspect, a computer-readable storage medium is provided, which has instructions stored therein, which when run on a computer, cause the computer to perform the method of operation control of the first aspect described above.
In a fifth aspect, there is provided a computer program product containing instructions which, when run on a computer, cause the computer to perform the method of operational control of the first aspect described above.
The technical effects obtained by the second, third, fourth and fifth aspects are similar to the technical effects obtained by the corresponding technical means in the first aspect, and are not described herein again.
Drawings
FIG. 1 is a schematic diagram illustrating an application scenario in accordance with an exemplary embodiment;
FIG. 2 is a schematic diagram of an application scenario shown in accordance with another exemplary embodiment;
FIG. 3 is a schematic diagram of an application scenario shown in accordance with another exemplary embodiment;
FIG. 4 is a schematic diagram illustrating a target interface in accordance with an exemplary embodiment;
FIG. 5 is a schematic illustration of a target interface shown in accordance with another exemplary embodiment;
FIG. 6 is a schematic diagram of a target interface shown in accordance with another exemplary embodiment;
FIG. 7 is a schematic diagram illustrating a target interface in accordance with another exemplary embodiment;
FIG. 8 is a software architecture diagram of an electronic device shown in accordance with an exemplary embodiment;
FIG. 9 is a schematic flow chart diagram illustrating a method of operational control in accordance with an exemplary embodiment;
FIG. 10 is a schematic diagram illustrating an application scenario in accordance with another illustrative embodiment;
FIG. 11 is a schematic diagram illustrating an operation type of a touch operation in accordance with an exemplary embodiment;
FIG. 12 is a diagram illustrating a flow of issuing a touch event in accordance with an exemplary embodiment;
FIG. 13 is a diagram illustrating a flow of issuing a touch event in accordance with another exemplary embodiment;
FIG. 14 is a diagram illustrating a flow of issuing a touch event in accordance with another exemplary embodiment;
FIG. 15 is a diagram illustrating a flow of issuing a touch event in accordance with another exemplary embodiment;
FIG. 16 is a schematic diagram illustrating the recording of touch operations in accordance with an exemplary embodiment;
FIG. 17 is a flow chart diagram illustrating a method of operational control in accordance with another exemplary embodiment;
FIG. 18 is a schematic diagram illustrating the monitoring of an application operational status in accordance with an exemplary embodiment;
FIG. 19 is a diagram illustrating a flow of issuing a touch event in accordance with another exemplary embodiment;
FIG. 20 is a schematic diagram illustrating an intervening touch event in accordance with an exemplary embodiment;
fig. 21 is a schematic structural diagram of an electronic device shown in accordance with an example embodiment.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
It should be understood that reference to "a plurality" in this application means two or more. In the description of this application, "/" indicates an inclusive meaning, for example, A/B may indicate either A or B; "and/or" herein is only an association relationship describing an associated object, and means that there may be three relationships, for example, a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, for the convenience of clearly describing the technical solutions of the present application, the terms "first", "second", and the like are used to distinguish the same items or similar items having substantially the same functions and actions. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
During the use of the electronic device by the user, in some scenarios, the user may be required to perform mechanical and complex repetitive operations on the application interface. For example, taking the electronic device as a mobile phone as an example, in a possible scenario, an application a in the mobile phone supports a red packet swiping function, and a user opens the application a and enters a red packet swiping interface of the application a, as shown in (a) of fig. 1, a red packet picture is displayed in the red packet swiping interface of the application a, so that the user can continuously pull down the red packet picture in the red packet swiping interface, and after the user repeatedly pulls down for a certain number of times, the application a can be triggered to open the red packet and display the red packet amount, as shown in (b) of fig. 1.
However, in an application scenario like the above, the user is required to frequently and repeatedly perform the touch operation, the operation is cumbersome, and as time goes on, the efficiency of the touch operation may be reduced due to fatigue of the user, thereby affecting the performance efficiency of the task. Therefore, the embodiment of the application provides an operation control method, which can record a series of behavior data of a touch operation of a user in the process of the touch operation of the user, and then automatically execute the series of touch operations based on the recorded result, so that the requirement that the user repeatedly executes the touch operations is avoided, the problem that the efficiency of the touch operation is reduced due to fatigue of the user is avoided, and the execution efficiency of a task is further avoided from being influenced.
Before describing the method provided by the embodiment of the present application, an application scenario related to the embodiment of the present application is described.
As an example of the present application, application B in the handset supports a virtual garden function, where the user can grow virtual animals, thus requiring the user to perform some tasks at intervals. For example, referring to fig. 2 (a), fig. 2 (a) is a schematic diagram of a virtual garden interface in an application B according to an exemplary embodiment, the virtual garden interface includes a virtual chicken 21, such as named "sprouted chicken", and in order to enable the sprouted chicken to produce virtual eggs, the user is required to perform some tasks in the virtual garden interface at regular intervals. Referring to fig. 2 (b), in order to avoid the need for frequent repetitive actions by the user, the user may drop down the notification bar after entering the virtual garden interface, and before doing the task, the notification bar may include a screen recording option 22. The user may press the screen recording option 22 for a long time, and in response to the user's trigger operation on the screen recording option 22, the mobile phone redisplays the virtual garden interface, as shown in (c) of fig. 2, and starts recording the user's touch operation in the virtual garden interface. Illustratively, as shown in (c) of fig. 2, a user may click on an sprouted chicken to interact with the sprouted chicken, and in response to the user clicking on the sprouted chicken, application B may display an interactive utterance near the sprouted chicken, such as shown in (c) of fig. 2, the interactive utterance may be "eat well while busy", and to increase the mood index of the sprouted chicken, the user may click on the sprouted chicken multiple times, and accordingly, the interactive utterance displayed by application B each time may be different, such as two consecutive clicks by the user, and referring to (d) of fig. 2, the second displayed interactive utterance may be "help for a child on earth because of your movement". In addition, the user may feed the sprouted chicken, for example, as shown in fig. 2 (e), the user may click on a virtual grain store 23 in the virtual garden interface. Referring to fig. 2 (f), in response to a click operation of the virtual food warehouse 23 by a user, the application B adds virtual food to the food tray 24 of the sprouted chicken and reduces the remaining amount of food in the virtual food warehouse 23.
Thereafter, referring to fig. 3 (a), when the user wants to end the operation, the notification bar may be pulled down again, the mobile phone displays the notification bar in response to the pull-down operation of the user, the user may press the record screen option 22 again, and the mobile phone ends the operation of recording the touch operation in response to the long press operation of the record screen option 22 by the user. In an example, referring to fig. 3 (b), after the user presses the record screen option 22 again for a long time, the mobile phone displays a target interface, where the target interface includes a record file 31, and the record file 31 includes behavior data of the touch operation recorded in the above process. In addition, a playing option 32 corresponding to the recording file 31 is also reported in the target interface, and when the user wants to automatically perform the above-mentioned series of operations on the virtual garden interface based on the recording file 31, the user can click the playing option 32. Referring to fig. 3 (c), in response to the user's trigger operation on the play option 32, the mobile phone displays the virtual garden interface, and then, the mobile phone automatically executes the above-mentioned series of touch operation-triggered events in the virtual state interface in sequence. For example, as shown in (d) of fig. 3, the mobile phone may simulate an operation of clicking a sprouted chicken first, the application B displays an interactive utterance near the sprouted chicken, for example, the interactive utterance at this time is "thank you play with me", please continue to refer to (e) of fig. 3, the mobile phone simulates an operation of clicking the sprouted chicken again, the application B displays an interactive utterance near the sprouted chicken again, for example, the interactive utterance at this time is "beautiful today", then, the mobile phone simulates an operation of clicking the virtual grain depot 23 in the virtual garden interface, the application B adds virtual grain in the food tray 24 of the sprouted chicken again, and reduces the remaining amount of grain in the virtual grain depot 23, as shown in (f) of fig. 3, at this time, the remaining amount of grain in the virtual grain depot 23 is 80g.
As an example of the application, during playing of the recorded file, a touch action corresponding to each touch operation may be displayed in the virtual garden interface, for example, for a click operation, a schematic diagram of a click by a cursor may be displayed at a clicked position in the virtual garden interface, and for a slide operation, a schematic diagram of a slide of the cursor may be displayed at a slide position.
As an example of the present application, referring to fig. 3 (c), during the process of playing the recording file, a pause playing option 33 may be further displayed in the virtual garden interface, and the pause playing option 33 may be clicked when the user does not need to automatically execute the event that is not executed in the recording file in the virtual garden interface by using a mobile phone. In response to the user clicking the pause playing option 33, the mobile phone no longer executes the subsequent unexecuted event based on the recorded file, that is, stops playing the recorded file.
In some scenarios, the user may not want to play the recording file immediately after the touch operation is recorded, and for this reason, referring to fig. 4 (a), a start time setting option may also be provided in the target interface for the user to set the time for starting playing the recording file. The user may click the start time setting option, please refer to fig. 4 (b), and in response to the user's trigger operation on the start time setting option, the mobile phone displays a time setting interface, so that the user can set the time for starting playing the recording file based on the time setting interface, specifically including setting the date and setting the time. In this case, the user does not need to click the play option again to play the recorded file, but the recorded file is played according to the set start time, that is, when it is detected that the system time reaches the set start time, the mobile phone plays the recorded file under the condition that the application program B is in the running state and the virtual garden interface is opened, that is, the series of operations are executed in the virtual garden interface.
In some scenarios, the user may also need to play the recording file multiple times. To this end, referring to fig. 5 (a), the target interface further includes a repeat number setting option for the user to set the number of times the recording file is repeatedly played. The user may click the repetition number setting option, please refer to fig. 5 (b), and in response to the user's trigger operation on the repetition number setting option, the mobile phone displays a repetition number setting interface, so that the user can set the number of times of repeatedly playing the recording file based on the repetition number setting interface. Therefore, after the recorded file is played, the mobile phone can count the repeated playing times, and when the counted repeated playing times reach the set times, the repeated playing is stopped.
Further, referring to fig. 4 (a), the target interface may further include description information of the recording file, such as recording time, recording duration, and the like of the recording file, which is not limited in this embodiment of the application.
In some scenarios, the user may also need to play the recording file multiple times. To this end, referring to fig. 6 (a), the target interface further includes a repeat duration setting option for the user to set the total duration of the repeat playing of the recording file. The user may click the repeat duration setting option, please refer to fig. 6 (b), and in response to the user's trigger operation on the repeat duration setting option, the mobile phone displays a repeat duration setting interface, so that the user can set the total duration of the repeat playing of the recording file, including the settings of time, minutes, and seconds, based on the repeat duration setting interface. Therefore, after the recorded file is played, the mobile phone can count the playing time length, and when the playing time reaches the set time length, the repeated playing is stopped.
As an example of the present application, the target interface may further include a plurality of recording files, the plurality of recording files are displayed according to a sequence of recording time, please refer to fig. 7 (a), when the number of the plurality of recording files is large, the user may slide left (or right) an icon of the recording file, please refer to fig. 7 (b), and in response to the left-sliding operation of the user, the mobile phone displays the recording file of the next page in the target interface. Each recording file corresponds to a file name, and a user can select a recording file to be currently played from a plurality of recording files according to a requirement, for example, as shown in (c) of fig. 7, assuming that a recording file of a touch operation recorded in the virtual garden interface is a recording file 1, the user can select the recording file 1, and then perform a relevant configuration (such as a repetition number) on the recording file 1, or can directly click a playing option, so that the mobile phone automatically executes a corresponding operation in the virtual garden interface based on the recording file 1.
It should be noted that the screen recording option described above may also be used to trigger the mobile phone to perform a screen recording operation, for example, after the screen recording option is clicked, the mobile phone starts to record a screen. In another example, an operation recording option dedicated to triggering the mobile phone to start or end the recording touch operation may also be provided in the pull-down notification bar, for example, the operation recording option is an "operation recording" option, in which case, the user may trigger the mobile phone to start or end the recording touch operation by clicking the operation recording option. Or the operation recording option provided in the pull-down notification bar is used for triggering the mobile phone to start recording touch operation, then the ending option can be displayed in the virtual garden interface, when the user wants to end the recording operation, the ending option can be triggered, accordingly, the mobile phone ends the recording operation, so that the recording operation can be directly ended in the virtual garden interface, the pull-down notification bar is prevented from being opened by the user, the operation convenience is improved, and the user experience can be improved.
It should be noted that the application scenarios described above are only exemplary, and the application scenarios of the method provided in the embodiment of the present application are not limited, that is, in other application interfaces, when a user wants to automatically trigger a series of touch operations of the user in the other application interfaces subsequently by the electronic device, the electronic device may operate in the above manner.
Next, a software system of the electronic device according to the embodiment of the present application will be described. The software system of the electronic device may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the application takes an Android (Android) system with a layered architecture as an example to exemplarily explain a software system of an electronic device.
Fig. 8 is a block diagram of a software system of an electronic device according to an embodiment of the present disclosure. Referring to fig. 8, the layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system layer, and a kernel layer from top to bottom.
The application layer may include a series of application packages. As shown in fig. 8, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions. As an example of the present application, the application layer includes a view system, a touch event management module, an interception feedback module, an event receiving module, an event destruction module, and the like.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system can be used for constructing a display interface of an application program, and the display interface can be composed of one or more views, such as a view for displaying a short message notification icon, a view for displaying characters and a view for displaying pictures.
The touch event management module is used for distributing the touch events under the condition that the touch events occur so as to issue the touch events to the corresponding application programs for processing. In one example, the touch event management module distributes the touch event for activity.
The interception feedback module is used for feeding back whether the application program intercepts the touch event to the touch event management module. In one example, the interception feedback module is onInterceptTouchent ().
The event receiving module is used for receiving the behavior data of the touch event and calling the touch event by the application program, so that the application program can process the corresponding touch event based on the behavior data conveniently. In one example, the event receiving module is onTouchEvent.
The event destruction module is used for destroying the waste events, wherein the waste events refer to touch events which are not processed by the application program.
Further, as shown in FIG. 8, the application framework layer may include a window manager, a content provider, a phone manager, a resource manager, a notification manager, and the like. The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data, which may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc., and makes the data accessible to applications. The telephone manager is used for providing communication functions of the electronic equipment, such as management (including connection, disconnection and the like) of call states. The resource manager provides various resources, such as localized strings, icons, pictures, layout files, video files, etc., to the application. The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a brief dwell, and does not require user interaction. For example, a notification manager is used to notify download completion, message alerts, and the like. The notification manager may also be a notification that appears in the form of a chart or scrollbar text at the top status bar of the system, such as a notification of a background running application. The notification manager may also be a notification that appears on the screen in the form of a dialog window, such as prompting a text message in a status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system. The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules, such as: surface managers (surface managers), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like. The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications. The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The workflow of the software and hardware of the electronic device is exemplarily described below in connection with capturing a photo scene.
When the touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into an original input event (including touch coordinates, a time stamp of the touch operation, and other information). The raw input events are stored at the kernel layer. And the application program framework layer acquires the original input event from the kernel layer and identifies the control corresponding to the original input event. Taking the touch operation as a click operation, and taking a control corresponding to the click operation as a control of a camera application icon as an example, the camera application calls an interface of an application program framework layer, starts the camera application, then calls a kernel layer to start a camera drive, and captures a still image or a video through the camera 193.
The method provided by the embodiment of the application is described in detail below with reference to the accompanying drawings. Referring to fig. 9, fig. 9 is a flowchart illustrating an operation control method according to an exemplary embodiment, by way of example and not limitation, where the method is applied to the electronic device illustrated in fig. 8, and the electronic device is implemented by a plurality of modules in an interactive manner. The method may include some or all of the following:
step 901: in the process of displaying the first application interface of the first application program, the touch event management module receives a recording starting operation, and the recording starting operation is used for triggering the start of recording the touch operation of the user in the first application interface of the first application program.
The first application is any application in the electronic device. For example, the first application may be, but is not limited to, a shopping application (e.g., taobao) TM ) Payment application program (such as payment treasure) TM ) And a desktop program.
In one example, the first application interface is an interface of a certain service provided in the first application program, such as a virtual garden service provided in the first application program, the virtual garden icon may be provided in the second application interface of the first application program, the user may click on the virtual garden icon, and in response to the user clicking on the virtual garden icon, the electronic device displays a corresponding virtual garden interface, such as shown in fig. 2 (a), when the virtual garden interface is the first application interface. In one example, the electronic device may display the virtual garden interface through a view system.
In one example, an operation recording option (which may be a multiplexed screen recording option) is provided in the electronic device, and the user may cause the electronic device to start performing a recording operation by triggering the operation recording option.
For example, referring to fig. 2, since the user is required to repeatedly perform some operations in the virtual garden interface during playing the virtual garden of the application B, such as multiple interactions with a sprouted chicken, feeding of the sprouted chicken, dressing of the sprouted chicken, etc., the user may trigger the electronic device to record the following touch operations in the virtual garden interface in order to avoid the need for manual repetition of the operations by the user. To do so, the user slides down from the top upper end of the electronic device. As shown in fig. 2 (b), in response to an operation of the user sliding down from the top upper end of the electronic device, the electronic device displays a pull-down notification bar including a screen recording option, and the user can press the screen recording option in the pull-down notification bar for a long time. Accordingly, the electronic device receives a recording start operation, and the electronic device starts an operation recording function, namely, a function of recording a touch operation of a user on a screen of the electronic device.
As an example of the present application, after the user presses the record screen option for a long time, the electronic device returns to the first application interface before displaying the drop-down notification bar, for example, please see (c) in fig. 2, and after the user presses the record screen option for a long time, the electronic device returns to the virtual garden interface.
Step 902: the touch event management module receives a touch operation of a user on the first application interface.
That is, after the operation recording function is started, the user may perform a touch operation in the first application interface of the first application program, and it is understood that, during the operation recording process, the user may perform one touch operation in the first application interface of the first application program or may perform a plurality of touch operations (for example, continuously or in a time-sharing manner). Accordingly, the touch event management module of the electronic device receives each touch operation.
In one possible scenario, after the user touches a control in the first application interface, the user may switch to a third application interface of the first application program. For example, referring to fig. 10 (a), a "fertilizer" icon is provided in the virtual garden interface, and when the user clicks the "fertilizer" icon, the mobile phone displays the virtual farm interface of the application B in response to the clicking operation of the user, as shown in fig. 10 (B). After switching to the third application interface, the user may continue to perform some touch operations in the third application interface, in which case the touch event management module receives each touch operation of the user on the third application interface.
Referring to fig. 11, in the case that the touch operation is multiple, the operation types of the multiple touch operations may generally include one or more of a click operation, a slide operation, and a long press operation. In fig. 11, (a) indicates a click operation, (b) indicates a slide operation, and (c) indicates a long press operation.
Step 903: and the touch event management module records the behavior data of the touch operation.
When a touch operation is received on a first application interface of a first application program, a touch event management module records behavior data corresponding to the touch operation, namely, executes an operation recording function.
In one example, if the first application interface is switched to a third application interface after a certain touch operation is performed in the first application interface, and the user continues to perform the touch operation in the third application interface, the touch event management module records behavior data of each touch operation in the third application interface.
The behavior data of the touch operation is different according to different operation types of the touch operation. Specifically, the following cases are included:
in the first case: and under the condition that the touch operation is the click operation, the behavior data of the click operation comprises click position coordinates and click time of the click operation, the click position coordinates are used for representing the position of the click operation acted on the first application interface of the first application program, and the click time is used for representing how long the click operation occurs from the beginning of the operation recording. For example, when the click position coordinate is (x, y) = (12.1,22.5), and the click time is 01.
In one example, if the first application interface of the first application program is displayed in a full screen, that is, the display size of the first application interface of the first application program is the same as the screen size of the electronic device, the click position coordinate is the coordinate of the position point on the screen acted by the click operation.
In another example, if the first application interface of the first application program is not displayed in a full screen, that is, the display size of the first application interface of the first application program is smaller than the screen size of the electronic device, for example, in a tablet computer, the first application interface of the first application program only occupies a portion of the tablet screen. In this case, the click position coordinate may be determined according to the relative position of the first application interface and the screen and the coordinate of the position point on the screen where the click operation acts. For example, the coordinates of the position point acted on by the click operation on the screen can be determined, and then the determined coordinates are subjected to coordinate system conversion according to the relative position of the first application interface and the screen so as to be converted into the coordinate system of the first application interface, so that the click position coordinates of the click operation on the first application interface of the first application program can be determined.
In the second case: in the case where the touch operation is a slide operation, the behavior data of the slide operation includes slide start position coordinates, slide end position coordinates, and slide start time of the slide operation. The sliding start position coordinate is used for representing the start position of the sliding operation acted on the first application interface of the first application program, the sliding end position coordinate is used for representing the end position of the sliding operation acted on the first application interface of the first application program, and the sliding start time is used for representing how long the sliding operation occurs after the operation recording starts. For example, when the coordinates of the slide start position of the slide operation are (x 1, y 1) = (4.7,6.5), the coordinates of the slide end position of the slide operation are (x 2, y 2) = (4.9,11.0), and the slide start time is 02, the slide operation is performed from the position (3532) to the position (4.9,11.0) of the first application interface, and the slide operation occurs at the 2 nd minute 10 seconds from the start of the recording operation.
In one example, if the first application interface of the first application program is displayed in a full screen, that is, the display size of the first application interface of the first application program is the same as the screen size of the electronic device, the sliding start position coordinate is the coordinate of the position point acting on the screen when the sliding starts, and the sliding end position coordinate is the coordinate of the position point acting on the screen when the sliding ends.
In another example, if the first application interface of the first application program is not displayed in a full screen, the sliding start position coordinate may be determined according to the relative position of the first application interface and the screen and the coordinate of the position point acting on the screen when the sliding starts. Similarly, the coordinates of the sliding end position can be determined according to the relative position of the first application interface and the screen and the coordinates of the position point acting on the screen when the sliding is finished.
In the third case: in the case where the touch operation is a long-press operation, the behavior data of the long-press operation includes a long-press position coordinate of the long-press operation, a long-press start time, and a long-press duration. The long press position coordinate is used for representing the position of the long press operation acting on the first application interface of the first application program, the long press starting time is used for representing how long the long press operation occurs from the beginning of the operation recording, and the long press duration is used for representing the duration of the long press operation acting on the screen. For example, the long-press position coordinates of the long-press operation are (x, y) = (9.1,2.4), the long-press start time is 05.
In one example, if the first application interface of the first application program is displayed in a full screen mode, that is, the display size of the first application interface of the first application program is the same as the screen size of the electronic device, the long-press position coordinate is the coordinate of the position point acted on the screen by the long-press operation. In another example, if the first application interface of the first application program is not displayed in full screen, the long press position coordinate may be determined according to the relative position of the application interface and the screen and the coordinate of the position point on the screen acted on by the long press operation.
Step 904: the touch event management module responds to the touch operation.
That is, in the process of recording the behavior data of the touch operation of the user on the first application interface of the first application program, the electronic device normally responds to each touch operation of the user. In one example, if the recording process is switched from the first application interface to the third application interface and the user continues to perform the touch operation in the third application interface, the electronic device will normally respond to each touch operation in the third application interface by the user. Therefore, the normal response of the operation can be prevented from being influenced by the execution of the operation recording function.
In implementation, the touch event management module issues a touch event corresponding to the touch operation, and when the touch event corresponding to the touch operation is issued, traversal is performed from the top view, as shown in fig. 12, fig. 12 is a flowchart of a touch event delivery mechanism according to an exemplary embodiment, where if an application program to which the top view (may also be referred to as a top parent view) belongs intercepts the touch event, behavior data of the touch event is delivered to the application program to which the top parent view belongs for processing, and if the application program to which the top parent view belongs does not intercept the touch event, traversal of a next view (may be referred to as a parent view) is continued. And if the application program to which the parent view belongs intercepts the touch event, submitting the behavior data of the touch event to the application program to which the parent view belongs for processing, if the application program to which the parent view belongs does not intercept the touch event, continuing traversing the view (which can be called as a child view) of the next layer, if the application program to which the child view belongs intercepts the touch event, submitting the behavior data of the touch event to the application program to which the child view belongs for processing, and if the application program to which the child view belongs does not intercept the touch event and other views do not exist, destroying the touch event.
Specifically, referring to fig. 13, each time one touch operation of the user is received, responding to the one touch operation may include the following sub-steps 9041 to 9046:
9041: and the touch event management module performs traversal according to the display sequence of the currently opened views from top to bottom.
It is understood that a user may open a plurality of application interfaces during the use of the electronic device, for example, the user opens the application program a, and in response to the opening operation of the application program a by the user, the electronic device opens the application interface v1 of the application program a; when the user clicks the option for opening the application interface v2 of the application program A in the application interface v1, responding to the clicking operation of the user, and the electronic equipment opens the application interface v2 of the application program A; then, when the user opens the first application program, in response to the opening operation of the user on the first application program, the electronic equipment opens a second application interface v3 of the first application program; when the user opens the virtual garden in the second application interface v3 of the first application, the electronic device opens the virtual garden interface v4. Each application interface can be understood as a view, so that the views currently opened by the electronic device are an application interface v4, an application interface v3, an application interface v2 and an application interface v1 from top to bottom, and the application programs to which the application interfaces belong are a first application program, an application program A and an application program A respectively. It can be seen that the application that was opened first is displayed on the lowest layer and the application that was opened last is displayed on the top layer. In order to determine which view corresponds to the application program to process the touch operation, the touch event management module performs traversal according to a display order from top to bottom.
9042: each time a view is traversed, the touch event management module queries whether the application program to which the currently traversed view belongs intercepts the touch operation.
As an example of the application, when a user performs a touch operation on an application interface, a touch event management module obtains behavior data and event attribute information of the touch operation, where the event attribute information includes information such as an event type (e.g., a click type and a slide type) in addition to a part of the behavior data in the behavior data, that is, there is an intersection between the event attribute information and the behavior data. The portion of the behavioral data included in the event attribute information may include, but is not limited to, location coordinates. In the traversing process, each time a view is traversed, the touch event management module sends the event attribute information of the touch operation to the application program to which the currently traversed view belongs. In one example, the application program includes a view module, and the application program to which the currently traversed view belongs receives event attribute information sent by the touch event management module through the view module, and then determines whether to intercept the touch operation according to the event attribute information. And if the application program to which the currently traversed view belongs needs to process the touch event, intercepting the touch operation, otherwise, if the application program to which the currently traversed view belongs does not process the touch operation, not intercepting the touch operation. Under the condition that the application program to which the current traversed view belongs determines to intercept the touch operation, calling an interception feedback module, and informing the touch event management module to intercept the touch operation through the interception feedback module, for example, returning true to the touch event management module through the interception feedback module; and under the condition that the application program to which the currently traversed view belongs determines not to intercept the touch operation, calling an interception feedback module, and informing the touch event management module not to intercept the touch operation through the interception feedback module, such as returning false to the touch event management module through the interception feedback module. In this way, the touch event management module can determine whether the application program to which the currently traversed view belongs intercepts the touch operation.
Under the condition that the application program to which the currently traversed view belongs does not intercept the touch operation, the following operations of 9043 to 9045 are carried out; in the case where the application program to which the currently traversed view belongs intercepts the one touch operation, the following operation of 9046 is entered.
9043: the touch event management module determines whether an unretraversed view exists.
Since the number of currently opened views is limited, the touch event management module queries whether there is an unretraversed view in the case that the application to which the currently traversed view belongs does not intercept the one touch operation. If there is a view that has not been traversed, the touch event management module performs the operations of 9044 below, otherwise, if there is no view that has not been traversed, the touch event management module performs the operations of 9045 below.
9044: the touch event management module continues to traverse the next view.
In the case that the application program to which the currently traversed view belongs does not intercept the touch event, if an unretraversed view exists, the touch event management module continues to traverse the next view so as to determine whether the application program to which the next view belongs will process the touch event. That is, after traversing to the next view, the touch event management module determines whether the application program to which the next traversed view belongs intercepts the touch operation according to the operation of 9042.
9045: the touch event management module destroys the one touch event.
Under the condition that the application program to which the currently traversed view belongs does not intercept the touch event and the non-traversed view does not exist, it is indicated that no application program in the currently opened application program processes the touch event. For example, a certain touch operation may touch on a border of the first application interface.
In one example, the touch event management module sends data (e.g., including behavior data and event attribute information) related to the touch event to the event destruction module, which destroys the touch event.
9046: the touch event management module sends the behavior data of the touch operation to the event receiving module.
The event receiving module provides a callable interface for the application program, so that after the touch event management module sends the behavior data of the touch operation to the event receiving module, the application program to which the currently traversed view belongs can acquire the behavior data of the touch event from the event receiving module, and then the touch operation is processed based on the behavior data of the touch event, namely the touch operation is consumed.
For example, referring to fig. 14, assuming that the touch operation is a control clicked, in the traversal process, the touch event processing module issues the touch event to a view (referred to as a parent view) where a parent control of the control is located, for example, a dispatcouchevent (dispatcouchevent), and the dispatcouchevent () of the parent view calls view group. If intercepted, this touch event is no longer passed to the child view, and the parent view processes this one touch event. If not, continuously transmitting the touch event to the sub-view, traversing the sub-view to inquire whether the sub-view where the clicked control is located can be found, and if the sub-view where the clicked control is located is found, calling the displatcouchvent () of the sub-view so as to realize the transmission of the touch event. Otherwise, if the child view where the clicked control is located cannot be found, the touch event may be destroyed by the touch event processing module itself, or the child view may be destroyed by the parent view, for example, the parent view may call its onClick () for destruction, and as an example and not by way of limitation, the touch event processing module may call the onClick () through an onTouch () - > onTouch event () - > performClick () - > onClick () path.
At this point, the operation responding to the touch operation of the user on the first application interface of the first application program is completed.
As shown in fig. 15, after the recording operation is started, when a touch operation of a user on a first application interface of a first application program is received, the event is normally issued to the bottom layer to respond to the touch operation, and the behavior data of the touch operation is recorded, so that the normal response of the electronic device to the touch operation is not affected in the recording operation process.
It should be noted that step 904 is an optional operation in the embodiment of the present application.
It should be further noted that step 904 and step 903 do not have a strict sequential execution order, and in an example, step 904 and step 903 may be executed in parallel.
Step 905: and the touch event management module receives a recording end instruction.
In one example, referring to fig. 3 (a), when the user wants to terminate the recording operation function, the user may slide down from the top of the electronic device, and in response to the sliding operation, the electronic device displays a pull-down notification bar, and then the user may press down the recording screen option in the pull-down notification bar for a long time. In response to a long press operation of the user on the screen recording option in the pull-down notification bar, the touch event management module determines that a recording end instruction is received, at which point it may be determined that recording is ended.
In the above description, the recording end is determined when the recording end instruction is received, as an example. In another embodiment, a preset recording duration may be set in the electronic device, so that after receiving the recording start operation, the recording duration is timed, for example, a timer configured in the electronic device may be started to time. And determining that the recording is finished when the recording duration reaches the preset recording duration. The specified recording time period may be set according to actual requirements, for example, the specified recording time period may be 5 minutes, and the like, which is not limited in this embodiment of the present application.
Step 906: and under the condition that the recording is finished, the touch event management module generates a recording file, wherein the recording file comprises the recorded behavior data of at least one touch operation.
And under the condition that the recording is finished, the touch event management module generates a recording file based on the recorded behavior data of each touch operation in at least one touch operation.
In an example, referring to fig. 16, the record file includes behavior data of three touch operations, namely, a click operation, a slide operation, and a long-time press operation, that is, after the recording is started, the user performs the click operation, the slide operation, and the long-time press operation in the first application interface of the first application program in sequence. Wherein the click event corresponding to the click operation occurs at the 1 st minute and 10 seconds after the recording is started, and the coordinate of the click position is (12.1,22.5); the sliding event corresponding to the sliding operation occurs at the 2 nd minute 10 seconds after the recording is started, the coordinate of the sliding starting position is (4.7,6.5), and the coordinate of the sliding ending position is (4.9,11.0); the long press event corresponding to the long press operation occurs at the 5 th minute and 29 seconds after the recording is started, the coordinates of the position of the long press are (9.1,2.4), and the long press duration is 2.1 seconds.
That is, as shown in fig. 17, in the process of processing the touch operation, all touch operations are recorded to obtain a recorded file. In one example, an event list may be established in the recording file, and the event list includes a corresponding relationship between each touch operation and its start time, so that when the recording file is played later, the execution sequence of each touch operation may be determined based on the event list.
It should be noted that, generating the recording file when the recording is finished is an optional operation in the embodiment of the present application.
Step 907: and the touch event management module displays a target interface, wherein the target interface comprises a recording file.
As an example of the present application, please refer to fig. 3 (b), where the target interface includes the recording file and a play option corresponding to the recording file, and the play option is used to trigger the electronic device to automatically execute a corresponding operation based on behavior data in the recording file.
As an example of the present application, please refer to fig. 4, fig. 5, or fig. 6, the target interface includes description information of the recording file, for example, the description information includes recording time, recording duration, and the like. Further, the file name of the recording file may also be included.
In one example, the target interface may further include a related configuration option for playing the recording file, and the user may perform some configuration operation based on the related configuration option. For example, as shown in fig. 4, the target interface includes, but is not limited to, a repeat time setting option and a repeat duration setting option. The repetition frequency setting option is used for setting the frequency of repeatedly playing the recording file, for example, the frequency can be set as a target frequency, and the target frequency can be set according to actual requirements, for example, 3 times and the like; the repeat duration setting option is used for setting the total duration of the repeat playing of the recording file, for example, the total duration can be set as a target duration, and the target duration can be set according to actual requirements, for example, the target duration can be 15 minutes. It is understood that when setting the target time length, the user may set the target time length according to the recording time length of the recording file, and the target time length may be set to be an integral multiple of the recording time length, for example and without limitation.
In one example, a start time setting option for setting the time to start playing the recorded file may be further included in the target interface, such as a start time setting of 08 days from 7 months 1 to 7 months 5 of 2022: 00.
it should be noted that step 907 is an optional operation in the embodiment of the present application. In another example, the target interface may not be displayed, but the target interface including the recording file may be manually opened by the user, for example, after the touch event management module generates the recording file, the recording file may be stored in a specified path, so that the user may open the target interface where the recording file is located based on the specified path, where the specified path may be set according to actual requirements. Or in another embodiment, the touch event management module directly performs the following operation of step 908, or after a second preset time period elapses, the touch event management module performs the following operation of step 908. The second preset time period may be set according to actual requirements, which is not limited in the embodiment of the present application.
Step 908: and the touch event management module receives the playing operation of the recorded file in the target interface.
The playing operation is used for indicating that the at least one touch operation is automatically triggered in the first application program based on the recording file.
In an example, as shown in fig. 4, 5, or 6, in a case that the target interface includes a play option, when the user wants the electronic device to automatically perform an operation corresponding to the at least one touch operation in the first application interface of the first application based on the recording file, the play option corresponding to the recording file may be clicked, that is, the play operation is triggered.
Step 909: in response to the playing operation, when the first application interface is opened, and when the execution sequence of the second touch operation is determined to be reached based on the behavior data of the second touch operation, the touch event management module traverses from top to bottom according to the display sequence of the currently opened views.
The second touch operation is any one of at least one touch operation.
In order to respond to a corresponding touch operation based on a behavior parameter in a recording file when a play operation for the recording file is received, whether a first application program is in a running state may be detected, where the running state of the first application program includes being in a foreground running state or being in a background running state. In a case where the first application program is in the running state and the first application interface is opened, it is described that the at least one touch operation can be sequentially responded in the first application program in the execution order of the at least one touch operation based on the behavior data of the at least one touch operation in the recording file, and in this case, the electronic apparatus performs the operations of steps 909 to 915.
In step 909, in response to the play operation, in the case that the first application interface is opened, the touch event management module may call the recording file, and in one example, as shown in fig. 17, the touch event management module may determine an execution order of each touch operation according to an event list in the recording file and then sequentially execute the corresponding events according to the execution order. For example, referring to fig. 16, since the start times of the touch operations corresponding to the behavior data in the record file record are sequentially 1 minute 10 seconds, 2 minutes 10 seconds, and 5 minutes 29 seconds, the execution sequence of the touch operations sequentially is: click operation, slide operation, long press operation. For any one touch operation (namely, a second touch operation) of the multiple touch operations, the electronic device may issue the second touch operation through the touch event management module to query which application program will process the second touch operation, and for this reason, the touch event management module performs traversal according to a display order of the currently opened views from top to bottom (the view opened later is located at an upper layer of the view opened first).
Step 910: and when a view is traversed, the touch event management module sends the event attribute information of the second touch event to the application program to which the currently traversed view belongs.
The event attribute information of the second touch event includes, in addition to a part of the behavior data of the second touch event, information such as an event type (e.g., a click type and a slide type) of the second touch event, that is, an intersection exists between the event attribute information and the behavior data. The partial behavior data included in the event attribute information of the second touch event may include, but is not limited to, position coordinates and the like.
And when the current view is traversed, the touch event management module sends the event attribute information of the second touch event to the application program to which the currently traversed view belongs so as to determine whether the application program to which the currently traversed view belongs processes the second touch event. It is understood that, since the second touch operation is a touch operation performed by the user in the first application program during the recording process, the first application program will usually process the second touch operation, and therefore, the touch event management module sends the event attribute information of the second touch event to the application program to which the currently traversed view belongs, that is, queries whether the application program to which the currently traversed view belongs is the first application program.
Step 911: and the application program sends first indication information to the interception feedback module under the condition of determining to intercept the second touch event according to the event attribute information of the second touch event, wherein the first indication information is used for indicating to intercept the second touch event.
In a possible case, if the application program to which the currently traversed view belongs is the first application program, the first application program determines to intercept the second touch event according to event attribute information of the second touch event, and at this time, the first application program may send first indication information to the interception feedback module, for example, the first indication information is true.
In another possible case, if the application program to which the currently traversed view belongs is not the first application program, the application program to which the currently traversed view belongs determines not to intercept the second touch event after receiving the event attribute information of the second touch event, and in this case, the application program to which the currently traversed view belongs sends second indication information to the interception feedback module, where the second indication information is used to indicate that the second touch event is not intercepted. For example, the second indication is false. In this case, the touch event management module continues to traverse the next view.
Step 912: and the interception feedback module feeds back the first indication information to the touch event management module.
That is, the interception feedback module notifies the touch event management module of the application program belonging to the currently traversed view to intercept the second touch event. In this way, after receiving the first indication information, the touch event management module may determine that the application to which the currently traversed view belongs is the first application.
Step 913: and the touch event management module sends the behavior data of the second touch operation to the event receiving module.
Step 914: the application program acquires the behavior data of the second touch event from the event receiving module.
It is understood that the application program at this time is the first application program, that is, the first application program obtains the behavior data of the second touch event from the event receiving module.
Step 915: the application responds to the second touch event based on the behavior data of the second touch event.
For example, referring to fig. 3 (d), in a case that the second touch operation is a click operation, the first application program may determine that the sprouted chicken is clicked at this time according to the behavior data of the second touch operation, and the first application program increases the number of interactions with the sprouted chicken and displays an interactive session with the sprouted chicken. In this way, each time the execution sequence of a certain touch operation among at least one touch operation is reached, the touch operation can be responded according to the flow.
And the electronic equipment responds to at least one touch operation in sequence according to the flow. In an example, if the touch operation related to the recording file further includes an operation of switching from the first application interface to the third application interface and a touch operation in the third application interface, in the process of playing the recording file, after the touch event is issued according to the above process, the first application interface may be automatically switched to the third application interface, and an event corresponding to the touch operation is executed in the third application interface.
In one example, the at least one touch operation includes a first touch operation for invoking the second application, such as including a link address capable of invoking the second application in a first application interface of the first application, and when the first touch operation is a click operation on the link address, the touch operation management module queries whether the second application is in a running state, and if the second application is not in the running state, the second application may be started. If the second application is in the running state, no other operation may be performed, or an application interface of the second application may be displayed.
In an example, referring to fig. 18, the touch event processing module may include an application state monitoring module, where the application state monitoring module is configured to monitor an operating state of the application program, for example, when the application program is started, the application state monitoring module may be registered to notify the application state monitoring module of entering the operating state, and an application identifier may be carried during registration so that the application state monitoring module can know which application program is. When the application program is closed, the application state monitoring module can be logged off to inform the application state monitoring module to enter a closed state, and the application program can also carry an application identifier when logged off. In this way, when the second application program is called based on the recorded file, the touch event management module may query the application state monitoring module for the state of the second application program. In one example, the application state listening module is ActivityManager ().
Further, if a touch operation is performed in the second application program after the second application program is started, the touch operation in the second application program is responded by the second application program in the process of playing the recording file.
As an example of the present application, please refer to fig. 19, in the process of sequentially responding to at least one touch operation according to the recording file, the user may also perform a touch operation in the application interface of the third application program, in this case, the electronic device may still respectively issue touch events corresponding to the touch operations, so as to normally respond to the touch operation of the user in the application interface of the third application program, and normally respond to a certain touch operation in the at least one touch operation, that is, when performing an automatic response of the at least one touch operation based on the recording file, other touch operations of the user in the electronic device are not affected. The third application is any application in the electronic device, and in one example, the third application is the same application as the first application.
That is, in the process of sequentially responding to the at least one touch operation in the first application program according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation, the touch operation of the user in the application interface of the third application program is received, and the touch operation of the user is responded in the third application program. For example, referring to fig. 20, in the process of automatically responding to at least one touch operation based on the recorded file, the user performs a click operation in the application interface of the third application program at the 4 th minute and 11 th second from the time when the recorded file starts to be played, and the click position coordinate of the click operation is (5.5,7.0). After receiving the click operation, the touch operation management module may issue a click event corresponding to the click operation according to a flow issued by the event, so as to send behavior data of the click operation to the third application program, and the third application program responds to the click operation based on the behavior data of the click operation.
As an example of the present application, as shown in fig. 17, when the user further sets the number of times of repeat playing for the target file based on the target interface, after the current round of response is finished, that is, after the response of the behavior data based on the touch operation in the last execution sequence in the recorded file is finished, the number of times of repeat playing may be counted, where the number of times of repeat playing is the number of times of repeat playing for the recorded file. And if the repeated playing times do not reach the target times, continuously responding to at least one touch operation in the first application program in sequence according to the execution sequence of the at least one touch operation on the basis of the behavior data of the at least one touch operation in the recording file. And if the repeated playing times reach the target times, determining that the response is finished.
As an example of the present application, when the user further sets a replay duration for the target file based on the target interface, after the current round of response is finished, that is, after the response of the behavior data based on the touch operation in the last execution sequence in the recorded file is finished, the total replay duration may be counted, where the total replay duration is the total duration of replaying the recorded file. And if the total repeated playing time does not reach the target time, continuously responding to at least one touch operation in the first application program in sequence according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation in the recording file. And if the total repeated playing time length reaches the target time length, determining that the response is finished.
It should be noted that, in this embodiment, when receiving a play operation of a recording file in a target interface by a user, if a first application interface is already opened, at least one touch operation is sequentially responded in an execution sequence of the at least one touch operation in a first application program based on the recording file. In another example, in a case where a time for starting playing of the recorded file is set, when it is monitored that the current time reaches the time for starting playing of the recorded file, if the first application interface is opened, the electronic device sequentially responds to at least one touch operation in the first application program according to an execution sequence of the at least one touch operation based on the recorded file, for example, if the time for starting playing of the recorded file is 08 every 7 month 1 to 7 month 5 of 2022: 00, at eight morning points of each of 7 month 1 day to 7 month 5 day of 2022, if the first application interface is opened, the electronic device automatically and sequentially responds to at least one touch operation in the first application program according to the execution sequence of the at least one touch operation based on the recording file. In yet another example, the electronic device may further automatically respond to the at least one touch operation in sequence in the first application program in an execution order of the at least one touch operation based on the recording file after the recording is generated and in a case where the first application interface is opened. The embodiments of the present application do not limit this.
It should be noted that, if the first application interface is not opened, the electronic device may not perform any operation. In another example, in a case that the first application interface is not opened, the electronic device may further display a prompt window on the desktop, where prompt information is displayed in the prompt window, and the prompt information is used for prompting a user whether to start the first application program and open the first application interface, for example, the prompt information is "please confirm whether to start the first application program and open the first application interface". In addition, the prompt window may further include a confirmation option and a cancel option, the confirmation option may be triggered when the user wants to start the first application program and open the first application interface, and in response to a triggering operation of the confirmation option by the user, the electronic device starts the first application program and opens the first application interface, and then sequentially responds to at least one touch operation according to an execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation in the recording file. Otherwise, if the user does not want to start the first application program and open the first application interface, the cancel option may be triggered, and in response to the user triggering the cancel option, the electronic device closes the prompt window, that is, does not perform other operations related to the recording file.
In the embodiment of the application, after receiving the recording start operation, behavior data of touch operation of a user on a first application interface of a first application program is recorded. And after the recording is finished, under the condition that the first application interface is opened, sequentially responding to at least one touch operation according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation. Therefore, by recording the behavior data of the touch operation of the user and automatically and repeatedly executing the event corresponding to at least one touch operation of the user in the first application interface according to the recorded behavior data, the requirement for manual repeated operation of the user is avoided, the problem that the operation efficiency is reduced due to fatigue of the user is avoided, and the task execution effect can be improved.
The electronic device provided by the embodiment of the application can be, but is not limited to, a mobile phone, a tablet computer, a portable computer, and the like, and the electronic device can support touch operation or touch screen operation. The electronic device is capable of installing applications, such as including the first application, the second application, the third application, and so on, as described above. Fig. 21 is a schematic structural diagram of an electronic device according to an embodiment of the present application. Referring to fig. 21, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces, such as an inter-integrated circuit (I2C) interface, an inter-integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, and the like.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device 100 through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being an integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving files of music, video, etc. in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created by the electronic device 100 during use, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation acting thereon or nearby. The touch sensor 180K may pass the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
In the above embodiments, the implementation may be wholly or partly realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that includes one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., digital Versatile Disk (DVD)), or a semiconductor medium (e.g., solid State Disk (SSD)), among others.
The above description is not intended to limit the present application to the particular embodiments disclosed, but rather, the present application is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present application.

Claims (15)

1. A method for operation control, applied to an electronic device, the method comprising:
receiving a recording starting operation, wherein the recording starting operation is used for triggering the start of recording the touch operation of a user on a screen of the electronic equipment;
recording behavior data of touch operation of the user in a first application interface of a first application program;
and under the condition that the recording is finished, if the first application interface is opened, sequentially responding to at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the recorded behavior data of the at least one touch operation.
2. The method according to claim 1, wherein in the case that the recording is finished, if the first application interface is opened, sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the recorded behavior data of the at least one touch operation comprises:
under the condition that the recording is finished, generating a recording file, wherein the recording file comprises behavior data of the at least one touch operation;
and under the condition that the first application interface is opened, sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation on the basis of the behavior data of the at least one touch operation in the recording file.
3. The method of claim 2, wherein sequentially responding to the at least one touch operation in the first application interface in the order of execution of the at least one touch operation based on the behavior data of the at least one touch operation in the recording file in the case that the first application interface has been opened comprises:
displaying a target interface, wherein the target interface comprises the recording file;
responding to a playing instruction of the recording file in the target interface, and sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation in the recording file under the condition that the first application interface is opened.
4. The method of claim 3, wherein the responding to the play instruction of the recording file in the target interface, in the case that the first application interface is opened, the at least one touch operation in the first application interface in sequence according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation in the recording file comprises:
receiving a repetition frequency setting instruction in the target interface, wherein the repetition frequency setting instruction carries a target frequency and is used for indicating that the target frequency is repeatedly executed for the at least one touch operation;
responding to a playing instruction of the recording file in the target interface, and sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation in the recording file under the condition that the first application interface is opened;
after the current round of playing is finished, counting the repeated playing times of the recorded file;
if the repeated playing times do not reach the target times, continuously responding to the at least one touch operation in the first application interface in sequence according to the execution sequence of the at least one touch operation on the basis of the behavior data of the at least one touch operation in the recording file;
and if the repeated playing times reach the target times, determining that the response is finished.
5. The method of any of claims 1-4, wherein the at least one touch operation comprises a first touch operation invoking a second application; the sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the recorded behavior data of the at least one touch operation under the condition that the first application interface is opened includes:
under the condition that the first application interface is opened, if the current execution sequence of the first touch operation is determined based on the behavior data of the first touch operation, monitoring whether the second application program is in a running state;
and starting the second application program under the condition that the second application program is not in a running state.
6. The method of any one of claims 1-5, further comprising:
receiving the touch operation of the user in the application interface of a third application program in the process of sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation;
responding to the touch operation of the user in the third application program.
7. The method according to any one of claims 1-6, wherein the sequentially responding to the at least one touch operation in the first application interface in the execution order of the at least one touch operation based on the recorded behavior data of the at least one touch operation in the case that the first application interface is opened comprises:
when the first application interface is opened, traversing according to the display sequence of the currently opened views from top to bottom when the execution sequence of the second touch operation is determined to be reached based on the behavior data of the second touch operation, wherein the second touch operation is any one of the at least one touch operation;
when a view is traversed, whether an application program to which the currently traversed view belongs is the first application program is inquired;
under the condition that the application program to which the currently traversed view belongs is not the first application program, continuously traversing the next view;
and sending the behavior data of the second touch operation to the first application program for processing under the condition that the application program to which the currently traversed view belongs is the first application program, and finishing the traversing operation.
8. The method of claim 7, wherein the electronic device comprises a touch event management module, an intercept feedback module, and an event receiving module; when a view is traversed, inquiring whether the application program to which the currently traversed view belongs is the first application program or not, wherein the inquiring includes:
when the touch event management module traverses to one view, the event attribute information of the second touch event is sent to an application program to which the currently traversed view belongs, and the event attribute information of the second touch event is determined based on the behavior data of the second touch event;
the application program sends first indication information to the interception feedback module under the condition that the second touch event is determined to be intercepted according to the event attribute information of the second touch event, wherein the first indication information is used for indicating the interception of the second touch event;
the interception feedback module feeds back the first indication information to the touch event management module;
the touch event management module determines that the application to which the currently traversed view belongs is the first application.
9. The method of claim 8, wherein in the case that the application program to which the currently traversed view belongs is the first application program, sending behavior data of the second touch operation to the first application program for processing comprises:
the touch event management module sends the behavior data of the second touch operation to the event receiving module;
the first application program acquires the behavior data of the second touch event from the event receiving module;
the first application responds to the second touch event based on the behavior data of the second touch event.
10. The method according to any one of claims 1 to 9, wherein in a case where the number of the at least one touch operation is plural, operation types of the plural touch operations include one or more of a click operation, a slide operation, and a long press operation.
11. The method of claim 10,
in the case that the at least one touch operation comprises the click operation, behavior data of the click operation comprises click position coordinates and click time of the click operation;
in a case where the at least one touch operation includes the slide operation, behavior data of the slide operation includes a slide start position coordinate, a slide end position coordinate, and a slide start time of the slide operation;
and in the case that the at least one touch operation comprises the long press operation, the behavior data of the long press operation comprises long press position coordinates, a long press starting time and a long press duration of the long press operation.
12. The method of any of claims 1-11, wherein receiving a recording start operation further comprises:
responding to one touch operation in the first application program every time the touch operation of the user in the first application interface is received.
13. The method according to any one of claims 1 to 12, wherein, in a case where the recording is finished, if the first application interface is opened, sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the recorded behavior data of the at least one touch operation comprises:
under the condition that a recording ending operation is received, if the first application interface is opened, sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation; alternatively, the first and second liquid crystal display panels may be,
and under the condition that the recording duration reaches the preset recording duration, if the first application interface is opened, sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation.
14. An electronic device, wherein the structure of the electronic device comprises a processor and a memory;
the memory is used for storing a program for supporting the electronic device to execute the method according to any one of claims 1-13 and storing data involved in implementing the method according to any one of claims 1-13;
the processor is configured to execute programs stored in the memory.
15. A computer-readable storage medium having stored therein instructions which, when run on a computer, cause the computer to perform the method of any one of claims 1-13.
CN202211343134.4A 2022-10-31 2022-10-31 Operation control method, electronic device and readable storage medium Active CN115686334B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211343134.4A CN115686334B (en) 2022-10-31 2022-10-31 Operation control method, electronic device and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211343134.4A CN115686334B (en) 2022-10-31 2022-10-31 Operation control method, electronic device and readable storage medium

Publications (2)

Publication Number Publication Date
CN115686334A true CN115686334A (en) 2023-02-03
CN115686334B CN115686334B (en) 2023-11-28

Family

ID=85046802

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211343134.4A Active CN115686334B (en) 2022-10-31 2022-10-31 Operation control method, electronic device and readable storage medium

Country Status (1)

Country Link
CN (1) CN115686334B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116185268A (en) * 2023-02-17 2023-05-30 深圳市和风科技有限公司 Interaction method, system, medium and computer of code scanning gun and terminal equipment

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110191676A1 (en) * 2010-01-29 2011-08-04 Microsoft Corporation Cross-Browser Interactivity Recording, Playback, and Editing
CN108038059A (en) * 2017-12-22 2018-05-15 广州酷狗计算机科技有限公司 Interface traversal method and device
CN109542553A (en) * 2018-10-26 2019-03-29 北京慧流科技有限公司 The information extraction method and device and storage medium of user interface UI element
CN110389802A (en) * 2019-06-05 2019-10-29 华为技术有限公司 A kind of display methods and electronic equipment of flexible screen
CN110703948A (en) * 2019-10-09 2020-01-17 展讯通信(上海)有限公司 Touch screen operation recording and broadcasting system and method
CN110928787A (en) * 2019-11-22 2020-03-27 北京博睿宏远数据科技股份有限公司 Automatic test script recording and playback method, device, equipment and storage medium
CN111143200A (en) * 2019-12-12 2020-05-12 广州华多网络科技有限公司 Method and device for recording and playing back touch event, storage medium and equipment
CN114443447A (en) * 2021-12-17 2022-05-06 苏州浪潮智能科技有限公司 Webpage operation playback method and device, computer equipment and medium
CN114650330A (en) * 2020-12-18 2022-06-21 华为技术有限公司 Method, electronic equipment and system for adding operation sequence
CN114692049A (en) * 2022-03-29 2022-07-01 医渡云(北京)技术有限公司 Browser-based screen recording method and device, electronic equipment and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110191676A1 (en) * 2010-01-29 2011-08-04 Microsoft Corporation Cross-Browser Interactivity Recording, Playback, and Editing
CN108038059A (en) * 2017-12-22 2018-05-15 广州酷狗计算机科技有限公司 Interface traversal method and device
CN109542553A (en) * 2018-10-26 2019-03-29 北京慧流科技有限公司 The information extraction method and device and storage medium of user interface UI element
CN110389802A (en) * 2019-06-05 2019-10-29 华为技术有限公司 A kind of display methods and electronic equipment of flexible screen
CN110703948A (en) * 2019-10-09 2020-01-17 展讯通信(上海)有限公司 Touch screen operation recording and broadcasting system and method
CN110928787A (en) * 2019-11-22 2020-03-27 北京博睿宏远数据科技股份有限公司 Automatic test script recording and playback method, device, equipment and storage medium
CN111143200A (en) * 2019-12-12 2020-05-12 广州华多网络科技有限公司 Method and device for recording and playing back touch event, storage medium and equipment
CN114650330A (en) * 2020-12-18 2022-06-21 华为技术有限公司 Method, electronic equipment and system for adding operation sequence
CN114443447A (en) * 2021-12-17 2022-05-06 苏州浪潮智能科技有限公司 Webpage operation playback method and device, computer equipment and medium
CN114692049A (en) * 2022-03-29 2022-07-01 医渡云(北京)技术有限公司 Browser-based screen recording method and device, electronic equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116185268A (en) * 2023-02-17 2023-05-30 深圳市和风科技有限公司 Interaction method, system, medium and computer of code scanning gun and terminal equipment
CN116185268B (en) * 2023-02-17 2024-01-30 深圳市和风科技有限公司 Interaction method, system, medium and computer of code scanning gun and terminal equipment

Also Published As

Publication number Publication date
CN115686334B (en) 2023-11-28

Similar Documents

Publication Publication Date Title
CN108512695B (en) Method and device for monitoring application blockage
KR101457632B1 (en) Mobile electronic device having program notification function and program notification method thereof
CN108536538A (en) Processor core dispatching method, device, terminal and storage medium
US11706331B2 (en) Information processing method and apparatus, storage medium, and electronic device
US20240111473A1 (en) Distributed display method and terminal for application interface
CN110110262A (en) Browser EMS memory management process, device and equipment
CN109343902A (en) Operation method, device, terminal and the storage medium of audio processing components
CN111367456A (en) Communication terminal and display method in multi-window mode
CN111597000A (en) Small window management method and terminal
US20230014732A1 (en) Application startup and archiving
EP1700234A1 (en) System and method for sequencing media objects
CN115686334B (en) Operation control method, electronic device and readable storage medium
CN112732434A (en) Application management method and device
CN113127773A (en) Page processing method and device, storage medium and terminal equipment
CN111797343A (en) Operation activity management method, configuration server and display terminal
CN109462777B (en) Video heat updating method, device, terminal and storage medium
CN114327087A (en) Input event processing method and device, electronic equipment and storage medium
CN113709026A (en) Method, device, storage medium and program product for processing instant communication message
US20230139886A1 (en) Device control method and device
CN111324255B (en) Application processing method based on double-screen terminal and communication terminal
CN112783386A (en) Page jump method, device, storage medium and computer equipment
CN115061758B (en) Application display method, terminal, electronic device and storage medium
CN113938550B (en) Terminal equipment, information feedback method and storage medium
CN113253905B (en) Touch method based on multi-finger operation and intelligent terminal
US20240007559A1 (en) Message Prompt Method and Electronic Device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant