CN115686334B - Operation control method, electronic device and readable storage medium - Google Patents

Operation control method, electronic device and readable storage medium Download PDF

Info

Publication number
CN115686334B
CN115686334B CN202211343134.4A CN202211343134A CN115686334B CN 115686334 B CN115686334 B CN 115686334B CN 202211343134 A CN202211343134 A CN 202211343134A CN 115686334 B CN115686334 B CN 115686334B
Authority
CN
China
Prior art keywords
touch operation
application
touch
behavior data
recording
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211343134.4A
Other languages
Chinese (zh)
Other versions
CN115686334A (en
Inventor
王傲飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211343134.4A priority Critical patent/CN115686334B/en
Publication of CN115686334A publication Critical patent/CN115686334A/en
Application granted granted Critical
Publication of CN115686334B publication Critical patent/CN115686334B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an operation control method, electronic equipment and a readable storage medium, and belongs to the technical field of terminals. The method comprises the following steps: and receiving a recording start operation, wherein the recording start operation is used for triggering the start of recording the touch operation of the user on the screen of the electronic equipment. Behavior data of touch operation of a user in a first application interface of a first application program is recorded. And under the condition that recording is finished, if the first application interface is opened, sequentially responding to at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the recorded behavior data of the at least one touch operation. According to the method and the device for executing the task, the behavior data of the touch operation of the user are recorded, the event corresponding to at least one touch operation of the user is automatically executed repeatedly in the first application interface according to the recorded behavior data, the manual repeated operation of the user is avoided, the problem of operation efficiency reduction caused by fatigue of the user is avoided, and the task execution effect is improved.

Description

Operation control method, electronic device and readable storage medium
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a method for controlling operations, an electronic device, and a readable storage medium.
Background
With rapid development of terminal technology, electronic devices are widely used. During use of the electronic device, in some scenarios it may be desirable for the user to mechanically and repeatedly perform certain types of operations, such as requiring the user to constantly pull down a red envelope displayed in an application interface while brushing the red envelope in a certain application. Accordingly, the electronic device performs corresponding operations such as opening a red envelope and displaying the amount of the red envelope.
However, these repetitive operations may cause the user to gradually feel tired, and as time increases, the operation efficiency is easily lowered due to the tired user, thereby affecting the task execution effect.
Disclosure of Invention
The application provides an operation control method, electronic equipment and a readable storage medium, which can solve the problem that the operation efficiency is reduced due to fatigue caused by repeated operation of a user for a long time, so that the task execution effect is affected. The technical scheme is as follows:
in a first aspect, a method for controlling operation, an electronic device, and a readable storage medium are provided, where the method includes:
receiving a recording start operation, wherein the recording start operation is used for triggering the start of recording the touch operation of a user on the screen of the electronic equipment;
Recording behavior data of touch operation of the user in a first application interface of a first application program;
and under the condition that recording is finished, if the first application interface is opened, sequentially responding to at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the recorded behavior data of the at least one touch operation.
Therefore, through recording the behavior data of the touch operation of the user, then automatically repeatedly executing the event corresponding to at least one touch operation of the user in the first application interface according to the recorded behavior data, the problem that the operation efficiency is reduced due to fatigue of the user is avoided, and further the task execution effect can be improved.
As an example of the present application, in the case where recording is finished, if the first application interface is already opened, sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the recorded behavior data of the at least one touch operation, including:
generating a recording file under the condition that recording is finished, wherein the recording file comprises behavior data of the at least one touch operation;
And under the condition that the first application interface is opened, sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation in the recorded file.
Therefore, by generating the recorded file, at least one touch operation can be replayed based on the recorded file, so that corresponding touch events can be automatically and sequentially executed in the first application interface according to the execution sequence of the at least one touch operation, and further, the need of manual operation of a user is avoided.
As an example of the present application, when the first application interface is opened, based on behavior data of the at least one touch operation in the record file, sequentially responding to the at least one touch operation in the first application interface according to an execution sequence of the at least one touch operation, including:
displaying a target interface, wherein the target interface comprises the recorded file;
responding to a playing instruction of the recorded file in the target interface, and under the condition that the first application interface is opened, sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on behavior data of the at least one touch operation in the recorded file.
Therefore, after the recorded file is generated, the target interface comprising the recorded file is displayed, so that a user can conveniently play the recorded file in the target interface, namely, the user can play the recorded file according to own requirements, and the user experience can be improved.
As an example of the present application, in response to a play instruction of the recording file in the target interface, in a case where the first application interface is opened, based on behavior data of the at least one touch operation in the recording file, sequentially responding to the at least one touch operation in the first application interface according to an execution order of the at least one touch operation, including:
receiving a repetition number setting instruction in the target interface, wherein the repetition number setting instruction carries target number of times, and the repetition number setting instruction is used for indicating the repeated execution of the target number of times on the at least one touch operation;
responding to a playing instruction of the recorded file in the target interface, and sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on behavior data of the at least one touch operation in the recorded file under the condition that the first application interface is opened;
After the current round of playing is finished, counting the repeated playing times of the recorded file;
if the repeated playing times do not reach the target times, continuing to respond to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation in the recorded file;
and if the repeated playing times reach the target times, determining that the response is ended.
Therefore, the repeated times setting option is provided in the target interface, so that the user can set the repeated times of playing the recorded file according to the requirement, the need of manual repeated playing of the user is avoided, and the user experience is improved. And the electronic equipment can automatically play the recorded file for a plurality of times, so that corresponding touch events are executed in the first application interface for a plurality of times, and the task execution effect can be improved.
As one example of the present application, the at least one touch operation includes a first touch operation invoking a second application; and under the condition that the first application interface is opened, based on the recorded behavior data of at least one touch operation, sequentially responding to the at least one touch operation on the first application interface according to the execution sequence of the at least one touch operation, wherein the method comprises the following steps:
If the execution sequence of the first touch operation is determined to be currently reached based on the behavior data of the first touch operation under the condition that the first application interface is opened, monitoring whether the second application program is in a running state;
and starting the second application program under the condition that the second application program is not in a running state.
Therefore, in the process of playing the recorded file, another application program, namely the second application program, can be pulled up as required, the realization capability of the service is expanded, and the richness of the service is increased.
As an example of the present application, the method further comprises:
receiving touch operations of the user in an application interface of a third application program in a process of sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation;
responding to the touch operation of the user in the third application program.
In this way, in the process of sequentially executing the touch event corresponding to each touch operation in the first application interface based on the behavior data of at least one touch operation, when the re-touch operation of the user is received, the normal response can still be performed, so that the user experience can be improved.
As an example of the present application, the sequentially responding, in the first application interface, to the at least one touch operation in an execution order of the at least one touch operation based on the recorded behavior data of the at least one touch operation when the first application interface has been opened, includes:
when the execution sequence of the second touch operation is determined to be reached based on the behavior data of the second touch operation under the condition that the first application interface is opened, traversing the display sequence from top to bottom of each view which is opened currently, wherein the second touch operation is any one touch operation in the at least one touch operation;
querying whether an application program to which a currently traversed view belongs is the first application program or not when traversing to one view;
if the application program to which the currently traversed view belongs is not the first application program, continuing to traverse the next view;
and sending the behavior data of the second touch operation to the first application program for processing and ending the traversing operation when the application program to which the currently traversed view belongs is the first application program.
In this way, the touch events are issued in a traversing mode from view to view, so that the touch events can be responded orderly, and the normal response to the touch events received again can be ensured under the condition that the touch events are received again, and the condition that the touch event response is disordered or lost is avoided.
As one example of the present application, the electronic device includes a touch event management module, an interception feedback module, and an event receiving module; and when traversing to one view, inquiring whether the application program to which the currently traversed view belongs is the first application program or not, wherein the method comprises the following steps:
when the touch event management module traverses to one view, event attribute information of the second touch event is sent to an application program to which the currently traversed view belongs, and the event attribute information of the second touch event is determined based on behavior data of the second touch event;
the application program sends first indication information to the interception feedback module under the condition that the second touch event is intercepted according to the event attribute information of the second touch event, wherein the first indication information is used for indicating interception of the second touch event;
The interception feedback module feeds back the first indication information to the touch event management module;
the touch event management module determines that the application to which the currently traversed view belongs is the first application.
Therefore, the touch event management module is responsible for issuing the event, the interception feedback module is responsible for feeding back the interception result of the event, and the interception result is fed back to the touch event management module, so that the touch event can be responded orderly.
As an example of the present application, in a case where the application to which the currently traversed view belongs is the first application, the sending the behavior data of the second touch operation to the first application for processing includes:
the touch event management module sends the behavior data of the second touch operation to the event receiving module;
the first application program obtains the behavior data of the second touch event from the event receiving module;
the first application responds to the second touch event based on behavior data of the second touch event.
In this way, the touch event management module sends the behavior data to the event receiving module, so that the first application program obtains the behavior data from the event receiving module and responds to the behavior data, namely, a data transmission channel is established through the event receiving module, and the touch event management module can send the behavior data to be processed by different application programs in a unified data encapsulation mode.
As an example of the present application, in the case where the number of the at least one touch operation is plural, the operation types of the plural touch operations include one or more of a click operation, a slide operation, and a long press operation.
Therefore, the behavior data of the touch operation with different operation types can be recorded, and the richness of the service can be improved.
As an example of the present application,
in the case that the at least one touch operation includes the click operation, behavior data of the click operation includes click position coordinates and click time of the click operation;
in the case where the at least one touch operation includes the sliding operation, behavior data of the sliding operation includes a sliding start position coordinate, a sliding end position coordinate, and a sliding start time of the sliding operation;
in the case that the at least one touch operation includes the long press operation, behavior data of the long press operation includes long press position coordinates of the long press operation, a long press start time and a long press duration.
Therefore, the behavior data of the touch operation with different operation types can be recorded, and the richness of the service can be improved.
As an example of the present application, after the receiving the recording start operation, the method further includes:
each time a touch operation of the user in the first application interface is received, responding to the touch operation in the first application program.
Therefore, in the process of recording the touch operation of the user, each recorded touch operation can still be responded normally, and the problem that the normal response is influenced by the recording operation can be avoided.
As an example of the present application, in the case where recording is finished, if the first application interface is already opened, sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the recorded behavior data of the at least one touch operation, including:
under the condition that recording ending operation is received, if the first application interface is opened, sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation; or,
and under the condition that the recording duration reaches the preset recording duration, if the first application interface is opened, sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation.
In this way, the recording can be finished through the recording finishing operation, or the recording can be finished through setting the preset recording duration, so that the recording finishing mode is increased.
In a second aspect, there is provided an operation control apparatus having a function of realizing the method behavior of the operation control in the first aspect described above. The device for controlling operation comprises at least one module for implementing the method for controlling operation provided in the first aspect.
In a third aspect, an electronic device is provided, where the electronic device includes a processor and a memory, where the memory is configured to store a program for supporting the electronic device to execute the method of operation control provided in the first aspect, and store data related to the method for implementing the operation control in the first aspect. The processor is configured to execute a program stored in the memory. The electronic device may further comprise a communication bus for establishing a connection between the processor and the memory.
In a fourth aspect, there is provided a computer readable storage medium having instructions stored therein, which when run on a computer, cause the computer to perform the method of operation control of the first aspect described above.
In a fifth aspect, there is provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method of operational control of the first aspect described above.
The technical effects obtained by the second, third, fourth and fifth aspects are similar to the technical effects obtained by the corresponding technical means in the first aspect, and are not described in detail herein.
Drawings
FIG. 1 is a schematic diagram of an application scenario shown in accordance with an exemplary embodiment;
FIG. 2 is a schematic diagram of an application scenario illustrated in accordance with another exemplary embodiment;
FIG. 3 is a schematic diagram of an application scenario illustrated in accordance with another exemplary embodiment;
FIG. 4 is a schematic diagram of a target interface shown according to an example embodiment;
FIG. 5 is a schematic diagram of a target interface shown according to another exemplary embodiment;
FIG. 6 is a schematic diagram of a target interface shown according to another exemplary embodiment;
FIG. 7 is a schematic diagram of a target interface shown according to another exemplary embodiment;
FIG. 8 is a schematic diagram of a software architecture of an electronic device, according to an example embodiment;
FIG. 9 is a flowchart of a method of operational control, according to an example embodiment;
FIG. 10 is a schematic diagram of an application scenario illustrated in accordance with another exemplary embodiment;
FIG. 11 is a schematic diagram illustrating the type of operation for one touch operation, according to an example embodiment;
FIG. 12 is a schematic diagram illustrating a flow of issuing a touch event according to an example embodiment;
FIG. 13 is a schematic diagram illustrating a flow of issuing a touch event according to another exemplary embodiment;
FIG. 14 is a schematic diagram illustrating a flow of issuing a touch event according to another exemplary embodiment;
FIG. 15 is a schematic diagram illustrating a flow of issuing a touch event according to another exemplary embodiment;
FIG. 16 is a schematic diagram illustrating one method of recording touch operations according to an example embodiment;
FIG. 17 is a flow chart illustrating a method of operational control according to another exemplary embodiment;
FIG. 18 is a schematic diagram illustrating monitoring of an application running state according to an exemplary embodiment;
FIG. 19 is a schematic diagram illustrating a flow of issuing a touch event according to another exemplary embodiment;
FIG. 20 is a schematic diagram illustrating an insert touch event, according to an example embodiment;
fig. 21 is a schematic diagram showing a structure of an electronic device according to an exemplary embodiment.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
It should be understood that references to "a plurality" in this disclosure refer to two or more. In the description of the present application, "/" means or, unless otherwise indicated, for example, A/B may represent A or B; "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. In addition, in order to facilitate the clear description of the technical solution of the present application, the words "first", "second", etc. are used to distinguish the same item or similar items having substantially the same function and function. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
In some scenarios, a user may be required to perform mechanical, complex, repetitive operations on an application interface during use of the electronic device. For example, taking an electronic device as a mobile phone as an example, in one possible scenario, an application program a in the mobile phone supports a red packet brushing function, a user opens the application program a and enters a red packet brushing interface of the application program a, as shown in fig. 1 (a), a red packet picture is displayed in the red packet brushing interface of the application program a, so that the user can continuously pull down the red packet picture in the red packet brushing interface, and after the user repeatedly pulls down for a certain number of times, the application program a is triggered to open a red packet and display a red packet amount, as shown in fig. 1 (b).
However, in an application scenario like the above, the user is required to frequently and repeatedly perform the touch operation, which is complicated in operation, and with the increase of time, the efficiency of the touch operation may be reduced due to fatigue of the user, thereby affecting the execution efficiency of the task. Therefore, the embodiment of the application provides an operation control method, which can record the behavior data of a series of touch operations of a user in the process of the touch operations of the user, and then automatically execute the series of touch operations based on the recorded result, so that the problem that the efficiency of the touch operations is reduced due to fatigue of the user is avoided, and further the execution efficiency of a task is prevented from being influenced.
Before introducing the method provided by the embodiment of the application, the application scene related to the embodiment of the application is introduced.
As an example of the application, application B in a cell phone supports a virtual garden function in which a user can breed virtual animals, thus requiring the user to perform some tasks at intervals. For example, referring to fig. 2 (a), fig. 2 (a) is a schematic diagram of a virtual garden interface in an application B according to an exemplary embodiment, wherein the virtual garden interface includes virtual chickens 21, such as "lovely chickens", and the user is required to perform tasks on the virtual garden interface at intervals in order to enable the lovely chickens to develop virtual eggs. Referring to fig. 2 (b), in order to avoid the need for frequent repetitive actions by the user, the user may pull down a notification bar after entering the virtual house interface and before doing the task, the notification bar including the screen recording option 22. The user can press the record screen option 22 for a long time, in response to a trigger operation of the record screen option 22 by the user, the virtual garden interface is redisplayed by the mobile phone as shown in fig. 2 (c), and the mobile phone starts recording the touch operation of the user in the virtual garden interface. For example, as shown in fig. 2 (c), the user may click on the lovely chicken to interact with the loved chicken, in response to the clicking operation of the loved chicken by the user, the application B may display an interactive utterance near the loved chicken, for example, as shown in fig. 2 (c), the interactive utterance may be "at busy and good eating, and to increase the mood index of the loved chicken, the user may click on the loved chicken multiple times, and accordingly, the interactive utterance displayed by the application B each time may be different, for example, the user clicks twice consecutively, please refer to fig. 2 (d), and the interactive utterance displayed second time may be" a child on earth receives help due to your behavior ". Alternatively, the user may feed the chicks, such as shown in fig. 2 (e), and the user may click on the virtual grain library 23 in the virtual garden interface. Referring to fig. 2 (f), in response to a user's clicking operation on the virtual grain library 23, the application program B adds virtual grains in the food tray 24 of the lovely chicken and reduces the remaining amount of grains in the virtual grain library 23.
Then, referring to fig. 3 (a), when the user wants to end the operation, the notification bar may be pulled down again, in response to the user's pulling down operation, the mobile phone displays the notification bar, the user may press the recording option 22 again for a long time, and in response to the user's pressing the recording option 22 for a long time, the mobile phone ends the operation of recording the touch operation. In one example, referring to fig. 3 (b), after the user presses the screen recording option 22 again, the mobile phone displays a target interface, where the target interface includes a recording file 31, and the recording file 31 includes behavior data of the touch operation recorded in the above process. In addition, the playing options 32 corresponding to the recorded files 31 are also reported in the target interface, and when the user wants the mobile phone to automatically execute the series of operations on the virtual garden interface based on the recorded files 31, the playing options 32 can be clicked. Referring to fig. 3 (c), in response to the user triggering the play option 32, the mobile phone displays a virtual garden interface, and then, the mobile phone automatically executes the events triggered by the series of touch operations in the virtual state interface. For example, as shown in fig. 3 (d), the mobile phone may simulate the operation of clicking on the loved chicken, the application program B displays an interactive word near the loved chicken, for example, the interactive word is "thank you for you to play with me", please continue to refer to fig. 3 (e), the mobile phone again simulates the operation of clicking on the loved chicken, the application program B again displays an interactive word near the loved chicken, for example, the interactive word is "today's mood mei" and then the mobile phone simulates the operation of clicking on the virtual grain library 23 in the virtual garden interface, the application program B again adds virtual grains in the food tray 24 of the loved chicken and reduces the grain remaining amount in the virtual grain library 23, as shown in fig. 3 (f), and the grain remaining amount in the virtual grain library 23 is 80g.
As an example of the present application, during playing of the recorded file, a touch action corresponding to each touch operation may also be displayed in the virtual garden interface, such as a schematic diagram of a click of a cursor may be displayed in the virtual garden interface by a click position for a click operation, a schematic diagram of a slide of a cursor may be displayed in a slide position for a slide operation, and so on.
As an example of the present application, referring to fig. 3 (c), during playing the recorded file, a pause play option 33 may also be displayed in the virtual fazenda interface, and the pause play option 33 may be clicked when the user does not need to automatically execute an event not executed in the recorded file in the virtual fazenda interface by the mobile phone. In response to the user clicking the pause play option 33, the mobile phone does not execute a subsequent unexecuted event based on the recorded file, i.e., stops playing the recorded file.
In some scenarios, the user may not want to play the recorded file immediately after the touch operation is recorded, for which, referring to fig. 4 (a), a start time setting option may also be provided in the target interface for the user to set the time to start playing the recorded file. The user may click on the start time setting option, please refer to fig. 4 (b), and respond to the triggering operation of the user on the start time setting option, the mobile phone displays a time setting interface, so that the user can set the time for starting playing the recorded file based on the time setting interface, which specifically includes setting the date and setting the time. In this case, the user does not need to click the play option again to play the recorded file, but plays the recorded file according to the set start time, that is, when the system time is detected to reach the set start time, in the case that the application program B is in the running state and the virtual garden interface is opened, the mobile phone plays the recorded file, that is, executes the series of operations in the virtual garden interface.
In some scenarios, the user may also need to repeatedly play the recorded file multiple times. For this reason, referring to fig. 5 (a), the target interface further includes a repetition number setting option for setting the number of repeated playing of the recorded file by the user. The user may click on the repetition number setting option, please refer to fig. 5 (b), and in response to the triggering operation of the repetition number setting option by the user, the mobile phone displays a repetition number setting interface, and the user may set the number of repeated playing of the recorded file based on the repetition number setting interface. After the recorded file is played, the mobile phone counts the repeated playing times, and stops the repeated playing when the counted repeated playing times reach the set times.
Further, referring to fig. 4 (a), the target interface may further include description information of the recording file, such as recording time, recording duration, etc., which is not limited in the embodiment of the present application.
In some scenarios, the user may also need to repeatedly play the recorded file multiple times. For this reason, referring to fig. 6 (a), the target interface further includes a repeat time length setting option for setting a total time length for repeated playing of the recorded file by the user. The user can click on the repetition time length setting option, please refer to fig. 6 (b), and in response to the triggering operation of the user on the repetition time length setting option, the mobile phone displays a repetition time length setting interface, and the user can set the total time length for repeatedly playing the recorded file based on the repetition time length setting interface, including the time, minute and second settings. After the recorded file is played, the mobile phone counts the playing time, and when the playing time reaches the set time, the repeated playing is stopped.
As an example of the present application, the target interface may further include a plurality of recording files, where the plurality of recording files are displayed according to the sequence of recording times, please refer to fig. 7 (a), and when the number of the plurality of recording files is greater, the user may slide left (or slide right) the icon of the recording file, please refer to fig. 7 (b), and in response to the user's left-sliding operation, the mobile phone displays the recording file of the next page in the target interface. Each recording file corresponds to a file name, and the user can select a recording file currently required to be played from a plurality of recording files according to requirements, for example, as shown in (c) in fig. 7, if the recording file of the touch operation recorded in the virtual garden interface is recording file 1, the user can select recording file 1, and then can perform relevant configuration (such as the number of repetitions) for recording file 1, or can directly click on a play option, so that the mobile phone automatically performs a corresponding operation in the virtual garden interface based on recording file 1.
It should be noted that, the screen recording option described above may also be used to trigger the mobile phone to perform screen recording operation, for example, after clicking the screen recording option, the mobile phone starts to perform screen recording. In another example, an operation recording option dedicated to triggering the mobile phone to start or end recording touch operation may be provided in the drop-down notification bar, for example, the operation recording option is an "operation recording" option, in which case, the user may trigger the mobile phone to start or end recording touch operation by clicking the operation recording option. Or, the operation recording option provided in the drop-down notification bar is used for triggering the mobile phone to start recording touch operation, then the ending option can be displayed in the virtual fazenda interface, when the user wants to end the recording operation, the ending option can be triggered, and accordingly, the mobile phone ends the recording operation, so that the recording operation can be directly ended in the virtual fazenda interface, the need for the user to open the drop-down notification bar is avoided, the operation convenience is improved, and the user experience can be improved.
It should be noted that the above application scenario is merely exemplary, and the application scenario of the method provided by the embodiment of the present application is not limited, that is, in other application interfaces, when the user wants the electronic device to automatically trigger a series of subsequent touch operations of the user in other application interfaces, the user may operate in the above manner.
Next, a software system of an electronic device according to an embodiment of the present application will be described. The software system of the electronic device may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the application, an Android (Android) system with a layered architecture is taken as an example, and a software system of electronic equipment is exemplified.
Fig. 8 is a block diagram of a software system of an electronic device according to an embodiment of the present application. Referring to fig. 8, the hierarchical architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run time) and system layer, and a kernel layer, respectively.
The application layer may include a series of application packages. As shown in fig. 8, the application package may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions. As an example of the present application, the application layer includes a view system, a touch event management module, an interception feedback module, an event receiving module, an event destruction module, and the like.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to construct a display interface for an application, which may be comprised of one or more views, such as a view that includes displaying a text notification icon, a view that includes displaying text, and a view that includes displaying a picture.
The touch event management module is used for distributing the touch event under the condition that the touch event occurs, so that the touch event is conveniently issued to the corresponding application program for processing. In one example, the touch event management module is Activity. Display touch_UchEvent (), and distributes touch events through Window. SuperDispatch touch_hEvent () and DecoreView. SuperDispatch touch_hEvent ().
The interception feedback module is used for feeding back whether the application program intercepts the touch event or not to the touch event management module. In one example, the intercept feedback module is onintertctouchevent ().
The event receiving module is used for receiving the behavior data of the touch event and supplying the application program for calling, so that the application program can process the corresponding touch event based on the behavior data conveniently. In one example, the event receiving module is onTouchEvent.
The event destruction module is used for destroying abandoned events, wherein the abandoned events refer to touch events which are not processed by the application program.
Further, as shown in FIG. 8, the application framework layer may include a window manager, a content provider, a telephony manager, a resource manager, a notification manager, and the like. The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data, which may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc., and make such data accessible to the application. The telephony manager is used to provide communication functions of the electronic device, such as management of call status (including on, off, etc.). The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like. The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. For example, a notification manager is used to inform that the download is complete, a message alert, etc. The notification manager may also be a notification that appears in the system top status bar in the form of a chart or a scroll bar text, such as a notification of a background running application. The notification manager may also be a notification that appears on the screen in the form of a dialog window, such as a text message being prompted in a status bar, a notification sound being emitted, the electronic device vibrating, a flashing indicator light, etc.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system. The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules, such as: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc. The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications. Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The workflow of the electronic device software and hardware is illustrated below in connection with capturing a photo scene.
When touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into the original input event (including information such as touch coordinates, time stamp of touch operation, etc.). The original input event is stored at the kernel layer. The application framework layer acquires an original input event from the kernel layer, and identifies a control corresponding to the original input event. Taking the touch operation as a click operation, the control corresponding to the click operation is a control of a camera application icon as an example, the camera application calls an interface of an application program framework layer, starts the camera application, calls a kernel layer to start a camera driver, and captures a still image or video through a camera 193.
The method provided by the embodiment of the application is described in detail below with reference to the accompanying drawings. Referring to fig. 9, fig. 9 is a schematic flow chart of an operation control method according to an exemplary embodiment, which is used herein as an example, but not limiting, for the electronic device shown in fig. 8, and the electronic device is illustrated by using a plurality of module interaction implementations as an example. The method may include some or all of the following:
Step 901: in the process of displaying the first application interface of the first application program, the touch event management module receives a recording start operation, wherein the recording start operation is used for triggering the start of recording the touch operation of the user in the first application interface of the first application program.
The first application is any one application in the electronic device. For example, the first application may be, but is not limited to, a shopping application (e.g., taobao TM ) Payment application (e.g. Payment device TM ) Any one of desktop programs.
In one example, the first application interface is an interface of a service provided in the first application program, such as a virtual garden service provided in the first application program, and the second application interface of the first application program may be provided with a virtual garden icon, the user may click on the virtual garden icon, and in response to the clicking operation of the virtual garden icon by the user, the electronic device displays a corresponding virtual garden interface, such as shown in fig. 2 (a), where the virtual garden interface is the first application interface. In one example, the electronic device may display the virtual garden interface through a view system.
In one example, an operational recording option (which may be multiplexed with a recording option) is provided in the electronic device, which may be triggered by a user to cause the electronic device to begin performing a recording operation.
For example, referring to fig. 2, in order to avoid the need for the user to manually repeat operations, the user may trigger the electronic device to record the next touch operation in the virtual garden interface, since the user is required to repeatedly perform operations in the virtual garden interface, such as multiple interactions with the lovely chicks, feeding the loved chicks, dressing the loved chicks, etc., during the course of playing the virtual garden of application B. To this end, the user slides down from the upper top end of the electronic device. As shown in fig. 2 (b), in response to a user sliding down from the top end of the electronic device, the electronic device displays a drop-down notification bar including a screen recording option, in which the user can press the screen recording option for a long time. Accordingly, the electronic device receives the recording start operation, and the electronic device starts the operation recording function, namely, the function of recording the touch operation of the user on the screen of the electronic device.
As an example of the present application, after the user presses the screen recording option for a long time, the electronic device returns to the first application interface before the drop-down notification bar is displayed, for example, see (c) in fig. 2, and after the user presses the screen recording option for a long time, the electronic device returns to the virtual garden interface.
Step 902: the touch event management module receives a touch operation of a user on the first application interface.
That is, after the recording function is started, the user may perform a touch operation in the first application interface of the first application program, and it is easy to understand that, during the recording operation, the user may perform one touch operation in the first application interface of the first application program, or may perform multiple touch operations (such as continuous or time-sharing). Accordingly, the touch event management module of the electronic device receives each touch operation.
In one possible scenario, the user may also switch to a third application interface of the first application program after touching a control in the first application interface. For example, referring to fig. 10 (a), a "fertilizer" icon is provided in the virtual farm interface, and when the user clicks on the "fertilizer" icon, the mobile phone displays the virtual farm interface of the application B in response to the clicking operation of the user, as shown in fig. 10 (B). After switching to the third application interface, the user may continue to perform some touch operations in the third application interface, in which case the touch event management module receives each touch operation of the user on the third application interface.
Referring to fig. 11, in the case where the touch operation is plural, the operation type of the plural touch operations may generally include one or more of a click operation, a slide operation, and a long press operation. Wherein fig. 11 (a) shows a click operation, fig. 11 (b) shows a slide operation, and fig. 11 (c) shows a long press operation.
Step 903: the touch event management module records behavior data of the touch operation.
And each time a touch operation is received on the first application interface of the first application program, the touch event management module records behavior data corresponding to the touch operation, namely, an operation recording function is executed.
In one example, if a certain touch operation is performed in the first application interface and then the touch operation is performed in the third application interface, the touch event management module records behavior data of each touch operation in the third application interface.
The behavior data of the touch operation is different according to the operation type of the touch operation. Specifically, the following cases are included:
first case: in the case where the touch operation is a click operation, the behavior data of the click operation includes a click position coordinate of the click operation for indicating a position where the click operation acts on the first application interface of the first application program and a click time for indicating how long after the click operation occurs from when the operation recording starts. For example, the click position coordinates are (x, y) = (12.1,22.5), and the click time is 01:10, which indicates that the user performs a click operation at the position (12.1,22.5) on the first application interface of the first application program, the click operation occurring 1 min 10 seconds from the start of the operation recording.
In one example, if the first application interface of the first application program is full-screen display, that is, the display size of the first application interface of the first application program is the same as the screen size of the electronic device, the click position coordinate is the coordinate of the position point of the click operation on the screen.
In another example, if the first application interface of the first application program is not displayed full screen, that is, the display size of the first application interface of the first application program is smaller than the screen size of the electronic device, for example, in a tablet computer, the first application interface of the first application program only occupies a portion of the tablet screen. In this case, the click position coordinates may be determined according to the relative position of the first application interface and the screen, and the coordinates of the position point on the screen where the click operation is applied. For example, coordinates of a location point of the clicking operation on the screen may be determined, and then the determined coordinates are transformed into a coordinate system in which the first application interface is located according to a relative position of the first application interface and the screen, so as to determine coordinates of a clicking location of the clicking operation on the first application interface of the first application program.
Second case: in the case where the touch operation is a slide operation, behavior data of the slide operation includes a slide start position coordinate, a slide end position coordinate, and a slide start time of the slide operation. The slide start position coordinates are used to indicate a start position of a slide operation on the first application interface of the first application program, the slide end position coordinates are used to indicate an end position of the slide operation on the first application interface of the first application program, and the slide start time is used to indicate how long after the slide operation has occurred since the start of the operation recording. For example, the coordinates of the slide start position of the slide operation is (x 1, y 1) = (4.7,6.5), the coordinates of the slide end position of the slide operation is (x 2, y 2) = (4.9,11.0), and the slide start time is 02:10, which indicates that the slide operation is slid from the (4.7,6.5) position to the (4.9,11.0) position of the first application interface, which occurs 2 minutes 10 seconds from the start of the operation recording.
In one example, if the first application interface of the first application program is full-screen display, that is, the display size of the first application interface of the first application program is the same as the screen size of the electronic device, the coordinates of the sliding start position are the coordinates of the position point acting on the screen when the sliding starts, and the coordinates of the sliding end position are the coordinates of the position point acting on the screen when the sliding ends.
In another example, if the first application interface of the first application program is not displayed full screen, the coordinates of the sliding start position may be determined according to the relative position of the first application interface and the screen, and the coordinates of the position point acting on the screen when the sliding starts. Similarly, the coordinates of the sliding end point position may be determined according to the relative position of the first application interface and the screen, and the coordinates of the position point acting on the screen at the end of sliding.
Third case: in the case where the touch operation is a long press operation, the behavior data of the long press operation includes long press position coordinates of the long press operation, a long press start time, and a long press duration. The long press position coordinates are used to represent the position of the long press operation acting on the first application interface of the first application program, the long press start time is used to represent how long the long press operation occurs since the start of the operation recording, and the long press duration is used to represent the duration of the long press operation acting on the screen. For example, the long press position coordinate of the long press operation is (x, y) = (9.1,2.4), the long press start time is 05:29, and the long press duration is 2.1s, which indicates that the user performs the long press operation at the position (9.1,2.4) of the screen, the long press operation occurs at 5 minutes and 29 seconds from the start of operation recording, and the duration is 2.1 seconds.
In one example, if the first application interface of the first application program is displayed in full screen, that is, the display size of the first application interface of the first application program is the same as the screen size of the electronic device, the long press position coordinate is the coordinate of the position point acted on the screen by the long press operation. In another example, if the first application interface of the first application program is not displayed full screen, the long press position coordinate may be determined according to the relative position of the application interface and the screen and the coordinate of the position point where the long press operation acts on the screen.
Step 904: the touch event management module is responsive to the touch operation.
That is, in the process of recording the behavior data of the touch operation of the user on the first application interface of the first application program, the electronic device normally responds to each touch operation of the user. In one example, if the recording process is switched from the first application interface to the third application interface and the user continues to perform the touch operation in the third application interface, the electronic device will also respond normally for each touch operation of the user in the third application interface. Thus, the normal response of the operation can be prevented from being influenced by executing the operation recording function.
In implementation, the touch event management module issues a touch event corresponding to the touch operation, when the touch event corresponding to the touch operation is issued, the touch event is traversed from the top view, as shown in fig. 12, fig. 12 is a flowchart of a touch event delivery mechanism according to an exemplary embodiment, if an application program to which the top view (also referred to as a top parent view) belongs intercepts the touch event, behavior data of the touch event is submitted to the application program to which the top parent view belongs for processing, and if the application program to which the top parent view does not intercept the touch event, the view of the next layer (which may be referred to as a parent view) is continuously traversed. If the application program of the parent view intercepts the touch event, the behavior data of the touch event is submitted to the application program of the parent view for processing, if the application program of the parent view does not intercept the touch event, the view of the next layer (which can be called as child view) is continuously traversed, if the application program of the child view intercepts the touch event, the behavior data of the touch event is submitted to the application program of the child view for processing, and if the application program of the child view does not intercept the touch event and other views do not exist, the touch event is destroyed.
Specifically, referring to fig. 13, each time one touch operation of the user is received, responding to the one touch operation may include the following sub-steps 9041 to 9046:
9041: the touch event management module traverses according to the display sequence of the currently opened views from top to bottom.
It will be appreciated that a user may open a plurality of application interfaces during use of the electronic device, and illustratively, the user opens application a, and the electronic device opens application interface v1 of application a in response to an opening operation of application a by the user; when a user clicks an option for opening an application interface v2 of the application program a in the application interface v1, the electronic device opens the application interface v2 of the application program a in response to the clicking operation of the user; then, when the user opens the first application program, responding to the opening operation of the user on the first application program, and opening a second application interface v3 of the first application program by the electronic equipment; when the user opens the virtual garden in the second application interface v3 of the first application, the electronic device opens the virtual garden interface v4. Each application interface can be understood as a view, so that each view currently opened by the electronic device is an application interface v4, an application interface v3, an application interface v2 and an application interface v1 sequentially from top to bottom, and application programs to which each application interface belongs are a first application program, an application program a and an application program a respectively. It can be seen that the application that is opened first is displayed at the lowest layer and the application that is opened last is displayed at the uppermost layer. To determine which view corresponds to the application that will handle the one touch operation, the touch event management module traverses in the top-down display order.
9042: each time a view is traversed, the touch event management module queries whether the application to which the currently traversed view belongs intercepts the one touch operation.
As an example of the present application, when a user performs a touch operation on an application interface, a touch event management module acquires behavior data and event attribute information of the touch operation, wherein the event attribute information includes information of an event type (such as a click type, a slide type) and the like, in addition to a part of the behavior data in the behavior data, that is, an intersection exists between the event attribute information and the behavior data. The partial behavior data included in the event attribute information may include, but is not limited to, location coordinates. In this way, in the traversing process, when one view is traversed, the touch event management module sends the event attribute information of the one touch operation to the application program to which the currently traversed view belongs. In one example, the application program includes a view module, the application program to which the currently traversed view belongs receives event attribute information sent by the touch event management module through the view module, and then determines whether to intercept the one touch operation according to the event attribute information. If the application program to which the currently traversed view belongs is to process the one touch event, the one touch operation is intercepted, otherwise, if the application program to which the currently traversed view belongs is not to process the one touch operation, the one touch operation is not intercepted. Under the condition that an application program to which the currently traversed view belongs determines to intercept one touch operation, an interception feedback module is called, and the interception feedback module is used for informing the touch event management module of intercepting the one touch operation, for example, a true is returned to the touch event management module through the interception feedback module; and calling the interception feedback module under the condition that the application program to which the currently traversed view belongs determines not to intercept the one touch operation, and informing the touch event management module through the interception feedback module not to intercept the one touch operation, for example, returning false to the touch event management module through the interception feedback module. Thus, the touch event management module can determine whether the application program to which the currently traversed view belongs intercepts the touch operation.
If the application program to which the currently traversed view belongs does not intercept the one touch operation, entering the operations from 9043 to 9045 as follows; if the application program to which the currently traversed view belongs intercepts this touch operation, the operation of 9046 is entered as follows.
9043: the touch event management module determines whether there are non-traversed views.
Since the number of views currently opened is limited, the touch event management module inquires whether there are non-traversed views in the case that the application to which the currently traversed view belongs does not intercept the one touch operation. If there is an un-traversed view, the touch event management module performs the operations of 9044, otherwise if there is no un-traversed view, the touch event management module performs the operations of 9045.
9044: the touch event management module continues to traverse the next view.
In the case where the application to which the currently traversed view belongs does not intercept the one touch event, if there is an un-traversed view, the touch event management module continues to traverse the next view in order to determine whether the application to which the next view belongs will process the one touch event. That is, after traversing to the next view, the touch event management module determines, according to operation 9042, whether the application to which the next traversed view belongs intercepts the one touch operation.
9045: the touch event management module destroys the one touch event.
When the application program to which the currently traversed view belongs does not intercept the touch event and the non-traversed view does not exist, the fact that the application program does not process the touch event in the currently opened application program is indicated, and in this case, the touch event management module destroys the touch event, namely the touch operation is invalid. Such as a touch operation may touch on the boundary of the first application interface.
In one example, the touch event management module sends relevant data (e.g., including behavior data and event attribute information) for the one touch event to the event destruction module, which destroys the one touch event.
9046: the touch event management module transmits behavior data of the one touch operation to the event receiving module.
The event receiving module provides a callable interface for the application program, so that after the touch event management module sends the behavior data of the one touch operation to the event receiving module, the application program to which the currently traversed view belongs can acquire the behavior data of the one touch event from the event receiving module, and then the one touch operation is processed based on the behavior data of the one touch event, namely the one touch operation is consumed.
For example, referring to fig. 14, if the touch operation is to click on a control, in the traversal process, the touch event processing module issues the touch event to the view (called parent view) where the parent control of the control is located, such as to the (dispatchtouch) of the parent view, and the dispatchtouch of the parent view calls the view group. If intercepted, the touch event is no longer passed to the child view and the parent view processes the one touch event. Otherwise, if the sub-view is not intercepted, the touch event is continuously transmitted to the sub-view, the sub-view is traversed to inquire whether the sub-view where the clicked control is located can be found, and if the sub-view where the clicked control is located is found, the dispatchTouchEvent () of the sub-view can be called, so that the transmission of the touch event is realized. Otherwise, if the child view where the clicked control is located cannot be found, the touch event may be destroyed by the touch event processing module itself or destroyed by the parent view, for example, the parent view may call for the onClick () destruction of itself, and the touch event processing module may call for onClick () through the path onTouch () - > onTouch invent () - > performClick () >, onClick ().
Thus, the operation of responding to the touch operation of the user on the first application interface of the first application program is completed.
In this way, referring to fig. 15, after the recording operation is started, when the touch operation of the user on the first application interface of the first application program is received, an event is normally issued to the bottom layer to respond to the touch operation, and behavior data of the touch operation is recorded, so that the normal response of the electronic device to the touch operation is not affected in the recording operation process.
It should be noted that step 904 is an optional operation in the embodiment of the present application.
It should be noted that, there is no strict order of execution between the step 904 and the step 903, and in one example, the step 904 and the step 903 may be executed in parallel.
Step 905: and the touch event management module receives a recording ending instruction.
In one example, referring to fig. 3 (a), when the user wants to terminate the recording operation function, the user can slide down from the top end of the electronic device, and in response to the sliding operation, the electronic device displays a pull-down notification bar, after which the user can press the recording option in the pull-down notification bar for a long time. And responding to the long-press operation of the user on the screen recording options in the drop-down notification bar, and determining that the recording end instruction is received by the touch event management module, wherein the recording end can be determined.
The above description is given by taking, as an example, a case where the recording end instruction is received, the recording end is determined. In another embodiment, the electronic device may further be provided with a preset recording duration, so that after receiving the recording start operation, the recording duration is counted, for example, a timer configured in the electronic device may be started to count. And when the recording time length reaches the preset recording time length, determining that the recording is finished. The specified recording duration may be set according to actual requirements, for example, the specified recording duration may be 5 minutes, which is not limited in the embodiment of the present application.
Step 906: and under the condition that recording is finished, the touch event management module generates a recording file, and the recording file comprises recorded behavior data of at least one touch operation.
And under the condition that recording is finished, the touch event management module generates a recorded file based on the recorded behavior data of each touch operation in the at least one touch operation.
In an example, referring to fig. 16, the recorded file includes three pieces of behavior data of touch operations, which are a click operation, a sliding operation, and a long press operation, respectively, that is, after recording is started, the user performs the click operation, the sliding operation, and the long press operation in sequence in the first application interface of the first application program. Wherein the clicking event corresponding to the clicking operation occurs 1 minute 10 seconds after the beginning of recording, and the coordinate of the clicking position is (12.1,22.5); the sliding event corresponding to the sliding operation occurs at 2 minutes and 10 seconds after the recording is started, the coordinate of the sliding starting position is (4.7,6.5), and the coordinate of the sliding end position is (4.9,11.0); the long press event corresponding to the long press operation occurs at the 5 th minute and 29 seconds after the start of recording, the coordinates of the long press position are 9.1,2.4, and the long press duration is 2.1 seconds.
That is, as shown in fig. 17, in the process of processing the touch operation, all the touch operations are recorded, and a recorded file is obtained. In one example, an event list may be established in the recording file, where the event list includes a correspondence between each touch operation and its start time, so that the execution order of the respective touch operations may be determined based on the event list when the recording file is played later.
It should be noted that, generating the recording file in the case of recording end is an optional operation in the embodiment of the present application.
Step 907: the touch event management module displays a target interface, wherein the target interface comprises a recorded file.
As an example of the present application, please refer to fig. 3 (b), wherein the target interface includes the recording file and a play option corresponding to the recording file, and the play option is used for triggering the electronic device to automatically execute a corresponding operation based on the behavior data in the recording file.
As an example of the present application, please refer to fig. 4, 5 or 6, the target interface includes description information of the recording file, such as description information including recording time, recording duration, etc. Further, the file name of the recorded file may also be included.
In one example, the target interface may also include relevant configuration options regarding playing the recorded file, based on which the user may perform some configuration operations. For example, as shown in fig. 4, the target interface includes, but is not limited to, a repetition number setting option and a repetition duration setting option. The repetition number setting option is used for setting the number of repeated playing of the recorded file, for example, the number of repeated playing can be set as a target number, and the target number can be set according to actual requirements, for example, 3 times and the like; the repeat time length setting option is used for setting the total time length for repeatedly playing the recorded file, for example, the total time length can be set as a target time length, the target time length can be set according to actual requirements, for example, the target time length can be 15 minutes and the like. It will be appreciated that in setting the target time period, the user may set according to the recording time period of the recording file, and the target time period may be generally set to an integer multiple of the recording time period by way of example and not limitation.
In one example, the target interface may further include a start time setting option, where the start time setting option is used to set a time to start playing the recorded file, such as 08, where the start time may be set to 7 months 1 to 5 days of 2022: 00.
It should be noted that step 907 is an optional operation in the embodiment of the present application. In another example, instead of displaying the target interface, the user may manually open the target interface including the recorded file, for example, the touch event management module may store the recorded file under a specified path after generating the recorded file, so that the user may open the target interface where the recorded file is located based on the specified path, where the specified path may be set according to actual requirements. Or in yet another embodiment, the touch event management module performs the operations of step 908 directly, or after a second preset period of time has elapsed, the touch event management module performs the operations of step 908. The second preset duration may be set according to actual requirements, which is not limited in the embodiment of the present application.
Step 908: the touch event management module receives a play operation of the recorded file in the target interface.
The play operation is used for indicating that the at least one touch operation is automatically triggered in the first application program based on the recorded file.
In an example, as shown in fig. 4, fig. 5, or fig. 6, in the case that the target interface includes a play option, when the user wants the electronic device to automatically perform an operation corresponding to the at least one touch operation in the first application interface of the first application based on the recorded file, the play option corresponding to the recorded file may be clicked, that is, the play operation is triggered.
Step 909: in response to the play operation, when it is determined that the execution order of the second touch operation is reached based on the behavior data of the second touch operation in a case where the first application interface has been opened, the touch event management module traverses in the display order from top to bottom of the respective views currently opened.
The second touch operation is any one of the at least one touch operation.
In the case of receiving a play operation on the recorded file, in order to be able to respond to the corresponding touch operation based on the behavior parameters in the recorded file, whether the first application is in an operating state may be detected, where the operating state of the first application includes being in a foreground operation or being in a background operation. In the case where the first application program is in the running state and the first application interface has been opened, it is explained that the behavior data of at least one touch operation in the recorded file can be based on that the at least one touch operation is sequentially responded to in the first application program in the execution order of the at least one touch operation, in which case the electronic device performs the operations of steps 909 to 915.
In response to the play operation, the touch event management module may call the recording file in a case where the first application interface is opened, and in one example, as shown in fig. 17, the touch event management module may determine an execution order of each touch operation according to an event list in the recording file, and then sequentially execute the corresponding events according to the execution order. For example, referring to fig. 16, since the start time of the plurality of touch operations corresponding to the behavior data in the record file is sequentially 1 min 10 seconds, 2 min 10 seconds, and 5 min 29 seconds, the execution sequence of the plurality of touch operations is sequentially: click operation, slide operation, long press operation. For any one of the plurality of touch operations (i.e., the second touch operation), the electronic device may issue the second touch operation through the touch event management module to query which application will process the second touch operation, and for this purpose, the touch event management module traverses the currently opened views in a top-to-bottom display order (the later opened view is located at the top of the earlier opened view).
Step 910: and when traversing to one view, the touch event management module sends event attribute information of the second touch event to an application program to which the currently traversed view belongs.
The event attribute information of the second touch event includes information such as an event type (e.g., click type, slide type) of the second touch event, in addition to part of the behavior data of the second touch event, that is, an intersection exists between the event attribute information and the behavior data. The partial behavior data included in the event attribute information of the second touch event may include, but is not limited to, position coordinates and the like.
And when one view is traversed, the touch event management module sends event attribute information of the second touch event to an application program to which the currently traversed view belongs so as to determine whether the application program to which the currently traversed view belongs processes the second touch event. It will be understood that, since the second touch operation is a touch operation that the user acts on the first application program during the foregoing recording process, the first application program will typically process the second touch operation, so the touch event management module sends event attribute information of the second touch event to the application program to which the currently traversed view belongs, that is, queries whether the application program to which the currently traversed view belongs is the first application program.
Step 911: and the application program sends first indication information to the interception feedback module under the condition that the interception of the second touch event is determined according to the event attribute information of the second touch event, wherein the first indication information is used for indicating interception of the second touch event.
In one possible case, if the application program to which the currently traversed view belongs is a first application program, the first application program determines to intercept the second touch event according to the event attribute information of the second touch event, and at this time, the first application program may send first indication information, for example, the first indication information is true, to the intercept feedback module.
In another possible case, if the application program to which the currently traversed view belongs is not the first application program, the application program to which the currently traversed view belongs determines not to intercept the second touch event after receiving the event attribute information of the second touch event, and in this case, the application program to which the currently traversed view belongs sends second instruction information to the interception feedback module, where the second instruction information is used to instruct not to intercept the second touch event. For example, the second indication information is false. In this case, the touch event management module continues to traverse the next view.
Step 912: the interception feedback module feeds back the first indication information to the touch event management module.
The interception feedback module informs the application program to which the view currently traversed by the touch event management module belongs to intercept the second touch event. In this way, the touch event management module may determine, after receiving the first indication information, that the application to which the currently traversed view belongs is the first application.
Step 913: the touch event management module sends the behavior data of the second touch operation to the event receiving module.
Step 914: the application program obtains the behavior data of the second touch event from the event receiving module.
It is easy to understand that the application program at this time is the first application program, that is, the first application program obtains the behavior data of the second touch event from the event receiving module.
Step 915: the application responds to the second touch event based on the behavior data of the second touch event.
For example, referring to fig. 3 (d), in the case where the second touch operation is a click operation, the first application program may determine that the chick is clicked at this time according to the behavior data of the second touch operation, and the first application program increases the interaction times with the chick and displays the interaction dialogue with the chick. In this way, each time the execution order of a certain touch operation of the at least one touch operation is reached, the touch operation can be responded to in accordance with the above-described flow.
And the electronic equipment responds to at least one touch operation in turn according to the flow. In an example, if the touch operation involved in the recording file further includes an operation of switching from the first application interface to the third application interface and a touch operation in the third application interface, after a touch event is issued according to the above procedure in the process of playing the recording file, the touch operation may be automatically switched from the first application interface to the third application interface, and an event corresponding to the touch operation is executed in the third application interface.
In one example, the at least one touch operation includes a first touch operation that invokes the second application, such as including a link address in a first application interface of the first application that is capable of invoking the second application, and when the first touch operation is a click operation on the link address, the touch operation management module queries whether the second application is in a running state, and if the second application is not in a running state, the second application may be started. If the second application is in the running state, no other operation may be performed, or an application interface of the second application may be displayed.
In one example, referring to fig. 18, the touch event processing module may include an application state monitoring module, where the application state monitoring module is configured to monitor an operating state of an application, for example, when the application is started, the application state monitoring module may register with the application state monitoring module to notify the application state monitoring module that the application state monitoring module enters the operating state, and may carry an application identifier when registering so that the application state monitoring module may learn which application is. When the application program is closed, the application state monitoring module can be logged off to inform the application state monitoring module to enter a closing state, and the application identifier can be carried when the application program is logged off. In this manner, the touch event management module may query the application state listening module for the state of the second application when executing to invoke the second application based on the recorded file. In one example, the application state listening module is actiglymanager ().
Further, if the touch operation is performed in the second application after the second application is started, the touch operation in the second application is responded by the second application in the process of playing the recorded file.
As an example of the present application, please refer to fig. 19, in the process of sequentially responding to at least one touch operation according to the recording file, the user may also perform touch operations in the application interface of the third application program, and in this case, the electronic device may still issue touch events corresponding to the respective touch operations, so as to normally respond to the touch operation of the user in the application interface of the third application program, and normally respond to a certain touch operation of the at least one touch operation, that is, when performing automatic response to the at least one touch operation based on the recording file, other touch operations of the user in the electronic device may not be affected. Wherein the third application is any one of the applications in the electronic device, in one example, the third application is the same application as the first application.
That is, in the process of sequentially responding to at least one touch operation in the first application program in the execution order of the at least one touch operation based on the behavior data of the at least one touch operation, the touch operation of the user in the application interface of the third application program is received, and the touch operation of the user is responded in the third application program. For example, referring to fig. 20, in the process of automatically responding to at least one touch operation based on the recorded file, the user performs a click operation in the application interface of the third application program at 4 minutes 11 seconds from the start of playing the recorded file, and the click position coordinates of the click operation are (5.5,7.0). After receiving the clicking operation, the touch operation management module can issue a clicking event corresponding to the clicking operation according to the process issued by the event so as to send the behavior data of the clicking operation to the third application program, so that the third application program responds to the clicking operation based on the behavior data of the clicking operation.
As an example of the present application, as shown in fig. 17, in the case where the user has set the number of times of repeated playing for the target file based on the target interface, after the end of the response of the present round, that is, after the end of the response of the behavior data based on the last touch operation performed in the recorded file, the number of times of repeated playing, which is the number of times of repeated playing for the recorded file, may be counted. If the repeated playing times do not reach the target times, continuing to respond to at least one touch operation in the first application program according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation in the recorded file. If the number of repeated playing reaches the target number, the response is determined to be ended.
As an example of the present application, in the case where the user has set the playback time period for the target file based on the target interface, after the end of the present round of response, that is, after the end of the behavioral data response of the touch operation based on the last execution order in the recorded file, the playback total time period, which is the total time period for playing back the recorded file repeatedly, may be counted. If the total repeated playing time length does not reach the target time length, continuing to respond to at least one touch operation in the first application program according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation in the recorded file. And if the total repeated playing time length reaches the target time length, determining that the response is ended.
It should be noted that, in this embodiment, when receiving a play operation of a user on a recording file in a target interface, if a first application interface is opened, at least one touch operation is sequentially responded in an execution sequence of the at least one touch operation in a first application program based on the recording file. In another example, in the case where the time for starting playing the recorded file is set, when the electronic device monitors that the current time reaches the time for starting playing the recorded file, if the first application interface is opened, sequentially responding to at least one touch operation in the first application program based on the execution sequence of the at least one touch operation, for example, 08 each day if the time for starting playing the recorded file is from 7 months 1 day to 7 months 5 days of 2022: 00, at eight early days of 7 months 1 to 7 months 5 of 2022, if the first application interface is opened, the electronic device automatically responds to at least one touch operation in the first application program according to the execution sequence of the at least one touch operation based on the recorded file. In yet another example, the electronic device may further automatically respond to the at least one touch operation in the first application program in an execution order of the at least one touch operation based on the recorded file in a case where the first application interface has been opened after the recording is generated. The embodiment of the present application is not limited thereto.
It should be noted that, if the first application interface is not opened, the electronic device may not perform any operation. In another example, in a case where the first application interface is not opened, the electronic device may further display a prompt window on the desktop, where prompt information is displayed in the prompt window, where the prompt information is used to prompt the user whether to start the first application program and open the first application interface, for example, the prompt information is "please confirm whether to start the first application program and open the first application interface". In addition, the prompt window may further include a confirmation option and a cancel option, when the user wants to start the first application program and open the first application interface, the confirmation option may be triggered, and in response to the triggering operation of the user on the confirmation option, the electronic device starts the first application program and opens the first application interface, and then sequentially responds to at least one touch operation according to the execution sequence of the at least one touch operation based on behavior data of the at least one touch operation in the record file. Otherwise, if the user does not want to start the first application program and open the first application interface, the cancel option may be triggered, and in response to the triggering operation of the user on the cancel option, the electronic device closes the prompt window, that is, does not execute other operations related to recording the file.
In the embodiment of the application, after receiving the recording start operation, behavior data of touch operation of a user on a first application interface of a first application program is recorded. After the recording is finished, under the condition that the first application interface is opened, the at least one touch operation is responded in sequence according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation. Therefore, through recording the behavior data of the touch operation of the user, then automatically repeatedly executing the event corresponding to at least one touch operation of the user in the first application interface according to the recorded behavior data, the problem that the operation efficiency is reduced due to fatigue of the user is avoided, and further the task execution effect can be improved.
The electronic device provided by the embodiment of the application can be, but is not limited to, a mobile phone, a tablet computer, a portable computer and the like, and the electronic device can support touch operation or touch screen operation. The electronic device is capable of installing applications including, for example, the first application, the second application, the third application, and the like described above. Fig. 21 is a schematic structural diagram of an electronic device according to an embodiment of the present application. Referring to fig. 21, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, a user identification module (subscriber identification module, SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces, such as may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device 100 through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light emitting diode (AMOLED), a flexible light-emitting diode (flex), a mini, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being an integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. Such as storing files of music, video, etc. in an external memory card.
The internal memory 121 may be used to store computer-executable program code that includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data (e.g., audio data, phonebook, etc.) created by the electronic device 100 during use, and so forth. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor 180K may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer instructions are loaded and executed on a computer, the processes or functions described in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, data subscriber line (Digital Subscriber Line, DSL)) or wireless (e.g., infrared, wireless, microwave, etc.) means. The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium such as a floppy Disk, a hard Disk, a magnetic tape, an optical medium such as a digital versatile Disk (Digital Versatile Disc, DVD), or a semiconductor medium such as a Solid State Disk (SSD), etc.
The above embodiments are not intended to limit the present application, and any modifications, equivalent substitutions, improvements, etc. within the technical scope of the present application should be included in the scope of the present application.

Claims (13)

1. A method of operational control, for use in an electronic device, the method comprising:
receiving a recording start operation, wherein the recording start operation is used for triggering the start of recording the touch operation of a user on the screen of the electronic equipment;
recording behavior data of touch operation of the user in a first application interface of a first application program;
under the condition that recording is finished, if the first application interface is opened, sequentially responding to at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the recorded behavior data of the at least one touch operation;
receiving touch operations of the user in an application interface of a third application program in a process of sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation;
Responding to the touch operation of the user in the third application program, wherein the third application program is any application program in the electronic equipment;
wherein, when the first application interface is opened, based on the recorded behavior data of at least one touch operation, sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation, including:
when the first application interface is opened and the execution sequence of the second touch operation is determined to be reached based on the behavior data of the second touch operation, traversing is carried out according to the display sequence from top to bottom of each currently opened view, wherein the second touch operation is any one touch operation in the at least one touch operation, each view corresponds to one application interface, and different application interfaces belong to the same application program or to different application programs;
querying whether an application program to which a currently traversed view belongs is the first application program or not when traversing to one view;
if the application program to which the currently traversed view belongs is not the first application program, continuing to traverse the next view;
And sending the behavior data of the second touch operation to the first application program for processing and ending the traversing operation when the application program to which the currently traversed view belongs is the first application program.
2. The method of claim 1, wherein, in the case of the recording being completed, if the first application interface is opened, sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the recorded behavior data of the at least one touch operation, comprising:
generating a recording file under the condition that recording is finished, wherein the recording file comprises behavior data of the at least one touch operation;
and under the condition that the first application interface is opened, sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation in the recorded file.
3. The method of claim 2, wherein the sequentially responding to the at least one touch operation in the first application interface in the execution order of the at least one touch operation based on behavior data of the at least one touch operation in the recorded file in the case that the first application interface has been opened, comprises:
Displaying a target interface, wherein the target interface comprises the recorded file;
responding to a playing instruction of the recorded file in the target interface, and under the condition that the first application interface is opened, sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on behavior data of the at least one touch operation in the recorded file.
4. The method of claim 3, wherein responding to the play instruction of the recording file in the target interface, in the case that the first application interface is opened, based on the behavior data of the at least one touch operation in the recording file, sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation, comprises:
receiving a repetition number setting instruction in the target interface, wherein the repetition number setting instruction carries target number of times, and the repetition number setting instruction is used for indicating the repeated execution of the target number of times on the at least one touch operation;
responding to a playing instruction of the recorded file in the target interface, and sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on behavior data of the at least one touch operation in the recorded file under the condition that the first application interface is opened;
After the current round of playing is finished, counting the repeated playing times of the recorded file;
if the repeated playing times do not reach the target times, continuing to respond to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation in the recorded file;
and if the repeated playing times reach the target times, determining that the response is ended.
5. The method of any of claims 1-4, wherein the at least one touch operation comprises a first touch operation invoking a second application; and under the condition that the first application interface is opened, based on the recorded behavior data of at least one touch operation, sequentially responding to the at least one touch operation on the first application interface according to the execution sequence of the at least one touch operation, wherein the method comprises the following steps:
if the execution sequence of the first touch operation is determined to be currently reached based on the behavior data of the first touch operation under the condition that the first application interface is opened, monitoring whether the second application program is in a running state;
And starting the second application program under the condition that the second application program is not in a running state.
6. The method of claim 1, wherein the electronic device comprises a touch event management module, an intercept feedback module, and an event receiving module; and when traversing to one view, inquiring whether the application program to which the currently traversed view belongs is the first application program or not, wherein the method comprises the following steps:
when the touch event management module traverses to one view, event attribute information of a second touch event is sent to an application program to which the currently traversed view belongs, and the event attribute information of the second touch event is determined based on behavior data of the second touch event;
the application program sends first indication information to the interception feedback module under the condition that the second touch event is intercepted according to the event attribute information of the second touch event, wherein the first indication information is used for indicating interception of the second touch event;
the interception feedback module feeds back the first indication information to the touch event management module;
the touch event management module determines that the application to which the currently traversed view belongs is the first application.
7. The method of claim 6, wherein, in the case that the application to which the currently traversed view belongs is the first application, sending the behavior data of the second touch operation to the first application for processing includes:
the touch event management module sends the behavior data of the second touch operation to the event receiving module;
the first application program obtains the behavior data of the second touch event from the event receiving module;
the first application responds to the second touch event based on behavior data of the second touch event.
8. The method of any one of claims 1-4, wherein, in the case where the number of the at least one touch operation is a plurality, a plurality of operation types of the touch operation include one or more of a click operation, a slide operation, a long press operation.
9. The method of claim 8, wherein,
in the case that the at least one touch operation includes the click operation, behavior data of the click operation includes click position coordinates and click time of the click operation;
In the case where the at least one touch operation includes the sliding operation, behavior data of the sliding operation includes a sliding start position coordinate, a sliding end position coordinate, and a sliding start time of the sliding operation;
in the case that the at least one touch operation includes the long press operation, behavior data of the long press operation includes long press position coordinates of the long press operation, a long press start time and a long press duration.
10. The method of any one of claims 1-4, wherein after said receiving a recording start operation, further comprising:
each time a touch operation of the user in the first application interface is received, responding to the touch operation in the first application program.
11. The method according to any one of claims 1-4, wherein, in the case of the recording ending, if the first application interface is already opened, sequentially responding to the at least one touch operation in the first application interface according to the execution order of the at least one touch operation based on the recorded behavior data of the at least one touch operation, including:
Under the condition that recording ending operation is received, if the first application interface is opened, sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation; or,
and under the condition that the recording duration reaches the preset recording duration, if the first application interface is opened, sequentially responding to the at least one touch operation in the first application interface according to the execution sequence of the at least one touch operation based on the behavior data of the at least one touch operation.
12. An electronic device, wherein the electronic device comprises a processor and a memory in its structure;
the memory being for storing a program for supporting the electronic device to perform the method of any one of claims 1-11, and for storing data for implementing the method of any one of claims 1-11;
the processor is configured to execute a program stored in the memory.
13. A computer readable storage medium having instructions stored therein which, when run on a computer, cause the computer to perform the method of any of claims 1-11.
CN202211343134.4A 2022-10-31 2022-10-31 Operation control method, electronic device and readable storage medium Active CN115686334B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211343134.4A CN115686334B (en) 2022-10-31 2022-10-31 Operation control method, electronic device and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211343134.4A CN115686334B (en) 2022-10-31 2022-10-31 Operation control method, electronic device and readable storage medium

Publications (2)

Publication Number Publication Date
CN115686334A CN115686334A (en) 2023-02-03
CN115686334B true CN115686334B (en) 2023-11-28

Family

ID=85046802

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211343134.4A Active CN115686334B (en) 2022-10-31 2022-10-31 Operation control method, electronic device and readable storage medium

Country Status (1)

Country Link
CN (1) CN115686334B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116185268B (en) * 2023-02-17 2024-01-30 深圳市和风科技有限公司 Interaction method, system, medium and computer of code scanning gun and terminal equipment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108038059A (en) * 2017-12-22 2018-05-15 广州酷狗计算机科技有限公司 Interface traversal method and device
CN109542553A (en) * 2018-10-26 2019-03-29 北京慧流科技有限公司 The information extraction method and device and storage medium of user interface UI element
CN110389802A (en) * 2019-06-05 2019-10-29 华为技术有限公司 A kind of display methods and electronic equipment of flexible screen
CN110703948A (en) * 2019-10-09 2020-01-17 展讯通信(上海)有限公司 Touch screen operation recording and broadcasting system and method
CN110928787A (en) * 2019-11-22 2020-03-27 北京博睿宏远数据科技股份有限公司 Automatic test script recording and playback method, device, equipment and storage medium
CN111143200A (en) * 2019-12-12 2020-05-12 广州华多网络科技有限公司 Method and device for recording and playing back touch event, storage medium and equipment
CN114443447A (en) * 2021-12-17 2022-05-06 苏州浪潮智能科技有限公司 Webpage operation playback method and device, computer equipment and medium
CN114650330A (en) * 2020-12-18 2022-06-21 华为技术有限公司 Method, electronic equipment and system for adding operation sequence
CN114692049A (en) * 2022-03-29 2022-07-01 医渡云(北京)技术有限公司 Browser-based screen recording method and device, electronic equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110191676A1 (en) * 2010-01-29 2011-08-04 Microsoft Corporation Cross-Browser Interactivity Recording, Playback, and Editing

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108038059A (en) * 2017-12-22 2018-05-15 广州酷狗计算机科技有限公司 Interface traversal method and device
CN109542553A (en) * 2018-10-26 2019-03-29 北京慧流科技有限公司 The information extraction method and device and storage medium of user interface UI element
CN110389802A (en) * 2019-06-05 2019-10-29 华为技术有限公司 A kind of display methods and electronic equipment of flexible screen
CN110703948A (en) * 2019-10-09 2020-01-17 展讯通信(上海)有限公司 Touch screen operation recording and broadcasting system and method
CN110928787A (en) * 2019-11-22 2020-03-27 北京博睿宏远数据科技股份有限公司 Automatic test script recording and playback method, device, equipment and storage medium
CN111143200A (en) * 2019-12-12 2020-05-12 广州华多网络科技有限公司 Method and device for recording and playing back touch event, storage medium and equipment
CN114650330A (en) * 2020-12-18 2022-06-21 华为技术有限公司 Method, electronic equipment and system for adding operation sequence
CN114443447A (en) * 2021-12-17 2022-05-06 苏州浪潮智能科技有限公司 Webpage operation playback method and device, computer equipment and medium
CN114692049A (en) * 2022-03-29 2022-07-01 医渡云(北京)技术有限公司 Browser-based screen recording method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN115686334A (en) 2023-02-03

Similar Documents

Publication Publication Date Title
US10467025B2 (en) Managing delivery of code and dependent data using application containers
CN109034115B (en) Video image recognizing method, device, terminal and storage medium
WO2021057830A1 (en) Information processing method and electronic device
CN108512695B (en) Method and device for monitoring application blockage
CN108845856B (en) Object-based synchronous updating method and device, storage medium and equipment
US11706331B2 (en) Information processing method and apparatus, storage medium, and electronic device
CN108027706A (en) A kind of application interface display methods and terminal device
CN110300328A (en) A kind of video playing control method, device and readable storage medium storing program for executing
CN111597000A (en) Small window management method and terminal
CN111367456A (en) Communication terminal and display method in multi-window mode
CN115686334B (en) Operation control method, electronic device and readable storage medium
US20210109644A1 (en) Display method when application is exited and terminal
US20230014732A1 (en) Application startup and archiving
US20180196584A1 (en) Execution of multiple applications on a device
CN112732434A (en) Application management method and device
CN114327087A (en) Input event processing method and device, electronic equipment and storage medium
WO2021052488A1 (en) Information processing method and electronic device
US20230139886A1 (en) Device control method and device
CN106528188B (en) It is a kind of to apply accelerating method and device
CN113253905A (en) Touch method based on multi-finger operation and intelligent terminal
CN115328347B (en) Interface display method, device, terminal equipment and storage medium
CN115061758B (en) Application display method, terminal, electronic device and storage medium
CN113938550B (en) Terminal equipment, information feedback method and storage medium
CN116048317B (en) Display method and device
WO2024001871A1 (en) Control and operation method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant