CN117396834A - Gesture processing method, device and equipment based on processor - Google Patents

Gesture processing method, device and equipment based on processor Download PDF

Info

Publication number
CN117396834A
CN117396834A CN202280006545.6A CN202280006545A CN117396834A CN 117396834 A CN117396834 A CN 117396834A CN 202280006545 A CN202280006545 A CN 202280006545A CN 117396834 A CN117396834 A CN 117396834A
Authority
CN
China
Prior art keywords
interface
touch
task
touch operation
display screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280006545.6A
Other languages
Chinese (zh)
Inventor
王嘉浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shirui Electronics Co Ltd
Original Assignee
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shirui Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Shiyuan Electronics Thecnology Co Ltd, Guangzhou Shirui Electronics Co Ltd filed Critical Guangzhou Shiyuan Electronics Thecnology Co Ltd
Publication of CN117396834A publication Critical patent/CN117396834A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating

Abstract

The application provides a gesture processing method, device and equipment based on a processor, and relates to a touch control technology, wherein the method comprises the following steps: if the desktop application running currently is determined to be a preset starter in at least two starters, determining a gesture recognition mode corresponding to the preset starter; wherein the gesture recognition mode is used for indicating gesture recognition information (101). In response to a touch operation of a user, a touch instruction corresponding to the touch operation is determined according to a gesture recognition mode (102). Entering a task interface corresponding to the touch instruction according to the touch instruction; the task interface is used for displaying an application program (103) in a background running state in the electronic equipment. In response to a click operation for the task interface, an application in the task interface is processed (104). The method enables the electronic equipment to be compatible with various starters, and solves the technical problem that the types of the starters for supporting the multi-task management of the electronic equipment are single.

Description

Gesture processing method, device and equipment based on processor Technical Field
The present disclosure relates to touch control technologies, and in particular, to a gesture processing method, device and equipment based on a processor.
Background
Currently, in the process of using an electronic device, the electronic device needs to have a background multitasking function, and therefore, an initiator of the electronic device needs to be deployed.
In the prior art, a user can perform multitasking gesture operation in a native starter of the electronic device, thereby realizing a multitasking management function.
However, in the prior art, the electronic device only supports the multitasking gesture operation in the native initiator, which is not compatible with other initiators, so that the implementation scene of the multitasking management function is single, and the type of the initiator of the electronic device supporting the multitasking management is single.
Disclosure of Invention
The application provides a gesture processing method, device and equipment based on a processor, which are used for solving the technical problem that the type of an initiator of electronic equipment supporting multitasking is single.
In a first aspect, the present application provides a gesture processing method based on a processor, which is applied to an electronic device, where at least two initiators are deployed on the electronic device, and the initiators are used for starting a desktop application; the method comprises the following steps:
if the desktop application running currently is determined to be a preset starter in the at least two starters, determining a gesture recognition mode corresponding to the preset starter; the gesture recognition mode is used for indicating gesture recognition information;
Responding to touch operation of a user, and determining a touch instruction corresponding to the touch operation according to the gesture recognition mode;
entering a task interface corresponding to the touch instruction according to the touch instruction; the task interface is used for displaying application programs in a background running state in the electronic equipment;
and responding to clicking operation for the task interface, and processing the application program in the task interface.
Further, if it is determined that the currently running desktop application is a preset initiator of the at least two initiators, determining a gesture recognition mode corresponding to the preset initiator includes:
if the desktop application running currently is determined to be a preset starter in the at least two starters, determining a gesture recognition mode corresponding to the preset starter according to a mapping relation between the preset starter and the gesture recognition mode; the gesture recognition mode is used for indicating gesture recognition information.
Further, if the electronic device supports the screen rotation processing, a preset area is arranged on a display screen of the electronic device; responding to the touch operation of a user, determining a touch instruction corresponding to the touch operation according to the gesture recognition mode, and comprising the following steps:
And responding to touch operation of a user, wherein the touch operation represents operation that a touch point operated in a preset area slides to a display screen outside the preset area, and a touch instruction corresponding to the touch operation is determined according to the gesture recognition mode.
Further, if the electronic equipment does not support screen rotation processing; responding to the touch operation of a user, determining a touch instruction corresponding to the touch operation according to the gesture recognition mode, and comprising the following steps:
responding to touch operation of a user, wherein the touch operation represents operation that a touch point operated in a preset area slides to a display screen outside the preset area, and the display type of the display screen is determined;
and determining a touch instruction corresponding to the touch operation according to the gesture recognition mode and the display type of the display screen.
Further, according to the gesture recognition mode and the display type of the display screen, determining a touch instruction corresponding to the touch operation includes:
determining an input event consumer corresponding to the display type of the display screen; wherein the input event consumer characterizes processing logic information corresponding to a display type;
And determining a touch instruction corresponding to the touch operation according to the gesture recognition mode and the processing logic information corresponding to the display type.
Further, determining the input event consumer corresponding to the display type of the display screen includes:
if the display type of the display screen is determined to be a full-screen interface, determining that an input event consumer corresponding to the display screen is a full-screen interface input event consumer; the full-screen interface inputs processing logic information corresponding to the event consumer representation and the full-screen interface;
if the display type of the display screen is determined to be a vertical screen interface, determining that an input event consumer corresponding to the display screen is a vertical screen interface input event consumer; the vertical screen interface inputs processing logic information corresponding to the event consumer representation and the vertical screen interface.
Further, according to the gesture recognition mode and the processing logic information corresponding to the display type, determining the touch instruction corresponding to the touch operation includes:
determining the last input event of the touch operation on the display screen according to the processing logic information corresponding to the display type;
and if the last input event is determined to be the execution of the current touch operation, generating a touch instruction corresponding to the touch operation.
Further, the method further comprises:
and if the last input event is determined to cancel the current touch operation, returning to the interface before the touch operation.
Further, in response to a click operation for the task interface, processing an application program in the task interface includes:
and responding to clicking operation for any application program in the task interface, and displaying any application program.
Further, the touch operation includes touch information including a speed value, an acceleration value, a coordinate value, and a touch dwell time of the touch point.
Further, the method further comprises:
generating a difference device according to the speed value, the acceleration value, the coordinate value and the touch residence time of the touch point, and taking the current interface of the display screen as a task thumbnail source; wherein the differentiator characterizes the animation change effect of the current interface, namely the scaling and the transparency;
and responding to the touch operation, and displaying the current interface according to the animation change effect of the differentiator.
Further, the task interface comprises a multi-task overview interface, an interface of a display screen, or an interface before touch operation;
The multitasking overview interface includes at least one task card group; the task card group comprises an application program icon and a task thumbnail source corresponding to the application program icon.
Further, the multi-task overview interface also includes a push-to-clear button; the method further comprises the steps of:
responding to the triggering operation of the one-key clearing button, clearing the task card group in the multi-task overview interface, displaying clearing prompt information, and returning to the interface of the display screen after a preset time period; the emptying prompt information disappears in a transparency gradual change mode in a preset time period.
Further, the method further comprises:
and responding to clicking operation for a blank area except the task card group in the multi-task overview interface, and returning the interface of the display screen from the multi-task overview interface according to a preset transparency gradual change mode.
In a second aspect, the present application provides a gesture processing apparatus based on a processor, applied to an electronic device, where at least two initiators are deployed on the electronic device, and the initiators are used for starting a desktop application; the device comprises:
the first determining unit is used for determining a gesture recognition mode corresponding to a preset starter if the desktop application currently operated is determined to be the preset starter in the at least two starters; the gesture recognition mode is used for indicating gesture recognition information;
The second determining unit is used for responding to touch operation of a user and determining a touch instruction corresponding to the touch operation according to the gesture recognition mode;
the entering unit is used for entering a task interface corresponding to the touch instruction according to the touch instruction; the task interface is used for displaying application programs in a background running state in the electronic equipment;
and the processing unit is used for responding to clicking operation aiming at the task interface and processing the application program in the task interface.
Further, the first determining unit is specifically configured to:
if the desktop application running currently is determined to be a preset starter in the at least two starters, determining a gesture recognition mode corresponding to the preset starter according to a mapping relation between the preset starter and the gesture recognition mode; the gesture recognition mode is used for indicating gesture recognition information.
Further, if the electronic device supports the screen rotation processing, a preset area is arranged on a display screen of the electronic device; the second determining unit is specifically configured to:
and responding to touch operation of a user, wherein the touch operation represents operation that a touch point operated in a preset area slides to a display screen outside the preset area, and a touch instruction corresponding to the touch operation is determined according to the gesture recognition mode.
Further, if the electronic equipment does not support screen rotation processing; the second determination unit includes:
the first determining module is used for responding to touch operation of a user, wherein the touch operation represents operation that a touch point operated in a preset area slides to a display screen outside the preset area, and the display type of the display screen is determined;
and the second determining module is used for determining a touch instruction corresponding to the touch operation according to the gesture recognition mode and the display type of the display screen.
Further, the second determining module includes:
the first determining submodule is used for determining an input event consumer corresponding to the display type of the display screen; wherein the input event consumer characterizes processing logic information corresponding to a display type;
and the second determining submodule is used for determining a touch instruction corresponding to the touch operation according to the gesture recognition mode and the processing logic information corresponding to the display type.
Further, the first determination submodule includes:
a third determining submodule, configured to determine that an input event consumer corresponding to a display screen is a full-screen interface input event consumer if it is determined that the display type of the display screen is full-screen interface; the full-screen interface inputs processing logic information corresponding to the event consumer representation and the full-screen interface;
A fourth determining submodule, configured to determine that an input event consumer corresponding to a display screen is a vertical screen interface input event consumer if the display type of the display screen is determined to be a vertical screen interface; the vertical screen interface inputs processing logic information corresponding to the event consumer representation and the vertical screen interface.
Further, the second determination submodule includes:
a fifth determining submodule, configured to determine a last input event of the touch operation on the display screen according to the processing logic information corresponding to the display type;
and the generating sub-module is used for generating a touch instruction corresponding to the touch operation if the last input event is determined to be the execution of the current touch operation.
Further, the apparatus further comprises:
and the return sub-module is used for returning to the interface before the touch operation if the last input event is determined to cancel the current touch operation.
Further, the processing unit is specifically configured to:
and responding to clicking operation for any application program in the task interface, and displaying any application program.
Further, the touch operation includes touch information including a speed value, an acceleration value, a coordinate value, and a touch dwell time of the touch point.
Further, the apparatus further comprises:
the generating unit is used for generating a difference device according to the speed value, the acceleration value, the coordinate value and the touch residence time of the touch point, and taking the current interface of the display screen as a task thumbnail source; wherein the differentiator characterizes the animation change effect of the current interface, namely the scaling and the transparency;
and the display unit is used for responding to the touch operation and displaying the current interface according to the animation change effect of the differentiator.
Further, the task interface comprises a multi-task overview interface, an interface of a display screen, or an interface before touch operation;
the multitasking overview interface includes at least one task card group; the task card group comprises an application program icon and a task thumbnail source corresponding to the application program icon.
Further, the multi-task overview interface also includes a push-to-clear button; the apparatus further comprises:
the clearing unit is used for responding to the triggering operation of the one-key clearing button, clearing the task card group in the multi-task overview interface, displaying clearing prompt information and returning to the interface of the display screen after a preset time period; the emptying prompt information disappears in a transparency gradual change mode in a preset time period.
Further, the apparatus further comprises:
and the return unit is used for responding to clicking operation for a blank area except the task card group in the multi-task overview interface, and returning the interface of the display screen from the multi-task overview interface according to a preset transparency gradual change mode.
In a third aspect, the present application provides an electronic device comprising a memory, a processor, the memory storing a computer program executable on the processor, the processor implementing the method of the first aspect when executing the computer program.
In a fourth aspect, the present application provides a computer-readable storage medium having stored therein computer-executable instructions for performing the method of the first aspect when executed by a processor.
In a fifth aspect, the present application provides a computer program product comprising a computer program which, when executed by a processor, implements the method of the first aspect.
According to the gesture processing method, device and equipment based on the processor, if the desktop application running currently is determined to be the preset starter in the at least two starters, a gesture recognition mode corresponding to the preset starter is determined; the gesture recognition mode is used for indicating gesture recognition information. And responding to the touch operation of the user, and determining a touch instruction corresponding to the touch operation according to the gesture recognition mode. Entering a task interface corresponding to the touch instruction according to the touch instruction; the task interface is used for displaying application programs in a background running state in the electronic equipment. And responding to clicking operation for the task interface, and processing the application program in the task interface. In the scheme, a gesture recognition mode corresponding to a preset starter can be determined, then a touch instruction corresponding to touch operation is determined according to the gesture recognition mode, a task interface corresponding to the touch instruction is entered according to the touch instruction, and a user performs multitasking on an application program in the task interface. Therefore, gesture recognition modes corresponding to the initiators are developed on the electronic equipment, and then under each initiator which operates, multitasking can be carried out according to the gesture recognition modes corresponding to the initiators, so that the electronic equipment can be compatible with various initiators, and the technical problem that the types of the initiators of the electronic equipment supporting multitasking management are single is solved.
Drawings
FIG. 1 is a schematic flow chart of a gesture processing method based on a processor according to an embodiment of the present application;
fig. 2 is a schematic view of a scenario of a multi-task overview interface according to an embodiment of the present application;
FIG. 3 is a flowchart of another gesture processing method based on a processor according to an embodiment of the present application;
FIG. 4 is a schematic structural diagram of a gesture processing device based on a processor according to an embodiment of the present application;
FIG. 5 is a schematic diagram of another gesture processing device based on a processor according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 7 is a block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure.
In one example, the electronic device needs to have a background multitasking function during the use, and therefore, a starter of the electronic device needs to be deployed. In the prior art, a user can perform multitasking gesture operation in a native starter of the electronic device, thereby realizing a multitasking management function. However, in the prior art, the electronic device only supports the multitasking gesture operation in the native initiator, which is not compatible with other initiators, so that the implementation scene of the multitasking management function is single, and the type of the initiator of the electronic device supporting the multitasking management is single.
The application provides a gesture processing method, device and equipment based on a processor, which aim to solve the technical problems in the prior art.
The following describes the technical solutions of the present application and how the technical solutions of the present application solve the above technical problems in detail with specific embodiments. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 1 is a flow chart of a gesture processing method based on a processor, which is provided in an embodiment of the present application, and as shown in fig. 1, the gesture processing method is applied to an electronic device, where the electronic device is deployed with at least two starters, and the starters are used for starting a desktop application; the method comprises the following steps:
101. if the desktop application running currently is determined to be a preset starter in at least two starters, determining a gesture recognition mode corresponding to the preset starter; the gesture recognition mode is used for indicating gesture recognition information.
For example, the execution body of the present embodiment may be an electronic device, or a terminal device, or a gesture processing apparatus or device based on a processor, or other apparatus or device that may execute the present embodiment, which is not limited. In this embodiment, the execution body is described as an electronic device.
First, the electronic device is deployed with at least two initiators for initiating desktop applications. The electronic device is an Android system device, one starter is a native starter of the electronic device, the other starter is a starter which is downloaded and installed by a user and is other than the native starter, namely a three-party starter, and the electronic device can be deployed with a plurality of three-party starters. The preset starter is a starter which is set by a user and is started by default during starting. When the electronic device is running, whether the desktop application running currently is a preset starter in at least two starters needs to be judged first, if the desktop application running currently is determined to be the preset starter in the at least two starters, a gesture recognition mode corresponding to the preset starter can be determined according to a mapping relation between the preset starter and the gesture recognition mode, the gesture recognition mode is used for indicating gesture recognition information, and the gesture recognition information is used for recognizing touch operation.
For example, when the user downloads at least one three-way starter, the user may set the preset starter in the system setting first, and when the user opens the electronic device next time, the user will enter the desktop application corresponding to the preset starter after starting, or the user does not set the preset starter in the system setting, and each time the user starts the electronic device, there is a popup selection frame, and the user may select one desktop application to start in the popup selection frame. When the desktop application is entered, judging whether the starter used by the current user is a preset starter through a package manager of an Android system, if the starter used by the current user is determined to be a native starter of the Android system, directly jumping to processing logic of the native starter of the Android system for processing, and if the starter used by the current user is determined to be the preset starter, namely, the starter used by the current user is not the native starter of the Android system, determining a gesture recognition mode corresponding to the preset starter.
102. And responding to the touch operation of the user, and determining a touch instruction corresponding to the touch operation according to the gesture recognition mode.
The touch operation includes touch information about movement of a finger of a user, the touch information including a speed value, an acceleration value, a coordinate value, a touch stay time, and the like of a touch point, and the electronic device is provided with a display screen on which a preset area is provided, the preset area may be a rectangular area, and the like, and the preset area may be located at a bottom position of the display screen, which is not limited. When touch operation of a user is received, wherein the touch operation characterizes the operation that a touch point operated in a preset area slides to a display screen outside the preset area, a touch instruction corresponding to the touch operation is determined according to a gesture recognition mode, and the touch instruction comprises an interface entering a multi-task overview interface, an interface entering the display screen, an interface before entering the touch operation, and the like.
103. Entering a task interface corresponding to the touch instruction according to the touch instruction; the task interface is used for displaying application programs in a background running state in the electronic equipment.
The electronic device may calculate, according to the touch instruction, a last input event of the display screen in combination with a global state variable of the system interface recorded in the android system, and trigger to enter a task interface corresponding to the touch instruction when a corresponding state condition is reached. The state global variable is used for storing the state of an interface displayed by the display screen, for example, the state includes: the top of the display screen slides downwards, the display screen can display a menu related to common setting items, and in the state of displaying the menu, the electronic equipment can enter an interface of the display screen corresponding to the touch instruction according to the touch instruction; or if the electronic equipment enters a multi-task overview interface corresponding to the touch instruction according to the touch instruction in a state that the menu is not displayed; or if the touch dwell time of the touch operation is short in a state that the menu is not displayed, the electronic device enters the interface before the touch operation according to the touch instruction.
104. And responding to clicking operation for the task interface, and processing the application program in the task interface.
Illustratively, the task interface includes a multitasking overview interface, an interface into a display screen, or an interface prior to entering a touch operation, among others. Fig. 2 is a schematic view of a scenario of a multi-task overview interface provided in an embodiment of the present application, where, as shown in fig. 2, the multi-task overview interface includes a top status bar, a preset area at the bottom, a one-touch clear button, and at least one task card group, and the task card group includes an application icon and a task thumbnail source corresponding to the application icon. And defining a callback event method of the one-key clear button, if a trigger operation for the one-key clear button is received, responding to the trigger operation, actively sliding up and disappearing all the task card groups running in the background and displayed in the multi-task overview interface, and returning to the interface of the display screen in a transparency gradual animation mode after stopping displaying a prompt of 'no content recently' for a preset time period (for example, 1 s) when clicking the one-key clear button.
In the embodiment of the application, if the desktop application running at present is determined to be the preset starter in the at least two starters, determining a gesture recognition mode corresponding to the preset starter; the gesture recognition mode is used for indicating gesture recognition information. And responding to the touch operation of the user, and determining a touch instruction corresponding to the touch operation according to the gesture recognition mode. Entering a task interface corresponding to the touch instruction according to the touch instruction; the task interface is used for displaying application programs in a background running state in the electronic equipment. And responding to clicking operation for the task interface, and processing the application program in the task interface. In the scheme, a gesture recognition mode corresponding to a preset starter can be determined, then a touch instruction corresponding to touch operation is determined according to the gesture recognition mode, a task interface corresponding to the touch instruction is entered according to the touch instruction, and a user performs multitasking on an application program in the task interface. Therefore, gesture recognition modes corresponding to the initiators are developed on the electronic equipment, and then under each initiator which operates, multitasking can be carried out according to the gesture recognition modes corresponding to the initiators, so that the electronic equipment can be compatible with various initiators, and the technical problem that the types of the initiators of the electronic equipment supporting multitasking management are single is solved.
FIG. 3 is a schematic flow chart of another gesture processing method based on a processor, which is provided in an embodiment of the present application, and as shown in FIG. 3, the gesture processing method is applied to an electronic device, where the electronic device is deployed with at least two initiators, and the initiators are used for starting a desktop application; the method comprises the following steps:
201. if the desktop application running at present is determined to be a preset starter in at least two starters, determining a gesture recognition mode corresponding to the preset starter according to the mapping relation between the preset starter and the gesture recognition mode; the gesture recognition mode is used for indicating gesture recognition information.
Illustratively, this step may refer to step 101 in fig. 1, and will not be described in detail.
202. And responding to the touch operation of the user, and determining a touch instruction corresponding to the touch operation according to the gesture recognition mode.
In one example, step 202 includes two implementations:
the first implementation of step 202: if the electronic equipment supports screen rotation processing, a preset area is arranged on a display screen of the electronic equipment; step 202 comprises: and responding to touch operation of a user, wherein the touch operation characterizes the operation that a touch point in a preset area slides to a display screen outside the preset area, and a touch instruction corresponding to the touch operation is determined according to a gesture recognition mode.
The second implementation of step 202: if the electronic equipment does not support the screen rotation processing; step 202 comprises: responding to touch operation of a user, wherein the touch operation represents the operation that a touch point operated in a preset area slides to a display screen outside the preset area, and the display type of the display screen is determined; and determining a touch instruction corresponding to the touch operation according to the gesture recognition mode and the display type of the display screen.
In one example, "determining a touch instruction corresponding to a touch operation according to a gesture recognition mode and a display type of a display screen" includes: determining an input event consumer corresponding to the display type of the display screen; wherein, the input event consumer characterizes the processing logic information corresponding to the display type; and determining a touch instruction corresponding to the touch operation according to the gesture recognition mode and the processing logic information corresponding to the display type.
In one example, "determining an input event consumer corresponding to a display type of a display screen" includes:
if the display type of the display screen is determined to be a full-screen interface, determining that the input event consumer corresponding to the display screen is a full-screen interface input event consumer; the full-screen interface inputs processing logic information corresponding to the event consumer representation and the full-screen interface; if the display type of the display screen is determined to be a vertical screen interface, determining that the input event consumer corresponding to the display screen is a vertical screen interface input event consumer; the vertical screen interface inputs processing logic information corresponding to the event consumer representation and the vertical screen interface.
In one example, "determining a touch instruction corresponding to a touch operation according to a gesture recognition mode and processing logic information corresponding to a display type" includes: determining the last input event of the touch operation on the display screen according to the processing logic information corresponding to the display type; and if the last input event is determined to be the execution of the current touch operation, generating a touch instruction corresponding to the touch operation.
In one example, if it is determined that the last input event is to cancel the current touch operation, the interface before the touch operation is returned.
The touch operation includes touch information about movement of a finger of a user, the touch information including a speed value, an acceleration value, a coordinate value, a touch dwell time, and the like of a touch point, and the electronic device is provided with a display screen on which a preset area is provided, which may be a rectangular area located at a bottom of the display screen. When touch operation of a user is received, wherein the touch operation characterizes the operation that a touch point operated in a preset area slides to a display screen outside the preset area, a touch instruction corresponding to the touch operation is determined according to a gesture recognition mode, and the touch instruction comprises an interface entering a multi-task overview interface, an interface entering the display screen, an interface before entering the touch operation, and the like.
In a first implementation manner, if the electronic device supports the screen rotation process, if a touch operation of a user is received, where the touch operation characterizes an operation of sliding a touch point operated in a preset area to a display screen outside the preset area, a touch instruction corresponding to the touch operation is determined according to a gesture recognition mode.
In a second implementation manner, if the electronic device does not support screen rotation processing, if a touch operation of a user is received, where the touch operation characterizes an operation of sliding a touch point operated in a preset area to a display screen outside the preset area, the received touch operation is transmitted to a touch interaction service (touch interaction service) of a native starter through a preset system interface (SystemUI), and because the design of a mode of the electronic device mounted in the present invention does not support screen rotation, a vertical screen display logic of an application interface is implemented by a device terminal based on an Android window mode, and because the native starter limits, a multitasking gesture cannot be performed on the application interface displayed in the window mode, a display type of a current foreground interface of the display screen is determined through the touch interaction service, where the display type includes full screen display or vertical screen display, and an input event consumer corresponding to the display type of the display screen is determined, where the input event consumer characterizes processing logic information corresponding to the display type. And finally, determining a touch instruction corresponding to the touch operation according to the gesture recognition mode and the processing logic information corresponding to the display type.
If the display type of the display screen is determined to be a full-screen interface, determining that the input event consumer corresponding to the display screen is a full-screen interface input event consumer. The full-screen interface input event consumer is obtained by modifying a 'non-native activity input event consumer' of a native starter in the android system, and represents processing logic information corresponding to the full-screen interface, and is used for processing touch operation under the full-screen interface. And if the display type of the display screen is determined to be a vertical screen interface, determining that the input event consumer corresponding to the display screen is a vertical screen interface input event consumer. The vertical screen interface input event consumer is obtained by modifying 'backup input event consumer' of a raw starter in the android system, and the vertical screen interface input event consumer characterizes processing logic information corresponding to the vertical screen interface and is used for processing input events in a vertical screen mode.
In the process of determining a touch instruction corresponding to touch operation according to the gesture recognition mode and the processing logic information corresponding to the display type, when a user's finger leaves the display screen, the electronic device determines whether the state of the last input event of the touch operation on the display screen is that the user cancels the current gesture operation according to the processing logic information corresponding to the display type, if the last input event is that the current touch operation is cancelled, the interface before the touch operation is returned and the whole flow is finished, and if the last input event is that the current touch operation is executed, the touch instruction corresponding to the touch operation is generated.
203. Entering a task interface corresponding to the touch instruction according to the touch instruction; the task interface is used for displaying application programs in a background running state in the electronic equipment.
Illustratively, this step may refer to step 103 in fig. 1, which is not described herein.
204. And displaying any application program in response to clicking operation for any application program in the task interface.
Illustratively, this step may refer to step 104 in fig. 1, and will not be described in detail.
205. The touch operation includes touch information including a speed value, an acceleration value, a coordinate value, and a touch dwell time of the touch point. Generating a difference device according to the speed value, the acceleration value, the coordinate value and the touch residence time of the touch point, and taking the current interface of the display screen as a task thumbnail source; the difference device characterizes the animation change effect of the scaling and the transparency of the current interface; and responding to the touch operation, and displaying the current interface according to the animation change effect of the differentiator.
In one example, the task interface includes a multitasking overview interface, an interface of a display screen, or an interface prior to a touch operation; the multitasking overview interface includes at least one task card group; the task card group comprises application program icons and task thumbnail sources corresponding to the application program icons.
By way of example, through the corresponding "input event consumer", the electronic device records the movement state of the user's finger according to the touch operation, and generates an animation interpolator with scaling and transparency change after comprehensively calculating the states of acceleration, speed, coordinates, touch residence time and the like of the touch operation, and takes the snapshot of the application interface currently displayed on the display screen as a task thumbnail source, and performs gesture following display of the current full vertical screen interface/vertical screen interface according to the calculated interpolator, so that the current interface can be displayed according to the animation change effect of the differentiator in response to the touch operation.
The task interface includes a multitasking overview interface, an interface of a display screen, an interface before touch operation, or the like. The multi-task overview interface comprises at least one task card group, wherein the task card group comprises an application icon and a task thumbnail source corresponding to the application icon. According to screen resolution used by the electronic equipment and the interaction habit quantification value of the user of the electronic equipment, calculating to obtain the layout size of a plurality of task card groups, calculating to obtain the layout of preset application program icons and the layout relative size of task thumbnail sources corresponding to the application program icons through a system, wherein the layout of the application program icons is positioned right above the layout of the task thumbnail sources, and the task thumbnail sources are filled by using task using snapshots.
For example, when an application program is not opened on the display screen, the multitasking overview interface will perform gesture following when the multitasking gesture is triggered by sliding up from the preset area at the bottom directly from the interface of the display screen, and according to the sliding distance of the finger, all tasks running in the background are gradually displayed in an active mode combining gradual interface transparency and outward-inward shrinkage, wherein a one-key clear button and an "application icon" of a single task card set are gradually displayed in an expanding animation mode in the gesture following process.
When the multitasking gesture is triggered by sliding up from a preset area at the bottom of the interface of the application program, the task overview interface can gesture follow in the sliding process, only the task thumbnail source of the currently running program is displayed, and the action effect contracted from outside to inside is displayed. When the finger leaves the display screen, if the final entering of the multi-task overview interface is judged according to the state of the last input event, the application program icon of the current task and the one-key cleaning button are gradually displayed in an expanding animation mode, and the adjacent task card groups are gradually displayed in a transparency gradual change mode.
206. The multitasking overview interface also includes a one-touch clear button; responding to the triggering operation of the one-key clearing button, clearing the task card group in the multi-task overview interface, displaying clearing prompt information, and returning to the interface of the display screen after a preset time period; the emptying prompt information disappears in a transparency gradual change mode in a preset time period.
Illustratively, the multitasking overview interface includes a one-touch clear button. The electronic equipment defines a callback event method of the one-key clearing button, if a trigger operation for the one-key clearing button is received, a task card group displayed in the multi-task overview interface actively slides upwards and disappears in response to the trigger operation, and the interface of the display screen returns to a transparency gradual animation mode after a 'recent nothing content' 1s prompt is stopped and displayed when the one-key clearing button is clicked.
207. And responding to clicking operation for a blank area except the task card group in the multi-task overview interface, and returning the interface of the display screen from the multi-task overview interface according to a preset transparency gradual change mode.
In the multitasking overview interface, for example, when the user clicks on the blank area of the non-task card group and slides up from the preset area at the bottom, the interface is triggered to return to the interface of the display screen in a transparency gradient manner. The electronic equipment can also realize the preview of background multitasking by sliding left and right in the area of the multitasking card group in the multitasking overview interface, highlight the centered task card group, realize the switching of background tasks by clicking or pulling down the appointed task card, and also realize the switching of background tasks by clicking or pulling down the appointed task card.
In the embodiment of the application, if the desktop application running at present is determined to be the preset initiator in the at least two initiators, determining a gesture recognition mode corresponding to the preset initiator according to the mapping relation between the preset initiator and the gesture recognition mode; the gesture recognition mode is used for indicating gesture recognition information. And responding to the touch operation of the user, and determining a touch instruction corresponding to the touch operation according to the gesture recognition mode. Entering a task interface corresponding to the touch instruction according to the touch instruction; the task interface is used for displaying application programs in a background running state in the electronic equipment. And displaying any application program in response to clicking operation for any application program in the task interface. The touch operation includes touch information including a speed value, an acceleration value, a coordinate value, and a touch dwell time of the touch point. Generating a difference device according to the speed value, the acceleration value, the coordinate value and the touch residence time of the touch point, and taking the current interface of the display screen as a task thumbnail source; the difference device characterizes the animation change effect of the scaling and the transparency of the current interface; and responding to the touch operation, and displaying the current interface according to the animation change effect of the differentiator. The multitasking overview interface also includes a one-touch clear button; responding to the triggering operation of the one-key clearing button, clearing the task card group in the multi-task overview interface, displaying clearing prompt information, and returning to the interface of the display screen after a preset time period; the emptying prompt information disappears in a transparency gradual change mode in a preset time period. And responding to clicking operation for a blank area except the task card group in the multi-task overview interface, and returning the interface of the display screen from the multi-task overview interface according to a preset transparency gradual change mode. Therefore, gesture recognition modes corresponding to the initiators are developed on the electronic equipment, and then under each initiator which operates, multitasking can be carried out according to the gesture recognition modes corresponding to the initiators, so that the electronic equipment can be compatible with various initiators, and the technical problem that the types of the initiators of the electronic equipment supporting multitasking management are single is solved.
Fig. 4 is a schematic structural diagram of a gesture processing device based on a processor, provided in an embodiment of the present application, as shown in fig. 4, where the gesture processing device is applied to an electronic device, and the electronic device is deployed with at least two initiators, where the initiators are used to launch a desktop application; the device comprises:
a first determining unit 31, configured to determine a gesture recognition mode corresponding to a preset initiator if it is determined that the currently running desktop application is the preset initiator of the at least two initiators; the gesture recognition mode is used for indicating gesture recognition information.
The second determining unit 32 is configured to determine, in response to a touch operation of the user, a touch instruction corresponding to the touch operation according to the gesture recognition mode.
An entering unit 33, configured to enter a task interface corresponding to the touch instruction according to the touch instruction; the task interface is used for displaying application programs in a background running state in the electronic equipment.
And the processing unit 34 is used for responding to clicking operation for the task interface and processing the application program in the task interface.
The device of the embodiment may execute the technical scheme in the above method, and the specific implementation process and the technical principle are the same and are not described herein again.
Fig. 5 is a schematic structural diagram of another gesture processing apparatus based on a processor according to an embodiment of the present application, and based on the embodiment shown in fig. 4, as shown in fig. 5, the first determining unit 31 is specifically configured to:
if the desktop application running at present is determined to be a preset starter in at least two starters, determining a gesture recognition mode corresponding to the preset starter according to the mapping relation between the preset starter and the gesture recognition mode; the gesture recognition mode is used for indicating gesture recognition information.
In one example, if the electronic device supports the screen rotation process, a preset area is set on a display screen of the electronic device; the second determining unit 32 is specifically configured to:
and responding to touch operation of a user, wherein the touch operation represents operation that a touch point operated in a preset area slides to a display screen outside the preset area, and a touch instruction corresponding to the touch operation is determined according to a gesture recognition mode.
In one example, the electronic device does not support screen rotation processing; the second determination unit 32 includes:
the first determining module 321 is configured to respond to a touch operation of a user, where the touch operation characterizes an operation of sliding a touch point that is operated in a preset area to a display screen that is outside the preset area, and determine a display type of the display screen.
The second determining module 322 is configured to determine a touch instruction corresponding to the touch operation according to the gesture recognition mode and the display type of the display screen.
In one example, the second determination module 322 includes:
a first determining submodule 3221, configured to determine an input event consumer corresponding to a display type of the display screen; wherein the input event consumer characterizes the processing logic information corresponding to the display type.
The second determining submodule 3222 is configured to determine a touch instruction corresponding to the touch operation according to the gesture recognition mode and the processing logic information corresponding to the display type.
In one example, the first determination submodule 3221 includes:
a third determining submodule 32211, configured to determine that the input event consumer corresponding to the display screen is a full-screen interface input event consumer if it is determined that the display type of the display screen is a full-screen interface; wherein, the full screen interface inputs the processing logic information that event consumer characterization corresponds with full screen interface.
A fourth determining submodule 32212, configured to determine that the input event consumer corresponding to the display screen is a vertical screen interface input event consumer if the display type of the display screen is determined to be a vertical screen interface; the vertical screen interface inputs processing logic information corresponding to the event consumer representation and the vertical screen interface.
In one example, the second determination submodule 3222 includes:
a fifth determining sub-module 32221 is configured to determine a last input event of the touch operation on the display screen according to the processing logic information corresponding to the display type.
The generating sub-module 32222 is configured to generate a touch instruction corresponding to the touch operation if it is determined that the last input event is to perform the current touch operation.
In one example, the apparatus further comprises:
a return sub-module 32223 is configured to return to the interface before the touch operation if it is determined that the last input event is to cancel the current touch operation.
In one example, the processing unit 34 is specifically configured to:
and displaying any application program in response to clicking operation for any application program in the task interface.
In one example, the touch operation includes touch information including a velocity value, an acceleration value, a coordinate value, and a touch dwell time of the touch point.
In one example, the apparatus further comprises:
a generating unit 41, configured to generate a differentiator according to a speed value, an acceleration value, a coordinate value, and a touch residence time of the touch point, and use a current interface of the display screen as a task thumbnail source; wherein the differentiator characterizes the animation change effect of both the scaling and the transparency of the current interface.
And a display unit 42 for displaying the current interface according to the animation change effect of the differentiator in response to the touch operation.
In one example, the task interface includes a multitasking overview interface, an interface of a display screen, or an interface prior to a touch operation.
The multitasking overview interface includes at least one task card group; the task card group comprises application program icons and task thumbnail sources corresponding to the application program icons.
In one example, the multitasking overview interface further includes a push-to-clear button; the apparatus further comprises:
a clearing unit 43, configured to respond to a trigger operation for the one-button clearing button, clear the task card group in the multitasking overview interface, display a clearing prompt message, and return to the interface of the display screen after a preset time period; the emptying prompt information disappears in a transparency gradual change mode in a preset time period.
In one example, the apparatus further comprises:
and the returning unit 44 is configured to return, from the multitasking overview interface, to the interface of the display screen according to a preset transparency gradient manner in response to a clicking operation for a blank area other than the task card group in the multitasking overview interface.
The device of the embodiment may execute the technical scheme in the above method, and the specific implementation process and the technical principle are the same and are not described herein again.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application, where, as shown in fig. 6, the electronic device includes: a memory 51, and a processor 52.
The memory 51 stores a computer program executable on the processor 52.
The processor 52 is configured to perform the method as provided by the above-described embodiments.
The electronic device further comprises a receiver 53 and a transmitter 54. The receiver 53 is for receiving instructions and data transmitted from an external device, and the transmitter 54 is for transmitting instructions and data to the external device.
Fig. 7 is a block diagram of an electronic device, which may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, etc., provided in an embodiment of the present application.
The apparatus 600 may include one or more of the following components: a processing component 602, a memory 604, a power component 606, a multimedia component 608, an audio component 610, an input/output (I/O) interface 612, a sensor component 614, and a communication component 616.
The processing component 602 generally controls overall operation of the apparatus 600, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 602 may include one or more processors 620 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 602 can include one or more modules that facilitate interaction between the processing component 602 and other components. For example, the processing component 602 may include a multimedia module to facilitate interaction between the multimedia component 608 and the processing component 602.
The memory 604 is configured to store various types of data to support operations at the apparatus 600. Examples of such data include instructions for any application or method operating on the apparatus 600, contact data, phonebook data, messages, pictures, videos, and the like. The memory 604 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power supply component 606 provides power to the various components of the device 600. The power supply components 606 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 600.
The multimedia component 608 includes a screen between the device 600 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or sliding action, but also the duration and pressure associated with the touch or sliding operation. In some embodiments, the multimedia component 608 includes a front camera and/or a rear camera. The front camera and/or the rear camera may receive external multimedia data when the apparatus 600 is in an operational mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 610 is configured to output and/or input audio signals. For example, the audio component 610 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 600 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 604 or transmitted via the communication component 616. In some embodiments, audio component 610 further includes a speaker for outputting audio signals.
The I/O interface 612 provides an interface between the processing component 602 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 614 includes one or more sensors for providing status assessment of various aspects of the apparatus 600. For example, the sensor assembly 614 may detect the on/off state of the device 600, the relative positioning of the assemblies, such as the display and keypad of the device 600, the sensor assembly 614 may also detect the change in position of the device 600 or one of the assemblies of the device 600, the presence or absence of user contact with the device 600, the orientation or acceleration/deceleration of the device 600, and the change in temperature of the device 600. The sensor assembly 614 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact. The sensor assembly 614 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 614 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 616 is configured to facilitate communication between the apparatus 600 and other devices in a wired or wireless manner. The device 600 may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one exemplary embodiment, the communication component 616 receives broadcast signals or broadcast-related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component 616 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 600 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for executing the methods described above.
In an exemplary embodiment, a non-transitory computer-readable storage medium is also provided, such as memory 604, including instructions executable by processor 620 of apparatus 600 to perform the above-described method. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
Embodiments of the present application also provide a non-transitory computer-readable storage medium, which when executed by a processor of an electronic device, enables the electronic device to perform the method provided by the above embodiments.
The embodiment of the application also provides a computer program product, which comprises: a computer program stored in a readable storage medium, from which at least one processor of an electronic device can read, the at least one processor executing the computer program causing the electronic device to perform the solution provided by any one of the embodiments described above.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (31)

  1. The gesture processing method based on the processor is characterized by being applied to electronic equipment, wherein the electronic equipment is provided with at least two starters, and the starters are used for starting desktop applications; the method comprises the following steps:
    If the desktop application running currently is determined to be a preset starter in the at least two starters, determining a gesture recognition mode corresponding to the preset starter; the gesture recognition mode is used for indicating gesture recognition information;
    responding to touch operation of a user, and determining a touch instruction corresponding to the touch operation according to the gesture recognition mode;
    entering a task interface corresponding to the touch instruction according to the touch instruction; the task interface is used for displaying application programs in a background running state in the electronic equipment;
    and responding to clicking operation for the task interface, and processing the application program in the task interface.
  2. The method of claim 1, wherein if it is determined that the currently running desktop application is a preset initiator of the at least two initiators, determining a gesture recognition mode corresponding to the preset initiator comprises:
    if the desktop application running currently is determined to be a preset starter in the at least two starters, determining a gesture recognition mode corresponding to the preset starter according to a mapping relation between the preset starter and the gesture recognition mode; the gesture recognition mode is used for indicating gesture recognition information.
  3. The method of claim 1, wherein if the electronic device supports the screen rotation process, a preset area is set on a display screen of the electronic device; responding to the touch operation of a user, determining a touch instruction corresponding to the touch operation according to the gesture recognition mode, and comprising the following steps:
    and responding to touch operation of a user, wherein the touch operation represents operation that a touch point operated in a preset area slides to a display screen outside the preset area, and a touch instruction corresponding to the touch operation is determined according to the gesture recognition mode.
  4. The method of claim 1, wherein the electronic device does not support screen rotation processing; responding to the touch operation of a user, determining a touch instruction corresponding to the touch operation according to the gesture recognition mode, and comprising the following steps:
    responding to touch operation of a user, wherein the touch operation represents operation that a touch point operated in a preset area slides to a display screen outside the preset area, and the display type of the display screen is determined;
    and determining a touch instruction corresponding to the touch operation according to the gesture recognition mode and the display type of the display screen.
  5. The method of claim 4, wherein determining a touch instruction corresponding to the touch operation according to the gesture recognition mode and the display type of the display screen comprises:
    determining an input event consumer corresponding to the display type of the display screen; wherein the input event consumer characterizes processing logic information corresponding to a display type;
    and determining a touch instruction corresponding to the touch operation according to the gesture recognition mode and the processing logic information corresponding to the display type.
  6. The method of claim 5, wherein determining the input event consumer corresponding to the display type of the display screen comprises:
    if the display type of the display screen is determined to be a full-screen interface, determining that an input event consumer corresponding to the display screen is a full-screen interface input event consumer; wherein, the full screen interface inputs the processing logic information that event consumer characterization corresponds to full screen interface;
    if the display type of the display screen is determined to be a vertical screen interface, determining that an input event consumer corresponding to the display screen is a vertical screen interface input event consumer; the vertical screen interface inputs processing logic information corresponding to the event consumer representation and the vertical screen interface.
  7. The method of claim 5, wherein determining a touch instruction corresponding to the touch operation based on a gesture recognition mode and processing logic information corresponding to the display type comprises:
    determining the last input event of the touch operation on the display screen according to the processing logic information corresponding to the display type;
    and if the last input event is determined to be the execution of the current touch operation, generating a touch instruction corresponding to the touch operation.
  8. The method of claim 7, wherein the method further comprises:
    and if the last input event is determined to cancel the current touch operation, returning to the interface before the touch operation.
  9. The method of claim 1, wherein processing the application in the task interface in response to the click operation for the task interface comprises:
    and responding to clicking operation for any application program in the task interface, and displaying any application program.
  10. The method of claim 1, wherein the touch operation comprises touch information including a velocity value, an acceleration value, a coordinate value, and a touch dwell time of a touch point.
  11. The method according to claim 10, wherein the method further comprises:
    generating a difference device according to the speed value, the acceleration value, the coordinate value and the touch residence time of the touch point, and taking the current interface of the display screen as a task thumbnail source; wherein the differentiator characterizes the animation change effect of the current interface, namely the scaling and the transparency;
    and responding to the touch operation, and displaying the current interface according to the animation change effect of the differentiator.
  12. The method of claim 11, wherein the task interface comprises a multitasking overview interface, an interface of a display screen, or an interface prior to a touch operation;
    the multitasking overview interface includes at least one task card group; the task card group comprises an application program icon and a task thumbnail source corresponding to the application program icon.
  13. The method of claim 12, wherein the multitasking overview interface further comprises a push-to-clear button; the method further comprises the steps of:
    responding to the triggering operation of the one-key clearing button, clearing the task card group in the multi-task overview interface, displaying clearing prompt information, and returning to the interface of the display screen after a preset time period; the emptying prompt information disappears in a transparency gradual change mode in a preset time period.
  14. The method according to any one of claims 1-13, further comprising:
    and responding to clicking operation for a blank area except the task card group in the multi-task overview interface, and returning the interface of the display screen from the multi-task overview interface according to a preset transparency gradual change mode.
  15. The gesture processing device based on the processor is characterized by being applied to electronic equipment, wherein the electronic equipment is provided with at least two starters, and the starters are used for starting desktop applications; the device comprises:
    the first determining unit is used for determining a gesture recognition mode corresponding to a preset starter if the desktop application currently operated is determined to be the preset starter in the at least two starters; the gesture recognition mode is used for indicating gesture recognition information;
    the second determining unit is used for responding to touch operation of a user and determining a touch instruction corresponding to the touch operation according to the gesture recognition mode;
    the entering unit is used for entering a task interface corresponding to the touch instruction according to the touch instruction; the task interface is used for displaying application programs in a background running state in the electronic equipment;
    And the processing unit is used for responding to clicking operation aiming at the task interface and processing the application program in the task interface.
  16. The apparatus according to claim 15, wherein the first determining unit is specifically configured to:
    if the desktop application running currently is determined to be a preset starter in the at least two starters, determining a gesture recognition mode corresponding to the preset starter according to a mapping relation between the preset starter and the gesture recognition mode; the gesture recognition mode is used for indicating gesture recognition information.
  17. The apparatus of claim 15, wherein if the electronic device supports the screen rotation process, a preset area is provided on a display screen of the electronic device; the second determining unit is specifically configured to:
    and responding to touch operation of a user, wherein the touch operation represents operation that a touch point operated in a preset area slides to a display screen outside the preset area, and a touch instruction corresponding to the touch operation is determined according to the gesture recognition mode.
  18. The apparatus of claim 15, wherein the electronic device does not support screen rotation; the second determination unit includes:
    The first determining module is used for responding to touch operation of a user, wherein the touch operation represents operation that a touch point operated in a preset area slides to a display screen outside the preset area, and the display type of the display screen is determined;
    and the second determining module is used for determining a touch instruction corresponding to the touch operation according to the gesture recognition mode and the display type of the display screen.
  19. The apparatus of claim 18, wherein the second determining module comprises:
    the first determining submodule is used for determining an input event consumer corresponding to the display type of the display screen; wherein the input event consumer characterizes processing logic information corresponding to a display type;
    and the second determining submodule is used for determining a touch instruction corresponding to the touch operation according to the gesture recognition mode and the processing logic information corresponding to the display type.
  20. The apparatus of claim 19, wherein the first determination submodule comprises:
    a third determining submodule, configured to determine that an input event consumer corresponding to a display screen is a full-screen interface input event consumer if it is determined that the display type of the display screen is full-screen interface; the full-screen interface inputs processing logic information corresponding to the event consumer representation and the full-screen interface;
    A fourth determining submodule, configured to determine that an input event consumer corresponding to a display screen is a vertical screen interface input event consumer if the display type of the display screen is determined to be a vertical screen interface; the vertical screen interface inputs processing logic information corresponding to the event consumer representation and the vertical screen interface.
  21. The apparatus of claim 19, wherein the second determination submodule comprises:
    a fifth determining submodule, configured to determine a last input event of the touch operation on the display screen according to the processing logic information corresponding to the display type;
    and the generating sub-module is used for generating a touch instruction corresponding to the touch operation if the last input event is determined to be the execution of the current touch operation.
  22. The apparatus of claim 21, wherein the apparatus further comprises:
    and the return sub-module is used for returning to the interface before the touch operation if the last input event is determined to cancel the current touch operation.
  23. The apparatus according to claim 15, wherein the processing unit is specifically configured to:
    and responding to clicking operation for any application program in the task interface, and displaying any application program.
  24. The apparatus of claim 15, wherein the touch operation comprises touch information including a velocity value, an acceleration value, a coordinate value, and a touch dwell time of a touch point.
  25. The apparatus of claim 24, wherein the apparatus further comprises:
    the generating unit is used for generating a difference device according to the speed value, the acceleration value, the coordinate value and the touch residence time of the touch point, and taking the current interface of the display screen as a task thumbnail source; wherein the differentiator characterizes the animation change effect of the current interface, namely the scaling and the transparency;
    and the display unit is used for responding to the touch operation and displaying the current interface according to the animation change effect of the differentiator.
  26. The apparatus of claim 25, wherein the task interface comprises a multitasking overview interface, an interface of a display screen, or an interface prior to a touch operation;
    the multitasking overview interface includes at least one task card group; the task card group comprises an application program icon and a task thumbnail source corresponding to the application program icon.
  27. The apparatus of claim 26, wherein the multitasking overview interface further comprises a push-to-clear button; the apparatus further comprises:
    The clearing unit is used for responding to the triggering operation of the one-key clearing button, clearing the task card group in the multi-task overview interface, displaying clearing prompt information and returning to the interface of the display screen after a preset time period; the emptying prompt information disappears in a transparency gradual change mode in a preset time period.
  28. The apparatus according to any one of claims 15-27, wherein the apparatus further comprises:
    and the return unit is used for responding to clicking operation for a blank area except the task card group in the multi-task overview interface, and returning the interface of the display screen from the multi-task overview interface according to a preset transparency gradual change mode.
  29. An electronic device comprising a memory, a processor, the memory having stored therein a computer program executable on the processor, the processor implementing the method of any of the preceding claims 1-14 when the computer program is executed.
  30. A computer readable storage medium having stored therein computer executable instructions which when executed by a processor are adapted to carry out the method of any one of claims 1-14.
  31. A computer program product comprising a computer program which, when executed by a processor, implements the method of any of claims 1-14.
CN202280006545.6A 2022-05-07 2022-05-07 Gesture processing method, device and equipment based on processor Pending CN117396834A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/091360 WO2023216012A1 (en) 2022-05-07 2022-05-07 Processor-based gesture processing method and apparatus, and device

Publications (1)

Publication Number Publication Date
CN117396834A true CN117396834A (en) 2024-01-12

Family

ID=88729366

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280006545.6A Pending CN117396834A (en) 2022-05-07 2022-05-07 Gesture processing method, device and equipment based on processor

Country Status (2)

Country Link
CN (1) CN117396834A (en)
WO (1) WO2023216012A1 (en)

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120159395A1 (en) * 2010-12-20 2012-06-21 Microsoft Corporation Application-launching interface for multiple modes
JP5762912B2 (en) * 2011-09-30 2015-08-12 京セラ株式会社 Apparatus, method, and program
CN104461599A (en) * 2013-09-25 2015-03-25 上海琥智数码科技有限公司 Integration method of multiple desktop starters
CN104461333A (en) * 2013-09-25 2015-03-25 上海琥智数码科技有限公司 Switching method of multiple desktop starters
US10503352B2 (en) * 2015-01-19 2019-12-10 Microsoft Technology Licensing, Llc Control of representation interaction within an application launcher
CN106339173A (en) * 2016-08-31 2017-01-18 新诺商桥科技(北京)有限公司 Smart desktop system
CN109831586A (en) * 2019-02-28 2019-05-31 努比亚技术有限公司 A kind of user interface switching display methods, mobile terminal and storage medium
CN110908739B (en) * 2019-11-28 2020-12-01 广东汉鼎蜂助手网络技术有限公司 Method, device and equipment for realizing data docking with third-party Launcher
CN111158788B (en) * 2019-12-31 2023-05-30 科大讯飞股份有限公司 Desktop starter control method and device and storage medium

Also Published As

Publication number Publication date
WO2023216012A1 (en) 2023-11-16

Similar Documents

Publication Publication Date Title
US10976887B2 (en) Method and apparatus for split-window display
EP3572936B1 (en) Method, terminal and computer-readable storage medium for displaying interface of application program
EP3316112B1 (en) Split screen display method and apparatus, computer program and recording medium
US10642476B2 (en) Method and apparatus for single-hand operation on full screen
EP3242204B1 (en) Method and device for multi-task management
CN104866179B (en) Terminal application program management method and device
CN105893136B (en) Multitask management method and device
US20190235745A1 (en) Method and device for displaying descriptive information
EP3337146A1 (en) Method and apparatus for displaying notification message
CN111381739B (en) Application icon display method and device, electronic equipment and storage medium
CN106775202B (en) Information transmission method and device
CN106354504B (en) Message display method and device
US20200174661A1 (en) Method and apparatus for displaying user interface, terminal and storage medium
EP3236355B1 (en) Method and apparatus for managing task of instant messaging application
CN105912204B (en) Method and device for starting application interface
CN111552426B (en) Intelligent equipment operation method and device and storage medium
CN111814088A (en) Page processing method and device
CN111522498A (en) Touch response method and device and storage medium
US20180091636A1 (en) Call processing method and device
CN108829473B (en) Event response method, device and storage medium
CN112423092A (en) Video recording method and video recording device
CN117396834A (en) Gesture processing method, device and equipment based on processor
CN110417987B (en) Operation response method, device, equipment and readable storage medium
CN107360315B (en) Information display method and device
CN112817662A (en) Method and device for starting application program functional interface and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination