CN105302452B - Operation method and device based on gesture interaction - Google Patents

Operation method and device based on gesture interaction Download PDF

Info

Publication number
CN105302452B
CN105302452B CN201410351674.6A CN201410351674A CN105302452B CN 105302452 B CN105302452 B CN 105302452B CN 201410351674 A CN201410351674 A CN 201410351674A CN 105302452 B CN105302452 B CN 105302452B
Authority
CN
China
Prior art keywords
application program
function
environment
instruction
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410351674.6A
Other languages
Chinese (zh)
Other versions
CN105302452A (en
Inventor
周兰兰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201410351674.6A priority Critical patent/CN105302452B/en
Publication of CN105302452A publication Critical patent/CN105302452A/en
Application granted granted Critical
Publication of CN105302452B publication Critical patent/CN105302452B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides an operation method based on gesture interaction, which comprises the following steps: the method comprises the steps of obtaining an input touch gesture track, identifying an operation environment where the input touch gesture track occurs, searching for a function instruction matched with the input touch gesture track in the operation environment, and executing the function instruction if the function instruction matched with the input touch gesture track is searched. The method and the device solve the problems that a large amount of time of a user needs to be consumed, great inconvenience is brought to the user, and the user interaction experience is poor due to a series of complicated operations performed by searching for the target application program or the target function of the target application program by the existing user, and achieve the advantages of simplicity and convenience in operation, saving of the user time and good interaction experience.

Description

Operation method and device based on gesture interaction
Technical Field
The invention relates to the technical field of computers, in particular to an operation method and device based on gesture interaction.
Background
The touch screen technology of the iPhone mobile phone brings a novel, simple and easy-to-use man-machine interface, so that the mobile phone gets rid of complicated keyboard input, and therefore, the common mobile phone application program basically realizes man-machine interaction through events, and the events are operations of a user on a graphical interface. Touch events include press, pop, slide, double click, etc. In the existing smart phone control, the following event processing related methods are provided: firstly, planning calculation is carried out on the current graphical interface of the mobile phone by clicking the area, and response of a functional event is carried out in the corresponding area. And secondly, setting a touch monitoring event on the corresponding control through a related method function of event processing provided by the smart phone control, and performing response control.
However, for the application program with powerful functions in the current mobile phone, when the touch event processing method is used, because the internal functions are more, when a user uses one of the functions, the user needs to manually click the primary menu key to enter the primary menu key interface and click the secondary menu key, and when the user enters the secondary menu key interface, the user clicks the target function key which the user needs to use, and the user enters the target function interface to realize the target operation. The series of operations of searching the target application program or clicking the target function interface entering the target application program to perform the target operation all need to be performed for many times, the operation is complex, a large amount of time is consumed for a user, great inconvenience is brought to the user, and the user operation experience is poor.
Disclosure of Invention
The embodiment of the invention provides an operation method based on gesture interaction, and aims to solve the problems that in the prior art, a user searches for a target application program and clicks a target function interface entering the target application program to perform a series of operations, the operation is complicated, a large amount of time of the user needs to be consumed, great inconvenience is brought to the user, and the user control experience is poor.
The embodiment of the invention is realized in such a way that an operation method based on gesture interaction comprises the following steps:
acquiring an input touch gesture track;
identifying an operation environment in which the input touch gesture track occurs;
searching a function instruction matched with the input touch gesture track in the operation environment;
and if the functional instruction matched with the input touch gesture track is found, executing the functional instruction.
The embodiment of the invention also provides an operation device based on gesture interaction, which comprises:
the gesture acquisition module is used for acquiring an input touch gesture track;
the environment recognition module is used for recognizing an operation environment in which the input touch gesture track occurs;
the gesture matching module is used for searching a function instruction matched with the input touch gesture track under the operation environment;
and the function execution module is used for executing the function instruction if the function instruction matched with the input touch gesture track is found.
According to the embodiment of the invention, the input touch gesture track is obtained, the operation environment where the input touch gesture track occurs is identified, the function instruction matched with the input touch gesture track in the operation environment is searched, and if the function instruction matched with the input touch gesture track is searched, the function instruction is executed. The embodiment of the invention solves the problems that a large amount of time of a user needs to be consumed, great inconvenience is brought to the user and the user interaction experience is poor due to a series of complicated operations performed by searching the target application program or the target function of the target application program by the existing user, and achieves the beneficial effects of simple operation, user time saving and good interaction experience.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic structural diagram of an operation terminal of an operation method based on gesture interaction in the embodiment of the present invention.
FIG. 2 is a flowchart of a first embodiment of the gesture interaction based operation method of the present invention.
FIG. 3 is a flowchart of a second embodiment of the gesture interaction based operation method of the present invention.
Fig. 4 is a schematic structural diagram of a first embodiment of the operating device based on gesture interaction according to the present invention.
Fig. 5 is a schematic structural diagram of a second embodiment of the operating device based on gesture interaction according to the present invention.
Fig. 6 is a bus diagram of a terminal device where an operation device based on gesture interaction is located in the embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Fig. 1 is a schematic structural diagram of an operation terminal of an operation method based on gesture interaction in an embodiment of the present invention, where the terminal may be used to implement the operation method based on gesture interaction provided in the embodiment of the present invention. Specifically, the method comprises the following steps:
the terminal may include RF (Radio Frequency) circuitry 110, memory 120 including one or more computer-readable storage media, an input unit 130, a display unit 140, a sensor 150, an audio circuit 160, a WiFi (Wireless Fidelity) module 170, a processor 180 including one or more processing cores, and a power supply 190. Those skilled in the art will appreciate that the terminal structure shown in fig. 1 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
the RF circuit 110 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, receives downlink information from a base station and then sends the received downlink information to the one or more processors 180 for processing; in addition, data relating to uplink is transmitted to the base station. In general, the RF circuitry 110 includes, but is not limited to, an antenna, at least one Amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, an LNA (Low Noise Amplifier), a duplexer, and the like. In addition, the RF circuitry 110 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA (Code Division Multiple Access), WCDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), e-mail, SMS (short messaging Service), etc.
The memory 120 may be used to store software programs and modules, and the processor 180 executes various functional applications and data processing by operating the software programs and modules stored in the memory 120. The memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the terminal, etc. Further, the memory 120 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 120 may further include a memory controller to provide the processor 180 and the input unit 130 with access to the memory 120.
The input unit 130 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. The input unit 130 may include a touch surface 131 and other input devices 132. The touch surface 131, also referred to as a touch display screen or a touch pad, may collect a touch operation by a user (e.g., a screen unlock operation performed by the user on the touch surface 131 or an operation near the touch surface 131 using any suitable object or accessory such as a finger, a stylus, etc.) on or near the touch surface 131 and drive a corresponding connection device according to a preset program. Alternatively, the touch surface 131 may include two portions, a touch detection device and a touch controller. The touch detection device detects a touch gesture track input by a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 180, and can receive and execute commands sent by the processor 180. In addition, the touch surface 131 can be implemented using various types, such as resistive, capacitive, infrared, and surface acoustic wave. In addition to the touch surface 131, the input unit 130 may also include other input devices 132. In particular, other input devices 132 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 140 may be used to display information input by or provided to a user and various graphic user interfaces of the terminal, which may be configured by graphics, text, icons, video, and any combination thereof. The Display unit 140 may include a Display panel 141, and optionally, the Display panel 141 may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like. Further, the touch surface 131 can cover the display panel 141, and when the touch surface 131 detects a touch operation on or near the touch surface, the touch operation is transmitted to the processor 180 to determine the type of the touch event, and then the processor 180 provides a corresponding visual output on the display panel 141 according to the type of the touch event. Although in FIG. 1, the touch surface 131 and the display panel 141 are shown as two separate components to implement input and output functions, in some embodiments, the touch surface 131 and the display panel 141 may be integrated to implement input and output functions.
The terminal may also include at least one sensor 150, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 141 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 141 and/or a backlight when the terminal is moved to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when the terminal device is stationary, and can be used for applications (such as horizontal and vertical screen switching, related games, magnetometer attitude calibration) for recognizing the attitude of the terminal device, and related functions (such as pedometer and tapping) for vibration recognition; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured in the terminal, detailed description is omitted here.
The audio circuitry 160, speaker 161, microphone 162 may provide an audio interface between the user and the terminal. The audio circuit 160 may transmit the electrical signal converted from the received audio data to the speaker 161, and convert the electrical signal into a sound signal for output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into an electric signal, converts the electric signal into audio data after being received by the audio circuit 160, and then outputs the audio data to the processor 180 for processing, and then transmits the audio data to another terminal through the RF circuit 110, or outputs the audio data to the memory 120 for further processing. The audio circuit 160 may also include an earbud jack to provide communication of peripheral headphones with the terminal.
WiFi belongs to a short-distance wireless transmission technology, and the terminal can help a user to send and receive e-mails, browse webpages, access streaming media and the like through the WiFi module 170, and provides wireless broadband internet access for the user. Although fig. 1 shows the WiFi module 170, it is understood that it does not belong to the essential constitution of the terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 180 is a control center of the terminal, connects various parts of the entire terminal device using various interfaces and lines, and performs various functions of the terminal and processes data by operating or executing software programs and/or modules stored in the memory 120 and calling data stored in the memory 120, thereby integrally monitoring the terminal device. Optionally, processor 180 may include one or more processing cores; preferably, the processor 180 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 180.
The terminal also includes a power supply 190 (e.g., a battery) for powering the various components, which may preferably be logically coupled to the processor 180 via a power management system to manage charging, discharging, and power consumption via the power management system. The power supply 190 may also include any component including one or more of a dc or ac power source, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown, the terminal may further include a camera, a bluetooth module, and the like, which will not be described herein. In this embodiment, the display unit of the terminal is a touch screen display, and the terminal further includes a memory and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the one or more processors. The one or more programs include operation instructions which can be used for executing the following embodiments corresponding to the unlocking method of the terminal screen in the embodiments of the present invention.
The invention provides an operation method based on gesture interaction. Fig. 2 is a flowchart illustrating a first embodiment of an operation method based on gesture interaction according to the present invention. The operation method based on gesture interaction mentioned in this embodiment includes the steps of:
step S210, acquiring an input touch gesture track;
in this embodiment, a listener is provided to monitor a touch signal of a user, analyze the touch signal, and display and process a touch gesture formed by sliding a finger on a screen through the class of gettureoverlaverlayview provided by the system.
Step S220, identifying an operation environment where the input touch gesture track occurs;
the operation environment related to the method comprises a system operation environment and an application program environment, and based on the detection of operation environment factors, whether the operation environment for acquiring the touch gesture track of the current input is the system operation environment or the application program environment is identified.
The system operating environment may be a system interactive interface, such as a computer like Windows and Mac OS, or a smart phone like ios, android, and Windows phone, or a similar operating system interactive interface, and is mainly used for implementing basic functions, settings, and the like of system operation.
The application program environment may be an application program interactive interface having some item or some items of functions running on the operating system, and the application program may be a third party application program, or may be an application program carried by the operating system, such as a game program, a news client program, a weather forecast program, and the like.
Step S230, searching a function instruction matched with the input touch gesture track in the operation environment;
when the current operating environment is a system operating environment, searching a function instruction which is matched with the input touch gesture track in the system operating environment; and when the current operating environment is a certain application program environment, searching for a function instruction matched with the input touch gesture track in the application program environment.
In step S240, if the function command matched with the input touch gesture track is found, the function command is executed.
Under the system operation environment, if a functional instruction matched with the input touch gesture track is found, the functional instruction is executed, the functional instruction comprises a calling instruction of an application program, the functional instruction comprises a calling instruction of the application program, the application program is executed, namely the process ID of the application program is found, and a program corresponding to the process ID is called.
In the application program environment, if a function instruction matched with an input touch gesture track is found, the function instruction is executed, the function instruction comprises a function calling instruction in the application program, the function calling instruction in the application program is executed, namely, a function entry address in the application program corresponding to the function instruction is found, and a function corresponding to the function entry address is executed.
According to the method and the device, the input touch gesture track is obtained, the operation environment where the input touch gesture track occurs is identified, the functional instruction matched with the input touch gesture track in the operation environment is searched, and if the functional instruction matched with the input touch gesture track is searched, the functional instruction is executed, so that the problems that a user needs to consume a large amount of time for searching a target application program or a target function of the target application program for a series of complicated operations, great inconvenience is brought to the user, and the user interaction experience is poor in the prior art are solved, and the beneficial effects of simplicity and convenience in operation, saving of user time and good interaction experience are achieved.
Fig. 3 is a flowchart illustrating an operation method based on gesture interaction according to a second embodiment of the present invention. The operation method based on gesture interaction mentioned in this embodiment includes the steps of:
step S310, setting a matching relation between a touch gesture track and a function instruction;
and establishing gesture matching information, and setting a matching relation among an operating environment in which the touch gesture track occurs, the touch gesture track and a function calling instruction. The operating environment in which the touch gesture trajectory occurs includes a system operating environment and an application environment.
When the operating environment where the touch gesture track occurs is the system operating environment, the function instruction comprises a calling instruction of an application program, a matching relation I of the touch gesture track and the application program is set, and the system operating environment and the matching relation I are bound.
When the operation environment where the touch gesture track occurs is an application program environment, the function instruction comprises a function calling instruction in the application program, a matching relation II between the touch gesture track and a function module in the application program is set, and the application program environment and the matching relation II are bound.
In this step, a custom class file is created, for example, a function such as system onCreate function or custom initialization is overwritten in the custom class file, configured gesture matching information is loaded in the function such as system onCreate function or custom initialization, and the custom class file is stored in an object of a gesture database provided by the system.
The expression form of the gesture matching information is not limited, and is exemplified in the form of a gesture matching information table, which is shown in table 1 below:
TABLE 1
Operating environment Touch gesture trajectory Function instructions
System operating environment 0 Application program 1
System operating environment 1 Application 2
System operating environment 2 Application 3
System operating environment 3 Application 4
Application A Environment 0 Function module a1
Application A Environment 1 Function module a2
Application A Environment 2 Function module a3
Application A Environment 3 Function module a4
Application B Environment 0 Function module b1
Application B Environment 1 Function module b2
Application B Environment 2 Function module b3
Application B Environment 3 Function module b4
The touch gesture trajectory is any number, letter, or other character that can be customized by a user. In table 1, taking the touch gesture trajectory as a number as an example, in the system operation environment, the touch gesture trajectories 0, 1, 2, and 3 correspond to functional instructions in sequence as follows: application 1, application 2, application 3, and application 4; under the environment of an application program A, touch gesture tracks 0, 1, 2 and 3 correspond to functional instructions which are as follows in sequence: function module a1, function module a2, function module a3, function module a 4; under the environment of an application program B, touch gesture tracks 0, 1, 2 and 3 correspond to functional instructions which are as follows in sequence: function module b1, function module b2, function module b3, and function module b 4.
The touch gesture tracks 0, 1, 2, and 3 in table 1 may be used in a system operating environment, an application a environment, or an application B environment, that is, the same touch gesture track is used in different operating environments, and function instruction call that does not affect matching of the operating environments is not affected. Of course, different touch gesture trajectories may also be used in different operating environments. Therefore, the user does not need to consider the current operating environment, and can set the matching relation between the touch gesture track and the function instruction according to the preference or habit of the user.
Meanwhile, touch gesture trajectories may be single-point and multi-point gestures, a single-point gesture being a gesture performed by a single point of contact, such as a gesture performed by a single touch from a single finger, palm, or stylus. A multi-point gesture is a gesture performed by multiple points, e.g., a gesture performed by multiple touches from multiple fingers, fingers and a palm, one finger and one stylus, multiple styluses, and/or any combination thereof.
Step S320, acquiring an input touch gesture track;
and a monitor is additionally arranged to monitor the touch signal of a user, analyze the touch signal, and display and process the touch gesture formed by the sliding of a finger on a screen through the GestureOverLayView type provided by the system.
Step S330, identifying an operation environment where the input touch gesture track occurs;
the operation environment related to the method comprises a system operation environment and an application program environment, and based on the detection of operation environment factors, whether the operation environment for acquiring the touch gesture track of the current input is the system operation environment or the application program environment is identified.
The system operating environment may be a system interactive interface, such as a computer like Windows and Mac OS, or a smart phone like ios, android, and Windows phone, or a similar operating system interactive interface, and is mainly used for implementing basic functions, settings, and the like of system operation.
The application program environment may be an application program interactive interface having some item or some items of functions running on the operating system, and the application program may be a third party application program, or may be an application program carried by the operating system, such as a game program, a news client program, a weather forecast program, and the like.
Step S340, searching a function instruction matched with the input touch gesture track in the operation environment;
when the current operating environment is a system operating environment, searching a function instruction which is matched with the input touch gesture track in the system operating environment; and when the current operating environment is a certain application program environment, searching for a function instruction matched with the input touch gesture track in the application program environment.
In step S351, if the function instruction matched with the input touch gesture track is found, the function instruction is executed.
Under the system operation environment, if a functional instruction matched with the input touch gesture track is found, the functional instruction is executed, the functional instruction comprises a calling instruction of an application program, the functional instruction comprises a calling instruction of the application program, the application program is executed, namely the process ID of the application program is found, and a program corresponding to the process ID is called.
In the application program environment, if a function instruction matched with an input touch gesture track is found, the function instruction is executed, the function instruction comprises a function calling instruction in the application program, the function calling instruction in the application program is executed, namely, a function entry address in the application program corresponding to the function instruction is found, and a function corresponding to the function entry address is executed.
And step S352, if the function instruction matched with the input touch gesture track cannot be found, displaying gesture guide information.
Establishing gesture guide information, and setting a matching relation among an operation environment where the touch gesture track occurs, the touch gesture track and the gesture guide information. The operating environment in which the touch gesture trajectory occurs includes a system operating environment and an application environment.
In this step, a custom class file is created, for example, a function such as system onCreate function or custom initialization is overwritten in the custom class file, a matching relationship among the operating environment, the touch gesture trajectory and the gesture guidance information is loaded in the function such as system onCreate function or custom initialization, and the custom class file is stored in an object of a gesture database provided by the system.
The matching relationship among the operating environment where the touch gesture trajectory occurs, the touch gesture trajectory, and the gesture guidance information is shown in table 2 below, for example:
TABLE 2
Operating environment Touch gesture trajectory Function instructions
System operating environment 0 Application program 1
System operating environment 1 Application 2
System operating environment 2 Application 3
System operating environment 3 Application 4
System operating environment 4 Gesture guidance information I
Application A Environment 0 Function module a1
Application A Environment 1 Function module a2
Application A Environment 2 Function module a3
Application A Environment 3 Function module a4
Application A Environment 4 Gesture guidance information II
Application B Environment 0 Function module b1
Application B Environment 1 Function module b2
Application B Environment 2 Function module b3
Application B Environment 3 Function module b4
Application B Environment 4 Gesture guidance information III
The touch gesture trajectory is any number, letter, or other character that can be customized by a user. Taking the touch gesture track as a number as an example in table 2, under the system operation environment, the touch gesture track 4 correspondingly calls the gesture guidance information i; in the application program A environment, a gesture track 4 is touched, and gesture guide information II is correspondingly called; and in the application program B environment, the gesture track 4 is touched, and the gesture guiding information III is correspondingly called.
The gesture guidance information content includes gesture matching information in a corresponding operating environment, the display form of the gesture guidance information content is not limited, taking the gesture guidance information i in the system operating environment as an example, the content of the gesture guidance information i is shown in a table form as the following table 3:
TABLE 3
Touch gesture trajectory Function instructions
0 Application program 1
1 Application 2
2 Application 3
3 Application 4
Based on table 3, in a system operation environment, if a user does not know or forget a matching relationship between a touch gesture track and a function instruction, a function instruction matched with the touch gesture track 4 in the system operation environment can be found by drawing the touch gesture track 4, that is, the gesture guidance information i is found, the gesture guidance information i is called, the calling of the gesture guidance information i includes finding an address of the gesture guidance information i, and the gesture guidance information i is triggered according to the address of the gesture guidance information i for the user to check. Meanwhile, searching for a function instruction matched with the input touch gesture track in the system operation environment by acquiring the input touch gesture track, and if the function instruction matched with the input touch gesture track cannot be searched, displaying gesture guide information I to prompt a user to input a correct touch gesture track or prompt the user to input again.
In the embodiment, by acquiring an input touch gesture track, identifying an operating environment where the input touch gesture track occurs, searching for a function instruction matched with the input touch gesture track in the operating environment, executing the function instruction if the function instruction matched with the input touch gesture track is found, displaying gesture guidance information to prompt a user to input a correct touch gesture track if the function instruction matched with the input touch gesture track cannot be found, and calling the gesture guidance information by drawing a certain touch gesture track if the user is unclear or forgets the matching relationship between the touch gesture track and the function instruction, the user is facilitated to check the matching relationship between the touch gesture track and the function instruction in the current operating environment, the user is helped to draw the correct touch gesture track, one-step calling of the function instruction is realized, and the operation is simple, the time of the user is saved, and the interactive experience effect is good.
Meanwhile, the same touch gesture track can be used in different operation environments, function instruction calling without influencing operation environment matching is achieved, of course, different touch gesture tracks can be used in different operation environments, a user does not need to consider the current operation environment, the matching relation between the touch gesture track and the function instruction can be set according to own preference or habit, and the user operation is enabled to be simpler and more convenient.
The invention further provides an operation device based on gesture interaction. As shown in fig. 4, fig. 4 is a schematic structural diagram of a first embodiment of an operation device based on gesture interaction according to the present invention. The operation device based on gesture interaction mentioned in this embodiment includes the steps of:
a gesture obtaining module 410, configured to obtain an input touch gesture trajectory;
an environment recognition module 420 for recognizing an operation environment where an input touch gesture trajectory occurs;
the gesture matching module 430 is configured to search for a function instruction matched with the input touch gesture trajectory in the operating environment;
and the function executing module 440 is configured to execute the function instruction if the function instruction matched with the input touch gesture track is found.
According to the operation device based on gesture interaction, the input touch gesture track is obtained, the operation environment where the input touch gesture track occurs is identified, the functional instruction matched with the input touch gesture track under the operation environment is searched, and if the functional instruction matched with the input touch gesture track is searched, the functional instruction is executed, so that the problems that a series of complicated operations are performed when a user searches for a target application program or a target function of the target application program in the prior art, a large amount of time of the user needs to be consumed, great inconvenience is brought to the user, the user interaction experience is poor are solved, the operation is simple and direct, the user time is saved, and the beneficial effect of good interaction experience is achieved.
As shown in fig. 5, fig. 5 is a schematic structural diagram of a second embodiment of an operation device based on gesture interaction according to the present invention. The operation device based on gesture interaction mentioned in this embodiment includes the steps of:
the gesture setting module 510 is used for setting the matching relationship between the touch gesture track and the function instruction;
and establishing gesture matching information, and setting a matching relation among an operating environment in which the touch gesture track occurs, the touch gesture track and a function calling instruction. The operating environment in which the touch gesture trajectory occurs includes a system operating environment and an application environment.
When the operating environment where the touch gesture track occurs is the system operating environment, the function instruction comprises a calling instruction of an application program, a matching relation I of the touch gesture track and the application program is set, and the system operating environment and the matching relation I are bound.
When the operation environment where the touch gesture track occurs is an application program environment, the function instruction comprises a function calling instruction in the application program, a matching relation II between the touch gesture track and a function module in the application program is set, and the application program environment and the matching relation II are bound.
In this step, a custom class file is created, for example, a function such as system onCreate function or custom initialization is overwritten in the custom class file, configured gesture matching information is loaded in the function such as system onCreate function or custom initialization, and the custom class file is stored in an object of a gesture database provided by the system.
The expression form of the gesture matching information is not limited, and is now exemplified in the form of a gesture matching information table, as shown in table 1.
The touch gesture trajectory is any number, letter, or other character that can be customized by a user. In table 1, taking the touch gesture trajectory as a number as an example, in the system operation environment, the touch gesture trajectories 0, 1, 2, and 3 correspond to functional instructions in sequence as follows: application 1, application 2, application 3, and application 4; under the environment of an application program A, touch gesture tracks 0, 1, 2 and 3 correspond to functional instructions which are as follows in sequence: function module a1, function module a2, function module a3, function module a 4; under the environment of an application program B, touch gesture tracks 0, 1, 2 and 3 correspond to functional instructions which are as follows in sequence: function module b1, function module b2, function module b3, and function module b 4.
The touch gesture tracks 0, 1, 2, and 3 in table 1 may be used in a system operating environment, an application a environment, or an application B environment, that is, the same touch gesture track is used in different operating environments, and function instruction call that does not affect matching of the operating environments is not affected. Of course, different touch gesture trajectories may also be used in different operating environments. Therefore, the user does not need to consider the current operating environment, and can set the matching relation between the touch gesture track and the function instruction according to the preference or habit of the user.
Meanwhile, touch gesture trajectories may be single-point and multi-point gestures, a single-point gesture being a gesture performed by a single point of contact, such as a gesture performed by a single touch from a single finger, palm, or stylus. A multi-point gesture is a gesture performed by multiple points, e.g., a gesture performed by multiple touches from multiple fingers, fingers and a palm, one finger and one stylus, multiple styluses, and/or any combination thereof.
A gesture obtaining module 520, configured to obtain an input touch gesture trajectory;
and a monitor is additionally arranged to monitor the touch signal of a user, analyze the touch signal, and display and process the touch gesture formed by the sliding of a finger on a screen through the GestureOverLayView type provided by the system.
An environment recognition module 530 for recognizing an operation environment where an input touch gesture trajectory occurs;
the operation environment related to the method comprises a system operation environment and an application program environment, and based on the detection of operation environment factors, whether the operation environment for acquiring the touch gesture track of the current input is the system operation environment or the application program environment is identified.
The system operating environment may be a system interactive interface, such as a computer like Windows and Mac OS, or a smart phone like ios, android, and Windows phone, or a similar operating system interactive interface, and is mainly used for implementing basic functions, settings, and the like of system operation.
The application program environment may be an application program interactive interface having some item or some items of functions running on the operating system, and the application program may be a third party application program, or may be an application program carried by the operating system, such as a game program, a news client program, a weather forecast program, and the like.
The gesture matching module 540 is configured to search for a function instruction matched with the input touch gesture trajectory in the operation environment;
when the current operating environment is a system operating environment, searching a function instruction which is matched with the input touch gesture track in the system operating environment; and when the current operating environment is a certain application program environment, searching for a function instruction matched with the input touch gesture track in the application program environment.
And the function execution module 551 is used for executing the function instruction if the function instruction matched with the input touch gesture track is found.
Under the system operation environment, if a functional instruction matched with the input touch gesture track is found, the functional instruction is executed, the functional instruction comprises a calling instruction of an application program, the functional instruction comprises a calling instruction of the application program, the application program is executed, namely the process ID of the application program is found, and a program corresponding to the process ID is called.
In the application program environment, if a function instruction matched with an input touch gesture track is found, the function instruction is executed, the function instruction comprises a function calling instruction in the application program, the function calling instruction in the application program is executed, namely, a function entry address in the application program corresponding to the function instruction is found, and a function corresponding to the function entry address is executed.
The gesture guiding module 552 displays gesture guiding information to prompt the user to input a correct touch gesture track if no function instruction matching the input touch gesture track is found.
Establishing gesture guide information, and setting a matching relation among an operation environment where the touch gesture track occurs, the touch gesture track and the gesture guide information. The operating environment in which the touch gesture trajectory occurs includes a system operating environment and an application environment.
In this step, a custom class file is created, for example, a function such as system onCreate function or custom initialization is overwritten in the custom class file, a matching relationship among the operating environment, the touch gesture trajectory and the gesture guidance information is loaded in the function such as system onCreate function or custom initialization, and the custom class file is stored in an object of a gesture database provided by the system.
The matching relationship among the operating environment where the touch gesture trajectory occurs, the touch gesture trajectory, and the gesture guidance information is shown in table 2, for example.
The touch gesture trajectory is any number, letter, or other character that can be customized by a user. Taking the touch gesture track as a number as an example in table 2, under the system operation environment, the touch gesture track 4 correspondingly calls the gesture guidance information i; in the application program A environment, a gesture track 4 is touched, and gesture guide information II is correspondingly called; and in the application program B environment, the gesture track 4 is touched, and the gesture guiding information III is correspondingly called.
The gesture guidance information content includes gesture matching information in the corresponding operating environment, the display form of the gesture guidance information content is not limited, and taking the gesture guidance information i in the system operating environment as an example, the content of the gesture guidance information i is displayed in a table form as shown in table 3.
Based on table 3, in a system operation environment, if a user does not know or forget a matching relationship between a touch gesture track and a function instruction, a function instruction matched with the touch gesture track 4 in the system operation environment can be found by drawing the touch gesture track 4, that is, the gesture guidance information i is found, the gesture guidance information i is called, the calling of the gesture guidance information i includes finding an address of the gesture guidance information i, and the gesture guidance information i is triggered according to the address of the gesture guidance information i for the user to check. Meanwhile, searching for a function instruction matched with the input touch gesture track in the system operation environment by acquiring the input touch gesture track, and if the function instruction matched with the input touch gesture track cannot be searched, displaying gesture guide information I to prompt a user to input a correct touch gesture track or prompt the user to input again.
In the embodiment, by acquiring an input touch gesture track, identifying an operating environment where the input touch gesture track occurs, searching for a function instruction matched with the input touch gesture track in the operating environment, executing the function instruction if the function instruction matched with the input touch gesture track is found, displaying gesture guidance information to prompt a user to input a correct touch gesture track if the function instruction matched with the input touch gesture track cannot be found, and calling the gesture guidance information by drawing a certain touch gesture track if the user is unclear or forgets the matching relationship between the touch gesture track and the function instruction, the user is facilitated to check the matching relationship between the touch gesture track and the function instruction in the current operating environment, the user is helped to draw the correct touch gesture track, one-step calling of the function instruction is realized, and the operation is simple, the time of the user is saved, and the interactive experience effect is good.
Meanwhile, the same touch gesture track can be used in different operation environments, function instruction calling without influencing operation environment matching is achieved, of course, different touch gesture tracks can be used in different operation environments, a user does not need to consider the current operation environment, the matching relation between the touch gesture track and the function instruction can be set according to own preference or habit, and the user operation is enabled to be simpler and more convenient.
As shown in fig. 6, fig. 6 is a bus diagram of a terminal device where an operation device based on gesture interaction is located in the embodiment of the present invention. The terminal device may include: at least one processor 610, such as a CPU, at least one network interface 640, a user interface 630, memory 650, at least one communication bus 620. Wherein a communication bus 620 is used to enable connective communication between these components. The user interface 630 may include a Display (Display), a Keyboard (Keyboard), a standard wired interface, and a wireless interface. Network interface 640 may include a standard wired interface, a wireless interface (e.g., a WIFI interface). Memory 650 may be a high-speed RAM memory or a non-volatile memory, such as at least one disk memory. The memory 650 may also be at least one storage device located remotely from the processor 610. The memory 650, which is a type of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and an operating program based on gesture interaction.
In the terminal device where the operation device based on gesture interaction shown in fig. 6 is located, the network interface 640 is mainly used for connecting to a server and performing data communication with the server; the user interface 630 is mainly used for receiving user instructions and interacting with the user; and the processor 610 may be configured to invoke the gesture interaction based operation program stored in the memory 650 and perform the following operations:
acquiring an input touch gesture track;
identifying an operation environment in which the input touch gesture track occurs;
searching a function instruction matched with the input touch gesture track in the operation environment;
and if the functional instruction matched with the input touch gesture track is found, executing the functional instruction.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. The term "comprising" is used to specify the presence of stated features, integers, steps, operations, elements, components, operations.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (6)

1. An operation method based on gesture interaction, the method comprising:
establishing gesture matching information, and setting a matching relation among an operation environment in which a touch gesture track occurs, the touch gesture track, a calling function instruction and gesture guiding information; the operating environment comprises a system operating environment and an application program environment; the system operating environment is an operating system interactive interface, and the application program environment is an application program interactive interface for displaying the application program function in the application program running process;
acquiring an input touch gesture track;
identifying an operation environment in which the input touch gesture track occurs; the operating environment comprises a system operating environment and an application program environment; when the operating environment where the touch gesture track occurs is the system operating environment, the function instruction comprises a calling instruction of an application program, a matching relation I of the touch gesture track and the application program is set, and the system operating environment and the matching relation I are bound; when the operation environment where the touch gesture track occurs is an application program environment, the function instruction comprises a function calling instruction in the application program, a matching relation II between the touch gesture track and a function module in the application program is set, and the application program environment and the matching relation II are bound;
searching a function instruction matched with the input touch gesture track in the operation environment; wherein, under the system operating environment, the functional instruction comprises a call instruction of an application program; in the application program environment, the function instruction comprises a function calling instruction in an application program;
if the functional instruction matched with the input touch gesture track is found, executing the functional instruction; under the system operating environment, the executing the functional instruction comprises executing a calling instruction of the application program, searching a process ID of the application program, and calling a program corresponding to the process ID to run the application program; in the application program environment, the executing the function instruction comprises executing a function call instruction in the application program, searching a function entry address in the application program corresponding to the function instruction, and executing a function corresponding to the function entry address;
and if the function instruction matched with the input touch gesture track cannot be found, displaying corresponding gesture guide information according to an operation environment, wherein the gesture guide information comprises the matching relation between the touch gesture track and the function instruction in the operation environment.
2. The method according to claim 1, wherein the function instruction comprises a call instruction of gesture guidance information, and the executing the function instruction comprises executing the call instruction of gesture guidance information, namely searching for an address of the gesture guidance information, and triggering the gesture guidance information according to the address of the gesture guidance information.
3. An operation device based on gesture interaction, the device comprising:
the gesture setting module is used for creating gesture matching information and setting a matching relation among an operation environment in which the touch gesture track occurs, the touch gesture track, a calling function instruction and gesture guiding information; the operating environment comprises a system operating environment and an application program environment; the system operating environment is an operating system interactive interface, and the application program environment is an application program interactive interface for displaying the application program function in the application program running process;
the gesture acquisition module is used for acquiring an input touch gesture track;
the environment recognition module is used for recognizing an operation environment in which the input touch gesture track occurs; the operating environment comprises a system operating environment and an application program environment; when the operating environment where the touch gesture track occurs is the system operating environment, the function instruction comprises a calling instruction of an application program, a matching relation I of the touch gesture track and the application program is set, and the system operating environment and the matching relation I are bound; when the operation environment where the touch gesture track occurs is an application program environment, the function instruction comprises a function calling instruction in the application program, a matching relation II between the touch gesture track and a function module in the application program is set, and the application program environment and the matching relation II are bound;
the gesture matching module is used for searching a function instruction matched with the input touch gesture track under the operation environment; wherein, under the system operating environment, the functional instruction comprises a call instruction of an application program; in the application program environment, the function instruction comprises a function calling instruction in an application program;
the function execution module is used for executing the function instruction if the function instruction matched with the input touch gesture track is found; under the system operating environment, the executing the functional instruction comprises executing a calling instruction of the application program, searching a process ID of the application program, and calling a program corresponding to the process ID to run the application program; in the application program environment, the executing the function instruction comprises executing a function call instruction in the application program, searching a function entry address in the application program corresponding to the function instruction, and executing a function corresponding to the function entry address;
and the gesture guiding module is used for displaying corresponding gesture guiding information according to the operation environment after the step of searching for the function instruction matched with the input touch gesture track in the operation environment, if the function instruction matched with the input touch gesture track cannot be searched, wherein the gesture guiding information comprises the matching relation between the touch gesture track and the function instruction in the operation environment.
4. The apparatus according to claim 3, wherein the function instruction comprises a call instruction of gesture guidance information, and the executing the function instruction comprises executing the call instruction of gesture guidance information, namely searching for an address of the gesture guidance information, and triggering the gesture guidance information according to the address of the gesture guidance information.
5. A storage medium on which a computer program is stored, which program, when being executed by a processor, is adapted to carry out a method of operation based on gesture interaction according to any one of claims 1 to 2.
6. A terminal device comprising a storage medium, a processor and a computer program stored on the storage medium and executable on the processor, the processor implementing the method of operation based on gesture interaction according to any one of claims 1 to 2 when executing the program.
CN201410351674.6A 2014-07-22 2014-07-22 Operation method and device based on gesture interaction Active CN105302452B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410351674.6A CN105302452B (en) 2014-07-22 2014-07-22 Operation method and device based on gesture interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410351674.6A CN105302452B (en) 2014-07-22 2014-07-22 Operation method and device based on gesture interaction

Publications (2)

Publication Number Publication Date
CN105302452A CN105302452A (en) 2016-02-03
CN105302452B true CN105302452B (en) 2020-10-30

Family

ID=55199768

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410351674.6A Active CN105302452B (en) 2014-07-22 2014-07-22 Operation method and device based on gesture interaction

Country Status (1)

Country Link
CN (1) CN105302452B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105759962B (en) * 2016-02-04 2019-08-06 腾讯科技(深圳)有限公司 A kind of method and mobile terminal of application starting
CN106227350B (en) * 2016-07-28 2019-07-09 青岛海信电器股份有限公司 The method and smart machine of operation control are carried out based on gesture
WO2018112803A1 (en) * 2016-12-21 2018-06-28 华为技术有限公司 Touch screen-based gesture recognition method and device
CN107329574A (en) * 2017-06-30 2017-11-07 联想(北京)有限公司 Input method and system for electronic equipment
CN108694913B (en) * 2018-06-21 2021-02-23 昆山龙腾光电股份有限公司 Terminal equipment and working parameter adjusting method thereof
CN111176534A (en) * 2018-11-12 2020-05-19 奇酷互联网络科技(深圳)有限公司 Mobile terminal, method for rapidly displaying application or function and storage medium
CN110389702A (en) * 2019-07-19 2019-10-29 珠海格力电器股份有限公司 A kind of screenshot method, device and storage medium
CN111131952B (en) * 2019-12-27 2023-11-03 深圳春沐源控股有限公司 Earphone assembly control method, earphone assembly and computer readable storage medium
CN112418080A (en) * 2020-11-20 2021-02-26 江苏奥格视特信息科技有限公司 Finger action recognition method of laser scanning imager

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102841682A (en) * 2012-07-12 2012-12-26 宇龙计算机通信科技(深圳)有限公司 Terminal and gesture manipulation method
CN103324425A (en) * 2012-12-13 2013-09-25 重庆优腾信息技术有限公司 Command execution method and device based on gestures
CN103809842A (en) * 2012-11-07 2014-05-21 上海揆志网络科技有限公司 Method and device for executing system functions by hand gesture identification

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102841682A (en) * 2012-07-12 2012-12-26 宇龙计算机通信科技(深圳)有限公司 Terminal and gesture manipulation method
CN103809842A (en) * 2012-11-07 2014-05-21 上海揆志网络科技有限公司 Method and device for executing system functions by hand gesture identification
CN103324425A (en) * 2012-12-13 2013-09-25 重庆优腾信息技术有限公司 Command execution method and device based on gestures

Also Published As

Publication number Publication date
CN105302452A (en) 2016-02-03

Similar Documents

Publication Publication Date Title
CN105302452B (en) Operation method and device based on gesture interaction
US10725646B2 (en) Method and apparatus for switching screen interface and terminal
WO2016107501A1 (en) Intelligent device control method and device
CN105786878B (en) Display method and device of browsing object
CN104852885B (en) Method, device and system for verifying verification code
CN106293308B (en) Screen unlocking method and device
CN108039963B (en) Container configuration method and device and storage medium
CN108958606B (en) Split screen display method and device, storage medium and electronic equipment
WO2015043189A1 (en) Message display method and apparatus, and terminal device
WO2018027551A1 (en) Message display method, user terminal and graphic user interface
WO2014206138A1 (en) Webpage data update method, apparatus and terminal device
CN106951143B (en) Method and device for hiding application icons
WO2015067142A1 (en) Webpage display method and device
CN113050863A (en) Page switching method and device, storage medium and electronic equipment
CN109688611B (en) Frequency band parameter configuration method, device, terminal and storage medium
CN106339391B (en) Webpage display method and terminal equipment
CN105631059B (en) Data processing method, data processing device and data processing system
CN105320858B (en) Method and device for rapidly displaying application icons
CN105095161B (en) Method and device for displaying rich text information
CN111475066B (en) Background switching method of application program and electronic equipment
CN108897467B (en) Display control method and terminal equipment
WO2015124060A1 (en) Login interface displaying method and apparatus
CN108234275B (en) Method and device for releasing communication information
CN104954231B (en) Method and device for sending and displaying recommendation information
WO2015067206A1 (en) File searching method and terminal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant