CN110096213B - Terminal operation method based on gestures, mobile terminal and readable storage medium - Google Patents

Terminal operation method based on gestures, mobile terminal and readable storage medium Download PDF

Info

Publication number
CN110096213B
CN110096213B CN201910366850.6A CN201910366850A CN110096213B CN 110096213 B CN110096213 B CN 110096213B CN 201910366850 A CN201910366850 A CN 201910366850A CN 110096213 B CN110096213 B CN 110096213B
Authority
CN
China
Prior art keywords
touch event
gesture
application
mobile terminal
infrared sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910366850.6A
Other languages
Chinese (zh)
Other versions
CN110096213A (en
Inventor
吴启军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201910366850.6A priority Critical patent/CN110096213B/en
Publication of CN110096213A publication Critical patent/CN110096213A/en
Application granted granted Critical
Publication of CN110096213B publication Critical patent/CN110096213B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Abstract

The application discloses a terminal operation method based on gestures, which comprises the following steps: when the application is detected to be started, determining whether the application is started in an auxiliary operation mode, monitoring gesture actions based on the infrared sensor when the application is started in the auxiliary operation mode, determining a first touch event corresponding to the gesture actions based on a preset mapping relation when the gesture actions are monitored based on the infrared sensor, determining a third touch event based on a second touch event triggered by a screen of the mobile terminal and the first touch event, and executing the operation corresponding to the third touch event through the application. The application also discloses a mobile terminal and a readable storage medium. The method and the device are applied to the mobile terminal provided with the infrared sensor, acquire the gesture through the infrared sensor and execute the corresponding operation, provide more operation combinations, can finish more complex operations, and realize the concurrency of the gesture-based terminal operation.

Description

Terminal operation method based on gestures, mobile terminal and readable storage medium
Technical Field
The present application relates to the field of intelligent terminals, and in particular, to a gesture-based terminal operation method, a mobile terminal, and a readable storage medium.
Background
With the development of smart phones, the operability of the smart phone is stronger, the smart phone generally has only one front touch screen, and generally, the touch operation of a user is performed on the front screen, so that the user can perform various operations by using the smart phone.
With the popularity of hand-tour, many terminal manufacturers have now introduced game terminals, and because the mobile terminals do not have physical keys for games, when users play games by using devices such as smartphones and tablet computers, the users generally operate up, down, left, right and other functions through virtual keys on a game interface. However, in applications with complex mobile games or other relatively complex operations, complex operations are often required, that is, a user needs to click a plurality of touch virtual keys on a screen to complete the corresponding application operations.
Disclosure of Invention
The application mainly aims to provide a gesture-based terminal operation method, a mobile terminal and a readable storage medium, and aims to solve the technical problems that the existing mobile gesture-based terminal is small in operation space, limited in completed complex operation and low in operation concurrency.
In order to achieve the above object, the present application provides a gesture-based terminal operation method, including:
when the application start is detected, determining whether the application starts an auxiliary operation mode;
monitoring gesture actions based on the infrared sensor when the application opening auxiliary operation mode is determined;
when a gesture action is monitored based on the infrared sensor, determining a first touch event corresponding to the gesture action based on a preset mapping relation;
determining a third touch event based on a second touch event triggered by a screen of the mobile terminal and the first touch event;
and executing the operation corresponding to the third touch event through the application.
Optionally, when the gesture motion is detected based on the infrared sensor, the step of determining the first touch event corresponding to the gesture motion based on the preset mapping relationship includes:
determining whether the gesture motion meets a preset condition or not based on the preset mapping relation when the gesture motion is monitored based on the infrared sensor;
and when the gesture meets a preset condition, determining a first touch event corresponding to the gesture based on the preset mapping relation.
Optionally, the step of determining whether the gesture meets a preset condition based on the preset mapping relation includes:
determining whether the gesture motion exists in the preset mapping relation, wherein when the gesture motion exists in the preset mapping relation, the gesture motion meets preset conditions.
Optionally, the step of determining a third touch event based on the second touch event triggered by the screen of the mobile terminal and the first touch event includes:
determining whether a first operation instruction corresponding to the first touch event is consistent with a second operation instruction corresponding to the second touch event;
and when the first operation instruction and the second operation instruction are consistent, the second touch event is used as the third touch event, and the first touch event is used as an invalid touch event.
Optionally, after the step of determining whether the first operation instruction corresponding to the first touch event and the second operation instruction corresponding to the second touch event are consistent, the method further includes:
and synthesizing the first touch event and the second touch event into the third touch event when the first operation instruction and the second operation instruction are inconsistent.
Optionally, before the step of determining whether the application starts the auxiliary operation mode when the application is detected to be started, the method further includes:
when a setting instruction is received, determining an application to be set, and setting the application to be set into an auxiliary operation mode;
and acquiring gesture actions based on the infrared sensors, and generating a preset mapping relation between the gesture actions and the touch events based on preset rules and the gesture actions.
Optionally, after the step of determining whether the application starts the auxiliary operation mode when the application is detected to be started, the method further includes:
if a switching instruction is received, switching the auxiliary operation mode into a single-screen operation mode;
and closing the monitoring function of the infrared sensor.
Optionally, the step of executing, by the application, the operation corresponding to the third touch event includes:
and determining an operation instruction corresponding to the third touch event based on a comparison table of the touch event and the operation instruction, and executing an operation corresponding to the operation instruction.
In addition, in order to achieve the above object, the present application also provides a mobile terminal including a memory, a processor, and a gesture-based terminal operation program stored on the memory and executable on the processor, which when executed by the processor, implements the steps of the gesture-based terminal operation method as described above.
In addition, to achieve the above object, the present application also provides a readable storage medium having stored thereon a gesture-based terminal operation program which, when executed by a processor, implements the steps of the gesture-based terminal operation method as described above.
According to the method, whether the application starts an auxiliary operation mode or not is determined when the application is detected to start, gesture actions are monitored based on the infrared sensor when the application starts the auxiliary operation mode, then a first touch event corresponding to the gesture actions is determined based on a preset mapping relation when the gesture actions are monitored based on the infrared sensor, a third touch event is determined based on a second touch event triggered by a screen of the mobile terminal and the first touch event, and an operation corresponding to the third touch event is executed through the application. The method and the device are applied to the mobile terminal provided with the infrared sensor, can simultaneously perform different operations on the mobile terminal, provide more operation combinations, can complete more complex operations, and realize concurrency of terminal operations based on gestures.
Drawings
Fig. 1 is a schematic diagram of a hardware structure of a terminal for implementing various embodiments of the present application;
fig. 2 is a schematic diagram of a communication network system according to an embodiment of the present application;
FIG. 3 is a flowchart of a gesture-based terminal operation method according to a first embodiment of the present application;
fig. 4 is a flowchart of a gesture-based terminal operation method according to a second embodiment of the present application.
The achievement of the object, functional features and advantages of the present application will be described with reference to the embodiments with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
In the following description, suffixes such as "module", "component", or "unit" for representing elements are used only for facilitating the description of the present application, and have no specific meaning per se. Thus, "module," "component," or "unit" may be used in combination.
The terminal may be implemented in various forms. For example, the terminals described in the present application may include mobile terminals such as cell phones, tablet computers, notebook computers, palm computers, personal digital assistants (Personal Digital Assistant, PDA), portable media players (Portable Media Player, PMP), navigation devices, wearable devices, smart bracelets, pedometers, and fixed terminals such as digital TVs, desktop computers, and the like.
The following description will be given taking a mobile terminal as an example, and those skilled in the art will understand that the configuration according to the embodiment of the present application can be applied to a fixed type terminal in addition to elements particularly used for a moving purpose.
Referring to fig. 1, which is a schematic diagram of a hardware structure of a mobile terminal implementing various embodiments of the present application, the mobile terminal 100 may include: an RF (Radio Frequency) unit 101, a WiFi module 102, an audio output unit 103, an a/V (audio/video) input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, a memory 109, a processor 110, and a power supply 111. Those skilled in the art will appreciate that the mobile terminal structure shown in fig. 1 is not limiting of the mobile terminal and that the mobile terminal may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
The following describes the components of the mobile terminal in detail with reference to fig. 1:
the radio frequency unit 101 may be used for receiving and transmitting signals during the information receiving or communication process, specifically, after receiving downlink information of the base station, processing the downlink information by the processor 110; and, the uplink data is transmitted to the base station. Typically, the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System of Mobile communication, global System for Mobile communications), GPRS (General Packet Radio Service ), CDMA2000 (Code Division Multiple Access, CDMA 2000), WCDMA (Wideband Code Division Multiple Access ), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access, time Division synchronous code Division multiple Access), FDD-LTE (Frequency Division Duplexing-Long Term Evolution, frequency Division Duplex Long term evolution), and TDD-LTE (Time Division Duplexing-Long Term Evolution, time Division Duplex Long term evolution), etc.
WiFi belongs to a short-distance wireless transmission technology, and a mobile terminal can help a user to send and receive e-mails, browse web pages, access streaming media and the like through the WiFi module 102, so that wireless broadband Internet access is provided for the user. Although fig. 1 shows a WiFi module 102, it is understood that it does not belong to the necessary constitution of a mobile terminal, and can be omitted entirely as required within a range that does not change the essence of the application.
The audio output unit 103 may convert voice data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the mobile terminal 100 is in a call signal reception mode, a talk mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output (e.g., a call signal reception sound, a message reception sound, etc.) related to a specific function performed by the mobile terminal 100. The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input unit 104 is used to receive an audio or video signal. The a/V input unit 104 may include a graphics processor (Graphics Processing Unit, GPU) 1041 and a microphone 1042, the graphics processor 1041 processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphics processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 can receive sound (voice data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, and the like, and can process such sound into voice data. The processed audio (voice) data may be converted into a format output that can be transmitted to the mobile communication base station via the radio frequency unit 101 in the case of a telephone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting the audio signal.
The mobile terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor and a proximity sensor, wherein the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and the proximity sensor can turn off the display panel 1061 and/or the backlight when the mobile terminal 100 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and direction when stationary, and can be used for applications of recognizing the gesture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and knocking), and the like; as for other sensors such as fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc. that may also be configured in the mobile phone, the detailed description thereof will be omitted.
The display unit 106 is used to display information input by a user or information provided to the user. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the mobile terminal. In particular, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on the touch panel 1071 or thereabout by using any suitable object or accessory such as a finger, a stylus, etc.), and drive the corresponding connection device according to a predetermined program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch control detection device detects the touch control direction of a user, detects signals brought by touch control operation and transmits the signals to the touch control controller; the touch controller receives touch information from the touch detection device, converts the touch information into touch coordinates, sends the touch coordinates to the processor 110, and can receive and execute commands sent by the processor 110. Further, the touch panel 1071 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 107 may include other input devices 1072 in addition to the touch panel 1071. In particular, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, mouse, joystick, etc., as specifically not limited herein.
Further, the touch panel 1071 may cover the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 1, the touch panel 1071 and the display panel 1061 are two independent components for implementing the input and output functions of the mobile terminal, in some embodiments, the touch panel 1071 may be integrated with the display panel 1061 to implement the input and output functions of the mobile terminal, which is not limited herein.
The interface unit 108 serves as an interface through which at least one external device can be connected with the mobile terminal 100. For example, the external devices may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and an external device.
Memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area that may store an operating system, application programs required for at least one function (such as a sound playing function, an image playing function, etc.), and a storage data area; the storage data area may store data (such as voice data, phonebook, etc.) created according to the use of the handset, etc. In addition, memory 109 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 110 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by running or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the mobile terminal. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
In addition, in the mobile terminal 100 shown in fig. 1, the processor 110 is configured to invoke the gesture-based terminal operation program stored in the memory 109, and execute the steps of the gesture-based terminal operation method provided in the embodiments of the present application.
The mobile terminal 100 may further include a power source 111 (e.g., a battery) for supplying power to the respective components, and preferably, the power source 111 may be logically connected to the processor 110 through a power management system, so as to perform functions of managing charging, discharging, and power consumption management through the power management system.
Although not shown in fig. 1, the mobile terminal 100 may further include a bluetooth module or the like, which is not described herein.
In order to facilitate understanding of the embodiments of the present application, a communication network system on which the mobile terminal of the present application is based will be described below.
Referring to fig. 2, fig. 2 is a schematic diagram of a communication network system according to an embodiment of the present application, where the communication network system is an LTE system of a general mobile communication technology, and the LTE system includes a UE (User Equipment) 201, an e-UTRAN (Evolved UMTS Terrestrial Radio Access Network ) 202, an epc (Evolved Packet Core, evolved packet core) 203, and an IP service 204 of an operator that are sequentially connected in communication.
Specifically, the UE201 may be the mobile terminal 100 described above, and will not be described herein.
The E-UTRAN202 includes eNodeB2021 and other eNodeB2022, etc. The eNodeB2021 may be connected with other eNodeB2022 by a backhaul (e.g., an X2 interface), the eNodeB2021 is connected to the EPC203, and the eNodeB2021 may provide access from the UE201 to the EPC 203.
EPC203 may include MME (Mobility Management Entity ) 2031, hss (Home Subscriber Server, home subscriber server) 2032, other MMEs 2033, SGW (Serving Gate Way) 2034, pgw (PDN Gate Way) 2035 and PCRF (Policy and Charging Rules Function, policy and tariff function entity) 2036, and so on. The MME2031 is a control node that handles signaling between the UE201 and EPC203, providing bearer and connection management. HSS2032 is used to provide registers to manage functions such as home location registers (not shown) and to hold user specific information about service characteristics, data rates, etc. All user data may be sent through SGW2034 and PGW2035 may provide IP address allocation and other functions for UE201, PCRF2036 is a policy and charging control policy decision point for traffic data flows and IP bearer resources, which selects and provides available policy and charging control decisions for a policy and charging enforcement function (not shown).
IP services 204 may include the internet, intranets, IMS (IP Multimedia Subsystem ), or other IP services, etc.
Although the LTE system is described above as an example, it should be understood by those skilled in the art that the present application is not limited to LTE systems, but may be applied to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA, and future new network systems.
Based on the above-mentioned terminal hardware structure and the communication network system, various embodiments of the gesture-based terminal operation method of the present application are presented.
The application provides a terminal operation method based on gestures.
Referring to fig. 3, fig. 3 is a flowchart illustrating a first embodiment of a gesture-based terminal operation method according to the present application.
In the present embodiment, an embodiment of a gesture-based terminal operation method is provided, and it should be noted that although a logic order is shown in the flowchart, in some cases, the steps shown or described may be performed in an order different from that herein.
In this embodiment, the gesture-based terminal operation method is applied to a mobile terminal provided with an infrared sensor, and the gesture-based terminal operation method includes:
step S10, when the starting of an application is detected, determining whether the application starts an auxiliary operation mode or not;
in this embodiment, the present application is applied to a mobile terminal provided with infrared sensors, the number of which is at least 2, and preferably, the infrared sensors are provided at the center position of the mobile terminal. The mobile terminal detects the starting condition of the application in real time, and when receiving a starting instruction of the application, starts the corresponding application and displays an application interface, wherein the starting instruction can be the starting instruction triggered by the touch operation detected by the mobile terminal or the voice instruction containing the application to be started is received according to the touch operation, and the method is not limited. When the mobile terminal detects that the application is started, judging whether the application is started in an auxiliary operation mode, and starting a detection function only in the auxiliary operation mode by an infrared sensor of the mobile terminal, wherein the specific method for judging whether the application is started in the auxiliary operation mode is to correlate the starting of the application with the auxiliary operation mode in advance, and when the application in the associated auxiliary operation mode is started, the auxiliary operation mode which is preset can be correspondingly started.
It should be noted that, when the mobile terminal detects that the application is started and determines that the application does not start the auxiliary operation mode, the mobile terminal performs the display operation in the normal single-screen operation mode.
Step S20, monitoring gesture actions based on the infrared sensor when the application starting auxiliary operation mode is determined;
in this embodiment, after the mobile terminal detects that the application is started, if the mobile terminal determines that the application is started in the auxiliary operation mode, the mobile terminal detects a gesture through the infrared sensor, where the gesture is acquired in a sensing area of the infrared sensor, for example, a finger is swung in the sensor, and the gesture is determined to be sliding, and may be specifically further classified into sliding leftwards, sliding rightwards, sliding upwards, sliding downwards, and the like.
Step S30, when a gesture action is detected based on the infrared sensor, determining a first touch event corresponding to the gesture action based on a preset mapping relation;
in this embodiment, when the mobile terminal monitors the gesture motion through the infrared sensor, the first touch event corresponding to the gesture motion is further determined based on the mapping relationship between the gesture motion and the touch event. The touch event is specifically a gesture that can be recognized by the mobile terminal and corresponds to a track formed by a user operating a screen or an operation on a virtual key, for example, a gesture that swings to the right in a sensing area of the infrared sensor, and if the gesture is determined to be a gesture sliding to the right, the gesture is mapped to a touch event of a certain virtual key, and a corresponding relationship between the gesture and the touch event is preset by the user and stored in a mapping relationship.
Specifically, step S30 includes:
step S31, when the gesture motion is monitored based on the infrared sensor, determining whether the gesture motion meets a preset condition based on the preset mapping relation;
in this embodiment, when the mobile terminal detects that the application is started and determines that the application is started in the auxiliary operation mode, the mobile terminal detects a gesture action through the infrared sensor, and when the gesture action is detected, it is further required to determine whether the gesture action meets a preset condition, that is, whether the gesture is a valid gesture or not, so as to trigger the operation instruction to execute a related operation.
Specifically, step S31 includes: determining whether the gesture motion exists in the preset mapping relation, wherein when the gesture motion exists in the preset mapping relation, the gesture motion meets preset conditions.
In this embodiment, the correspondence between the gesture action and the touch event is stored in the preset mapping relationship, when the mobile terminal detects the gesture action through the infrared sensor, whether the gesture action exists or not is checked in the preset mapping relationship, if the gesture action exists in the preset mapping relationship, it is determined that the gesture action meets the preset condition, and then the touch event corresponding to the gesture action can be obtained through the preset mapping relationship.
Step S32, when the gesture meets a preset condition, determining a first touch event corresponding to the gesture based on the preset mapping relationship.
In this embodiment, when the gesture exists in the preset mapping relationship, it is determined that the gesture meets a preset condition. In the preset mapping relation, the gesture actions are in one-to-one correspondence with the touch events, so that when one gesture action is detected, one touch event is uniquely determined. For example, a gesture of a user's hand approaching an infrared sensor may be determined to be a tap gesture, mapped to a touch event that turns on the camera; the gesture movement swung to the right in the sensing area of the infrared sensor can be determined to be the gesture movement sliding to the right, and then the gesture movement is mapped to a touch event corresponding to the jump virtual key, so that the corresponding relation stored in the preset mapping relation is as follows: the gesture action of clicking corresponds to a touch event that turns on the camera, and the gesture action of waving to the right corresponds to a jump touch event.
Step S40, determining a third touch event based on a second touch event triggered by a screen of the mobile terminal and the first touch event;
in this embodiment, after the auxiliary operation mode is started, if the mobile terminal detects the first touch event determined by the gesture detected by the infrared sensor and the second touch event triggered by the screen of the mobile terminal at the same time, the detected first touch event and the detected second touch event are combined into the third touch event. For the mobile terminal, two sets of information are received at the same time, so that the mobile terminal can combine the first touch event and the second touch event into a third touch event, wherein the third touch event is a new touch event after combining the first touch event and the second touch event, and it can be understood that the third touch event comprises all the characteristics of the first touch event and the second touch event.
It can be appreciated that in the auxiliary operation mode, both the screen and the infrared sensor have the capability of detecting a touch event, and if both the screen and the infrared sensor determine the touch event in the same period, the touch events determined by the screen and the infrared sensor are collected simultaneously; if the touch event is detected by the screen or the infrared sensor in the same period, the touch event of the screen or the infrared sensor is acquired.
Step S50, executing, by the application, an operation corresponding to the third touch event.
In this embodiment, after the third touch event is synthesized, the mobile terminal obtains an operation instruction triggered by the third touch event, and executes a corresponding operation in the application according to the triggered operation instruction.
Specifically, step S50 includes:
step S51, determining an operation instruction corresponding to the third touch event based on a comparison table of the touch event and the operation instruction, and executing an operation corresponding to the operation instruction.
In this embodiment, the mobile terminal establishes a mapping relationship between the third touch event and the operation instruction of the mobile terminal, determines the operation instruction corresponding to the touch operation based on the comparison table of the touch operation and the operation instruction stored in the mobile terminal, and executes the operation corresponding to the operation instruction. For example, a gesture movement swiping to the right corresponds to a jump touch event, and when the gesture movement swiping to the right is detected, a jump operation is performed; the gesture action of clicking corresponds to a touch event of opening the camera, and when the clicking gesture is detected, the operation of opening the camera is performed.
Further, in an embodiment, step S10 further includes:
step S60, when a setting instruction is received, determining an application to be set, and setting the application to be set into an auxiliary operation mode;
in this embodiment, when the mobile terminal receives the setting instruction, the mobile terminal enters a setting mode according to the setting instruction, and in the setting mode, displays a setting interface for a user to select an application to be set, specifically, the mobile terminal obtains all applications in the mobile terminal, and displays an application icon list on the setting interface for the user to select a corresponding icon. After receiving an application to be set selected by a user, setting the application as an auxiliary operation mode, wherein the method for setting the auxiliary operation mode can be selected as follows: and associating the starting of the application with the auxiliary operation mode, and correspondingly starting the auxiliary operation mode when the application is started.
Step S70, gesture actions are acquired based on the infrared sensors, and a preset mapping relation between the gesture actions and the touch events is generated based on preset rules and the gesture actions.
In this embodiment, in the auxiliary operation mode, firstly, gesture actions are collected based on the infrared sensor, for example, gesture actions of clicking are collected, then, a small suspension icon is dragged onto a game virtual key to be controlled on a screen by dragging the small suspension icon, for example, the virtual key is a key for opening a backpack, then, according to the position of the suspension icon, coordinates of a center point of the suspension icon are obtained, the coordinates are used as a starting point position or a clicking position of the current operation, at this time, setting of the gesture actions of clicking and a touch event for opening the backpack is completed, steps of collecting the gesture actions and dragging the suspension icon are repeated, mapping of various gesture actions and the touch event is completed, and it is to be noted that the gesture actions and the touch event are in one-to-one correspondence.
Further, in an embodiment, step S10 further includes:
step S80, if a switching instruction is received, switching the auxiliary operation mode into a single-screen operation mode;
step S90, turning off the monitoring function of the infrared sensor.
In this embodiment, after determining that the application starts the auxiliary operation mode, if the mobile terminal receives an instruction to switch the mode, the auxiliary operation mode is switched to the normal single-screen operation mode.
Specifically, the mobile terminal displays a transparent virtual switching key on the application display interface where the auxiliary operation mode is started, for example, the transparent virtual switching key can be located at the upper left corner, so that the switching is convenient. When the user does not need auxiliary operation in the application of the auxiliary operation mode, but intends to switch to the single-screen operation mode, the user can directly click the switch key, the mobile terminal switches the application to the single-screen operation mode after receiving the instruction of switching the mode, and the detection function of the infrared sensor is closed in the single-screen mode.
According to the gesture-based terminal operation method, when an application is detected to be started, whether an auxiliary operation mode is started by the application is determined, when the auxiliary operation mode is determined to be started by the application, gesture actions are monitored based on the infrared sensor, then when the gesture actions are monitored based on the infrared sensor, a first touch event corresponding to the gesture actions is determined based on a preset mapping relation, then a third touch event is determined based on a second touch event triggered by a screen of the mobile terminal and the first touch event, and an operation corresponding to the third touch event is executed through the application. The gesture-based mobile terminal is applied to a mobile terminal provided with the infrared sensor, the gesture is acquired through the infrared sensor, corresponding operation is executed, more operation combinations are provided, more complex operations can be completed, and the concurrency of the gesture-based terminal operation is realized.
Based on the first embodiment, a second embodiment of the gesture-based terminal operation method of the present application is proposed, referring to fig. 4, in this embodiment, step S40 includes:
step S41, determining whether a first operation instruction corresponding to the first touch event is consistent with a second operation instruction corresponding to the second touch event;
in this embodiment, after the auxiliary operation mode is started, if the mobile terminal detects the first touch event determined by the gesture detected by the infrared sensor and the second touch event triggered by the screen of the mobile terminal at the same time, only one touch event is default in the same area or the same virtual button, and when two touch events are simultaneously present, in order to avoid the system disorder caused by repeated execution of the operations corresponding to the first touch event and the second touch event, whether the first operation instruction corresponding to the first touch event and the second operation instruction corresponding to the second touch event are consistent needs to be confirmed.
And step S42, when the first operation instruction and the second operation instruction are consistent, the second touch event is regarded as the third touch event, and the first touch event is regarded as the invalid touch event.
In this embodiment, in order to avoid two repeated touch events collected in the same area or the same virtual key, according to a priority execution principle, when the first operation instruction and the second operation instruction are consistent, the second touch event is used as a third touch event, that is, only the touch operation based on the screen is executed, and the first touch event is used as an invalid touch event, that is, the mobile terminal does not respond to the gesture action.
And step S43, when the first operation instruction and the second operation instruction are inconsistent, synthesizing the first touch event and the second touch event into the third touch event.
In this embodiment, when the first operation instruction and the second operation instruction are inconsistent, it is indicated that the mobile terminal detects the first touch event determined by the gesture detected by the infrared sensor and the second touch event triggered by the screen of the mobile terminal act on different areas or different virtual keys, and then the first touch event and the second touch event are synthesized into the third touch event, so as to execute the operation of the third touch event.
According to the gesture-based terminal operation method, the mobile terminal detects the first touch event determined by the gesture action detected by the infrared sensor and the second touch event triggered by the screen of the mobile terminal, only one touch event can be default in the same area or the same virtual button, and when two touch events are simultaneously and same, only the second touch event is reserved, so that the repeated execution of the touch events is avoided.
In addition, the embodiment of the application also provides a readable storage medium.
The readable storage medium has stored thereon a gesture-based terminal operation program which, when executed by a processor, implements the steps of the gesture-based terminal operation method as in any of the embodiments described above.
The specific implementation manner of the readable storage medium of the present application is substantially the same as the above embodiments of the gesture-based terminal operation method, and will not be described herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present application.
The foregoing description of the preferred embodiments of the present application should not be taken as limiting the scope of the application, but rather should be understood to cover all modifications, equivalents, and alternatives falling within the scope of the application as defined by the following description and drawings, or by direct or indirect application to other relevant art(s).

Claims (6)

1. A gesture-based terminal operation method, which is applied to a mobile terminal provided with an infrared sensor, the gesture-based terminal operation method comprising:
when the application start is detected, determining whether the application starts an auxiliary operation mode;
monitoring gesture actions based on the infrared sensor when the application opening auxiliary operation mode is determined;
when a gesture action is monitored based on the infrared sensor, determining a first touch event corresponding to the gesture action based on a preset mapping relation;
determining a third touch event based on a second touch event triggered by a screen of the mobile terminal and the first touch event;
executing the operation corresponding to the third touch event through the application;
when the gesture motion is detected based on the infrared sensor, determining a first touch event corresponding to the gesture motion based on a preset mapping relation includes:
determining whether the gesture motion meets a preset condition or not based on the preset mapping relation when the gesture motion is monitored based on the infrared sensor;
when the gesture meets a preset condition, determining a first touch event corresponding to the gesture based on the preset mapping relation;
the step of determining whether the gesture meets a preset condition based on the preset mapping relation comprises the following steps:
determining whether the gesture action exists in the preset mapping relation, wherein when the gesture action exists in the preset mapping relation, the gesture action meets a preset condition;
the step of determining a third touch event based on the second touch event triggered by the screen of the mobile terminal and the first touch event includes:
determining whether a first operation instruction corresponding to the first touch event is consistent with a second operation instruction corresponding to the second touch event;
when the first operation instruction and the second operation instruction are consistent, the second touch event is used as the third touch event, and the first touch event is used as an invalid touch event;
after the step of determining whether the first operation instruction corresponding to the first touch event is consistent with the second operation instruction corresponding to the second touch event, the method further includes:
and synthesizing the first touch event and the second touch event into the third touch event when the first operation instruction and the second operation instruction are inconsistent.
2. The gesture-based terminal operation method of claim 1, wherein the step of determining whether the application starts the auxiliary operation mode when the application start is detected, further comprises:
when a setting instruction is received, determining an application to be set, and setting the application to be set into an auxiliary operation mode;
and acquiring gesture actions based on the infrared sensors, and generating a preset mapping relation between the gesture actions and the touch events based on preset rules and the gesture actions.
3. The gesture-based terminal operation method of claim 1, wherein after the step of determining whether the application starts the auxiliary operation mode when the application start is detected, further comprising:
if a switching instruction is received, switching the auxiliary operation mode into a single-screen operation mode;
and closing the monitoring function of the infrared sensor.
4. The gesture-based terminal operation method of any one of claims 1 to 3, wherein the step of executing, by the application, the operation corresponding to the third touch event includes:
and determining an operation instruction corresponding to the third touch event based on a comparison table of the touch event and the operation instruction, and executing an operation corresponding to the operation instruction.
5. A mobile terminal comprising a memory, a processor and a gesture-based terminal operating program stored on the memory and executable on the processor, which when executed by the processor, implements the steps of the gesture-based terminal operating method of any of claims 1 to 4.
6. A readable storage medium, characterized in that it has stored thereon a gesture-based terminal operation program, which, when executed by a processor, implements the steps of the gesture-based terminal operation method according to any of claims 1 to 4.
CN201910366850.6A 2019-04-30 2019-04-30 Terminal operation method based on gestures, mobile terminal and readable storage medium Active CN110096213B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910366850.6A CN110096213B (en) 2019-04-30 2019-04-30 Terminal operation method based on gestures, mobile terminal and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910366850.6A CN110096213B (en) 2019-04-30 2019-04-30 Terminal operation method based on gestures, mobile terminal and readable storage medium

Publications (2)

Publication Number Publication Date
CN110096213A CN110096213A (en) 2019-08-06
CN110096213B true CN110096213B (en) 2023-12-08

Family

ID=67446757

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910366850.6A Active CN110096213B (en) 2019-04-30 2019-04-30 Terminal operation method based on gestures, mobile terminal and readable storage medium

Country Status (1)

Country Link
CN (1) CN110096213B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111831138B (en) * 2020-06-22 2024-03-15 歌尔科技有限公司 Signal identification method and device of wireless earphone and wireless earphone
CN113315871B (en) * 2021-05-25 2022-11-22 广州三星通信技术研究有限公司 Mobile terminal and operating method thereof
CN116755565A (en) * 2023-08-21 2023-09-15 南京维赛客网络科技有限公司 Method, system and storage medium for realizing virtual interaction by transmitting signal by finger

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107422853A (en) * 2017-06-29 2017-12-01 努比亚技术有限公司 A kind of gesture identification method, mobile terminal and computer-readable recording medium
WO2018076523A1 (en) * 2016-10-25 2018-05-03 科世达(上海)管理有限公司 Gesture recognition method and apparatus, and in-vehicle system
CN108897478A (en) * 2018-06-28 2018-11-27 努比亚技术有限公司 Terminal operation method, mobile terminal and computer readable storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101943435B1 (en) * 2012-04-08 2019-04-17 삼성전자주식회사 Flexible display apparatus and operating method thereof
WO2014172334A1 (en) * 2013-04-15 2014-10-23 Flextronics Ap, Llc User gesture control of vehicle features

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018076523A1 (en) * 2016-10-25 2018-05-03 科世达(上海)管理有限公司 Gesture recognition method and apparatus, and in-vehicle system
CN107422853A (en) * 2017-06-29 2017-12-01 努比亚技术有限公司 A kind of gesture identification method, mobile terminal and computer-readable recording medium
CN108897478A (en) * 2018-06-28 2018-11-27 努比亚技术有限公司 Terminal operation method, mobile terminal and computer readable storage medium

Also Published As

Publication number Publication date
CN110096213A (en) 2019-08-06

Similar Documents

Publication Publication Date Title
CN108037893B (en) Display control method and device of flexible screen and computer readable storage medium
CN108108081B (en) Information display method based on double-sided screen, mobile terminal and readable storage medium
CN107861663B (en) Method and device for displaying dockbar under comprehensive screen
CN109542325B (en) Double-sided screen touch method, double-sided screen terminal and readable storage medium
CN107888768B (en) Unlocking control method, terminal and computer readable storage medium
CN109195213B (en) Mobile terminal screen control method, mobile terminal and computer readable storage medium
CN107809534B (en) Control method, terminal and computer storage medium
CN107329572B (en) Control method, mobile terminal and computer-readable storage medium
CN110096213B (en) Terminal operation method based on gestures, mobile terminal and readable storage medium
CN108958936B (en) Application program switching method, mobile terminal and computer readable storage medium
CN108055366A (en) Terminal call interface processing method, mobile terminal and computer readable storage medium
CN108563388B (en) Screen operation method, mobile terminal and computer-readable storage medium
CN112306366A (en) Operation method, mobile terminal and storage medium
CN107707755B (en) Key using method, terminal and computer readable storage medium
CN109389394B (en) Multi-screen payment control method, equipment and computer readable storage medium
CN110058767B (en) Interface operation method, wearable terminal and computer-readable storage medium
CN109408187B (en) Head portrait setting method and device, mobile terminal and readable storage medium
CN108810262B (en) Application configuration method, terminal and computer readable storage medium
CN109117069B (en) Interface operation method, terminal and computer readable storage medium
CN108153477B (en) Multi-touch operation method, mobile terminal and computer-readable storage medium
CN111443818B (en) Screen brightness regulation and control method, equipment and computer readable storage medium
CN109710168B (en) Screen touch method and device and computer readable storage medium
CN111443850A (en) Terminal operation method, terminal and storage medium
CN107580106B (en) Call control method, mobile terminal and computer readable storage medium
CN107315523B (en) Split screen processing method, mobile terminal and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant