CN106873887B - System and method for controlling application function based on gesture - Google Patents

System and method for controlling application function based on gesture Download PDF

Info

Publication number
CN106873887B
CN106873887B CN201710003117.9A CN201710003117A CN106873887B CN 106873887 B CN106873887 B CN 106873887B CN 201710003117 A CN201710003117 A CN 201710003117A CN 106873887 B CN106873887 B CN 106873887B
Authority
CN
China
Prior art keywords
application
gesture track
area
gesture
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710003117.9A
Other languages
Chinese (zh)
Other versions
CN106873887A (en
Inventor
代启帅
王猛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201710003117.9A priority Critical patent/CN106873887B/en
Publication of CN106873887A publication Critical patent/CN106873887A/en
Application granted granted Critical
Publication of CN106873887B publication Critical patent/CN106873887B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a method for controlling application functions based on gestures, which comprises the steps of creating a first gesture track and instruction corresponding relation list and a second gesture track and instruction corresponding relation list; acquiring a menu item under application setting; and displaying the data in the area where the application is located; detecting a first gesture track within an area in which the application is located; then detecting a second gesture track outside the area of the application; when a first gesture track in the area where the application is located is received, switching menu items displayed in the area where the application is located according to the first gesture track and an instruction corresponding relation list; and then when a second gesture track outside the area where the application is located is received, adjusting the parameter value of the menu item displayed in the area where the application is located according to the corresponding relation list of the second gesture track and the instruction. The invention also discloses a system for controlling the starting of the application function based on the gesture, and the user experience of the mobile terminal is improved.

Description

System and method for controlling application function based on gesture
Technical Field
The invention relates to the technical field of gesture recognition, in particular to a system and a method for controlling application functions based on gestures.
Background
The existing mobile terminal has a plurality of applications, however, each application can set the function of the application, however, when a user needs to change a certain setting of a certain application, the user needs to first click on the application and then find a setting option under the application, then find a setting menu item needing to be modified in a setting menu of the application, and adjust or modify the set parameter after clicking, so that the user is very cumbersome when operating the setting of the application, and a lot of invariance is brought to the user, for example, the application may not find the setting option due to too many functions.
Meanwhile, the gesture sliding detection technology on the existing touch screen is very mature, if the gesture track can be used as a trigger instruction for controlling an application function (such as a set function), the opening speed of the application function (such as the set function) is greatly improved, and great convenience is brought to a user.
Therefore, there is a need to provide a system and a method for controlling an application function based on a gesture, which provide a new way for controlling the application function of a mobile terminal, and simultaneously bring convenience to a user and improve the user experience of the mobile terminal.
Disclosure of Invention
The invention mainly aims to provide a system and a method for controlling an application function based on gestures, and aims to solve the technical problem that the conventional mobile terminal cannot rapidly control the application function.
In order to achieve the above object, the present invention provides a system for controlling application functions based on gestures, the system being run on a mobile terminal, the system comprising:
the gesture list creating module is used for creating a first gesture track and instruction corresponding relation list and a second gesture and instruction corresponding relation list;
the acquisition module is used for acquiring a menu item under application setting;
the display module is used for displaying a menu item under the application setting in the area where the application is located;
the first detection module is used for detecting a first gesture track in the area where the application is located;
the second detection module is used for detecting a second gesture track outside the area where the application is located after receiving the first gesture track inside the area where the application is located;
the switching module is used for switching menu items displayed in the area where the application is located according to the first gesture track and the instruction corresponding relation list when the first gesture track detected by the first detection module in the area where the application is located is received;
and the adjusting module is used for adjusting the parameter value of the menu item displayed in the area where the application is located according to the corresponding relation list of the second gesture track and the instruction when the second gesture track outside the area where the application is located, which is detected by the second detecting module, is received.
Optionally, the second gesture track is connected to the first gesture track, wherein the first gesture track and the second gesture track include sliding left, sliding right, sliding up, or sliding down.
Optionally, the first gesture track and instruction correspondence list includes that sliding the corresponding instruction to the left is to switch to a previous menu of the application setting, and sliding the corresponding instruction to the right is to switch to a next menu of the application setting; the list of the corresponding relation between the second gesture track and the instruction comprises that the corresponding instruction is slid upwards to adjust one parameter value in the menu items displayed in the area where the application is located, and the corresponding instruction is slid downwards to adjust another parameter value of the menu items displayed in the area where the application is located.
Optionally, the gesture creating module is further configured to create a third gesture track and instruction correspondence list; the menu item comprises setting shortcut operation; the parameter values of the shortcut operation comprise opening and closing; the system further comprises a control unit for controlling the operation of the motor,
the third detection module is used for detecting the shortcut operation starting of the application and detecting a third gesture track;
and the control module is used for controlling the application to start a corresponding function or page according to the third gesture track and the instruction corresponding list when receiving the third gesture track of a preset touch area.
Optionally, the gesture list creating module is further configured to create a list of correspondence between the third gesture track and the instruction according to the application function or the page usage ranking.
Compared with the prior art, the system for controlling the application function based on the gesture provided by the invention has the advantages that the corresponding relation list of the first gesture track and the instruction and the corresponding relation list of the second gesture and the instruction are created, the menu item under the application setting is switched through the first gesture track detected in the area where the application is located, and finally the menu parameter is adjusted through the second gesture track instruction detected outside the area where the application is located, namely the application is set through the operations, even the function of the application is directly opened or a certain page of the application is opened through the third gesture after the operation setting is started through the shortcut operation, convenience is brought to a user, and meanwhile, the user experience degree of the mobile terminal is improved.
In addition, to achieve the above object, the present invention further provides a method for controlling an application function based on a gesture, the method being applied to a mobile terminal, the method including:
creating a first gesture track and instruction corresponding relation list and a second gesture and instruction corresponding relation list;
acquiring a menu item under application setting;
displaying a menu item under the application setting in the area of the application;
detecting a first gesture track in the area where the application is located;
after receiving a first gesture track in the area where the application is located, detecting a second gesture track outside the area where the application is located;
when a first gesture track in the area where the application is located is received, switching menu items displayed in the area where the application is located according to the first gesture track and an instruction corresponding relation list;
and when a second gesture track outside the application area is received after the first gesture track in the application area is received, adjusting the parameter value of the menu item displayed in the application area according to the second gesture track and the instruction corresponding relation list.
Optionally, the second gesture is connected to the first gesture track, wherein the first gesture track and the second gesture track include sliding left, sliding right, sliding up, or sliding down.
Optionally, the first gesture track and instruction correspondence list includes that sliding the corresponding instruction to the left is to switch to a previous menu of the application setting, and sliding the corresponding instruction to the right is to switch to a next menu of the application setting; the list of the corresponding relation between the second gesture track and the instruction comprises that the corresponding instruction is slid upwards to adjust one parameter value in the menu items displayed in the area where the application is located, and the corresponding instruction is slid downwards to adjust another parameter value of the menu items displayed in the area where the application is located.
Optionally, the menu item includes setting a shortcut operation; the parameter values of the shortcut operation comprise opening and closing; the method further comprises the step of enabling the user to select the target,
creating a third gesture track and instruction corresponding relation list;
detecting the shortcut operation starting of the application and detecting a third gesture track;
and when a third gesture track of a preset touch area is received, controlling the application to start a corresponding function or page according to the third gesture track and the instruction corresponding list.
Optionally, the method further includes creating a list of correspondence between the third gesture track and the instruction according to the application function or page usage ranking.
Compared with the prior art, the method for controlling the application function based on the gesture provided by the invention has the advantages that the first gesture track and instruction corresponding relation list and the second gesture and instruction corresponding relation list are created, the menu under the application setting is switched through the first gesture track detected in the area where the application is located, and finally the menu parameter is adjusted through the second gesture track instruction detected outside the area where the application is located, namely the application is set through the operations, even the function of the application is directly opened or a certain page of the application is opened through the third gesture after the operation setting is started through the shortcut operation, convenience is brought to a user, and meanwhile, the user experience degree of the mobile terminal is improved.
Drawings
Fig. 1 is a schematic hardware configuration diagram of an alternative mobile terminal implementing various embodiments of the present invention;
FIG. 2 is a diagram of a wireless communication system for the mobile terminal shown in FIG. 1;
FIG. 3 is a block diagram of a system for controlling application functions based on gestures according to the present invention;
FIG. 4 is a schematic diagram of a first and a second gesture tracks of an application area of the system for controlling application functions based on gestures according to the present invention;
FIG. 5 is a schematic diagram of a third gesture trajectory of a preset touch area of the system for controlling application functions based on gestures according to the present invention;
FIG. 6 is a flow chart illustrating a method for controlling application functions based on gestures according to the present invention;
FIG. 7 is a flowchart illustrating a first embodiment of a method for controlling application functions based on gestures according to the present invention;
FIG. 8 is a flowchart illustrating a second embodiment of a method for controlling application functions based on gestures according to the present invention.
Reference numerals:
Figure GDA0001243801970000041
Figure GDA0001243801970000051
Figure GDA0001243801970000061
the implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
A mobile terminal implementing various embodiments of the present invention will now be described with reference to the accompanying drawings. In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in themselves. Thus, "module" and "component" may be used in a mixture.
The mobile terminal may be implemented in various forms. For example, the terminal described in the present invention may include a mobile terminal such as a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a navigation device, and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. In the following, it is assumed that the terminal is a mobile terminal. However, it will be understood by those skilled in the art that the configuration according to the embodiment of the present invention can be applied to a fixed type terminal in addition to elements particularly used for moving purposes.
Fig. 1 is a schematic diagram of a hardware structure of an alternative mobile terminal for implementing various embodiments of the present invention.
The mobile terminal 100 may include a wireless communication unit 110, an a/V (audio/video) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190, etc. Fig. 1 illustrates a mobile terminal having various components, but it is to be understood that not all illustrated components are required to be implemented. More or fewer components may alternatively be implemented. Elements of the mobile terminal will be described in detail below.
The wireless communication unit 110 typically includes one or more components that allow radio communication between the mobile terminal 100 and a wireless communication system or network. For example, the wireless communication unit may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115.
The broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel. The broadcast channel may include a satellite channel and/or a terrestrial channel. The broadcast management server may be a server that generates and transmits a broadcast signal and/or broadcast associated information or a server that receives a previously generated broadcast signal and/or broadcast associated information and transmits it to a terminal. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like. Also, the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal. The broadcast associated information may also be provided via a mobile communication network, and in this case, the broadcast associated information may be received by the mobile communication module 112. The broadcast signal may exist in various forms, for example, it may exist in the form of an Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB), an Electronic Service Guide (ESG) of digital video broadcasting-handheld (DVB-H), and the like. The broadcast receiving module 111 may receive a signal broadcast by using various types of broadcasting systems. In particular, the broadcast receiving module 111 may receive digital broadcasting by using a digital broadcasting system such as a data broadcasting system of multimedia broadcasting-terrestrial (DMB-T), digital multimedia broadcasting-satellite (DMB-S), digital video broadcasting-handheld (DVB-H), forward link media (MediaFLO @), terrestrial digital broadcasting integrated service (ISDB-T), and the like. The broadcast receiving module 111 may be constructed to be suitable for various broadcasting systems that provide broadcast signals as well as the above-mentioned digital broadcasting systems. The broadcast signal and/or broadcast associated information received via the broadcast receiving module 111 may be stored in the memory 160 (or other type of storage medium).
The mobile communication module 112 transmits and/or receives radio signals to and/or from at least one of a base station (e.g., access point, node B, etc.), an external terminal, and a server. Such radio signals may include voice call signals, video call signals, or various types of data transmitted and/or received according to text and/or multimedia messages.
The wireless internet module 113 supports wireless internet access of the mobile terminal. The module may be internally or externally coupled to the terminal. The wireless internet access technology to which the module relates may include WLAN (wireless LAN) (Wi-Fi), Wibro (wireless broadband), Wimax (worldwide interoperability for microwave access), HSDPA (high speed downlink packet access), and the like.
The short-range communication module 114 is a module for supporting short-range communication. Some examples of short-range communication technologies include bluetooth (TM), Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), zigbee (TM), and the like.
The location information module 115 is a module for checking or acquiring location information of the mobile terminal. A typical example of the location information module is a GPS (global positioning system). According to the current technology, the GPS module 115 calculates distance information and accurate time information from three or more satellites and applies triangulation to the calculated information, thereby accurately calculating three-dimensional current location information according to longitude, latitude, and altitude. Currently, a method for calculating position and time information uses three satellites and corrects an error of the calculated position and time information by using another satellite. In addition, the GPS module 115 can calculate speed information by continuously calculating current position information in real time.
The a/V input unit 120 is used to receive an audio or video signal. The a/V input unit 120 may include a camera unit 121 and a microphone 122, and the camera unit 121 processes image data of still pictures or videos obtained by an image capturing apparatus in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 151. The image frames processed by the camera unit 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras may be provided according to the configuration of the mobile terminal. The microphone 122 may receive sounds (audio data) via the microphone in a phone call mode, a recording mode, a voice recognition mode, or the like, and can process such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the mobile communication module 112 in case of a phone call mode. The microphone 122 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The user input unit 130 may generate key input data according to a command input by a user to control various operations of the mobile terminal. The user input unit 130 allows a user to input various types of information, and may include a keyboard, dome sheet, touch pad (e.g., a touch-sensitive member that detects changes in resistance, pressure, capacitance, and the like due to being touched), scroll wheel, joystick, and the like. In particular, when the touch pad is superimposed on the display unit 151 in the form of a layer, a touch screen may be formed.
The sensing unit 140 detects a current state of the mobile terminal 100 (e.g., an open or closed state of the mobile terminal 100), a position of the mobile terminal 100, presence or absence of contact (i.e., touch input) by a user with the mobile terminal 100, an orientation of the mobile terminal 100, acceleration or deceleration movement and direction of the mobile terminal 100, and the like, and generates a command or signal for controlling an operation of the mobile terminal 100. For example, when the mobile terminal 100 is implemented as a slide-type mobile phone, the sensing unit 140 may sense whether the slide-type phone is opened or closed. In addition, the sensing unit 140 can detect whether the power supply unit 190 supplies power or whether the interface unit 170 is coupled with an external device. The sensing unit 140 may include a fingerprint sensor 141 as will be described below in connection with a touch screen.
The interface unit 170 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The identification module may store various information for authenticating a user using the mobile terminal 100 and may include a User Identity Module (UIM), a Subscriber Identity Module (SIM), a Universal Subscriber Identity Module (USIM), and the like. In addition, a device having an identification module (hereinafter, referred to as an "identification device") may take the form of a smart card, and thus, the identification device may be connected with the mobile terminal 100 via a port or other connection means. The interface unit 170 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal and the external device.
In addition, when the mobile terminal 100 is connected with an external cradle, the interface unit 170 may serve as a path through which power is supplied from the cradle to the mobile terminal 100 or may serve as a path through which various command signals input from the cradle are transmitted to the mobile terminal. Various command signals or power input from the cradle may be used as signals for recognizing whether the mobile terminal is accurately mounted on the cradle. The output unit 150 is configured to provide output signals (e.g., audio signals, video signals, alarm signals, vibration signals, etc.) in a visual, audio, and/or tactile manner. The output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and the like.
The display unit 151 may display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display unit 151 may display a User Interface (UI) or a Graphical User Interface (GUI) related to a call or other communication (e.g., text messaging, multimedia file downloading, etc.). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or an image and related functions, and the like.
Meanwhile, when the display unit 151 and the touch pad are overlapped with each other in the form of a layer to form a touch screen, the display unit 151 may serve as an input device and an output device. The display unit 151 may include at least one of a Liquid Crystal Display (LCD), a thin film transistor LCD (TFT-LCD), an Organic Light Emitting Diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like. Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as transparent displays, and a typical transparent display may be, for example, a TOLED (transparent organic light emitting diode) display or the like. Depending on the particular desired implementation, the mobile terminal 100 may include two or more display units (or other display devices), for example, the mobile terminal may include an external display unit (not shown) and an internal display unit (not shown). The touch screen may be used to detect a touch input pressure as well as a touch input position and a touch input area.
The audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 into an audio signal and output as sound when the mobile terminal is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output module 152 may provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output module 152 may include a speaker, a buzzer, and the like.
The alarm unit 153 may provide an output to notify the mobile terminal 100 of the occurrence of an event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to audio or video output, the alarm unit 153 may provide output in different ways to notify the occurrence of an event. For example, the alarm unit 153 may provide an output in the form of vibration, and when a call, a message, or some other incoming communication (incomingmunication) is received, the alarm unit 153 may provide a tactile output (i.e., vibration) to inform the user thereof. By providing such a tactile output, the user can recognize the occurrence of various events even when the user's mobile phone is in the user's pocket. The alarm unit 153 may also provide an output notifying the occurrence of an event via the display unit 151 or the audio output module 152.
The memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. Also, the mobile terminal 100 may cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
The controller 180 generally controls the overall operation of the mobile terminal. For example, the controller 180 performs control and processing related to voice calls, data communications, video calls, and the like. In addition, the controller 180 may include a multimedia module 1810 for reproducing (or playing back) multimedia data, and the multimedia module 1810 may be constructed within the controller 180 or may be constructed separately from the controller 180. The controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
The power supply unit 190 receives external power or internal power and provides appropriate power required to operate various elements and components under the control of the controller 180.
The various embodiments described herein may be implemented in a computer-readable medium using, for example, computer software, hardware, or any combination thereof. For a hardware implementation, the embodiments described herein may be implemented using at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a processor, a controller, a microcontroller, a microprocessor, an electronic unit designed to perform the functions described herein, and in some cases, such embodiments may be implemented in the controller 180. For a software implementation, the implementation such as a process or a function may be implemented with a separate software module that allows performing at least one function or operation. The software codes may be implemented by software applications (or programs) written in any suitable programming language, which may be stored in the memory 160 and executed by the controller 180.
Up to this point, mobile terminals have been described in terms of their functionality. Hereinafter, a slide-type mobile terminal among various types of mobile terminals, such as a folder-type, bar-type, swing-type, slide-type mobile terminal, and the like, will be described as an example for the sake of brevity. Accordingly, the present invention can be applied to any type of mobile terminal, and is not limited to a slide type mobile terminal.
The mobile terminal 100 as shown in fig. 1 may be configured to operate with communication systems such as wired and wireless communication systems and satellite-based communication systems that transmit data via frames or packets.
A communication system in which a mobile terminal according to the present invention is operable will now be described with reference to fig. 2.
Such communication systems may use different air interfaces and/or physical layers. For example, the air interface used by the communication system includes, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS) (in particular, Long Term Evolution (LTE)), global system for mobile communications (GSM), and the like. By way of non-limiting example, the following description relates to a CDMA communication system, but such teachings are equally applicable to other types of systems.
Referring to fig. 2, the CDMA wireless communication system may include a plurality of mobile terminals 100, a plurality of Base Stations (BSs) 270, Base Station Controllers (BSCs) 275, and a Mobile Switching Center (MSC) 280. The MSC280 is configured to interface with a Public Switched Telephone Network (PSTN) 290. The MSC280 is also configured to interface with a BSC275, which may be coupled to the base station 270 via a backhaul. The backhaul may be constructed according to any of several known interfaces including, for example, E1/T1, ATM, IP, PPP, frame Relay, HDSL, ADSL, or xDSL. It will be understood that a system as shown in fig. 2 may include multiple BSCs 2750.
Each BS270 may serve one or more sectors (or regions), each sector covered by a multi-directional antenna or an antenna pointing in a particular direction being radially distant from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS270 may be configured to support multiple frequency allocations, with each frequency allocation having a particular frequency spectrum (e.g., 1.25MHz,5MHz, etc.).
The intersection of partitions with frequency allocations may be referred to as a CDMA channel. The BS270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology. In such a case, the term "base station" may be used to generically refer to a single BSC275 and at least one BS 270. The base stations may also be referred to as "cells". Alternatively, each sector of a particular BS270 may be referred to as a plurality of cell sites.
As shown in fig. 2, a Broadcast Transmitter (BT)295 transmits a broadcast signal to the mobile terminal 100 operating within the system. A broadcast receiving module 111 as shown in fig. 1 is provided at the mobile terminal 100 to receive a broadcast signal transmitted by the BT 295. In fig. 2, several Global Positioning System (GPS) satellites 300 are shown. The satellite 300 assists in locating at least one of the plurality of mobile terminals 100.
In fig. 2, a plurality of satellites 300 are depicted, but it is understood that useful positioning information may be obtained with any number of satellites. The GPS module 115 as shown in fig. 1 is generally configured to cooperate with satellites 300 to obtain desired positioning information. Other techniques that can track the location of the mobile terminal may be used instead of or in addition to GPS tracking techniques. In addition, at least one GPS satellite 300 may selectively or additionally process satellite DMB transmission.
As a typical operation of the wireless communication system, the BS270 receives reverse link signals from various mobile terminals 100. The mobile terminal 100 is generally engaged in conversations, messaging, and other types of communications. Each reverse link signal received by a particular base station 270 is processed within the particular BS 270. The obtained data is forwarded to the associated BSC 275. The BSC provides call resource allocation and mobility management functions including coordination of soft handoff procedures between BSs 270. The BSCs 275 also route the received data to the MSC280, which provides additional routing services for interfacing with the PSTN 290. Similarly, the PSTN290 interfaces with the MSC280, the MSC interfaces with the BSCs 275, and the BSCs 275 accordingly control the BS270 to transmit forward link signals to the mobile terminal 100.
Based on the above mobile terminal hardware structure and communication system, various embodiments of the system of the present invention are proposed.
First, the present invention provides a system 400 for controlling application functions based on gestures.
FIG. 3 is a schematic diagram of modules for controlling application functions based on gestures according to the present invention. In this embodiment, the system 400 may be divided into one or more modules, which are stored in the memory 160 and executed by one or more controllers (the controller 180 in this embodiment) to accomplish the present invention. The system operates in the mobile terminal 100. For example, in fig. 3, the system 400 may be divided gesture list creation module 410, acquisition module 420, display module 430, first detection module 440, second detection module 450, switching module 460, and adjustment module 470. The detailed description of the functions of the function modules 410 and 470 will be described below.
The gesture list creating module 410 is configured to create a first gesture track and instruction corresponding relationship list and a second gesture track and instruction corresponding relationship list; and stored in the memory 160.
Specifically, in this embodiment, the first gesture track and instruction correspondence list includes that sliding the corresponding instruction leftward is to switch to the last menu item of the application setting, and sliding the corresponding instruction rightward is to switch to the next menu item of the application setting; in other embodiments, the first gesture track and instruction correspondence list includes that a corresponding instruction is slid up to be a next menu item switched to the application setting, and a corresponding instruction is slid down to be a last menu item switched to the application setting; the user can preset a switching instruction of the menu item corresponding to the set gesture track. Meanwhile, the corresponding relation list of the second gesture track and the instruction comprises that the corresponding instruction is slid upwards to adjust one parameter value in the menu items displayed in the area where the application is located, and the corresponding instruction is slid downwards to adjust the other parameter value of the menu items displayed in the area where the application is located. In this embodiment, if the menu item is to set the application sound size, the instruction of the upward slide is to increase the sound, and the instruction of the downward slide is to decrease the sound. In other embodiments, other gesture trajectories may be used to adjust the parameters of the application sound.
It should be noted that the gesture track includes various shapes of graphics, such as W, and the graphics can be included in the gesture track, for example, sliding up, down, left, and right, or sliding in a W shape, etc.
The obtaining module 420 is configured to obtain a menu item under an application setting.
Specifically, each application of the mobile terminal 100 has a setting function, and some options are shown under the setting, which we refer to as menu items under the setting, and the obtaining module 420 obtains the menu items under the setting of the application for different applications, and each application is separated independently and stored in the memory 110.
The display module 430 is configured to display a menu item under the application setting in the area where the application is located.
Specifically, as shown in fig. 4, the first and second gesture tracks of the area where the application of the system for controlling the application function based on the gesture is located are illustrated, and as shown in the figure, the menu item displayed in the area 501 where the application is located is setting a or setting B. In other embodiments, the menu item under the setting of the application may be displayed as a default setting N.
The first detection module 440 is configured to detect a first gesture track in an area where the application is located.
Specifically, whether a gesture track exists in the area where the application is located is detected, and if the gesture track exists, it is determined that a first gesture track in the area where the application is located is detected.
The second detecting module 450 is configured to detect a second gesture track outside the area where the application is located after receiving the first gesture track inside the area where the application is located.
Specifically, with reference to fig. 4, after the first gesture track in the area 501 where the application is located is detected, whether a gesture track exists outside the area where the application is located is continuously detected, and if the gesture track exists, it is determined that the second gesture track is detected. In some embodiments, the first gesture track is connected to the second gesture track.
The switching module 460 is configured to, when receiving the first gesture track detected by the first detecting module 440 in the area 501 where the application is located, switch the menu item displayed in the area where the application is located according to the first gesture track and the corresponding relation of the instruction.
Specifically, with reference to fig. 4, a first gesture track is detected on an area 501 where an application of the mobile terminal 100 is located, in this embodiment, the first gesture track is slid leftward or rightward, when a finger is detected to slide rightward, the menu item "set B" in the application setting is switched to, when a finger is detected to slide leftward, the menu item "set a" in the application setting is switched to, and in other embodiments, the menu item in the application setting may also be switched by sliding upward and downward. In this embodiment, when the area 501 where the application is located detects a gesture track of a user, the first gesture track and instruction correspondence list is automatically started, so as to execute an instruction corresponding to the gesture track in the first gesture track and instruction correspondence list.
The adjusting module 470 is configured to, when receiving a second gesture track outside the area where the application is located and detected by the second detecting module 450, adjust a parameter value of a menu item displayed in the area where the application is located according to the second gesture track and the instruction corresponding relationship list.
Specifically, in this embodiment, with reference to fig. 4, it is detected that the first gesture track on the area 501 where the application is located slides to the right, and according to the first gesture track and the instruction correspondence list, the display on the area 501 where the application is located is switched to "setting B", if "setting B" is "whether to send a push message", the adjusting module 470 continuously detects a second gesture track outside the area where the application is located after the second detecting module 450 detects the first gesture track on the area 501 where the application is located, and adjusts the parameter of "setting B" according to the second gesture track and the instruction correspondence list, for example, when the second gesture track slides upwards, "yes" is selected.
It should be noted that, in this embodiment, the first gesture track is connected to the second gesture track, that is, after the user slides backwards in the area 501 where the application is located to obtain the "setting B", the finger does not leave the touch screen to continue sliding upwards to set to start pushing. In other embodiments, the first gesture track and the second gesture track may not be connected, but need to be obtained in a further step, for example, after the first gesture track is detected in the area 501 where the application is located, if a gesture track is detected outside the area, the gesture track may be regarded as the second gesture track, and the parameter of the currently displayed "menu item" of the application where the previous first gesture track is located is correspondingly adjusted.
It should be added that the gesture list creating module 410 is further configured to create a third gesture track and instruction correspondence list.
Specifically, in this embodiment, the list of correspondence between the third gesture track and the command includes that sliding the corresponding command upward left is to open a certain page of the application, and sliding the corresponding command downward right is to open a certain function of the application. In other embodiments, the third gesture track and instruction correspondence list is created according to the usage ranking of the application functions or pages, for example, a command corresponding to a downward sliding is a function with the highest usage of the application, a command corresponding to an upward sliding is a page with the highest usage of the application, and the like.
It should be noted that, in a case where the application sets the "shortcut operation" to on, detection of a gesture trajectory in the predetermined touch area may be regarded as a third gesture trajectory. In this embodiment, with reference to fig. 4, the "setting B" is "shortcut operation", the "shortcut operation" is started when the second gesture track slides upward, and at this time, the application can be shortcut operated by sliding out the third gesture track in the predetermined area. As shown in fig. 5, which is a schematic diagram of a third gesture trajectory of the system for controlling an application function based on a gesture according to the present invention, a user slides a finger in the predetermined touch area 502, and the third detection module (not shown) detects the third gesture trajectory in the predetermined touch area 502, in this embodiment, the application is WeChat, and the list of correspondence between the third gesture trajectory and the instruction includes that an instruction corresponding to a right-up sliding is to open a WeChat friend circle page, an instruction corresponding to a left-down sliding is to open a scanning function, an instruction corresponding to a left-up sliding is to open a certain public account, and an instruction corresponding to a right-down sliding is to open a chat page of friend a. In other embodiments, the instruction corresponding to sliding up to the right may be a page with the highest usage rate within a predetermined time for opening the WeChat, such as a friend circle page, a page with the longest usage time within a predetermined time, such as a chat page with a friend, and the like.
It is to be added that the system further includes a control module (not shown), and when a third gesture track of a preset touch area is received, the control module controls the application to open a corresponding function or page according to the third gesture track and the instruction corresponding list. In this embodiment, the preset touch area is a blank area below the screen of the mobile terminal 100, and in other embodiments, the preset touch area may be other areas of the mobile terminal 100.
Through the function module 410 and 470, the system for controlling the application function based on the gesture provided by the invention creates a first gesture track and instruction corresponding relation list and a second gesture and instruction corresponding relation list, and finally adjusts the menu parameters of the system by applying the second gesture track instruction detected outside the area where the system is located, that is, the application is set through the operations, so that convenience is brought to the user, and the user experience degree of the mobile terminal is improved.
In addition, the function of the application can be directly started or a certain page of the application can be started through a third gesture after the shortcut operation is started through the operation setting, and the user experience of the mobile terminal is enhanced again.
Further, the invention also provides a method for controlling the application function based on the gesture.
Fig. 6 is a flowchart illustrating a method for controlling an application function based on a gesture according to the present invention. The order of execution of the steps in the flowchart shown in fig. 6 may be changed, some steps may be omitted, or other steps may be added, according to different needs.
The method for controlling the application function based on the gesture can be divided into the following steps:
step S610: and creating a first gesture track and instruction corresponding relation list and a second gesture track and instruction corresponding relation list.
Specifically, in this embodiment, the first gesture track and instruction correspondence list includes that sliding the corresponding instruction leftward is to switch to the last menu item of the application setting, and sliding the corresponding instruction rightward is to switch to the next menu item of the application setting; in other embodiments, the first gesture track and instruction correspondence list includes that a corresponding instruction is slid up to be a next menu item switched to the application setting, and a corresponding instruction is slid down to be a last menu item switched to the application setting; the user can preset a switching instruction of the menu item corresponding to the set gesture track. Meanwhile, the corresponding relation list of the second gesture track and the instruction comprises that the corresponding instruction is slid upwards to adjust one parameter value in the menu items displayed in the area where the application is located, and the corresponding instruction is slid downwards to adjust the other parameter value of the menu items displayed in the area where the application is located.
Step S620: a menu item under an application setting is obtained.
Specifically, each application of the mobile terminal 100 has a setting function, and some choices are displayed under the setting, which is called as a menu item under the setting, and the menu item under the setting of the application is obtained for different applications, and each application is separately divided and stored in the memory 160.
Step S630: and displaying the menu item under the application setting in the area where the application is located.
Specifically, as shown in fig. 4, the first and second gesture tracks of the area where the application of the system for controlling the application function based on the gesture is located are illustrated, and as shown in the figure, the menu item displayed in the area 501 where the application is located is setting a or setting B. In other embodiments, the menu item under the setting of the application may be displayed as a default setting N.
Step S640, detecting a first gesture track in the area where the application setting is located.
Specifically, whether a gesture track exists in the area where the application is located is detected, and if the gesture track exists, it is determined that a first gesture track in the area where the application is located is detected.
Step S650, after receiving the first gesture track in the area where the application is located, detects a second gesture track outside the area where the application is located.
Specifically, with reference to fig. 4, after the first gesture track in the area 501 where the application is located is detected, whether a gesture track exists outside the area where the application is located is continuously detected, and if the gesture track exists, it is determined that the second gesture track is detected. In some embodiments, the first gesture track is connected to the second gesture track.
Step S660, when the first gesture track in the area where the application is located is received, switching the menu item displayed in the area where the application is located according to the first gesture track and the instruction correspondence list.
Specifically, a first gesture track is detected in an area where an application of the mobile terminal 100 is located, an instruction corresponding to the first gesture track in the first gesture track and instruction correspondence list is queried, and a menu item is switched according to the queried instruction.
Step S670, when a first gesture track in the area where the application is located is received and then a second gesture track outside the area where the application is located is received, adjusting a parameter value of a menu item displayed in the area where the application is located according to the second gesture track and the instruction relation list.
Specifically, the setting menu item of the application is switched through step S660, and then a second gesture track is detected outside the area where the application is located, and then an instruction is obtained according to the second gesture track and the instruction relation list, so as to adjust the parameter value of the menu item currently displayed in the area where the application is located.
Based on the above process flow, the above process steps are further illustrated below by specifically describing two examples.
The first embodiment is as follows:
fig. 7 is a flowchart illustrating an embodiment of a method for controlling an application function based on a gesture according to a first embodiment of the present invention. In this embodiment, the execution order of the steps in the flowchart shown in fig. 7 may be changed and some steps may be omitted according to different requirements.
Step S710, a first gesture track and instruction corresponding relationship list and a second gesture track and instruction relationship list are created and stored in the memory 160, and step S720 is continuously executed.
Specifically, in this embodiment, the first gesture track and instruction relation list includes that the instruction corresponding to the rightward sliding is to switch to the next menu item. In other embodiments, the first gesture track and command relationship list includes that the command corresponding to the upward swipe is to switch to the next menu item. Meanwhile, the corresponding relation list of the second gesture track and the instruction comprises that the corresponding instruction is slid upwards to adjust one parameter value in the menu items displayed in the area where the application is located, and the corresponding instruction is slid downwards to adjust the other parameter value of the menu items displayed in the area where the application is located.
In step S720, it is detected that an application has a gesture track sliding to the right in the area.
It should be noted that the first gesture track is a gesture track in the area where the application is located. Therefore, step S720 detects a gesture trajectory of rightward sliding in the area where the application is located, and considers the gesture trajectory of rightward sliding as the first gesture trajectory. The process continues to step S730.
And step S730, obtaining a rightward sliding instruction as a next menu item switched to the application setting according to the first gesture track and the instruction relation list.
Specifically, as shown in fig. 4, the default display menu item in the area 501 where the application is located is "set a", and according to the first gesture track and instruction relation list, the instruction for the first gesture to slide to the right is the next menu item for switching the application setting, in this embodiment, the menu item is switched to "set B" by sliding to the right; in other embodiments, step S730 may be to slide to the left to switch to the menu item "set B". The process continues to step S740.
Step S740, determining whether the menu item is a menu item that the user wants to set, if not, continuing to execute step S730, and if so, executing step S750.
Specifically, with continued reference to FIG. 4, if the "settings B" are not the menu item that the user wants to set, the user may continue to slide right within the area 501 of the application until the menu item to be set is found.
Step S750, detecting a gesture track sliding upward outside the area where the application is located.
Specifically, with continued reference to fig. 4, in this embodiment, it is detected that the second gesture track is a sliding-up track, that is, an upward sliding track is detected on the right side of the area where the application is located. The process continues to step S760.
Step S760, obtaining an upward sliding instruction as a parameter value for adjusting the menu item under the application setting according to the second gesture track and instruction relation list.
Specifically, with reference to fig. 4, according to the second gesture track and instruction relation list, the instruction for sliding the second gesture upwards is to adjust a parameter value of "setting B", and in this embodiment, if "setting B" is to send an application push message, the instruction for sliding upwards is to select "yes"; in other embodiments the "settings B" may be other menu items and the upward swipe may be a parameter value for selecting other menu items.
Through the steps S710 to S760, in the first embodiment of the method for controlling an application function based on a gesture provided by the present invention, the first gesture track and instruction correspondence list and the second gesture and instruction correspondence list are obtained, and finally the menu parameter is adjusted by applying the second gesture track instruction detected outside the area where the second gesture track instruction is located, that is, the application is set through these operations, which brings convenience to the user and improves the user experience of the mobile terminal.
Example two:
fig. 8 is a flowchart illustrating an embodiment of a method for controlling an application function based on a gesture according to a second embodiment of the present invention. In this embodiment, the execution order of the steps in the flowchart shown in fig. 8 may be changed and some steps may be omitted according to different requirements.
Step S810 is to create a first gesture track and instruction corresponding relation list, a second gesture track and instruction corresponding relation list, and a third gesture track and instruction corresponding list.
Specifically, in this embodiment, the first gesture track and instruction relation list includes that the instruction corresponding to the rightward sliding is to switch to the next menu item. In other embodiments, the first gesture track and command relationship list includes that the command corresponding to the upward swipe is to switch to the next menu item. Meanwhile, the corresponding relation list of the second gesture track and the instruction comprises that the corresponding instruction is slid upwards to adjust one parameter value in the menu items displayed in the area where the application is located, and the corresponding instruction is slid downwards to adjust the other parameter value of the menu items displayed in the area where the application is located. Meanwhile, the third finger track and instruction corresponding relation list comprises that a corresponding instruction is slid to the upper right to open a certain page of the application, and a corresponding instruction is slid to the upper left to open a certain function of the application. The process continues to step S820.
Step S820, according to the first gesture track and instruction relation list, sliding a gesture in the area where the application is located to find a "shortcut operation" menu item.
Specifically, referring to fig. 4, in this embodiment, the "setting B" is a "shortcut operation" setting menu item that the user needs to find, and the step S830 is continuously performed.
And step S830, according to the second gesture track and the instruction relation list, starting the 'shortcut operation' through a sliding gesture outside the area where the application is located.
Specifically, with reference to fig. 4, the second gesture track is an instruction corresponding to upward sliding to open the shortcut operation, and the instruction corresponding to downward sliding to close the shortcut operation, when the shortcut operation needs to be opened, the user needs to slide an upward gesture track outside the area where the application is located, in this embodiment, the first gesture track is connected to the second gesture track, and the area outside the area where the application is located is the right side of the area where the application is located, so as to conveniently connect the gesture track sliding rightward in the first gesture track, of course, in other embodiments, the first gesture track may be disconnected from the second gesture track, but needs to be obtained in a further step, for example, after the first gesture track is detected in the area 501 where the application is located, if a gesture track is detected outside the area, the gesture track may be regarded as the second gesture track, correspondingly adjusted is the parameter of the currently displayed "menu item" of the application where the last first gesture trajectory was located. Execution continues with S840.
Step 840, according to the third gesture track and the instruction relation list, sliding a gesture in a predetermined touch area, and opening a function or a certain page of the application.
Specifically, as shown in fig. 5, which is a schematic diagram of a third gesture trajectory of the system for controlling an application function based on a gesture according to the present invention, a user slides a finger in the predetermined touch area 502, in this embodiment, the application is a WeChat, and the list of correspondence between the third gesture trajectory and the instruction includes that an instruction corresponding to a right-up sliding is to open a WeChat friend circle page, an instruction corresponding to a left-down sliding is to open a scanning function, an instruction corresponding to a left-up sliding is to open a certain public account, and an instruction corresponding to a right-down sliding is to open a chat page of a friend a. In other embodiments, the instruction corresponding to sliding up to the right may be a page with the highest usage rate within a predetermined time for opening the WeChat, such as a friend circle page, a page with the longest usage time within a predetermined time, such as a chat page with a friend, and the like. Therefore, when a third gesture track of a preset touch area is received, the application is controlled to start a corresponding function or page according to the third gesture track and the instruction corresponding list.
It should be added that the third gesture track and instruction correspondence list is created according to the usage ranking of the application functions or pages, for example, a corresponding instruction of downward sliding is a function with the highest usage of the application, and an instruction of upward sliding is a page with the highest usage of the application.
In addition, when the shortcut operation of the application is started, the application does not need to be quickly operated by the previous gesture track. The application is an application currently used by the mobile terminal.
Through the steps S810 to S840, the second embodiment of the method for controlling an application function based on a gesture according to the present invention directly opens the function of the application or opens a certain page of the application through the third gesture after the operation setting shortcut operation is opened, thereby enhancing the user experience of the mobile terminal again.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (8)

1. A system for controlling application functions based on gestures, the system being run on a mobile terminal, the system comprising:
the gesture list creating module is used for creating a first gesture track and instruction corresponding relation list and a second gesture track and instruction corresponding relation list;
the acquisition module is used for acquiring a menu item under application setting;
the display module is used for displaying a menu item under the application setting in the area where the application is located;
the first detection module is used for detecting a first gesture track in the area where the application is located;
the second detection module is used for detecting a second gesture track outside the area where the application is located after receiving the first gesture track inside the area where the application is located;
the switching module is used for switching menu items displayed in the area where the application is located according to the first gesture track and the instruction corresponding relation list when the first gesture track detected by the first detection module in the area where the application is located is received;
the adjusting module is used for adjusting the parameter values of the menu items displayed in the area where the application is located according to the corresponding relation list of the second gesture track and the instruction when the second gesture track outside the area where the application is located and detected by the second detecting module is received;
the first gesture track and instruction corresponding relation list comprises a corresponding instruction which slides leftwards to switch to the last menu item set by the application, and a corresponding instruction which slides rightwards to switch to the next menu item set by the application, and the second gesture track and instruction corresponding relation list comprises a corresponding instruction which slides upwards to adjust a parameter value in the menu item displayed in the area where the application is located, and a corresponding instruction which slides downwards to adjust another parameter value of the menu item displayed in the area where the application is located.
2. The system for gesture-based control of an application function of claim 1, wherein the second gesture track is connected to the first gesture track, wherein the first gesture track, the second gesture track comprises a left swipe, a right swipe, an up swipe, or a down swipe.
3. The system for controlling application functions based on gestures of claim 1, wherein the gesture list creation module is further configured to create a third gesture track and instruction correspondence list; the menu item comprises setting shortcut operation; the parameter values of the shortcut operation comprise opening and closing; the system further comprises a control unit for controlling the operation of the motor,
the third detection module is used for detecting the shortcut operation starting of the application and detecting a third gesture track in a preset touch area;
and the control module is used for controlling the application to start a corresponding function or page according to the third gesture track and the instruction corresponding list when receiving the third gesture track of a preset touch area.
4. The system for controlling application functions based on gestures of claim 3, wherein the gesture list creation module is further configured to create the third gesture track and instruction correspondence list according to the application function or page usage ranking.
5. A method for controlling application functions based on gestures is applied to a mobile terminal, and is characterized in that the method comprises the following steps:
creating a first gesture track and instruction corresponding relation list and a second gesture track and instruction corresponding relation list;
acquiring a menu item under application setting;
displaying menu items under the application setting in the area where the application is located;
detecting a first gesture track within an area in which the application is located;
after receiving a first gesture track in an area where the application is located, detecting a second gesture track outside the area where the application is located;
when a first gesture track in the area where the application is located is received, switching menu items displayed in the area where the application is located according to the first gesture track and an instruction corresponding relation list;
when a first gesture track in the area where the application is located is received and a second gesture track outside the area where the application is located is received, adjusting parameter values of menu items displayed in the area where the application is located according to the second gesture track and an instruction corresponding relation list;
the first gesture track and instruction corresponding relation list comprises a previous menu item for switching to the application setting by sliding a corresponding instruction leftwards, and a next menu item for switching to the application setting by sliding a corresponding instruction rightwards; the list of the corresponding relation between the second gesture track and the instruction comprises that the corresponding instruction is slid upwards to adjust one parameter value in the menu items displayed in the area where the application is located, and the corresponding instruction is slid downwards to adjust another parameter value of the menu items displayed in the area where the application is located.
6. The method for controlling application functions based on gestures of claim 5,
the second gesture is connected with the first gesture track, wherein the first gesture track and the second gesture track comprise leftward sliding, rightward sliding, upward sliding or downward sliding.
7. The method of controlling an application function based on a gesture according to claim 5, wherein the menu item comprises, setting a shortcut operation; the parameter values of the shortcut operation comprise opening and closing; the method further comprises the step of enabling the user to select the target,
creating a third gesture track and instruction corresponding relation list;
detecting the shortcut operation opening of the application and detecting a third gesture track in a preset touch area;
and when a third gesture track of a preset touch area is received, controlling the application to start a corresponding function or page according to the third gesture track and the instruction corresponding list.
8. The method of gesture-based control of application functions of claim 7, further comprising creating the third gesture track and instruction correspondence list according to the application function or page usage ranking.
CN201710003117.9A 2017-01-03 2017-01-03 System and method for controlling application function based on gesture Active CN106873887B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710003117.9A CN106873887B (en) 2017-01-03 2017-01-03 System and method for controlling application function based on gesture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710003117.9A CN106873887B (en) 2017-01-03 2017-01-03 System and method for controlling application function based on gesture

Publications (2)

Publication Number Publication Date
CN106873887A CN106873887A (en) 2017-06-20
CN106873887B true CN106873887B (en) 2021-05-25

Family

ID=59164368

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710003117.9A Active CN106873887B (en) 2017-01-03 2017-01-03 System and method for controlling application function based on gesture

Country Status (1)

Country Link
CN (1) CN106873887B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107493387A (en) * 2017-08-24 2017-12-19 维沃移动通信有限公司 A kind of game control method and mobile terminal
CN107908467A (en) * 2017-10-30 2018-04-13 广东小天才科技有限公司 A kind of application programe switch-over method, device and smart machine
CN107831989A (en) * 2017-11-28 2018-03-23 维沃移动通信有限公司 A kind of Application Parameters method of adjustment and mobile terminal
CN109000340B (en) * 2018-08-14 2020-03-06 珠海格力电器股份有限公司 Controller of electric appliance and control method thereof
CN109688262A (en) * 2018-12-03 2019-04-26 珠海格力电器股份有限公司 A kind of control method of terminal, device, storage medium and terminal

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201232383A (en) * 2011-01-28 2012-08-01 Hon Hai Prec Ind Co Ltd Method for unlocking a locking interface on screen
CN103530045A (en) * 2012-07-03 2014-01-22 腾讯科技(深圳)有限公司 Menu item starting method and mobile terminal
CN103455237A (en) * 2013-08-21 2013-12-18 中兴通讯股份有限公司 Menu processing method and device
CN105183331A (en) * 2014-05-30 2015-12-23 北京奇虎科技有限公司 Method and device for controlling gesture on electronic device
CN104077145B (en) * 2014-07-11 2017-07-21 北京安兔兔科技有限公司 Applied program processing method and system
CN106020629A (en) * 2016-06-12 2016-10-12 紫蛙科技(上海)有限公司 Triggering method and device of application program selection menu

Also Published As

Publication number Publication date
CN106873887A (en) 2017-06-20

Similar Documents

Publication Publication Date Title
CN105426097B (en) Real-time split screen size adjusting method and split screen device
CN105468158B (en) Color adjustment method and mobile terminal
CN105188098B (en) Network switching device and network switching method of mobile terminal
CN106873887B (en) System and method for controlling application function based on gesture
CN104731480B (en) Image display method and device based on touch screen
CN106445352B (en) Edge touch device and method of mobile terminal
CN106911850B (en) Mobile terminal and screen capturing method thereof
CN106569709B (en) Apparatus and method for controlling mobile terminal
CN106383647B (en) Terminal interface control device and method
CN106527933B (en) Control method and device for edge gesture of mobile terminal
CN106791238B (en) Call control method and device of multi-party call conference system
CN104932697B (en) Gesture unlocking method and device
CN106249989B (en) Method for arranging social application program icons during content sharing and mobile terminal
CN110928708B (en) Icon display method and device, electronic equipment and computer readable storage medium
CN106648324B (en) Hidden icon control method and device and terminal
CN106973156B (en) Screen brightness adjusting method and device
CN109240579B (en) Touch operation method, equipment and computer storage medium
CN106250130B (en) Mobile terminal and method for responding key operation
CN107132967B (en) Application starting method and device, storage medium and terminal
CN106453863B (en) Method and system for controlling terminal and earphone
CN109542317B (en) Display control method, device and storage medium of double-sided screen mobile terminal
CN105791541B (en) Screenshot method and mobile terminal
CN104639428B (en) Self-adaptive method for session scene in instant messaging and mobile terminal
CN107197084B (en) Method for projection between mobile terminals and first mobile terminal
CN106569670B (en) Device and method for processing application

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant