CN106254597A - A kind of gripping identification system based on proximity transducer - Google Patents
A kind of gripping identification system based on proximity transducer Download PDFInfo
- Publication number
- CN106254597A CN106254597A CN201610872700.9A CN201610872700A CN106254597A CN 106254597 A CN106254597 A CN 106254597A CN 201610872700 A CN201610872700 A CN 201610872700A CN 106254597 A CN106254597 A CN 106254597A
- Authority
- CN
- China
- Prior art keywords
- panel
- recognition
- mobile terminal
- recognition panel
- identification
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000001960 triggered effect Effects 0.000 claims description 10
- 238000003491 array Methods 0.000 claims description 6
- 238000004891 communication Methods 0.000 description 26
- 238000000034 method Methods 0.000 description 20
- 230000006870 function Effects 0.000 description 16
- 238000010586 diagram Methods 0.000 description 11
- 210000003811 finger Anatomy 0.000 description 10
- 230000008569 process Effects 0.000 description 10
- 238000010295 mobile communication Methods 0.000 description 7
- 238000012545 processing Methods 0.000 description 7
- 230000002452 interceptive effect Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 230000005236 sound signal Effects 0.000 description 4
- 238000007726 management method Methods 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000005192 partition Methods 0.000 description 2
- 230000002441 reversible effect Effects 0.000 description 2
- 210000003813 thumb Anatomy 0.000 description 2
- 101150012579 ADSL gene Proteins 0.000 description 1
- 102100020775 Adenylosuccinate lyase Human genes 0.000 description 1
- 108700040193 Adenylosuccinate lyases Proteins 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 230000007257 malfunction Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 238000013468 resource allocation Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/0202—Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
- H04M1/0279—Improving the user comfort or ergonomics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72448—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
- H04M1/72454—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Environmental & Geological Engineering (AREA)
- Telephone Function (AREA)
Abstract
The invention discloses a kind of gripping identification system based on proximity transducer, relate to technical field of mobile terminals, this system includes: proximity transducer, identification panel it is provided with on described proximity transducer, described identification panel is arranged on mobile terminal in the gripping area preset, when mobile terminal receives the holding operation of user, judge whether the capacitance that described identification panel collects meets the trigger condition preset, if, then trigger the application of correspondence, the present invention is operated by the grip of proximity transducer identification user, and complete corresponding function, improve the experience that user terminal is mutual.
Description
Technical Field
The invention relates to the technical field of mobile terminals, in particular to a holding identification system based on a proximity sensor.
Background
Gesture recognition is an emerging user interface mode, and is often used in applications such as building and industrial control panels where a user can interact with a device only by moving or gestures, and is particularly important in scenes where a touch screen interface is difficult to adopt, such as in a humid environment, when the user wears gloves, or when the user hardly touches the control panel.
The traditional gesture recognition mode and the type are more, for example, gesture recognition based on vision is adopted, the technology is developed earlier and maturely, requirements on equipment and environment are strict, and limitation of use is larger.
Disclosure of Invention
The invention mainly aims to provide a holding recognition system based on a proximity sensor, which can recognize holding gesture operation of a user through the proximity sensor, complete corresponding functions and improve interactive experience of a user terminal.
In order to achieve the above object, the present invention provides a grip recognition system based on a proximity sensor, including: the mobile terminal comprises a proximity sensor, wherein an identification panel is arranged on the proximity sensor, the identification panel is arranged in a preset holding area on the mobile terminal, when the mobile terminal receives a holding operation of a user, whether a capacitance value acquired by the identification panel meets a preset trigger condition or not is judged, and if yes, a corresponding application is triggered.
Optionally, the holding area comprises a left side, a right side and a back of the mobile terminal.
Optionally, the recognition panel includes a left recognition panel and a right recognition panel, the left recognition panel is disposed on the left side of the mobile terminal, the right recognition panel is disposed on the right side of the mobile terminal, the left recognition panel is provided with a left trigger capacitance threshold, and the right recognition panel is provided with a right trigger capacitance threshold.
Optionally, the determining whether the capacitance value acquired by the identification panel meets a preset trigger condition includes: and the capacitance values acquired by the left side identification panel and the right side identification panel are both larger than the corresponding trigger capacitance threshold value, and the holding operation is a preset holding gesture.
Optionally, the identification panel further includes a back identification panel, the back identification panel includes a first back identification panel and a second back identification panel, the first back identification panel is disposed on the back of the mobile terminal close to the bottom, the second back identification panel is disposed on the back of the mobile terminal close to the top, a first back trigger capacitance threshold is disposed on the first back identification panel, and a second back trigger capacitance threshold is disposed on the second back identification panel.
Optionally, the determining whether the capacitance value acquired by the identification panel meets a preset trigger condition includes: the capacitance values collected by the left side identification panel, the right side identification panel and the first back identification panel are all larger than corresponding trigger capacitance threshold values, and the capacitance values collected by the second back identification panel are smaller than the second back trigger capacitance threshold values, so that the holding operation is represented as a preset holding gesture.
Optionally, the identification panel includes a left side identification panel and a right side identification panel, the left side identification panel is disposed on the back of the mobile terminal close to the left side, and the right side identification panel is disposed on the back of the mobile terminal close to the right side.
Optionally, the left side identification panel is disposed on a left side and/or a back of the mobile terminal near the bottom, and the right side identification panel is disposed on a right side and/or a back of the mobile terminal near the bottom.
Optionally, the left side identification panel is composed of a plurality of identification panel arrays, and the right side identification panel is composed of a plurality of identification panel arrays.
Optionally, the left side identification panel further includes a left holding identification panel, and the left holding identification panel is disposed on a left side surface of the mobile terminal near the top; the right side identification panel further comprises a right hand holding identification panel, and the right hand holding identification panel is arranged on the right side face, close to the top, of the mobile terminal.
The invention provides a holding identification system based on a proximity sensor, which comprises: the mobile terminal comprises a proximity sensor, wherein an identification panel is arranged on the proximity sensor and arranged in a preset holding area on the mobile terminal, when the mobile terminal receives a holding operation of a user, whether a capacitance value acquired by the identification panel meets a preset trigger condition or not is judged, and if yes, a corresponding application is triggered.
Drawings
Fig. 1 is a schematic diagram of a hardware structure of a mobile terminal implementing various embodiments of the present invention;
FIG. 2 is a diagram of a wireless communication system for the mobile terminal shown in FIG. 1;
FIG. 3 is a block diagram illustrating an exemplary structure of a proximity sensor based grip recognition system according to an embodiment of the present invention;
FIG. 4 is a schematic diagram illustrating a holding gesture according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a capacitive proximity sensor according to an embodiment of the present invention;
fig. 6 is a block diagram illustrating an exemplary structure of a proximity sensor based grip recognition system according to a second embodiment of the present invention;
FIG. 7 is a schematic diagram of a holding gesture according to a second embodiment of the present invention;
fig. 8 is a block diagram illustrating an exemplary structure of a proximity sensor based grip recognition system according to a third embodiment of the present invention;
fig. 9 is a schematic diagram of a holding gesture according to a third embodiment of the present invention;
fig. 10 is a block diagram illustrating an exemplary structure of a proximity sensor based grip recognition system according to a fourth embodiment of the present invention;
FIG. 11 is a schematic diagram illustrating a holding gesture according to a fourth embodiment of the present invention;
the implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
A mobile terminal implementing various embodiments of the present invention will now be described with reference to the accompanying drawings. In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in themselves. Thus, "module" and "component" may be used in a mixture.
The mobile terminal may be implemented in various forms. For example, the terminal described in the present invention may include a mobile terminal such as a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a navigation device, and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. In the following, it is assumed that the terminal is a mobile terminal. However, it will be understood by those skilled in the art that the configuration according to the embodiment of the present invention can be applied to a fixed type terminal in addition to elements particularly used for moving purposes.
Fig. 1 is a schematic hardware configuration of a mobile terminal implementing various embodiments of the present invention.
The mobile terminal 100 may include a wireless communication unit 110, an a/V (audio/video) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190, etc. Fig. 1 illustrates a mobile terminal having various components, but it is to be understood that not all illustrated components are required to be implemented. More or fewer components may alternatively be implemented. Elements of the mobile terminal will be described in detail below.
The wireless communication unit 110 typically includes one or more components that allow radio communication between the mobile terminal 100 and a wireless communication system or network. For example, the wireless communication unit may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115.
The broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel. The broadcast channel may include a satellite channel and/or a terrestrial channel. The broadcast management server may be a server that generates and transmits a broadcast signal and/or broadcast associated information or a server that receives a previously generated broadcast signal and/or broadcast associated information and transmits it to a terminal. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like. Also, the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal. The broadcast associated information may also be provided via a mobile communication network, and in this case, the broadcast associated information may be received by the mobile communication module 112. The broadcast signal may exist in various forms, for example, it may exist in the form of an Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB), an Electronic Service Guide (ESG) of digital video broadcasting-handheld (DVB-H), and the like. The broadcast receiving module 111 may receive a signal broadcast by using various types of broadcasting systems. In particular, the broadcast receiving module 111 may receive digital broadcasting by using a digital broadcasting system such as a data broadcasting system of multimedia broadcasting-terrestrial (DMB-T), digital multimedia broadcasting-satellite (DMB-S), digital video broadcasting-handheld (DVB-H), forward link media (MediaFLO @), terrestrial digital broadcasting integrated service (ISDB-T), and the like. The broadcast receiving module 111 may be constructed to be suitable for various broadcasting systems that provide broadcast signals as well as the above-mentioned digital broadcasting systems. The broadcast signal and/or broadcast associated information received via the broadcast receiving module 111 may be stored in the memory 160 (or other type of storage medium).
The mobile communication module 112 transmits and/or receives radio signals to and/or from at least one of a base station (e.g., access point, node B, etc.), an external terminal, and a server. Such radio signals may include voice call signals, video call signals, or various types of data transmitted and/or received according to text and/or multimedia messages.
The wireless internet module 113 supports wireless internet access of the mobile terminal. The module may be internally or externally coupled to the terminal. The wireless internet access technology to which the module relates may include WLAN (wireless LAN) (Wi-Fi), Wibro (wireless broadband), Wimax (worldwide interoperability for microwave access), HSDPA (high speed downlink packet access), and the like.
The short-range communication module 114 is a module for supporting short-range communication. Some examples of short-range communication technologies include bluetooth (TM), Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), zigbee (TM), and the like.
The location information module 115 is a module for checking or acquiring location information of the mobile terminal. A typical example of the location information module is a GPS (global positioning system). According to the current technology, the GPS module 115 calculates distance information and accurate time information from three or more satellites and applies triangulation to the calculated information, thereby accurately calculating three-dimensional current location information according to longitude, latitude, and altitude. Currently, a method for calculating position and time information uses three satellites and corrects an error of the calculated position and time information by using another satellite. In addition, the GPS module 115 can calculate speed information by continuously calculating current position information in real time.
The a/V input unit 120 is used to receive an audio or video signal. The a/V input unit 120 may include a camera 121 and a microphone 1220, and the camera 121 processes image data of still pictures or video obtained by an image capturing apparatus in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display module 151. The image frames processed by the camera 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 1210 may be provided according to the construction of the mobile terminal. The microphone 122 may receive sounds (audio data) via the microphone in a phone call mode, a recording mode, a voice recognition mode, or the like, and can process such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the mobile communication module 112 in case of a phone call mode. The microphone 122 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The user input unit 130 may generate key input data according to a command input by a user to control various operations of the mobile terminal. The user input unit 130 allows a user to input various types of information, and may include a keyboard, dome sheet, touch pad (e.g., a touch-sensitive member that detects changes in resistance, pressure, capacitance, and the like due to being touched), scroll wheel, joystick, and the like. In particular, when the touch pad is superimposed on the display module 151 in the form of a layer, a touch screen may be formed.
The sensing unit 140 detects a current state of the mobile terminal 100 (e.g., an open or closed state of the mobile terminal 100), a position of the mobile terminal 100, presence or absence of contact (i.e., touch input) by a user with the mobile terminal 100, an orientation of the mobile terminal 100, acceleration or deceleration movement and direction of the mobile terminal 100, and the like, and generates a command or signal for controlling an operation of the mobile terminal 100. For example, when the mobile terminal 100 is implemented as a slide-type mobile phone, the sensing unit 140 may sense whether the slide-type phone is opened or closed. In addition, the sensing unit 140 can detect whether the power supply unit 190 supplies power or whether the interface unit 170 is coupled with an external device. The sensing unit 140 may include a proximity sensor 1410 as will be described below in connection with a touch screen.
The interface unit 170 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The identification module may store various information for authenticating a user using the mobile terminal 100 and may include a User Identity Module (UIM), a Subscriber Identity Module (SIM), a Universal Subscriber Identity Module (USIM), and the like. In addition, a device having an identification module (hereinafter, referred to as an "identification device") may take the form of a smart card, and thus, the identification device may be connected with the mobile terminal 100 via a port or other connection means. The interface unit 170 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal and the external device.
In addition, when the mobile terminal 100 is connected with an external cradle, the interface unit 170 may serve as a path through which power is supplied from the cradle to the mobile terminal 100 or may serve as a path through which various command signals input from the cradle are transmitted to the mobile terminal. Various command signals or power input from the cradle may be used as signals for recognizing whether the mobile terminal is accurately mounted on the cradle. The output unit 150 is configured to provide output signals (e.g., audio signals, video signals, alarm signals, vibration signals, etc.) in a visual, audio, and/or tactile manner. The output unit 150 may include a display module 151, an audio output module 152, an alarm module 153, and the like.
The display module 151 may display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display module 151 may display a User Interface (UI) or a Graphical User Interface (GUI) related to a call or other communication (e.g., text messaging, multimedia file downloading, etc.). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display module 151 may display a captured image and/or a received image, a UI or GUI showing a video or an image and related functions, and the like.
Meanwhile, when the display module 151 and the touch pad are stacked on each other in the form of layers to form a touch screen, the display module 151 may serve as an input device and an output device. The display module 151 may include at least one of a Liquid Crystal Display (LCD), a thin film transistor LCD (TFT-LCD), an Organic Light Emitting Diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like. Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as transparent displays, and a typical transparent display may be, for example, a TOLED (transparent organic light emitting diode) display or the like. The mobile terminal 100 may include two or more display modules (or other display devices) according to a particular desired implementation, for example, the mobile terminal may include an external display module (not shown) and an internal display module (not shown). The touch screen may be used to detect a touch input pressure as well as a touch input position and a touch input area.
The audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 into an audio signal and output as sound when the mobile terminal is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output module 152 may provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output module 152 may include a speaker, a buzzer, and the like.
The alarm module 153 may provide an output to notify the mobile terminal 100 of the occurrence of an event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to audio or video output, the alarm module 153 may provide output in different ways to notify the occurrence of an event. For example, the alarm module 153 may provide an output in the form of a vibration, and when a call, a message, or some other incoming communication (incomingmunication) is received, the alarm module 153 may provide a tactile output (i.e., a vibration) to inform the user thereof. By providing such a tactile output, the user can recognize the occurrence of various events even when the user's mobile phone is in the user's pocket. The alarm module 153 may also provide an output notifying the occurrence of an event via the display module 151 or the audio output module 152.
The memory 160 may store software programs and the like for processing and controlling operations performed by the controller 180, or may temporarily store data (e.g., a phonebook, messages, still images, videos, and the like) that has been or will be output. Also, the memory 160 may store data regarding various ways of vibration and audio signals output when a touch is applied to the touch screen.
The memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. Also, the mobile terminal 100 may cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
The controller 180 generally controls the overall operation of the mobile terminal. For example, the controller 180 performs control and processing related to voice calls, data communications, video calls, and the like. In addition, the controller 180 may include a multimedia module 1810 for reproducing (or playing back) multimedia data, and the multimedia module 1810 may be constructed within the controller 180 or may be constructed separately from the controller 180. The controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
The power supply unit 190 receives external power or internal power and provides appropriate power required to operate various elements and components under the control of the controller 180.
The various embodiments described herein may be implemented in a computer-readable medium using, for example, computer software, hardware, or any combination thereof. For a hardware implementation, the embodiments described herein may be implemented using at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a processor, a controller, a microcontroller, a microprocessor, an electronic unit designed to perform the functions described herein, and in some cases, such embodiments may be implemented in the controller 180. For a software implementation, the implementation such as a process or a function may be implemented with a separate software module that allows performing at least one function or operation. The software codes may be implemented by software applications (or programs) written in any suitable programming language, which may be stored in the memory 160 and executed by the controller 180.
Up to this point, mobile terminals have been described in terms of their functionality. Hereinafter, a slide-type mobile terminal among various types of mobile terminals, such as a folder-type, bar-type, swing-type, slide-type mobile terminal, and the like, will be described as an example for the sake of brevity. Accordingly, the present invention can be applied to any type of mobile terminal, and is not limited to a slide type mobile terminal.
The mobile terminal 100 as shown in fig. 1 may be configured to operate with communication systems such as wired and wireless communication systems and satellite-based communication systems that transmit data via frames or packets.
A communication system in which a mobile terminal according to the present invention is operable will now be described with reference to fig. 2.
Such communication systems may use different air interfaces and/or physical layers. For example, the air interface used by the communication system includes, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS) (in particular, Long Term Evolution (LTE)), global system for mobile communications (GSM), and the like. By way of non-limiting example, the following description relates to a CDMA communication system, but such teachings are equally applicable to other types of systems.
Referring to fig. 2, the CDMA wireless communication system may include a plurality of mobile terminals 100, a plurality of Base Stations (BSs) 270, Base Station Controllers (BSCs) 275, and a Mobile Switching Center (MSC) 280. The MSC280 is configured to interface with a Public Switched Telephone Network (PSTN) 290. The MSC280 is also configured to interface with a BSC275, which may be coupled to the base station 270 via a backhaul. The backhaul may be constructed according to any of several known interfaces including, for example, E1/T1, ATM, IP, PPP, frame Relay, HDSL, ADSL, or xDSL. It will be understood that a system as shown in fig. 2 may include multiple BSCs 2750.
Each BS270 may serve one or more sectors (or regions), each sector covered by a multi-directional antenna or an antenna pointing in a particular direction being radially distant from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS270 may be configured to support multiple frequency allocations, with each frequency allocation having a particular frequency spectrum (e.g., 1.25MHz,5MHz, etc.).
The intersection of partitions with frequency allocations may be referred to as a CDMA channel. The BS270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology. In such a case, the term "base station" may be used to generically refer to a single BSC275 and at least one BS 270. The base stations may also be referred to as "cells". Alternatively, each sector of a particular BS270 may be referred to as a plurality of cell sites.
As shown in fig. 2, a Broadcast Transmitter (BT)295 transmits a broadcast signal to the mobile terminal 100 operating within the system. A broadcast receiving module 111 as shown in fig. 1 is provided at the mobile terminal 100 to receive a broadcast signal transmitted by the BT 295. In fig. 2, several Global Positioning System (GPS) satellites 300 are shown. The satellite 300 assists in locating at least one of the plurality of mobile terminals 100.
In fig. 2, a plurality of satellites 300 are depicted, but it is understood that useful positioning information may be obtained with any number of satellites. The GPS module 115 as shown in fig. 1 is generally configured to cooperate with satellites 300 to obtain desired positioning information. Other techniques that can track the location of the mobile terminal may be used instead of or in addition to GPS tracking techniques. In addition, at least one GPS satellite 300 may selectively or additionally process satellite DMB transmission.
As a typical operation of the wireless communication system, the BS270 receives reverse link signals from various mobile terminals 100. The mobile terminal 100 is generally engaged in conversations, messaging, and other types of communications. Each reverse link signal received by a particular base station 270 is processed within the particular BS 270. The obtained data is forwarded to the associated BSC 275. The BSC provides call resource allocation and mobility management functions including coordination of soft handoff procedures between BSs 270. The BSCs 275 also route the received data to the MSC280, which provides additional routing services for interfacing with the PSTN 290. Similarly, the PSTN290 interfaces with the MSC280, the MSC interfaces with the BSCs 275, and the BSCs 275 accordingly control the BS270 to transmit forward link signals to the mobile terminal 100.
Based on the above mobile terminal hardware structure and communication system, the present invention provides various embodiments of the method.
Example one
As shown in fig. 3, in the present embodiment, a grip recognition system based on a proximity sensor includes: the mobile terminal comprises a proximity sensor, wherein an identification panel is arranged on the proximity sensor, the identification panel is arranged in a preset holding area on the mobile terminal, when the mobile terminal receives a holding operation of a user, whether a capacitance value acquired by the identification panel meets a preset trigger condition or not is judged, and if yes, a corresponding application is triggered.
In the embodiment, the holding gesture operation of the user is recognized through the proximity sensor, and the corresponding function is completed, so that the interactive experience of the user terminal is improved.
In this embodiment, the number of the proximity sensors is one, and the proximity sensors are provided with a plurality of identification panels respectively used for acquiring capacitance data of different positions; as another embodiment, the number of the proximity sensors may be multiple, and each proximity sensor corresponds to one identification panel.
As shown in fig. 4, in this embodiment, the holding operation is an operation in which a user holds a mobile terminal (e.g., a mobile phone) with a left hand or a right hand, generally, the user does not grip the mobile phone with great force, at this time, fingers or a palm lightly contact the mobile phone, a contact area between an identification panel on the mobile phone and the fingers is small, an acquired capacitance value is also small, and at this time, an application corresponding to a holding gesture is not triggered, so that a malfunction is prevented from often occurring; when a user grips the mobile phone with force, the contact area between the identification panel on the mobile phone and the fingers is increased, the acquired capacitance value is also increased, and when the capacitance value acquired by the identification panel meets a preset trigger condition, the application corresponding to the holding gesture is triggered.
In this embodiment, the holding gesture includes a left-hand holding gesture and a right-hand holding gesture, and may be set to trigger different applications, where the applications include application programs, keys, services, or functions, and the application programs include a camera, a map, a QQ, a WeChat, and the like; the keys comprise a home key, a return key, a confirmation key, an attribute key, a volume adjusting key and the like; the service comprises telephone, short message and the like; the functions include screenshot, volume adjustment, page turning and the like.
In this embodiment, as shown in fig. 5, the working principle of the capacitive proximity sensor is shown, and assuming that the effective projection area of the proximity object (such as a finger) on the sensor is a, the capacitance value C can be expressed by the following formula:
wherein,0andrthe dielectric constants of vacuum and air are respectively expressed, and the effective projection area and the distance close to an object can change the value of the capacitance C according to the formula, so that some interaction states can be determined according to the change of the value of the capacitance C and the difference between a plurality of capacitances.
In this embodiment, the grip recognition system further includes a gesture predefining unit configured to define specific positions of the recognition panel of the proximity sensor, a capacitance threshold division, and possible gestures of the corresponding panel, where when the gesture starts to operate, a capacitance value of the recognition panel of the proximity sensor changes according to the proximity of a hand, and the gesture data acquisition unit completes distance sensing acquisition between the proximity sensor and the hand. The proximity sensor reports the acquired original data to the gesture data processing unit for data processing; after the gesture data processing unit converts the original capacitance data, dividing and converting the original capacitance data into data identified by an algorithm according to a capacitance threshold; new data are collected after a certain time (millisecond) interval; and a group of data is formed according to the number of the recognition panels and is processed by the gesture data processing unit and is transferred to the gesture recognition unit. The gesture recognition unit performs gesture algorithm recognition through multiple groups of integrated data and performs gesture matching with the gesture predefined unit. When the defined gesture is matched in the algorithm recognition process, the corresponding trigger event is sent to the corresponding application execution unit, and the application execution unit (including the service or the application) executes the operation to be completed by the gesture.
In this embodiment, the shape of the identification panel is rectangular, circular, triangular or L-shaped.
As shown in fig. 3, in the present embodiment, the holding area includes a left side (marked as left in the figure), a right side (marked as right in the figure) and a back of the mobile terminal; in addition, for convenience of description, a front side, a bottom side and a top side of the mobile terminal are defined as six sides of one mobile terminal, respectively, wherein the front side refers to a side of the mobile terminal having a display screen, the back side refers to a side corresponding to the front side, the left side and the right side refer to a left side frame and a right side frame seen when facing the front side, and the bottom side and the top side refer to a lower side frame and an upper side frame seen when facing the front side.
As shown in fig. 3, in the present embodiment, the recognition panel includes a left recognition panel S2 and a right recognition panel S1, the left recognition panel S2 is disposed on the left side of the mobile terminal, the right recognition panel S1 is disposed on the right side of the mobile terminal, a left trigger capacitance threshold is disposed on the left recognition panel S2, and a right trigger capacitance threshold is disposed on the right recognition panel S1.
In this embodiment, the determining whether the capacitance value collected by the identification panel meets a preset trigger condition includes: and the capacitance values acquired by the left side identification panel and the right side identification panel are both larger than the corresponding trigger capacitance threshold value, and the holding operation is a preset holding gesture.
In this embodiment, the holding operation can be recognized only by the left and right recognition panels, and as can be seen from fig. 4, in the holding operation, capacitance values collected by the recognition panel contacted by the fingers and the recognition panel contacted by the palm are different due to gaps between the fingers, so that the left-hand holding gesture and the right-hand holding gesture can be recognized.
As another embodiment, the left-hand holding gesture and the right-hand holding gesture may be distinguished by respectively providing a left-hand holding recognition panel and a right-hand holding recognition panel on the left side and the right side, where when a capacitance value collected on the left-hand holding recognition panel is greater than a preset trigger capacitance threshold value, it indicates that the current holding gesture is the left-hand holding gesture, and when a capacitance value collected on the right-hand holding recognition panel is greater than a preset trigger capacitance threshold value, it indicates that the current holding gesture is the right-hand holding gesture.
As another embodiment, as shown in fig. 3 and 4, the identification panel further includes a back identification panel, the back identification panel includes a first back identification panel S0 and a second back identification panel S3, the first back identification panel S0 is disposed on the back of the mobile terminal near the bottom, the second back identification panel S3 is disposed on the back of the mobile terminal near the top, a first back trigger capacitance threshold is set on the first back identification panel S0, and a second back trigger capacitance threshold is set on the second back identification panel S3.
In this embodiment, the determining whether the capacitance value collected by the identification panel meets a preset trigger condition includes: the capacitance values collected by the left side identification panel, the right side identification panel and the first back identification panel are all larger than corresponding trigger capacitance threshold values, and the capacitance values collected by the second back identification panel are smaller than the second back trigger capacitance threshold values, so that the holding operation is represented as a preset holding gesture.
In this embodiment, the left and right recognition panels and the back recognition panel recognize the holding operation at the same time, and the palm of the user is bound to touch the back of the mobile terminal during the holding operation, but in order to prevent the wrong operation during the approach of the back, one recognition panel is respectively arranged at the bottom and the top of the back, and the recognition panel S0 near the bottom is triggered while the recognition panel S3 near the top is not triggered to be recognized as the holding operation.
Example two
As shown in fig. 6, in the present embodiment, the identification panels include a left identification panel S2 and a right identification panel S1, the left identification panel S2 is disposed at the back of the mobile terminal near the left side, and the right identification panel S1 is disposed at the back of the mobile terminal near the right side.
As shown in fig. 7, in the present embodiment, since the user is used to grip the bottom of the mobile terminal with a hand, the left recognition panel S2 is disposed on the left side and/or the back of the mobile terminal near the bottom, and the right recognition panel S1 is disposed on the right side and/or the back of the mobile terminal near the bottom.
In the embodiment, the holding gesture operation of the user is recognized through the proximity sensor, and the corresponding function is completed, so that the interactive experience of the user terminal is improved.
In this embodiment, the number of the proximity sensors is one, and the proximity sensors are provided with a plurality of identification panels respectively used for acquiring capacitance data of different positions; as another embodiment, the number of the proximity sensors may be multiple, and each proximity sensor corresponds to one identification panel.
In this embodiment, the determining whether the capacitance value collected by the identification panel meets a preset trigger condition includes: and the capacitance values acquired by the left side identification panel and the right side identification panel are both larger than the corresponding trigger capacitance threshold value, and the holding operation is a preset holding gesture.
In this embodiment, the holding operation can be recognized only by the left and right recognition panels, and as can be seen from fig. 7, in the holding operation, capacitance values collected by the recognition panel contacted by the fingers and the recognition panel contacted by the palm are different due to gaps between the fingers, so that the left-hand holding gesture and the right-hand holding gesture can be recognized.
As another embodiment, the left-hand holding gesture and the right-hand holding gesture may be distinguished by respectively providing a left-hand holding recognition panel and a right-hand holding recognition panel on the left side and the right side, where when a capacitance value collected on the left-hand holding recognition panel is greater than a preset trigger capacitance threshold value, it indicates that the current holding gesture is the left-hand holding gesture, and when a capacitance value collected on the right-hand holding recognition panel is greater than a preset trigger capacitance threshold value, it indicates that the current holding gesture is the right-hand holding gesture.
As another embodiment, as shown in fig. 6 and 7, the identification panel further includes a back identification panel, the back identification panel includes a first back identification panel S0 and a second back identification panel S3, the first back identification panel S0 is disposed on the back of the mobile terminal near the bottom, the second back identification panel S3 is disposed on the back of the mobile terminal near the top, a first back trigger capacitance threshold is set on the first back identification panel S0, and a second back trigger capacitance threshold is set on the second back identification panel S3.
In this embodiment, the determining whether the capacitance value collected by the identification panel meets a preset trigger condition includes: the capacitance values collected by the left side identification panel, the right side identification panel and the first back identification panel are all larger than corresponding trigger capacitance threshold values, and the capacitance values collected by the second back identification panel are smaller than the second back trigger capacitance threshold values, so that the holding operation is represented as a preset holding gesture.
In this embodiment, the left and right recognition panels and the back recognition panel recognize the holding operation at the same time, and the palm of the user is bound to touch the back of the mobile terminal during the holding operation, but in order to prevent the wrong operation during the approach of the back, one recognition panel is respectively arranged at the bottom and the top of the back, and the recognition panel S0 near the bottom is triggered while the recognition panel S3 near the top is not triggered to be recognized as the holding operation.
EXAMPLE III
As shown in fig. 8, in the present embodiment, the left recognition panel is composed of a number of recognition panel arrays S20 … S2m, and the right recognition panel is composed of a number of recognition panel arrays S10 … S1 n.
As shown in fig. 9, when the user grips the mobile terminal with the left hand, the left recognition panel array S20 … S2m is fully activated, and the right recognition panel array S10 … S1n is partially activated, whereby it can be determined as a left-hand holding gesture; similarly, if the right recognition panel array S10 … S1n is fully activated and the left recognition panel array S20 … S2m is partially activated, it can be determined as a right-hand holding gesture.
In the embodiment, the holding gesture operation of the user is recognized through the proximity sensor, and the corresponding function is completed, so that the interactive experience of the user terminal is improved.
In this embodiment, the number of the proximity sensors is one, and the proximity sensors are provided with a plurality of identification panels respectively used for acquiring capacitance data of different positions; as another embodiment, the number of the proximity sensors may be multiple, and each proximity sensor corresponds to one identification panel.
In this embodiment, the determining whether the capacitance value collected by the identification panel meets a preset trigger condition includes: and the capacitance values acquired by the left side identification panel and the right side identification panel are both larger than the corresponding trigger capacitance threshold value, and the holding operation is a preset holding gesture.
In this embodiment, the holding operation can be recognized only by the left and right recognition panels, and as can be seen from fig. 7, in the holding operation, capacitance values collected by the recognition panel contacted by the fingers and the recognition panel contacted by the palm are different due to gaps between the fingers, so that the left-hand holding gesture and the right-hand holding gesture can be recognized.
As another embodiment, the left-hand holding gesture and the right-hand holding gesture may be distinguished by respectively providing a left-hand holding recognition panel and a right-hand holding recognition panel on the left side and the right side, where when a capacitance value collected on the left-hand holding recognition panel is greater than a preset trigger capacitance threshold value, it indicates that the current holding gesture is the left-hand holding gesture, and when a capacitance value collected on the right-hand holding recognition panel is greater than a preset trigger capacitance threshold value, it indicates that the current holding gesture is the right-hand holding gesture.
As another embodiment, as shown in fig. 8 and 9, the identification panel further includes a back identification panel, the back identification panel includes a first back identification panel S0 and a second back identification panel S3, the first back identification panel S0 is disposed on the back of the mobile terminal near the bottom, the second back identification panel S3 is disposed on the back of the mobile terminal near the top, a first back trigger capacitance threshold is set on the first back identification panel S0, and a second back trigger capacitance threshold is set on the second back identification panel S3.
Example four
As shown in fig. 10, in the present embodiment, the left side identification panel further includes a left hand holding identification panel SL, and the left hand holding identification panel SL is disposed on the left side surface of the mobile terminal near the top; the right side identification panel further comprises a right hand holding identification panel SR, and the right hand holding identification panel SR is arranged on the right side surface, close to the top, of the mobile terminal.
As shown in fig. 11, when the user grips the mobile terminal with the left hand, the SL array can be operated by the thumb for interaction, and since the system recognizes the left-hand grip, the array SR will ignore the external input, so as to prevent the wrong operation caused by touching the SR by mistake; similarly, when the user grips the mobile terminal with the right hand, the SR array can be operated by the thumb for interaction, and since the system recognizes that the user grips with the right hand, the array SL cannot look at external input at the moment, so that misoperation caused by mistakenly touching the SL is prevented.
In the present embodiment, the arrangement of the left-hand holding identification panel and the right-hand holding identification panel is suitable for the first to third embodiments.
In the embodiment, the holding gesture operation of the user is recognized through the proximity sensor, and the corresponding function is completed, so that the interactive experience of the user terminal is improved.
In this embodiment, the number of the proximity sensors is one, and the proximity sensors are provided with a plurality of identification panels respectively used for acquiring capacitance data of different positions; as another embodiment, the number of the proximity sensors may be multiple, and each proximity sensor corresponds to one identification panel.
EXAMPLE five
In this embodiment, a mobile terminal includes, in addition to the existing function module in fig. 1, the holding recognition system described in the above embodiment, and can recognize a holding gesture operation of a user through a proximity sensor, and complete a corresponding function, thereby improving an interaction experience of a user terminal.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.
Claims (10)
1. A proximity sensor based grip recognition system, comprising: the mobile terminal comprises a proximity sensor, wherein an identification panel is arranged on the proximity sensor, the identification panel is arranged in a preset holding area on the mobile terminal, when the mobile terminal receives a holding operation of a user, whether a capacitance value acquired by the identification panel meets a preset trigger condition or not is judged, and if yes, a corresponding application is triggered.
2. The proximity sensor based grip recognition system of claim 1, wherein the grip area comprises a left side, a right side, and a back of the mobile terminal.
3. The proximity sensor based grip recognition system of claim 2, wherein the recognition panel comprises a left recognition panel and a right recognition panel, the left recognition panel is disposed on the left side of the mobile terminal, the right recognition panel is disposed on the right side of the mobile terminal, the left recognition panel is provided with a left trigger capacitance threshold, and the right recognition panel is provided with a right trigger capacitance threshold.
4. The proximity sensor based grip recognition system of claim 3, wherein the determining whether the capacitance value collected by the recognition panel satisfies a preset trigger condition comprises: and the capacitance values acquired by the left side identification panel and the right side identification panel are both larger than the corresponding trigger capacitance threshold value, and the holding operation is a preset holding gesture.
5. The proximity sensor based grip recognition system of claim 3, wherein the recognition panel further comprises a back recognition panel, the back recognition panel comprises a first back recognition panel and a second back recognition panel, the first back recognition panel is disposed on the back of the mobile terminal near the bottom, the second back recognition panel is disposed on the back of the mobile terminal near the top, the first back recognition panel is provided with a first back trigger capacitance threshold, and the second back recognition panel is provided with a second back trigger capacitance threshold.
6. The proximity sensor based grip recognition system of claim 5, wherein the determining whether the capacitance value collected by the recognition panel satisfies a preset trigger condition comprises: the capacitance values collected by the left side identification panel, the right side identification panel and the first back identification panel are all larger than corresponding trigger capacitance threshold values, and the capacitance values collected by the second back identification panel are smaller than the second back trigger capacitance threshold values, so that the holding operation is represented as a preset holding gesture.
7. The proximity sensor based grip recognition system of claim 2, wherein the recognition panel comprises a left recognition panel and a right recognition panel, the left recognition panel is disposed on the back of the mobile terminal near the left side, and the right recognition panel is disposed on the back of the mobile terminal near the right side.
8. The proximity-sensor-based grip recognition system according to any one of claims 1 to 7, wherein the left recognition panel is disposed on a left side and/or a back side of the mobile terminal near the bottom, and the right recognition panel is disposed on a right side and/or a back side of the mobile terminal near the bottom.
9. The proximity sensor based grip recognition system of any one of claims 8, wherein said left recognition panel is comprised of a plurality of recognition panel arrays, and said right recognition panel is comprised of a plurality of recognition panel arrays.
10. The proximity sensor based grip recognition system according to any one of claims 9, wherein the left side recognition panel further comprises a left hand grip recognition panel, the left hand grip recognition panel is disposed on a left side surface of the mobile terminal near the top; the right side identification panel further comprises a right hand holding identification panel, and the right hand holding identification panel is arranged on the right side face, close to the top, of the mobile terminal.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610872700.9A CN106254597A (en) | 2016-09-29 | 2016-09-29 | A kind of gripping identification system based on proximity transducer |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610872700.9A CN106254597A (en) | 2016-09-29 | 2016-09-29 | A kind of gripping identification system based on proximity transducer |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106254597A true CN106254597A (en) | 2016-12-21 |
Family
ID=57611318
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610872700.9A Pending CN106254597A (en) | 2016-09-29 | 2016-09-29 | A kind of gripping identification system based on proximity transducer |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106254597A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107368717A (en) * | 2017-06-05 | 2017-11-21 | 深圳市金立通信设备有限公司 | The method and terminal of a kind of identification |
US10838541B2 (en) | 2018-09-03 | 2020-11-17 | Htc Corporation | Method for operating handheld device, handheld device and computer-readable recording medium thereof |
WO2023226031A1 (en) * | 2022-05-27 | 2023-11-30 | 北京小米移动软件有限公司 | Shortcut operation execution method and apparatus, device, and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011159947A1 (en) * | 2010-06-16 | 2011-12-22 | Qualcomm Incorporated | Layout design of proximity sensors to enable shortcuts |
CN103513763A (en) * | 2012-06-26 | 2014-01-15 | Lg电子株式会社 | Mobile terminal and control method thereof |
CN105278781A (en) * | 2014-06-02 | 2016-01-27 | 辛纳普蒂克斯公司 | Side sensing for electronic devices |
WO2016054190A1 (en) * | 2014-09-30 | 2016-04-07 | Pcms Holdings, Inc. | Mobile device and method for interperting gestures without obstructing the screen |
-
2016
- 2016-09-29 CN CN201610872700.9A patent/CN106254597A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011159947A1 (en) * | 2010-06-16 | 2011-12-22 | Qualcomm Incorporated | Layout design of proximity sensors to enable shortcuts |
CN103513763A (en) * | 2012-06-26 | 2014-01-15 | Lg电子株式会社 | Mobile terminal and control method thereof |
CN105278781A (en) * | 2014-06-02 | 2016-01-27 | 辛纳普蒂克斯公司 | Side sensing for electronic devices |
WO2016054190A1 (en) * | 2014-09-30 | 2016-04-07 | Pcms Holdings, Inc. | Mobile device and method for interperting gestures without obstructing the screen |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107368717A (en) * | 2017-06-05 | 2017-11-21 | 深圳市金立通信设备有限公司 | The method and terminal of a kind of identification |
US10838541B2 (en) | 2018-09-03 | 2020-11-17 | Htc Corporation | Method for operating handheld device, handheld device and computer-readable recording medium thereof |
TWI715058B (en) * | 2018-09-03 | 2021-01-01 | 宏達國際電子股份有限公司 | Method for operating handheld device, handheld device and computer-readable recording medium thereof |
WO2023226031A1 (en) * | 2022-05-27 | 2023-11-30 | 北京小米移动软件有限公司 | Shortcut operation execution method and apparatus, device, and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106527933B (en) | Control method and device for edge gesture of mobile terminal | |
CN106130734A (en) | The control method of mobile terminal and control device | |
CN105138260A (en) | Application switching method and terminal | |
CN106412301A (en) | Screen control system based on proximity sensor and mobile terminal | |
CN106657602A (en) | Conversation screen-off system based on proximity sensor and mobile terminal | |
CN106648324B (en) | Hidden icon control method and device and terminal | |
CN106357905A (en) | Mobile terminal | |
CN106843723A (en) | A kind of application program associates application method and mobile terminal | |
CN106354268A (en) | Screen control system based on proximity sensor and mobile terminal | |
CN106406621B (en) | A kind of mobile terminal and its method for handling touch control operation | |
CN109542317B (en) | Display control method, device and storage medium of double-sided screen mobile terminal | |
CN106412295A (en) | Mobile terminal based on virtual key | |
CN106254597A (en) | A kind of gripping identification system based on proximity transducer | |
CN106254596A (en) | A kind of kneading identification system based on proximity transducer and mobile terminal | |
CN106484271A (en) | A kind of device, method and mobile terminal controlling mobile terminal based on user gesture | |
CN106648052A (en) | Application triggering system based on proximity detector and mobile terminal | |
CN106341502A (en) | Proximity sensor-based application triggering system and mobile terminal | |
CN106095308B (en) | The method of mobile terminal and dummy keyboard false-touch prevention | |
CN105183327B (en) | A kind of method of terminal and terminal operation | |
CN106325503B (en) | Interactive operation identification device and method | |
CN105094598A (en) | Device and method for processing operational objects | |
CN104932711A (en) | Method and device for realizing input mode switching | |
CN106896988B (en) | Application icon alignment device, terminal and method | |
CN106484183A (en) | A kind of pressing identifying system based on proximity transducer, method and mobile terminal | |
CN105208193A (en) | Button-free device and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20161221 |
|
RJ01 | Rejection of invention patent application after publication |