CN112334860A - Touch method of wearable device, wearable device and system - Google Patents

Touch method of wearable device, wearable device and system Download PDF

Info

Publication number
CN112334860A
CN112334860A CN201880094859.XA CN201880094859A CN112334860A CN 112334860 A CN112334860 A CN 112334860A CN 201880094859 A CN201880094859 A CN 201880094859A CN 112334860 A CN112334860 A CN 112334860A
Authority
CN
China
Prior art keywords
wearable device
fingerprint
fingerprint sensor
touch operation
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201880094859.XA
Other languages
Chinese (zh)
Other versions
CN112334860B (en
Inventor
龚树强
龚建勇
仇存收
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN112334860A publication Critical patent/CN112334860A/en
Application granted granted Critical
Publication of CN112334860B publication Critical patent/CN112334860B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The application discloses a touch method of wearable equipment, the wearable equipment and a system, relates to the technical field of communication, and can improve the sensitivity and accuracy of the wearable equipment in recognizing user gestures and reduce the probability of false triggering of the wearable equipment or a terminal. The method comprises the following steps: the wearable device detects a touch operation input by a user by using a fingerprint sensor; the wearable device judges whether the touch operation contains input of fingerprints; if the touch operation comprises input of a fingerprint, the wearable device recognizes a control gesture corresponding to the touch operation; the wearable device sends the control gesture to the terminal; or the wearable device sends the operation instruction corresponding to the control gesture to the terminal, so that the terminal executes the operation instruction corresponding to the control gesture, and communication connection is established between the wearable device and the terminal.

Description

Touch method of wearable device, wearable device and system Technical Field
The present application relates to the field of communications technologies, and in particular, to a touch method for a wearable device, and a system.
Background
At present, terminals such as mobile phones and tablet computers support accessing accessories such as earphones. For example, after the bluetooth connection is established between the mobile phone and the bluetooth headset, the user can play songs in the mobile phone and communicate with contacts using the bluetooth headset.
Generally, one or more function keys (e.g., volume +, volume-and the like) are arranged on the bluetooth headset, and a user can control the mobile phone to implement a function related to audio playing by operating the function keys. Some bluetooth headsets are also provided with a touch pad, and a user can implement the function of a corresponding function key by executing a preset gesture (for example, clicking, sliding and the like) on the touch pad. For example, if it is detected that the user performs a click operation on the touch pad of the bluetooth headset, the bluetooth headset may generate a play instruction corresponding to the click operation and send the play instruction to the mobile phone, so that the mobile phone executes a play function in response to the play instruction.
However, since the size and the dimension of the bluetooth headset are small, the area of the touch pad disposed on the bluetooth headset is correspondingly small, and thus, the number of sensing units (e.g., sensing capacitors) in the touch pad for recognizing the user gesture is relatively small, so that the sensitivity and the accuracy of the bluetooth headset when recognizing the user gesture using the sensing units with limited number cannot be improved.
Disclosure of Invention
The application provides a touch method of a wearable device, the wearable device and a system, which can improve the sensitivity and accuracy of the wearable device in recognizing a user gesture and reduce the probability of false triggering of the wearable device or a terminal.
In order to achieve the purpose, the technical scheme is as follows:
in a first aspect, the present application provides a touch method for a wearable device, where a fingerprint sensor is disposed in the wearable device, and then the wearable device detects a touch operation input by a user using the fingerprint sensor; furthermore, the wearable device can judge whether the touch operation contains the input of the fingerprint; if the touch operation comprises input of a fingerprint, the touch operation is not a false touch behavior of the user, so that the wearable device can identify a control gesture corresponding to the touch operation; and sending the control gesture to the terminal; or sending the operation instruction corresponding to the control gesture to the terminal, so that the terminal executes the operation instruction corresponding to the control gesture, and communication connection is established between the wearable device and the terminal.
That is to say, in the embodiment of the application, by using the characteristics of small size and high integration level of the sensing unit in the fingerprint sensor, the fingerprint sensor is arranged in the wearable device for recognizing the touch operation of the user, so as to replace a traditional wearable device in which a touchpad with a large size is used for recognizing the gesture performed by the user, thereby improving the integration level of the wearable device. Meanwhile, the number of sensing units in the fingerprint sensor is more, and the fingerprint sensor can identify whether the touch operation of the user is triggered by a finger instead of by mistake, so that the sensitivity and accuracy of the wearable device in the process of identifying the gesture of the user are improved, and the probability that the wearable device and the terminal are triggered by mistake can be reduced.
In one possible design method, after the wearable device detects a touch operation input by a user using the fingerprint sensor, the method further includes: responding to the touch operation, the wearable equipment collects N continuous images formed on the fingerprint sensor, wherein at least one image in the N continuous images contains a fingerprint pattern, and N is an integer greater than 1; at this time, the wearable device recognizes a control gesture corresponding to the touch operation, including: and the wearable equipment identifies the control gesture corresponding to the touch operation according to the change condition of the fingerprint pattern in the N continuous images.
In one possible design approach, the wearable device captures N consecutive images formed on the fingerprint sensor, including: when the fingerprint sensor detects that a finger of a user contacts the fingerprint sensor, starting to acquire an image formed on the fingerprint sensor at a preset frequency; and when the fingerprint sensor detects that the finger of the user leaves the fingerprint sensor, stopping collecting the images formed on the fingerprint sensor to obtain the N continuous images.
In one possible design approach, the wearable device captures N consecutive images formed on the fingerprint sensor, including: when the fingerprint sensor detects that a finger of a user contacts the fingerprint sensor, starting to acquire an image formed on the fingerprint sensor at a preset frequency; when the fingerprint sensor detects that the finger of the user leaves the fingerprint sensor, continuously collecting the image formed on the fingerprint sensor within a preset time; and if the finger of the user is not detected to contact the fingerprint sensor within the preset time, stopping collecting the images formed on the fingerprint sensor to obtain the N continuous images.
In a possible design method, the wearable device recognizes a control gesture corresponding to the touch operation according to a variation condition of the fingerprint pattern in the N consecutive images, and specifically includes: the wearable device identifies images including fingerprint patterns in the N continuous images according to preset fingerprint characteristics; and then, the wearable device identifies the control gesture corresponding to the touch operation according to the size change and/or the position change of the fingerprint pattern in the N continuous images. The principle that fingerprint patterns can be collected through the fingerprint sensor is used in this application promptly, through the continuous change control gesture of fingerprint patterns discernment user input.
In one possible design method, the wearable device recognizes a control gesture corresponding to the touch operation according to a size change and/or a position change of the fingerprint pattern in the N consecutive images, including: when X continuous images in the N continuous images contain the fingerprint pattern and the position of the fingerprint pattern in the X images is the same, the control gesture corresponding to the touch operation is a clicking operation, and X is less than or equal to N; or when continuous Y images in the N continuous images contain the fingerprint pattern and the positions of the fingerprint pattern in the Y images are the same, the control gesture corresponding to the touch operation is a long-time pressing operation, and X is more than Y and is less than or equal to N; or, when there are consecutive Z images in the N consecutive images that include the fingerprint pattern and the displacement of the fingerprint pattern in the Z images is greater than the distance threshold, the control gesture corresponding to the touch operation is a sliding operation, Z is not greater than N, or, when there are L3 images that do not include the fingerprint pattern between L1 consecutive images that include the fingerprint pattern and L2 consecutive images that include the fingerprint pattern in the N consecutive images, the control gesture corresponding to the touch operation is a double-click operation, L3 is less than a preset threshold, and 1 < L1+ L2+ L3 is not greater than N.
In one possible design approach, a control gesture is sent to the terminal at the wearable device; or before the wearable device sends the operation instruction corresponding to the control gesture to the terminal, the method further includes: the wearable device determines that the touch operation is not a false touch operation. That is to say, only when the wearable device recognizes that the touch operation received by the fingerprint sensor includes the input of the fingerprint, the wearable device continues to acquire the image formed on the fingerprint sensor, recognize the control gesture corresponding to the touch operation, and send the recognized control gesture or operation instruction to the terminal.
In one possible design method, the wearable device determining that the touch operation is not a false touch operation includes: if P1 images in the N continuous images contain fingerprint patterns and P1 is larger than a preset value, the wearable device determines that the touch operation is not a false touch operation.
In a possible design method, if P2 images in the N consecutive images contain fingerprint patterns, and P2 is smaller than the preset value, the wearable device determines that the touch operation is a false touch operation; then, the wearable device switches the fingerprint sensor from the working state to the dormant state, so that the user is prevented from mistakenly touching the wearable device to operate the awakening terminal, and the power consumption of the wearable device and the mobile phone is reduced.
In one possible design method, before the wearable device receives a touch operation input by a user to the fingerprint sensor, the method further includes: in response to a user-input wake-up operation, the wearable device switches the fingerprint sensor from a sleep state to an active state. That is, the fingerprint sensor may be in a sleep state before receiving a touch operation input by a user to reduce power consumption of the wearable device.
In a second aspect, the present application provides a wearable device comprising: a fingerprint sensor, one or more processors, memory, and one or more programs; wherein, the processor is coupled with the memory, the one or more programs are stored in the memory, and when the wearable device runs, the processor executes the one or more programs stored in the memory to make the wearable device execute the touch method of any one of the wearable devices.
The fingerprint sensor can be arranged on one side which is not contacted with a user when the wearable device is worn; the wearable device can be a Bluetooth headset, smart glasses, a smart watch or the like.
In a third aspect, the present application provides a computer storage medium, including computer instructions, which, when run on a wearable device, cause the wearable device to perform any one of the above-mentioned touch methods of the wearable device.
In a fourth aspect, the present application provides a computer program product, which, when running on the wearable device, causes the wearable device to perform the touch method of the wearable device described in any one of the above.
In a fifth aspect, the present application provides a touch system, including a wearable device and a terminal, where the wearable device is provided with a fingerprint sensor, and a communication connection is established between the wearable device and the terminal; wherein the wearable device is to: detecting a touch operation input by a user by using the fingerprint sensor; judging whether the touch operation contains input of a fingerprint; if the touch operation comprises input of a fingerprint, identifying a control gesture corresponding to the touch operation; sending the control gesture to the terminal, or sending an operation instruction corresponding to the control gesture to the terminal; the terminal is used for: receiving a control gesture sent by the wearable device, or receiving an operation instruction which is sent by the wearable device and corresponds to the control gesture; and executing an operation instruction corresponding to the control gesture.
In a sixth aspect, the present application provides a touch system, including a wearable device and a terminal, where the wearable device is provided with a fingerprint sensor, and a communication connection is established between the wearable device and the terminal; wherein the wearable device is to: detecting a touch operation input by a user by using the fingerprint sensor; collecting N continuous images formed on the fingerprint sensor in response to the touch operation, wherein at least one of the N continuous images contains a fingerprint pattern, and N is an integer greater than 1; sending the N continuous images to the terminal; the terminal is used for: receiving N continuous images sent by wearable equipment; identifying a control gesture corresponding to the touch operation according to the change condition of the fingerprint pattern in the N continuous images; and executing an operation instruction corresponding to the control gesture.
It is to be understood that the terminal according to the second aspect, the computer storage medium according to the third aspect, the computer program product according to the fourth aspect, and the systems according to the fifth and sixth aspects are all configured to perform the corresponding methods provided above, and therefore, the beneficial effects achieved by the terminal according to the second aspect, the computer program product according to the fourth aspect, and the systems according to the fifth and sixth aspects can refer to the beneficial effects of the corresponding methods provided above, and are not described herein again.
Drawings
Fig. 1 is a first schematic view of a touch scene of a wearable device according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of a fingerprint sensor according to an embodiment of the present disclosure;
fig. 3 is a first structural schematic diagram of a wearable device according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of smart glasses provided in an embodiment of the present application;
fig. 5 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 6 is an interaction diagram of a touch method of a wearable device according to an embodiment of the present disclosure;
fig. 7 is a schematic view illustrating a touch scene of a wearable device according to an embodiment of the present disclosure;
fig. 8 is a schematic view illustrating a third touch scene of a wearable device according to an embodiment of the present disclosure;
fig. 9 is a schematic view illustrating a touch scene of a wearable device according to an embodiment of the present disclosure;
fig. 10 is a schematic structural diagram of a wearable device according to an embodiment of the present application.
Detailed Description
Embodiments of the present application will be described in detail below with reference to the accompanying drawings.
As shown in fig. 1, the touch method of the wearable device provided in the embodiment of the present application may be applied to a touch system formed by a wearable device 11 and a terminal 12. A wireless communication connection or a wired communication connection may be established between the wearable device 11 and the terminal 12.
The wearable device 11 may be a wireless headset, a wired headset, a smart glasses, a smart helmet, a smart wristwatch, or the like. The terminal 12 may be a mobile phone, a tablet Computer, a notebook Computer, an Ultra-mobile Personal Computer (UMPC), a Personal Digital Assistant (PDA), and the like, which is not limited in this embodiment.
Taking the wearable device 11 as a bluetooth headset for example, as shown in fig. 1, in the embodiment of the present application, a fingerprint sensor 201 is disposed on the bluetooth headset. Wherein, fingerprint sensor 201 can set up the side that does not directly contact with the user when the user wears, for example, can set up fingerprint sensor 201 on bluetooth headset's shell, perhaps, can also set up fingerprint sensor 201 alone as a control module and link to each other with bluetooth headset's casing.
When a user's finger touches the exposed collecting surface of the fingerprint sensor 201, the fingerprint sensor 201 can collect a fingerprint pattern formed on the collecting surface by the user's finger. Illustratively, the fingerprint sensor 201 shown in fig. 2 includes a plurality of sensing units 201b arranged in an array, and an acquisition surface 201a covering the sensing units 201 b. The fingerprint on a user's finger typically includes concave valleys and convex peaks (ridges). After the user's finger contacts the collecting surface 201a of the fingerprint sensor 201, since the human body belongs to the conductive medium, the sensing unit 201b in the fingerprint sensor 201 may generate electrical signals corresponding to the valleys and the ridges, respectively. By taking the sensing unit 201b as an example of the sensing capacitor, the capacitance difference generated by the sensing capacitor corresponding to the valley in the fingerprint is the first capacitance difference, and the capacitance difference generated by the sensing capacitor corresponding to the peak in the fingerprint is the second capacitance difference, so that the fingerprint pattern of the user can be drawn based on the capacitance difference at different positions on the fingerprint sensor 201.
In addition, if the fingerprint sensor 201 is an optical fingerprint sensor, the sensing unit 201b may be a photo sensor (e.g., a photodiode or a photo transistor). Of course, the fingerprint sensor 201 may be a capacitive fingerprint sensor, an optical fingerprint sensor, a radio frequency fingerprint sensor, an ultrasonic fingerprint sensor, or the like, which is not limited in this embodiment of the present application.
It can be understood that, as the number of the above sensing units 201b is larger, the accuracy and sensitivity in collecting and recognizing the fingerprint pattern of the user are higher. Since the integration level of the sensing elements 201b in the fingerprint sensor 201 is high, the number of the sensing elements 201b in the fingerprint sensor 201 is much greater than that of the sensing elements in the common touch pad for a same size of the fingerprint sensor 201 and the common touch pad. For example, a 2cm by 2cm fingerprint sensor 201 may include more than 50 sensing elements 201b, while a 2cm by 2cm ordinary touchpad may include only about ten or more sensing elements.
In the embodiment of the present application, the fingerprint sensor 201 can be used to replace an ordinary touchpad originally disposed in a wearable device, so as to reduce the size of the wearable device. Meanwhile, as the number of sensing units in the fingerprint sensor 201 is large, the collected fingerprint patterns of the user are clear and accurate, and therefore, the wearable device can recognize specific gestures, such as sliding gestures, double-click gestures and the like, of the finger of the user in the fingerprint sensor 201 through a plurality of fingerprint patterns continuously collected by the fingerprint sensor 201.
In this way, the wearable device may determine, according to the preset correspondence between different gestures and different operation instructions, an operation instruction corresponding to the identified specific gesture, such as a play instruction, a volume adjustment instruction, or a pause instruction. Furthermore, the wearable device can send the corresponding operation instruction to the terminal, so that the terminal executes the operation instruction, and the wearable device controls the related functions in the terminal through the touch operation of the user on the wearable device.
Certainly, after the wearable device recognizes a specific gesture performed by the user by using the fingerprint pattern acquired by the fingerprint sensor 201, the recognized gesture may also be sent to the terminal, and the terminal executes a corresponding operation instruction according to the gesture.
Further, as shown in fig. 3, in addition to the fingerprint sensor 201, the wearable device 11 may further include a microphone 201 (e.g., a bone conduction microphone), an acceleration sensor 203, a proximity light sensor 204, a communication module 205, a speaker 206, a calculation module 207, a storage module 208, and a power supply 209. It will be appreciated that the wearable device 11 described above may have more or fewer components than shown in fig. 3, may combine two or more components, or may have a different configuration of components. The various components shown in fig. 3 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing or application specific integrated circuits.
It should be noted that, in the above embodiments, the bluetooth headset is taken as the wearable device 11 for illustration, and it can be understood that the fingerprint sensor 201 may also be disposed in other wearable devices such as smart glasses, a smart helmet, or a smart bracelet, so as to identify a gesture performed by the user on the fingerprint sensor 201.
For example, as shown in fig. 4, the fingerprint sensor 201 described above may be integrated in the smart glasses 301. For example, the fingerprint sensor 201 may be provided on a frame or a temple of the smart glasses 301. When the user touches the fingerprint sensor 201 on the smart glasses 301 with a finger, the fingerprint sensor 201 may collect a fingerprint pattern on the fingerprint sensor 201 at a certain frequency. In this way, the smart glasses 301 may recognize a specific gesture performed by the user on the fingerprint sensor 201 according to the change of the position and size of the finger in the collected multiple fingerprint patterns.
As shown in fig. 5, the terminal 12 in the voice control system may be a mobile phone 100. The mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a USB interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a radio frequency module 150, a communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display 194, a SIM card interface 195, and the like. The sensor module may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor, and the like.
The structure illustrated in the embodiment of the present invention is not limited to the mobile phone 100. It may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a Neural-Network Processing Unit (NPU), etc. The different processing units may be independent devices or may be integrated in the same processor.
The controller may be a decision maker directing the various components of the handset 100 to work in concert as instructed. Is the neural center and command center of the cell phone 100. The controller generates an operation control signal according to the instruction operation code and the time sequence signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor is a cache memory. Instructions or data that have just been used or recycled by the processor may be saved. If the processor needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses and reducing the latency of the processor, thereby increasing the efficiency of the system.
In some embodiments, the processor 110 may include an interface. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, the processor may include multiple sets of I2C buses. The processor may be coupled to the touch sensor, charger, flash, camera, etc. via different I2C bus interfaces. For example: the processor may be coupled to the touch sensor via an I2C interface, such that the processor and the touch sensor communicate via an I2C bus interface to implement the touch functionality of the cell phone 100.
The I2S interface may be used for audio communication. In some embodiments, the processor may include multiple sets of I2S buses. The processor may be coupled to the audio module via an I2S bus to enable communication between the processor and the audio module. In some embodiments, the audio module can transmit audio signals to the communication module through the I2S interface, so as to realize the function of answering the call through the bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module and the communication module may be coupled by a PCM bus interface. In some embodiments, the audio module may also transmit the audio signal to the communication module through the PCM interface, so as to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication, with different sampling rates for the two interfaces.
The UART interface is a universal serial data bus used for asynchronous communications. The bus is a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor with the communication module 160. For example: the processor communicates with the Bluetooth module through the UART interface to realize the Bluetooth function. In some embodiments, the audio module may transmit the audio signal to the communication module through the UART interface, so as to realize the function of playing music through the bluetooth headset.
The MIPI interface can be used to connect a processor with peripheral devices such as a display screen and a camera. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, the processor and the camera communicate through a CSI interface to implement the camera function of the handset 100. The processor and the display screen communicate through a DSI interface to implement the display function of the mobile phone 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, the GPIO interface may be used to connect the processor with a camera, display screen, communication module, audio module, sensor, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 may be a Mini USB interface, a Micro USB interface, a USB Type C interface, etc. The USB interface may be used to connect a charger to charge the mobile phone 100, or may be used to transmit data between the mobile phone 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. But may also be used to connect other electronic devices such as AR devices, etc.
The interface connection relationship between the modules in the embodiment of the present invention is only schematically illustrated, and does not limit the structure of the mobile phone 100. The mobile phone 100 may adopt different interface connection modes or a combination of multiple interface connection modes in the embodiment of the present invention.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module may receive charging input from a wired charger via a USB interface. In some wireless charging embodiments, the charging management module may receive a wireless charging input through a wireless charging coil of the cell phone 100. The charging management module can also supply power to the terminal device through the power management module 141 while charging the battery.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module receives the input of the battery and/or the charging management module and supplies power to the processor, the internal memory, the external memory, the display screen, the camera, the communication module and the like. The power management module may also be used to monitor parameters such as battery capacity, battery cycle number, battery state of health (leakage, impedance), etc. In some embodiments, the power management module 141 may also be disposed in the processor 110. In some embodiments, the power management module 141 and the charging management module may also be disposed in the same device.
The wireless communication function of the mobile phone 100 can be implemented by the antenna module 1, the antenna module 2, the rf module 150, the communication module 160, a modem, and a baseband processor.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the cellular network antenna may be multiplexed into a wireless local area network diversity antenna. In some embodiments, the antenna may be used in conjunction with a tuning switch.
The rf module 150 may provide a communication processing module including a solution of wireless communication such as 2G/3G/4G/5G applied to the mobile phone 100. May include at least one filter, switch, power Amplifier, Low Noise Amplifier (LNA), etc. The radio frequency module receives electromagnetic waves through the antenna 1, and processes the received electromagnetic waves such as filtering, amplification and the like, and transmits the electromagnetic waves to the modem for demodulation. The radio frequency module can also amplify the signal modulated by the modem, and the signal is converted into electromagnetic wave by the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the rf module 150 may be disposed in the processor 150. In some embodiments, at least some functional modules of the rf module 150 may be disposed in the same device as at least some modules of the processor 110.
The modem may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to a speaker, a receiver, etc.) or displays an image or video through a display screen. In some embodiments, the modem may be a stand-alone device. In some embodiments, the modem may be separate from the processor, in the same device as the rf module or other functional module.
The communication module 160 may provide a communication processing module of a solution for wireless communication applied to the mobile phone 100, including Wireless Local Area Networks (WLAN), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The communication module 160 may be one or more devices integrating at least one communication processing module. The communication module receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor. The communication module 160 may also receive a signal to be transmitted from the processor, frequency-modulate and amplify the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the handset 100 is coupled to the radio frequency module and the antenna 2 is coupled to the communication module. So that the handset 100 can communicate with networks and other devices via wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The mobile phone 100 implements the display function through the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing and is connected with a display screen and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen includes a display panel. The display panel may be an LCD (liquid crystal display), an OLED (organic light-emitting diode), an active-matrix organic light-emitting diode (AMOLED), a miniature, a Micro led, a Micro-o led, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the handset 100 may include 1 or N display screens, with N being a positive integer greater than 1.
As also shown in fig. 1, the cell phone 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen, an application processor, and the like.
The ISP is used for processing data fed back by the camera. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the handset 100 may include 1 or N cameras, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the handset 100 is in frequency bin selection, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. Handset 100 may support one or more codecs. Thus, the handset 100 can play or record video in a variety of encoding formats, such as: MPEG1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can realize applications such as intelligent recognition of the mobile phone 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the mobile phone 100. The external memory card communicates with the processor through the external memory interface to realize the data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the cellular phone 100 and data processing by executing instructions stored in the internal memory 121. The memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data (e.g., audio data, a phonebook, etc.) created during use of the handset 100, and the like. Further, the memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, other volatile solid-state storage devices, a universal flash memory (UFS), and the like.
The mobile phone 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module is used for converting digital audio information into analog audio signals to be output and converting the analog audio input into digital audio signals. The audio module may also be used to encode and decode audio signals. In some embodiments, the audio module may be disposed in the processor 110, or some functional modules of the audio module may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The cellular phone 100 can listen to music through a speaker or listen to a hands-free call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the handset 100 receives a call or voice information, it can receive voice by placing the receiver close to the ear.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or sending voice information, a user can input a voice signal into the microphone by making a sound by approaching the microphone through the mouth of the user. The handset 100 may be provided with at least one microphone. In some embodiments, the handset 100 may be provided with two microphones to achieve a noise reduction function in addition to collecting sound signals. In some embodiments, the mobile phone 100 may further include three, four or more microphones to collect sound signals and reduce noise, and may further identify sound sources and implement directional recording functions.
The headphone interface 170D is used to connect a wired headphone. The earphone interface may be a USB interface, or may be an open mobile platform (OMTP) standard interface of 3.5mm, or a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor may be disposed on the display screen. There are many types of pressure sensors, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor, the capacitance between the electrodes changes. The handset 100 determines the intensity of the pressure from the change in capacitance. When a touch operation is applied to the display screen, the mobile phone 100 detects the intensity of the touch operation according to the pressure sensor. The cellular phone 100 can also calculate the touched position based on the detection signal of the pressure sensor. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the cellular phone 100. In some embodiments, the angular velocity of the handset 100 about three axes (i.e., the x, y, and z axes) may be determined by a gyroscope sensor. The gyro sensor may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyroscope sensor detects the shake angle of the mobile phone 100, and calculates the distance to be compensated for the lens module according to the shake angle, so that the lens can counteract the shake of the mobile phone 100 through reverse movement, thereby achieving anti-shake. The gyroscope sensor can also be used for navigation and body feeling game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the handset 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by a barometric pressure sensor.
The magnetic sensor 180D includes a hall sensor. The handset 100 may detect the opening and closing of the flip holster using a magnetic sensor. In some embodiments, when the handset 100 is a flip phone, the handset 100 may detect the opening and closing of the flip based on the magnetic sensor. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E can detect the magnitude of acceleration of the cellular phone 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the handset 100 is stationary. The method can also be used for recognizing the terminal gesture, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The handset 100 may measure distance by infrared or laser. In some embodiments, the scene is photographed and the cell phone 100 may utilize a range sensor to measure the distance to achieve fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. Infrared light is emitted outward through the light emitting diode. Infrared reflected light from nearby objects is detected using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the cell phone 100. When insufficient reflected light is detected, it can be determined that there is no object near the cellular phone 100. The mobile phone 100 can detect that the user holds the mobile phone 100 close to the ear by using the proximity light sensor, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor can also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. The mobile phone 100 may adaptively adjust the display screen brightness according to the perceived ambient light level. The ambient light sensor can also be used to automatically adjust the white balance when taking a picture. The ambient light sensor may also cooperate with the proximity light sensor to detect whether the cell phone 100 is in a pocket to prevent inadvertent contact.
The fingerprint sensor 180H is used to collect a fingerprint. The mobile phone 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, take a photograph of the fingerprint, answer an incoming call with the fingerprint, and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, the handset 100 implements a temperature processing strategy using the temperature detected by the temperature sensor. For example, when the temperature reported by the temperature sensor exceeds the threshold, the mobile phone 100 performs a reduction in the performance of the processor located near the temperature sensor, so as to reduce power consumption and implement thermal protection.
The touch sensor 180K is also referred to as a "touch panel". Can be arranged on the display screen. For detecting a touch operation acting thereon or thereabout. The detected touch operation may be passed to an application processor to determine the touch event type and provide a corresponding visual output via the display screen.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor may acquire a vibration signal of a human body's voice vibrating a bone mass. The bone conduction sensor can also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor may also be disposed in the earpiece. The audio module 170 may analyze a voice signal based on the vibration signal of the bone block vibrated by the sound part obtained by the bone conduction sensor, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signals acquired by the bone conduction sensor, and a heart rate detection function is realized.
The keys 190 include a power-on key, a volume key, and the like. The keys may be mechanical keys. Or may be touch keys. The cellular phone 100 receives a key input, and generates a key signal input related to user setting and function control of the cellular phone 100.
The motor 191 may generate a vibration cue. The motor can be used for incoming call vibration prompt and can also be used for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The touch operation on different areas of the display screen can also correspond to different vibration feedback effects. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a Subscriber Identity Module (SIM). The SIM card can be attached to and detached from the cellular phone 100 by being inserted into or pulled out from the SIM card interface. The handset 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface can support a Nano SIM card, a Micro SIM card, a SIM card and the like. Multiple cards can be inserted into the same SIM card interface at the same time. The types of the plurality of cards may be the same or different. The SIM card interface may also be compatible with different types of SIM cards. The SIM card interface may also be compatible with external memory cards. The mobile phone 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the handset 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the mobile phone 100 and cannot be separated from the mobile phone 100.
In the embodiment of the present application, in combination with fig. 1 to 5, when the wearable device 11 detects a touch operation on the fingerprint sensor 201, N (N > 1) continuous images formed on the fingerprint sensor 201 may be acquired by the fingerprint sensor 201 at a certain frequency. Furthermore, by comparing the variation of parameters such as the size and the position of the fingerprint pattern in the N consecutive images, the wearable device 11 can recognize the gesture performed by the user on the fingerprint sensor 201. In this way, the wearable device 11 may send a corresponding operation instruction to the terminal 12 according to the recognized gesture, so as to implement control of the wearable device 11 on the related function in the terminal 12.
That is to say, in the embodiment of the application, by using the characteristics of small size and high integration level of the sensing unit in the fingerprint sensor, the fingerprint sensor is arranged in the wearable device for recognizing the gesture performed by the user, so as to replace a traditional wearable device in which a touchpad with a large size is used for recognizing the gesture performed by the user, thereby improving the integration level of the wearable device. Meanwhile, as the number of sensing units in the fingerprint sensor is more, the sensitivity and the accuracy of the wearable device in recognizing the gestures of the user are improved, and the probability that the wearable device and the terminal are triggered by mistake is reduced.
For convenience of understanding, a touch method of a wearable device provided in the embodiments of the present application is specifically described below with reference to the accompanying drawings. In the following embodiments, a mobile phone is taken as a terminal, and a bluetooth headset is taken as a wearable device for illustration.
Fig. 6 is a flowchart illustrating a touch method of a wearable device according to an embodiment of the present disclosure. As shown in fig. 6, the touch method may include:
s601, the mobile phone and the Bluetooth headset establish Bluetooth connection.
When the user wishes to use the bluetooth headset, the bluetooth function of the bluetooth headset may be turned on. At this point, the bluetooth headset may send the paired broadcast to the outside. If the handset has bluetooth enabled, the handset may receive the pairing broadcast and prompt the user that the associated bluetooth device has been scanned. After the user selects the Bluetooth headset on the mobile phone, the mobile phone can be paired with the Bluetooth headset and establish Bluetooth connection. Subsequently, the mobile phone and the Bluetooth headset can communicate through the Bluetooth connection. Of course, if the mobile phone and the bluetooth headset are successfully paired before the bluetooth connection is established, the mobile phone may automatically establish the bluetooth connection with the scanned bluetooth headset.
In addition, if the headset used by the user has Wi-Fi functionality, the user can also operate the handset to establish a Wi-Fi connection with the headset. Or, if the earphone used by the user is a wired earphone, the user may also insert a plug of an earphone cord into a corresponding earphone interface of the mobile phone to establish a wired connection, which is not limited in this embodiment of the present application.
When the mobile phone is connected with the Bluetooth earphone in a Bluetooth mode, the mobile phone can also take the connected Bluetooth earphone as legal Bluetooth equipment. For example, the handset may store an identification of the legitimate bluetooth device (e.g., the MAC address of the bluetooth headset, etc.) locally on the handset. Therefore, when the subsequent mobile phone receives an operation instruction or data sent by a certain Bluetooth device, the mobile phone can judge whether the Bluetooth device communicated at the moment is a legal Bluetooth device according to the stored identification of the legal Bluetooth device. When the mobile phone judges that the illegal Bluetooth equipment sends the operation instruction or the data to the mobile phone at present, the mobile phone can discard the operation instruction or the data so as to improve the safety of the mobile phone in the using process. Of course, a handset may manage one or more legitimate bluetooth devices. As shown in fig. 7, the user can enter the management interface 701 of the legal device from the setting function, and the user can add or delete the legal bluetooth device in the management interface 701.
And S602 (optional), responding to a preset operation input to the Bluetooth headset by a user, and awakening the fingerprint sensor to enter a working state by the Bluetooth headset.
In order to reduce the power consumption of the bluetooth headset, the fingerprint sensor on the bluetooth headset may be set to be in a sleep state with lower power consumption by default after the bluetooth headset is started. In the sleep state, the bluetooth headset may scan the electrical signals generated by the sensing units in the fingerprint sensor at a low operating frequency, or the bluetooth headset may temporarily turn off the fingerprint sensor (e.g., power down the fingerprint sensor).
In addition, the bluetooth headset may preset one or more preset operations for waking up the fingerprint sensor. For example, the bluetooth headset may be preset with a wakeup word (e.g., "hello, small E"). When the Bluetooth headset detects that voice information input by a user comprises the awakening word through the microphone, the user executes preset operation for awakening the fingerprint sensor, and at the moment, the Bluetooth headset can switch the fingerprint sensor from a dormant state to a working state. Still alternatively, the bluetooth headset may be preset with a tap operation (e.g., tap two times) for waking up the fingerprint sensor. When the Bluetooth headset detects that the user performs the knocking operation through the acceleration sensor, the Bluetooth headset can switch the fingerprint sensor from the dormant state to the working state. Still alternatively, or in addition, the bluetooth headset may be preset with a touch operation (e.g., a click operation) for waking up the fingerprint sensor. If the fingerprint sensor in the dormant state detects that the user performs the touch operation, the fingerprint sensor can be switched from the dormant state to the working state. After the fingerprint sensor enters the operating state, scanning of the sensing elements in the fingerprint sensor at a relatively high operating frequency (e.g., 10Hz) may be initiated to capture the image formed on the fingerprint sensor.
Of course, after the bluetooth headset is started, the fingerprint sensor of the bluetooth headset may be set to be in an operating state by default, and the bluetooth headset may skip step S602 to perform the following steps S603-S606.
In other embodiments of the present application, after the bluetooth headset is started, if no operation of the bluetooth headset by the user is detected within a preset time, the bluetooth headset may automatically enter the sleep state. For example, the bluetooth headset may enter a BLE (bluetooth low energy) mode, thereby further reducing the power consumption of the bluetooth headset. When the bluetooth headset enters the sleep mode, some sensors (such as the acceleration sensor or the microphone) may be kept to operate at a lower frequency, and after the preset operation input to the bluetooth headset by the user is received, the bluetooth headset may be switched from the sleep mode to the operating mode, and then the following steps S603 to S606 are performed.
S603, the Bluetooth headset collects N continuous images formed on the fingerprint sensor, at least one of the N continuous images contains a fingerprint pattern, and N is an integer greater than 1.
After the fingerprint sensor enters the working state, the Bluetooth headset can continuously acquire N continuous images formed on the acquisition surface of the fingerprint sensor by using the fingerprint sensor at a certain working frequency. Since the finger of the user is a conductive object, the finger of the user can contact and leave the fingerprint sensor, which causes the corresponding capacitance signal in the fingerprint sensor to change. Therefore, after the fingerprint sensor enters the working state, the action of the finger of the user contacting and leaving the fingerprint sensor can be sensed. Then, the bluetooth headset may continuously capture images formed on the fingerprint sensor from the time the fingerprint sensor senses that the user's finger touches the fingerprint sensor until the fingerprint sensor senses that the user's finger leaves the fingerprint sensor, thereby obtaining the N continuous images.
Still alternatively, since the user does not always touch the fingerprint sensor when inputting a gesture on the fingerprint sensor (for example, the user's finger may briefly leave the fingerprint sensor between two click operations when performing a double-click operation), the fingerprint sensor may continue to operate for a certain time (for example, 2 seconds) after sensing that the user's finger leaves the fingerprint sensor, that is, the image formed on the fingerprint sensor continues to be captured within the 2 seconds. If the fact that the finger of the user is not sensed to contact the fingerprint sensor within the 2 seconds indicates that the gesture input by the user at this time is finished, the Bluetooth headset can control the fingerprint sensor to enter the dormant state again, and therefore power consumption of the Bluetooth headset is reduced. At this time, the N continuous images are continuously acquired from the time when the fingerprint sensor senses that the finger of the user touches the fingerprint sensor until the fingerprint sensor senses that the finger of the user leaves the fingerprint sensor for 2 seconds, and other images which do not include the fingerprint pattern may exist in the N continuous images.
Or, the bluetooth headset may start to continuously acquire the images formed on the fingerprint sensor after the fingerprint sensor enters the working state until the fingerprint sensor senses that the user finger leaves the fingerprint sensor or the fingerprint sensor senses that the user finger leaves the fingerprint sensor for a preset time, so as to obtain the N continuous images. Since the user may not touch the fingerprint sensor immediately after the fingerprint sensor enters the active state, there may be other images not including the fingerprint pattern in the first few images of the above N consecutive images.
Furthermore, because the fingerprint lines of the fingers of the common user are all patterns with certain rules formed by wave troughs and wave crests, the Bluetooth headset can learn the fingerprint characteristics of the common fingerprint in advance through samples of some fingerprint patterns. The fingerprint features may be stored in the bluetooth headset in the form of a model or vector. The fingerprint feature may be used to indicate the fingerprint feature of a particular user (e.g., legitimate user a) or may be used to indicate a fingerprint feature common to the fingerprints of most common users. Of course, the fingerprint feature may also be obtained by the bluetooth headset from other devices (for example, a mobile phone or a cloud server), which is not limited in this embodiment of the present application.
Therefore, the Bluetooth headset can identify whether the image has the fingerprint characteristics in one or more images collected by the fingerprint sensor, so that whether the touch operation on the fingerprint sensor is mistaken touch operation or not is judged. If the Bluetooth headset determines that the touch operation occurring on the fingerprint sensor is not the false touch operation, the Bluetooth headset can continue to execute the following steps S604-S606; otherwise, the bluetooth headset may discard the acquired image and re-enter the sleep state to reduce power consumption of the bluetooth headset.
For example, the bluetooth headset may start to capture an image from the fingerprint sensor, and determine in real time whether the image captured by the fingerprint sensor has the above fingerprint features (i.e. whether the captured image includes a fingerprint pattern). If the fingerprint pattern is not included in the M (M ≦ N) consecutive images, it indicates that the user may not touch the fingerprint sensor, or that other objects (such as hair, clothes, or face) other than the user' S finger may mistakenly touch the fingerprint sensor, so the bluetooth headset does not need to continue to capture the image formed on the fingerprint sensor, and does not need to perform the following steps S604-S606.
Or, the bluetooth headset may also periodically detect whether the image collected by the fingerprint sensor has the fingerprint feature. For example, after the fingerprint sensor starts to capture images, the bluetooth headset may randomly extract one detection from each captured 5 images by taking 5 images as a unit to detect whether the fingerprint sensor has the above fingerprint feature. For another example, the bluetooth headset may randomly extract one detection from the images acquired every 500ms by taking 500ms as a unit to detect whether the fingerprint features are present. If the image with the fingerprint characteristics is detected, it is determined that the touch operation occurring on the fingerprint sensor this time is not a false touch operation, and the bluetooth headset may continue to continuously capture images formed on the fingerprint sensor, and perform the following steps S604-S606.
Or, the bluetooth headset may also count the number of images containing the fingerprint pattern in the N continuous images after acquiring the N continuous images collected by the fingerprint sensor. If the number of the images containing the fingerprint patterns in the N continuous images is smaller than the first threshold (for example, smaller than 3 images), it can be said that the user may touch the fingerprint sensor by mistake at this time, otherwise, the bluetooth headset may determine that the touch operation occurring on the fingerprint sensor at this time is not a false touch operation. Or, if the number of images that do not include the fingerprint pattern in the N consecutive images is greater than the second threshold (for example, less than 10), it may also be said that the user may only touch the fingerprint sensor by mistake at this time, otherwise, the bluetooth headset may determine that the touch operation occurring on the fingerprint sensor this time is not a false touch operation.
That is to say, in the embodiment of the application, only when the bluetooth headset recognizes that the touch operation received by the fingerprint sensor includes an input of a fingerprint, the bluetooth headset continues to acquire an image formed on the fingerprint sensor, recognize a control gesture corresponding to the touch operation, and send the recognized control gesture or operation instruction to the terminal. Otherwise, if the touch operation received by the fingerprint sensor does not contain the input of the fingerprint, that is, the touch operation input by the user this time is the false touch operation, the mobile phone does not need to execute the following steps S604-S606, so that the false touch operation of the user on the bluetooth headset is avoided to wake up the mobile phone, and the power consumption of the bluetooth headset and the mobile phone is also reduced.
It should be noted that, if the bluetooth headset identifies that the touch operation input by the user this time is a false touch operation (that is, the touch operation does not include an input of a fingerprint) in the process of acquiring the N continuous images, the bluetooth headset may stop acquiring the image formed on the fingerprint sensor; if the Bluetooth headset identifies that the touch operation is the false touch operation in the process of identifying the touch operation input by the user at this time, the Bluetooth headset can stop identifying the touch operation input by the user at this time; if the bluetooth headset recognizes that the touch operation input by the user at this time is the false touch operation after recognizing the control gesture corresponding to the touch operation, the bluetooth headset can stop sending the recognized control gesture or operation instruction to the mobile phone.
And S604, the Bluetooth headset identifies the control gesture input by the user according to the fingerprint patterns in the N continuous images.
In step S604, after the bluetooth headset obtains the N continuous images collected by the fingerprint sensor, the change of the fingerprint pattern in the N continuous images can be identified. For example, the size of the fingerprint pattern may gradually decrease or even disappear completely in N consecutive images, and the position of the fingerprint pattern may continuously move in the N consecutive images. Then, the bluetooth headset may determine what the specific control gesture the user inputs to the fingerprint sensor in step S603 is according to the variation of the fingerprint pattern.
For example, the bluetooth headset may recognize the control gesture input by the user on the fingerprint sensor according to the variation of the position and size of the fingerprint pattern in the N consecutive images. As shown in (a) of fig. 8, if a fingerprint pattern 801 is included in X (X is less than a first preset value) consecutive images among the above N consecutive images, and the size of the fingerprint pattern 801 does not significantly change, it may be determined that the user has input a click operation on the fingerprint sensor. As shown in (b) of fig. 8, if there are consecutive Y (Y is greater than the second preset value, which is greater than or equal to the first preset value) images among the N consecutive images including the fingerprint pattern 801, and the size of the fingerprint pattern 801 does not significantly change, it may be determined that the user has input the long press operation on the fingerprint sensor. As shown in (c) of fig. 8, if Z (Z ≦ N) consecutive images among the N consecutive images include the fingerprint pattern 801, and the fingerprint pattern 801 gradually moves from the point a to the point B, and the distance between the point a and the point B is greater than the distance threshold, it may be determined that the user has input the slide operation on the fingerprint sensor.
Further, if the bluetooth headset determines that the control gesture input by the user on the fingerprint sensor is a sliding operation, the bluetooth headset may further determine parameters such as a sliding direction and a motion trajectory of the sliding operation according to the positions of the fingerprint patterns 801 in the N consecutive images. For example, the sliding direction of the sliding operation may be upward or downward, and the movement locus of the sliding operation may be a closed figure such as a circle.
Because the sensing unit size among the fingerprint sensor is little, the integrated level is high, consequently, the fingerprint pattern in each image that bluetooth headset used fingerprint sensor to gather is more clear and accurate. The accuracy and the sensitivity of the Bluetooth headset based on the fingerprint patterns are higher when the control gesture input by the user is identified, and meanwhile, the size of the Bluetooth headset cannot be increased due to the fact that the size of the fingerprint sensor is too large.
The process of recognizing the control gesture by the bluetooth headset may be executed by a computing module (e.g., a CPU) in the bluetooth headset. Alternatively, the computing module may be integrated into a fingerprint sensor of the bluetooth headset, and the fingerprint sensor may perform the steps S603-S604.
In addition, after the bluetooth headset recognizes the control gesture, the following steps S605a or S605b may be performed.
And S605a, the Bluetooth headset sends the control gesture to the mobile phone.
In step S605a, the bluetooth headset may send the control gesture (e.g., long press operation) recognized in step S604 to the mobile phone, and the mobile phone may determine a corresponding operation command according to the control gesture, and execute the operation command corresponding to the control gesture according to step S606 described below.
For example, after the bluetooth headset recognizes that the control gesture input by the user on the fingerprint sensor is a long-press operation, an identifier (e.g. 01) of the long-press operation may be sent to the mobile phone. The corresponding relation between different control gestures and different operation instructions on the Bluetooth headset in each application is stored in the mobile phone in advance, so that the mobile phone can determine that the operation instruction corresponding to the long press operation in the running application is the instruction for pausing playing according to the corresponding relation. Furthermore, the mobile phone can execute the instruction for pausing the playing, so that the audio being played is paused. Therefore, the user can control the running application in the mobile phone to realize the related function by executing a control gesture on the fingerprint sensor of the Bluetooth headset.
And S605b, the Bluetooth headset sends an operation instruction corresponding to the control gesture to the mobile phone.
In step S605b, the bluetooth headset may store the corresponding relationship between different control gestures and different operation commands in advance. For example, the operation command corresponding to the upward sliding operation is a command for increasing the volume, and the operation command corresponding to the downward sliding operation is a command for decreasing the volume. Then, after the bluetooth headset recognizes the control gesture of the user on the fingerprint sensor, the operation instruction corresponding to the control gesture can be determined according to the corresponding relationship. Furthermore, the bluetooth headset may send the operation instruction corresponding to the control gesture to the mobile phone, so that the mobile phone may execute the operation instruction corresponding to the control gesture according to the following step S606.
Optionally, after the bluetooth connection is established between the bluetooth headset and the mobile phone, the mobile phone may set the bluetooth module in the mobile phone to a dormant state. For example, if the bluetooth connection between the bluetooth headset and the mobile phone does not transmit data for a certain time (e.g., 1 minute), the mobile phone may switch the bluetooth module in the mobile phone to a sleep state to reduce the power consumption of the mobile phone. For another example, when the mobile phone enters the screen locking state, the bluetooth module can be automatically switched to the dormant state. Therefore, before the bluetooth headset sends the recognized control gesture or the operation instruction corresponding to the control gesture to the mobile phone, the bluetooth headset may also send a wake-up instruction to the mobile phone. Therefore, the mobile phone can switch the Bluetooth module into a working state in response to the awakening instruction, so that the mobile phone can recover the Bluetooth connection with the Bluetooth headset in advance, and the mobile phone can quickly respond after receiving the operation instruction sent by the Bluetooth headset through the Bluetooth connection.
In other embodiments of the present application, the bluetooth headset may also transmit the N consecutive images to the mobile phone after acquiring the N consecutive images formed on the fingerprint sensor. And then, the mobile phone identifies the control gesture input by the user according to the fingerprint patterns in the N continuous images, and determines and executes an operation instruction corresponding to the identified control gesture. Therefore, the Bluetooth headset does not need to perform work such as gesture recognition, and the realization complexity and the power consumption of the Bluetooth headset can be reduced.
And S606, executing the operation instruction corresponding to the control gesture by the mobile phone.
In step S606, if the mobile phone receives the operation command sent by the bluetooth headset, the mobile phone may directly execute the operation command. If the mobile phone receives the recognized control gesture sent by the Bluetooth headset, the mobile phone can determine an operation instruction corresponding to the control gesture first, and then execute the operation instruction. Because the operation instructions set for the same control gesture in different applications may be different, after the mobile phone receives the control gesture sent by the bluetooth headset, the operation instruction corresponding to the control gesture in the current application can be determined according to the specific application in operation.
In some embodiments of the present application, when the bluetooth headset sends the control gesture or the operation instruction to the mobile phone, the bluetooth headset may also send its own device identifier (for example, a MAC address) to the mobile phone. Because the identification of the legal Bluetooth equipment which passes the authentication is stored in the mobile phone, the mobile phone can determine whether the currently connected Bluetooth earphone is the legal Bluetooth equipment or not according to the received equipment identification. If the Bluetooth headset is legal Bluetooth equipment, the mobile phone can further execute an operation instruction corresponding to the control gesture recognized by the Bluetooth headset, otherwise, the mobile phone can discard the control gesture or the operation instruction sent by the Bluetooth headset, and therefore the safety problem caused by malicious operation and control of the mobile phone by illegal Bluetooth equipment is avoided.
In addition, as shown in fig. 9, the user may also enter a setting interface 901 for managing a valid device in the handset. In the setup interface 901, the user may manually add a new control gesture or delete an old control gesture to the corresponding legitimate device. In addition, the user can manually set an operation instruction corresponding to each control gesture, so that the user can obtain customized touch experience on legal equipment.
By the steps S601-S606, the bluetooth headset may identify the control gesture input by the user on the fingerprint sensor by using the fingerprint sensor, and further control the mobile phone to execute the operation command corresponding to the control gesture. The size of the sensing unit in the fingerprint sensor is smaller, the integration level is higher, so that the sensitivity and the accuracy of the Bluetooth headset are higher when the gesture of the user is recognized, and meanwhile, the fingerprint sensor can recognize the misoperation gesture triggered by the non-user finger, so that the probability that the Bluetooth headset and the mobile phone are triggered by mistake can be reduced when the mobile phone is controlled by the touch method, and the power consumption of the Bluetooth headset and the mobile phone is reduced.
In some embodiments of the present application, embodiments of the present application disclose a wearable device, which may include, as shown in fig. 10: a fingerprint sensor 1001; one or more processors 1002; a memory 1003; a communication interface 1004; one or more application programs (not shown); and one or more computer programs 1005, which may be connected via one or more communication buses 1006. Wherein the one or more computer programs 1005 are stored in the memory 1003 and configured to be executed by the one or more processors 1002, the one or more computer programs 1005 comprising instructions which may be used to perform the steps as in fig. 6 and the corresponding embodiments.
In addition, in connection with the wearable device shown in fig. 2, the processor 1002 may be the computing module 207, the memory 1003 may be the storage module 208, and the communication interface 1004 may be the communication module 205 in fig. 2. Of course, the wearable device shown in fig. 10 may further include the microphone 201, the acceleration sensor 203, the proximity light sensor 204, the speaker 206, and the power supply 209 shown in fig. 2, which is not limited in this embodiment.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
Each functional unit in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or make a contribution to the prior art, or all or part of the technical solutions may be implemented in the form of a software product stored in a storage medium and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard drive, read only memory, random access memory, magnetic or optical disk, and the like.
The above description is only a specific implementation of the embodiments of the present application, but the scope of the embodiments of the present application is not limited thereto, and any changes or substitutions within the technical scope disclosed in the embodiments of the present application should be covered by the scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.

Claims (24)

  1. A touch control method of a wearable device, wherein a fingerprint sensor is arranged in the wearable device, the method comprising:
    the wearable device detects a touch operation input by a user by using the fingerprint sensor;
    the wearable device judges whether the touch operation contains input of a fingerprint;
    if the touch operation comprises input of a fingerprint, the wearable device recognizes a control gesture corresponding to the touch operation;
    the wearable device sends the control gesture to a terminal, or the wearable device sends an operation instruction corresponding to the control gesture to the terminal, so that the terminal executes the operation instruction corresponding to the control gesture, and communication connection is established between the wearable device and the terminal.
  2. The touch control method of the wearable device according to claim 1, further comprising, after the wearable device detects a touch operation of a user input using the fingerprint sensor:
    in response to the touch operation, the wearable device collects N continuous images formed on the fingerprint sensor, wherein at least one of the N continuous images contains a fingerprint pattern, and N is an integer greater than 1;
    wherein, the wearable device recognizes a control gesture corresponding to the touch operation, including:
    and the wearable equipment identifies a control gesture corresponding to the touch operation according to the change condition of the fingerprint pattern in the N continuous images.
  3. The touch method of the wearable device according to claim 2, wherein the wearable device collects N consecutive images formed on the fingerprint sensor, and comprises:
    starting to acquire an image formed on the fingerprint sensor at a preset frequency when the fingerprint sensor detects that a user's finger contacts the fingerprint sensor;
    and when the fingerprint sensor detects that the finger of the user leaves the fingerprint sensor, stopping collecting the images formed on the fingerprint sensor to obtain the N continuous images.
  4. The touch method of the wearable device according to claim 2, wherein the wearable device collects N consecutive images formed on the fingerprint sensor, and comprises:
    starting to acquire an image formed on the fingerprint sensor at a preset frequency when the fingerprint sensor detects that a user's finger contacts the fingerprint sensor;
    when the fingerprint sensor detects that the finger of the user leaves the fingerprint sensor, continuously collecting the image formed on the fingerprint sensor within a preset time;
    and if the finger of the user is not detected to contact the fingerprint sensor within the preset time, stopping collecting the images formed on the fingerprint sensor to obtain the N continuous images.
  5. The touch control method of the wearable device according to any one of claims 2 to 4, wherein the wearable device recognizes a control gesture corresponding to the touch operation according to a change condition of the fingerprint pattern in the N consecutive images, including:
    the wearable device identifies images including fingerprint patterns in the N continuous images according to preset fingerprint characteristics;
    and the wearable equipment identifies the control gesture corresponding to the touch operation according to the size change and/or the position change of the fingerprint pattern in the N continuous images.
  6. The touch method of the wearable device according to claim 5, wherein the wearable device recognizes a control gesture corresponding to the touch operation according to a size change and/or a position change of the fingerprint pattern in the N consecutive images, and the method includes:
    when X continuous images in the N continuous images contain the fingerprint pattern and the position of the fingerprint pattern in the X images is the same, the control gesture corresponding to the touch operation is a clicking operation, and X is not more than N; or the like, or, alternatively,
    when Y continuous images in the N continuous images contain the fingerprint pattern and the position of the fingerprint pattern in the Y continuous images is the same, the control gesture corresponding to the touch operation is a long-time pressing operation, and X is more than Y and is not more than N; or the like, or, alternatively,
    when the continuous Z images in the N continuous images contain the fingerprint pattern and the displacement of the fingerprint pattern in the Z images is greater than the distance threshold, the control gesture corresponding to the touch operation is a sliding operation, Z is less than or equal to N, or,
    when L3 images which do not contain the fingerprint pattern exist between L1 continuous images containing the fingerprint pattern and L2 continuous images containing the fingerprint pattern in the N continuous images, the control gesture corresponding to the touch operation is a double-click operation, L3 is smaller than a preset threshold, and L1+ L2+ L3 is smaller than or equal to N and is more than 1.
  7. The touch control method of the wearable device according to any one of claims 1-6, wherein the control gesture is sent to the terminal at the wearable device; or before the wearable device sends the operation instruction corresponding to the control gesture to the terminal, the method further includes:
    the wearable device determines that the touch operation is not a false touch operation.
  8. The touch control method of the wearable device according to claim 7, wherein the wearable device determining that the touch operation is not a false touch operation comprises:
    if P1 images in the N continuous images contain fingerprint patterns and P1 is larger than a preset value, the wearable device determines that the touch operation is not a false touch operation.
  9. The touch method of the wearable device according to claim 8, further comprising:
    if P2 images in the N continuous images contain fingerprint patterns and P2 is smaller than the preset value, the wearable device determines that the touch operation is false touch operation;
    the wearable device switches the fingerprint sensor from an operating state to a dormant state.
  10. The touch control method of the wearable device according to any one of claims 1 to 9, further comprising, before the wearable device receives a touch operation input by a user to the fingerprint sensor:
    in response to a wake-up operation input by a user, the wearable device switches the fingerprint sensor from a sleep state to an active state.
  11. A wearable device comprising a processor, and a memory, a fingerprint sensor, and a communication interface coupled to the processor, wherein,
    the fingerprint sensor is configured to: receiving touch operation input by a user;
    the processor is configured to: judging whether the touch operation contains input of a fingerprint; if the touch operation comprises input of a fingerprint, identifying a control gesture corresponding to the touch operation;
    the communication interface is configured to: and sending the control gesture to a terminal, or sending an operation instruction corresponding to the control gesture to the terminal, so that the terminal executes the operation instruction corresponding to the control gesture, and a communication connection is established between the wearable device and the terminal.
  12. The wearable device of claim 11,
    the fingerprint sensor is further configured to: acquiring N continuous images formed on the fingerprint sensor by the touch operation, wherein at least one of the N continuous images contains a fingerprint pattern, and N is an integer greater than 1;
    the processor is configured to identify a control gesture corresponding to the touch operation, and specifically:
    the processor is specifically configured to: and identifying a control gesture corresponding to the touch operation according to the change condition of the fingerprint pattern in the N continuous images.
  13. Wearable device according to claim 12, wherein the fingerprint sensor is configured to capture N consecutive images of the touch operation formed on the fingerprint sensor, in particular:
    the fingerprint sensor is specifically configured to: starting to acquire an image formed on the fingerprint sensor at a preset frequency when it is detected that a user's finger contacts the fingerprint sensor; and when the finger of the user is detected to leave the fingerprint sensor, stopping collecting the images formed on the fingerprint sensor to obtain the N continuous images.
  14. Wearable device according to claim 12, wherein the fingerprint sensor is configured to capture N consecutive images of the touch operation formed on the fingerprint sensor, in particular:
    the fingerprint sensor is specifically configured to: starting to acquire an image formed on the fingerprint sensor at a preset frequency when it is detected that a user's finger contacts the fingerprint sensor; when the finger of the user is detected to leave the fingerprint sensor, continuously collecting the image formed on the fingerprint sensor within a preset time; and if the finger of the user is not detected to contact the fingerprint sensor within the preset time, stopping collecting the images formed on the fingerprint sensor to obtain the N continuous images.
  15. The wearable device according to any of claims 12-14, wherein the processor is configured to recognize the control gesture corresponding to the touch operation by:
    the processor is specifically configured to: identifying images including fingerprint patterns in the N continuous images according to preset fingerprint characteristics; and identifying a control gesture corresponding to the touch operation according to the size change and/or the position change of the fingerprint pattern in the N continuous images.
  16. The wearable device of claim 15,
    the processor is specifically configured to: when X continuous images in the N continuous images contain the fingerprint pattern and the position of the fingerprint pattern in the X images is the same, determining that the control gesture corresponding to the touch operation is a clicking operation, and X is not more than N; or when Y continuous images in the N continuous images contain the fingerprint pattern and the positions of the fingerprint pattern in the Y continuous images are the same, determining that the control gesture corresponding to the touch operation is a long-press operation, wherein X is more than Y and is less than or equal to N; or when a continuous Z image in the N continuous images contains the fingerprint pattern and the displacement of the fingerprint pattern in the Z images is greater than a distance threshold, determining that the control gesture corresponding to the touch operation is a sliding operation and Z is less than or equal to N, or when L3 images which do not contain the fingerprint pattern exist between L1 continuous images containing the fingerprint pattern and L2 continuous images containing the fingerprint pattern in the N continuous images, determining that the control gesture corresponding to the touch operation is a double-click operation, wherein L3 is less than a preset threshold and 1 < L1+ L2+ L3 is less than or equal to N.
  17. The wearable device according to any of claims 12-16,
    the processor is further configured to: determining that the touch operation is not a false touch operation.
  18. The wearable device of claim 17, wherein the processor is configured to determine that the touch operation is not a false touch operation by:
    the processor is specifically configured to: and if P1 images in the N continuous images contain fingerprint patterns and P1 is larger than a preset value, determining that the touch operation is not the false touch operation.
  19. The wearable device of claim 18,
    the processor is further configured to: if P2 images in the N continuous images contain fingerprint patterns and P2 is smaller than the preset value, determining that the touch operation is false touch operation; and switching the fingerprint sensor from a working state to a dormant state.
  20. The wearable device of any of claims 12-19,
    the processor is further configured to: and if the awakening operation input by the user is detected, switching the fingerprint sensor from the dormant state to the working state.
  21. The wearable device according to any of claims 11-20, wherein the fingerprint sensor is disposed on a side of the wearable device that is not in contact with a user when the wearable device is worn; the wearable device is a Bluetooth headset, intelligent glasses or an intelligent watch.
  22. A computer-readable storage medium having instructions stored therein, which when run on a wearable device, cause the wearable device to perform a method of touch of the wearable device of any of claims 1-10.
  23. A computer program product comprising instructions for causing a wearable device to perform a touch method of the wearable device according to any one of claims 1-10 when the computer program product is run on the wearable device.
  24. A touch system is characterized by comprising wearable equipment and a terminal, wherein a fingerprint sensor is arranged in the wearable equipment, and communication connection is established between the wearable equipment and the terminal; wherein the content of the first and second substances,
    the wearable device to: detecting a touch operation input by a user by using the fingerprint sensor; judging whether the touch operation contains input of a fingerprint; if the touch operation comprises input of a fingerprint, identifying a control gesture corresponding to the touch operation; sending the control gesture to the terminal, or sending an operation instruction corresponding to the control gesture to the terminal;
    the terminal is used for: receiving a control gesture sent by the wearable device, or receiving an operation instruction which is sent by the wearable device and corresponds to the control gesture; and executing an operation instruction corresponding to the control gesture.
CN201880094859.XA 2018-07-27 2018-07-27 Touch control method of wearable device, wearable device and system Active CN112334860B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/097675 WO2020019355A1 (en) 2018-07-27 2018-07-27 Touch control method for wearable device, and wearable device and system

Publications (2)

Publication Number Publication Date
CN112334860A true CN112334860A (en) 2021-02-05
CN112334860B CN112334860B (en) 2023-06-02

Family

ID=69182163

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880094859.XA Active CN112334860B (en) 2018-07-27 2018-07-27 Touch control method of wearable device, wearable device and system

Country Status (2)

Country Link
CN (1) CN112334860B (en)
WO (1) WO2020019355A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113709617A (en) * 2021-08-27 2021-11-26 Oppo广东移动通信有限公司 Wireless earphone control method and device, wireless earphone and storage medium
CN115562472A (en) * 2022-02-11 2023-01-03 荣耀终端有限公司 Gesture interaction method, medium and electronic equipment

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111736688A (en) * 2020-02-27 2020-10-02 珠海市杰理科技股份有限公司 Bluetooth headset, system and gesture recognition method thereof
CN111814586A (en) * 2020-06-18 2020-10-23 维沃移动通信有限公司 Fingerprint module control method and device, electronic equipment and readable storage medium
CN115665313A (en) * 2021-07-09 2023-01-31 华为技术有限公司 Device control method and electronic device

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1722152A (en) * 2004-07-05 2006-01-18 日本电气英富醍株式会社 Fingerprint reading method, fingerprint reading system and program
US20110038513A1 (en) * 2004-04-23 2011-02-17 Sony Corporation Fingerprint image reconstruction based on motion estimate across a narrow fringerprint sensor
CN104320591A (en) * 2014-11-21 2015-01-28 广东欧珀移动通信有限公司 Method and device for controlling front-rear switching of camera and intelligent terminal
CN104536561A (en) * 2014-12-10 2015-04-22 金硕澳门离岸商业服务有限公司 Wearable device and method for controlling terminal device in operation by wearable device
CN104700079A (en) * 2015-03-06 2015-06-10 南昌欧菲生物识别技术有限公司 Fingerprint recognition module and touch screen based on fingerprint recognition
CN105354544A (en) * 2015-10-29 2016-02-24 小米科技有限责任公司 Fingerprint identification method and apparatus
CN105739897A (en) * 2016-01-29 2016-07-06 宇龙计算机通信科技(深圳)有限公司 Touch operation processing method and device, and terminal
CN105938403A (en) * 2016-06-14 2016-09-14 无锡天脉聚源传媒科技有限公司 Cursor control method and device based on fingerprint recognition
CN106462342A (en) * 2016-09-29 2017-02-22 深圳市汇顶科技股份有限公司 Fingerprint navigation method and fingerprint navigation signal generation device
CN106469265A (en) * 2016-09-30 2017-03-01 北京小米移动软件有限公司 Electronic equipment awakening method, device and electronic equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104581480A (en) * 2014-12-18 2015-04-29 周祥宇 Touch control headset system and touch control command recognition method
CN104503577B (en) * 2014-12-19 2017-10-24 广东欧珀移动通信有限公司 A kind of method and device by wearable device control mobile terminal
CN106062778B (en) * 2016-04-01 2019-05-07 深圳市汇顶科技股份有限公司 Fingerprint identification method, device and terminal
CN106547465A (en) * 2016-10-14 2017-03-29 青岛海信移动通信技术股份有限公司 A kind of fast operating method and mobile terminal of mobile terminal
CN107748648A (en) * 2017-10-27 2018-03-02 维沃移动通信有限公司 Prevent the method and terminal device of fingerprint sensor false triggering

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110038513A1 (en) * 2004-04-23 2011-02-17 Sony Corporation Fingerprint image reconstruction based on motion estimate across a narrow fringerprint sensor
CN1722152A (en) * 2004-07-05 2006-01-18 日本电气英富醍株式会社 Fingerprint reading method, fingerprint reading system and program
CN104320591A (en) * 2014-11-21 2015-01-28 广东欧珀移动通信有限公司 Method and device for controlling front-rear switching of camera and intelligent terminal
CN104536561A (en) * 2014-12-10 2015-04-22 金硕澳门离岸商业服务有限公司 Wearable device and method for controlling terminal device in operation by wearable device
CN104700079A (en) * 2015-03-06 2015-06-10 南昌欧菲生物识别技术有限公司 Fingerprint recognition module and touch screen based on fingerprint recognition
CN105354544A (en) * 2015-10-29 2016-02-24 小米科技有限责任公司 Fingerprint identification method and apparatus
CN105739897A (en) * 2016-01-29 2016-07-06 宇龙计算机通信科技(深圳)有限公司 Touch operation processing method and device, and terminal
CN105938403A (en) * 2016-06-14 2016-09-14 无锡天脉聚源传媒科技有限公司 Cursor control method and device based on fingerprint recognition
CN106462342A (en) * 2016-09-29 2017-02-22 深圳市汇顶科技股份有限公司 Fingerprint navigation method and fingerprint navigation signal generation device
CN106469265A (en) * 2016-09-30 2017-03-01 北京小米移动软件有限公司 Electronic equipment awakening method, device and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
宋伟刚: "《机械电子工程实验教程(高)》", 30 June 2009, 北京:冶金工业出版社 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113709617A (en) * 2021-08-27 2021-11-26 Oppo广东移动通信有限公司 Wireless earphone control method and device, wireless earphone and storage medium
CN115562472A (en) * 2022-02-11 2023-01-03 荣耀终端有限公司 Gesture interaction method, medium and electronic equipment
CN115562472B (en) * 2022-02-11 2023-09-22 荣耀终端有限公司 Gesture interaction method, medium and electronic equipment

Also Published As

Publication number Publication date
CN112334860B (en) 2023-06-02
WO2020019355A1 (en) 2020-01-30

Similar Documents

Publication Publication Date Title
EP3822831B1 (en) Voice recognition method, wearable device and electronic device
CN110989852B (en) Touch screen, electronic equipment and display control method
CN112334860B (en) Touch control method of wearable device, wearable device and system
CN111369988A (en) Voice awakening method and electronic equipment
CN110730114B (en) Method and equipment for configuring network configuration information
CN110687998A (en) Application management method and device
CN110742580A (en) Sleep state identification method and device
CN112334977B (en) Voice recognition method, wearable device and system
CN110658975A (en) Mobile terminal control method and device
CN114490174A (en) File system detection method, electronic device and computer readable storage medium
CN110691165A (en) Navigation operation method and electronic equipment
CN114221402A (en) Charging method and device of terminal equipment and terminal equipment
CN113467735A (en) Image adjusting method, electronic device and storage medium
CN109285563B (en) Voice data processing method and device in online translation process
CN113129916A (en) Audio acquisition method, system and related device
CN115119336A (en) Earphone connection system, earphone connection method, earphone, electronic device and readable storage medium
CN111309130B (en) Mobile terminal and method for realizing water inflow protection
CN113867520A (en) Device control method, electronic device, and computer-readable storage medium
CN114116610A (en) Method, device, electronic equipment and medium for acquiring storage information
CN113821129A (en) Display window control method and electronic equipment
CN114089902A (en) Gesture interaction method and device and terminal equipment
CN114125144B (en) Method, terminal and storage medium for preventing false touch
CN114115513B (en) Key control method and key device
CN113364067B (en) Charging precision calibration method and electronic equipment
WO2023071497A1 (en) Photographing parameter adjusting method, electronic device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant