WO2020019355A1 - Procédé de commande tactile pour dispositif vestimentaire, et système et dispositif vestimentaire - Google Patents

Procédé de commande tactile pour dispositif vestimentaire, et système et dispositif vestimentaire Download PDF

Info

Publication number
WO2020019355A1
WO2020019355A1 PCT/CN2018/097675 CN2018097675W WO2020019355A1 WO 2020019355 A1 WO2020019355 A1 WO 2020019355A1 CN 2018097675 W CN2018097675 W CN 2018097675W WO 2020019355 A1 WO2020019355 A1 WO 2020019355A1
Authority
WO
WIPO (PCT)
Prior art keywords
wearable device
fingerprint
fingerprint sensor
touch operation
images
Prior art date
Application number
PCT/CN2018/097675
Other languages
English (en)
Chinese (zh)
Inventor
龚树强
龚建勇
仇存收
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2018/097675 priority Critical patent/WO2020019355A1/fr
Priority to CN201880094859.XA priority patent/CN112334860B/zh
Publication of WO2020019355A1 publication Critical patent/WO2020019355A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Definitions

  • the present application relates to the field of communications technologies, and in particular, to a touch method, a wearable device, and a system for a wearable device.
  • mobile phones, tablets and other terminals support access to accessories such as headphones.
  • accessories such as headphones.
  • a mobile phone and a Bluetooth headset as an example, after a Bluetooth connection is established between the mobile phone and the Bluetooth headset, the user can use the Bluetooth headset to play songs in the mobile phone, talk to contacts, and so on.
  • a Bluetooth headset is provided with one or more function keys (such as volume +, volume-and other keys), and the user can control the mobile phone to implement functions related to audio playback by operating these function keys.
  • Some Bluetooth headsets are also equipped with a touchpad, and users can implement the functions of the corresponding function keys by performing preset gestures (for example, clicking, sliding, etc.) on the touchpad. For example, if it is detected that the user has performed a click operation on the touchpad of the Bluetooth headset, the Bluetooth headset may generate a playback instruction corresponding to the click operation, and send the playback instruction to the mobile phone, so that the mobile phone performs the playback function in response to the playback instruction .
  • the area of the touchpad provided on the Bluetooth headset is correspondingly small.
  • a sensing unit such as a sensing capacitor
  • the number of Bluetooth headsets is also relatively small, making it impossible for Bluetooth headsets to improve the sensitivity and accuracy when using these limited sensing units to recognize user gestures.
  • the present application provides a touch method, a wearable device, and a system for a wearable device, which can improve the sensitivity and accuracy of the wearable device when recognizing a user's gesture, and reduce the chance of the wearable device or terminal being triggered by mistake.
  • the present application provides a touch method for a wearable device.
  • the wearable device is provided with a fingerprint sensor. Then, the wearable device uses the fingerprint sensor to detect a touch operation input by the user; further, the wearable device can determine the touch operation. Whether the touch operation includes fingerprint input; if the touch operation includes fingerprint input, it means that the touch operation is not a user's accidental touch behavior, so the wearable device can recognize the control gesture corresponding to the touch operation; and send the control gesture to A terminal; or, sending an operation instruction corresponding to the control gesture to the terminal, so that the terminal executes the operation instruction corresponding to the control gesture, and a communication connection is established between the wearable device and the terminal.
  • the fingerprint sensor has a small size and a high degree of integration, and the fingerprint sensor is set in a wearable device to identify a user's touch operation, instead of using the size in a traditional wearable device.
  • the larger touchpad recognizes gestures performed by the user, thereby improving the integration of the wearable device.
  • the fingerprint sensor because the number of sensing units in the fingerprint sensor is greater, and the fingerprint sensor can identify whether the user's touch operation is triggered by a finger instead of being accidentally touched, the sensitivity and accuracy of the wearable device when recognizing the user's gesture will be Increase, while reducing the chance of wearable devices and terminals being triggered by mistake.
  • the method further includes: in response to the touch operation, the wearable device collects N consecutive images formed on the fingerprint sensor, At least one of the N consecutive images includes a fingerprint pattern, and N is an integer greater than 1.
  • the wearable device recognizes the control gesture corresponding to the touch operation, including: the wearable device continues to the N consecutive images according to the fingerprint pattern The change in the image identifies the control gesture corresponding to the touch operation.
  • the wearable device collects N consecutive images formed on the fingerprint sensor, including: when the fingerprint sensor detects that the user's finger is in contact with the fingerprint sensor, it starts to collect the formed images on the fingerprint sensor at a preset frequency. Image; when the fingerprint sensor detects that the user's finger leaves the fingerprint sensor, it stops collecting images formed on the fingerprint sensor to obtain the N consecutive images.
  • the wearable device collects N consecutive images formed on the fingerprint sensor, including: when the fingerprint sensor detects that the user's finger is in contact with the fingerprint sensor, it starts to collect the formed images on the fingerprint sensor at a preset frequency. Image; when the fingerprint sensor detects that the user ’s finger leaves the fingerprint sensor, it continues to collect the image formed on the fingerprint sensor within a preset time; if it does not detect that the user's finger has touched the fingerprint sensor within the preset time, it stops collecting the fingerprint sensor On the formed image, the N consecutive images are obtained.
  • the wearable device recognizes the control gesture corresponding to the touch operation according to the change of the fingerprint pattern in the N consecutive images, and specifically includes: the wearable device according to a preset fingerprint characteristic, An image including a fingerprint pattern in N consecutive images is identified; further, the wearable device recognizes a control gesture corresponding to the touch operation according to a size change and / or a position change of the fingerprint pattern in the N consecutive images. That is, the principle that a fingerprint sensor can collect a fingerprint pattern is used in this application to recognize a control gesture input by a user through continuous changes of the fingerprint pattern.
  • the wearable device recognizes the control gesture corresponding to the touch operation according to the size change and / or position change of the fingerprint pattern in the N consecutive images, including: when the N consecutive images There are consecutive X images containing the fingerprint pattern, and the position of the fingerprint pattern in the X images is the same, then the control gesture corresponding to the touch operation is a click operation, X ⁇ N; or, when the N consecutive images are in There are consecutive Y images containing the fingerprint pattern, and the position of the fingerprint pattern in the Y images is the same, then the control gesture corresponding to the touch operation is a long press operation, X ⁇ Y ⁇ N; or, when the N consecutive images There are consecutive Z images containing the fingerprint pattern, and the displacement amount of the fingerprint pattern in the Z images is greater than the distance threshold, then the control gesture corresponding to the touch operation is a sliding operation, Z ⁇ N, or, when the N consecutive images Between L1 consecutive images containing the fingerprint pattern and L2 consecutive images containing the fingerprint pattern, there are L3 images not
  • the method before the wearable device sends a control gesture to the terminal; or before the wearable device sends an operation instruction corresponding to the control gesture to the terminal, the method further includes: the wearable device determines that the touch operation is not Incorrect operation. That is, in this application, only when the wearable device recognizes that the touch operation received by the fingerprint sensor includes fingerprint input, it will continue to collect the image formed on the fingerprint sensor and identify the control gesture corresponding to the touch operation. And sending the identified control gesture or operation instruction to the terminal.
  • the wearable device determines that the above-mentioned touch operation is not an accidental touch operation, including: if P1 images in the N consecutive images contain a fingerprint pattern, and P1 is greater than a preset value, the wearable device determines the The touch operation is not an accidental touch operation.
  • the wearable device determines that the touch operation is a false touch operation; then, the wearable device will The fingerprint sensor is switched from the working state to the sleep state, thereby avoiding the user's accidental touch operation on the wearable device to wake up the terminal, and reducing the power consumption of the wearable device and the mobile phone.
  • the method before the wearable device receives a touch operation input by the user to the fingerprint sensor, the method further includes: in response to a wakeup operation input by the user, the wearable device switches the fingerprint sensor from a sleep state to a working state. That is, the fingerprint sensor may be in a sleep state before receiving a touch operation input by a user to reduce power consumption of the wearable device.
  • the present application provides a wearable device, including: a fingerprint sensor, one or more processors, a memory, and one or more programs; wherein the processor is coupled to the memory, and the one or more programs are stored In the memory, when the wearable device is running, the processor executes one or more programs stored in the memory, so that the wearable device executes any one of the touch methods of the wearable device.
  • the fingerprint sensor may be disposed on a side that is not in contact with the user when the wearable device is worn; the wearable device may be a Bluetooth headset, smart glasses, or a smart watch.
  • the present application provides a computer storage medium including computer instructions, and when the computer instructions are run on the wearable device, the wearable device executes the touch method of the wearable device according to any one of the foregoing.
  • the present application provides a computer program product that, when the computer program product runs on the wearable device, causes the wearable device to perform the touch method of the wearable device according to any one of the above.
  • the present application provides a touch control system including a wearable device and a terminal, wherein the wearable device is provided with a fingerprint sensor, and a communication connection is established between the wearable device and the terminal;
  • the wearable device is configured to: use the fingerprint sensor to detect a touch operation input by a user; determine whether the touch operation includes a fingerprint input; if the touch operation includes a fingerprint input, identify that the touch operation corresponds to Sending the control gesture to the terminal, or sending an operation instruction corresponding to the control gesture to the terminal;
  • the terminal is configured to: receive the control gesture sent by the wearable device Or receive an operation instruction corresponding to the control gesture sent by the wearable device; and execute an operation instruction corresponding to the control gesture.
  • the present application provides a touch system including a wearable device and a terminal, wherein the wearable device is provided with a fingerprint sensor, and a communication connection is established between the wearable device and the terminal;
  • the wearable device is configured to: use the fingerprint sensor to detect a touch operation input by a user; in response to the touch operation, collect N consecutive images formed on the fingerprint sensor, at least one of the N consecutive images
  • the images include a fingerprint pattern, and N is an integer greater than 1.
  • the N consecutive images are sent to the terminal; the terminal is configured to: receive N consecutive images sent by the wearable device;
  • the change in the N consecutive images identifies a control gesture corresponding to the touch operation; and executes an operation instruction corresponding to the control gesture.
  • the terminal described in the second aspect, the computer storage medium described in the third aspect, the computer program product described in the fourth aspect, and the systems described in the fifth and sixth aspects are all used for The corresponding method provided above is executed, and therefore, the beneficial effects that can be achieved can refer to the beneficial effects in the corresponding method provided above, and details are not described herein again.
  • FIG. 1 is a first schematic view of a touch scenario of a wearable device according to an embodiment of the present application
  • FIG. 2 is a schematic structural diagram of a fingerprint sensor according to an embodiment of the present application.
  • FIG. 3 is a first schematic structural diagram of a wearable device according to an embodiment of the present application.
  • FIG. 4 is a schematic structural diagram of a smart glasses according to an embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of a terminal according to an embodiment of the present application.
  • FIG. 6 is a schematic interaction diagram of a touch method of a wearable device according to an embodiment of the present application.
  • FIG. 7 is a second schematic diagram of a touch scenario of a wearable device according to an embodiment of the present application.
  • FIG. 8 is a third schematic view of a touch scenario of a wearable device according to an embodiment of the present application.
  • FIG. 9 is a fourth schematic view of a touch scenario of a wearable device according to an embodiment of the present application.
  • FIG. 10 is a second schematic structural diagram of a wearable device according to an embodiment of the present application.
  • a touch method of a wearable device provided in an embodiment of the present application can be applied to a touch system composed of a wearable device 11 and a terminal 12.
  • a wireless communication connection or a wired communication connection may be established between the wearable device 11 and the terminal 12.
  • the wearable device 11 may be a wireless headset, a wired headset, smart glasses, a smart helmet, or a smart watch.
  • the terminal 12 may be a device such as a mobile phone, a tablet computer, a notebook computer, an Ultra-mobile Personal Computer (UMPC), a Personal Digital Assistant (PDA), and the like in this embodiment of the present application.
  • UMPC Ultra-mobile Personal Computer
  • PDA Personal Digital Assistant
  • a fingerprint sensor 201 is provided on the Bluetooth headset in the embodiment of the present application.
  • the fingerprint sensor 201 may be disposed on a side that is not directly in contact with the user when the user wears it.
  • the fingerprint sensor 201 may be disposed on a housing of a Bluetooth headset, or the fingerprint sensor 201 may be separately configured as a control module and The case of the Bluetooth headset is connected.
  • the fingerprint sensor 201 can collect a fingerprint pattern formed by the user's finger on the collection surface.
  • the fingerprint sensor 201 shown in FIG. 2 includes a plurality of sensing units 201b arranged in an array, and a collection surface 201a covering the sensing units 201b.
  • Fingerprints on a user's finger generally include concave troughs (valleys) and convex crests (ridges).
  • the sensing unit 201b in the fingerprint sensor 201 can generate electrical signals corresponding to the valley and the ridge, respectively.
  • the capacitance difference generated by the sensing capacitor corresponding to the valley in the fingerprint is the first capacitance difference
  • the capacitance difference generated by the sensing capacitor corresponding to the peak in the fingerprint is the second capacitance difference.
  • the capacitance difference at different positions on 201 can draw the user's fingerprint pattern.
  • the sensing unit 201b may be a photoelectric sensor (such as a photodiode or a phototransistor).
  • the above-mentioned fingerprint sensor 201 may be a capacitive fingerprint sensor, an optical fingerprint sensor, a radio frequency fingerprint sensor, or an ultrasonic fingerprint sensor, and the embodiment of the present application does not place any restrictions on this.
  • the number of the sensing units 201b when the number of the sensing units 201b is larger, the accuracy and sensitivity when collecting and identifying the fingerprint pattern of the user are higher. Due to the high integration of the sensing unit 201b in the fingerprint sensor 201, for a fingerprint sensor 201 and an ordinary touchpad of the same size, the number of the sensing units 201b in the fingerprint sensor 201 is much larger than that of the ordinary touchpad. Number of. For example, a 2cm * 2cm fingerprint sensor 201 may include more than 50 * 50 sensing units, while the number of sensing units in a 2cm * 2cm ordinary touchpad may be only about a dozen.
  • the above-mentioned fingerprint sensor 201 may be used instead of the ordinary touchpad originally provided in the wearable device to reduce the size of the wearable device.
  • the wearable device can recognize the user's finger in the fingerprint through the multiple fingerprint patterns continuously collected by the fingerprint sensor 201.
  • Specific gestures performed by the sensor 201 such as a slide gesture, a double-tap gesture, and the like.
  • the wearable device may determine an operation instruction corresponding to the recognized specific gesture according to a preset correspondence relationship between different gestures and different operation instructions, such as a play instruction, a volume adjustment instruction, or a pause instruction. Furthermore, the wearable device may send a corresponding operation instruction to the terminal, so that the terminal executes the operation instruction, so as to implement the wearable device to control the related functions in the terminal through the user's touch operation on the wearable device.
  • the wearable device may also send the recognized gesture to the terminal, and the terminal executes a corresponding operation instruction according to the gesture. Without any restrictions.
  • the wearable device 11 may further include a microphone 201 (for example, a bone conduction microphone), an acceleration sensor 203, a proximity light sensor 204, a communication module 205, a speaker 206, The computing module 207, the storage module 208, and the power supply 209 and other components.
  • a microphone 201 for example, a bone conduction microphone
  • an acceleration sensor 203 for example, a Bluetooth sensor
  • a proximity light sensor 204 for example, a Bluetooth
  • a communication module 205 for example, a Bluetooth sensor, a microphone 206
  • the computing module 207 for example, a Bluetooth sensor, a Bluetooth, or a Wi-Fi module 206, or a Wi-Fi module 206, or the like.
  • the various components shown in FIG. 3 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing or application specific integrated circuits.
  • the Bluetooth headset is used as an example for the wearable device 11. It can be understood that the fingerprint sensor 201 can also be set on other wearable devices such as smart glasses, smart helmets or smart bracelets. Is used to recognize a gesture performed by the user on the fingerprint sensor 201.
  • the above-mentioned fingerprint sensor 201 may be integrated in the smart glasses 301.
  • the fingerprint sensor 201 may be disposed on a frame or temple of the smart glasses 301.
  • the fingerprint sensor 201 can collect a fingerprint pattern on the fingerprint sensor 201 at a certain frequency.
  • the smart glasses 301 can recognize specific gestures performed by the user on the fingerprint sensor 201 according to changes in the position and size of the user's finger in the collected multiple fingerprint patterns.
  • the terminal 12 in the voice control system may be a mobile phone 100.
  • the mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a USB interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a radio frequency module 150, a communication module 160, and an audio module.
  • the sensor module can include pressure sensor 180A, gyroscope sensor 180B, barometric pressure sensor 180C, magnetic sensor 180D, acceleration sensor 180E, distance sensor 180F, proximity light sensor 180G, fingerprint sensor 180H, temperature sensor 180J, touch sensor 180K, and ambient light sensor. 180L, bone conduction sensor, etc.
  • the structure illustrated in the embodiment of the present invention does not limit the mobile phone 100. It may include more or fewer parts than shown, or some parts may be combined, or some parts may be split, or different parts may be arranged.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (DSP), baseband processor, and / or neural network processing unit (NPU) Wait.
  • AP application processor
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • DSP digital signal processor
  • NPU neural network processing unit
  • different processing units can be independent devices or integrated in the same processor.
  • the controller may be a decision maker that instructs the various components of the mobile phone 100 to coordinate work according to instructions. It is the nerve center and command center of the mobile phone 100.
  • the controller generates operation control signals according to the instruction operation code and timing signals, and completes the control of fetching and executing the instructions.
  • the processor 110 may further include a memory for storing instructions and data.
  • the memory in the processor is a cache memory. You can save instructions or data that the processor has just used or recycled. If the processor needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided, the processor's waiting time is reduced, and the efficiency of the system is improved.
  • the processor 110 may include an interface.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit (inter-integrated circuit, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, and a universal asynchronous transceiver (universal asynchronous receiver / transmitter (UART) interface, mobile industry processor interface (MIPI), general-purpose input / output (GPIO) interface, subscriber identity module (SIM) interface, And / or universal serial bus (universal serial bus, USB) interfaces.
  • I2C integrated circuit
  • I2S integrated circuit
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input / output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a two-way synchronous serial bus, including a serial data line (SDA) and a serial clock line (SCL).
  • the processor may include multiple sets of I2C buses.
  • the processor can be coupled to touch sensors, chargers, flashes, cameras, etc. through different I2C bus interfaces.
  • the processor may couple the touch sensor through the I2C interface, so that the processor and the touch sensor communicate through the I2C bus interface to implement the touch function of the mobile phone 100.
  • the I2S interface can be used for audio communication.
  • the processor may include multiple sets of I2S buses.
  • the processor may be coupled to the audio module through an I2S bus to implement communication between the processor and the audio module.
  • the audio module can transmit audio signals to the communication module through the I2S interface, so as to implement the function of receiving calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing, and encoding analog signals.
  • the audio module and the communication module may be coupled through a PCM bus interface.
  • the audio module can also transmit audio signals to the communication module through the PCM interface, so as to implement the function of receiving calls through a Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication, and the sampling rates of the two interfaces are different.
  • the UART interface is a universal serial data bus for asynchronous communication. This bus is a two-way communication bus. It converts the data to be transferred between serial and parallel communications.
  • a UART interface is typically used to connect the processor and the communication module 160.
  • the processor communicates with the Bluetooth module through a UART interface to implement the Bluetooth function.
  • the audio module can transmit audio signals to the communication module through the UART interface, so as to implement the function of playing music through a Bluetooth headset.
  • the MIPI interface can be used to connect processors with peripheral devices such as displays, cameras, etc.
  • the MIPI interface includes a camera serial interface (CSI), a display serial interface (DSI), and the like.
  • the processor and the camera communicate through a CSI interface to implement a shooting function of the mobile phone 100.
  • the processor and the display communicate through a DSI interface to implement a display function of the mobile phone 100.
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor with a camera, a display screen, a communication module, an audio module, a sensor, and the like.
  • GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 may be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface can be used to connect a charger to charge the mobile phone 100, and can also be used to transfer data between the mobile phone 100 and peripheral devices. It can also be used to connect headphones and play audio through headphones. It can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules shown in the embodiments of the present invention is only a schematic description, and does not constitute a limitation on the structure of the mobile phone 100.
  • the mobile phone 100 may use different interface connection modes or a combination of multiple interface connection modes in the embodiments of the present invention.
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module may receive a charging input of a wired charger through a USB interface.
  • the charging management module may receive a wireless charging input through a wireless charging coil of the mobile phone 100. While the charging management module is charging the battery, it can also supply power to the terminal device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charge management module 140 and the processor 110.
  • the power management module receives inputs from the battery and / or charge management module, and supplies power to a processor, an internal memory, an external memory, a display screen, a camera, and a communication module.
  • the power management module can also be used to monitor battery capacity, battery cycle times, battery health (leakage, impedance) and other parameters.
  • the power management module 141 may also be disposed in the processor 110.
  • the power management module 141 and the charge management module may also be provided in the same device.
  • the wireless communication function of the mobile phone 100 can be implemented by the antenna module 1, the antenna module 2 the radio frequency module 150, the communication module 160, the modem, and the baseband processor.
  • the antenna 1 and the antenna 2 are used for transmitting and receiving electromagnetic wave signals.
  • Each antenna in the mobile phone 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be multiplexed to improve antenna utilization. For example, a cellular network antenna can be multiplexed into a wireless LAN diversity antenna. In some embodiments, the antenna may be used in conjunction with a tuning switch.
  • the radio frequency module 150 may provide a communication processing module applied to the mobile phone 100 and including a wireless communication solution such as 2G / 3G / 4G / 5G. It may include at least one filter, switch, power amplifier, Low Noise Amplifier (LNA), and the like.
  • the radio frequency module receives electromagnetic waves from the antenna 1, and processes the received electromagnetic waves by filtering, amplifying, etc., and transmitting them to the modem for demodulation.
  • the radio frequency module can also amplify the signal modulated by the modem and turn it into electromagnetic wave radiation through the antenna 1.
  • at least part of the functional modules of the radio frequency module 150 may be disposed in the processor 150.
  • at least part of the functional modules of the radio frequency module 150 may be provided in the same device as at least part of the modules of the processor 110.
  • the modem may include a modulator and a demodulator.
  • the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is processed by the baseband processor and then passed to the application processor.
  • the application processor outputs sound signals through audio equipment (not limited to speakers, receivers, etc.), or displays images or videos through a display screen.
  • the modem may be a separate device.
  • the modem may be independent of the processor and disposed in the same device as the radio frequency module or other functional modules.
  • the communication module 160 can provide wireless local area networks (WLAN), Bluetooth (Bluetooth, BT), global navigation satellite system (GNSS), frequency modulation (FM) applied to the mobile phone 100.
  • a communication processing module for a wireless communication solution such as near field communication (NFC), infrared technology (infrared, IR) and the like.
  • the communication module 160 may be one or more devices that integrate at least one communication processing module.
  • the communication module receives the electromagnetic wave through the antenna 2, frequency-modulates and filters the electromagnetic wave signal, and sends the processed signal to the processor.
  • the communication module 160 may also receive a signal to be transmitted from the processor, frequency-modulate it, amplify it, and turn it into electromagnetic wave radiation through the antenna 2.
  • the antenna 1 of the mobile phone 100 is coupled to a radio frequency module, and the antenna 2 is coupled to a communication module.
  • the mobile phone 100 can communicate with a network and other devices through wireless communication technology.
  • the wireless communication technology may include a global mobile communication system (GSM), a general packet radio service (GPRS), a code division multiple access (CDMA), and broadband.
  • GSM global mobile communication system
  • GPRS general packet radio service
  • CDMA code division multiple access
  • WCDMA wideband code division multiple access
  • TD-SCDMA time-division code division multiple access
  • LTE long term evolution
  • BT GNSS
  • WLAN NFC
  • FM FM
  • IR technology IR
  • the GNSS may include a global positioning system (GPS), a global navigation satellite system (GLONASS), a beidou navigation navigation system (BDS), and a quasi-zenith satellite system (quasi -zenith satellite system (QZSS)) and / or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Bertdou navigation navigation system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the mobile phone 100 implements a display function through a GPU, a display screen 194, and an application processor.
  • the GPU is a microprocessor for image processing, which connects the display screen and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos, and the like.
  • the display includes a display panel.
  • the display panel can adopt LCD (liquid crystal display), OLED (organic light-emitting diode), active matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode) emitting diodes, AMOLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (QLEDs), etc.
  • the mobile phone 100 may include one or N display screens, where N is a positive integer greater than 1.
  • the mobile phone 100 can implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen, and an application processor.
  • ISP is used to process data from camera feedback. For example, when taking a picture, the shutter is opened, and the light is transmitted to the light receiving element of the camera through the lens. The light signal is converted into an electrical signal, and the light receiving element of the camera passes the electrical signal to the ISP for processing and converts the image to the naked eye. ISP can also optimize the image's noise, brightness, and skin tone. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, an ISP may be provided in the camera 193.
  • the camera 193 is used to capture still images or videos.
  • An object generates an optical image through a lens and projects it onto a photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs digital image signals to the DSP for processing.
  • DSP converts digital image signals into image signals in standard RGB, YUV and other formats.
  • the mobile phone 100 may include one or N cameras, where N is a positive integer greater than 1.
  • a digital signal processor is used to process digital signals. In addition to digital image signals, it can also process other digital signals. For example, when the mobile phone 100 is selected at a frequency point, the digital signal processor is used to perform a Fourier transform on the frequency point energy.
  • Video codecs are used to compress or decompress digital video.
  • the mobile phone 100 may support one or more codecs. In this way, the mobile phone 100 can play or record videos in multiple encoding formats, such as: MPEG1, MPEG2, MPEG3, MPEG4, and so on.
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • applications such as smart cognition of the mobile phone 100 can be implemented, such as: image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the mobile phone 100.
  • the external memory card communicates with the processor through an external memory interface to implement a data storage function. For example, save music, videos and other files on an external memory card.
  • the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
  • the processor 110 executes various functional applications and data processing of the mobile phone 100 by running instructions stored in the internal memory 121.
  • the memory 121 may include a storage program area and a storage data area.
  • the storage program area may store an operating system, at least one application required by a function (such as a sound playback function, an image playback function, etc.) and the like.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the mobile phone 100.
  • the memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, other volatile solid-state storage devices, a universal flash memory (universal flash storage, UFS), etc. .
  • a non-volatile memory such as at least one magnetic disk storage device, a flash memory device, other volatile solid-state storage devices, a universal flash memory (universal flash storage, UFS), etc.
  • the mobile phone 100 can implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a headphone interface 170D, and an application processor. Such as music playback, recording, etc.
  • the audio module is used to convert digital audio information into an analog audio signal output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module can also be used to encode and decode audio signals.
  • the audio module may be disposed in the processor 110, or some functional modules of the audio module may be disposed in the processor 110.
  • the speaker 170A also called a "horn" is used to convert audio electrical signals into sound signals.
  • the mobile phone 100 can listen to music through a speaker or listen to a hands-free call.
  • the receiver 170B also referred to as the "handset" is used to convert audio electrical signals into sound signals.
  • the mobile phone 100 answers a call or a voice message, it can answer the voice by holding the receiver close to the human ear.
  • the microphone 170C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can make a sound through the mouth close to the microphone, and input the sound signal into the microphone.
  • the mobile phone 100 may be provided with at least one microphone.
  • the mobile phone 100 may be provided with two microphones, in addition to collecting sound signals, it may also implement a noise reduction function.
  • the mobile phone 100 may further be provided with three, four, or more microphones to collect sound signals, reduce noise, and also identify sound sources, and implement a directional recording function.
  • the headset interface 170D is used to connect a wired headset.
  • the earphone interface can be a USB interface or a 3.5mm open mobile terminal platform (OMTP) standard interface, and the American Cellular Telecommunications Industry Association (United States of America, CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA American Cellular Telecommunications Industry Association
  • the pressure sensor 180A is used to sense a pressure signal, and can convert the pressure signal into an electrical signal.
  • the pressure sensor may be disposed on the display screen.
  • the capacitive pressure sensor may be at least two parallel plates having a conductive material. When a force is applied to the pressure sensor, the capacitance between the electrodes changes.
  • the mobile phone 100 determines the intensity of the pressure according to the change in capacitance.
  • the mobile phone 100 detects the intensity of the touch operation according to a pressure sensor.
  • the mobile phone 100 may also calculate the touched position according to the detection signal of the pressure sensor.
  • touch operations acting on the same touch position but different touch operation intensities may correspond to different operation instructions. For example, when a touch operation with a touch operation intensity lower than the first pressure threshold is applied to the short message application icon, an instruction for viewing the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold is applied to the short message application icon, an instruction for creating a short message is executed.
  • the gyro sensor 180B may be used to determine the movement posture of the mobile phone 100.
  • the angular velocity of the mobile phone 100 about three axes may be determined by a gyro sensor.
  • a gyroscope sensor can be used for image stabilization. Exemplarily, when the shutter is pressed, the gyro sensor detects the shake angle of the mobile phone 100, and calculates the distance that the lens module needs to compensate according to the angle, so that the lens can cancel the shake of the mobile phone 100 by the reverse movement to achieve anti-shake.
  • the gyroscope sensor can also be used for navigation and somatosensory game scenes.
  • the barometric pressure sensor 180C is used to measure air pressure.
  • the mobile phone 100 calculates altitude by using the air pressure value measured by the air pressure sensor to assist in positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the mobile phone 100 can detect the opening and closing of the flip leather case by using a magnetic sensor.
  • the mobile phone 100 can detect the opening and closing of the flip according to a magnetic sensor. Further, according to the opened and closed state of the holster or the opened and closed state of the flip cover, characteristics such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the mobile phone 100 in various directions (generally three axes).
  • the magnitude and direction of gravity can be detected when the mobile phone 100 is stationary. It can also be used to identify the posture of the terminal, and is used in applications such as switching between horizontal and vertical screens, and pedometers.
  • the mobile phone 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the mobile phone 100 may use a distance sensor to measure distances to achieve fast focusing.
  • the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode. Infrared light is emitted outward through a light emitting diode.
  • the mobile phone 100 can use a proximity light sensor to detect that the user is holding the mobile phone 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • the proximity light sensor can also be used in holster mode, and the pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the mobile phone 100 can adaptively adjust the brightness of the display screen according to the perceived ambient light brightness.
  • the ambient light sensor can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor can also cooperate with the proximity light sensor to detect whether the mobile phone 100 is in a pocket to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the mobile phone 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, access application lock, fingerprint photographing, fingerprint answering calls, etc.
  • the temperature sensor 180J is used to detect the temperature.
  • the mobile phone 100 uses the temperature detected by the temperature sensor to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor exceeds a threshold, the mobile phone 100 performs a performance reduction of a processor located near the temperature sensor in order to reduce power consumption and implement thermal protection.
  • the touch sensor 180K is also called “touch panel”. Can be set on the display. Used to detect touch operations on or near it. The detected touch operation can be passed to the application processor to determine the type of touch event and provide corresponding visual output through the display screen.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor can acquire a vibration signal of a human body acoustical vibration bone mass.
  • Bone conduction sensors can also touch the human pulse and receive blood pressure beating signals.
  • a bone conduction sensor may also be provided in the headset.
  • the audio module 170 may analyze a voice signal based on a vibration signal of a oscillating bone mass obtained by the bone conduction sensor to implement a voice function.
  • the application processor may analyze the heart rate information based on the blood pressure beating signal obtained by the bone conduction sensor to implement a heart rate detection function.
  • the keys 190 include a power-on key, a volume key, and the like.
  • the keys can be mechanical keys. It can also be a touch button.
  • the mobile phone 100 receives key input, and generates key signal inputs related to user settings and function control of the mobile phone 100.
  • the motor 191 may generate a vibration alert.
  • the motor can be used for incoming vibration alert and touch vibration feedback.
  • the touch operation applied to different applications can correspond to different vibration feedback effects.
  • Touch operations on different areas of the display can also correspond to different vibration feedback effects.
  • Different application scenarios (such as time reminders, receiving information, alarm clocks, games, etc.) can also correspond to different vibration feedback effects.
  • Touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging status, power change, and can also be used to indicate messages, missed calls, notifications, and so on.
  • the SIM card interface 195 is used to connect to a subscriber identity module (SIM).
  • SIM subscriber identity module
  • the SIM card can be contacted and separated from the mobile phone 100 by inserting or removing the SIM card interface.
  • the mobile phone 100 may support one or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface can support Nano SIM cards, Micro SIM cards, SIM cards, etc. Multiple SIM cards can be inserted into the same SIM card interface at the same time. The types of the multiple cards may be the same or different.
  • the SIM card interface is also compatible with different types of SIM cards.
  • the SIM card interface is also compatible with external memory cards.
  • the mobile phone 100 interacts with the network through the SIM card to implement functions such as calling and data communication.
  • the mobile phone 100 uses an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the mobile phone 100 and cannot be separated from the mobile phone 100.
  • N (N) formed on the fingerprint sensor 201 can be collected by the fingerprint sensor 201 at a certain frequency. > 1) continuous images. Furthermore, the wearable device 11 can recognize gestures performed by the user on the fingerprint sensor 201 by comparing changes in parameters such as the size and position of the fingerprint pattern in the N consecutive images. In this way, the wearable device 11 can send a corresponding operation instruction to the terminal 12 according to the recognized gesture, so as to implement the wearable device 11 to control related functions in the terminal 12.
  • the fingerprint sensor has a small size and a high degree of integration, and the fingerprint sensor is set in a wearable device to recognize a gesture performed by a user, instead of using a size in a traditional wearable device.
  • the larger touchpad recognizes gestures performed by the user, thereby improving the integration of the wearable device.
  • the number of sensing units in the fingerprint sensor is greater, the sensitivity and accuracy of the wearable device when recognizing a user's gesture will be improved, thereby reducing the chance of the wearable device and terminal being accidentally triggered.
  • a touch method of a wearable device provided by an embodiment of the present application will be specifically introduced below with reference to the accompanying drawings.
  • a mobile phone is used as a terminal
  • a Bluetooth headset is used as a wearable device.
  • FIG. 6 is a schematic flowchart of a touch method of a wearable device according to an embodiment of the present application. As shown in FIG. 6, the touch method may include:
  • the mobile phone establishes a Bluetooth connection with the Bluetooth headset.
  • the Bluetooth function of the Bluetooth headset can be turned on. At this time, the Bluetooth headset can send a paired broadcast to the outside. If the mobile phone has the Bluetooth function turned on, the mobile phone can receive the pairing broadcast and prompt the user that the relevant Bluetooth device has been scanned. When the user selects a Bluetooth headset on the phone, the phone can pair with the Bluetooth headset and establish a Bluetooth connection. Subsequently, the mobile phone and the Bluetooth headset can communicate through the Bluetooth connection. Of course, if the mobile phone and the Bluetooth headset have been successfully paired before establishing this Bluetooth connection, the mobile phone can automatically establish a Bluetooth connection with the scanned Bluetooth headset.
  • the user can also operate the mobile phone to establish a Wi-Fi connection with the headset.
  • the earphone used by the user is a wired earphone
  • the user can also insert the plug of the earphone cable into the corresponding earphone interface of the mobile phone to establish a wired connection, which is not limited in this embodiment of the present application.
  • the mobile phone can also use the Bluetooth headset connected at this time as a legitimate Bluetooth device. For example, the mobile phone may save the identification of the legal Bluetooth device (such as the MAC address of a Bluetooth headset, etc.) locally on the mobile phone. In this way, when a subsequent mobile phone receives an operation instruction or data from a Bluetooth device, the mobile phone can determine whether the Bluetooth device communicating at this time is a legitimate Bluetooth device according to the saved identifier of the legal Bluetooth device. When the mobile phone determines that an illegal Bluetooth device is currently sending an operation instruction or data to the mobile phone, the mobile phone may discard the operation instruction or data to improve the security during the use of the mobile phone.
  • a phone can manage one or more legitimate Bluetooth devices. As shown in FIG. 7, the user can enter the management interface 701 of legal devices from the setting function, and the user can add or delete legal Bluetooth devices in the management interface 701.
  • S602 In response to a preset operation input by the user to the Bluetooth headset, the Bluetooth headset wakes up the fingerprint sensor and enters a working state.
  • the fingerprint sensor on the Bluetooth headset is set to a sleep state with low power consumption by default.
  • the Bluetooth headset can scan the electrical signals generated by each sensing unit in the fingerprint sensor at a lower working frequency, or the Bluetooth headset can also temporarily turn off the fingerprint sensor (for example, power off the fingerprint sensor).
  • the Bluetooth headset can preset one or more preset operations for awakening the fingerprint sensor.
  • a Bluetooth headset may be preset with a wake-up word (eg, "Hello, Little E").
  • a wake-up word eg, "Hello, Little E”
  • the Bluetooth headset detects that the wake-up word is included in the voice information input by the user through the microphone, it indicates that the user has performed a preset operation for waking the fingerprint sensor. At this time, the Bluetooth headset can switch the fingerprint sensor from a sleep state to a working state.
  • the Bluetooth headset may be preset with a tapping operation (eg, double tapping) to wake up the fingerprint sensor. When the Bluetooth headset detects that the user has performed the tapping operation through the acceleration sensor, the Bluetooth headset can switch the fingerprint sensor from a sleep state to a working state.
  • the Bluetooth headset may be preset with a touch operation (for example, a tap operation) to wake up the fingerprint sensor. If the fingerprint sensor in the sleep state detects that the user has performed the touch operation, it can switch from the sleep state to the working state. After the fingerprint sensor enters the working state, each sensing unit in the fingerprint sensor can be scanned at a higher working frequency (for example, 10 Hz) to collect an image formed on the fingerprint sensor.
  • a touch operation for example, a tap operation
  • the fingerprint sensor of the Bluetooth headset may be set to a working state by default.
  • the Bluetooth headset may skip step S602 and perform the following steps S603-S606.
  • the Bluetooth headset may also automatically enter a sleep state.
  • a Bluetooth headset can enter a BLE (Bluetooth Low Energy) mode, thereby further reducing the power consumption of the Bluetooth headset.
  • some sensors such as the acceleration sensor or the microphone described above
  • the Bluetooth headset can be switched from the sleep state
  • the operation mode is the following steps S603-S606.
  • the Bluetooth headset collects N consecutive images formed on the fingerprint sensor. At least one of the N consecutive images includes a fingerprint pattern, and N is an integer greater than 1.
  • the Bluetooth headset can use the fingerprint sensor to continuously acquire N consecutive images formed on the collection surface of the fingerprint sensor at a certain working frequency. Since the user's finger is a conductive object, when the user touches and leaves the fingerprint sensor, the corresponding capacitance signal in the fingerprint sensor changes. Therefore, after the fingerprint sensor enters the working state, the motion of the user's finger touching and leaving the fingerprint sensor can be sensed. Then, the Bluetooth headset can continuously acquire the image formed on the fingerprint sensor from the fingerprint sensor sensing that the user's finger touches the fingerprint sensor, until the fingerprint sensor senses that the user's finger leaves the fingerprint sensor, thereby obtaining the above-mentioned N consecutive images.
  • the Bluetooth headset can control the fingerprint sensor to enter the sleep state again to reduce the power consumption of the Bluetooth headset.
  • the above-mentioned N consecutive images refer to the images continuously acquired 2 seconds after the fingerprint sensor detects that the user's finger touches the fingerprint sensor, until the fingerprint sensor detects that the user's finger leaves the fingerprint sensor. Other images including fingerprint patterns.
  • the Bluetooth headset can also start to continuously capture images formed on the fingerprint sensor after the fingerprint sensor enters the working state, until the fingerprint sensor senses that the user's finger leaves the fingerprint sensor or the fingerprint sensor senses that the user's finger leaves the fingerprint sensor. Up to time, the above N continuous images are obtained. Since the user may not touch the fingerprint sensor immediately after the fingerprint sensor enters the working state, in this way, there may be other images not including the fingerprint pattern in the first few images of the above N consecutive images.
  • the Bluetooth headset can learn fingerprint characteristics of general fingerprints through some samples of fingerprint patterns in advance.
  • the fingerprint feature can be stored in the Bluetooth headset in the form of a model or a vector.
  • the fingerprint feature can be used to indicate the fingerprint feature of a specific user (such as legal user A), and can also be used to indicate the fingerprint feature common to the fingerprints of most ordinary users.
  • the above-mentioned fingerprint characteristics may also be obtained by the Bluetooth headset from other devices (such as a mobile phone or a cloud server), and this embodiment of the present application does not place any restrictions on this.
  • the Bluetooth headset can identify whether the image has the above-mentioned fingerprint characteristics in one or more images collected by the fingerprint sensor, thereby determining whether the current touch operation on the fingerprint sensor is a false touch operation. If the Bluetooth headset determines that the touch operation on the fingerprint sensor is not an accidental touch operation, the Bluetooth headset may continue to perform the following steps S604-S606; otherwise, the Bluetooth headset may discard the captured image and enter the sleep state again to Reduce the power consumption of Bluetooth headsets.
  • a Bluetooth headset may start to collect images from a fingerprint sensor, and determine in real time whether each image captured by the fingerprint sensor has the above-mentioned fingerprint characteristics (that is, whether the captured image includes a fingerprint pattern). If the above fingerprint patterns are not included in consecutive M (M ⁇ N) images, the user may not touch the fingerprint sensor at this time, or there may be other objects (such as hair, clothes, or face) other than the user's finger at this time. Etc.) The fingerprint sensor was touched by mistake. Therefore, the Bluetooth headset does not need to continue to collect images formed on the fingerprint sensor, and it is not necessary to perform the following steps S604-S606.
  • the Bluetooth headset may periodically detect whether the image collected by the fingerprint sensor has the above-mentioned fingerprint characteristics. For example, after the fingerprint sensor starts collecting images, the Bluetooth headset may randomly select one image from each of the collected five images to detect whether it has the above fingerprint characteristics. As another example, a Bluetooth headset may also randomly select one image from each image collected in 500ms to detect whether it has the above-mentioned fingerprint feature. If an image with the fingerprint characteristics is detected, it can be determined that the current touch operation on the fingerprint sensor is not an accidental touch operation. The Bluetooth headset can continue to continuously capture images formed on the fingerprint sensor, and perform the following steps S604-S606 .
  • the Bluetooth headset may also count the number of images including the fingerprint pattern in the N consecutive images after acquiring the N consecutive images collected by the fingerprint sensor. If the number of images containing fingerprint patterns in the N consecutive images is less than the first threshold (for example, less than 3), it may indicate that the user may have just touched the fingerprint sensor by mistake, otherwise, the Bluetooth headset may determine that the fingerprint sensor occurred this time The touch operation on this is not an accidental touch operation. Or, if the number of the N consecutive images that do not contain a fingerprint pattern is greater than the second threshold (for example, less than 10), it may also indicate that the user may have just touched the fingerprint sensor by mistake, otherwise, the Bluetooth headset can determine this time The touch operation that occurs on the fingerprint sensor is not an accidental touch operation.
  • the first threshold for example, less than 3
  • the Bluetooth headset may determine that the fingerprint sensor occurred this time The touch operation on this is not an accidental touch operation.
  • the second threshold for example, less than 10
  • the Bluetooth headset only when the Bluetooth headset recognizes that the touch operation received by the fingerprint sensor includes a fingerprint input, it will continue to collect the image formed on the fingerprint sensor and identify the control corresponding to the touch operation. A gesture, and sending the identified control gesture or operation instruction to the terminal. Otherwise, if the touch operation received on the fingerprint sensor does not include fingerprint input, that is, the touch operation input by the user is a wrong touch operation, the mobile phone need not perform the following steps S604-S606, thereby avoiding the user's The accidental touch operation wakes up the mobile phone and also reduces the power consumption of the Bluetooth headset and mobile phone.
  • the Bluetooth headset may stop collecting the fingerprint sensor. If the Bluetooth headset recognizes the touch operation entered by the user as a touch operation by mistake, the Bluetooth headset can stop recognizing the touch operation entered by the user; if the Bluetooth headset is identifying the touch operation Only after the control gesture corresponding to the touch operation is recognized that the touch operation input by the user is a touch operation by mistake, the Bluetooth headset may stop sending the identified control gesture or operation instruction to the mobile phone.
  • the Bluetooth headset recognizes the control gesture input by the user according to the fingerprint pattern in the N consecutive images.
  • step S604 after the Bluetooth headset obtains the N consecutive images collected by the fingerprint sensor, it can identify the change of the fingerprint pattern in the N consecutive images. For example, the size of the fingerprint pattern can be gradually reduced or even completely disappeared in the N consecutive images, and the position of the fingerprint pattern can be continuously moved in the N consecutive images. Then, according to the change of the fingerprint pattern, the Bluetooth headset may determine what gesture is the specific control gesture input by the user to the fingerprint sensor in step S603.
  • the Bluetooth headset can recognize the user to input a control gesture on the fingerprint sensor according to the change in the position and size of the fingerprint pattern in the N consecutive images.
  • a control gesture on the fingerprint sensor according to the change in the position and size of the fingerprint pattern in the N consecutive images.
  • X is less than the first preset value
  • the fingerprints are included Pattern 801, and the size of the fingerprint pattern 801 does not change significantly, it can be determined that the user has entered a long press operation on the fingerprint sensor.
  • the fingerprint pattern 801 is gradually moved from the A point to the B point, and the A point and the If the distance between points B is greater than the distance threshold, it can be determined that the user has entered a swipe operation on the fingerprint sensor.
  • the Bluetooth headset may also determine the sliding direction and motion trajectory of the sliding operation according to the position of the fingerprint pattern 801 in the N consecutive images. And other parameters.
  • the sliding direction of the sliding operation may be up or down, and the motion trajectory of the sliding operation may be a closed graphic such as a circle.
  • the sensing unit in the fingerprint sensor is small in size and highly integrated, the fingerprint pattern in each image collected by the Bluetooth headset using the fingerprint sensor is more clear and accurate. Based on these fingerprint patterns, Bluetooth headsets have higher accuracy and sensitivity when recognizing control gestures entered by users. At the same time, the volume of Bluetooth headsets will not increase due to the size of the fingerprint sensor.
  • the process of identifying the control gesture by the Bluetooth headset may be performed by a computing module (such as a CPU) in the Bluetooth headset.
  • the calculation module may also be integrated into the fingerprint sensor of the Bluetooth headset, and the above steps S603-S604 may be performed by the fingerprint sensor.
  • the Bluetooth headset sends the control gesture to the mobile phone.
  • the Bluetooth headset may send the control gesture (for example, a long press operation) identified in step S604 to the mobile phone, and the mobile phone determines the corresponding operation instruction according to the control gesture, and performs the control gesture according to the following step S606 Corresponding operation instructions.
  • the control gesture for example, a long press operation
  • the Bluetooth headset After the Bluetooth headset recognizes that the control gesture input by the user on the fingerprint sensor is a long-press operation, it can send an identification (for example, 01) of the long-press operation to the mobile phone.
  • the correspondence relationship between different control gestures and different operation instructions on the Bluetooth headset in each application is stored in the mobile phone in advance. Then, the mobile phone can determine the operation instruction corresponding to the long-press operation in the running application according to the correspondence relationship. Command to pause playback. Furthermore, the mobile phone can execute the instruction for pausing playback, so that the audio being played is paused. In this way, the user can control a running application in the mobile phone to implement related functions by performing a control gesture on the fingerprint sensor of the Bluetooth headset.
  • the Bluetooth headset sends an operation instruction corresponding to the control gesture to the mobile phone.
  • the correspondence relationship between different control gestures and different operation instructions may be stored in the Bluetooth headset in advance.
  • the operation instruction corresponding to the upward sliding operation is an instruction to increase the volume
  • the operation instruction corresponding to the downward sliding operation is an instruction to decrease the volume.
  • the Bluetooth headset recognizes the control gesture of the user on the fingerprint sensor, it can determine an operation instruction corresponding to the control gesture according to the corresponding relationship. Further, the Bluetooth headset may send the operation instruction corresponding to the control gesture to the mobile phone, so that the mobile phone may execute the operation instruction corresponding to the control gesture according to the following step S606.
  • the mobile phone may set the Bluetooth module in the mobile phone to a sleep state. For example, if the Bluetooth connection between the Bluetooth headset and the mobile phone does not transmit data for a certain period of time (for example, 1 minute), the mobile phone can switch the Bluetooth module in the mobile phone to the sleep state to reduce the power consumption of the mobile phone. For another example, when the mobile phone enters the lock screen state, the Bluetooth module can be automatically switched to the sleep state. Therefore, before the Bluetooth headset sends the identified control gesture or an operation instruction corresponding to the control gesture to the mobile phone, the Bluetooth headset may also send a wake-up command to the mobile phone.
  • a certain period of time for example, 1 minute
  • the Bluetooth module can be automatically switched to the sleep state. Therefore, before the Bluetooth headset sends the identified control gesture or an operation instruction corresponding to the control gesture to the mobile phone, the Bluetooth headset may also send a wake-up command to the mobile phone.
  • the mobile phone can switch the Bluetooth module to a working state in response to the wake-up instruction, so that the mobile phone can recover the Bluetooth connection with the Bluetooth headset in advance, so that the mobile phone can quickly respond after receiving the operation instruction sent by the Bluetooth headset through the Bluetooth connection.
  • the Bluetooth headset may also send the N consecutive images to the mobile phone after collecting the N consecutive images formed on the fingerprint sensor. Furthermore, the mobile phone recognizes the control gesture input by the user based on the fingerprint patterns in the N consecutive images, and determines and executes an operation instruction corresponding to the recognized control gesture. In this way, the Bluetooth headset does not need to perform gesture recognition and other tasks, which can reduce the implementation complexity and power consumption of the Bluetooth headset.
  • the mobile phone executes an operation instruction corresponding to the control gesture.
  • step S606 if the mobile phone receives the operation instruction sent by the Bluetooth headset, the mobile phone can directly execute the operation instruction. If the mobile phone receives the identified control gesture sent by the Bluetooth headset, the mobile phone may first determine an operation instruction corresponding to the control gesture, and then execute the operation instruction. Since the operation instruction set for the same control gesture may be different in different applications, after the mobile phone receives the control gesture sent by the Bluetooth headset, it can determine the operation instruction corresponding to the control gesture in the current application according to the specific application being run.
  • the Bluetooth headset when the Bluetooth headset sends the control gesture or operation instruction to the mobile phone, it may also send its own device identification (such as a MAC address) to the mobile phone. Because the mobile phone stores the identification of the legal Bluetooth device that has passed the authentication, the mobile phone can determine whether the currently connected Bluetooth headset is a legal Bluetooth device based on the received device identification. If the Bluetooth headset is a legitimate Bluetooth device, the mobile phone may further execute the operation instruction corresponding to the control gesture recognized by the Bluetooth headset, otherwise, the mobile phone may discard the above-mentioned control gesture or operation instruction sent by the Bluetooth headset, thereby avoiding the maliciousness of illegal Bluetooth devices. Security issues caused by manipulating the phone.
  • a MAC address such as a MAC address
  • the user may also enter a setting interface 901 for managing legal devices in the mobile phone.
  • a user may manually add a new control gesture or delete an old control gesture to a corresponding legal device.
  • the user can manually set an operation instruction corresponding to each control gesture, so that the user can obtain a customized touch experience on a legitimate device.
  • the Bluetooth headset can use the fingerprint sensor to recognize a control gesture input by the user on the fingerprint sensor, and then control the mobile phone to execute an operation instruction corresponding to the control gesture. Due to the smaller size and higher integration of the sensing unit in the fingerprint sensor, the Bluetooth headset has higher sensitivity and accuracy when recognizing user gestures. At the same time, because the fingerprint sensor can recognize misoperation gestures triggered by non-user fingers, so When using the above touch control method to control the mobile phone, the probability of the Bluetooth headset and the mobile phone being triggered by mistake can be reduced, and the power consumption of the Bluetooth headset and the mobile phone can be reduced.
  • an embodiment of the present application discloses a wearable device.
  • the wearable device may include: a fingerprint sensor 1001; one or more processors 1002; a memory 1003; communication The interface 1004; one or more application programs (not shown); and one or more computer programs 1005.
  • the above-mentioned devices may be connected through one or more communication buses 1006.
  • the one or more computer programs 1005 are stored in the memory 1003 and are configured to be executed by the one or more processors 1002.
  • the one or more computer programs 1005 include instructions. 6 and the corresponding steps in the respective embodiments.
  • the processor 1002 may be the computing module 207 in FIG. 2
  • the memory 1003 may be the storage module 208 in FIG. 2
  • the communication interface 1004 may be the communication module 205 in FIG. 2.
  • the wearable device shown in FIG. 10 may further include components such as a microphone 201, an acceleration sensor 203, a proximity light sensor 204, a speaker 206, and a power supply 209 shown in FIG. 2, which are not limited in the embodiment of the present application.
  • Each functional unit in each of the embodiments of the present application may be integrated into one processing unit, or each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above integrated unit may be implemented in the form of hardware or in the form of software functional unit.
  • the integrated unit When the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in a computer-readable storage medium.
  • the technical solutions of the embodiments of the present application essentially or partly contribute to the existing technology or all or part of the technical solutions may be embodied in the form of a software product.
  • the computer software product is stored in a storage device.
  • the medium includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to perform all or part of the steps of the method described in the embodiments of the present application.
  • the foregoing storage media include: flash media, mobile hard disks, read-only memories, random access memories, magnetic disks, or optical discs, which can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

L'invention concerne un procédé de commande tactile pour un dispositif vestimentaire, ainsi qu'un système et un dispositif vestimentaire, ceux-ci se rapportant au domaine technique des communications, pouvant améliorer la sensibilité et la précision d'un dispositif vestimentaire lorsqu'un geste d'utilisateur est reconnu, et pouvant réduire la probabilité qu'un dispositif vestimentaire ou un terminal soit déclenché de manière incorrecte. Le procédé comporte les étapes suivantes: un dispositif vestimentaire utilise un capteur d'empreintes digitales pour détecter une opération tactile introduite par un utilisateur; le dispositif vestimentaire détermine si l'opération tactile comprend l'entrée d'une empreinte digitale; si l'opération tactile comprend l'entrée d'une empreinte digitale, le dispositif vestimentaire reconnaît un geste de commande correspondant à l'opération tactile; et le dispositif vestimentaire envoie le geste de commande à un terminal; en variante, le dispositif vestimentaire envoie au terminal une instruction d'opération correspondant au geste de commande, de sorte que le terminal exécute une instruction d'opération correspondant au geste de commande, une connexion de communication étant établie entre le dispositif vestimentaire et le terminal.
PCT/CN2018/097675 2018-07-27 2018-07-27 Procédé de commande tactile pour dispositif vestimentaire, et système et dispositif vestimentaire WO2020019355A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2018/097675 WO2020019355A1 (fr) 2018-07-27 2018-07-27 Procédé de commande tactile pour dispositif vestimentaire, et système et dispositif vestimentaire
CN201880094859.XA CN112334860B (zh) 2018-07-27 2018-07-27 一种可穿戴设备的触控方法、可穿戴设备及系统

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/097675 WO2020019355A1 (fr) 2018-07-27 2018-07-27 Procédé de commande tactile pour dispositif vestimentaire, et système et dispositif vestimentaire

Publications (1)

Publication Number Publication Date
WO2020019355A1 true WO2020019355A1 (fr) 2020-01-30

Family

ID=69182163

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/097675 WO2020019355A1 (fr) 2018-07-27 2018-07-27 Procédé de commande tactile pour dispositif vestimentaire, et système et dispositif vestimentaire

Country Status (2)

Country Link
CN (1) CN112334860B (fr)
WO (1) WO2020019355A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111736688A (zh) * 2020-02-27 2020-10-02 珠海市杰理科技股份有限公司 蓝牙耳机、系统及其手势识别的方法
CN111814586A (zh) * 2020-06-18 2020-10-23 维沃移动通信有限公司 指纹模组控制方法、装置、电子设备及可读存储介质
CN114205708A (zh) * 2021-12-17 2022-03-18 深圳市鑫正宇科技有限公司 一种骨传导蓝牙耳机的智能语音触控系统和方法
CN115665313A (zh) * 2021-07-09 2023-01-31 华为技术有限公司 设备控制方法及电子设备

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113709617A (zh) * 2021-08-27 2021-11-26 Oppo广东移动通信有限公司 无线耳机的控制方法、装置、无线耳机及存储介质
CN115562472B (zh) * 2022-02-11 2023-09-22 荣耀终端有限公司 手势交互方法、介质及电子设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104503577A (zh) * 2014-12-19 2015-04-08 广东欧珀移动通信有限公司 一种通过可穿戴式设备控制移动终端的方法及装置
CN104536561A (zh) * 2014-12-10 2015-04-22 金硕澳门离岸商业服务有限公司 采用可穿戴设备控制终端设备操作的方法及可穿戴设备
CN104581480A (zh) * 2014-12-18 2015-04-29 周祥宇 一种触控耳机系统及其触控命令的识别方法
CN106462342A (zh) * 2016-09-29 2017-02-22 深圳市汇顶科技股份有限公司 指纹导航方法及指纹导航信号产生的装置

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7194116B2 (en) * 2004-04-23 2007-03-20 Sony Corporation Fingerprint image reconstruction based on motion estimate across a narrow fingerprint sensor
JP4411152B2 (ja) * 2004-07-05 2010-02-10 Necインフロンティア株式会社 指紋読取方法、指紋読取システム及びプログラム
CN104320591B (zh) * 2014-11-21 2018-07-03 广东欧珀移动通信有限公司 一种控制摄像头前后切换的方法、装置和一种智能终端
CN104700079A (zh) * 2015-03-06 2015-06-10 南昌欧菲生物识别技术有限公司 指纹识别模组及基于指纹识别的触控屏
CN105354544A (zh) * 2015-10-29 2016-02-24 小米科技有限责任公司 指纹识别方法及装置
CN105739897A (zh) * 2016-01-29 2016-07-06 宇龙计算机通信科技(深圳)有限公司 一种触控操作处理的方法、装置以及终端
CN106062778B (zh) * 2016-04-01 2019-05-07 深圳市汇顶科技股份有限公司 指纹识别方法、装置和终端
CN105938403A (zh) * 2016-06-14 2016-09-14 无锡天脉聚源传媒科技有限公司 一种基于指纹识别的光标控制方法及装置
CN106469265A (zh) * 2016-09-30 2017-03-01 北京小米移动软件有限公司 电子设备唤醒方法、装置以及电子设备
CN106547465A (zh) * 2016-10-14 2017-03-29 青岛海信移动通信技术股份有限公司 一种移动终端的快速操作方法及移动终端
CN107748648A (zh) * 2017-10-27 2018-03-02 维沃移动通信有限公司 防止指纹传感器误触发的方法和终端设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104536561A (zh) * 2014-12-10 2015-04-22 金硕澳门离岸商业服务有限公司 采用可穿戴设备控制终端设备操作的方法及可穿戴设备
CN104581480A (zh) * 2014-12-18 2015-04-29 周祥宇 一种触控耳机系统及其触控命令的识别方法
CN104503577A (zh) * 2014-12-19 2015-04-08 广东欧珀移动通信有限公司 一种通过可穿戴式设备控制移动终端的方法及装置
CN106462342A (zh) * 2016-09-29 2017-02-22 深圳市汇顶科技股份有限公司 指纹导航方法及指纹导航信号产生的装置

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111736688A (zh) * 2020-02-27 2020-10-02 珠海市杰理科技股份有限公司 蓝牙耳机、系统及其手势识别的方法
CN111814586A (zh) * 2020-06-18 2020-10-23 维沃移动通信有限公司 指纹模组控制方法、装置、电子设备及可读存储介质
CN115665313A (zh) * 2021-07-09 2023-01-31 华为技术有限公司 设备控制方法及电子设备
CN114205708A (zh) * 2021-12-17 2022-03-18 深圳市鑫正宇科技有限公司 一种骨传导蓝牙耳机的智能语音触控系统和方法
CN114205708B (zh) * 2021-12-17 2024-05-31 深圳市鑫正宇科技有限公司 一种骨传导蓝牙耳机的智能语音触控系统和方法

Also Published As

Publication number Publication date
CN112334860B (zh) 2023-06-02
CN112334860A (zh) 2021-02-05

Similar Documents

Publication Publication Date Title
EP3822831B1 (fr) Procédé de reconnaissance vocale, dispositif portable et dispositif électronique
CN112351322B (zh) 一种通过遥控器实现一碰投屏的终端设备、方法以及系统
WO2020019355A1 (fr) Procédé de commande tactile pour dispositif vestimentaire, et système et dispositif vestimentaire
CN111369988A (zh) 一种语音唤醒方法及电子设备
WO2020019176A1 (fr) Procédé de mise à jour de voix de réveil d'un assistant vocal par un terminal, et terminal
WO2021213151A1 (fr) Procédé de commande d'affichage et dispositif portable
WO2022089000A1 (fr) Procédé de vérification de système de fichiers, dispositif électronique et support de stockage lisible par ordinateur
WO2020034104A1 (fr) Procédé de reconnaissance vocale, dispositif pouvant être porté et système
WO2020051852A1 (fr) Procédé d'enregistrement et d'affichage d'informations dans un processus de communication, et terminaux
CN113676339B (zh) 组播方法、装置、终端设备及计算机可读存储介质
WO2020221062A1 (fr) Procédé d'opération de navigation et dispositif électronique
CN113728295A (zh) 控屏方法、装置、设备及存储介质
WO2020062304A1 (fr) Procédé de transmission de fichier et dispositif électronique
CN113467735A (zh) 图像调整方法、电子设备及存储介质
CN114554012A (zh) 来电接听方法、电子设备及存储介质
CN114089902A (zh) 手势交互方法、装置及终端设备
CN115119336B (zh) 耳机连接系统、方法、耳机、电子设备及可读存储介质
CN113129916A (zh) 一种音频采集方法、系统及相关装置
CN115665632A (zh) 音频电路、相关装置和控制方法
CN113467747B (zh) 音量调节方法、电子设备及存储介质
CN114116610A (zh) 获取存储信息的方法、装置、电子设备和介质
CN113867520A (zh) 设备控制方法、电子设备和计算机可读存储介质
CN115525366A (zh) 一种投屏方法及相关装置
CN114125144B (zh) 一种防误触的方法、终端及存储介质
CN114115513B (zh) 一种按键控制方法和一种按键装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18927890

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18927890

Country of ref document: EP

Kind code of ref document: A1