WO2021052139A1 - Procédé d'entrée de geste et dispositif électronique - Google Patents

Procédé d'entrée de geste et dispositif électronique Download PDF

Info

Publication number
WO2021052139A1
WO2021052139A1 PCT/CN2020/112039 CN2020112039W WO2021052139A1 WO 2021052139 A1 WO2021052139 A1 WO 2021052139A1 CN 2020112039 W CN2020112039 W CN 2020112039W WO 2021052139 A1 WO2021052139 A1 WO 2021052139A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
electronic device
sub
user interface
frame rate
Prior art date
Application number
PCT/CN2020/112039
Other languages
English (en)
Chinese (zh)
Inventor
葛振华
姚家雄
杨琪
吕臻凯
陈松林
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021052139A1 publication Critical patent/WO2021052139A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • This application relates to the field of artificial intelligence technology, and in particular to a gesture input method and electronic equipment.
  • AI artificial intelligence
  • the ways that people interact with mobile terminals mainly include touch screen sliding, voice, and so on.
  • dynamic gesture interaction plays an irreplaceable role in enhancing the interactive experience of mobile terminals.
  • the interaction process of dynamic gestures in the prior art is as follows: a single gesture specified by the user input is maintained for a period of time to activate the gesture interaction function, and then the gesture is changed to enable the terminal to realize the corresponding function. During the interaction process, the user waits for a long time, the interaction efficiency is low, and the gesture needs to be changed during the interaction process, and the user experience is poor.
  • the present application discloses a gesture input method and electronic device, which can enable a user to interact with a terminal through gestures, and the input gesture includes a wake-up gesture without additional input of the wake-up gesture, the interaction efficiency in the interaction process is high, and the user experience is good.
  • an embodiment of the present application provides a gesture input method, which is applied to an electronic device, and the method includes: the electronic device acquires a first gesture, the first gesture includes a first sub-gesture and a second sub-gesture; When the sub-gesture matches the preset gesture, the function corresponding to the second sub-gesture is recognized; the function corresponding to the second sub-gesture is executed.
  • the operation gesture may include a wake-up gesture, that is, the initial gesture of the operation gesture may be a wake-up gesture, which reduces the gesture operation time delay, improves the response efficiency of the electronic device, and improves the interaction efficiency between the user and the electronic device.
  • the electronic device acquiring the first gesture includes: the electronic device acquires the first sub-gesture at a first frame rate; when the first sub-gesture matches a preset gesture, identifying the function corresponding to the second sub-gesture includes : When the first sub-gesture matches the preset gesture, continue to acquire the first sub-gesture at the second frame rate; when the continuously acquired first sub-gesture matches the preset gesture, the second sub-gesture is acquired at the third frame rate ; Wherein, the third frame rate is greater than the first frame rate; determine the function corresponding to the second sub-gesture.
  • the first sub-gesture may be further collected to determine whether the subsequently collected first sub-gesture matches the preset gesture. This can reduce the probability of false triggers and avoid waste of power consumption due to false triggers of gestures.
  • increasing the frame rate of image collection can increase the speed of the electronic device in recognizing the gesture, thereby improving the performance of the electronic device. The speed of response improves the interaction efficiency between users and electronic devices.
  • the second frame rate is equal to the third frame rate.
  • the embodiment of the application can switch the frame rate of the captured image immediately after detecting that the first hand gesture matches the preset gesture for the first time, and then collect the subsequent first and second sub-gestures at the third frame rate.
  • the probability of false triggering can be reduced, the response speed of the electronic device can be improved to the greatest extent, and the interaction efficiency between the user and the electronic device can be improved.
  • the second frame rate is equal to the first frame rate.
  • the above-mentioned first frame rate may be 10 frames per second, that is, the camera of the electronic device may collect 10 frames of images per second.
  • the captured multi-frame images may include the first sub-gesture.
  • the aforementioned third frame rate may be 30 frames per second, that is, the camera of the electronic device may collect 30 frames of images per second.
  • the captured image stream may include a second sub-gesture.
  • obtaining the second sub-gesture at the third frame rate includes: when there are x-th consecutive first sub-gestures, When a sub-gesture matches the preset gesture, the second sub-gesture is acquired at the third frame rate; where X and x are both positive integers, and X is greater than or equal to x.
  • the present application provides a specific method for determining the match between the first sub-gesture and the preset gesture to ensure the accuracy of recognizing the first sub-gesture, reduce the probability of false triggering, and avoid waste of power consumption due to false triggering of the gesture.
  • the method further includes: the electronic device displays a first user interface.
  • recognizing the function corresponding to the second sub-gesture includes: when the first sub-gesture matches the preset gesture, in the first user interface The indicator is displayed to identify the function corresponding to the second sub-gesture, where the indicator is used to indicate that the electronic device is recognizing the gesture.
  • an indicator may be displayed on the display interface of the electronic device to prompt the user that the gesture is currently being recognized, which may improve the interaction between the user and the electronic device.
  • recognizing the function corresponding to the second sub-gesture includes: when the first sub-gesture matches the preset gesture, in the first user interface A prompt is displayed to identify the function corresponding to the second sub-gesture, where the prompt is used to indicate the respective functions corresponding to the multiple gestures supported by the electronic device.
  • a prompt can be displayed in the user interface and the display style of the indicator can be associated with the supported functional gestures, which can reduce the user's memory burden and improve the interaction efficiency between the user and the electronic device.
  • the electronic device displays a first user interface, and the function corresponding to the second sub-gesture is the first function; after the function corresponding to the second sub-gesture is executed, the method further includes: the electronic device A second user interface is displayed, the first user interface is different from the first user interface; the electronic device acquires the first gesture; when the first gesture matches the preset gesture, the function corresponding to the second sub-gesture is recognized as The second function; the above-mentioned first function is different from the above-mentioned second function; the above-mentioned second function is performed.
  • the user when the electronic device displays different user interfaces, the user can input the same gesture to achieve different functions.
  • the gesture response function corresponding to the user interface can be set according to the attributes of the content currently displayed on the user interface, which can fully improve the utilization rate of gesture input and enhance the interest of the interaction between the user and the electronic device.
  • an embodiment of the present application provides an electronic device, including: one or more processors, a memory, a display screen, and a camera; the memory, the display screen, the first camera, and the one or more A plurality of processors are coupled, and the memory is used to store computer program code, and the computer program code includes computer instructions.
  • the electronic device executes: Gesture, the first gesture includes a first sub-gesture and a second sub-gesture; when the first sub-gesture matches a preset gesture, recognize the function corresponding to the second sub-gesture; execute the second sub-gesture Corresponding function.
  • the above-mentioned electronic device executes the acquisition of the first gesture, which specifically executes: the electronic device acquires the first sub-gesture at the first frame rate; the above-mentioned electronic device executes when the first sub-gesture matches the preset gesture, Recognize the function corresponding to the second sub-gesture, and specifically execute: when the first sub-gesture matches the preset gesture, continue to acquire the first sub-gesture at the second frame rate; when the continuously acquired first sub-gesture matches the preset gesture , Acquire the second sub-gesture at the third frame rate; wherein the third frame rate is greater than the first frame rate; and determine the function corresponding to the second sub-gesture.
  • the second frame rate is equal to the third frame rate.
  • the second frame rate is equal to the first frame rate.
  • the above-mentioned first frame rate may be 10 frames per second, that is, the camera of the electronic device may collect 10 frames of images per second.
  • the captured multi-frame images may include the first sub-gesture.
  • the aforementioned third frame rate may be 30 frames per second, that is, the camera of the electronic device may collect 30 frames of images per second.
  • the captured image stream may include a second sub-gesture.
  • the above electronic device executes the second sub-gesture at the third frame rate when the first sub-gesture that continues to be acquired matches the preset gesture, and specifically executes: when X consecutive first sub-gestures When there are x first sub-gestures matching the preset gesture, the second sub-gesture is acquired at the third frame rate; where X and x are both positive integers, and X is greater than or equal to x.
  • the electronic device before the above-mentioned electronic device acquires the first gesture, the electronic device further executes: displaying the first user interface.
  • the above electronic device executes the function of recognizing the second sub-gesture when the first sub-gesture matches the preset gesture, and specifically executes: when the first sub-gesture matches the preset gesture, An indicator is displayed on the first user interface to recognize the function corresponding to the second sub-gesture, where the indicator is used to indicate that the electronic device is recognizing the gesture.
  • the above-mentioned electronic device executes the above-mentioned function of recognizing the second sub-gesture when the first sub-gesture matches the preset gesture, and specifically executes: when the first sub-gesture matches the preset gesture, A prompt is displayed in the first user interface to identify the function corresponding to the second sub-gesture, where the prompt is used to indicate the respective functions corresponding to the multiple gestures supported by the electronic device.
  • the electronic device displays a first user interface, and the function corresponding to the second sub-gesture is the first function; after the electronic device executes the function corresponding to the second sub-gesture, the electronic device further executes: Display a second user interface, the first user interface is different from the first user interface; the first gesture is acquired; when the first gesture matches the preset gesture, the function corresponding to the second sub-gesture is recognized as the second Function; the above-mentioned first function is different from the above-mentioned second function; the above-mentioned second function is performed.
  • the embodiments of the present application provide a computer-readable storage medium with instructions stored in the computer-readable storage medium, which when run on an electronic device, cause the electronic device to execute the above-mentioned first aspect or the first aspect.
  • the method provided by any one of the implementations.
  • the embodiments of the present application provide a computer program product that, when it runs on an electronic device, causes the electronic device to execute the method provided in the first aspect or any one of the implementation manners of the first aspect.
  • the electronic device provided in the second aspect, the computer storage medium provided in the third aspect, or the computer program product provided in the fourth aspect are all used to implement the gesture input method provided in the first aspect. Therefore, the beneficial effects that can be achieved can refer to the beneficial effects in the corresponding method, which will not be repeated here.
  • FIG. 1 is a schematic diagram of the structure of an electronic device provided by an embodiment of the application.
  • FIG. 2 is a block diagram of the software structure of an electronic device provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of a user input gesture provided by an embodiment of the application.
  • FIG. 4 is a schematic diagram of the result of another electronic device provided by an embodiment of this application.
  • FIG. 5 is a schematic diagram of an image acquisition process provided by an embodiment of this application.
  • FIG. 6 is a schematic diagram of another image acquisition process provided by an embodiment of the application.
  • FIG. 7 is a schematic diagram of application scenario 1 provided by an embodiment of the application.
  • FIG. 8 is a schematic diagram of some user interfaces of application scenario 1 provided by an embodiment of the application.
  • FIG. 9 is a schematic diagram of other user interfaces of application scenario 1 provided by an embodiment of this application.
  • FIG. 10 is a schematic diagram of application scenario 2 provided by an embodiment of the application.
  • 11 is a schematic diagram of some user interfaces of application scenario 2 provided by an embodiment of the application.
  • FIG. 12 is a schematic diagram of some user interfaces of application scenario 3 provided by an embodiment of the application.
  • FIG. 13 is a gesture interaction prompt provided by an embodiment of this application.
  • FIG. 14 is another gesture interaction prompt provided by an embodiment of this application.
  • FIG. 15 is a schematic flowchart of a gesture input method provided by an embodiment of the application.
  • the embodiment of the application provides a gesture input method.
  • the electronic device recognizes the initial action of any gesture input by the user, it can first determine whether the gesture is a gesture supported by the electronic device, and if so, continue to recognize the gesture input by the user, and determine The function corresponding to this gesture.
  • the initial action of each operation gesture supported by the electronic device can be used as a wake-up gesture of the gesture interaction function.
  • the present application can omit the wake-up gesture, reduce the time for the user to input the gesture in a space, reduce the gesture operation delay, and improve the interaction efficiency between the user and the electronic device.
  • the electronic devices involved in the embodiments of this application may be mobile phones, tablet computers, desktops, laptops, notebook computers, Ultra-mobile Personal Computers (UMPC), handheld computers, netbooks, and personal digital assistants (Personal Digital Assistants). Digital Assistant, PDA), wearable electronic equipment, virtual reality equipment, etc.
  • UMPC Ultra-mobile Personal Computers
  • PDA Personal Digital Assistants
  • PDA wearable electronic equipment
  • virtual reality equipment etc.
  • FIG. 1 shows a schematic diagram of the structure of an electronic device 100.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2.
  • Mobile communication module 150 wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light Sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100.
  • the electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU), etc.
  • AP application processor
  • modem processor modem processor
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the different processing units may be independent devices or integrated in one or more processors.
  • the controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 to store instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory can store instructions or data that the processor 110 has just used or used cyclically. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
  • the processor 110 may include one or more interfaces.
  • the interface can include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, and a universal asynchronous transmitter (universal asynchronous) interface.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB Universal Serial Bus
  • the I2C interface is a bidirectional synchronous serial bus, which includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may include multiple sets of I2C buses.
  • the processor 110 may couple the touch sensor 180K, the charger, the flash, the camera 193, etc., respectively through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement the touch function of the electronic device 100.
  • the I2S interface can be used for audio communication.
  • the processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170.
  • the audio module 170 may transmit audio signals to the wireless communication module 160 through an I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communication to sample, quantize and encode analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a two-way communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the UART interface is generally used to connect the processor 110 and the wireless communication module 160.
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to realize the Bluetooth function.
  • the audio module 170 may transmit audio signals to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with the display screen 194, the camera 193 and other peripheral devices.
  • the MIPI interface includes a camera serial interface (camera serial interface, CSI), a display serial interface (display serial interface, DSI), and so on.
  • the processor 110 and the camera 193 communicate through a CSI interface to implement the shooting function of the electronic device 100.
  • the processor 110 and the display screen 194 communicate through a DSI interface to realize the display function of the electronic device 100.
  • the GPIO interface can be configured through software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that complies with the USB standard specification, and specifically may be a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transfer data between the electronic device 100 and peripheral devices. It can also be used to connect earphones and play audio through earphones. This interface can also be used to connect to other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiment of the present invention is merely a schematic description, and does not constitute a structural limitation of the electronic device 100.
  • the electronic device 100 may also adopt different interface connection modes in the foregoing embodiments, or a combination of multiple interface connection modes.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 140 may receive the charging input of the wired charger through the USB interface 130.
  • the charging management module 140 may receive the wireless charging input through the wireless charging coil of the electronic device 100. While the charging management module 140 charges the battery 142, it can also supply power to the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110.
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 can be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the electronic device 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 150 may provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the electronic device 100.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves by the antenna 1, and perform processing such as filtering, amplifying and transmitting the received electromagnetic waves to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic wave radiation via the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing. After the low-frequency baseband signal is processed by the baseband processor, it is passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the display screen 194.
  • the modem processor may be an independent device. In other embodiments, the modem processor may be independent of the processor 110 and be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), and global navigation satellites.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
  • the wireless communication module 160 may also receive the signal to be sent from the processor 110, perform frequency modulation, amplify it, and convert it into electromagnetic waves to radiate through the antenna 2.
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, connected to the display 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs, which execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos, and the like.
  • the display screen 194 includes a display panel.
  • the display panel can adopt liquid crystal display (LCD), organic light-emitting diode (OLED), active-matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the electronic device 100 may include one or N display screens 194, and N is a positive integer greater than one.
  • the electronic device 100 can realize a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, and an application processor.
  • the ISP is used to process the data fed back by the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing and is converted into an image visible to the naked eye. ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, the ISP may be provided in the camera 193. In this embodiment of the present application, the ISP may process the images collected by the camera 193 that include user gestures.
  • the camera 193 is used to capture still images or videos.
  • the object generates an optical image through the lens and is projected to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transfers the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 100 may include one or N cameras 193, and N is a positive integer greater than one.
  • the camera 193 may be used to capture an image stream at a first frequency, and when the NPU recognizes that the image stream contains the initial part of a gesture supported by the electronic device, the image stream is captured at the second frequency. Among them, the first frequency is lower than the second frequency.
  • the camera 193 involved in the embodiment of the present application may be a front camera.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in multiple encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • MPEG2 MPEG2, MPEG3, MPEG4, and so on.
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • the NPU can be used to process the image collected by the camera 193, and analyze whether the gesture contained in the image is the initial part of the gesture supported by the electronic device 100, or whether the gesture contained in the image is supported by the electronic device 100 gesture.
  • the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, at least one application program (such as a sound playback function, an image playback function, etc.) required by at least one function.
  • the data storage area can store data (such as audio data, phone book, etc.) created during the use of the electronic device 100.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like.
  • UFS universal flash storage
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by running instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the internal memory 121 may be used to store the model of the gesture supported by the electronic device 100 and the function corresponding to each supported gesture. Possibly, in the application interface of different applications, the gestures supported may be different, and the functions implemented by the same gesture may also be different.
  • the electronic device 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. For example, music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 may be provided in the processor 110, or part of the functional modules of the audio module 170 may be provided in the processor 110.
  • the speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 answers a call or voice message, it can receive the voice by bringing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through the human mouth, and input the sound signal into the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, which can implement noise reduction functions in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions.
  • the earphone interface 170D is used to connect wired earphones.
  • the earphone interface 170D may be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 180A may be provided on the display screen 194.
  • the capacitive pressure sensor may include at least two parallel plates with conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations that act on the same touch position but have different touch operation strengths may correspond to different operation instructions. For example, when a touch operation whose intensity of the touch operation is less than the first pressure threshold is applied to the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, an instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the movement posture of the electronic device 100.
  • the angular velocity of the electronic device 100 around three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for shooting anti-shake.
  • the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shake of the electronic device 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude based on the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 may use the magnetic sensor 180D to detect the opening and closing of the flip holster.
  • the electronic device 100 can detect the opening and closing of the flip according to the magnetic sensor 180D.
  • features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and apply to applications such as horizontal and vertical screen switching, pedometers and so on.
  • the electronic device 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 may use the distance sensor 180F to measure the distance to achieve fast focusing.
  • the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 100 emits infrared light to the outside through the light emitting diode.
  • the electronic device 100 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 can determine that there is no object near the electronic device 100.
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode, and the pocket mode will automatically unlock and lock the screen.
  • the ambient light sensor 180L is used to sense the brightness of the ambient light.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived brightness of the ambient light.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in the pocket to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, access application locks, fingerprint photographs, fingerprint answering calls, and so on.
  • the temperature sensor 180J is used to detect temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the electronic device 100 reduces the performance of the processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 due to low temperature.
  • the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also called “touch device”.
  • the touch sensor 180K may be disposed on the display screen 194, and the touch screen is composed of the touch sensor 180K and the display screen 194, which is also called a “touch screen”.
  • the touch sensor 180K is used to detect touch operations acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation can be provided through the display screen 194.
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100, which is different from the position of the display screen 194.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can obtain the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the human pulse and receive the blood pressure pulse signal.
  • the bone conduction sensor 180M may also be provided in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can parse the voice signal based on the vibration signal of the vibrating bone block of the voice obtained by the bone conduction sensor 180M, and realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beating signal obtained by the bone conduction sensor 180M, and realize the heart rate detection function.
  • the button 190 includes a power-on button, a volume button, and so on.
  • the button 190 may be a mechanical button. It can also be a touch button.
  • the electronic device 100 may receive key input, and generate key signal input related to user settings and function control of the electronic device 100.
  • the motor 191 can generate vibration prompts.
  • the motor 191 can be used for incoming call vibration notification, and can also be used for touch vibration feedback.
  • touch operations that act on different applications can correspond to different vibration feedback effects.
  • Acting on touch operations in different areas of the display screen 194, the motor 191 can also correspond to different vibration feedback effects.
  • Different application scenarios for example: time reminding, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 may be an indicator light, which may be used to indicate the charging status, power change, or to indicate messages, missed calls, notifications, and so on.
  • the SIM card interface 195 is used to connect to the SIM card.
  • the SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to achieve contact and separation with the electronic device 100.
  • the electronic device 100 may support 1 or N SIM card interfaces, and N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM cards, Micro SIM cards, SIM cards, etc.
  • the same SIM card interface 195 can insert multiple cards at the same time. The types of the multiple cards can be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 may also be compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the electronic device 100 adopts an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiment of the present invention takes an Android system with a layered architecture as an example to illustrate the software structure of the electronic device 100 by way of example.
  • FIG. 2 is a block diagram of the software structure of the electronic device 100 according to an embodiment of the present invention.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Communication between layers through software interface.
  • the Android system is divided into four layers, from top to bottom, the application layer, the application framework layer, the Android runtime and system library, and the kernel layer.
  • the application layer can include a series of application packages.
  • the application package can include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message, etc.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer can include a window manager, a content provider, a view system, a phone manager, a resource manager, and a notification manager.
  • the window manager is used to manage window programs.
  • the window manager can obtain the size of the display, determine whether there is a status bar, lock the screen, take a screenshot, etc.
  • the content provider is used to store and retrieve data and make these data accessible to applications.
  • the data may include videos, images, audios, phone calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls that display text, controls that display pictures, and so on.
  • the view system can be used to build applications.
  • the display interface can be composed of one or more views.
  • a display interface that includes a short message notification icon may include a view that displays text and a view that displays pictures.
  • the phone manager is used to provide the communication function of the electronic device 100. For example, the management of the call status (including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and it can disappear automatically after a short stay without user interaction.
  • the notification manager is used to notify download completion, message reminders, and so on.
  • the notification manager can also be a notification that appears in the status bar at the top of the system in the form of a chart or a scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window. For example, text messages are prompted in the status bar, prompt sounds, electronic devices vibrate, and indicator lights flash.
  • Android Runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function functions that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in a virtual machine.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), three-dimensional graphics processing library (for example: OpenGL ES), 2D graphics engine (for example: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides a combination of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to realize 3D graphics drawing, image rendering, synthesis, and layer processing.
  • the 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display driver, camera driver, audio driver, and sensor driver.
  • the corresponding hardware interrupt is sent to the kernel layer.
  • the kernel layer processes the touch operation into the original input event (including touch coordinates, time stamp of the touch operation, etc.).
  • the original input events are stored in the kernel layer.
  • the application framework layer obtains the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, and the control corresponding to the click operation is the control of the camera application icon as an example, the camera application calls the interface of the application framework layer to start the camera application, and then starts the camera driver by calling the kernel layer.
  • the camera 193 captures still images or videos.
  • Fig. 3 exemplarily shows a schematic diagram of an air input gesture.
  • the user can make the initial part of the functional gesture within a preset range in front of the front camera 193 of the electronic device 100 such as 5 cm-30 cm.
  • the functional gesture is a gesture supported by the electronic device 100.
  • the image data collected by the front camera 193 may be RGB data or grayscale data.
  • the depth data of the user’s gesture can also be collected through a depth camera or infrared sensor.
  • the data of the functional gesture stored in the internal memory 121 is the depth data, which is the Three-dimensional model of functional gestures.
  • the depth camera may be, for example, a time of flight (tof) camera.
  • tof time of flight
  • the depth data of the user's gesture can also be collected through a radar sensor.
  • Fig. 4 exemplarily shows a schematic structural diagram of another electronic device provided by an embodiment of the present application.
  • the electronic device 200 may include a bright screen detection module 210, an image acquisition module 220, a low-power motion detection module 230, a gesture recognition module 240, and an execution module 250.
  • the bright screen detection module 210 may be used to detect whether the electronic device 200 is in a bright screen state, that is, whether the display screen of the electronic device 200 is lit. If the electronic device 200 is in the bright screen state, the image acquisition module 220 starts to acquire multiple frames of images at the first frame rate. The captured multi-frame images are taken as the input of the low power consumption action detection module 230 in units of frames. If the output result of the low-power motion detection module 230 is that the initial part of the supported gesture exists in the multi-frame image, the image acquisition module 220 acquires the image stream at the third frame rate and serves as the input of the gesture recognition module 240. The result output by the gesture recognition module 240 may be a recognition result of a functional gesture.
  • the execution module 250 may execute an event corresponding to the functional gesture according to the recognition result output by the gesture recognition module 240.
  • the image stream may include multiple frames of continuous images.
  • the first frame rate and the third frame rate are the frequency at which the image acquisition module 220 acquires images.
  • the first frame rate is less than the third frame rate. That is, when the gesture is not recognized, the image acquisition unit 220 can acquire multiple frames of images at the first frame rate, and when it is determined that the input gesture is the initial gesture of the functional gesture, the image stream is acquired at the third frame rate, which can ensure that the electronic device 200 The rate of recognizing gestures and responding can also save the power consumption of the electronic device 200 to the greatest extent, and improve the endurance of the electronic device 200 as much as possible.
  • the above-mentioned first frame rate may be, for example, 10 frames/sec, and the third frame rate may be, for example, 30 frames/sec.
  • the image acquisition module 220 may be the front camera 193, depth camera, infrared sensor or radar sensor mentioned in FIG. 3.
  • the aforementioned low-power consumption action detection module 230 and gesture recognition module 240 may both be implemented by algorithms executed by the NPU.
  • the low power consumption action detection module 230 may include a first neural network model and a post-processing module.
  • the first neural network model can be used to identify whether the gesture contained in the input picture is the initial gesture.
  • the electronic device 100 may input the picture containing the gesture collected by the image acquisition module 220 into the model, and the model will output the recognition result of the initial gesture.
  • the recognition result can be but not limited to be characterized by 0 or 1. For example, when the recognition result is 1, the gesture that characterizes the input is the initial gesture; when the recognition result is 0, the gesture that characterizes the input is not the initial gesture.
  • the post-processing module can be used to process the recognition results corresponding to each frame of the multiple frames of images output by the first neural network model, and determine whether the gesture recognition module 240 needs to be awakened according to the aforementioned recognition results.
  • the gesture recognition module 240 may include a second neural network model.
  • the second neural network model can be used to identify gestures that may be included in the input image stream and the probability corresponding to each gesture.
  • the electronic device 100 can input the image stream collected by the image collection module 220 into the model, and the model outputs the result of gesture recognition.
  • the recognition result may include multiple possible functional gestures and the probability of each functional gesture.
  • the above-mentioned execution module 250 may be a CPU.
  • the recognition result of the functional gesture can be used to characterize what the functional gesture input by the user is, for example, whether the wrist is turned up or down.
  • the CPU may determine the event corresponding to the functional gesture according to the recognition result of the functional gesture, and execute the event.
  • the recognition result of the functional gesture can be used to characterize the event corresponding to the functional gesture input by the user, and the CPU can execute the event.
  • the recognition result of the functional gesture may indicate that the gesture input by the user does not belong to any kind of functional gesture, and the gesture recognition module 240 may not send the recognition result to the CPU. Or the gesture recognition module 240 may send the recognition result to the CPU, but the CPU may not make any response.
  • the image acquisition module 220 of the electronic device 100 does not detect any gesture within a period of time (such as 1 second, 2 seconds, 5 seconds, etc.), then the image acquisition module 220 can reduce the frame rate from the third frame rate to the first frame rate to save power consumption overhead.
  • the embodiment of the application can enable the user to input the functional gesture to operate the electronic device without additional input of the activation gesture, that is, the initial gesture of the functional gesture can be used as the activation gesture, and the input gesture can be completed in one go, reducing the gesture operation time delay, and the electronic device The response speed is fast, and the waste of power consumption caused by false triggering of gestures is avoided.
  • the electronic device 200 may not include the bright screen detection module 210 described above. That is to say, the image acquisition module 220 can acquire an image stream when the electronic device 200 is in a state where the screen is off.
  • the electronic device 200 may determine that there is a gesture in the captured image when the screen is turned off, and then perform identity verification. After confirming that the user is the owner, the gesture recognition process is triggered.
  • the method of identity verification can be, but is not limited to, palmprint recognition, iris recognition, fingerprint recognition, voiceprint recognition, and so on.
  • the two image acquisition processes are respectively introduced.
  • the premise of the image acquisition process is that the electronic device is in a bright screen state to ensure the privacy of the owner.
  • the following two processes are described by taking the image acquisition module 220 as the front camera 193 as an example.
  • FIG. 5 exemplarily shows a schematic diagram of an image acquisition process.
  • the front camera 193 continuously collects multiple frames of images at the first frame rate (for example, 10 frames/sec) when the electronic device is in the on-screen state.
  • the front camera 193 can input the image collected at the first frame rate into the low power consumption action detection module 230.
  • the low-power motion detection module 230 recognizes an initial gesture (that is, the initial part of a certain functional gesture) in the fifth frame of image, it switches the frame rate to the third frame rate (such as 30 frames per second), and further According to whether the initial gesture of a certain function still exists in the next few frames of images.
  • the wake-up gesture recognition module 240 recognizes the image stream collected by the front camera 193. It is further determined whether there are functional gestures in the image stream collected by the front camera 193. If it exists, the electronic device 100 executes an event corresponding to the functional gesture.
  • the process of how the low-power motion detection module 230 determines that there is an initial gesture of a certain functional gesture is exemplarily as follows:
  • the first neural network model in the low-power motion detection module 230 can respectively identify whether there is an initial gesture of a certain functional gesture for each frame of continuous X frames of images.
  • the post-processing module may determine whether the gesture recognition module 240 needs to be awakened according to the recognition result of the above-mentioned consecutive X frames of images. If, starting from the current frame, more than x frames of images in the previous consecutive X frames (including the current frame) all contain the initial gesture, the post-processing module can wake up the gesture recognition module 240.
  • X and x are both positive integers, and x is less than or equal to X. For example, X can be but not limited to 7, and x can be but not limited to 5.
  • the first neural network model continues to identify whether there is an initial gesture in the next frame of image, until X consecutive frames There are more than x frames in the image that include the initial gesture.
  • the post-processing module can wake up the gesture recognition module 240.
  • the above-mentioned current frame is the 13th frame.
  • the process of how the gesture recognition module 240 determines that there is a functional gesture is illustratively as follows:
  • frames 14 to 23 can be used as sliding windows (hereinafter referred to as sliding windows).
  • the length of the sliding window is 10, that is, the sliding window can contain 10 consecutive images, and the step is 1, that is, the sliding window can move forward one frame at a time.
  • the gesture recognition module 240 can analyze the content of 10 frames of images (frame 14-frame 23) contained in each sliding window, and output the recognition result. That is, the 10 frames of images contained in the sliding window are input into the second neural network model, and the recognition result is output.
  • the recognition result may specifically be various gestures and corresponding probabilities of various gestures that may exist in the image stream contained in the sliding window. The sum of the probabilities corresponding to each gesture is 1.
  • the length of the above-mentioned sliding window may also be other values, which is not limited in the embodiment of the present application.
  • the sliding window containing images from frames 14 to 23 can be called sliding window 1
  • the sliding window containing images from frames 15 to 24 is called sliding window 2, which will contain frame 16 -
  • the sliding window of the 25th frame is called sliding window 3.
  • the results that can be output are: gesture 1, with a probability of 50%; gesture 2, with a probability of 30%; and gesture 3, with a probability of 20%.
  • gesture recognition module 240 analyzes the image stream contained in the sliding window 2, the results that can be output are: gesture 1, probability 60%; gesture 2, probability 20%; gesture 3, 15%; gesture 4, 5%.
  • the weighted sum of the probabilities corresponding to each gesture in the results output by the sliding window 1 and the sliding window 2 can obtain the comprehensive probabilities corresponding to each gesture, that is, the comprehensive result of the sliding window 2.
  • the integrated result of the sliding window 3 may be obtained by performing a weighted summation of the output result of the sliding window 3 and the integrated result of the sliding window 2.
  • the integrated result of the sliding window 4 may be obtained by weighted summation of the output result of the sliding window 4 and the integrated result of the sliding window 3, and iterative calculation is performed until the integrated result of the last sliding window is calculated. According to the comprehensive result of the last sliding window, it can be determined that the gesture with the highest probability value is the functional gesture input by the user.
  • the image acquisition process shown in FIG. 5 can switch the third frame rate after the low-power motion detection module 230 recognizes the initial gesture, so that the speed of gesture recognition can be ensured, and the response speed of the electronic device can be ensured.
  • the low-power motion detection module 230 wakes up the gesture recognition module 240 after determining that the gesture input by the user is the initial gesture of the functional gesture, which can prevent the gesture recognition module from being woken up by mistake and causing waste of power consumption.
  • Fig. 6 exemplarily shows a schematic diagram of another image acquisition process.
  • the difference between FIG. 6 and FIG. 5 is that the time point for switching the frame rate is different.
  • the front camera 193 may switch from the first frame rate (for example, 10 frames/sec) to the third frame rate (30 frames/sec) after waking up the gesture recognition module 240.
  • the other processes are the same as the image acquisition process shown in FIG. 5, and will not be repeated here.
  • the image acquisition process can also be performed when the electronic device 100 is off.
  • the electronic device can determine that there is a gesture in the captured image when the screen is turned off, and then perform identity verification. After confirming that the user is the owner, the gesture recognition process is triggered.
  • the method of identity verification can be, but is not limited to, palmprint recognition, iris recognition, fingerprint recognition, voiceprint recognition, and so on.
  • the electronic device 100 can respond to functional gestures input by the user in an air-space manner.
  • the user can directly make a functional gesture without having to input a wake-up gesture first, and then switch to a functional gesture, and the process of inputting a functional gesture is done in one go. That is, the initial gesture of each functional gesture can be used as a wake-up gesture, and the wake-up gesture of different functional gestures can be different.
  • this application omits the wake-up gesture, reduces the gesture operation time delay, improves the response speed of the electronic device, and improves the efficiency of gesture input.
  • the user may input a wrist-up (or wrist-down) functional gesture when the electronic device 100 displays a user interface, so that the electronic device 100 can update and display the content displayed in the user interface.
  • the following describes application scenario one involved in the embodiment of the present application and a user interface (UI) embodiment in the application scenario.
  • UI user interface
  • Application scenario 1 View the scenario of WeChat conversation list.
  • Fig. 7 exemplarily shows a user interface for displaying a list of WeChat conversations.
  • WeChat is a kind of instant messaging software.
  • the user interface 30 for displaying the WeChat conversation list may include: a status bar 301, a conversation list 302, a "WeChat” menu control 303, a "Contacts” menu control 304, a "Discover” menu control 305, and My” menu control 306. among them:
  • the status bar 301 may include: operator indicators (for example, the operator’s name "China Mobile"), one or more signal strength indicators of wireless fidelity (wireless fidelity, Wi-Fi) signals, mobile communication signals (also Called cellular signal) one or more signal strength indicators, time indicators, and battery status indicators.
  • operator indicators for example, the operator’s name "China Mobile”
  • signal strength indicators of wireless fidelity wireless fidelity, Wi-Fi
  • mobile communication signals also Called cellular signal
  • the conversation list 302 may be used to display the conversation boxes of the contacts contacted with the user in chronological order.
  • the electronic device 100 can detect a user operation (such as a slide-up operation) acting on the conversation list, and in response to the user operation, the electronic device 100 can display an earlier conversation in the conversation list.
  • the “WeChat” menu control 303 can be used to display the conversation list 302.
  • the electronic device 100 can detect a click operation acting on the “WeChat” menu control 303, and in response to the click operation, the electronic device 100 can display a user interface of the “WeChat” menu, which is the user interface 30.
  • the electronic device 100 can also make the "WeChat” menu control 303 display special, for example, to emphasize the display color or change the display color.
  • the "address book” menu control 304 can be used to display the user's contact list and the like.
  • the electronic device 100 can detect a click operation acting on the "address book” menu control 304, and in response to the click operation, the electronic device 100 can display a user interface of the "address book” menu, and the user interface can include the user's contact list .
  • the electronic device 100 can also make the “address book” menu control 304 specially displayed, for example, to emphasize the display color or change the display color.
  • the "Discover” menu control 305 can be used to display a list of various function entries, such as “Moments of Friends", “Scan”, “Shake”, “Take a Look”, and “Search One”. Search” and so on.
  • the electronic device 100 can detect a click operation acting on the "discover” menu control 305, and in response to the click operation, the electronic device 100 can display a user interface of the "discover” menu, and the user interface can include a list of the aforementioned various function entries. In response to the click operation, the electronic device 100 may also make the "discovery" menu control 305 special display, for example, emphasize the display color or change the display color.
  • the "me” menu control 306 can be used to display options related to the user, such as "payment”, "favorite", “album” and so on.
  • the electronic device 100 can detect a click operation acting on the "me” menu control 306, and in response to the click operation, the electronic device 100 can display a user interface of the "me” menu, and the user interface can include options related to the user. In response to the click operation, the electronic device 100 can also make the "me” menu control 306 a special display, for example, to emphasize the display color or change the display color.
  • FIG. 8 exemplarily shows the response of the electronic device 100 after the user enters a function gesture (wrist up or wrist down) when the user interface 30 is displayed on the electronic device 100.
  • the electronic device 100 may display the indicator 307 in the user interface 30 after waking up the gesture recognition module 240.
  • the indicator 307 may be used to indicate that the electronic device 100 is currently detecting a gesture input by the user in the air.
  • the electronic device 100 may update and display the conversation list 302 in the user interface 30 after detecting the upward movement of the user's wrist, and the updated conversation list in the user interface 30 302 is shown on the right in Figure 8. Comparing the part in the dashed box (ie, the conversation list 302), it can be seen that the updated conversation list 302 can display earlier conversations. That is, the user can input the functional gesture of turning up the wrist to view the earlier conversation.
  • the electronic device 100 may update the displayed session list 302 in the user interface 30, and the updated session list in the user interface 302 is shown on the left in Figure 8. Comparing the part in the dashed box (ie, the conversation list 302), it can be seen that the updated conversation list 302 can display conversations that are later in time. That is, the user can input the functional gesture of turning the wrist down every time to view the conversation that is later.
  • the electronic device 100 may also display an earlier conversation in the user interface 30.
  • the electronic device 100 may also display a later session in the user interface 30.
  • the event corresponding to the functional gesture of turning the wrist up can be consistent with the touch operation of sliding up
  • the event corresponding to the functional gesture of turning the wrist down can be consistent with the touch operation of sliding down.
  • the wrist turning up or the wrist turning down may also correspond to other events, which are not limited in the embodiment of the present application.
  • the user may input a functional gesture of turning the wrist to the left (or turning the wrist to the right) at intervals when the electronic device 100 displays a certain user interface, so that the electronic device 100 displays other user interfaces.
  • FIG. 9 exemplarily shows the response of the electronic device 100 after the user enters a function gesture (turning the wrist left or the wrist right) when the user interface 30 is displayed on the electronic device 100.
  • the electronic device 100 may display the user interface 40 shown in the right image in FIG. 9 after detecting the left-turning movement of the user's wrist.
  • the user interface 30 shown in the left figure is the user interface of the "WeChat” menu
  • the user interface 40 shown in the right figure is the user interface of the "Contacts” menu. Comparing the display modes of the "WeChat” menu control 303 and the "Contacts" menu control 304 in the left figure and the right figure can also see the change of the currently displayed user interface.
  • the electronic device 100 may display the user interface 30 shown in the left image in FIG. 9 after detecting the right-turning action of the user's wrist.
  • the electronic device 100 can also switch the user interface 30 to the user interface 40.
  • the electronic device 100 may also switch the user interface 40 to the user interface 30.
  • the event corresponding to the functional gesture of turning the wrist to the left may be consistent with the touch operation of sliding left
  • the event corresponding to the functional gesture of turning the wrist to the right may be consistent with the touch operation of sliding to the right.
  • the electronic device 100 may display the user interface of the “discover” menu control 305 after detecting the left-turning movement of the user's wrist. If the user interface currently displayed is the user interface of the "discover” menu control 305, the electronic device 100 may display the user interface of the "address book” menu control 304 after detecting the right-turning action of the user's wrist, that is, the right in FIG. The figure shows a user interface 40.
  • the electronic device 100 may display the user interface of the "me” menu control 306 after detecting the left-turning movement of the user's wrist. If the currently displayed user interface is the user interface of the "me” menu control 306, the electronic device 100 may display the user interface of the "discover” menu control 305 after detecting the left-turning movement of the user's wrist.
  • a functional gesture of turning the wrist to the left or the wrist to the right may also be input in other application scenarios to make the electronic device switch to display the user interface.
  • a functional gesture of turning the wrist to the left or the wrist to the right may also be input in other application scenarios to make the electronic device switch to display the user interface.
  • the embodiment of the present application does not limit the above-listed application scenarios.
  • the wrist turning left or the right wrist turning down may also correspond to other events, which are not limited in the embodiment of the present application.
  • the electronic device 100 can make different responses.
  • the user can input a functional gesture of turning up the wrist (or turning down the wrist) when the electronic device 100 displays a user interface, so that the electronic device 100 can update and display the content displayed in the user interface, as shown in FIG. 8 shown.
  • the user may input a functional gesture of turning up the wrist (or turning down the wrist) when the electronic device 100 displays a user interface, so that the electronic device 100 increases the playback volume of the speaker 170A.
  • Application Scenario 2 The scene of playing music.
  • FIG. 10 exemplarily shows a user interface 50 for playing music.
  • the user interface 50 may include a return control 501, a download control 502, a sharing control 503, a previous control 504, a pause control 505, a next control 506, and a progress bar 507. among them:
  • the return control 501 can be used to return to the upper level user interface.
  • the electronic device 100 can detect a user operation on the return control 501 (for example, a click operation on the return control 501), and in response to the operation, the electronic device 100 can display the upper level user interface of the user interface 50 in the application.
  • the download control 502 can be used to download music data to the internal memory 121.
  • the electronic device 100 can detect a user operation on the download control 502 (such as a click operation on the download control 502), and in response to the operation, the electronic device 100 can download the data of the currently played music from the server and save it to the internal memory 121 .
  • the sharing control 503 can be used to share the currently playing music to other users.
  • the electronic device 100 can detect a user operation on the sharing control 503 (for example, a click operation on the sharing control 503), and in response to the operation, the electronic device 100 can share the currently playing music to other users.
  • the previous control 504 can be used to switch to the previous music.
  • the electronic device 100 can detect a user operation on the previous control 504 (such as a click operation on the previous control 504), and in response to the operation, the electronic device 100 can switch the currently playing music to the current music playlist The last music in.
  • the pause control 505 can be used to pause playing music.
  • the electronic device 100 can detect a user operation on the pause control 505 (for example, a click operation on the pause control 505), and in response to the operation, the electronic device 100 can pause the currently playing music.
  • the next control 506 can be used to switch to the previous music.
  • the electronic device 100 can detect a user operation on the next control 506 (such as a click operation on the next control 506), and in response to the operation, the electronic device 100 can switch the currently playing music to the current music playlist The next music in.
  • the progress bar 507 can be used to display and adjust the current music playing progress.
  • the electronic device 100 can detect a user operation (such as a click operation on the progress bar 507) acting on the progress bar 507, and in response to the operation, the electronic device 100 can adjust the current music playback progress.
  • FIG. 11 exemplarily shows the response of the electronic device 100 after the user enters a function gesture (wrist up or wrist down) when the user interface 50 is displayed on the electronic device 100.
  • the electronic device 100 may display the indicator 508 in the user interface 50 after waking up the gesture recognition module 240.
  • the indicator 508 may be used to indicate that the electronic device 100 is currently detecting a gesture input by the user in the air.
  • the electronic device 100 may display a volume adjuster 509 in the user interface. In addition, the electronic device 100 can also increase the playback volume of the speaker 170A according to the amount of upturning the user's wrist. If the user inputs a functional gesture of turning the wrist down in the air, the electronic device 100 may display the volume adjuster 509 in the user interface. In addition, the electronic device 100 can also reduce the playback volume of the speaker 170A according to the amount of downward turning of the user's wrist.
  • the wrist turning up or the wrist turning down may also correspond to other events, which are not limited in the embodiment of the present application.
  • the user interface 50 can also support other functional gestures, such as turning left or right wrist to realize the function of switching music to the previous or next song. Or turn left or right wrist to adjust the current music playback progress.
  • the embodiments of the present application do not limit this.
  • the following introduces the third application scenario involved in the embodiment of the present application and a user interface (UI) embodiment in the application scenario.
  • UI user interface
  • Application Scenario 3 The scene of playing video.
  • FIG. 12 exemplarily shows a user interface 60 for playing a video and a response of the electronic device 100 after a function gesture is inputted in the user interface 60.
  • the user interface 60 may include: an exit control 601, a content display area 602, a pause control 603, a previous set control 604, a next set control 605, and a progress bar 606. among them:
  • the exit control 601 can be used to exit the current user interface.
  • the electronic device 100 can detect a user operation (such as a click operation on the exit control 601) acting on the exit control 601, and in response to the operation, the electronic device 100 can exit the current user interface.
  • the content display area 602 can be used to display video content.
  • the content display area 602 can also adjust the playback brightness and playback volume.
  • the electronic device 100 can detect a touch operation (such as an up or down operation) acting on the left side of the content display area 602, and in response to the operation, the electronic device 100 can adjust the current playback brightness (increase or decrease).
  • the electronic device 100 can detect a touch operation (such as an up or down operation) acting on the right side of the content display area 602, and in response to the operation, the electronic device 100 can adjust the current playback volume (increase or decrease).
  • the pause control 603 can be used to pause playing the video.
  • the electronic device 100 can detect a user operation (such as a click operation on the pause control 603) acting on the pause control 503, and in response to the operation, the electronic device 100 can pause playing the video.
  • the previous episode control 604 can be used to switch the playback content to the previous episode.
  • the electronic device 100 can detect a user operation that acts on the previous episode control 604 (such as a click operation on the previous episode control 604), and in response to this operation, the electronic device 100 can switch the playback content to the previous episode.
  • the next episode control 605 can be used to switch the playback content to the next episode.
  • the electronic device 100 can detect a user operation on the next episode control 605 (such as a click operation on the next episode control 605), and in response to the operation, the electronic device 100 can switch the playback content to the next episode.
  • the progress bar 606 is used to display the current video playback progress, and can be used to adjust the playback progress.
  • the electronic device 100 can detect a user operation (such as a sliding operation on the progress bar 606) acting on the progress bar 606, and in response to the operation, the electronic device 100 can adjust the current playback progress.
  • the electronic device 100 may display the indicator 607 in the user interface 60 after waking up the gesture recognition module 240.
  • the indicator 607 may be used to indicate that the electronic device 100 is currently detecting a gesture input by the user in the air.
  • the electronic device 100 can increase the video playback volume according to the amount of turning up the user's wrist. If the user inputs a functional gesture of turning the wrist down in space, the electronic device 100 may reduce the video playback volume according to the amount of turning the user's wrist down.
  • the electronic device 100 may adjust the progress of the video playback backward according to the magnitude of the left turning of the user's wrist. If the user inputs a functional gesture of turning the wrist to the right, the electronic device 100 may adjust the progress of the video playback forward according to the magnitude of the right turning of the user's wrist.
  • Figures 6 to 12 above exemplarily show several application scenarios to which the embodiments of the present application are applicable. It is not limited to the application scenarios listed above, and the embodiments of the present application may also be applicable to other application scenarios, which are not limited in the embodiments of the present application.
  • users can input functional gestures from the palm to the fist to realize the screen capture function. That is to say, when the electronic device 100 detects that the user's gesture input from the air is from a palm to a fist, the electronic device 100 can save the currently displayed user interface to the internal memory 121. It is not limited to the above-mentioned functional gestures from the palm to the fist. In specific implementation, there may be other functional gestures to enable the electronic device 100 to realize the screen capture function, which is not limited in the embodiment of the present application.
  • the electronic device 100 may display a prompt on the user interface to remind the user of multiple functional gestures that can be supported in the user interface.
  • the prompt may include icons of events corresponding to multiple functional gestures supported by the user interface, and the icons of these events may correspond to the functional gestures in the displayed orientation, thereby reducing the user’s memory burden and improving the user’s experience with the electronic device 100.
  • Figure 13 and Figure 14 both take Application Scenario 3 as an example, and provide two display modes of prompts respectively.
  • the electronic device 100 may display a prompt in the user interface 60 after waking up the gesture recognition module 240.
  • the prompt may be used to indicate that the electronic device 100 is currently detecting a gesture inputted by the user in an airspace, and an icon of an event corresponding to multiple functional gestures that can be supported in the user interface.
  • the prompt 608 may include an icon for indicating that a gesture of air input is currently being detected, and an icon of an event corresponding to each functional gesture presented in the form of a circle. Combining the various functional gestures supported by the user interface 60 listed in FIG. 12 and their corresponding events, it can be seen from FIG. 13 that the icon for increasing the video playback volume is displayed above the circle to represent the functional gesture of turning up the wrist.
  • the video playback volume can be increased; the icon for reducing the video playback volume is displayed below the circle, and the functional gesture used to signify the wrist down can realize the reduction of the video playback volume; the icon for the progress backward is displayed on the left of the circle, The functional gesture used to characterize the left turn of the wrist can realize the backward video playback progress; the progress forward icon is displayed on the right of the circle, and the functional gesture used to characterize the right turn of the wrist can realize the progress of the video playback.
  • the prompt 608 may include an icon for indicating that a gesture of air input is currently being detected, and an arrow icon to point to an event corresponding to each functional gesture. Combining the various functional gestures supported by the user interface 60 listed in FIG. 12 and their corresponding events, it can be seen from FIG.
  • the upward arrow icon points to the icon for increasing the video playback volume, which is used to indicate that the functional gesture of turning up the wrist can be Realize the increase of video playback volume;
  • the downward arrow icon points to the icon of reducing the video playback volume, which is used to indicate that the function gesture of turning down the wrist can realize the reduction of video playback volume;
  • the left arrow icon points to the icon of progress backward, which is used to The functional gesture representing the left turning of the wrist can realize the backward video playback progress;
  • the right arrow icon points to the progress forward icon, and the functional gesture representing the right turning of the wrist can realize the progress of the video playback.
  • the user's memory burden can be reduced, and the interaction efficiency between the user and the electronic device 100 can be improved.
  • FIG. 15 exemplarily shows a schematic flowchart of a gesture input method provided by an embodiment of the present application.
  • the gesture input method can include at least the following steps:
  • the electronic device acquires a first gesture, where the first gesture includes a first sub-gesture and a second sub-gesture.
  • the first gesture may be a wrist turning up, a wrist turning down, a wrist turning left, or a wrist turning right as shown in FIGS. 8-12.
  • the first gesture may also be a gesture from a palm to a fist.
  • the first gesture may also be other gestures, which are not limited in the embodiment of the present application.
  • the first sub-gesture may be the starting part of the first gesture.
  • the second sub-gesture may be other parts of the first gesture except the initial gesture.
  • the first gesture may include multiple first sub-gestures.
  • the first sub-gesture may take a frame as a unit, and specifically may be a gesture included in a multi-frame image input by the low-power detection module 230 mentioned in the embodiment of FIG. 4.
  • the second sub-gesture may take an image stream as a unit, and is specifically a gesture included in the image stream input by the gesture recognition module 240 mentioned in the embodiment of FIG. 4.
  • the preset gesture may be the initial gesture of each gesture supported by the above-mentioned electronic device.
  • the first neural network model may be trained by a large number of preset gestures and non-predetermined gestures.
  • the recognition result of the first neural network model corresponding to the preset gesture is 1, and the recognition result of the first neural network model corresponding to the non-predetermined gesture is 0.
  • the first gesture may be the up or down of the wrist as shown in the embodiment of FIG. 8, and the function corresponding to the second sub-gesture may be to update the content displayed in the user interface 30.
  • the first gesture may be the left-turning of the wrist (or the right-turning of the wrist) as shown in the embodiment in FIG. 9, and the function corresponding to the second sub-gesture may be to switch the user interface 30 to the user interface 40 (or the user interface 40 Switch to the user interface 30).
  • the first gesture may be the wrist up (or wrist down) shown in the embodiment of FIG. 11, and the function corresponding to the second sub-gesture may be to increase the playback volume of the speaker 170A (or decrease the playback volume of the speaker 170A). volume).
  • the first gesture may be the wrist up (or wrist down) shown in the embodiment in FIG. 12, and the function corresponding to the second sub-gesture may be to increase the volume of video playback (or decrease the volume of video playback).
  • the first gesture may be the left-turning of the wrist (or the right-turning of the wrist) shown in the embodiment of FIG. 12, and the function corresponding to the second sub-gesture may be to adjust the progress of the video playback backward (or to adjust the video playback forward). schedule).
  • first gesture may also be other gestures
  • second sub-gesture may also correspond to other functions, which are not limited in the embodiments of the present application. .
  • the functional gestures in the embodiments of the present application may include a wake-up gesture, which wakes up the gesture recognition function of the electronic device. That is, the initial gesture of the functional gesture can be used as a wake-up gesture, and the wake-up gesture of different functional gestures can be different.
  • the embodiment of the present application can omit the process of inputting the wake-up gesture, reduce the time delay of the gesture operation, improve the response speed of the electronic device, and improve the user and electronics The interaction efficiency of the device.
  • the electronic device may acquire the first sub-gesture at the first frame rate through the image acquisition module 220. After determining that the first sub-gesture matches the preset gesture for the first time, switch to the second frame rate, and continue to acquire the first sub-gesture at the second frame rate. After determining that the continuously acquired first sub-gesture matches the preset gesture, the low-power motion detection module 230 acquires the second sub-gesture at the third frame rate, and the gesture recognition module 240 determines the function corresponding to the second sub-gesture.
  • the third frame rate is greater than the first frame rate.
  • the third frame rate may be 30 frames/sec, for example, and the first frame rate may be 10 frames/sec, for example.
  • the second frame rate may be the third frame rate.
  • the electronic device may determine for the first time that the first gesture matches the preset gesture in the fifth frame of image (that is, the low-power detection module 230 recognizes the initial gesture), from Starting from frame 6, switch the frame rate from the first frame rate to the third frame rate.
  • the first sub-gesture is acquired at the third frame rate from the 14th frame.
  • Two son gestures Two son gestures.
  • the embodiment of the application can switch the frame rate of the captured image immediately after detecting that the first hand gesture matches the preset gesture for the first time, and then collect the subsequent first and second sub-gestures at the third frame rate.
  • the probability of false triggering can be reduced, the response speed of the electronic device can be improved to the greatest extent, and the interaction efficiency between the user and the electronic device can be improved.
  • the second frame rate may be the first frame rate.
  • the electronic device may determine for the first time that the first gesture matches the preset gesture in the 5th frame of the image (that is, the low-power detection module 230 recognizes the initial gesture). Continue to acquire the first sub-gesture at the first frame rate, until it is determined that the first sub-gesture matches the preset gesture in the 13th frame (that is, the functional gesture is determined to be the starting gesture), and the first sub-gesture is acquired at the third frame rate from the 14th frame Two son gestures.
  • the second sub-gesture is acquired at the third frame rate.
  • the electronic device can recognize the first sub-gesture in the 7th frame to the 13th frame in the continuous 7 frames of images, when there are 5 of the 7 first sub-gestures in the 7 frames of images. If the first sub-gesture matches the preset gesture, it can be determined that the first sub-gesture is the initial gesture of the functional gesture, and the gesture recognition module 240 is awakened to acquire the second sub-gesture at the third frame rate.
  • the gesture recognition module 240 can ensure that the user's intention is clear when awakening, rather than being awakened by the user's inadvertent gesture. Reduce power consumption caused by false triggering of gestures.
  • the method before the electronic device acquires the first gesture, the method further includes: the electronic device displays a first user interface.
  • the first user interface may be the user interface 30 shown in FIG. 7.
  • the first user interface may also be the user interface 50 shown in FIG. 10.
  • the first user interface may also be the user interface 60 shown in FIG. 12.
  • the first user interface may also be any user interface that can be displayed by other electronic devices, which is not limited in the embodiment of the present application.
  • the electronic device may display an indicator in the first user interface and recognize the function corresponding to the second sub-gesture.
  • the indicator can be used to indicate that the electronic device is recognizing a gesture.
  • the electronic device 100 may display the indicator in the first user interface after waking up the gesture recognition module 240.
  • the indicator may be the indicator 307 shown in FIG. 8, the indicator 508 shown in FIG. 11, and the indicator 607 shown in FIG. 12.
  • an indicator may be displayed on the display interface of the electronic device to prompt the user that the gesture is currently being recognized, which may improve the interaction between the user and the electronic device.
  • the electronic device may display a prompt in the first user interface and recognize the function corresponding to the second sub-gesture.
  • the prompt may be used to indicate the respective functions corresponding to the various gestures supported by the electronic device.
  • the electronic device 100 may display a prompt in the first user interface after waking up the gesture recognition module 240.
  • the indicator may be the prompt 608 shown in FIG. 13 and the prompt 609 shown in FIG. 14.
  • the electronic device can support four functional gestures when displaying the current user interface.
  • the functions corresponding to various functional gestures are different. It can be seen from Figure 13 that the icon for increasing the volume of video playback is displayed above the circle, and the functional gesture used to indicate that the wrist is turned up can increase the volume of video playback; the icon for decreasing the volume of video playback is displayed below the circle ,
  • the functional gesture used to characterize the wrist turning down can reduce the video playback volume;
  • the progress backward icon is displayed on the left of the circle, the functional gesture used to characterize the wrist turning left can realize the video playback progress backward;
  • the progress forward icon Displayed on the right of the circle the functional gesture used to characterize the right turning of the wrist can realize the progress of the video playback.
  • Figure 14 exemplarily shows another way of displaying prompts.
  • the prompt 608 may include an icon for indicating that a gesture of air input is currently being detected, and an arrow icon to point to an event corresponding to each functional gesture. Combining the various functional gestures supported by the user interface 60 listed in FIG. 12 and their corresponding events, it can be seen from FIG.
  • the upward arrow icon points to the icon for increasing the video playback volume, which is used to indicate that the functional gesture of turning up the wrist can be Realize the increase of video playback volume;
  • the downward arrow icon points to the icon of reducing the video playback volume, which is used to indicate that the function gesture of turning down the wrist can realize the reduction of video playback volume;
  • the left arrow icon points to the icon of progress backward, which is used to The functional gesture representing the left turning of the wrist can realize the backward video playback progress;
  • the right arrow icon points to the progress forward icon, and the functional gesture representing the right turning of the wrist can realize the progress of the video playback.
  • the electronic device can associate different functional gestures with their corresponding functions through the display orientation of the icon of the function corresponding to each functional gesture, which can reduce the user's memory burden and improve the interaction efficiency between the user and the electronic device.
  • the electronic device displays the first user interface, and the function corresponding to the second sub-gesture is the first function; after the function corresponding to the second sub-gesture is executed, the method further includes: the electronic device displays For a second user interface, the first user interface is different from the first user interface; the electronic device acquires the first gesture; when the first gesture matches the preset gesture, it recognizes that the function corresponding to the second sub-gesture is the first Two functions; the above-mentioned first function is different from the above-mentioned second function; the above-mentioned second function is performed.
  • the electronic device displays different user interfaces
  • the user can input the same gesture to achieve different functions.
  • the gesture response function corresponding to the user interface can be set according to the attributes of the content currently displayed on the user interface, which can fully improve the utilization rate of gesture input and enhance the interest of the interaction between the user and the electronic device.
  • the first display interface may be, for example, turning up the wrist as shown in FIG. 8, and the first function may be to update the display content in the display user interface 30.
  • the second user interface may be, for example, turning up the wrist in FIG. 11, and the second function may be, for example, increasing the playback volume (volume up) of the speaker 170A.
  • first user interface It is not limited to the first user interface, the first function, the second user interface, and the second function listed above, and there may be other first user interface, first function, second user interface, and second function in specific implementation.
  • the embodiment of the application does not limit this.
  • recognition function gestures it is not limited to the above-mentioned recognition function gestures, and makes the electronic device perform corresponding functions.
  • it can also be used to recognize the type of sports by recognizing the movements of the user's whole body. Or you can recognize the user's facial expressions to achieve the function of capturing photos. Or you can also realize the addition or replacement of shooting special effects by recognizing the user's gestures.
  • the embodiment of the application does not limit this.
  • the embodiment of the present application also provides a computer-readable storage medium that stores instructions in the computer-readable storage medium, and when it runs on a computer or a processor, the computer or the processor executes any one of the above methods. Or multiple steps. If each component module of the above-mentioned signal processing device is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in the computer readable storage medium.
  • the above-mentioned embodiments it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof.
  • software it can be implemented in the form of a computer program product in whole or in part.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium or transmitted through the computer-readable storage medium.
  • the computer instructions can be sent from a website site, computer, server, or data center to another website site via wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) , Computer, server or data center for transmission.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or a data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (SSD)).
  • the process can be completed by a computer program instructing relevant hardware.
  • the program can be stored in a computer readable storage medium. , May include the processes of the foregoing method embodiments.
  • the aforementioned storage media include: ROM or random storage RAM, magnetic disks or optical disks and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

La présente invention concerne un procédé d'entrée de geste et un dispositif électronique. Le procédé consiste à : acquérir, par un dispositif électronique, un premier geste, le premier geste comprenant un premier sous-geste et un second sous-geste ; lorsque le premier sous-geste correspond à un geste prédéfini, reconnaître une fonction correspondant au second sous-geste ; et exécuter la fonction correspondant au second sous-geste. Un geste de mise en œuvre selon la présente invention peut comprendre un geste de réveil, c'est-à-dire qu'un geste initial du geste de mise en œuvre peut être le geste de réveil, réduisant ainsi le retard de mise en œuvre gestuelle, augmentant l'efficacité de réponse du dispositif électronique, et augmentant l'efficacité d'interaction entre un utilisateur et le dispositif électronique.
PCT/CN2020/112039 2019-09-18 2020-08-28 Procédé d'entrée de geste et dispositif électronique WO2021052139A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910883139.8A CN112527093A (zh) 2019-09-18 2019-09-18 手势输入方法及电子设备
CN201910883139.8 2019-09-18

Publications (1)

Publication Number Publication Date
WO2021052139A1 true WO2021052139A1 (fr) 2021-03-25

Family

ID=74883340

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/112039 WO2021052139A1 (fr) 2019-09-18 2020-08-28 Procédé d'entrée de geste et dispositif électronique

Country Status (2)

Country Link
CN (1) CN112527093A (fr)
WO (1) WO2021052139A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114415830A (zh) * 2021-12-31 2022-04-29 科大讯飞股份有限公司 隔空输入方法及设备、计算机可读存储介质
CN115484391A (zh) * 2021-06-16 2022-12-16 荣耀终端有限公司 一种拍摄方法及电子设备
CN117149046A (zh) * 2023-10-25 2023-12-01 荣耀终端有限公司 交互手势强度阈值调整方法及电子设备

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113031776A (zh) * 2021-03-25 2021-06-25 恒大新能源汽车投资控股集团有限公司 一种手势处理方法、装置及设备
CN117111727A (zh) * 2023-02-22 2023-11-24 荣耀终端有限公司 手方向的检测方法、电子设备及可读介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101702106A (zh) * 2009-11-04 2010-05-05 深圳市汇顶科技有限公司 一种触摸屏终端的唤醒方法及系统
CN103380405A (zh) * 2010-12-30 2013-10-30 汤姆逊许可公司 用于手势识别的用户界面、装置和方法
WO2015053451A1 (fr) * 2013-10-10 2015-04-16 Lg Electronics Inc. Terminal mobile et son procédé de fonctionnement
CN105183144A (zh) * 2015-04-29 2015-12-23 比亚迪股份有限公司 移动终端的控制方法和装置
US10007777B1 (en) * 2015-09-14 2018-06-26 Google Llc Single input unlock for computing devices
CN109085885A (zh) * 2018-08-14 2018-12-25 李兴伟 智能戒指
CN111158467A (zh) * 2019-12-12 2020-05-15 青岛小鸟看看科技有限公司 一种手势交互方法和终端

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011204019A (ja) * 2010-03-25 2011-10-13 Sony Corp ジェスチャ入力装置、ジェスチャ入力方法およびプログラム
US9785217B2 (en) * 2012-09-28 2017-10-10 Synaptics Incorporated System and method for low power input object detection and interaction
CN104267819B (zh) * 2014-10-09 2017-07-14 苏州触达信息技术有限公司 可手势唤醒的电子设备以及电子设备手势唤醒方法
CN105302301B (zh) * 2015-10-15 2018-02-13 广东欧珀移动通信有限公司 一种移动终端的唤醒方法、装置及移动终端
CN107479816B (zh) * 2017-07-28 2019-09-24 Oppo广东移动通信有限公司 黑屏手势的识别方法、装置、存储介质及移动终端
CN107479700B (zh) * 2017-07-28 2020-05-12 Oppo广东移动通信有限公司 黑屏手势控制方法、装置、存储介质及移动终端

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101702106A (zh) * 2009-11-04 2010-05-05 深圳市汇顶科技有限公司 一种触摸屏终端的唤醒方法及系统
CN103380405A (zh) * 2010-12-30 2013-10-30 汤姆逊许可公司 用于手势识别的用户界面、装置和方法
WO2015053451A1 (fr) * 2013-10-10 2015-04-16 Lg Electronics Inc. Terminal mobile et son procédé de fonctionnement
CN105183144A (zh) * 2015-04-29 2015-12-23 比亚迪股份有限公司 移动终端的控制方法和装置
US10007777B1 (en) * 2015-09-14 2018-06-26 Google Llc Single input unlock for computing devices
CN109085885A (zh) * 2018-08-14 2018-12-25 李兴伟 智能戒指
CN111158467A (zh) * 2019-12-12 2020-05-15 青岛小鸟看看科技有限公司 一种手势交互方法和终端

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115484391A (zh) * 2021-06-16 2022-12-16 荣耀终端有限公司 一种拍摄方法及电子设备
CN115484391B (zh) * 2021-06-16 2023-12-12 荣耀终端有限公司 一种拍摄方法及电子设备
CN114415830A (zh) * 2021-12-31 2022-04-29 科大讯飞股份有限公司 隔空输入方法及设备、计算机可读存储介质
CN117149046A (zh) * 2023-10-25 2023-12-01 荣耀终端有限公司 交互手势强度阈值调整方法及电子设备
CN117149046B (zh) * 2023-10-25 2024-03-15 荣耀终端有限公司 交互手势强度阈值调整方法及电子设备

Also Published As

Publication number Publication date
CN112527093A (zh) 2021-03-19

Similar Documents

Publication Publication Date Title
EP3872807B1 (fr) Procédé de commande vocale et dispositif électronique
WO2021052263A1 (fr) Procédé et dispositif d'affichage d'assistant vocal
WO2020211701A1 (fr) Procédé de formation de modèle, procédé de reconnaissance d'émotion, appareil et dispositif associés
WO2020259452A1 (fr) Procédé d'affichage plein écran pour terminal mobile et appareil
WO2021017889A1 (fr) Procédé d'affichage d'appel vidéo appliqué à un dispositif électronique et appareil associé
WO2020182065A1 (fr) Procédé d'activation de fonction de raccourci et dispositif électronique
CN113645351B (zh) 应用界面交互方法、电子设备和计算机可读存储介质
WO2020134869A1 (fr) Procédé de fonctionnement d'un dispositif électronique et dispositif électronique
WO2021000807A1 (fr) Procédé et appareil de traitement pour un scénario d'attente dans une application
WO2021052139A1 (fr) Procédé d'entrée de geste et dispositif électronique
WO2021036770A1 (fr) Procédé de traitement d'écran partagé et dispositif terminal
WO2021104485A1 (fr) Procédé de photographie et dispositif électronique
WO2019072178A1 (fr) Procédé de traitement de notification, et dispositif électronique
WO2022127787A1 (fr) Procédé d'affichage d'image et dispositif électronique
CN112492193B (zh) 一种回调流的处理方法及设备
WO2020073288A1 (fr) Procédé de déclenchement de dispositif électronique permettant d'exécuter une fonction, et dispositif électronique associé
WO2021052070A1 (fr) Procédé d'identification de fréquence de trames et dispositif électronique
WO2021218429A1 (fr) Procédé de gestion d'une fenêtre d'application, dispositif terminal et support de stockage lisible par ordinateur
WO2022042766A1 (fr) Procédé d'affichage d'informations, dispositif terminal et support de stockage lisible par ordinateur
WO2020029094A1 (fr) Procédé de génération d'instruction de commande vocale et terminal
WO2021082815A1 (fr) Procédé d'affichage d'élément d'affichage et dispositif électronique
WO2022166435A1 (fr) Procédé de partage d'image et dispositif électronique
WO2021052388A1 (fr) Procédé de communication vidéo et appareil de communication vidéo
WO2023029916A1 (fr) Procédé et appareil d'affichage d'annotation, dispositif terminal et support de stockage lisible
WO2021129453A1 (fr) Procédé de capture d'écran et dispositif associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20864774

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20864774

Country of ref document: EP

Kind code of ref document: A1