WO2019105376A1 - Procédé de reconnaissance de geste, terminal et support de stockage - Google Patents

Procédé de reconnaissance de geste, terminal et support de stockage Download PDF

Info

Publication number
WO2019105376A1
WO2019105376A1 PCT/CN2018/117864 CN2018117864W WO2019105376A1 WO 2019105376 A1 WO2019105376 A1 WO 2019105376A1 CN 2018117864 W CN2018117864 W CN 2018117864W WO 2019105376 A1 WO2019105376 A1 WO 2019105376A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture recognition
terminal
gesture
ultrasonic wave
application
Prior art date
Application number
PCT/CN2018/117864
Other languages
English (en)
Chinese (zh)
Inventor
刘永霞
史波
易科
李�瑞
刘杉
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2019105376A1 publication Critical patent/WO2019105376A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C23/00Non-electrical signal transmission systems, e.g. optical systems
    • G08C23/02Non-electrical signal transmission systems, e.g. optical systems using infrasonic, sonic or ultrasonic waves

Definitions

  • the present application relates to the field of computer technologies, and in particular, to a gesture recognition method, a terminal, and a storage medium.
  • the input mode of the user on the mobile phone browser is mainly contact input, that is, the user needs to manually input on the touch screen or physical button of the mobile phone terminal.
  • gesture recognition Users can also use non-contact input methods for input, such as gesture recognition, which is convenient for users.
  • the gesture refers to various actions made by the human hand under the control of the human consciousness, such as finger bending, stretching and hand movement in space, etc., which may be performing a certain task or communicating with people to express a certain Meaning or intent.
  • gesture recognition Based on three-dimensional interactive input technology of gesture recognition, there are commonly used data glove-based and vision-based (such as camera) gesture recognition.
  • An example of the present application provides a gesture recognition method, where the method is applied to a terminal, where a speaker and a microphone are disposed, and an application is installed on the terminal, and the method includes:
  • Corresponding gesture instructions are executed in the application according to the gesture recognition result.
  • the application example further provides a terminal, where the terminal is configured with a speaker and a microphone, and the terminal is installed with an application program, and the terminal includes:
  • a mode determining module configured to determine, according to an input control indication of the application, whether the terminal turns on a gesture recognition mode
  • An ultrasonic transmitting module configured to: when the terminal determines to enable the gesture recognition mode, play the first ultrasonic wave through the speaker;
  • An ultrasonic acquisition module configured to receive a second ultrasonic wave by using the microphone, where the second ultrasonic wave is waveform data obtained after collecting the first ultrasonic wave;
  • a gesture recognition module configured to perform gesture recognition according to the received second ultrasonic wave, to obtain a gesture recognition result
  • an instruction execution module configured to execute a corresponding gesture instruction in the application according to the gesture recognition result.
  • the application example further provides a terminal, where the terminal includes: a speaker, a microphone, a processor, and a memory;
  • the processor and the speaker, the microphone, and the memory communicate with each other;
  • the memory is for storing instructions
  • the speaker for playing a first ultrasonic wave under the control of the processor
  • the microphone for receiving a second ultrasonic wave under the control of the processor
  • the processor is operative to execute the instructions in the memory, performing the method of any of the preceding aspects.
  • the present application examples provide a computer readable storage medium having instructions stored therein that, when executed on a computer, cause the computer to perform the methods described in the various aspects above.
  • 1a is a schematic structural diagram of a system involved in an example of the present application.
  • 1b is a schematic block diagram of a gesture recognition method provided by an example of the present application.
  • 1c is a schematic structural diagram of a system involved in an example of the present application.
  • FIG. 3 is a schematic flow chart of ultrasonic receiving provided by an example of the present application.
  • FIG. 4 is a schematic flowchart of a one-dimensional gesture recognition process provided by an example of the present application.
  • FIG. 5 is a schematic flowchart of a two-dimensional gesture recognition process provided by an example of the present application.
  • FIG. 6 is a schematic flowchart of gesture recognition action management provided by an example of the present application.
  • 7-a is a schematic diagram of the action of gesture recognition provided by the example of the present application.
  • 7-b is a schematic diagram of the end of the action of gesture recognition provided by the example of the present application.
  • FIG. 8 is a schematic diagram of a basic operation of gesture recognition provided by an example of the present application.
  • 9-a is a schematic structural diagram of a terminal provided by an example of the present application.
  • 9-b is a schematic structural diagram of a mode determining module provided by an example of the present application.
  • 9-c is a schematic structural diagram of another terminal provided by an example of the present application.
  • 9-d is a schematic structural diagram of another terminal provided by an example of the present application.
  • FIG. 10 is a schematic structural diagram of a gesture recognition method provided by an example of the present application applied to a terminal;
  • FIG. 11 is a schematic structural diagram of another terminal according to an example of the present application.
  • the application example provides a gesture recognition method and a terminal for reducing the complexity of gesture recognition on an application of the terminal and reducing the computing performance requirement of the terminal.
  • the terminal can recognize the gesture of the user based on the image of the gesture made by the user.
  • the image recognition algorithm is complex, computationally intensive, high in power consumption, and susceptible to light, and has high performance requirements on the terminal device. Therefore, this method cannot achieve accurate recognition on the terminal.
  • the present application provides a gesture recognition method.
  • the method may be specifically applied to a gesture recognition scene of a terminal to a user.
  • the gesture recognition method provided by the example of the present application is applied to the terminal 101.
  • the terminal 101 is configured with a speaker 102 and a microphone 103.
  • the terminal 101 also has an application 104 installed thereon.
  • the application program installed on the terminal includes : Browser APP, or office software APP, or game app, etc., is not limited here.
  • FIG. 1b is a schematic flow chart of a gesture recognition method according to an illustrative example of the present application.
  • the gesture recognition method includes the following steps:
  • S101 Determine, according to an input control indication of the application, whether the terminal turns on the gesture recognition mode.
  • an application is installed on the terminal, and the user can use the application, for example, the user can use the browser APP to query the webpage.
  • the input control instruction may be specifically set in the application's setting menu, or may be set in the application's configuration file.
  • the input control indication may be set in the installation configuration file, and configured by the user when the application is installed on the terminal. This input control indication.
  • the open page of the gesture recognition mode can be set. If the user selects to enable the gesture recognition mode, it can be determined that the gesture recognition mode is enabled on the terminal, and if the user does not enable the gesture recognition mode, It is determined that the terminal has turned off the gesture recognition mode.
  • the terminal determines to enable the gesture recognition mode, the subsequent steps are triggered, otherwise the gesture recognition method in the example of the present application is ended.
  • the input control indication of the application can control whether the terminal enables the gesture recognition mode.
  • step S101 determines whether the terminal turns on the gesture recognition mode according to the input control indication of the application, including:
  • the terminal turns on the gesture recognition mode by default, it is determined that the gesture recognition mode is enabled in the terminal.
  • the input control indication of the application can be configured as the default open gesture recognition mode, and the gesture recognition mode can be automatically turned on after the application runs on the terminal, thereby eliminating the trouble of whether the user resets or not.
  • the gesture recognition mode can be automatically turned on after the application runs on the terminal, thereby eliminating the trouble of whether the user resets or not.
  • step S101 determines whether the terminal turns on the gesture recognition mode according to the input control indication of the application, including:
  • the opening and closing of the gesture recognition mode in the example of the present application may also be determined by some preset gesture actions.
  • some two-dimensional gestures may be preset, and the preset two-dimensional gesture is used by the user to control whether the terminal turns on the gesture. Identify the mode to meet the user's real-time control needs for the terminal. For example, determining that the input control indication of the application may open the gesture recognition mode for the first two-dimensional gesture according to the configuration file of the application, the terminal may transmit the ultrasonic wave through the speaker, and acquire the ultrasonic wave through the microphone, thereby identifying whether the user has made a preset.
  • One of the gesture actions (for example, the first two-dimensional gesture) can be turned on only when the preset gesture motion is detected.
  • the terminal can transmit the ultrasonic wave through the speaker, and acquire the ultrasonic wave through the microphone, thereby identifying whether the user has made a preset.
  • One of the gesture actions eg, the second two-dimensional gesture
  • the preset gesture action can be turned off only when the preset gesture action is detected.
  • the gesture recognition method provided by the example of the present application may further include the following steps:
  • the waveform data of the specified gesture action is recorded through the microphone
  • the gesture recognition result corresponding to the specified gesture action is determined according to the gesture waveform template library.
  • the gesture waveform is established in the example of the present application.
  • Template library For example, in order to solve the problem of insufficient computing power of the terminal, a cloud server-based gesture waveform template library can also be established. That is, in some examples, the gesture recognition method provided by the examples of the present application can also be applied to the system architecture as shown in FIG. 1c.
  • the system architecture includes a terminal 101 and a cloud server 105 that interact via one or more Internet 106.
  • the user records the waveform data of the specified gesture action through the terminal 101, and the cloud server 105 can establish a gesture waveform template library based on the above interaction, and adopts a machine learning method to perform gesture recognition, thereby improving the accuracy of gesture matching.
  • cloud server 105 can include two components: one is offline gesture waveform template library training and the other is online gesture recognition.
  • the terminal 101 may extract the received second ultrasonic wave to the ultrasonic data, and then send the ultrasonic data to the cloud server 105, and the cloud server 105 may acquire the gesture recognition result through the gesture waveform template library by using online cloud recognition, and then the cloud The server 105 sends the gesture recognition result to the terminal 101, so that different terminal types can obtain the gesture recognition result applicable to the terminal type.
  • the type information of the terminal and the configuration information of the terminal can be obtained through the configuration file of the terminal.
  • the system type of the terminal, the operating system version, the number of processors used by the terminal, and the size of the content space thereby entering the gesture waveform template library set for the terminal according to the type information of the terminal and the configuration information of the terminal, thereby improving gesture matching. Accuracy.
  • the terminal has a built-in speaker, and the terminal can first generate an ultrasonic signal, and then play the ultrasonic wave through the speaker, and define the ultrasonic wave played by the speaker as the “first ultrasonic wave”. After the first ultrasonic wave is emitted from the position where the terminal is located, the first ultrasonic wave hits the target obstacle and reflects to the position where the terminal is located.
  • the target obstacle is mainly a user's hand, such as a single or multiple fingers of the user, or one or two palms of the user.
  • the terminal can use the built-in ultrasonic generator to generate ultrasonic waves, and then transmit the ultrasonic waves through the built-in speakers of the terminal.
  • the ultrasonic generator generates ultrasonic waves by mechanical vibration, and generally propagates in an elastic medium in a longitudinal wave manner, which is a form of energy propagation.
  • the ultrasonic wave has a short wavelength and good directivity.
  • the ultrasonic generator generates ultrasonic waves with a frequency higher than 20,000 Hz, good directionality, strong penetrating power, and easy to obtain concentrated sound energy.
  • the gesture recognition method provided by the example of the present application further includes:
  • step S102 playing the first ultrasonic wave through the speaker
  • the trigger performs the following step S103: receiving the second ultrasonic wave through the microphone.
  • abnormality detection may be performed on the playing data, for example, the signal waveform of the first ultrasonic wave is detected from multiple dimensions such as time domain and frequency domain.
  • the terminal can detect whether the played data has dropped frames, for example, whether the sampled frequency points are the same as the frequency points played by the speaker in one second.
  • the terminal can check whether the energy of the signal is normal in the frequency domain, for example, the ultrasonic signal played is 20khz, and it is determined whether the received ultrasonic energy is 20khz.
  • the terminal when the terminal plays an ultrasonic wave, it can also detect whether there is a frame loss by playing music with fast rhythm.
  • the success rate of the received ultrasonic wave can be improved.
  • S103 Receive a second ultrasonic wave through a microphone, and the second ultrasonic wave is waveform data obtained by collecting the first ultrasonic wave.
  • the sound wave signal can be received through the microphone built in the terminal, and the sound wave signal received by the microphone is defined as the “second ultrasonic wave”.
  • the microphone built in the terminal may be one or more, which is not limited herein.
  • the second ultrasonic wave normally received from the surroundings of the terminal is a mixed signal mixed with ultrasonic waves reflected by the target obstacle, and ultrasonic waves emitted from the speaker, and ultrasonic waves emitted from the vicinity of the terminal.
  • the gesture recognition method provided by the example of the present application further includes:
  • the trigger performs the following steps: performing gesture recognition according to the received second ultrasonic wave.
  • the terminal After the terminal acquires the second ultrasonic wave, it can also determine whether there is an abnormality in the recorded signal waveform, for example, whether the received signal waveform is complete, that is, whether there is a frame loss. For example, the detection can be performed from multiple dimensions such as frequency domain and time domain, otherwise the subsequent gesture recognition result is affected. There are various methods of detecting, for example, determining whether the received signal waveform energy is equal to the played signal waveform energy. Another example is whether the frequency points recorded per second are the same as the sampling frequency. Another example is the energy of the time-frequency diagram of the signal.
  • the gesture recognition is performed only when there is no abnormality in the received second ultrasonic wave, thereby improving the accuracy of the gesture recognition.
  • step S103 receives the second ultrasonic wave through the microphone, including:
  • the recorded sound wave signal is stored in the recording buffer of the terminal, wherein the sound wave signal stored in the recording buffer is the second ultrasonic wave.
  • the terminal preset recording parameters can be various, such as mono or multi-channel.
  • the terminal can select mono, multi-channel, etc. according to the needs.
  • the recording parameter may further include a sampling frequency, which is the same as the transmission frequency of the ultrasonic wave.
  • the terminal can store the acoustic signal in the recording buffer.
  • the size of the recording buffer can be set as follows. If the recording buffer is set too small, it may cause frame loss. Generally, the minimum buffer size is more than 2 times.
  • the received second ultrasonic wave may be gesture-recognized. Since the first ultrasonic wave emitted by the speaker is blocked by the user's hand, the microphone collects the human hand. The blocked ultrasonic wave, so by analyzing the receiving position, waveform energy, and the like of the second ultrasonic wave received by the microphone, the gesture made by the user can be recognized, and the gesture recognition result is obtained.
  • the step S104 performs gesture recognition according to the received second ultrasonic wave, and after the gesture recognition result is obtained, the gesture recognition method provided by the example of the present application may further include the following steps:
  • the prompt information for successful gesture recognition is displayed through the display interface of the application.
  • the display interface of the application of the terminal in the example of the present application can also implement real-time interaction with the user, and the second ultrasonic wave received by the step S104 performs gesture recognition to obtain a gesture recognition result, indicating that the current gesture recognition is successful, in order to avoid
  • the user repeatedly performs multiple identical actions and the display interface of the application of the terminal can also display prompt information, for example, by using an animation displayed on the display interface to inform the user of the current gesture recognition program, or by prompting the user by text or sound, here is not Make a limit.
  • the terminal may respond to the gesture of the user, and may execute a corresponding gesture instruction according to the gesture recognition result, for example, operating the application of the terminal in response to the gesture instruction of the user.
  • the gesture recognition result is a circle
  • the terminal executes a gesture instruction corresponding to the gesture recognition result on the application: opening the application or operating a certain menu of the application.
  • the correspondence between the gesture recognition result and the gesture instruction may be determined according to a pre-configured list of the user on the terminal.
  • the terminal recognizes the gesture of the user to obtain a gesture recognition result, and the terminal may execute a gesture instruction of the user on the browser application, for example, inputting a text in a search box of the browser.
  • step S102 plays the first ultrasonic wave through the speaker, including:
  • N being a positive integer greater than or equal to 1;
  • step S103 receives the second ultrasonic wave through the microphone, including:
  • a second ultrasonic wave of N frequencies is collected from the surroundings of the terminal through a microphone.
  • the terminal can transmit ultrasonic waves of one or more frequencies, for example, transmitting a plurality of ultrasonic waves of different frequencies, and the ultrasonic waves can be separated by the same distance.
  • the terminal can separately acquire ultrasonic waves of one or more frequencies played by the speaker through the microphone.
  • step S104 performs gesture recognition according to the received second ultrasonic wave to obtain a gesture recognition result, including:
  • the one-dimensional gesture recognition is performed according to the waveform energy of the second ultrasonic wave, and the first gesture recognition result is obtained.
  • the first gesture recognition result includes: the gesture is close to the terminal, or is away from the terminal.
  • the terminal can transmit an ultrasonic wave through the speaker, and the terminal can collect the ultrasonic wave through the microphone, and the terminal can perform coarse-grained one-dimensional gesture recognition on the received one ultrasonic wave, for example, the first gesture is obtained by Doppler waveform energy change.
  • the result of the recognition, the first gesture recognition result includes: the gesture is close to the terminal, or is away from the terminal.
  • step S104 performs gesture recognition according to the received second ultrasonic wave to obtain a gesture recognition result, including:
  • the moving distance of the gesture is calculated for each of the N second ultrasonic waves
  • the gesture recognition result includes: the gesture is close to the terminal, or is away from the terminal, or Moving to the left relative to the terminal, or moving to the right relative to the terminal, or moving up relative to the terminal, or moving downward relative to the terminal.
  • the terminal can transmit a plurality of ultrasonic waves through the speaker, and the terminal can separately collect each ultrasonic wave through the microphone, and the terminal can perform fine-grained one-dimensional gesture recognition on the received plurality of ultrasonic waves. For example, using N-frequency ultrasonic phase changes to calculate the distance, the response is more sensitive. N ultrasonic waves in the middle of 17500HZ-23000HZ are emitted. The distance between adjacent ultrasonic waves is the same. The sampling frequency is 48000Hz. 512 points can be sampled to calculate the distance change in real time. The reaction time is 10.7ms. In the process of sound transmission, there will be some multipath effects, and some background static objects.
  • the dynamic signal can be subtracted from the background interference, and N distances can be calculated by N frequencies.
  • the least squares method is used to calculate the deviation, and the frequency distance with large deviation is removed, and the deviation is small. By excluding the abnormal distance, the calculation accuracy of the distance can be improved.
  • step S103 receives the second ultrasonic wave through the microphone, including:
  • the second ultrasonic wave is separately collected by the two microphones of the terminal.
  • step S104 performs gesture recognition according to the received second ultrasonic wave to obtain a gesture recognition result, including:
  • the two-dimensional gesture recognition is performed according to the calculated relative position and the initial position, and the second gesture recognition result is obtained, and the second gesture recognition result includes: two-dimensional coordinates of the gesture change.
  • At least two microphones may be disposed in the terminal to separately acquire the second ultrasonic waves. Then, for the second ultrasonic wave received by each microphone, the relative position and initial position of the gesture can be calculated.
  • the relative position of the gesture refers to the position of the gesture relative to the terminal based on the phase measurement of the second ultrasonic wave
  • the initial position of the gesture refers to the position of the user gesture at the time of initial recognition.
  • the description of the present application shows that a speaker and a microphone are disposed in the terminal, and an application program is installed on the terminal. Firstly, according to the input control indication of the application, determining whether the terminal turns on the gesture recognition mode, when the terminal determines to turn on the gesture recognition mode, playing the first ultrasonic wave through the speaker, and then receiving the second ultrasonic wave through the microphone, and the second ultrasonic wave is for the first ultrasonic wave. The waveform data obtained after the acquisition is then subjected to gesture recognition according to the received second ultrasonic wave to obtain a gesture recognition result, and finally a corresponding gesture instruction is executed in the application according to the gesture recognition result.
  • the gesture recognition can be realized by using the built-in speaker and microphone of the terminal, that is, the motion of the user can be recognized through the transmission and detection of the ultrasonic wave, the complexity of the gesture recognition is reduced, and the calculation performance requirement of the terminal is reduced. And implements gesture control of the user on the application. Further, by calculating the second ultrasonic wave, the moving distance of the gesture or the two-dimensional change of the gesture is obtained, and the accuracy of the gesture recognition is improved.
  • the application example can be applied to a mobile phone browser and a game application (Application, APP).
  • the application example can realize high-precision, low-delay, no-peripheral, non-contact one-dimensional and two-dimensional gesture recognition.
  • the example of the present application performs ultrasonic gesture detection based on the speaker and microphone of the terminal, and does not require any external equipment. For example, in the example of the present application, the user does not need to manually operate the touch screen of the terminal, and does not need to input any command from the keyboard, and the user only needs to make a gesture action.
  • the terminal in the example of the present application can detect by sending and receiving ultrasonic waves.
  • the gesture action made by the user is performed to execute the gesture instruction corresponding to the gesture action on the application of the terminal, for example, the user operates the game application by making a gesture, and the character movement in the game application can be controlled, and the game application is opened. With off and so on.
  • the user can operate basic browsing scenarios such as upper, lower, left, and right.
  • the terminal can adopt multiple playback modes such as static and streaming, and can be set according to user requirements.
  • Ultrasonic playback requirements are strict. If there is abnormal playback or frame loss, it will lead to calculation errors or errors. Therefore, the data acquisition process of the ultrasonic playback process needs to be prepared in advance. The playback process logic needs to ensure that the waveform data played is not dropped. That is, the waveform data that needs to be played is complete.
  • the terminal can detect whether the played data is abnormal, and can detect from multiple dimensions such as time domain and frequency domain. There are many ways to detect. For example, if the data being played is detected, there is no frame loss, and it is not the number of points for playing the sampling frequency in 1 second. Another example is to detect whether the playback module has reported an exception. Another example is to check whether the energy of the signal is normal in the frequency domain. For example, the frequency of the broadcast is 20khz, and the frequency of the received signal is 20khz. In addition, the playback process can also detect the loss of frames with fast-paced music playback.
  • the sampling frequency is twice the transmission frequency of the ultrasonic wave. According to Nyquist's sampling law, the sampling frequency is compared with the playback frequency, and the sampling frequency can be twice the playback frequency. If the recording buffer is set too small, it may cause frame loss. Generally, the size of the minimum buffer is more than 2 times.
  • the logic of the recording thread is as light as possible, ensuring that the sampling frequency is received in one second.
  • the recorded signal waveform is complete, whether there is no frame loss, need to detect from multiple dimensions such as frequency domain and time domain, otherwise it will affect the subsequent gesture recognition result.
  • There are many ways to detect such as whether the received signal energy is the energy of the frequency of playback.
  • the number of points recorded per second is the same as the sampling frequency.
  • the ultrasonic processing may be specifically a one-dimensional gesture recognition process or a two-dimensional gesture recognition process, which is respectively illustrated by way of example.
  • the one-dimensional gesture recognition processing will be described.
  • the one-dimensional gesture recognition processing can implement two schemes.
  • One of the coarse-grained gesture recognitions mainly includes:
  • Modeling a multi-dimensional feature such as a velocity, an acceleration, and the like of the waveform to obtain a one-dimensional gesture change Mainly according to the Doppler effect, in the process of the gesture approaching and far away, the waveform has different trends, and the calculation is based on the direction and acceleration of the waveform change. For example, as follows, an ultrasonic wave of 20,000 Hz is transmitted, and the sampling frequency is 44100 Hz, and 4096 points can be sampled each time for calculation.
  • Another fine-grained gesture recognition includes:
  • the reaction is more sensitive. N ultrasonic waves in the middle of 17500HZ-23000HZ, the distance between adjacent two ultrasonic waves is the same, the sampling frequency is 48000Hz, and 512 points can be sampled to calculate the distance change in real time.
  • the reaction time is 10.7ms, that is, in this example, the gesture
  • the accuracy of the recognition can reach the ms level.
  • the distance calculated for the phase changes of the N frequencies is eliminated by an algorithm to improve the calculation accuracy of the distance.
  • the background interference will be subtracted, and N distances will be calculated by N frequencies. According to the least squares method, Calculate the deviation, remove the frequency distance with large deviation, and keep the distance with small deviation.
  • the influence of the two microphones on the waveform is different, so that the change of the two directions can be obtained.
  • the two-dimensional gesture recognition processing mainly solves the recognition of the two-dimensional coordinates of the gesture action, thereby realizing the recognition of the two-dimensional graphics (drawing circles, drawing squares, drawing triangles, etc.) or writing Chinese characters. With two speakers, two channels of data can be sampled.
  • Steps 503 and 504 are performed for each microphone channel.
  • the phase based on the one-dimensional gesture recognition can calculate the relative distance of the gesture movement.
  • the coarse-grained initial position of the gesture can be calculated.
  • the initial position is the initial position of the gesture detected when the ultrasonic wave is received.
  • the two-dimensional gesture recognition can be combined with the coordinates in the two directions of x and y, and the precise coordinates (x, y) of the gesture recognition can be obtained through the initial The position and relative position changes can get the real-time gesture position.
  • the management of gesture recognition is exemplified next.
  • the action management of gesture recognition avoid misuse and unnecessary interruption. mainly includes:
  • the motion of the gesture recognition may be a circular shape
  • the motion recognition of the gesture recognition may be a square.
  • the user can set the gesture recognition action to meet the basic operational needs.
  • the basic operation of the gesture recognition may include basic actions such as up, down, left, and right.
  • the playback and recording of the ultrasonic wave may be affected, thereby affecting the accuracy of the gesture recognition.
  • an efficient programming language may be used, such as calculation logic.
  • Language implementation such as the use of pointer operations, optimize the processor's high-cost logic operations, minimize memory copy, input and output operations, and reduce computation time. Therefore, the time consumption of one-dimensional gesture recognition calculation in the example of the present application can be reduced to the ms level.
  • the accuracy of the gesture recognition and the fastest response time are the most important.
  • the user's misoperation can be prevented, and the opening and ending of the gesture recognition mode can be recognized by the two-dimensional gesture.
  • the present application example proposes a one-dimensional gesture recognition manner, wherein the ultrasonic phase-based approach can achieve millisecond (mm) level accuracy.
  • the example of the present application can improve the calculation performance as much as possible and save the calculation time.
  • a terminal 900 is provided.
  • the terminal is configured with a speaker and a microphone.
  • the terminal is installed with an application program, and the terminal may include: a mode determining module 901, and an ultrasound.
  • the mode determining module 901 is configured to determine, according to an input control indication of the application, whether the terminal turns on the gesture recognition mode;
  • the ultrasonic sending module 902 is configured to: when the terminal determines to enable the gesture recognition mode, play the first ultrasonic wave through the speaker;
  • the ultrasonic acquisition module 903 is configured to receive a second ultrasonic wave by using the microphone, where the second ultrasonic wave is waveform data obtained after collecting the first ultrasonic wave;
  • a gesture recognition module 904 configured to perform gesture recognition according to the received second ultrasonic wave, to obtain a gesture recognition result
  • the instruction execution module 905 is configured to execute a corresponding gesture instruction in the application according to the gesture recognition result.
  • the ultrasonic transmitting module 902 is specifically configured to play a first ultrasonic wave of N frequencies through the speaker, where N is a positive integer greater than or equal to 1;
  • the ultrasonic acquisition module is configured to collect, by the microphone, a second ultrasonic wave of N frequencies from a surrounding environment of the terminal.
  • the gesture recognition module 904 is specifically configured to perform one-dimensional gesture recognition according to the waveform energy of the second ultrasonic wave when the value of the N is 1, to obtain a first A gesture recognition result, the first gesture recognition result includes: the gesture is close to the terminal, or is away from the terminal.
  • the mode determining module 901 includes:
  • the control indication parsing module 9011 is configured to determine, according to the configuration file of the application, whether the input control indication of the application is a default open gesture recognition mode;
  • the program detection module 9012 is configured to detect whether the application runs successfully on the terminal
  • the mode-opening module 9013 is configured to determine, when the application runs successfully, that the terminal turns on the gesture recognition mode by default, and determines that the terminal has enabled the gesture recognition mode.
  • the mode determining module 901 is specifically configured to determine, according to the configuration file of the application, that the input control indication of the application is a first two-dimensional gesture open gesture recognition mode, or a second two-dimensional gesture Turning off the gesture recognition mode; when detecting the first two-dimensional gesture, determining that the terminal has turned on the gesture recognition mode; and when detecting the second two-dimensional gesture, determining that the terminal turns off the gesture recognition mode.
  • the terminal 900 further includes: a template library establishing module 906 and a gesture action matching module 907, where
  • the ultrasonic acquisition module 903 is configured to record waveform data of the specified gesture action through the microphone when the speaker plays the training ultrasonic wave;
  • a template library establishing module 906, configured to establish a gesture waveform template library according to the waveform data of the specified gesture action
  • the gesture action matching module 907 is configured to determine a gesture recognition result corresponding to the specified gesture action according to the gesture waveform template library.
  • the terminal 900 further includes:
  • the display module 908 is configured to perform gesture recognition according to the received second ultrasonic wave to obtain gesture recognition result, and then display prompt information for successful gesture recognition through the display interface of the application.
  • the gesture recognition module 904 includes:
  • a distance calculating module configured to calculate a moving distance of the gesture for each of the N second ultrasonic waves when the value of the N is greater than or equal to 2;
  • a one-dimensional identification module configured to exclude an abnormal moving distance from the moving distances of the N gestures, perform one-dimensional gesture recognition on the retained moving distance, and obtain a first gesture recognition result, where the first gesture recognition result is
  • the method includes: the gesture is close to the terminal, or is away from the terminal, or moves to the left relative to the terminal, or moves to the right relative to the terminal, or moves upward relative to the terminal, or is opposite to the terminal Move down.
  • the ultrasonic acquisition module 903 is specifically configured to separately collect a second ultrasonic wave through two microphones of the terminal;
  • the gesture recognition module 904 includes:
  • a position calculation module configured to calculate a relative position and an initial position of the gesture according to the second ultrasonic waves respectively received by the two microphones;
  • the two-dimensional recognition module is configured to perform two-dimensional gesture recognition according to the calculated relative position and the initial position to obtain a second gesture recognition result, where the second gesture recognition result includes: two-dimensional coordinates of the gesture change.
  • the ultrasonic acquisition module 903 includes:
  • a sound wave recording module configured to control the microphone to record an acoustic signal in a surrounding environment of the terminal according to preset recording parameters
  • a signal storage module configured to store the recorded acoustic wave signal into a recording buffer of the terminal, wherein the sound wave signal stored in the recording buffer is the second ultrasonic wave.
  • a speaker and a microphone are disposed in the terminal, and an application is installed on the terminal.
  • determining whether the terminal turns on the gesture recognition mode when the terminal determines to turn on the gesture recognition mode, playing the first ultrasonic wave through the speaker, and then receiving the second ultrasonic wave through the microphone, and the second ultrasonic wave is for the first ultrasonic wave
  • the waveform data obtained after the acquisition is then subjected to gesture recognition according to the received second ultrasonic wave to obtain a gesture recognition result, and finally a corresponding gesture instruction is executed in the application according to the gesture recognition result.
  • the gesture recognition can be realized by using the built-in speaker and microphone of the terminal, that is, the motion of the user can be recognized through the transmission and detection of the ultrasonic wave, the complexity of the gesture recognition is reduced, and the calculation performance requirement of the terminal is reduced. And implements gesture control of the user on the application. Further, by calculating the second ultrasonic wave, the moving distance of the gesture or the two-dimensional change of the gesture is obtained, and the accuracy of the gesture recognition is improved.
  • the present application also provides a terminal, as shown in FIG. 10, for the convenience of description, only the parts related to the examples of the present application are shown. For the specific technical details not disclosed, please refer to the example method part of the present application.
  • the terminal may be any terminal device including a mobile phone, a tablet computer, a PDA (Personal Digital Assistant), a POS (Point of Sales), an in-vehicle computer, and the terminal is a mobile phone as an example:
  • FIG. 10 is a block diagram showing a partial structure of a mobile phone associated with a terminal provided by an example of the present application.
  • the mobile phone includes: a radio frequency (RF) circuit 1010, a memory 1020, an input unit 1030, a display unit 1040, a sensor 1050, an audio circuit 1060, a wireless fidelity (WiFi) module 1070, and a processor 1080. And power supply 1090 and other components.
  • RF radio frequency
  • the RF circuit 1010 can be used for receiving and transmitting signals during the transmission or reception of information or during a call. In particular, after receiving the downlink information of the base station, it is processed by the processor 1080. In addition, the uplink data is designed to be sent to the base station. Generally, RF circuit 1010 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, RF circuit 1010 can also communicate with the network and other devices via wireless communication. The above wireless communication may use any communication standard or protocol, including but not limited to Global System of Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (Code Division). Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), E-mail, Short Messaging Service (SMS), and the like.
  • GSM Global System of Mobile communication
  • GPRS General Packet Radio Service
  • the memory 1020 can be used to store software programs and modules, and the processor 1080 executes various functional applications and data processing of the mobile phone by running software programs and modules stored in the memory 1020.
  • the memory 1020 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may be stored according to Data created by the use of the mobile phone (such as audio data, phone book, etc.).
  • memory 1020 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
  • Input unit 1030 can be used to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the handset.
  • the input unit 1030 may include a touch panel 1031 and other input devices 1032.
  • the touch panel 1031 also referred to as a touch screen, can collect touch operations on or near the user (such as the user using a finger, a stylus, or the like on the touch panel 1031 or near the touch panel 1031. Operation), and drive the corresponding connecting device according to a preset program.
  • the touch panel 1031 can include two portions of a touch detection device and a touch controller.
  • the touch detection device detects the touch orientation of the user, and detects a signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts the touch information into contact coordinates, and sends the touch information.
  • the processor 1080 is provided and can receive commands from the processor 1080 and execute them.
  • the touch panel 1031 can be implemented in various types such as resistive, capacitive, infrared, and surface acoustic waves.
  • the input unit 1030 may also include other input devices 1032.
  • other input devices 1032 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackballs, mice, joysticks, and the like.
  • the display unit 1040 can be used to display information input by the user or information provided to the user as well as various menus of the mobile phone.
  • the display unit 1040 may include a display panel 1041.
  • the display panel 1041 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), or the like.
  • the touch panel 1031 may cover the display panel 1041, and when the touch panel 1031 detects a touch operation thereon or nearby, the touch panel 1031 transmits to the processor 1080 to determine the type of the touch event, and then the processor 1080 according to the touch event.
  • the type provides a corresponding visual output on display panel 1041.
  • the touch panel 1031 and the display panel 1041 are two independent components to implement the input and input functions of the mobile phone, in some examples, the touch panel 1031 and the display panel 1041 may be integrated. The input and output functions of the phone.
  • the handset can also include at least one type of sensor 1050, such as a light sensor, motion sensor, and other sensors.
  • the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 1041 according to the brightness of the ambient light, and the proximity sensor may close the display panel 1041 and/or when the mobile phone moves to the ear. Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in all directions (usually three axes). When it is stationary, it can detect the magnitude and direction of gravity.
  • the mobile phone can be used to identify the gesture of the mobile phone (such as horizontal and vertical screen switching, related Game, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tapping), etc.; as for the mobile phone can also be configured with gyroscopes, barometers, hygrometers, thermometers, infrared sensors and other sensors, no longer Narration.
  • the gesture of the mobile phone such as horizontal and vertical screen switching, related Game, magnetometer attitude calibration
  • vibration recognition related functions such as pedometer, tapping
  • the mobile phone can also be configured with gyroscopes, barometers, hygrometers, thermometers, infrared sensors and other sensors, no longer Narration.
  • Audio circuit 1060, speaker 1061, microphone 1062, microphone 1063 can provide an audio interface between the user and the handset.
  • the audio circuit 1060 can transmit the converted electrical data of the received audio data to the speaker 1061, and convert it into a sound signal output by the speaker 1061; on the other hand, the microphone 1062 converts the collected sound signal into an electrical signal, by the audio circuit 1060. After receiving, it is converted into audio data, and then processed by the audio data output processor 1080, sent to the other mobile phone via the RF circuit 1010, or outputted to the memory 1020 for further processing.
  • the speaker 1061 can generate an ultrasonic signal, and then the microphone 1063 can acquire an ultrasonic signal from around the mobile phone.
  • WiFi is a short-range wireless transmission technology.
  • the mobile phone through the WiFi module 1070 can help users to send and receive e-mail, browse the web and access streaming media, etc. It provides users with wireless broadband Internet access.
  • FIG. 10 shows the WiFi module 1070, it can be understood that it does not belong to the essential configuration of the mobile phone, and can be omitted as needed within the scope of not changing the essence of the application.
  • the processor 1080 is the control center of the handset, which connects various portions of the entire handset using various interfaces and lines, by executing or executing software programs and/or modules stored in the memory 1020, and invoking data stored in the memory 1020, The phone's various functions and processing data, so that the overall monitoring of the phone.
  • processor 1080 can include one or more processing units; preferably, processor 1080 can integrate an application processor and a modem processor, wherein the application processor primarily processes an operating system, a user interface, and an application Etc.
  • the modem processor primarily handles wireless communications. It will be appreciated that the above described modem processor may also not be integrated into the processor 1080.
  • the mobile phone also includes a power source 1090 (such as a battery) that supplies power to various components.
  • a power source 1090 such as a battery
  • the power source can be logically coupled to the processor 1080 through a power management system to manage functions such as charging, discharging, and power management through the power management system.
  • the mobile phone may further include a camera, a Bluetooth module, and the like, and details are not described herein again.
  • the processor 1080 included in the terminal further has a flow of controlling a gesture recognition method performed by the terminal.
  • the process of identifying the ultrasonic waves by the processor 1080 can be referred to the description in the foregoing examples.
  • the terminal 1100 includes:
  • the speaker 1101, the microphone 1102, the processor 1103, and the memory 1104 (wherein the number of the processors 1103 in the terminal 1100 may be one or more, and one processor in FIG. 11 is taken as an example).
  • the speaker 1101, the microphone 1102, the processor 1103, and the memory 1104 may be connected by a bus or other means, wherein FIG. 11 is exemplified by a bus connection.
  • Memory 1104 can include read only memory and random access memory and provides instructions and data to processor 1103. A portion of the memory 1104 may also include a non-volatile random access memory (English name: Non-Volatile Random Access Memory, English abbreviation: NVRAM).
  • the memory 1104 stores operating systems and operational instructions, executable modules or data structures, or a subset thereof, or an extended set thereof, wherein the operational instructions can include various operational instructions for implementing various operations.
  • the operating system can include a variety of system programs for implementing various basic services and handling hardware-based tasks.
  • the processor 1103 controls the operation of the terminal, and the processor 1103 can also be referred to as a central processing unit (English full name: Central Processing Unit, English abbreviation: CPU).
  • the components of the terminal are coupled together by a bus system.
  • the bus system may include a power bus, a control bus, and a status signal bus in addition to the data bus.
  • the various buses are referred to as bus systems in the figures.
  • the method disclosed in the above examples of the present application may be applied to the processor 1103 or implemented by the processor 1103.
  • the processor 1103 can be an integrated circuit chip with signal processing capabilities. In the implementation process, each step of the above method may be completed by an integrated logic circuit of hardware in the processor 1103 or an instruction in a form of software.
  • the processor 1103 may be a general-purpose processor, a digital signal processor (English full name: digital signal processing, English abbreviation: DSP), an application specific integrated circuit (English name: Application Specific Integrated Circuit, English abbreviation: ASIC), field programmable Gate array (English name: Field-Programmable Gate Array, English abbreviation: FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
  • the general purpose processor may be a microprocessor or the processor or any conventional processor or the like.
  • the steps of the method disclosed in connection with the examples of the present application may be directly embodied by the execution of the hardware decoding processor or by a combination of hardware and software modules in the decoding processor.
  • the software module can be located in a conventional storage medium such as random access memory, flash memory, read only memory, programmable read only memory or electrically erasable programmable memory, registers, and the like.
  • the storage medium is located in the memory 1104, and the processor 1103 reads the information in the memory 1104 and performs the steps of the above method in combination with its hardware.
  • the speaker 1101 is configured to play a first ultrasonic wave under the control of the processor
  • the microphone 1102 is configured to receive a second ultrasonic wave under the control of the processor
  • the processor 1103 is configured to execute the instructions in the memory, and perform the method in the foregoing example.
  • the device examples described above are merely illustrative, wherein the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical. Units can be located in one place or distributed to multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present example.
  • the connection relationship between the modules indicates that there is a communication connection between them, and specifically may be implemented as one or more communication buses or signal lines.
  • U disk mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), disk or optical disk, etc., including a number of instructions to make a computer device (may be A personal computer, server, or network device, etc.) performs the methods described in various examples of the present application.
  • a computer device may be A personal computer, server, or network device, etc.

Abstract

La présente invention concerne, selon certains modes de réalisation, un procédé de reconnaissance de geste. Le procédé s'applique à un terminal, qui est pourvu d'un haut-parleur et d'un microphone, et sur lequel est installé un programme d'application. Le procédé consiste : à déterminer, selon une instruction de commande d'entrée du programme d'application, si le terminal démarre un mode de reconnaissance de geste ; lorsqu'il est déterminé que le terminal démarre le mode de reconnaissance de geste, à lire une première onde ultrasonore au moyen du haut-parleur ; à recevoir une seconde onde ultrasonore au moyen du microphone, la seconde onde ultrasonore étant des données de forme d'onde obtenues lors de l'acquisition de la première onde ultrasonore ; à réaliser une reconnaissance de geste selon la seconde onde ultrasonore reçue, pour obtenir un résultat de reconnaissance de geste ; et à exécuter une instruction de geste correspondante sur le programme d'application selon le résultat de reconnaissance de geste.
PCT/CN2018/117864 2017-11-30 2018-11-28 Procédé de reconnaissance de geste, terminal et support de stockage WO2019105376A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201711237225.9 2017-11-30
CN201711237225.9A CN109857245B (zh) 2017-11-30 2017-11-30 一种手势识别方法和终端

Publications (1)

Publication Number Publication Date
WO2019105376A1 true WO2019105376A1 (fr) 2019-06-06

Family

ID=66664706

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/117864 WO2019105376A1 (fr) 2017-11-30 2018-11-28 Procédé de reconnaissance de geste, terminal et support de stockage

Country Status (2)

Country Link
CN (1) CN109857245B (fr)
WO (1) WO2019105376A1 (fr)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021007733A1 (fr) * 2019-07-12 2021-01-21 Oppo广东移动通信有限公司 Procédé de reconnaissance de geste pour faire fonctionner un dispositif terminal, et dispositif terminal
CN111722715A (zh) * 2020-06-17 2020-09-29 上海思立微电子科技有限公司 一种手势控制系统及电子设备
CN111929689B (zh) * 2020-07-22 2023-04-07 杭州电子科技大学 一种基于手机自带传感器的物体成像方法
CN114647301A (zh) * 2020-12-17 2022-06-21 上海交通大学 一种基于声音信号的车载应用手势交互方法及系统
CN112860070A (zh) * 2021-03-03 2021-05-28 北京小米移动软件有限公司 设备交互方法、设备交互装置、存储介质及终端
CN112987925B (zh) * 2021-03-04 2022-11-25 歌尔科技有限公司 一种耳机及其控制方法、装置
CN112965639A (zh) * 2021-03-17 2021-06-15 北京小米移动软件有限公司 手势识别方法及装置、电子设备、存储介质
CN115981454A (zh) * 2021-10-13 2023-04-18 华为技术有限公司 非接触式手势控制方法和电子设备
CN115002278B (zh) * 2022-05-12 2023-10-10 中国电信股份有限公司 无线设备手势控制方法及装置、存储介质及电子设备

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103577107A (zh) * 2013-10-29 2014-02-12 广东欧珀移动通信有限公司 一种利用多点触控快速启动应用的方法及智能终端
CN105718064A (zh) * 2016-01-22 2016-06-29 南京大学 基于超声波的手势识别系统与方法
CN105807923A (zh) * 2016-03-07 2016-07-27 中国科学院计算技术研究所 一种基于超声波的凌空手势识别方法及系统

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11106273B2 (en) * 2015-10-30 2021-08-31 Ostendo Technologies, Inc. System and methods for on-body gestural interfaces and projection displays
US10632500B2 (en) * 2016-05-10 2020-04-28 Invensense, Inc. Ultrasonic transducer with a non-uniform membrane
CN107066086A (zh) * 2017-01-17 2017-08-18 上海与德信息技术有限公司 一种基于超声波的手势识别方法及装置
CN107291308A (zh) * 2017-07-26 2017-10-24 上海科世达-华阳汽车电器有限公司 一种手势识别装置及其识别方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103577107A (zh) * 2013-10-29 2014-02-12 广东欧珀移动通信有限公司 一种利用多点触控快速启动应用的方法及智能终端
CN105718064A (zh) * 2016-01-22 2016-06-29 南京大学 基于超声波的手势识别系统与方法
CN105807923A (zh) * 2016-03-07 2016-07-27 中国科学院计算技术研究所 一种基于超声波的凌空手势识别方法及系统

Also Published As

Publication number Publication date
CN109857245A (zh) 2019-06-07
CN109857245B (zh) 2021-06-15

Similar Documents

Publication Publication Date Title
WO2019105376A1 (fr) Procédé de reconnaissance de geste, terminal et support de stockage
US10466961B2 (en) Method for processing audio signal and related products
US11237724B2 (en) Mobile terminal and method for split screen control thereof, and computer readable storage medium
TWI679585B (zh) 指紋識別區域顯示方法及移動終端
US7650445B2 (en) System and method for enabling a mobile device as a portable character input peripheral device
WO2016119580A1 (fr) Procédé, dispositif et terminal de démarrage d'une fonction d'entrée vocale d'un terminal
CN108334272B (zh) 一种控制方法及移动终端
TW201839594A (zh) 行動終端、指紋識別控制方法及裝置
WO2019000287A1 (fr) Procédé et dispositif d'affichage d'icône
CN107870674B (zh) 一种程序启动方法和移动终端
WO2018166204A1 (fr) Procédé de commande de module de reconnaissance d'empreintes digitales, terminal mobile et support de stockage
WO2018076380A1 (fr) Dispositif électronique et procédé de génération de vignette vidéo dans un dispositif électronique
CN111050370A (zh) 网络切换方法、装置、存储介质及电子设备
WO2015000429A1 (fr) Procédé et dispositif de sélection intelligente de mots
CN108536509B (zh) 一种应用分身方法及移动终端
WO2017193496A1 (fr) Procédé et appareil de traitement de données d'application et dispositif de terminal
TW201516844A (zh) 一種物件選擇的方法和裝置
JP2018504798A (ja) ジェスチャ制御方法、デバイス、およびシステム
WO2017215635A1 (fr) Procédé de traitement d'effet sonore et terminal mobile
WO2019154360A1 (fr) Procédé de commutation d'interface et terminal mobile
CN108073405B (zh) 一种应用程序卸载方法及移动终端
WO2019052551A1 (fr) Procédé d'interaction de dispositif terminal, support d'informations et dispositif terminal
WO2019061512A1 (fr) Procédé de changement de tâche, et terminal
WO2019011108A1 (fr) Procédé de reconnaissance d'iris et produit associé
WO2015014135A1 (fr) Procédé et appareil de commande de pointeur de souris et dispositif de terminal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18882926

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18882926

Country of ref document: EP

Kind code of ref document: A1