CN112860070A - Device interaction method, device interaction apparatus, storage medium and terminal - Google Patents

Device interaction method, device interaction apparatus, storage medium and terminal Download PDF

Info

Publication number
CN112860070A
CN112860070A CN202110236455.3A CN202110236455A CN112860070A CN 112860070 A CN112860070 A CN 112860070A CN 202110236455 A CN202110236455 A CN 202110236455A CN 112860070 A CN112860070 A CN 112860070A
Authority
CN
China
Prior art keywords
gesture
distance
preset
determining
movement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110236455.3A
Other languages
Chinese (zh)
Inventor
周岭松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Beijing Xiaomi Pinecone Electronic Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Beijing Xiaomi Pinecone Electronic Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd, Beijing Xiaomi Pinecone Electronic Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202110236455.3A priority Critical patent/CN112860070A/en
Publication of CN112860070A publication Critical patent/CN112860070A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure relates to a device interaction method, a device interaction apparatus, a storage medium and a terminal. The method is applied to the electronic equipment and comprises the following steps: responding to the change of an ultrasonic sound field generated by the electronic equipment, and determining a corresponding gesture movement distance; under the condition that the gesture movement distance meets a preset condition, recognizing the gesture; and determining and executing a corresponding interaction instruction according to the recognized motion of the gesture. Therefore, the accuracy of gesture recognition can be improved by setting a gesture recognition threshold; according to the recognized movement times of the gestures, the corresponding interaction instructions are determined and executed, more types of gestures can be recognized, and under the condition that the hardware cost is not increased, the device interaction method is simplified and the user experience is improved according to the gestures with different movement times, which correspond to different interaction instructions.

Description

Device interaction method, device interaction apparatus, storage medium and terminal
Technical Field
The present disclosure relates to the field of terminal technologies, and in particular, to a device interaction method, apparatus, storage medium, and terminal.
Background
With the development of voice interaction technology, users can release both hands through voice control equipment in many scenes. However, in some scenarios, for example, a scenario of stopping playing songs, it is necessary to wake up the smart device by voice first and then input a voice command to stop playing, and the interaction process is somewhat complicated. Therefore, there is a need to introduce other more convenient ways to perform the process on the device. In the related art, the device interaction control can be performed through gestures, however, the gesture recognition has the problems of poor accuracy, easy false triggering and the like, and the accuracy of the device interaction control is not high.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides a device interaction method, apparatus, storage medium, and terminal.
According to a first aspect of the embodiments of the present disclosure, there is provided a device interaction method applied to an electronic device, the method including:
responding to the change of an ultrasonic sound field generated by the electronic equipment, and determining a corresponding gesture movement distance;
under the condition that the gesture movement distance meets a preset condition, recognizing the gesture;
and determining and executing a corresponding interaction instruction according to the recognized movement times of the gesture.
In some possible embodiments, the change in the ultrasonic sound field is determined based on the following steps:
controlling the electronic equipment to play a preset ultrasonic signal;
analyzing the preset ultrasonic signals collected by the electronic equipment;
and determining the change of the ultrasonic sound field according to the analyzed phase change information.
In some possible embodiments, the determining the corresponding gesture movement distance in response to the change in the ultrasonic sound field comprises:
acquiring phase variation corresponding to the variation of the ultrasonic sound field;
and determining the gesture movement distance corresponding to the phase variation according to a preset corresponding relation.
In some possible embodiments, the recognizing the gesture motion in the case that the gesture motion distance satisfies a preset condition includes:
judging whether the gesture movement distance is in a preset gesture movement interval or not;
under the condition that the gesture movement distance is within the preset gesture movement interval, recognizing the gesture;
the preset gesture motion interval is determined by the gesture motion distance and a steady-state distance, and the steady-state distance is used for representing the motion state of the gesture.
In some possible embodiments, in the case that the gesture motion distance is within the preset gesture motion interval, recognizing the gesture includes:
under the condition that the difference value between the gesture movement distance and the steady-state distance is larger than a first preset threshold value, starting to recognize the gesture;
and acquiring the duration that the absolute value of the difference value between the gesture movement distance and the steady-state distance is smaller than a second preset threshold, and stopping the recognition of the gesture under the condition that the duration exceeds a preset time threshold.
In some possible embodiments, in the case that the gesture motion distance is within the preset gesture motion interval, recognizing the gesture includes:
determining at least one distance peak value in the preset gesture interval under the condition that the gesture movement distance is in the preset gesture movement interval;
and determining the movement times of the gesture according to the sum of the number of the distance peaks.
According to a second aspect of the embodiments of the present disclosure, there is provided a device interaction apparatus applied to an electronic device, the apparatus including:
the determining module is used for responding to the change of an ultrasonic sound field generated by the electronic equipment and determining the corresponding gesture movement distance;
the recognition module is used for recognizing the gesture under the condition that the gesture movement distance meets a preset condition;
and the execution module is used for determining and executing the corresponding interaction instruction according to the recognized movement times of the gesture.
In some possible embodiments, the determining module comprises:
the ultrasonic transmitting module is used for controlling the electronic equipment to play a preset ultrasonic signal;
the ultrasonic receiving module is used for analyzing the preset ultrasonic signals collected by the electronic equipment;
the determining module is specifically configured to determine a change of the ultrasonic sound field according to the analyzed phase change information.
In some possible embodiments, the determining module further comprises:
the acquisition module is used for acquiring phase variation corresponding to the variation of the ultrasonic sound field;
the determining module is specifically configured to determine, according to a preset correspondence, a gesture movement distance corresponding to the phase change amount.
In some possible embodiments, the identification module further comprises:
the judging module is used for judging whether the gesture movement distance is in a preset gesture movement interval or not;
the recognition module is specifically configured to recognize the gesture when the gesture movement distance is within the preset gesture movement interval;
the preset gesture motion interval is determined by the gesture motion distance and a steady-state distance, and the steady-state distance is used for representing the motion state of the gesture.
In some possible embodiments, the identification module is specifically configured to:
under the condition that the difference value between the gesture movement distance and the steady-state distance is larger than a first preset threshold value, starting to recognize the gesture;
and acquiring the duration that the absolute value of the difference value between the gesture movement distance and the steady-state distance is smaller than a second preset threshold, and stopping the recognition of the gesture under the condition that the duration exceeds a preset time threshold.
In some possible embodiments, the identification module is specifically configured to:
determining at least one distance peak value in the preset gesture interval under the condition that the gesture movement distance is in the preset gesture movement interval;
and determining the movement times of the gesture according to the sum of the number of the distance peaks.
According to a third aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the steps of the device interaction method provided by the first aspect of the present disclosure.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a terminal, including: a memory having a computer program stored thereon; a processor for executing the computer program in the memory to implement the steps of the device interaction method provided by the first aspect of the present disclosure.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: determining a corresponding gesture movement distance by detecting the change of an ultrasonic sound field generated by the electronic equipment; and under the condition that the gesture movement distance meets a preset condition, recognizing the gesture. Therefore, the accuracy of gesture recognition can be improved by setting a gesture recognition threshold; according to the recognized movement times of the gestures, the corresponding interaction instructions are determined and executed, more types of gestures can be recognized, and under the condition that the hardware cost is not increased, the device interaction method is simplified and the user experience is improved according to the gestures with different movement times, which correspond to different interaction instructions.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow chart illustrating a method of device interaction according to an exemplary embodiment of the present disclosure;
FIG. 2 is a schematic diagram illustrating a method of device interaction according to an exemplary embodiment of the present disclosure;
FIG. 3 is a schematic diagram illustrating a signal acquisition according to an exemplary embodiment of the present disclosure;
FIG. 4 is a schematic diagram illustrating a phase change information according to an exemplary embodiment of the present disclosure;
FIG. 5 is a flow chart illustrating another method of device interaction according to an exemplary embodiment of the present disclosure;
FIG. 6 is a flow chart illustrating yet another method of device interaction according to an exemplary embodiment of the present disclosure;
FIG. 7 is a schematic diagram illustrating another phase change information according to an exemplary embodiment of the present disclosure;
FIG. 8 is a schematic diagram illustrating a statistical algorithm for a number of gesture movements according to an exemplary embodiment of the present disclosure;
FIG. 9 is a flowchart illustrating a contactless device interaction method according to an exemplary embodiment of the present disclosure;
FIG. 10 is a schematic diagram illustrating a configuration of a device interaction apparatus, according to an exemplary embodiment of the present disclosure;
fig. 11 is a block diagram illustrating a terminal 900 according to an exemplary embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
In the description that follows, the terms "first," "second," and the like are used for descriptive purposes only and are not intended to indicate or imply relative importance nor order to be construed.
With the development of voice interaction technology, users can release both hands through voice control equipment in many scenes. However, in some scenarios, for example, a scenario of stopping playing songs, it is necessary to wake up the smart device by voice first and then input a voice command to stop playing, and the interaction process is somewhat complicated. Therefore, there is a need to introduce other more convenient ways to perform the process on the device. In the related art, the device interaction control can be performed through gestures, however, the gesture recognition has the problems of poor accuracy, easy false triggering and the like, and the accuracy of the device interaction control is not high. .
In order to overcome the problems in the related art, the present disclosure provides a device interaction method, apparatus, storage medium, and terminal, and the present disclosure is described below with reference to specific embodiments.
Fig. 1 is a flowchart illustrating a device interaction method applied to an electronic device, which may be a fixed device or a mobile device, according to an exemplary embodiment, for example: intelligent terminal, intelligent dull and stereotyped, intelligent wearing equipment, personal electronic commerce assistant, personal computer etc. electronic equipment.
As shown in fig. 1, the method includes:
step S101, responding to the change of an ultrasonic sound field generated by the electronic equipment, and determining a corresponding gesture movement distance;
step S102, under the condition that the gesture movement distance meets a preset condition, recognizing the gesture;
and S103, determining and executing a corresponding interaction instruction according to the recognized movement times of the gesture.
The equipment interaction method in the embodiment of the disclosure is realized in an ultrasonic mode. In some possible embodiments, the electronic device may include an ultrasonic wave emitting device and an ultrasonic wave receiving device for sensing a sound field change of the ultrasonic wave. For example, the electronic device may include a speaker and a microphone.
Fig. 2 is a schematic diagram illustrating a device interaction method according to an exemplary embodiment of the present disclosure.
As shown in fig. 2, the electronic device is taken as an example of a smart sound box device. The intelligent sound box device is provided with a loudspeaker and a microphone, and can sense the change of an ultrasonic sound field through the loudspeaker and the microphone, judge whether a control gesture exists and further determine the gesture movement distance. Under the condition that the gesture movement distance meets a preset condition, recognizing the gesture; and determining and executing a corresponding interactive instruction according to the recognized movement times.
In some possible embodiments, when the smart speaker device is operating, the user may tap the electronic device with a hand over the smart speaker device in a non-contact manner, the smart speaker device may perform waking up and directly enter a speech recognition stage, and the user only needs to directly input a speech instruction, instead of first inputting a wake-up word to wait for the smart speaker device to respond and then entering the speech recognition stage; when the intelligent sound box device plays music, a user can take two electronic devices above the intelligent sound box device by hands in a non-contact manner, the intelligent sound box device can execute an interactive instruction for stopping playing the music, the user takes two electronic devices again, and the intelligent sound box device can execute an interactive instruction for continuing playing the music; when the user claps three smart speaker devices, the smart device can execute the broadcast of the current time and the weather of the day. The gesture capable of waking up the electronic device may be set as required, for example, the gesture device is a "clap" gesture or a "clap" gesture. Also, the number of movements of the gesture and the interactive instruction executed by the electronic device may be set as required, which is not limited by the present disclosure.
According to the equipment interaction method, the corresponding gesture movement distance is determined by detecting the change of an ultrasonic sound field generated by electronic equipment; under the condition that the gesture movement distance meets a preset condition, the gesture is recognized, and therefore the accuracy of gesture recognition can be improved by setting a gesture recognition threshold; according to the recognized movement times of the gestures, the corresponding interaction instructions are determined and executed, so that more types of gestures can be recognized, and the gestures with different movement times correspond to different interaction instructions without increasing hardware cost, so that the equipment interaction method is simplified, and the user experience is improved.
In some possible embodiments, the change in the ultrasonic sound field is determined based on the following steps:
controlling the electronic equipment to play a preset ultrasonic signal;
analyzing the preset ultrasonic signals collected by the electronic equipment;
and determining the change of the ultrasonic sound field according to the analyzed phase change information.
It should be noted that, the electronic device may play a preset ultrasonic signal through an ultrasonic transmitting device, such as a speaker, and then collect the preset ultrasonic signal reflected back in a space where the electronic device is located through an ultrasonic receiving device, such as a microphone. When there is no gesture motion, the sound field near the electronic device is stable; when there is a gesture motion, the sound field will also fluctuate accordingly. The electronic equipment is controlled to analyze the acquired preset ultrasonic signals, and the change of the ultrasonic sound field can be determined according to the phase change information obtained by analysis. For example, taking a "beat-to-beat" gesture as an example, the ultrasonic phase information collected by the microphone is changed by the hand waving motion, and the change of the ultrasonic sound field can be determined through the phase change.
In some embodiments, at least two single-frequency signals may be selected to form the predetermined ultrasonic signal, and the frequency bands of the single-frequency signals are different. The frequency band selection of the single-frequency signal is not limited in the embodiment of the disclosure, and the selection can be performed according to the requirement of subsequent processing. For example, two single-frequency signals having frequencies of 22kHz and 22.7kHz may be selected to be combined, or two single-frequency signals having frequency bands separated by an interval of 350Hz may be selected to be adapted to parameters of a subsequent filtering process. After preset ultrasonic signals are played, the preset ultrasonic signals are transmitted through different paths and are picked up by an ultrasonic receiving device of the electronic equipment, such as a microphone; and analyzing the acquired ultrasonic signals to obtain corresponding phase change information, thereby representing the change condition of the ultrasonic sound field.
Fig. 3 is a schematic diagram illustrating a signal acquisition according to an exemplary embodiment of the present disclosure.
Fig. 4 is a schematic diagram illustrating phase change information according to an exemplary embodiment of the disclosure, in which a curve a represents the phase change information, a curve B represents the distance change information, and a curve C represents the distance change information after gaussian smoothing.
Signals collected by the microphone are shown in fig. 3, and the corresponding phase information change can be obtained by analyzing the sound signals and is shown as a curve a in fig. 4. As can be seen from fig. 3 and 4, in the case where the ultrasonic sound field is stable, the phase is substantially constant; when the ultrasonic sound field changes, the phase changes. Therefore, the phase change information obtained by analysis can be used for representing the change situation of the ultrasonic sound field.
In some possible embodiments, the determining the corresponding gesture movement distance in response to the change of the ultrasonic sound field includes:
acquiring phase variation corresponding to the variation of the ultrasonic sound field;
and determining the gesture movement distance corresponding to the phase variation according to a preset corresponding relation.
As shown in fig. 4, the distance profile of the gesture can be mapped according to the phase information. When the sound field near the electronic equipment has no gesture motion, the phase is basically constant, and the corresponding distance information is also stable; when the sound field near the electronic device has gesture motion, the phase changes, and the gesture motion distance corresponding to the phase change amount can be determined according to the preset corresponding relation.
In some embodiments, the preset correspondence between the phase variation and the gesture movement distance may be set as the following formula 1:
Figure BDA0002960629690000091
wherein, Δ in the formula 1dIs the distance of gesture movement, ΔθFor the phase change, l is the wavelength.
The change of the gesture movement distance corresponding to the phase variation is shown by the curve B in fig. 4.
In some embodiments, the signal processing may be performed on the gesture movement distance corresponding to the phase change amount, for example: the distance change curve for representing the gesture movement distance is subjected to Gaussian smoothing processing, so that the distance change caused by sudden change interference in a short time can be effectively filtered, and the overall robustness is improved. The change of the gesture movement distance after the gaussian smoothing is performed is shown by the C curve in fig. 4.
Fig. 5 is a flowchart illustrating another device interaction method according to an exemplary embodiment of the present disclosure.
In some possible embodiments, in step S102, in the case that the gesture movement distance satisfies a preset condition, recognizing the gesture movement includes:
step S501, judging whether the gesture movement distance is in a preset gesture movement interval or not;
step S502: under the condition that the gesture movement distance is within the preset gesture movement interval, recognizing the gesture;
the preset gesture motion interval is determined by the gesture motion distance and a steady-state distance, and the steady-state distance is used for representing the motion state of the gesture.
In some possible embodiments, by analyzing fluctuation information of the ultrasonic signal acquired by the electronic device, a gesture movement distance and a movement acceleration may be acquired, and when the gesture movement distance or the movement acceleration exceeds a set threshold, it is determined that the user has made a system-specified gesture, and then a response is made. This implementation is simple, judged by a fixed threshold; however, the degree of disturbance caused by the motion of the gestures at different distances is different, and the determination method of the fixed threshold value causes the recognition accuracy to be low.
The device interaction method related to the embodiment of the disclosure introduces a concept of a steady-state distance, can adopt a Minimum Control Recursive Average (MCRA) algorithm to estimate the steady-state distance, determines a preset gesture motion interval through the gesture motion distance and the steady-state distance, and performs gesture motion analysis and gesture recognition in the motion interval.
The steady state distance is a distance parameter introduced to characterize the motion state of the gesture. Embodiments of the present disclosure employ a minimum control recursive average Method (MCRA) algorithm to calculate the steady-state distance when a gesture moves. The implementation logic of the method is based on two assumptions made about the presence or absence of gesture motion: when gesture motion is not present, the label is H0(t) a state in which the steady state distance estimate is updated in real time; when gesture motion exists, the label is H1(t), the steady state distance is not updated. In some embodiments, the steady-state distance update method may be set as the following equation 2:
Figure BDA0002960629690000101
Figure BDA0002960629690000102
wherein d (t) in formula 2 is the current actual distance value,
Figure BDA0002960629690000103
to estimate the steady-state distance, t is time, α is a smoothing factor, and the value of α can be set as desired, for example, to 0.98. The steady state distance curve estimated by MCRA is shown as curve C in fig. 4.
The device interaction method related by the embodiment of the disclosure introduces a concept of steady-state distance, adopts a minimum control recursive average method (MCRA algorithm) to estimate the steady-state distance, determines a preset gesture motion interval through the gesture motion distance and the steady-state distance, and performs gesture motion analysis and gesture recognition in the motion interval.
Fig. 6 is a flowchart illustrating yet another device interaction method according to an exemplary embodiment of the present disclosure.
In some possible embodiments, in step S102, in some possible embodiments, in the case that the gesture movement distance is within the preset gesture movement interval, recognizing the gesture includes:
step S601, under the condition that the difference value between the gesture movement distance and the steady-state distance is larger than a first preset threshold value, the gesture is identified;
and step S602, acquiring duration time of which the absolute value of the difference value between the gesture movement distance and the steady-state distance is less than a second preset threshold, and stopping recognizing the gesture under the condition that the duration time exceeds a preset time threshold.
It should be noted that after the steady-state distance is estimated by the MCRA algorithm, a preset gesture motion interval is determined according to the gesture motion distance and the steady-state distance, the gesture motion interval is an interval for recognizing a moving gesture, and the gesture motion interval may determine the time for starting and stopping recognition of the gesture.
Fig. 7 is a schematic diagram of another phase change information according to an exemplary embodiment of the disclosure, in which a curve a represents distance change information after gaussian smoothing, a curve B represents a steady-state distance curve estimated by MCRA, a curve C represents a preset gesture motion interval detection range, and a curve C represents the counted number of peak points;
in some embodiments, the preset gesture motion interval may be constructed by:
d (t) is the actual gesture movement distance,
Figure BDA0002960629690000111
at a steady state distance, ThraIs a first predetermined threshold value, ThrbIs a second predetermined threshold value, T0Is a preset time threshold; the values of the first preset threshold, the second preset threshold and the preset time threshold can be set as required, which is not limited in the disclosure; for example, the first preset threshold may be set to 5cm, the second preset threshold may be set to 2cm, and the preset time threshold may be setIs defined as 200 ms.
Beyond the steady state distance at the gesture movement distance d (t)
Figure BDA0002960629690000112
Is greater than a first preset threshold value ThraWhen is at time
Figure BDA0002960629690000121
When the gesture is detected to be detected, judging that gesture movement is started and recognizing the gesture if the difference value between the gesture movement distance and the steady-state distance is larger than a first preset distance, wherein the gesture is characterized to be H in the state0(t)→H1(t), i.e., a state in which the gesture motion does not exist infinitely approaches a state in which the gesture motion exists.
When entering gesture recognition, when d (t) exceeds
Figure BDA0002960629690000122
Range less than ThrbWhen it is time, start timing if
Figure BDA0002960629690000123
The duration of this state exceeds a preset time threshold T0Then a determination is made to enter a state where gesture motion is not present, in which state it is characterized as H1(t)→H0(t), i.e., the state in which the gesture motion exists infinitely approaches the state in which the gesture motion does not exist. Through the activity detection algorithm, a gesture motion interval can be effectively divided, and an interval divided by the time of the two state changes is used as a preset gesture motion interval, as shown by a curve C shown in fig. 7. The gesture motion is detected in the interval, so that the accuracy and robustness of gesture motion recognition can be improved.
In some possible embodiments, in the case that the gesture movement distance is within the preset gesture movement interval, recognizing the gesture includes:
determining at least one distance peak value in the preset gesture interval under the condition that the gesture movement distance is in the preset gesture movement interval;
and determining the movement times of the gesture according to the sum of the number of the distance peaks.
In some possible embodiments, the device interaction method related to the present disclosure may further determine and execute a corresponding interaction instruction according to the number of movements of the recognized gesture. And identifying the movement times of the gesture in the preset gesture movement interval by adopting a peak value statistical algorithm. It should be noted that in the method, several peaks appear in the distance change curve in the whole gesture recognition process, so as to determine the number of movements of the gesture.
The peak value statistical algorithm needs to preset a third preset threshold ThrΔThe statistical distance peak dpeak(t) the following conditions need to be satisfied: a first distance value exists in a left adjacent interval and a right adjacent interval of each distance peak value, and the difference value between the first distance value and the distance peak value is smaller than a third preset threshold value; the two conditions mentioned above can be understood as: (1)1) the peak value dpeak(t) is the maximum value in the local interval; 2) in the local interval, the number points on the left and right of the peak point are respectively less than dpeak(t)-ThrΔ. Then, counting the number of the distance peak values, and determining the movement times of the gesture according to the sum of the number of the distance peak values; wherein the third preset threshold ThrΔCan be set as desired, for example, the third preset threshold is set to 5 cm. The number of the peak values is counted, and the movement times of the preset gesture of the user are fed back, for example, when the preset gesture is a 'one beat' gesture, the number of times of 'one beat' of the user is fed back according to the counted number of the peak values. After the number of times of 'one beat' of the user is acquired, the information can be fed back to the upper-layer service logic. And entering a corresponding interaction state according to a scheme configured by a user, so as to realize more convenient interaction experience.
As shown in fig. 7, the number of peak points counted in the detection range of the preset gesture motion interval is 4, and the number of motions representing the preset gesture is 4. The device can determine the current interactive instruction according to the preset corresponding relation between the movement times of the preset gesture and the interactive instruction, and execute the interactive instruction in advance. The embodiment of the disclosure relates to a peak value statistical algorithm, and the number of distance peak values in a preset gesture interval is judged by introducing a third preset threshold value, so that ultrasonic wave fluctuation interference can be eliminated, and the accuracy of gesture movement frequency detection is further improved.
In other embodiments, a peak-to-valley statistical algorithm may also be used to determine the number of movements of the preset gesture. Still taking the preset gesture as a 'one-beat' gesture as an example, in the running process of the device, when the gesture distance change is detected to be greater than a threshold value, a 'one-beat' detection link is started. And when the distance change is less than the threshold value for a period of time, ending the detection. And calculating several peak values and valley values of the distance change curve in the whole 'one-beat' process by using a peak-valley statistical algorithm, and feeding back several times of 'one-beat' beats of the user through the statistics of the number of the peak values/the valley values.
Fig. 8 shows a schematic diagram of a statistical algorithm for the number of gesture movements. As shown in fig. 8, in the distance variation curve of "one beat", 4 peaks and 3 valleys were counted, which indicates that the user performed 4 times of downward hand-shake movements in total. The number of times of movement of the preset gesture in the preset gesture interval can also be determined through a peak-valley statistical algorithm, so that the equipment further determines and executes the interactive instruction according to the number of times of movement of the preset gesture.
Fig. 9 illustrates a contactless device interaction method. According to the method, the loudspeaker and the microphone device of the intelligent equipment are used, the designed high-frequency sound signals (ultrasonic waves) are played by the loudspeaker, and the high-frequency sound signals are transmitted through different paths after being played and can be picked by the microphone of the intelligent equipment. After the sound signal collected by the microphone is obtained, the processor of the intelligent device can further analyze the sound signal. By computational analysis of the sound signal, the corresponding phase information can be obtained. Based on the phase information, the distance change information of the hand can be mapped. When the hand is not operated, the phase is basically constant, and the corresponding distance information is also stable; when the hand moves, the phase changes, so the distance of the relative movement of the hand can be determined. And the distance change curve is subjected to Gaussian smoothing, so that sudden change interference caused by self movement of equipment in a short time can be effectively filtered. The distance curve after the gaussian smoothing can adopt a Minimum Control Recursive Average (MCRA) algorithm to calculate the steady-state distance when the hand does not move, so that a possible motion interval is estimated according to the steady-state distance. And calculating the movement times of the gestures in the movement interval through a peak value statistical algorithm, and determining and executing the interactive instruction by the equipment according to the movement times of the gestures and the corresponding relation between the movement times of different gestures and the interactive instruction which are pre-stored in the equipment.
The equipment interaction method is different from the traditional scheme of recognizing non-contact gestures through a camera, is realized by adopting an ultrasonic method, can be further applied to equipment supporting voice awakening interaction, not only enables the interaction mode to be more various and convenient and has faster response speed on the basis of not increasing the hardware cost of the equipment, but also can recognize various types of gestures, effectively filters out mistaken touches caused by other interferences and provides more stable recognition robustness.
Fig. 10 is a schematic structural diagram illustrating a device interaction apparatus according to an exemplary embodiment, where the device interaction apparatus is applied to an electronic device. Referring to fig. 10, the device interaction apparatus 100 includes a determination module 101, an identification module 102, and an execution module 103.
A determining module 101, configured to determine a corresponding gesture movement distance in response to a change in an ultrasonic sound field generated by the electronic device;
the recognition module 102 is configured to recognize the gesture when the gesture movement distance meets a preset condition;
and the execution module 103 determines and executes a corresponding interaction instruction according to the recognized movement times of the gesture.
In certain embodiments, the determining module 101 comprises:
the ultrasonic transmitting module is used for controlling the electronic equipment to play a preset ultrasonic signal;
the ultrasonic receiving module is used for analyzing the preset ultrasonic signals collected by the electronic equipment;
the determining module 101 is specifically configured to determine a change of the ultrasonic sound field according to the analyzed phase change information.
In some embodiments, the determining module 101 further comprises:
the acquisition module is used for acquiring phase variation corresponding to the variation of the ultrasonic sound field;
the determining module 101 is specifically configured to determine, according to a preset corresponding relationship, a gesture movement distance corresponding to the phase change amount.
In certain embodiments, the identification module 102 comprises:
the judging module is used for judging whether the gesture movement distance is in a preset gesture movement interval or not;
the recognition module 102 is specifically configured to recognize the gesture when the gesture movement distance is within the preset gesture movement interval;
the preset gesture motion interval is determined by the gesture motion distance and a steady-state distance, and the steady-state distance is used for representing the motion state of the gesture.
In some embodiments, the identification module 102 is specifically configured to:
under the condition that the difference value between the gesture movement distance and the steady-state distance is larger than a first preset threshold value, starting to recognize the gesture;
and acquiring the duration that the absolute value of the difference value between the gesture movement distance and the steady-state distance is smaller than a second preset threshold, and stopping the recognition of the gesture under the condition that the duration exceeds a preset time threshold.
In some embodiments, the identification module 102 is specifically configured to:
determining at least one distance peak value in the preset gesture interval under the condition that the gesture movement distance is in the preset gesture movement interval;
and determining the movement times of the gesture according to the sum of the number of the distance peaks.
Fig. 11 is a block diagram illustrating a terminal 900 according to an example embodiment. For example, terminal 900 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, fitness device, personal digital assistant, and the like.
Referring to fig. 11, terminal 900 can include one or more of the following components: a processing component 902, a memory 904, a power component 906, a multimedia component 908, an audio component 910, an input/output (I/O) interface 912, a sensor component 914, and a communication component 916.
Processing component 902 generally controls overall operation of terminal 900, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 902 may include one or more processors 920 to execute instructions to perform all or some of the steps of the wake-up control method described above. Further, processing component 902 can include one or more modules that facilitate interaction between processing component 902 and other components. For example, the processing component 902 can include a multimedia module to facilitate interaction between the multimedia component 908 and the processing component 902.
Memory 904 is configured to store various types of data to support operation at terminal 900. Examples of such data include instructions for any application or method operating on terminal 900, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 904 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power component 906 provides power to the various components of terminal 900. Power components 906 can include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for terminal 900.
The multimedia components 908 include a screen providing an output interface between the terminal 900 and the user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 908 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the terminal 900 is in an operation mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 910 is configured to output and/or input audio signals. For example, audio component 910 includes a Microphone (MIC) configured to receive external audio signals when terminal 900 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 904 or transmitted via the communication component 916. In some embodiments, audio component 910 also includes a speaker for outputting audio signals. I/O interface 912 provides an interface between processing component 902 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor component 914 includes one or more sensors for providing various aspects of state assessment for the terminal 900. For example, sensor assembly 914 can detect an open/closed state of terminal 900, a relative positioning of components, such as a display and keypad of terminal 900, a change in position of terminal 900 or a component of terminal 900, the presence or absence of user contact with terminal 900, an orientation or acceleration/deceleration of terminal 900, and a change in temperature of terminal 900. The sensor assembly 914 may include a proximity sensor configured to detect the presence of a nearby object in the absence of any physical contact. The sensor assembly 914 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 914 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
Communication component 916 is configured to facilitate communications between terminal 900 and other devices in a wired or wireless manner. Terminal 900 can access a wireless network based on a communication standard, such as WiFi, 2G, or 3G, or a combination thereof. In an exemplary embodiment, the communication component 916 receives a broadcast signal or broadcast associated information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 916 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the terminal 900 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the wake-up control method described above.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as memory 904 comprising instructions, executable by processor 920 of terminal 900 to perform the wake-up control method described above is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In another exemplary embodiment, a computer program product is also provided, which contains a computer program executable by a programmable apparatus, the computer program having code portions for performing the wake-up control method described above when executed by the programmable apparatus.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (14)

1. A device interaction method is applied to an electronic device, and comprises the following steps:
responding to the change of an ultrasonic sound field generated by the electronic equipment, and determining a corresponding gesture movement distance;
under the condition that the gesture movement distance meets a preset condition, recognizing the gesture;
and determining and executing a corresponding interaction instruction according to the recognized movement times of the gesture.
2. The method of claim 1, wherein the change in the ultrasonic sound field is determined based on:
controlling the electronic equipment to play a preset ultrasonic signal;
analyzing the preset ultrasonic signals collected by the electronic equipment;
and determining the change of the ultrasonic sound field according to the analyzed phase change information.
3. The method of claim 1 or 2, wherein determining the corresponding gesture movement distance in response to the change in the ultrasonic sound field comprises:
acquiring phase variation corresponding to the variation of the ultrasonic sound field;
and determining the gesture movement distance corresponding to the phase variation according to a preset corresponding relation.
4. The method according to claim 1, wherein the recognizing the gesture motion in the case that the gesture motion distance satisfies a preset condition comprises:
judging whether the gesture movement distance is in a preset gesture movement interval or not;
under the condition that the gesture movement distance is within the preset gesture movement interval, recognizing the gesture;
the preset gesture motion interval is determined by the gesture motion distance and a steady-state distance, and the steady-state distance is used for representing the motion state of the gesture.
5. The method of claim 4, wherein recognizing the gesture if the gesture motion distance is within the preset gesture motion interval comprises:
under the condition that the difference value between the gesture movement distance and the steady-state distance is larger than a first preset threshold value, starting to recognize the gesture;
and acquiring the duration that the absolute value of the difference value between the gesture movement distance and the steady-state distance is smaller than a second preset threshold, and stopping the recognition of the gesture under the condition that the duration exceeds a preset time threshold.
6. The method of claim 4, wherein recognizing the gesture if the gesture motion distance is within the preset gesture motion interval comprises:
determining at least one distance peak value in the preset gesture interval under the condition that the gesture movement distance is in the preset gesture movement interval;
and determining the movement times of the gesture according to the sum of the number of the distance peaks.
7. A device interaction device applied to electronic equipment is characterized by comprising:
the determining module is used for responding to the change of an ultrasonic sound field generated by the electronic equipment and determining the corresponding gesture movement distance;
the recognition module is used for recognizing the gesture under the condition that the gesture movement distance meets a preset condition;
and the execution module is used for determining and executing the corresponding interaction instruction according to the recognized movement times of the gesture.
8. The apparatus of claim 7, wherein the determining module comprises:
the ultrasonic transmitting module is used for controlling the electronic equipment to play a preset ultrasonic signal;
the ultrasonic receiving module is used for analyzing the preset ultrasonic signals collected by the electronic equipment;
the determining module is specifically configured to determine a change of the ultrasonic sound field according to the analyzed phase change information.
9. The apparatus of claim 7 or 8, wherein the determining module further comprises:
the acquisition module is used for acquiring phase variation corresponding to the variation of the ultrasonic sound field;
the determining module is specifically configured to determine, according to a preset correspondence, a gesture movement distance corresponding to the phase change amount.
10. The apparatus of claim 9, wherein the identification module comprises:
the judging module is used for judging whether the gesture movement distance is in a preset gesture movement interval or not;
the recognition module is specifically configured to recognize the gesture when the gesture movement distance is within the preset gesture movement interval;
the preset gesture motion interval is determined by the gesture motion distance and a steady-state distance, and the steady-state distance is used for representing the motion state of the gesture.
11. The apparatus of claim 10, wherein the identification module is specifically configured to:
under the condition that the difference value between the gesture movement distance and the steady-state distance is larger than a first preset threshold value, starting to recognize the gesture;
and acquiring the duration that the absolute value of the difference value between the gesture movement distance and the steady-state distance is smaller than a second preset threshold, and stopping the recognition of the gesture under the condition that the duration exceeds a preset time threshold.
12. The apparatus of claim 10, wherein the identification module is specifically configured to:
determining at least one distance peak value in the preset gesture interval under the condition that the gesture movement distance is in the preset gesture movement interval;
and determining the movement times of the gesture according to the sum of the number of the distance peaks.
13. A computer-readable storage medium, on which computer program instructions are stored, which program instructions, when executed by a processor, carry out the steps of the method according to any one of claims 1 to 6.
14. A terminal, comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to carry out the steps of the method of any one of claims 1 to 6.
CN202110236455.3A 2021-03-03 2021-03-03 Device interaction method, device interaction apparatus, storage medium and terminal Pending CN112860070A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110236455.3A CN112860070A (en) 2021-03-03 2021-03-03 Device interaction method, device interaction apparatus, storage medium and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110236455.3A CN112860070A (en) 2021-03-03 2021-03-03 Device interaction method, device interaction apparatus, storage medium and terminal

Publications (1)

Publication Number Publication Date
CN112860070A true CN112860070A (en) 2021-05-28

Family

ID=75991367

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110236455.3A Pending CN112860070A (en) 2021-03-03 2021-03-03 Device interaction method, device interaction apparatus, storage medium and terminal

Country Status (1)

Country Link
CN (1) CN112860070A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114153311A (en) * 2021-11-19 2022-03-08 北京小米移动软件有限公司 Device control method, device and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105094298A (en) * 2014-05-13 2015-11-25 华为技术有限公司 Terminal and terminal based gesture recognition method
US20160091308A1 (en) * 2014-09-30 2016-03-31 Invensense, Inc. Microelectromechanical systems (mems) acoustic sensor-based gesture recognition
CN105718064A (en) * 2016-01-22 2016-06-29 南京大学 Gesture recognition system and method based on ultrasonic waves
CN107491254A (en) * 2016-06-13 2017-12-19 中兴通讯股份有限公司 A kind of gesture operation method, device and mobile terminal
US20190041994A1 (en) * 2017-08-04 2019-02-07 Center For Integrated Smart Sensors Foundation Contactless gesture recognition system and method thereof
US20190073040A1 (en) * 2017-09-05 2019-03-07 Future Mobility Corporation Limited Gesture and motion based control of user interfaces
CN109857245A (en) * 2017-11-30 2019-06-07 腾讯科技(深圳)有限公司 A kind of gesture identification method and terminal
CN110031827A (en) * 2019-04-15 2019-07-19 吉林大学 A kind of gesture identification method based on ultrasonic distance measurement principle
US20190317606A1 (en) * 2018-04-12 2019-10-17 International Business Machines Corporation Multiple User Interaction with Audio Devices Using Speech and Gestures
CN111796792A (en) * 2020-06-12 2020-10-20 瑞声科技(新加坡)有限公司 Gesture action judgment method and device, electronic equipment and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105094298A (en) * 2014-05-13 2015-11-25 华为技术有限公司 Terminal and terminal based gesture recognition method
US20160091308A1 (en) * 2014-09-30 2016-03-31 Invensense, Inc. Microelectromechanical systems (mems) acoustic sensor-based gesture recognition
CN105718064A (en) * 2016-01-22 2016-06-29 南京大学 Gesture recognition system and method based on ultrasonic waves
CN107491254A (en) * 2016-06-13 2017-12-19 中兴通讯股份有限公司 A kind of gesture operation method, device and mobile terminal
US20190041994A1 (en) * 2017-08-04 2019-02-07 Center For Integrated Smart Sensors Foundation Contactless gesture recognition system and method thereof
US20190073040A1 (en) * 2017-09-05 2019-03-07 Future Mobility Corporation Limited Gesture and motion based control of user interfaces
CN109857245A (en) * 2017-11-30 2019-06-07 腾讯科技(深圳)有限公司 A kind of gesture identification method and terminal
US20190317606A1 (en) * 2018-04-12 2019-10-17 International Business Machines Corporation Multiple User Interaction with Audio Devices Using Speech and Gestures
CN110031827A (en) * 2019-04-15 2019-07-19 吉林大学 A kind of gesture identification method based on ultrasonic distance measurement principle
CN111796792A (en) * 2020-06-12 2020-10-20 瑞声科技(新加坡)有限公司 Gesture action judgment method and device, electronic equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114153311A (en) * 2021-11-19 2022-03-08 北京小米移动软件有限公司 Device control method, device and storage medium

Similar Documents

Publication Publication Date Title
CN108052079B (en) Device control method, device control apparatus, and storage medium
CN105488464B (en) Fingerprint identification method and device
EP3196736A1 (en) Method and apparatus for recognizing gesture
EP3933570A1 (en) Method and apparatus for controlling a voice assistant, and computer-readable storage medium
EP4184506A1 (en) Audio processing
US20180144176A1 (en) Fingerprint template acquisition method and device
CN109145679A (en) A kind of method, apparatus and system issuing warning information
CN108108683A (en) Touch-control response method, mobile terminal and storage medium
CN109599104A (en) Multi-beam choosing method and device
CN107958239B (en) Fingerprint identification method and device
CN106127132B (en) The reminding method and device, electronic equipment of slidingtype typing fingerprint
CN111580773A (en) Information processing method, device and storage medium
CN108090441A (en) Method for controlling fingerprint identification, mobile terminal and storage medium
CN107025041B (en) Fingerprint input method and terminal
CN107566615B (en) Message treatment method, device and computer readable storage medium
CN112860070A (en) Device interaction method, device interaction apparatus, storage medium and terminal
CN112509596A (en) Wake-up control method and device, storage medium and terminal
CN112346571A (en) Equipment control method and device and storage medium
CN114185444A (en) Method and device for preventing mistaken touch of touch screen and storage medium
CN104615457B (en) Show the method and device of control
EP3200127B1 (en) Method and device for fingerprint recognition
CN109144317A (en) The gesture detecting method and device of screen
CN112863511B (en) Signal processing method, device and storage medium
CN108877742A (en) Luminance regulating method and device
CN107948876A (en) Control the method, apparatus and medium of sound-box device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination