CN107066086A - A kind of gesture identification method and device based on ultrasonic wave - Google Patents
A kind of gesture identification method and device based on ultrasonic wave Download PDFInfo
- Publication number
- CN107066086A CN107066086A CN201710031687.9A CN201710031687A CN107066086A CN 107066086 A CN107066086 A CN 107066086A CN 201710031687 A CN201710031687 A CN 201710031687A CN 107066086 A CN107066086 A CN 107066086A
- Authority
- CN
- China
- Prior art keywords
- gesture
- ultrasonic
- ultrasonic wave
- positional information
- model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the invention discloses a kind of gesture identification method and device based on ultrasonic wave, apply on Intelligent glove, the Intelligent glove wrist is provided with two ultrasonic transmitters, at least one finger tip of the Intelligent glove and is provided with ultrasonic receiver, and methods described includes:The launch time and each ultrasonic receiver that obtain each ultrasonic transmitter transmitting ultrasonic wave receive the reception time of ultrasonic wave;The positional information of each ultrasonic receiver is determined according to the launch time and the reception time;Gesture model is set up according to the positional information.The embodiment of the present invention can solve the problem of error rate that is brought in the prior art using optical gesture identification technology and inertial sensor gesture identification is high, effectively reduce gesture identification misclassification rate, reduce error.
Description
Technical field
The present embodiments relate to Gesture Recognition, more particularly to a kind of gesture identification method and dress based on ultrasonic wave
Put.
Background technology
In computer science, gesture identification is the subject under discussion that human gesture is recognized by mathematical algorithm.Gesture is known
The motion of the parts of body of people is not can come from, but generally refers to the motion of face and hand.
Current gesture identification mainly has optical gesture identification technology and inertial sensor gesture identification two ways, optics hand
Gesture identification is mainly judged image, and modeled by algorithm by camera;Inertial sensor gesture identification mainly passes through
Inertial Measurement Unit catches calculating action track to complete gesture identification.
The shortcoming of optical gesture identification is needed in specific region, it is not possible to blocked, and the misclassification rate of algorithm also can
Compare high, especially for forefinger, nameless length identical user, it recognizes that error rate is of a relatively high.Inertial sensor hand
The shortcoming of gesture identification is the accumulation of error, and when long-time is used, the accumulation of error can make identification distortion occur.
The content of the invention
The embodiment of the present invention provides a kind of gesture identification method and device based on ultrasonic wave, to realize reduction gesture identification
The purpose of misclassification rate.
In a first aspect, the embodiments of the invention provide a kind of gesture identification method based on ultrasonic wave, applying in intelligent hand
Put, the Intelligent glove wrist is provided with two ultrasonic transmitters, at least one finger tip of the Intelligent glove and set
There is ultrasonic receiver, methods described includes:
The launch time and each ultrasonic receiver for obtaining each ultrasonic transmitter transmitting ultrasonic wave receive ultrasound
The reception time of ripple;
The positional information of each ultrasonic receiver is determined according to the launch time and the reception time;
Gesture model is set up according to the positional information.
It is preferred that, it is described to determine that the position of each ultrasonic receiver is believed according to the launch time and the reception time
Breath includes:
Each ultrasonic receiver is calculated respectively according to the launch time and the reception time to send out to two ultrasonic waves
The range information of emitter;
The positional information of relative two ultrasonic receivers of each ultrasonic receiver is determined according to the range information.
It is preferred that, it is described gesture model is set up according to the positional information to include:
The picture point of pad of finger and wrist is determined according to the positional information;
According to described image point-rendering gesture model.
It is preferred that, set up according to the positional information after gesture model, in addition to:
The gesture model is matched with default gesture library, being stored with the default gesture library, at least one is preset
Gesture;
When the match is successful, the default gesture matched is defined as user gesture.
It is preferred that, after the gesture model is matched with default gesture library, in addition to:
When matching unsuccessful, the prompting that it fails to match is sent, or send the prompting of the default gesture of increase.
Second aspect, the embodiment of the present invention additionally provides a kind of gesture identifying device based on ultrasonic wave, applies in intelligence
On gloves, the Intelligent glove wrist is provided with two ultrasonic transmitters, at least one finger tip of the Intelligent glove and set
Ultrasonic receiver is equipped with, described device includes:
Acquisition module, launch time and each ultrasonic wave for obtaining each ultrasonic transmitter transmitting ultrasonic wave connect
Receive the reception time that device receives ultrasonic wave;
First determining module, for determining each ultrasonic receiver according to the launch time and the reception time
Positional information;
Model building module, for setting up gesture model according to the positional information.
It is preferred that, first determining module includes:
Computing unit, is arrived for calculating each ultrasonic receiver respectively according to the launch time and the reception time
The range information of two ultrasonic transmitters;
First determining unit, for determining that each ultrasonic receiver connects with respect to two ultrasonic waves according to the range information
Receive the positional information of device.
It is preferred that, the model building module includes:
Second determining unit, the picture point for determining pad of finger and wrist according to the positional information;
Drawing unit, for according to described image point-rendering gesture model.
It is preferred that, in addition to:
Matching module, for being set up according to the positional information after gesture model, by the gesture model with presetting
Gesture library is matched, and be stored with least one default gesture in the default gesture library;
Second determining module, for when the match is successful, the default gesture matched to be defined as into user gesture.
It is preferred that, in addition to:
Reminding module, for after the gesture model is matched with default gesture library, when matching unsuccessful,
The prompting that it fails to match is sent, or sends the prompting of the default gesture of increase.
The embodiment of the present invention in Intelligent glove wrist by being provided with two ultrasonic transmitters, in Intelligent glove at least
Ultrasonic receiver is set on one finger tip, then obtains launch time of each ultrasonic transmitter transmitting ultrasonic wave and each
Ultrasonic receiver receives the reception time of ultrasonic wave, and each ultrasonic receiver is determined according to launch time and reception time
Positional information, gesture model is set up according to positional information, is solved and is used optical gesture identification technology and inertia in the prior art
The problem of error rate that sensor gesture identification is brought is high, effectively reduces gesture identification misclassification rate, reduces error.
Brief description of the drawings
Fig. 1 a are a kind of schematic flow sheet for gesture identification method based on ultrasonic wave that the embodiment of the present invention one is provided;
Fig. 1 b are that the ultrasonic transmitter of the offer of the embodiment of the present invention one and the structure of ultrasonic receiver set location are shown
It is intended to;
Fig. 2 a are a kind of schematic flow sheet for gesture identification method based on ultrasonic wave that the embodiment of the present invention two is provided;
Fig. 2 b are each default gesture in the default gesture library that the embodiment of the present invention two is provided;
Fig. 3 is a kind of structural representation for gesture identifying device based on ultrasonic wave that the embodiment of the present invention three is provided.
Embodiment
The present invention is described in further detail with reference to the accompanying drawings and examples.It is understood that this place is retouched
The specific embodiment stated is used only for explaining the present invention, rather than limitation of the invention.It also should be noted that, in order to just
Part related to the present invention rather than entire infrastructure are illustrate only in description, accompanying drawing.
Embodiment one
Fig. 1 a are a kind of schematic flow sheet for gesture identification method based on ultrasonic wave that the embodiment of the present invention one is provided, this
Embodiment is applicable to situation about being identified using gesture, and this method can be held by the gesture identifying device based on ultrasonic wave
OK, and typically it is integrated in Intelligent worn device, is particularly suitable for use in Intelligent glove, with reference to Fig. 1 a, specifically includes as follows:
S110, the launch time for obtaining each ultrasonic transmitter transmitting ultrasonic wave and each ultrasonic receiver are received
The reception time of ultrasonic wave.
Wherein, the Intelligent glove wrist is provided with two ultrasonic transmitters, the ultrasonic wave for sending certain frequency,
Ultrasonic receiver is provided with least one finger tip of the Intelligent glove, can receive what two ultrasonic transmitters were sent
Ultrasonic wave.It is preferred that, two ultrasonic transmitters can be simultaneously emitted by ultrasonic wave, each ultrasonic receiver according to certain frequency
The ultrasonic wave that the two ultrasonic transmitters are sent can be received simultaneously.
Specifically, when two ultrasonic transmitters send ultrasonic wave, time and the launch time of being designated as now can be gathered,
The ultrasonic wave sent is propagated and received by the ultrasonic receiver at diverse location on Intelligent glove in atmosphere, is now gathered again
Time and the reception time for being designated as the ultrasonic receiver.Although two ultrasonic transmitters are all disposed within the wrist of Intelligent glove
Position, but two ultrasonic transmitters are also distributed across on the diverse location of the wrist, so as to fixed subsequently according to triangle
Position principle determines the position of each ultrasonic receiver, and the ultrasonic receiver being arranged on different finger tips due to position not
Together, the time of received ultrasonic wave may also be different, thus each ultrasonic receiver to the reception of same ultrasonic wave when
Between may be different, the reception time of two ultrasonic waves of the same ultrasonic receiver to being simultaneously emitted by may also be different.
S120, the positional information for determining according to the launch time and the reception time each ultrasonic receiver.
It is known that the aerial spread speed of ultrasonic wave and propagation time can calculate the ultrasonic wave in atmosphere
The distance of propagation, the distance of each ultrasonic receiver and two ultrasonic transmitters can be calculated according to this principle, and
The distance of two ultrasonic transmitters can be preset value, thus can determine that each ultrasonic wave connects according to triangle positioning principle
Receive positional information of the device relative to the two ultrasonic transmitters.
S130, gesture model set up according to the positional information.
Because ultrasonic receiver is provided in the pad of finger of Intelligent glove, it is thus determined that each ultrasonic receiver
Relative to the positional information of two ultrasonic transmitters, that is, positional information of each finger tip relative to wrist is determined,
Thus the actual gesture model of user can just be simulated.
The technical scheme of the present embodiment, by being provided with two ultrasonic transmitters in Intelligent glove wrist, in intelligent hand
Ultrasonic receiver is set at least one finger tip of set, then obtains the launch time that each ultrasonic transmitter launches ultrasonic wave
And each ultrasonic receiver receives the reception time of ultrasonic wave, and each ultrasonic wave is determined according to launch time and reception time
The positional information of receiver, gesture model is set up according to positional information, is solved and is recognized skill using optical gesture in the prior art
The problem of error rate that art and inertial sensor gesture identification are brought is high, effectively reduces gesture identification misclassification rate, reduces mistake
Difference.
On the basis of above-mentioned technical proposal, preferably by S120, it is true according to the launch time and the reception time
The positional information of fixed each ultrasonic receiver is further optimized for:
A1, each ultrasonic receiver calculated to two ultrasonic waves according to the launch time and the reception time respectively
The range information of transmitter.
As shown in Figure 1 b, it is assumed that the wrist of Intelligent glove is provided with ultrasonic transmitter Q1 and ultrasonic transmitter Q2, intelligence
Can gloves five finger tips in, each finger tip all sets a ultrasonic receiver, be followed successively by from thumb to small finger direction F1, F2,
F3, F4 and F5.
By taking the range information of ultrasonic receiver F1 to two ultrasonic transmitter for calculating thumb position as an example, ultrasonic wave
The time that transmitter Q1 and ultrasonic transmitter Q2 are simultaneously emitted by ultrasonic wave is designated as launch time T, and ultrasonic receiver F1 is received
Time to two ultrasonic waves is designated as reception time T1 and receives time T2 respectively, then ultrasonic receiver F1 launches with ultrasonic wave
Device Q1 apart from a1=V × (T1-T), ultrasonic receiver F1 and ultrasonic transmitter Q2 apart from a2=V × (T2-T), its
Middle V is the aerial spread speed of ultrasonic wave.
A2, the position letter for determining according to the range information relative two ultrasonic receivers of each ultrasonic receiver
Breath.
Exemplary, it can be known preset value b that two ultrasonic transmitters Q1, Q2 distance, which are, it is determined that thumb
Ultrasonic receiver F1 to two ultrasonic transmitter Q1, Q2 of position after a1 and a2, according to triangle positioning principle
The corresponding positional information that can determine that ultrasonic receiver F1, that is, determine the positional information of thumb tip, similarly can determine that other hands
Refer to the positional information of finger tip.
On the basis of above-mentioned technical proposal, preferably by S130, gesture model is set up according to the positional information enter one
Step is optimized for:
B1, the picture point for determining according to the positional information pad of finger and wrist.
Specifically, the positional information of each ultrasonic receiver is corresponding fingertip location, two ultrasonic transmitters
The position at place be wrist location, the picture point of pad of finger and wrist can be set up accordingly.
B2, according to described image point-rendering gesture model.
Specifically, it is determined that after the picture point of pad of finger and wrist, the current gesture model of drafting can be simulated.
Embodiment two
Fig. 2 a are a kind of schematic flow sheet for gesture identification method based on ultrasonic wave that the embodiment of the present invention two is provided, this
Embodiment preferably enters one on the basis of above-described embodiment to setting up the operation after gesture model according to the positional information
Step optimization, with reference to Fig. 2 a, is specifically included as follows:
S210, the launch time for obtaining each ultrasonic transmitter transmitting ultrasonic wave and each ultrasonic receiver are received
The reception time of ultrasonic wave.
S220, the positional information for determining according to the launch time and the reception time each ultrasonic receiver.
S230, gesture model set up according to the positional information.
S240, the gesture model matched with default gesture library, be stored with least one in the default gesture library
Individual default gesture.
Wherein, preset in gesture library and store substantial amounts of default gesture, default gesture can be as shown in Figure 2 b.Specifically,
Gesture model can be compared with each default gesture, and judge the similarity of two gestures, if similarity reach it is pre-
If threshold value, then it is assumed that the match is successful, if being not reaching to predetermined threshold value, then it is assumed that without the match is successful.For two gesture phases
Like the judgement of degree, there can be a variety of evaluation algorithms to realize, for example, two gestures can be reduced/enlarged as after same size, than
To the overlapping area of two gestures, the percentage of the default gesture gross area is accounted for according to overlapping area to determine the similar of two gestures
Degree.
S250, when the match is successful, the default gesture matched is defined as user gesture.
Exemplary, the gesture of clenching fist of above-mentioned gesture model and the figure (b) in Fig. 2 b matches, then will scheme clenching fist for (b)
Gesture is defined as user gesture.
The technical scheme of the present embodiment, by being provided with two ultrasonic transmitters in Intelligent glove wrist, in intelligent hand
Ultrasonic receiver is set at least one finger tip of set, then obtains the launch time that each ultrasonic transmitter launches ultrasonic wave
And each ultrasonic receiver receives the reception time of ultrasonic wave, and each ultrasonic wave is determined according to launch time and reception time
The positional information of receiver, gesture model is set up according to positional information, is solved and is recognized skill using optical gesture in the prior art
The problem of error rate that art and inertial sensor gesture identification are brought is high, effectively reduces gesture identification misclassification rate, reduces mistake
Difference.And the present embodiment is also matched gesture model with default gesture library, when the match is successful, by the default gesture matched
Be defined as user gesture, so can accurately recognize the current gesture of user, subsequently can according to for gesture go to perform corresponding finger
Order.
On the basis of above-mentioned technical proposal, preferably after the gesture model is matched with default gesture library
Operation further optimization, is specifically included as follows:
When matching unsuccessful, the prompting that it fails to match is sent, or send the prompting of the default gesture of increase.
Specifically, when matching unsuccessful, it may be possible to which the current gesture of user is not obvious, it is impossible to identifies and corresponds to therewith
Default gesture, it is also possible to without this default gesture is stored in default gesture library, cause matching unsuccessful, can now send out
Go out the prompting that it fails to match, further can also send the prompting of the default gesture of increase, such as:It is aobvious in Intelligent worn device display interface
Show " it fails to match for gesture, please provide correct gesture ", when continuous n times it fails to match, can now be shown in Intelligent worn device
" gesture matching continuous failure n times, whether PLSCONFM adds new gesture to interface display" so intelligently provide accordingly finger for client
Show, lift Consumer's Experience.
Embodiment three
Fig. 3 is a kind of structural representation for gesture identifying device based on ultrasonic wave that the embodiment of the present invention three is provided, this
Embodiment is applicable to situation about being identified using gesture, and the device is typically integrated in Intelligent worn device, especially suitable
In Intelligent glove, with reference to Fig. 3, specifically include as follows:
The gesture identifying device based on ultrasonic wave that the present embodiment is provided, is applied on Intelligent glove, the Intelligent glove
Wrist is provided with two ultrasonic transmitters, at least one finger tip of the Intelligent glove and is provided with ultrasonic receiver, institute
Stating device includes:
Acquisition module 310, for the launch time for obtaining each ultrasonic transmitter transmitting ultrasonic wave and each ultrasound
Ripple receiver receives the reception time of ultrasonic wave;
First determining module 320, for determining that each ultrasonic wave is received according to the launch time and the reception time
The positional information of device;
Model building module 330, for setting up gesture model according to the positional information.
In the present embodiment, first determining module 320 includes:
Computing unit, is arrived for calculating each ultrasonic receiver respectively according to the launch time and the reception time
The range information of two ultrasonic transmitters;
First determining unit, for determining that each ultrasonic receiver connects with respect to two ultrasonic waves according to the range information
Receive the positional information of device.
In the present embodiment, the model building module 330 includes:
Second determining unit, the picture point for determining pad of finger and wrist according to the positional information;
Drawing unit, for according to described image point-rendering gesture model.
In the present embodiment, in addition to:
Matching module, for being set up according to the positional information after gesture model, by the gesture model with presetting
Gesture library is matched, and be stored with least one default gesture in the default gesture library;
Second determining module, for when the match is successful, the default gesture matched to be defined as into user gesture.
In the present embodiment, in addition to:
Reminding module, for after the gesture model is matched with default gesture library, when matching unsuccessful,
The prompting that it fails to match is sent, or sends the prompting of the default gesture of increase.
The present embodiment provide the gesture identifying device based on ultrasonic wave, with any embodiment of the present invention provided based on
The gesture identification method of ultrasonic wave belongs to same inventive concept, can perform any embodiment of the present invention provided based on ultrasonic wave
Gesture identification method, possess the corresponding functional module of execution method and beneficial effect.
Note, above are only presently preferred embodiments of the present invention and institute's application technology principle.It will be appreciated by those skilled in the art that
The invention is not restricted to specific embodiment described here, can carry out for a person skilled in the art it is various it is obvious change,
Readjust and substitute without departing from protection scope of the present invention.Therefore, although the present invention is carried out by above example
It is described in further detail, but the present invention is not limited only to above example, without departing from the inventive concept, also
Other more equivalent embodiments can be included, and the scope of the present invention is determined by scope of the appended claims.
Claims (10)
1. a kind of gesture identification method based on ultrasonic wave, is applied on Intelligent glove, it is characterised in that the Intelligent glove wrist
Portion is provided with two ultrasonic transmitters, at least one finger tip of the Intelligent glove and is provided with ultrasonic receiver, described
Method includes:
The launch time and each ultrasonic receiver for obtaining each ultrasonic transmitter transmitting ultrasonic wave receive ultrasonic wave
The reception time;
The positional information of each ultrasonic receiver is determined according to the launch time and the reception time;
Gesture model is set up according to the positional information.
2. the gesture identification method according to claim 1 based on ultrasonic wave, it is characterised in that described according to the transmitting
Time and the reception time determine that the positional information of each ultrasonic receiver includes:
Each ultrasonic receiver is calculated to two ultrasonic transmitters according to the launch time and the reception time respectively
Range information;
The positional information of relative two ultrasonic receivers of each ultrasonic receiver is determined according to the range information.
3. the gesture identification method according to claim 1 based on ultrasonic wave, it is characterised in that described according to the position
Information, which sets up gesture model, to be included:
The picture point of pad of finger and wrist is determined according to the positional information;
According to described image point-rendering gesture model.
4. the gesture identification method according to claim 1 based on ultrasonic wave, it is characterised in that believe according to the position
Breath is set up after gesture model, in addition to:
The gesture model is matched with default gesture library, be stored with least one default hand in the default gesture library
Gesture;
When the match is successful, the default gesture matched is defined as user gesture.
5. the gesture identification method according to claim 4 based on ultrasonic wave, it is characterised in that by the gesture model
After being matched with default gesture library, in addition to:
When matching unsuccessful, the prompting that it fails to match is sent, or send the prompting of the default gesture of increase.
6. a kind of gesture identifying device based on ultrasonic wave, is applied on Intelligent glove, it is characterised in that the Intelligent glove wrist
Portion is provided with two ultrasonic transmitters, at least one finger tip of the Intelligent glove and is provided with ultrasonic receiver, described
Device includes:
Acquisition module, for the launch time for obtaining each ultrasonic transmitter transmitting ultrasonic wave and each ultrasonic receiver
Receive the reception time of ultrasonic wave;
First determining module, the position for determining each ultrasonic receiver according to the launch time and the reception time
Information;
Model building module, for setting up gesture model according to the positional information.
7. the gesture identifying device according to claim 6 based on ultrasonic wave, it is characterised in that first determining module
Including:
Computing unit, for calculating each ultrasonic receiver respectively to two according to the launch time and the reception time
The range information of ultrasonic transmitter;
First determining unit, for determining relative two ultrasonic receivers of each ultrasonic receiver according to the range information
Positional information.
8. the gesture identifying device according to claim 6 based on ultrasonic wave, it is characterised in that the model building module
Including:
Second determining unit, the picture point for determining pad of finger and wrist according to the positional information;
Drawing unit, for according to described image point-rendering gesture model.
9. the gesture identifying device according to claim 6 based on ultrasonic wave, it is characterised in that also include:
Matching module, for being set up according to the positional information after gesture model, by the gesture model and default gesture
Storehouse is matched, and be stored with least one default gesture in the default gesture library;
Second determining module, for when the match is successful, the default gesture matched to be defined as into user gesture.
10. the gesture identifying device according to claim 9 based on ultrasonic wave, it is characterised in that also include:
Reminding module, for after the gesture model is matched with default gesture library, when matching unsuccessful, sending
The prompting that it fails to match, or send the prompting of the default gesture of increase.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710031687.9A CN107066086A (en) | 2017-01-17 | 2017-01-17 | A kind of gesture identification method and device based on ultrasonic wave |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710031687.9A CN107066086A (en) | 2017-01-17 | 2017-01-17 | A kind of gesture identification method and device based on ultrasonic wave |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107066086A true CN107066086A (en) | 2017-08-18 |
Family
ID=59598885
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710031687.9A Pending CN107066086A (en) | 2017-01-17 | 2017-01-17 | A kind of gesture identification method and device based on ultrasonic wave |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107066086A (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109857245A (en) * | 2017-11-30 | 2019-06-07 | 腾讯科技(深圳)有限公司 | A kind of gesture identification method and terminal |
CN109857251A (en) * | 2019-01-16 | 2019-06-07 | 珠海格力电器股份有限公司 | Gesture identification control method, device, storage medium and the equipment of intelligent appliance |
CN109905183A (en) * | 2019-02-19 | 2019-06-18 | 杨世惟 | A kind of underwater communication device and subsurface communication method based on ultrasonic transmission |
CN111103980A (en) * | 2019-12-18 | 2020-05-05 | 南京航空航天大学 | VR (virtual reality) environment interaction system and method based on FMCW (frequency modulated continuous wave) |
CN112121280A (en) * | 2020-08-31 | 2020-12-25 | 浙江大学 | Control method and control system of heart sound box |
CN114442803A (en) * | 2021-12-16 | 2022-05-06 | 鹏城实验室 | Data gloves based on ultrasonic wave software sensing |
WO2023230964A1 (en) * | 2022-06-01 | 2023-12-07 | 深圳市韶音科技有限公司 | Human body posture recognition system |
CN114442803B (en) * | 2021-12-16 | 2024-06-21 | 鹏城实验室 | Data glove based on ultrasonic soft sensing |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103729058A (en) * | 2013-12-20 | 2014-04-16 | 北京智谷睿拓技术服务有限公司 | Wearing type input system and input method |
CN103760970A (en) * | 2013-12-20 | 2014-04-30 | 北京智谷睿拓技术服务有限公司 | Wearable input system and method |
CN104898844A (en) * | 2015-01-23 | 2015-09-09 | 瑞声光电科技(常州)有限公司 | Gesture recognition and control device based on ultrasonic positioning and gesture recognition and control method based on ultrasonic positioning |
CN105204650A (en) * | 2015-10-22 | 2015-12-30 | 上海科世达-华阳汽车电器有限公司 | Gesture recognition method, controller, gesture recognition device and equipment |
-
2017
- 2017-01-17 CN CN201710031687.9A patent/CN107066086A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103729058A (en) * | 2013-12-20 | 2014-04-16 | 北京智谷睿拓技术服务有限公司 | Wearing type input system and input method |
CN103760970A (en) * | 2013-12-20 | 2014-04-30 | 北京智谷睿拓技术服务有限公司 | Wearable input system and method |
CN104898844A (en) * | 2015-01-23 | 2015-09-09 | 瑞声光电科技(常州)有限公司 | Gesture recognition and control device based on ultrasonic positioning and gesture recognition and control method based on ultrasonic positioning |
CN105204650A (en) * | 2015-10-22 | 2015-12-30 | 上海科世达-华阳汽车电器有限公司 | Gesture recognition method, controller, gesture recognition device and equipment |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109857245A (en) * | 2017-11-30 | 2019-06-07 | 腾讯科技(深圳)有限公司 | A kind of gesture identification method and terminal |
CN109857245B (en) * | 2017-11-30 | 2021-06-15 | 腾讯科技(深圳)有限公司 | Gesture recognition method and terminal |
CN109857251A (en) * | 2019-01-16 | 2019-06-07 | 珠海格力电器股份有限公司 | Gesture identification control method, device, storage medium and the equipment of intelligent appliance |
CN109905183A (en) * | 2019-02-19 | 2019-06-18 | 杨世惟 | A kind of underwater communication device and subsurface communication method based on ultrasonic transmission |
CN109905183B (en) * | 2019-02-19 | 2021-08-10 | 杨世惟 | Underwater communication device and method based on ultrasonic transmission |
CN111103980A (en) * | 2019-12-18 | 2020-05-05 | 南京航空航天大学 | VR (virtual reality) environment interaction system and method based on FMCW (frequency modulated continuous wave) |
WO2021120971A1 (en) * | 2019-12-18 | 2021-06-24 | 南京航空航天大学 | Fmcw-based vr environment interaction system and method |
CN112121280A (en) * | 2020-08-31 | 2020-12-25 | 浙江大学 | Control method and control system of heart sound box |
CN114442803A (en) * | 2021-12-16 | 2022-05-06 | 鹏城实验室 | Data gloves based on ultrasonic wave software sensing |
CN114442803B (en) * | 2021-12-16 | 2024-06-21 | 鹏城实验室 | Data glove based on ultrasonic soft sensing |
WO2023230964A1 (en) * | 2022-06-01 | 2023-12-07 | 深圳市韶音科技有限公司 | Human body posture recognition system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107066086A (en) | A kind of gesture identification method and device based on ultrasonic wave | |
US10649549B2 (en) | Device, method, and system to recognize motion using gripped object | |
US8525876B2 (en) | Real-time embedded vision-based human hand detection | |
US20110115892A1 (en) | Real-time embedded visible spectrum light vision-based human finger detection and tracking method | |
CN105260024B (en) | A kind of method and device that gesture motion track is simulated on screen | |
US8292833B2 (en) | Finger motion detecting apparatus and method | |
KR100630806B1 (en) | Command input method using motion recognition device | |
US20090153499A1 (en) | Touch action recognition system and method | |
US10372223B2 (en) | Method for providing user commands to an electronic processor and related processor program and electronic circuit | |
US20140307926A1 (en) | Expression estimation device, control method, control program, and recording medium | |
US20110268365A1 (en) | 3d hand posture recognition system and vision based hand posture recognition method thereof | |
CN106127152B (en) | A kind of fingerprint template update method and terminal device | |
WO2019000817A1 (en) | Control method and electronic equipment for hand gesture recognition | |
CN105094301A (en) | Control method and device, and electronic equipment | |
US10078374B2 (en) | Method and system enabling control of different digital devices using gesture or motion control | |
CN107272892A (en) | A kind of virtual touch-control system, method and device | |
KR20210052874A (en) | An electronic device for recognizing gesture of user using a plurality of sensor signals | |
CN110262767B (en) | Voice input wake-up apparatus, method, and medium based on near-mouth detection | |
JP2016115310A (en) | Electronic apparatus | |
EP4276591A1 (en) | Interaction method, electronic device, and interaction system | |
US20150070325A1 (en) | Image control apparatus, image processing system, and computer program product | |
CN112540686A (en) | Intelligent ring, method for determining working mode of ring and electronic equipment | |
TWI629646B (en) | Gesture recognition device | |
US10901814B2 (en) | Information processing apparatus and information processing method | |
JP2020201755A (en) | Concentration degree measurement device, concentration degree measurement method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20170818 |