CN109871857A - Method and apparatus for identifying a gesture - Google Patents

Method and apparatus for identifying a gesture Download PDF

Info

Publication number
CN109871857A
CN109871857A CN201711265640.5A CN201711265640A CN109871857A CN 109871857 A CN109871857 A CN 109871857A CN 201711265640 A CN201711265640 A CN 201711265640A CN 109871857 A CN109871857 A CN 109871857A
Authority
CN
China
Prior art keywords
gesture
vector
similarity
bending angle
finger
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201711265640.5A
Other languages
Chinese (zh)
Inventor
王维辉
杨星
何莉
赵如彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bosch Automotive Products Suzhou Co Ltd
Original Assignee
Bosch Automotive Products Suzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bosch Automotive Products Suzhou Co Ltd filed Critical Bosch Automotive Products Suzhou Co Ltd
Priority to CN201711265640.5A priority Critical patent/CN109871857A/en
Publication of CN109871857A publication Critical patent/CN109871857A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)

Abstract

The present invention relates to human-computer interaction technology, in particular to method, the device for implementing this method, the wearable gloves comprising described device and the computer storage medium for implementing this method for identifying a gesture.Generation and one or more associated primary vectors of the bending angle of finger-joint and secondary vector associated with the spatial orientation of palm are comprised the steps of according to the method for identifying a gesture of one aspect of the invention;Determine the similarity of the primary vector and secondary vector and respective reference vector;And gesture is identified according to the similarity, wherein the bending angle and spatial orientation are obtained using sensor.

Description

Method and apparatus for identifying a gesture
Technical field
The present invention relates to human-computer interaction technology, in particular to method for identifying a gesture, implement this method device and System, the wearable device comprising described device and the computer storage medium for implementing this method.
Background technique
Gesture identification is an importance in human-computer interaction, and the purpose is to allow users to come using simple gesture Control equipment is interacted with equipment, to establish between machine and people more richer than text user interface and graphic user interface Rich and easy communication way.
In the prior art, no matter gesture be it is static or dynamic, identification process generally comprises the following steps: gesture Acquisition, gestures detection and the segmentation of image, gesture analysis and gesture identification.Hand Gesture Segmentation is the committed step in identification process, Its effect directly influences the implementation effect of subsequent step.Currently used Hand Gesture Segmentation method mainly has based on monocular vision Hand Gesture Segmentation and Hand Gesture Segmentation based on stereoscopic vision, the former is to obtain gesture using an image capture device, obtains gesture Areal model, the latter be using multiple images acquisition equipment obtain the different images of gesture, be converted into three-dimensional model.Gesture is distinguished Knowledge is that the track in model parameter space is categorized into the process of some subset of space, and common identification algorithm includes template With neural network and hidden Markov model method.
In practical application, gesture identification is frequently subjected to the interference of environmental factor and causes misrecognition (such as light is excessively bright Or excessively dark and gesture and the smaller inaccuracy for being likely to cause Hand Gesture Segmentation of background difference).In addition, above-mentioned identification algorithm needs Model is trained using a large amount of data, therefore be time-consuming and laborious.Therefore the prior art can be overcome by providing one kind Disadvantages mentioned above gesture identification method and device be there is an urgent need to.
Summary of the invention
It is an object of the present invention to provide a kind of methods for identifying a gesture, have and are easy to implement and accuracy rate height The advantages that.
It is comprised the steps of according to the method for identifying a gesture of one aspect of the invention
Generate with the associated primary vector of the one or more bending angle of finger-joint and with the spatial orientation of palm Associated secondary vector;
Determine the similarity of the primary vector and secondary vector and respective reference vector;And
Gesture is identified according to the similarity,
Wherein, the bending angle and spatial orientation are obtained using sensor.
Preferably, in the above-mentioned methods, by the way that sensor to be set on wearable device and the finger near finger-joint Bone corresponding region obtains the bending angle.
Preferably, in the above-mentioned methods, corresponding with the back of the hand or the palm of the hand by the way that sensor to be set on wearable device Region obtains the spatial orientation.
Preferably, in the above-mentioned methods, the secondary vector uses the form of quaternary number.
Preferably, in the above-mentioned methods, the primary vector is several with the Europe of the two with the similarity of corresponding reference vector Reed distance is measured, and the secondary vector and the similarity of corresponding reference vector are measured with the angle of the two.
It is comprised the steps of according to the method for identifying a gesture of another aspect of the invention
Generate primary vector associated with one or more bending angle of finger-joint;
Determine the similarity of the primary vector and reference vector;And
Gesture is identified according to the similarity,
Wherein, the bending angle is obtained using sensor.
It is a further object to provide a kind of device for identifying a gesture, has and be easy to implement and accuracy rate The advantages that high.
Include: according to the device of another aspect of the invention
First module, for generating primary vector associated with one or more bending angle of finger-joint and and hand The associated secondary vector of the spatial orientation of the palm;
Second module, for determining the similarity of the primary vector and secondary vector and respective reference vector;And
Third module, for identifying gesture according to the similarity,
Wherein, the bending angle and spatial orientation are obtained using sensor.
Include: according to the device of another aspect of the invention
First module, for generating primary vector associated with one or more bending angle of finger-joint;
Second module, for determining the similarity of the primary vector and reference vector;And
Third module, for identifying gesture according to the similarity,
Wherein, the bending angle is obtained using sensor.
Device according to another aspect of the invention includes memory, processor and is stored on the memory and can The computer program run on the processor, wherein execute described program to realize method as described above.
It is also an object of the present invention to provide a kind of wearable devices, when identifying gesture have be easy to implement and High accuracy for examination.
Include: according to the wearable device of another aspect of the invention
It is in one's hands to obtain to be set to region corresponding with the phalanges near finger-joint on wearable device for first sensor The bending angle of articulations digitorum manus;
Second sensor is set on wearable device region corresponding with the back of the hand or the palm of the hand to obtain the space of palm Orientation;
Device for identifying a gesture comprising memory, processor and is stored on the memory and can be described The computer program run on processor, wherein by executing described program to realize the following steps:
Generate with the associated primary vector of the one or more bending angle of finger-joint and with the spatial orientation of palm Associated secondary vector;
Determine the similarity of the primary vector and secondary vector and respective reference vector;And
Gesture is identified according to the similarity.
Include: according to the wearable device of another aspect of the invention
It is in one's hands to obtain to be set to region corresponding with the phalanges near finger-joint on wearable device for first sensor The bending angle of articulations digitorum manus;
Device for identifying a gesture comprising memory, processor and is stored on the memory and can be described The computer program run on processor, wherein by executing described program to realize the following steps:
Generate primary vector associated with one or more bending angle of finger-joint;
Determine the similarity of the primary vector Yu respective reference vector;And
Gesture is identified according to the similarity.
It is a further object to provide a kind of system for identifying a gesture, has and be easy to implement and accuracy rate The advantages that high.
Include: according to the system for identifying a gesture of another aspect of the invention
Wearable device, comprising:
It is in one's hands to obtain to be set to region corresponding with the phalanges near finger-joint on wearable device for first sensor The bending angle of articulations digitorum manus;
Second sensor is set on wearable device region corresponding with the back of the hand or the palm of the hand to obtain the space of palm Orientation;
Computing device comprising memory, processor and is stored on the memory and can transport on the processor Capable computer program, wherein by executing described program to realize the following steps:
Generate with the associated primary vector of the one or more bending angle of finger-joint and with the spatial orientation of palm Associated secondary vector;
Determine the similarity of the primary vector and secondary vector and respective reference vector;And
Gesture is identified according to the similarity.
Include: according to the system for identifying a gesture of a further aspect of the invention
Wearable device, comprising:
It is in one's hands to obtain to be set to region corresponding with the phalanges near finger-joint on wearable device for first sensor The bending angle of articulations digitorum manus;
Computing device comprising memory, processor and is stored on the memory and can transport on the processor Capable computer program, wherein by executing described program to realize the following steps:
Generate primary vector associated with one or more bending angle of finger-joint;
Determine the similarity of the primary vector Yu respective reference vector;And
Gesture is identified according to the similarity.
The present invention also provides a kind of computer readable storage mediums, store computer program thereon, which is characterized in that the journey Method as described above is realized when sequence is executed by processor.
In the present invention, gesture feature is portrayed using the spatial orientation of the bending angle of finger-joint and palm, due to Bending angle and spatial orientation are all to be measured using sensor, therefore compared with prior art, improve recognition accuracy simultaneously And it avoids and utilizes image recognition algorithm bring complexity.Further, since sensor is very suitable to be set to wearable device On, therefore device for identifying a gesture of the invention can be integrated very well into wearable device.
Detailed description of the invention
Above-mentioned and/or other aspects and advantage of the invention will be become by the description of the various aspects below in conjunction with attached drawing It is more clear and is easier to understand, the same or similar unit, which is adopted, in attached drawing is indicated by the same numeral.Attached drawing includes:
Fig. 1 is schematic diagram, schematically illustrates distribution map of multiple sensors on wearable device.
Fig. 2 is the schematic diagram according to the method for identifying a gesture of one embodiment of the invention.
Fig. 3 is the schematic diagram according to the method for identifying a gesture of another embodiment of the present invention.
Fig. 4 is the schematic block diagram according to the device for identifying a gesture of another embodiment of the present invention.
Fig. 5 is the schematic block diagram according to the device for identifying a gesture of a further embodiment of the present invention.
Fig. 6 is the schematic block diagram according to the device for identifying a gesture of a further embodiment of the present invention.
Fig. 7 is the schematic block diagram according to the wearable device of a further embodiment of the present invention.
Fig. 8 is the schematic block diagram according to the wearable device of a further embodiment of the present invention.
Fig. 9 is the schematic block diagram according to the system for identifying a gesture of a further embodiment of the present invention.
Figure 10 is the schematic block diagram according to the system for identifying a gesture of a further embodiment of the present invention.
Specific embodiment
Referring to which illustrates the attached drawings of illustrative examples of the present invention to more fully illustrate the present invention.But this hair It is bright to be realized by different form, and be not construed as being only limitted to each embodiment given herein.The above-mentioned each implementation provided Example is intended to make the disclosure of this paper comprehensively complete, and protection scope of the present invention is more fully communicated to those skilled in the art Member.
In the present specification, the term of such as "comprising" and " comprising " etc indicates to want in addition to having in specification and right Asking has in book directly and other than the unit clearly stated and step, technical solution of the present invention be also not excluded for having not by directly or The situation of the other units clearly stated and step.
In the present specification, the term of such as " first " and " second " etc is not offered as unit in time, space, size Etc. sequence and be only make distinguish each unit be used.
In the present specification, " coupling ", which should be understood as being included between two units, directly transmits electric flux or electric signal Situation, or transmit by one or more third units the situation of electric flux or electric signal indirectly.
In the present specification, " bending angle " of finger-joint refers to the relative angle of the phalanges near finger-joint, " spatial orientation " of palm refers to the normal direction of plane locating for palm.
According to one aspect of the present invention, gesture feature is portrayed using the bending angle of one or more finger-joints, That is, using bending angle as the characteristic parameter of gesture identification.In order to improve recognition accuracy and meet gesture-type diversification Demand, gesture feature can also be portrayed using the combination of the spatial orientation of the bending angle and palm of finger-joint, that is, same The characteristic parameter of Shi Liyong bending angle and spatial orientation as gesture identification.
Other side according to the invention measures the bending angle of finger-joint and the space of palm using sensor Orientation.Preferably, sensor can be set to on wearable device region corresponding with the phalanges near finger-joint to survey The Eulerian angles for measuring phalanges, thus can determine the angle between phalanges or the bending angle in joint.Moreover it is preferred that can will pass Sensor is set to region corresponding with the back of the hand or the palm of the hand on wearable device to measure the spatial orientation of palm.
Fig. 1 is schematic diagram, schematically illustrates distribution map of multiple sensors on wearable device, number therein The number of word expression sensor.As seen from Figure 1, sensor 1-11 is set on the phalanges of finger-joint two sides, and sensor 12 is set It is placed on the back of the hand.
Above-mentioned bending angle and spatial orientation can be measured using various sensors.In one embodiment of the present of invention In, MEMS sensor illustratively can be used, object can be measured along the acceleration of three change in coordinate axis direction of rectangular coordinate system The magnetic field of degree, object around the angular speed of these three reference axis rotation and along these three reference axis.
Fig. 2 is the schematic diagram according to the method for identifying a gesture of one embodiment of the invention.
As shown in Fig. 2, device for identifying a gesture receives the sensor being mounted on wearable device in step 210 Measure the Eulerian angles of obtained finger phalanges and the Eulerian angles of the back of the hand.As described above, sensor may be disposed on wearable device And the corresponding region of phalanges and region corresponding with the back of the hand or the palm of the hand near finger-joint.
Step 220 is subsequently entered, device for identifying a gesture generates primary vector and second by the measured value of sensor Vector, wherein each element of primary vector represents the bending angle of one of finger-joint, and secondary vector indicates palm Spatial orientation.
Illustratively, primary vector is denoted as J belowr={ Jr(1),Jr(2),…Jr(i),…,Jr(n) }, wherein i is The label of finger-joint, n are the number of finger-joint characteristic parameter for identifying a gesture, Jr(i) i-th of finger-joint is indicated Bending angle or the joint two sides phalanges between angle, and by base corresponding with kth kind gesture-type, primary vector Quasi- vector is denoted as Jk={ Jk(1),Jk(2),…Jk(i),…,Jk(n) }, wherein Jk(i) i-th of component of reference vector is indicated.
In this embodiment, it is preferred that secondary vector uses the form of quaternary number, that is, secondary vector Qr={ Q1,Q2,Q3, Q4}.Although quaternary number and Eulerian angles are mathematically of equal value, use quaternary numbers when indicating the spatial orientation of palm Form be more advantageous to the calculating of object space towards transformation, and can be to avoid using Eulerian angles when the deadlock phenomenon that occurs (occurring causing to calculate the singular point interrupted)
Step 230 is subsequently entered, device for identifying a gesture determines primary vector JrWith secondary vector QrWith respective benchmark The similarity of vector.It should be pointed out that for every kind of gesture to be identified, all have accordingly for primary vector and the The reference vector of two vectors, therefore primary vector and secondary vector have multiple similarities, each similarity corresponds to a kind of hand Gesture type.
Preferably for every kind of gesture, primary vector J can be usedrCorresponding reference vector JkBetween Europe it is several in Moral distance measures their similarity, and here k indicates the type number of gesture.Specifically, following formula can be used calculate with The similarity S of kth kind gesture-typek1:
Here, JrIt (i) is primary vector JrI-th of element, JkIt (i) is kth kind gesture-type for primary vector Reference vector JkI-th of element, n be primary vector element number.
Preferably, secondary vector Q can be usedrWith corresponding reference vector QkBetween angle measure the similar of them Degree.Specifically, following formula can be used to calculate and kth kind gesture-type similarity Sk2:
Here, QrIt (i) is secondary vector QrI-th of element, QkIt (i) is the reference vector Q for secondary vectorkI-th A element.
Step 240 is subsequently entered, normalized is made to each similarity that step 230 determines to obtain value range Jie Similarity is normalized between 0-1.
Step 250 is subsequently entered, gesture is identified according to the similarity after normalized.
In the present embodiment, the value of normalization similarity can be comprehensively considered to identify gesture.With primary vector and Normalization similarity S' of two vectors relative to kth kind gesture-typek1And S'k2For, can be judged using following rules One and secondary vector and respective reference vector similarity:
If S'1< 0.05, then determine that primary vector is similar to reference vector height, if 0.05≤S'1< 0.1, then determine Primary vector is similar to reference vector moderate, if 0.1≤S'1< 0.15, then determine primary vector and reference vector minuent phase Seemingly, and if S'1>=0.15, then determine that primary vector and reference vector are dissimilar.For normalizing similarity S'2, can also be with Using similar decision rule, that is, if S'2>0.95, then determine that secondary vector is similar to reference vector height, if 0.90< S'2≤ 0.95, then determine that secondary vector is similar to reference vector moderate, if 0.85 < S'2≤ 0.90, then determine secondary vector with Reference vector minuent is similar, and if S'2≤ 0.85, then determine that secondary vector and reference vector are dissimilar.
It, can be further according to judgement after obtaining the judging result about the first and second vectors according to above-mentioned rule As a result determine whether the gesture being currently detected is kth kind gesture-type.Following table 1 illustratively gives one according to One and the gesture that is currently detected of secondary vector result judgement and kth kind gesture-type similarity degree example.
Table 1
Similarity S′1<0.05 0.05≤S′1<0.1 0.1≤S′1<0.15 S′1≥0.15
S′2>0.95 It is highly similar Moderate is similar It is low similar It is dissimilar
0.90<S′2≤0.95 Moderate is similar Moderate is similar It is low similar It is dissimilar
0.85<S′2≤0.90 It is low similar It is low similar It is dissimilar It is dissimilar
S′2≤0.85 It is dissimilar It is dissimilar It is dissimilar It is dissimilar
Fig. 3 is the schematic diagram according to the method for identifying a gesture of another embodiment of the present invention.
Compared with embodiment shown in Fig. 2, feature of the present embodiment only with the bending angle of finger as gesture identification Variable.
As shown in figure 3, device for identifying a gesture receives the sensor being mounted on wearable device in step 310 Measure the Eulerian angles of obtained finger phalanges.Subsequently enter step 320, device for identifying a gesture by sensor measured value Generate primary vector Jr={ Jr(1),Jr(2),…Jr(i),…,Jr(n) }, wherein i is the label of finger-joint, and n is for knowing The number of the finger-joint characteristic parameter of other gesture, Jr(i) indicate that the bending angle of i-th of finger-joint or the joint two sides refer to Angle between bone.
Step 330 is subsequently entered, device for identifying a gesture determines primary vector JrIt is similar between reference vector Degree.Similarly, for every kind of gesture to be identified, all have accordingly, for the reference vector J of primary vectork={ Jk (1),Jk(2),…Jk(i),…,Jk(n) }, wherein Jk(i) i-th of component of reference vector is indicated.
It should be pointed out that for every kind of gesture to be identified, all have the corresponding benchmark for primary vector to Amount, therefore primary vector has multiple similarities, each similarity corresponds to a kind of gesture-type.
Preferably for kth kind gesture-type, primary vector J can be usedrCorresponding reference vector JkBetween Euclidean distance measures their similarity.Specifically, above formula (1) can be used to calculate similarity Sk1
Step 340 is subsequently entered, normalized is made to multiple similarities that step 330 determines to obtain value range Jie Similarity is normalized between 0-1.
Step 350 is subsequently entered, gesture is identified according to the similarity after normalized.
Normalization similarity S' with primary vector relative to kth kind gesture-typek1For, can using following rules come Whether the currently detected gesture of judgement is kth kind gesture-type:
If S'1< 0.05, then determine that the gesture being currently detected is similar to kth kind gesture-type height, if 0.05≤ S'1< 0.1, then determine that the gesture being currently detected is similar to kth kind gesture-type moderate, if 0.1≤S'1< 0.15, then determine Currently detected gesture is similar to kth kind gesture-type minuent, and if S'1>=0.15, then determine currently to be detected Gesture is dissimilar.
Fig. 4 is the device for identifying a gesture according to another embodiment of the present invention.
Device 40 shown in Fig. 4 includes the first module 410, the second module 420 and third module 430.In the present embodiment, First module 410 for generates with the associated primary vector of the one or more bending angle of finger-joint and with the sky of palm Between be orientated associated secondary vector, the second module 420 for determine the primary vector and secondary vector and respective benchmark to The similarity of amount, and third module 430 is used to identify gesture according to the similarity.
Fig. 5 is the device for identifying a gesture according to another embodiment of the present invention.
Device 50 shown in fig. 5 includes the first module 510, the second module 520 and third module 530.In the present embodiment, First module 510 is for generating primary vector associated with one or more bending angle of finger-joint, the second module 520 For determining the similarity of the primary vector and reference vector, and third module 530 according to the similarity for knowing Other gesture.
Fig. 6 is the schematic block diagram according to the device for identifying a gesture of a further embodiment of the present invention.
Device for identifying a gesture 60 shown in fig. 6 includes memory 610, processor 620 and is stored in memory On 610 and the computer program 630 that can be run on processor 620, wherein executing computer program 630 may be implemented above By method for identifying a gesture described in Fig. 2 and 3.
Fig. 7 is the schematic block diagram according to the wearable device of a further embodiment of the present invention.
As shown in fig. 7, the wearable device 70 of the present embodiment includes first sensor 710, second sensor 720 and is used for Identify the device 730 of gesture.In the present embodiment, first sensor 710 be set on wearable device near finger-joint The corresponding region of phalanges to obtain the bending angle of finger-joint, second sensor 720 is set on wearable device and hand Back or the palm of the hand corresponding region obtain spatial orientation.Device 730 for identifying a gesture can use the dress in conjunction with described in Fig. 2 It sets to realize.
Fig. 8 is the schematic block diagram according to the wearable device of a further embodiment of the present invention.
As shown in figure 8, the wearable device 80 of the present embodiment includes first sensor 810 and device for identifying a gesture 820.In the present embodiment, first sensor 810 is set to area corresponding with the phalanges near finger-joint on wearable device Domain is to obtain the bending angle of finger-joint.Device 820 for identifying a gesture can be using the device in conjunction with described in Fig. 3 come real It is existing.
In the embodiment shown in Fig. 7 and 8, wearable device 70 and 80 can be wearable gloves.
Fig. 9 is the schematic block diagram according to the system for identifying a gesture of a further embodiment of the present invention.
As shown in figure 9, the system for identifying a gesture 90 of the present embodiment includes wearable device 910 and computing device 920.In the present embodiment, wearable device 910 includes first sensor 911 and second sensor 912, wherein the first sensing Device 911 is arranged on wearable device region corresponding with the phalanges near finger-joint to obtain the bending angle of finger-joint Degree, second sensor 912 are arranged at region corresponding with the back of the hand or the palm of the hand on wearable device and take to obtain the space of palm To.
Unlike Fig. 7 and 8 illustrated embodiments, in the present embodiment, the identifying processing of gesture is by being located at wearable set Standby external computing device 920 is completed.Referring to Fig. 9, computing device 920 includes memory 921, processor 922 and is stored in On memory 921 and the computer program 923 that can be run on processor 922, wherein processor 922 and first sensor 911 It couples with second sensor 912 to obtain the data of the bending angle of finger-joint and the spatial orientation of palm, and executes meter Calculation machine program 923 is to realize above by method for identifying a gesture described in Fig. 2.
Figure 10 is the schematic block diagram according to the system for identifying a gesture of a further embodiment of the present invention.
As shown in Figure 10, the system for identifying a gesture 100 of the present embodiment includes wearable device 1010 and calculating dress Set 1020.In the present embodiment, wearable device 1010 includes first sensor 1011, which is arranged at Region corresponding with the phalanges near finger-joint is on wearable device to obtain the bending angle of finger-joint.
Unlike Fig. 7 and 8 illustrated embodiments, in the present embodiment, the identifying processing of gesture is by being located at wearable set Standby external computing device 1020 is completed.Referring to Figure 10, computing device 1020 includes memory 1021, processor 1022 and deposits Store up on memory 1021 and the computer program 1023 that can be run on processor 1022, wherein processor 1022 and First sensor 1011 is coupled to obtain the data of the bending angle of finger-joint, and executes computer program 1023 to realize Above by method for identifying a gesture described in Fig. 3.
In the embodiment shown in Fig. 9 and 10, wearable device 910 and 1010 can be wearable gloves, computing device 920 and 1020 can be personal computer, tablet computer, mobile phone and personal digital assistant etc..
It is another aspect of this invention to provide that additionally providing a kind of computer readable storage medium, computer journey is stored thereon Sequence can be realized above when the program is executed by processor by method for identifying a gesture described in Fig. 2 and 3.
Embodiments and examples set forth herein is provided, to be best described by the reality according to this technology and its specific application Example is applied, and thus enables those skilled in the art to implement and using the present invention.But those skilled in the art will Know, provides above description and example only for the purposes of illustrating and illustrating.The description proposed is not intended to cover the present invention Various aspects or limit the invention to disclosed precise forms.
In view of the above, the scope of the present disclosure is determined by following claims.

Claims (18)

1. a kind of method for identifying a gesture, which is characterized in that comprise the steps of
It generates and the one or more associated primary vector of bending angle of finger-joints and related with the spatial orientation of palm The secondary vector of connection;
Determine the similarity of the primary vector and secondary vector and respective reference vector;And
Gesture is identified according to the similarity,
Wherein, the bending angle and spatial orientation are obtained using sensor.
2. the method for claim 1, wherein by by sensor be set on wearable device near finger-joint Phalanges corresponding region obtain the bending angle.
3. the method for claim 1, wherein by the way that sensor to be set on wearable device and the back of the hand or the palm of the hand pair The region answered obtains the spatial orientation.
4. the method for claim 1, wherein the secondary vector uses the form of quaternary number.
5. the method for claim 1, wherein the primary vector is with the similarity of corresponding reference vector with the two Euclidean distance is measured, and the secondary vector and the similarity of corresponding reference vector are weighed with the angle of the two Amount.
6. a kind of method for identifying a gesture, which is characterized in that comprise the steps of
Generate primary vector associated with one or more bending angle of finger-joint;
Determine the similarity of the primary vector and reference vector;And
Gesture is identified according to the similarity,
Wherein, the bending angle is obtained using sensor.
7. method as claimed in claim 6, wherein by by sensor be set on wearable device near finger-joint Phalanges corresponding region obtain the bending angle.
8. method as claimed in claim 6, wherein the primary vector is with the similarity of corresponding reference vector with the two Euclidean distance is measured.
9. a kind of device for identifying a gesture, characterized by comprising:
First module, for generate primary vector associated with one or more bending angle of finger-joint and with palm The associated secondary vector of spatial orientation;
Second module, for determining the similarity of the primary vector and secondary vector and respective reference vector;And
Third module, for identifying gesture according to the similarity,
Wherein, the bending angle and spatial orientation are obtained using sensor.
10. a kind of device for identifying a gesture, characterized by comprising:
First module, for generating primary vector associated with one or more bending angle of finger-joint;
Second module, for determining the similarity of the primary vector and reference vector;And
Third module, for identifying gesture according to the similarity,
Wherein, the bending angle is obtained using sensor.
11. a kind of device for identifying a gesture, comprising memory, processor and it is stored on the memory and can be in institute State the computer program run on processor, which is characterized in that execute described program to realize such as any one of claim 1-8 The method.
12. a kind of wearable device, comprising:
First sensor is set to region corresponding with the phalanges near finger-joint on wearable device and is closed with obtaining finger The bending angle of section;
Second sensor is set to region corresponding with the back of the hand or the palm of the hand on wearable device and takes to obtain the space of palm To;
Device for identifying a gesture comprising memory, processor and is stored on the memory and can be in the processing The computer program run on device, wherein by executing described program to realize the following steps:
It generates and the one or more associated primary vector of bending angle of finger-joints and related with the spatial orientation of palm The secondary vector of connection;
Determine the similarity of the primary vector and secondary vector and respective reference vector;And
Gesture is identified according to the similarity.
13. a kind of wearable device, comprising:
First sensor is set to region corresponding with the phalanges near finger-joint on wearable device and is closed with obtaining finger The bending angle of section;
Device for identifying a gesture comprising memory, processor and is stored on the memory and can be in the processing The computer program run on device, wherein by executing described program to realize the following steps:
Generate primary vector associated with one or more bending angle of finger-joint;
Determine the similarity of the primary vector Yu respective reference vector;And
Gesture is identified according to the similarity.
14. wearable device as described in claim 12 or 13, wherein the wearable device is wearable gloves.
15. a kind of system for identifying a gesture, comprising:
Wearable device, comprising:
First sensor is set to region corresponding with the phalanges near finger-joint on wearable device and is closed with obtaining finger The bending angle of section;
Second sensor is set to region corresponding with the back of the hand or the palm of the hand on wearable device and takes to obtain the space of palm To;
Computing device comprising memory, processor and is stored on the memory and can run on the processor Computer program, wherein by executing described program to realize the following steps:
Generate primary vector associated with one or more bending angle of finger-joint
With secondary vector associated with the spatial orientation of palm;
Determine the similarity of the primary vector and secondary vector and respective reference vector;And
Gesture is identified according to the similarity.
16. a kind of system for identifying a gesture, comprising:
Wearable device, comprising:
First sensor is set to region corresponding with the phalanges near finger-joint on wearable device and is closed with obtaining finger The bending angle of section;
Computing device comprising memory, processor and is stored on the memory and can run on the processor Computer program, wherein by executing described program to realize the following steps:
Generate primary vector associated with one or more bending angle of finger-joint;
Determine the similarity of the primary vector Yu respective reference vector;And
Gesture is identified according to the similarity.
17. the system as described in claim 15 or 16, wherein the wearable device is wearable gloves, the calculating dress It is set to one of personal computer, tablet computer, mobile phone and personal digital assistant.
18. a kind of computer readable storage medium, stores computer program thereon, which is characterized in that the program is held by processor Such as method of any of claims 1-8 is realized when row.
CN201711265640.5A 2017-12-05 2017-12-05 Method and apparatus for identifying a gesture Pending CN109871857A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711265640.5A CN109871857A (en) 2017-12-05 2017-12-05 Method and apparatus for identifying a gesture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711265640.5A CN109871857A (en) 2017-12-05 2017-12-05 Method and apparatus for identifying a gesture

Publications (1)

Publication Number Publication Date
CN109871857A true CN109871857A (en) 2019-06-11

Family

ID=66916269

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711265640.5A Pending CN109871857A (en) 2017-12-05 2017-12-05 Method and apparatus for identifying a gesture

Country Status (1)

Country Link
CN (1) CN109871857A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110991319A (en) * 2019-11-29 2020-04-10 广州市百果园信息技术有限公司 Hand key point detection method, gesture recognition method and related device
CN111158478A (en) * 2019-12-26 2020-05-15 维沃移动通信有限公司 Response method and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102063825A (en) * 2010-12-15 2011-05-18 北京理工大学 Sign language recognizing device based on data gloves
CN102193633A (en) * 2011-05-25 2011-09-21 广州畅途软件有限公司 dynamic sign language recognition method for data glove
CN104898847A (en) * 2015-06-12 2015-09-09 合肥市徽腾网络科技有限公司 Novel sign language recognizing acquiring method and device
CN105404397A (en) * 2015-12-21 2016-03-16 广东工业大学 Glove controller and control method for same
CN107037878A (en) * 2016-12-14 2017-08-11 中国科学院沈阳自动化研究所 A kind of man-machine interaction method based on gesture

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102063825A (en) * 2010-12-15 2011-05-18 北京理工大学 Sign language recognizing device based on data gloves
CN102193633A (en) * 2011-05-25 2011-09-21 广州畅途软件有限公司 dynamic sign language recognition method for data glove
CN104898847A (en) * 2015-06-12 2015-09-09 合肥市徽腾网络科技有限公司 Novel sign language recognizing acquiring method and device
CN105404397A (en) * 2015-12-21 2016-03-16 广东工业大学 Glove controller and control method for same
CN107037878A (en) * 2016-12-14 2017-08-11 中国科学院沈阳自动化研究所 A kind of man-machine interaction method based on gesture

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110991319A (en) * 2019-11-29 2020-04-10 广州市百果园信息技术有限公司 Hand key point detection method, gesture recognition method and related device
CN111158478A (en) * 2019-12-26 2020-05-15 维沃移动通信有限公司 Response method and electronic equipment
CN111158478B (en) * 2019-12-26 2023-02-03 维沃移动通信有限公司 Response method and electronic equipment

Similar Documents

Publication Publication Date Title
CN108875524B (en) Sight estimation method, device, system and storage medium
US10373244B2 (en) System and method for virtual clothes fitting based on video augmented reality in mobile phone
CN110322500A (en) Immediately optimization method and device, medium and the electronic equipment of positioning and map structuring
US8107688B2 (en) Gaze detection apparatus and the method of the same
JP6815707B2 (en) Face posture detection method, device and storage medium
US20160343165A1 (en) Method for displaying augmented reality content based on 3d point cloud recognition, and apparatus and system for executing the method
CN106030610B (en) The real-time 3D gesture recognition and tracking system of mobile device
EP3016027A2 (en) Human body part detection system and human body part detection method
CN103824072B (en) Method and device for detecting font structure of handwriting character
KR102057531B1 (en) Mobile devices of transmitting and receiving data using gesture
CN104881673B (en) The method and system of pattern-recognition based on information integration
CN107368820B (en) Refined gesture recognition method, device and equipment
Almasre et al. A real-time letter recognition model for Arabic sign language using kinect and leap motion controller v2
CN110349212A (en) Immediately optimization method and device, medium and the electronic equipment of positioning and map structuring
CN105718776A (en) Three-dimensional gesture verification method and system
CN111459269A (en) Augmented reality display method, system and computer readable storage medium
CN109871116A (en) Device and method for identifying a gesture
CN109871857A (en) Method and apparatus for identifying a gesture
CN103925922B (en) A kind of fixed star recognition methods being applicable to ICCD star chart under high dynamic condition
CN110222651A (en) A kind of human face posture detection method, device, terminal device and readable storage medium storing program for executing
CN106529480A (en) Finger tip detection and gesture identification method and system based on depth information
US20230326251A1 (en) Work estimation device, work estimation method, and non-transitory computer readable medium
CN113112321A (en) Intelligent energy body method, device, electronic equipment and storage medium
CN105100501B (en) A kind of mobile phone computing system based on Internet of Things
US10678337B2 (en) Context aware movement recognition system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination