CN107272878A - A kind of recognition methods for being applied to complicated gesture and device - Google Patents

A kind of recognition methods for being applied to complicated gesture and device Download PDF

Info

Publication number
CN107272878A
CN107272878A CN201710103875.8A CN201710103875A CN107272878A CN 107272878 A CN107272878 A CN 107272878A CN 201710103875 A CN201710103875 A CN 201710103875A CN 107272878 A CN107272878 A CN 107272878A
Authority
CN
China
Prior art keywords
gesture
data
hand
similarity
complicated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710103875.8A
Other languages
Chinese (zh)
Other versions
CN107272878B (en
Inventor
李元龙
周言明
黄昌正
林晓峰
陈曦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongguan Yilian Interation Information Technology Co ltd
Fantasy Zhuhai Technology Co ltd
Guangzhou Huantek Co ltd
Original Assignee
Guangzhou Science And Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Science And Technology Co Ltd filed Critical Guangzhou Science And Technology Co Ltd
Priority to CN201710103875.8A priority Critical patent/CN107272878B/en
Publication of CN107272878A publication Critical patent/CN107272878A/en
Application granted granted Critical
Publication of CN107272878B publication Critical patent/CN107272878B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of complicated gesture identification method, it includes:Calculate the locus where the back of the hand;Judge the relation control gathered data of the back of the hand and the gesture identification area of space;Pre-processed according to gathered data;Pretreated data are identified;Export similarity highest gesture identification result.The complicated gesture identification under 3D scenes is especially suitable for, and recognizer meets the requirement of identification, handles simpler.

Description

A kind of recognition methods for being applied to complicated gesture and device
Technical field
The present invention relates to artificial intelligence field, and in particular to a kind of identifying device and method of complicated gesture.
Background technology
The video equipments such as display are all remote viewings due to user, therefore mostly use remote control or mouse button Disk is operated, and with the intelligent more and more higher of product, the demand of user can not be met using the mode of operation of remote control. Gesture control is gradually applied on the products such as control display as a kind of new control mode.
Existing gesture identification method focuses primarily upon vision system for the system and equipment used, its gather and Primitive for identification is image, and data volume is big, processing is complicated, antijamming capability is low;From the type of gesture identification On see mainly there is simple hand-type to recognize, only focused on the relative characteristic of finger and palm, it is impossible to recognize a complicated hand The orientative feature and space characteristics of gesture;There is simple track identification, only focused on the spatial position change of palm, it is impossible to know The feature and hand-type feature of whole arm in other one complicated gesture;From knowing from method for distinguishing, there are neutral net, fuzzy knowledge Not etc., for recognizing static gesture, it is impossible to recognize gesture temporal aspect.
Accordingly, it would be desirable to be improved to prior art.
The content of the invention
The present invention provides a kind of recognition methods for being applied to complicated gesture and device, can recognize non-picture data, meet The requirement of chronicle recognition, is adapted to complicated gesture identification and device simple under 3D scenes, easy to use.
To realize above-mentioned technical purpose and the technique effect, the present invention is achieved through the following technical solutions.
First aspect there is provided a kind of recognition methods for being applied to complicated gesture, including:Calculate the space bit where the back of the hand Put;Judge the relation control gathered data of the back of the hand and the gesture identification area of space;Pre-processed according to gathered data; Pretreated data are identified;Export similarity highest gesture identification result.
Preferably, the mode for calculating the back of the hand place locus includes:The area of space of default gesture identification, root Caught according to default shoulder bone length, upper arm bone length, forearm bone length, the back of the hand to wrist distance and single armed inertia action Quaternion algebra that equipment captures is caught according to locus where calculating the back of the hand.
Preferably, the mode bag of the relation control gathered data for judging the back of the hand and the gesture identification area of space Include:When the back of the hand enters default gesture identification area of space, start collection;When the back of the hand leaves default gesture identification space During region, terminate collection.
Preferably, the mode pre-processed according to gathered data includes:Carried out using sliding filter at noise reduction Reason;Each frame data of the gathered data are integrated into 32 dimension datas.
Preferably, it is described that the mode that pretreated data are identified is included:Slope is set;According to specific programming Data structure and slope limitation, initialize distance map;The optimal path distance of sectional drawing is sought in Dynamic Programming;Calculate gathered data With the similarity of template gesture.
Preferably, the mode of the output similarity highest gesture identification result includes:Take similarity highest template Gesture, is used as this recognition result;The threshold value of similarity is set, then refuses identification less than threshold value for similarity, similarity is not The mark of gesture representative is then exported less than threshold value.
Preferably, the data gathered can obtain for vision system equipment must correspond to the rotation information of skeleton.
Preferably, recognition methods can be identified for Hidden Markov algorithm, neutral net scheduling algorithm.
Second aspect there is provided a kind of identifying device for being applied to complicated gesture, including:Detection trigger module, for calculating Locus where the back of the hand;Gathered data module, for judging that the relation of the back of the hand and the gesture identification area of space is controlled Gathered data;Pretreatment module, for being pre-processed according to gathered data;Identification module, for pretreated data It is identified;Output module is judged, for exporting similarity highest gesture identification result.
Preferably, the detection trigger module includes:Pre-set space submodule, the space region for presetting gesture identification Domain;Position calculating sub module, in one's hands according to default shoulder bone length, upper arm bone length, forearm bone length, the back of the hand The quaternion algebra that wrist distance and single armed inertia action seizure equipment are captured is according to locus where calculating the back of the hand.
Preferably, the gathered data module is used to, when the back of the hand enters default gesture identification area of space, start to adopt Collection;When the back of the hand leaves default gesture identification area of space, terminate collection.
Preferably, the pretreatment module includes:Submodule is filtered, for carrying out noise reduction process using sliding filter; Data Integration submodule, for each frame data of the gathered data to be integrated into 32 dimension datas.
Preferably, the identification module includes:Default slope submodule, for setting slope;Initialization submodule, is used According to data structure and the slope limitation specifically programmed, distance map is initialized;Path planning submodule, for Dynamic Programming Solve the optimal path distance of figure;Similarity Measure submodule, the similarity for calculating gathered data and template gesture.
Preferably, the judgement output module includes:Output sub-module, for taking similarity highest template gesture, makees For this recognition result;Judging submodule, the threshold value for setting similarity then refuses identification for similarity less than threshold value, Similarity is not less than the mark that threshold value then exports gesture representative.
Preferably, gathered data module data type can obtain for vision system equipment must correspond to the rotation of skeleton Transfering the letter breath.
Preferably, the recognition methods of identification module can be known for Hidden Markov algorithm, neutral net scheduling algorithm Not.
As described above, the recognition methods for being applied to complicated gesture of the present invention and device have the advantages that:Gesture Identification equipment is simple, easy to use, convenient for production only with inertia sensing unit, is combined into single and catches equipment, wears Wear simple.Gesture identification data are succinct, and processing is rapid, meets the demand of Real time identification, made using inertia sensing data quaternary number For identification object, prominent complicated gesture feature.Preprocess method protrudes the space characteristics of hand-type and arm, and fits very much Close the complicated gesture identification under 3D scenes;Recognizer meets the requirement of identification, handles simpler.
Brief description of the drawings
The technical scheme in technology, will be described to embodiment technology below in order to illustrate the embodiments of the present invention more clearly In required for the accompanying drawing that uses be briefly described, it should be apparent that, drawings in the following description are only some of the present invention Embodiment, for those of ordinary skill in the art, on the premise of not paying creative work, can also be according to these Accompanying drawing obtains other accompanying drawings.
Fig. 1 catches equipment schematic diagram for the single armed inertia action of the present invention.
Fig. 2 is the recognition methods flow chart of the complicated gesture of the present invention.
Fig. 3 is the area of space schematic diagram of the gesture identification of the present invention.
Fig. 4 is the identifying device module diagram of the complicated gesture of the present invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete Site preparation is described, it is clear that described embodiment is only a part of embodiment of the invention, rather than whole embodiments.It is based on Embodiment in the present invention, those of ordinary skill in the art obtained under the premise of creative work is not made it is all its His embodiment, belongs to the scope of protection of the invention.
The method for being applied to complicated gesture identification of the present invention includes as shown in Figure 2:Calculate the space bit where the back of the hand Put;Judge the relation control gathered data of the back of the hand and the gesture identification area of space;Pre-processed according to gathered data; Pretreated data are identified;Export similarity highest gesture identification result.
Specifically, as shown in figure 3, presetting the area of space of gesture identification, such as rectangular body region shown in Fig. 3 dotted line frames, Default shoulder bone length, upper arm bone length, forearm bone length, the back of the hand to wrist distance, length and apart from as three-dimensional Coordinate x-axis skew, y axles, z-axis skew be 0, and it is homogeneous turn to quaternary number, according to single armed inertia action catch equipment catch The quaternion algebra evidence grasped, calculates the locus where the back of the hand, passes through the length L (having changed into quaternary number) of setting, and sensing Device quaternion algebra is according to carrying out computing, it can be deduced that three-dimensional space position is:
As shown in figure 1, the quaternion algebra that wherein single armed inertia action seizure equipment is captured conducts according to the five fingers inertia is included Cell data, the back of the hand inertia conduction unit data, forearm inertia conduct data, upper arm inertia conduction unit data, chest inertia Conduction unit data.Its primary placements' mode is:Node 01 be chest inertia conduction unit, for gather human body just facing to Inertial data q01=(w01,x01,y01,z01);Node 11 is upper arm inertia conduction unit, the inertia for gathering arm on hand Data q11=(w11,x11,y11,z11);Node 21 is forearm inertia conduction unit, the inertial data q for gathering hand forearm21 =(w21,x21,y21,z21);Node 31 is the back of the hand inertia conduction unit, the inertial data q for gathering the back of the hand31=(w31,x31, y31,z31);Node 41 is thumb inertia conduction unit, the inertial data q for gathering thumb41=(w41,x41,y41,z41);Section Point 42 is forefinger inertia conduction unit, the inertial data q for gathering forefinger42=(w42,x42,y42,z42);During node 43 is Refer to inertia conduction unit, the inertial data q for gathering middle finger43=(w43,x43,y43,z43);Node 44 is nameless inertia Conduction unit, the inertial data q for gathering the third finger44=(w44,x44,y44,z44);Node 45 is that little finger of toe inertia conducts list Member, the inertial data q for gathering little finger of toe45=(w45,x45,y45,z45)。
(the P when the back of the hand enters default gesture identification area of spacex,Py,Pz) ∈ Ω, start 202 gathered data steps; When the back of the hand leaves default gesture identification area of spaceTerminate gathered data.
After the completion of gathered data, noise reduction process is carried out using sliding filter according to gesture data, reduced due to gesture During hand shake and sensor precision itself it is not enough caused by noise error.
In addition, gathered all inertial sensor datas come each moment is set as a frame data X, if being opened from collection Begin to collection to terminate to acquire r frame data altogether, these data constitute an orderly data acquisition system, therefrom the equidistant m frames that extract are closed Key frame data are:
If the data volume r of collection is less than m, refusal this time identification request.
The conversion processing of coordinate system is carried out to each key frame data, quaternary number is made with the local coordinate system of father node Based on coordinate system description rotation, prominent each bone of key frame rotates against feature and locus feature:
Deployed afterwards by disclosed above according to the multiplication of quaternary number.
Give up human body front orientation information q0, each frame data are integrated into 32 dimension datas:
X=[q1 ..., q8] (4)
Preferably, pretreated gesture data is identified.Its computational methods is as follows:
For gesture template y and gesture data x, per frame data containing 8 quaternary numbers, if gesture template y sequential length is M, gesture data x sequential length are n, then:
1. setting slope, that is, cyclic variable i, j maximum difference k are set, | i-j |≤k
2. according to data structure and the slope limitation specifically programmed, initialize distance map
3. Dynamic Programming solves the optimal path distance of figure:
D (0,0)=value (0,0)
D (i, j)=min d (i-1, j-1)+value (i, j),
D (i-1, j)+value (i, j), (in the range of slope limitation)
d(i,j-1)+value(i,j)}
Calculated value is stored in calculating process, in order to avoid compute repeatedly;
4. it is that can portray gesture data x and template gesture y similarity to solve d (m, n), d (m, n) is bigger, and similarity is lower, The smaller similarities of d (m, n) are higher.
Similarity highest template gesture is taken, this recognition result is used as;The threshold value of similarity is set, for similarity Then refuse identification less than threshold value, similarity is not less than the mark that threshold value then exports gesture representative.
Preferably, the data gathered can obtain for vision system equipment must correspond to the rotation information of skeleton.
Preferably, recognition methods can be identified for Hidden Markov algorithm, neutral net scheduling algorithm.
Second aspect, provides a kind of identifying device for being applied to complicated gesture as shown in Figure 4, including:Detection trigger mould Block, for calculating the locus where the back of the hand;Gathered data module, for judging the back of the hand and the gesture identification space region The relation control gathered data in domain;Pretreatment module, for being pre-processed according to gathered data;Identification module, for pre- Data after processing are identified;Output module is judged, for exporting similarity highest gesture identification result.
Specifically, touch detection module as shown in Figure 3 is used for the area of space for presetting gesture identification, such as Fig. 3 dotted line frames Shown rectangular body region, presets shoulder bone length, upper arm bone length, forearm bone length, the back of the hand to wrist distance, long Degree and apart from the x-axis skew as three-dimensional coordinate, y-axis, the skew of z-axis are 0, and it is homogeneous turn to quaternary number, it is used according to single armed Sexual act catches the quaternion algebra evidence that equipment is captured, and calculates the locus where the back of the hand, (is changed into by the length L of setting Quaternary number), and sensor quaternion algebra is according to carrying out computing, it can be deduced that three-dimensional space position:
As shown in figure 1, wherein single armed inertia action seizure equipment includes the five fingers inertia conduction unit, the conduction of the back of the hand inertia Unit, forearm inertia conduction unit, upper arm inertia conduction unit, chest inertia conduction unit.Its primary placements' mode is:Section Point 01 be chest inertia conduction unit, for gather human body just facing to inertial data q01=(w01,x01,y01,z01);Section Point 11 is upper arm inertia conduction unit, the inertial data q for gathering arm on hand11=(w11,x11,y11,z11);Node 21 is Forearm inertia conduction unit, the inertial data q for gathering hand forearm21=(w21,x21,y21,z21);Node 31 is used for the back of the hand Property conduction unit, the inertial data q for gathering the back of the hand31=(w31,x31,y31,z31);Node 41 is that thumb inertia conducts list Member, the inertial data q for gathering thumb41=(w41,x41,y41,z41);Node 42 is forefinger inertia conduction unit, is used for Gather the inertial data q of forefinger42=(w42,x42,y42,z42);Node 43 is middle finger inertia conduction unit, for gathering middle finger Inertial data q43=(w43,x43,y43,z43);Node 44 is nameless inertia conduction unit, for gathering the used of the third finger Property data q44=(w44,x44,y44,z44);Node 45 is little finger of toe inertia conduction unit, the inertial data q for gathering little finger of toe45 =(w45,x45,y45,z45)。
Gathered data unit is used for the (P when the back of the hand enters default gesture identification area of spacex,Py,Pz) ∈ Ω, start Gathered data step;When the back of the hand leaves default gesture identification area of spaceTerminate gathered data.
Pretreatment module is used for after the completion of gathered data, is carried out according to gesture data using sliding filter at noise reduction Reason, reduction is due to the shake of hand during gesture and the not enough caused noise error of sensor precision itself.
In addition, gathered all inertial sensor datas come each moment is set as a frame data X, if being opened from collection Begin to collection to terminate to acquire r frame data altogether, these data constitute an orderly data acquisition system, therefrom the equidistant m frames that extract are closed Key frame data.
If the data volume r of collection is less than m, refusal this time identification request.
The conversion processing of coordinate system is carried out to each key frame data, quaternary number is made with the local coordinate system of father node Based on coordinate system description rotation, prominent each bone of key frame rotates against feature and locus feature:
Above-mentioned formula is deployed according to the multiplication of quaternary number.
Give up human body front orientation information q0, each frame data are integrated into 32 dimension datas:
X=[q1 ..., q8] (4)
The identification method of identification module is:
Pair with gesture template y and gesture data x, per frame data containing 8 quaternary numbers, if gesture template y sequential length is M, gesture data x sequential length are n, then:
The 1st, slope is set, that is, cyclic variable i, j maximum difference k are set, | i-j |≤k
2nd, according to data structure and the slope limitation specifically programmed, distance map is initialized
3rd, Dynamic Programming solves the optimal path distance of figure:
D (0,0)=value (0,0)
D (i, j)=min d (i-1, j-1)+value (i, j),
D (i-1, j)+value (i, j), (in the range of slope limitation)
d(i,j-1)+value(i,j)}
Calculated value is stored in calculating process, in order to avoid compute repeatedly;
4th, it is that can portray gesture data x and template gesture y similarity to solve d (m, n),
D (m, n) is bigger, and similarity is lower, and the smaller similarities of d (m, n) are higher.
Similarity highest template gesture is taken, this recognition result is used as;The threshold value of similarity is set, for similarity Then refuse identification less than threshold value, similarity is not less than the mark that threshold value then exports gesture representative.
Preferably, gathered data module can obtain for vision system equipment must correspond to the rotation information of skeleton.
Preferably, the recognition methods of identification module can be known for Hidden Markov algorithm, neutral net scheduling algorithm Not.

Claims (18)

1. a kind of recognition methods for being applied to complicated gesture, it is characterised in that:Including:Calculate the locus where the back of the hand;Sentence Cut off the hands the back of the body and the gesture identification area of space relation control gathered data;Pre-processed according to gathered data;To pre- place Data after reason are identified;Export similarity highest gesture identification result.
2. the recognition methods of complicated gesture according to claim 1, it is characterised in that:Space bit where described calculating the back of the hand The mode put includes:The area of space of default gesture identification;According to default shoulder bone length, upper arm bone length, forearm bone Length, the back of the hand catch the quaternion algebra that captures of equipment according to space where calculating the back of the hand to wrist distance and single armed inertia action Position.
3. the recognition methods of complicated gesture according to claim 2, it is characterised in that:The single armed inertia action catches and set The standby quaternion algebra captured is used according to the five fingers inertia conduction unit data, the back of the hand inertia conduction unit data, forearm is at least included Property conduction unit data, upper arm inertia conduction unit data, chest inertia conduction unit data.
4. the recognition methods of complicated gesture according to claim 1 or 2, it is characterised in that:It is described to judge the back of the hand and described The mode of the relation control gathered data of gesture identification area of space includes:When the back of the hand enters default gesture identification area of space When, start collection;When the back of the hand leaves default gesture identification area of space, terminate collection.
5. the recognition methods of complicated gesture according to claim 1 or 2, it is characterised in that:It is described to be entered according to gathered data The mode of row pretreatment includes:Noise reduction process is carried out using sliding filter;Each frame data of the gathered data are integrated Into 32 dimension datas.
6. the recognition methods of complicated gesture according to claim 1 or 2, it is characterised in that:It is described to pretreated number Include according to the mode being identified:Slope is set;According to data structure and the slope limitation specifically programmed, distance map is initialized; The optimal path distance of sectional drawing is sought in Dynamic Programming;Calculate the similarity of gathered data and template gesture.
7. the recognition methods of complicated gesture according to claim 1 or 2, it is characterised in that:The output similarity highest The mode of gesture identification result include:Similarity highest template gesture is taken, this recognition result is used as;Similarity is set Threshold value, then refuses identification, similarity is not less than the mark that threshold value then exports gesture representative for similarity less than threshold value.
8. the recognition methods of complicated gesture according to claim 1 or 2, it is characterised in that:The data gathered can be Vision system equipment, which is obtained, must correspond to the rotation information of skeleton.
9. the recognition methods of complicated gesture according to claim 1 or 2, it is characterised in that:It is described to pretreated number Can be Hidden Markov algorithm or neutral net scheduling algorithm according to the method for being identified.
10. one kind is applied to complicated gesture identifying device, it is characterised in that:Including:Detection trigger module, for calculating the back of the hand institute Locus;Gathered data module, for judging that the relation of the back of the hand and the gesture identification area of space controls collection number According to;Pretreatment module, for being pre-processed according to gathered data;Identification module, for knowing to pretreated data Not;Output module is judged, for exporting similarity highest gesture identification result.
11. complicated gesture identifying device according to claim 10, it is characterised in that:The detection trigger module includes: Pre-set space submodule, the area of space for presetting gesture identification;Position calculating sub module, for long according to default shoulder bone Degree, upper arm bone length, forearm bone length, the back of the hand catch equipment is captured four to wrist distance and single armed inertia action First number data calculate locus where the back of the hand.
12. complicated gesture identifying device according to claim 11, it is characterised in that:The single armed inertia action catches and set Include the five fingers inertia conduction unit, the back of the hand inertia conduction unit, forearm inertia conduction unit, upper arm inertia conduction list less to the utmost Member, chest inertia conduction unit.
13. the complicated gesture identifying device according to claim 10 or 11, it is characterised in that:The gathered data module is used In when the back of the hand enters default gesture identification area of space, start collection;When the back of the hand leaves default gesture identification space region During domain, terminate collection.
14. the complicated gesture identifying device according to claim 10 or 11, it is characterised in that:The pretreatment module bag Include:Submodule is filtered, for carrying out noise reduction process using sliding filter;Data Integration submodule, for by the collection number According to each frame data be integrated into 32 dimension datas.
15. the complicated gesture identifying device according to claim 10 or 11, it is characterised in that:The identification module includes: Default slope submodule, for setting slope;Initialization submodule, for according to data structure and the slope limit specifically programmed System, initializes distance map;Path planning submodule, the optimal path distance of sectional drawing is sought for Dynamic Programming;Similarity Measure Module, the similarity for calculating gathered data and template gesture.
16. the complicated gesture identifying device according to claim 10 or 11, it is characterised in that:The judgement output module bag Include:Output sub-module, for taking similarity highest template gesture, is used as this recognition result;Judging submodule, for setting The threshold value of similarity, then refuses identification, similarity is not less than the mark that threshold value then exports gesture representative for similarity less than threshold value Know.
17. the complicated gesture identifying device according to claim 10 or 11, it is characterised in that:The gathered data number of modules Can be obtained according to type for vision system equipment must correspond to the rotation information of skeleton.
18. the complicated gesture identifying device according to claim 10 or 11, it is characterised in that:It is described to pretreated number Can be Hidden Markov algorithm or neutral net scheduling algorithm according to the method for being identified.
CN201710103875.8A 2017-02-24 2017-02-24 Identification method and device suitable for complex gesture Active CN107272878B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710103875.8A CN107272878B (en) 2017-02-24 2017-02-24 Identification method and device suitable for complex gesture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710103875.8A CN107272878B (en) 2017-02-24 2017-02-24 Identification method and device suitable for complex gesture

Publications (2)

Publication Number Publication Date
CN107272878A true CN107272878A (en) 2017-10-20
CN107272878B CN107272878B (en) 2020-06-16

Family

ID=60052539

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710103875.8A Active CN107272878B (en) 2017-02-24 2017-02-24 Identification method and device suitable for complex gesture

Country Status (1)

Country Link
CN (1) CN107272878B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109948511A (en) * 2019-03-14 2019-06-28 广东美的白色家电技术创新中心有限公司 Gesture identification method and device
CN110110646A (en) * 2019-04-30 2019-08-09 浙江理工大学 A kind of images of gestures extraction method of key frame based on deep learning

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103984928A (en) * 2014-05-20 2014-08-13 桂林电子科技大学 Finger gesture recognition method based on field depth image
KR20140139726A (en) * 2013-05-28 2014-12-08 엘에스산전 주식회사 Apparatus for recognizing gesture in Human Machine Interface
CN104777775A (en) * 2015-03-25 2015-07-15 北京工业大学 Two-wheeled self-balancing robot control method based on Kinect device
CN105807926A (en) * 2016-03-08 2016-07-27 中山大学 Unmanned aerial vehicle man-machine interaction method based on three-dimensional continuous gesture recognition
CN105809144A (en) * 2016-03-24 2016-07-27 重庆邮电大学 Gesture recognition system and method adopting action segmentation
CN106055091A (en) * 2016-05-16 2016-10-26 电子科技大学 Hand posture estimation method based on depth information and calibration method
CN106227368A (en) * 2016-08-03 2016-12-14 北京工业大学 A kind of human synovial angle calculation method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140139726A (en) * 2013-05-28 2014-12-08 엘에스산전 주식회사 Apparatus for recognizing gesture in Human Machine Interface
CN103984928A (en) * 2014-05-20 2014-08-13 桂林电子科技大学 Finger gesture recognition method based on field depth image
CN104777775A (en) * 2015-03-25 2015-07-15 北京工业大学 Two-wheeled self-balancing robot control method based on Kinect device
CN105807926A (en) * 2016-03-08 2016-07-27 中山大学 Unmanned aerial vehicle man-machine interaction method based on three-dimensional continuous gesture recognition
CN105809144A (en) * 2016-03-24 2016-07-27 重庆邮电大学 Gesture recognition system and method adopting action segmentation
CN106055091A (en) * 2016-05-16 2016-10-26 电子科技大学 Hand posture estimation method based on depth information and calibration method
CN106227368A (en) * 2016-08-03 2016-12-14 北京工业大学 A kind of human synovial angle calculation method and device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109948511A (en) * 2019-03-14 2019-06-28 广东美的白色家电技术创新中心有限公司 Gesture identification method and device
CN110110646A (en) * 2019-04-30 2019-08-09 浙江理工大学 A kind of images of gestures extraction method of key frame based on deep learning
CN110110646B (en) * 2019-04-30 2021-05-04 浙江理工大学 Gesture image key frame extraction method based on deep learning

Also Published As

Publication number Publication date
CN107272878B (en) 2020-06-16

Similar Documents

Publication Publication Date Title
WO2018040757A1 (en) Wearable device and method of using same to monitor motion state
CN101344816B (en) Human-machine interaction method and device based on sight tracing and gesture discriminating
WO2018120964A1 (en) Posture correction method based on depth information and skeleton information
CN107908288A (en) A kind of quick human motion recognition method towards human-computer interaction
Chen et al. A real-time dynamic hand gesture recognition system using kinect sensor
CN109597485B (en) Gesture interaction system based on double-fingered-area features and working method thereof
CN107168527A (en) The first visual angle gesture identification and exchange method based on region convolutional neural networks
CN101976330B (en) Gesture recognition method and system
CN102402289B (en) Mouse recognition method for gesture based on machine vision
CN104484644B (en) A kind of gesture identification method and device
CN106778477A (en) Tennis racket action identification method and device
CN108268132B (en) Gesture recognition method based on glove acquisition and man-machine interaction device
CN103926999A (en) Palm opening and closing gesture recognition method and device and man-machine interaction method and device
CN102640085A (en) System and method for recognizing gestures
CN114265498B (en) Method for combining multi-mode gesture recognition and visual feedback mechanism
CN107122707A (en) Video pedestrian based on macroscopic features compact representation recognition methods and system again
CN106599762A (en) Motion information recognition method and system
CN106919958A (en) A kind of human finger action identification method based on intelligent watch
CN106625658A (en) Method for controlling anthropomorphic robot to imitate motions of upper part of human body in real time
CN106200971A (en) Man-machine interactive system device based on gesture identification and operational approach
CN104731307A (en) Somatic action identifying method and man-machine interaction device
CN104517100B (en) Gesture pre-judging method and system
CN106073793B (en) Attitude Tracking and recognition methods based on micro-inertia sensor
CN110837792A (en) Three-dimensional gesture recognition method and device
Linqin et al. Dynamic hand gesture recognition using RGB-D data for natural human-computer interaction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20201211

Address after: Room 01, 17 / F, Xingguang Yingjing, 117 Shuiyin Road, Yuexiu District, Guangzhou City, Guangdong Province 510075

Patentee after: GUANGZHOU HUANTEK Co.,Ltd.

Patentee after: DONGGUAN YILIAN INTERATION INFORMATION TECHNOLOGY Co.,Ltd.

Address before: Room 01, 17 / F, Xingguang Yingjing, 119 Shuiyin Road, Yuexiu District, Guangzhou City, Guangdong Province 510075

Patentee before: GUANGZHOU HUANTEK Co.,Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20230626

Address after: Room 01-011, Floor 3, No. 721, Tianhe North Road, Tianhe District, Guangzhou, Guangdong 510630 (office only)

Patentee after: GUANGZHOU HUANTEK Co.,Ltd.

Patentee after: Fantasy (Zhuhai) Technology Co.,Ltd.

Patentee after: DONGGUAN YILIAN INTERATION INFORMATION TECHNOLOGY Co.,Ltd.

Address before: Room 01, 17 / F, Xingguang Yingjing, 117 Shuiyin Road, Yuexiu District, Guangzhou City, Guangdong Province 510075

Patentee before: GUANGZHOU HUANTEK Co.,Ltd.

Patentee before: DONGGUAN YILIAN INTERATION INFORMATION TECHNOLOGY Co.,Ltd.

TR01 Transfer of patent right