CN103294996B - A kind of 3D gesture identification method - Google Patents

A kind of 3D gesture identification method Download PDF

Info

Publication number
CN103294996B
CN103294996B CN201310168123.1A CN201310168123A CN103294996B CN 103294996 B CN103294996 B CN 103294996B CN 201310168123 A CN201310168123 A CN 201310168123A CN 103294996 B CN103294996 B CN 103294996B
Authority
CN
China
Prior art keywords
gesture
finger
image
time
describer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310168123.1A
Other languages
Chinese (zh)
Other versions
CN103294996A (en
Inventor
程洪
代仲君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201310168123.1A priority Critical patent/CN103294996B/en
Publication of CN103294996A publication Critical patent/CN103294996A/en
Application granted granted Critical
Publication of CN103294996B publication Critical patent/CN103294996B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The present invention relates to computer vision and field of human-computer interaction, be specifically related to a kind of 3D gesture identification method, comprise the following steps: obtain RGB image and depth image as training sample from image input device; Detection and the segmentation of staff is carried out by adaptive dynamic depth threshold value; Remove arm by morphological image operation, obtain palm center; Obtain gesture profile by edge extracting, and utilize time-serial position to describe gesture profile; Operated the position obtaining fingertip location and finger tie point in gesture by morphological image, the correspondence position of time-serial position carries out the segmentation of curve, and the specific steps such as obtaining finger describer that combines is carried out to the time-serial position of finger.The natural interaction of realization of the present invention and computing machine, effectively extends traditional man-machine interaction mode, discrimination also improves a lot, and can reach the discrimination of 99%.

Description

A kind of 3D gesture identification method
Technical field
The present invention relates to computer vision and field of human-computer interaction, be specifically related to a kind of 3D gesture identification method.
Background technology
Along with the development of computer technology, man-machine interaction has become part indispensable in people's life, but major part is all the two-dimentional interaction technique based on mouse, keyboard, handheld device and window interface alternately, how allowing become alternately becomes research heat topic more naturally in recent years.Gesture as one of the Main Means of human interaction, history even early than sound language, so make to use gesture as man-machine interaction can more friendly, convenient, succinct, intuitively, very naturally become the one expansion that conventional human is mutual.Identify that first gesture wants perception, the sensing device of existing three kinds of perception gestures: based on the sensing device of handheld device, the digital gesture ring of such as Microsoft, infrared gesture sensory perceptual system; Based on the sensing device touched, such as iPhone; The sensing device of view-based access control model, such as TOF camera, Kinect.
Summary of the invention
The object of the present invention is to provide a kind of 3D gesture identification method, solve existing man-machine interaction method convenient and swift not, and the problem that cannot realize under complex environment.
For solving above-mentioned technical matters, the present invention by the following technical solutions:
A kind of 3D gesture identification method, comprises the following steps:
Step one, obtains RGB image and depth image as training sample from image input device;
Step 2, carries out detection and the segmentation of staff by setting self-adaptation dynamic depth threshold value;
Step 3, removes arm by morphological image operation, obtains palm center;
Step 4, obtains gesture profile by edge extracting, and utilizes time-serial position to describe gesture profile;
Step 5, the position of fingertip location and finger tie point in gesture is obtained by morphological image operation, the correspondence position of time-serial position carries out the segmentation of curve, and specific combination is carried out to the time-serial position of finger obtains finger describer proper vector;
Step 6, is expressed as a finger describer eigenmatrix by each class in gesture training sample;
Step 7, by image input device Real-time Obtaining RGB image and depth image, performs step 2 ~ step 5 and the gesture inputted in real time is expressed as a finger describer eigenmatrix;
Step 8, finger describer proper vector step 7 obtained, carries out image with the eigenmatrix obtained in step 6 and to classify to the dynamic time warping of class and process, finally obtain gesture identification result.
Further technical scheme is, in described step 2, is the object of forefront by hypothesis hand, setting self-adaptation dynamic threshold: specifically utilize the method for binaryzation to separate staff and background by self-adaptation dynamic threshold, obtain the gesture figure of binaryzation.
Further technical scheme is, in described step 3, arm is removed by morphological image operation, obtain palm center, specifically detect arm geometry, select minimum point on it as wrist location, remove the part of below wrist, make burn into remaining image to expand and remove finger and obtain palm, calculate palm geometric center.
Further technical scheme is, in described step 4, obtains gesture profile by edge extracting, specifically with palm geometric center position, palm circumscribed circle is that radius draws circle, covers the palm in binary map, obtain gesture binary map, edge is extracted to gesture binary map, obtains gesture profile diagram.
Further technical scheme is, in described step 4, utilize time-serial position to describe gesture profile, the calculating particular by the angle and distance to profile summit obtains the useful information relevant to shape contained in data, and depict time-serial position as, realize the extraction of shape facility.Such as palm central point is P 0, profile initial point is P 1, profile summit is P i(i=2 ..., n), then can obtain angle ∠ P 1p 0p i=<P 0p 1, P 0p i>, (i=2 ..., n), angle does the horizontal ordinate as time-serial position after the normalization of 360 °; The Euclidean distance of profile summit and palm central point | P 0p i| as the ordinate of time-serial position, utilize formula | P 0p i|=| P 0p i|/max{|P 0p 1|, | P 0p 2| ..., | P 0p n| adjust the distance and do normalization, obtain crossing the time-serial position describing gesture profile.
Further technical scheme is, in described step 5, the position of fingertip location and finger tie point in gesture is obtained by morphological image operation, realize the segmentation to time-serial position, polygon estimation is carried out particular by gesture profile, detect polygonal sags and crests again, salient point is finger tip, concave point is finger tie point, finally filtering is carried out to the sags and crests detected, obtain fingertip location and finger tie point position, these positions are corresponded on time-serial position, realizes the segmentation to curve.Such as obtain polygon particular by <approxPloyDP> function in use OpenCV to estimate, detect polygonal sags and crests, salient point is finger tip, concave point is finger tie point, sags and crests detects can use <convexHull> and <cvConvexiyDefects> function in OpenCV, filtering is carried out to the sags and crests detected, the sags and crests of y coordinate figure below central point is all filtered, realizes the segmentation to finger.
Further technical scheme is, in described step 5, the time-serial position of described finger carries out specific combination and obtains pointing describer, and the finger segment of curve specifically curve segmentation obtained carries out specific combination and obtains finger describer proper vector.Described finger describer proper vector is as follows: f=[f 1, f 2... f s... f s],
Wherein the value of s is
s={1,2,...k,12,23,...(k-1)k,123,234,...,(k-2)(k-1)k,...123...k},
K ∈ 1,2 ... K}, K are the maximum finger numbers comprised in all gestures.
Further technical scheme is, in described step 6, each class in gesture training sample is expressed as a finger describer eigenmatrix, described finger describer eigenmatrix is made up of the finger describer proper vector of such all training sample.The matrix of described finger describer proper vector composition wherein c represents a certain class of gesture, G ca N c× M cmatrix, f c,nthe n-th training sample of c class gesture, N cfor number of training, M cfor the sum of c class gesture finger describer, change with c.
Further technical scheme is, in described step 7, by image input device Real-time Obtaining RGB image and depth image, performs step 2 ~ step 5 and the gesture inputted in real time is expressed as a finger describer proper vector.Obtain finger describer proper vector f test=[f 1', f 2' ... f s' ... f s'].
Further technical scheme is, in described step 8, image is to the dynamic time warping classification process of class, specifically carry out image to test sample book and training sample to calculate to the dynamic time warping of class, obtain the similarity of test sample book and all kinds of training sample, selecting similarity maximum, is namely the gesture-type as test sample book of the dynamic time warping shortest path of class.
Specifically dynamic time warping calculating is carried out to test data and training sample:
I 2 C - DTW ( G c , f test ) = &Sigma; s = 1 S min n &Element; { 1,2 , . . . , N c } { DTW ( f c , n , s , f s , ) }
Wherein, DTW (f c, n, s, f s') represent f c, n, swith f s' between the shortest regular path, f c, n, sf c,nin a kind of combinations of fingers, finally select minimum I2C-DTW (G c, f test) as gesture-type corresponding to test data, i.e. gesture identification result.
Compared with prior art, the invention has the beneficial effects as follows:
The present invention is undertaken alternately by the gesture information of user and computing machine, is exactly supplementing using the hand outline information of user as traditional keyboard and mouse interactive mode, enriches the mode of man-machine interaction.It contains the image of user's hand by means of only Kinect Real-time Obtaining, carry out the analysis of hand information in a computer, and analysis result is converted into the steering order of application program, realize the natural interaction with computing machine, effectively extend traditional man-machine interaction mode, discrimination also improves a lot, the discrimination of 99% can be reached.
Accompanying drawing explanation
Fig. 1 is overall flow and the example schematic of a kind of 3D gesture identification method of the present invention.
Fig. 2 is that the staff of a kind of 3D gesture identification method of the present invention detects segmentation, gesture contours extract and feature extraction schematic diagram thereof.
Fig. 3 is a training sample finger describer eigenmatrix in a kind of 3D gesture identification method of the present invention embodiment.
Fig. 4 is a test data finger describer proper vector in a kind of 3D gesture identification method of the present invention embodiment.
Embodiment
In order to make object of the present invention, technical scheme and advantage clearly understand, below in conjunction with drawings and Examples, the present invention is further elaborated.Should be appreciated that specific embodiment described herein only in order to explain the present invention, be not intended to limit the present invention.
Fig. 1 shows an embodiment of a kind of 3D gesture identification method of the present invention, and the middle part of Fig. 1 is block diagram, and both sides are instantiation: a kind of 3D gesture identification method, comprises the following steps:
Step one, obtains RGB image and depth image as training sample from image input device;
Step 2, carries out detection and the segmentation of staff by setting self-adaptation dynamic depth threshold value;
Step 3, removes arm by morphological image operation, obtains palm center;
Step 4, obtains gesture profile by edge extracting, and utilizes time-serial position to describe gesture profile;
Step 5, the position of fingertip location and finger tie point in gesture is obtained by morphological image operation, the correspondence position of time-serial position carries out the segmentation of curve, and specific combination is carried out to the time-serial position of finger obtains finger describer proper vector;
Step 6, is expressed as a finger describer eigenmatrix by each class in gesture training sample;
Step 7, by image input device Real-time Obtaining RGB image and depth image, performs step 2 ~ step 5 and the gesture inputted in real time is expressed as a finger describer eigenmatrix;
Step 8, the finger describer proper vector that step 7 is obtained, carry out image with the eigenmatrix obtained in step 6 to classify to the dynamic time warping (Image-to-ClassDynamicTimeWarping, I2C-DTW) of class and process, finally obtain gesture identification result.
According to a preferred embodiment of a kind of 3D gesture identification method of the present invention, in described step 2, by supposing that hand is the object of forefront, setting self-adaptation dynamic threshold: specifically utilize the method for binaryzation to separate staff and background by self-adaptation dynamic threshold, obtain the gesture figure of binaryzation, as shown in Figure 2 (a) shows.Namely described dynamic threshold is, after obtaining face coordinate by the Face datection on coloured image, correspond to average depth value depth image obtaining face, since then by the health of people and background separation, re-uses that Otsu auto-thresholding algorithm obtains.
According to another preferred embodiment of a kind of 3D gesture identification method of the present invention, in described step 3, arm is removed by morphological image operation, obtain palm center, specifically as shown in Fig. 2 (b), detect arm geometry, select minimum point on it as wrist location, remove the part of below wrist, burn into is done to remaining image and expands and remove finger and obtain palm, calculate palm geometric center.
According to another preferred embodiment of a kind of 3D gesture identification method of the present invention, in described step 4, obtain gesture profile by edge extracting, specifically with palm geometric center position, palm circumscribed circle is that radius draws circle, cover the palm in binary map, obtain gesture binary map, as Fig. 2 (c), edge is extracted to gesture binary map, obtain gesture profile diagram, as Fig. 2 (d).
According to another preferred embodiment of a kind of 3D gesture identification method of the present invention, in described step 4, time-serial position is utilized to describe gesture profile, calculating particular by the angle and distance to profile summit obtains the useful information relevant to shape contained in data, and depict time-serial position as, realize the extraction of shape facility.Such as palm central point is P 0, profile initial point is P 1, profile summit is P i(i=2 ..., n), then can obtain angle ∠ P 1p 0p i=<P 0p 1, P 0p i>, (i=2 ..., n), angle does the horizontal ordinate as time-serial position after the normalization of 360 °; The Euclidean distance of profile summit and palm central point | P 0p i| as the ordinate of time-serial position, utilize formula | P 0p i|=| P 0p i|/max{|P 0p 1|, | P 0p 2| ..., | P 0p n| adjust the distance and do normalization, obtain crossing the time-serial position describing gesture profile, as Fig. 2 (e).Described time-serial position is converted by gesture profile, to obtain the useful information relevant to shape contained in data, realizes the extraction of shape facility.
According to another preferred embodiment of a kind of 3D gesture identification method of the present invention, in described step 5, the position of fingertip location and finger tie point in gesture is obtained by morphological image operation, realize the segmentation to time-serial position, polygon estimation is carried out particular by gesture profile, detect polygonal sags and crests again, salient point is finger tip, concave point is finger tie point, finally filtering is carried out to the sags and crests detected, obtain fingertip location and finger tie point position, these positions are corresponded on time-serial position, realizes the segmentation to curve.Such as obtain polygon particular by <approxPloyDP> function in use OpenCV to estimate, detect polygonal sags and crests, salient point is finger tip, concave point is finger tie point, sags and crests detects can use <convexHull> and <cvConvexiyDefects> function in OpenCV, filtering is carried out to the sags and crests detected, the sags and crests of y coordinate figure below central point is all filtered, realizes the segmentation to finger.
According to another preferred embodiment of a kind of 3D gesture identification method of the present invention, in described step 5, the time-serial position of described finger carries out specific combination and obtains pointing describer, and the finger segment of curve specifically curve segmentation obtained carries out specific combination and obtains finger describer proper vector.Described finger describer proper vector is as follows: f=[f 1, f 2... f s... f s],
Wherein the value of s is
s={1,2,...k,12,23,...(k-1)k,123,234,...,(k-2)(k-1)k,...123...k},
K ∈ 1,2 ... K}, K are the maximum finger numbers comprised in all gestures.
According to another preferred embodiment of a kind of 3D gesture identification method of the present invention, in described step 6, each class in gesture training sample is expressed as a finger describer eigenmatrix, described finger describer eigenmatrix is made up of the finger describer proper vector of such all training sample.The matrix of described finger describer proper vector composition wherein c represents a certain class of gesture, G ca N c× M cmatrix, f c,nthe n-th training sample of c class gesture, N cfor number of training, M cfor the sum of c class gesture finger describer, change with c.
According to another preferred embodiment of a kind of 3D gesture identification method of the present invention, in described step 7, by image input device Real-time Obtaining RGB image and depth image, perform step 2 ~ step 5 and the gesture inputted in real time is expressed as a finger describer proper vector.Obtain finger describer proper vector f test=[f 1', f 2' ... f s' ... f s'], as shown in Figure 4.
According to another preferred embodiment of a kind of 3D gesture identification method of the present invention, in described step 8, image is to the dynamic time warping classification process of class, specifically carry out image to test sample book and training sample to calculate to the dynamic time warping of class, obtain the similarity of test sample book and all kinds of training sample, selecting similarity maximum, is namely the gesture-type as test sample book of the dynamic time warping shortest path of class.
Specifically dynamic time warping calculating is carried out to test data and training sample:
I 2 C - DTW ( G c , f test ) = &Sigma; s = 1 S min n &Element; { 1,2 , . . . , N c } { DTW ( f c , n , s , f s , ) }
Wherein, DTW (f c, n, s, f s') represent f c, n, swith f s' between the shortest regular path, f c, n, sf c,nin a kind of combinations of fingers, finally select minimum I2C-DTW (G c, f test) as gesture-type corresponding to test data, i.e. gesture identification result.
In addition, the computing method of described dynamic threshold are: detect front k the some gray-scale value r that gray-scale value in degree of depth picture is minimum i(i=1,2 ..., k), removing gray-scale value is the noise spot of 0 ~ 10, k=100 in this patent, then dynamic threshold wherein ρ is experiment estimated value, and user can change as the case may be.
In the various embodiments described above, described RGB image can be cromogram; Described depth image can be gray-scale map, and it represents the distance of object from image input device, and the larger distance of gray-scale value is far away, and the relation of gray-scale value and distance is determined according to concrete image input device.
The present invention includes staff and detect the holonomic system fully utilizing two kinds of technology with staff recognition technology and, two kinds of technology and comprehensive application system thereof can be issued to the effect of real-time stabilization at natural complex background.Staff detects and depth information and chromatic information is combined, and utilizes the singularity of staff, can accurately detect in one's hands and wrist, hand is split the position obtaining hand in image.Based on image to the creationary image that merged of the dynamic time warping algorithm of class to the matching algorithm of class and the regular algorithm of Time dynamic, accurately can obtain the state of each two field picture expert gesture, comprise the number of finger, position, finger tip angle.Two kinds of technology combine and can construct a personal-machine gesture interaction system.The present invention can be widely used in Smart Home, health medical treatment, education, the aspects such as computer game.

Claims (9)

1. a 3D gesture identification method, is characterized in that comprising the following steps:
Step one, obtains RGB image and depth image as training sample from image input device;
Step 2, carries out detection and the segmentation of staff by setting self-adaptation dynamic depth threshold value;
Step 3, removes arm by morphological image operation, obtains palm center;
Step 4, obtains gesture profile by edge extracting, and utilizes time-serial position to describe gesture profile;
Step 5, the position of fingertip location and finger tie point in gesture is obtained by morphological image operation, the correspondence position of time-serial position carries out the segmentation of curve, and specific combination is carried out to the time-serial position of finger obtains finger describer proper vector;
Step 6, is expressed as a finger describer eigenmatrix by each class in gesture training sample;
Step 7, by image input device Real-time Obtaining RGB image and depth image, performs step 2 ~ step 5 and the gesture inputted in real time is expressed as a finger describer proper vector;
Step 8, finger describer proper vector step 7 obtained, carries out image with the eigenmatrix obtained in step 6 and to classify to the dynamic time warping of class and process, finally obtain gesture identification result;
In described step 2, be the object of forefront by hypothesis hand, setting self-adaptation dynamic threshold: specifically utilize the method for binaryzation to separate staff and background by self-adaptation dynamic threshold, obtain the gesture figure of binaryzation.
2. a kind of 3D gesture identification method according to claim 1, it is characterized in that: in described step 3, arm is removed by morphological image operation, obtain palm center, specifically detect arm geometry, select minimum point on it as wrist location, remove the part of below wrist, make burn into remaining image to expand and remove finger and obtain palm, calculate palm geometric center.
3. a kind of 3D gesture identification method according to claim 1, it is characterized in that: in described step 4, gesture profile is obtained by edge extracting, specifically with palm geometric center position, palm circumscribed circle is that radius draws circle, covers the palm in binary map, obtains gesture binary map, edge is extracted to gesture binary map, obtains gesture profile diagram.
4. a kind of 3D gesture identification method according to Claim 1-3 any one, it is characterized in that: in described step 4, time-serial position is utilized to describe gesture profile, calculating particular by the angle and distance to profile summit obtains the useful information relevant to shape contained in data, and depict time-serial position as, realize the extraction of shape facility.
5. a kind of 3D gesture identification method according to claim 4, it is characterized in that: in described step 5, the position of fingertip location and finger tie point in gesture is obtained by morphological image operation, realize the segmentation to time-serial position, polygon estimation is carried out particular by gesture profile, detect polygonal sags and crests again, salient point is finger tip, concave point is finger tie point, finally filtering is carried out to the sags and crests detected, obtain fingertip location and finger tie point position, these positions are corresponded on time-serial position, realizes the segmentation to curve.
6. a kind of 3D gesture identification method according to claim 5, it is characterized in that: in described step 5, the time-serial position of described finger carries out specific combination and obtains pointing describer, and the finger segment of curve specifically curve segmentation obtained carries out specific combination and obtains finger describer proper vector.
7. a kind of 3D gesture identification method according to claim 6, it is characterized in that: in described step 6, each class in gesture training sample is expressed as a finger describer eigenmatrix, described finger describer eigenmatrix is made up of the finger describer proper vector of such all training sample.
8. a kind of 3D gesture identification method according to claim 7, it is characterized in that: in described step 7, by image input device Real-time Obtaining RGB image and depth image, perform step 2 ~ step 5 and the gesture inputted in real time is expressed as a finger describer proper vector.
9. a kind of 3D gesture identification method according to claim 8, it is characterized in that: in described step 8, image is to the dynamic time warping classification process of class, specifically carry out image to test sample book and training sample to calculate to the dynamic time warping of class, obtain the similarity of test sample book and all kinds of training sample, selecting similarity maximum, is namely the gesture-type as test sample book of the dynamic time warping shortest path of class.
CN201310168123.1A 2013-05-09 2013-05-09 A kind of 3D gesture identification method Active CN103294996B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310168123.1A CN103294996B (en) 2013-05-09 2013-05-09 A kind of 3D gesture identification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310168123.1A CN103294996B (en) 2013-05-09 2013-05-09 A kind of 3D gesture identification method

Publications (2)

Publication Number Publication Date
CN103294996A CN103294996A (en) 2013-09-11
CN103294996B true CN103294996B (en) 2016-04-27

Family

ID=49095827

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310168123.1A Active CN103294996B (en) 2013-05-09 2013-05-09 A kind of 3D gesture identification method

Country Status (1)

Country Link
CN (1) CN103294996B (en)

Families Citing this family (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104375631A (en) * 2013-10-22 2015-02-25 安徽寰智信息科技股份有限公司 Non-contact interaction method based on mobile terminal
KR101534742B1 (en) * 2013-12-10 2015-07-07 현대자동차 주식회사 System and method for gesture recognition of vehicle
CN103679213B (en) * 2013-12-13 2017-02-08 电子科技大学 3D gesture recognition method
TWI528271B (en) * 2013-12-16 2016-04-01 緯創資通股份有限公司 Method, apparatus and computer program product for polygon gesture detection and interaction
KR101526426B1 (en) * 2013-12-31 2015-06-05 현대자동차 주식회사 Gesture recognize apparatus and method
RU2014101965A (en) * 2014-01-22 2015-07-27 ЭлЭсАй Корпорейшн IMAGE PROCESSOR CONTAINING A GESTURE RECOGNITION SYSTEM WITH RECOGNITION OF A STATIC POSITION OF HAND BRUSH, BASED ON DYNAMIC CHANGE OF TIME
CN104978012B (en) 2014-04-03 2018-03-16 华为技术有限公司 One kind points to exchange method, apparatus and system
CN104268138B (en) * 2014-05-15 2017-08-15 西安工业大学 Merge the human body motion capture method of depth map and threedimensional model
CN105336005B (en) * 2014-06-27 2018-12-14 华为技术有限公司 A kind of method, apparatus and terminal obtaining target object sign data
CN105354812B (en) * 2014-07-10 2020-10-16 北京中科盘古科技发展有限公司 Multi-Kinect cooperation-based depth threshold segmentation algorithm contour recognition interaction method
CN105654103B (en) * 2014-11-12 2020-03-24 联想(北京)有限公司 Image identification method and electronic equipment
CN104333794A (en) * 2014-11-18 2015-02-04 电子科技大学 Channel selection method based on depth gestures
CN104766055A (en) * 2015-03-26 2015-07-08 济南大学 Method for removing wrist image in gesture recognition
CN104899600B (en) * 2015-05-28 2018-07-17 北京工业大学 A kind of hand-characteristic point detecting method based on depth map
CN105320937B (en) * 2015-09-25 2018-08-14 北京理工大学 Traffic police's gesture identification method based on Kinect
CN106610716B (en) 2015-10-21 2019-08-27 华为技术有限公司 A kind of gesture identification method and device
CN105893929A (en) * 2015-12-27 2016-08-24 乐视致新电子科技(天津)有限公司 Finger and wrist distinguishing method and device
CN105739702B (en) * 2016-01-29 2019-01-22 电子科技大学 Multi-pose finger tip tracking for natural human-computer interaction
CN107292904B (en) * 2016-03-31 2018-06-15 北京市商汤科技开发有限公司 A kind of palm tracking and system based on depth image
CN106446911B (en) * 2016-09-13 2018-09-18 李志刚 A kind of human hand recognition methods based on image border embroidery and distance feature
CN106503620A (en) * 2016-09-26 2017-03-15 深圳奥比中光科技有限公司 Numerical ciphers input method and its system based on gesture
CN107977070B (en) * 2016-10-25 2021-09-28 中兴通讯股份有限公司 Method, device and system for controlling virtual reality video through gestures
CN106529480A (en) * 2016-11-14 2017-03-22 江汉大学 Finger tip detection and gesture identification method and system based on depth information
DE102017210317A1 (en) * 2017-06-20 2018-12-20 Volkswagen Aktiengesellschaft Method and device for detecting a user input by means of a gesture
CN107491763A (en) * 2017-08-24 2017-12-19 歌尔科技有限公司 Finger areas dividing method and device based on depth image
CN108255298B (en) * 2017-12-29 2021-02-19 安徽慧视金瞳科技有限公司 Infrared gesture recognition method and device in projection interaction system
CN110874179B (en) * 2018-09-03 2021-09-14 京东方科技集团股份有限公司 Fingertip detection method, fingertip detection device, and medium
CN109375766A (en) * 2018-09-13 2019-02-22 何艳玲 A kind of Novel learning method based on gesture control
CN110046603B (en) * 2019-04-25 2020-11-27 合肥工业大学 Gesture action recognition method for Chinese pule sign language coding
CN111309149B (en) * 2020-02-21 2022-08-19 河北科技大学 Gesture recognition method and gesture recognition device
CN111466882A (en) * 2020-04-23 2020-07-31 上海祉云医疗科技有限公司 Intelligent traditional Chinese medicine hand diagnosis analysis system and method
CN111723698A (en) * 2020-06-05 2020-09-29 中南民族大学 Method and equipment for controlling lamplight based on gestures
CN111753771A (en) * 2020-06-29 2020-10-09 武汉虹信技术服务有限责任公司 Gesture event recognition method, system and medium
CN112507924B (en) * 2020-12-16 2024-04-09 深圳荆虹科技有限公司 3D gesture recognition method, device and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102769802A (en) * 2012-06-11 2012-11-07 西安交通大学 Man-machine interactive system and man-machine interactive method of smart television
CN102945079A (en) * 2012-11-16 2013-02-27 武汉大学 Intelligent recognition and control-based stereographic projection system and method
CN102968178A (en) * 2012-11-07 2013-03-13 电子科技大学 Gesture-based PPT (Power Point) control system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7308112B2 (en) * 2004-05-14 2007-12-11 Honda Motor Co., Ltd. Sign based human-machine interaction

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102769802A (en) * 2012-06-11 2012-11-07 西安交通大学 Man-machine interactive system and man-machine interactive method of smart television
CN102968178A (en) * 2012-11-07 2013-03-13 电子科技大学 Gesture-based PPT (Power Point) control system
CN102945079A (en) * 2012-11-16 2013-02-27 武汉大学 Intelligent recognition and control-based stereographic projection system and method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于多点手势识别的人机交互技术框架;李文生等;《计算机工程与设计》;20110630;第32卷(第6期);全文 *
基于视觉的手势识别技术的研究;贾建军;《中国优秀硕士学位论文全文数据库(电子期刊)》;20111215;第15-42页 *

Also Published As

Publication number Publication date
CN103294996A (en) 2013-09-11

Similar Documents

Publication Publication Date Title
CN103294996B (en) A kind of 3D gesture identification method
JP6079832B2 (en) Human computer interaction system, hand-to-hand pointing point positioning method, and finger gesture determination method
EP2802975B1 (en) Intelligent touchscreen keyboard with finger differentiation
Feng et al. Features extraction from hand images based on new detection operators
CN104899600A (en) Depth map based hand feature point detection method
RU2013154102A (en) FINGER RECOGNITION AND TRACKING SYSTEM
Zhu et al. Vision based hand gesture recognition using 3D shape context
CN103971102A (en) Static gesture recognition method based on finger contour and decision-making trees
CN103226388A (en) Kinect-based handwriting method
CN105045399A (en) Electronic device with 3D camera assembly
Wu et al. Vision-based fingertip tracking utilizing curvature points clustering and hash model representation
WO2017114002A1 (en) Device and method for inputting one-dimensional handwritten text
CN102402289A (en) Mouse recognition method for gesture based on machine vision
CN111414837A (en) Gesture recognition method and device, computer equipment and storage medium
CN103092437A (en) Portable touch interactive system based on image processing technology
CN106503619B (en) Gesture recognition method based on BP neural network
CN105242776A (en) Control method for intelligent glasses and intelligent glasses
US20140105453A1 (en) Gesture identification with natural images
CN105068662A (en) Electronic device used for man-machine interaction
CN105046249B (en) A kind of man-machine interaction method
US9529446B2 (en) Re-anchorable virtual panel in three-dimensional space
CN109189219A (en) The implementation method of contactless virtual mouse based on gesture identification
Choudhury et al. A CNN-LSTM based ensemble framework for in-air handwritten Assamese character recognition
CN107450717B (en) Information processing method and wearable device
CN102194097A (en) Multifunctional method for identifying hand gestures

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant