CN102467657A - Gesture recognizing system and method - Google Patents

Gesture recognizing system and method Download PDF

Info

Publication number
CN102467657A
CN102467657A CN2010105518019A CN201010551801A CN102467657A CN 102467657 A CN102467657 A CN 102467657A CN 2010105518019 A CN2010105518019 A CN 2010105518019A CN 201010551801 A CN201010551801 A CN 201010551801A CN 102467657 A CN102467657 A CN 102467657A
Authority
CN
China
Prior art keywords
gesture
finger tip
image
template
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2010105518019A
Other languages
Chinese (zh)
Inventor
王西颖
任海兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Samsung Telecommunications Technology Research Co Ltd
Samsung Electronics Co Ltd
Original Assignee
Beijing Samsung Telecommunications Technology Research Co Ltd
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Samsung Telecommunications Technology Research Co Ltd, Samsung Electronics Co Ltd filed Critical Beijing Samsung Telecommunications Technology Research Co Ltd
Priority to CN2010105518019A priority Critical patent/CN102467657A/en
Publication of CN102467657A publication Critical patent/CN102467657A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention provides a gesture recognizing system and method. The gesture recognizing method comprises the steps of: selecting a gesture template; generating a usable rule according to the selected gesture template; capturing images, wherein a gesture image is contained in the images; and recognizing and matching the gesture image according to the selected gesture template and the generated usable rule. The gesture recognizing system and method, provided by the invention, can be used for large-scale display equipment to ensure that better interaction between people and large-scale displays can be carried out through gestures.

Description

The gesture identification system and method
Technical field
The present invention relates to the mutual of user and computing machine, more particularly, relate to and be used for the mutual gesture identification of large-scale display device immersion.
Background technology
Need carry out human-computer interaction through method easily based on the virtual reality of the large-scale display device of immersion.Unfortunately, mutual for simple user, current man-machine interaction has obstacle.Therefore, the consumer is difficult to accept multimedia and virtual reality applications.For example, computer keyboard provides interaction capabilities widely, but it is directly perceived inadequately, compares mutually, and TV remote controller is more directly perceived, but TV remote controller but provides limited interaction capabilities.Also have some flexible display interfaces (flexible interface), but they are inconvenient and expensive.
Giant display provides the immersion environment to people, is very easily alternately through staff and giant display therefore.Current, most of existing systems are only considered the position or the motion of hand, but they pay no attention to the meaning of the shape of relieving oneself.In fact, owing to express the different meanings through different finger layouts, therefore, the shape of hand is very useful and effective ways for man-machine interaction.
Therefore, the present invention is a kind of can come to carry out mutual system and method with giant display through gesture identification.
Summary of the invention
The object of the present invention is to provide a kind of system and method for gesture identification efficiently.
According to an aspect of the present invention, a kind of gesture identification method is provided, has comprised: selected the gesture template; Produce applicable rule according to the gesture template of selecting; Catch image, comprise images of gestures in this image; Come images of gestures is carried out identification and matching according to the gesture template of selecting and the applicable rule of generation.
According to an aspect of the present invention, a kind of gesture identification system is provided, has comprised: mode selection module, select the gesture template; Rule module produces applicable rule according to the gesture template of selecting; Image capture module is caught image, comprises images of gestures in this image; The identification and matching module comes images of gestures is carried out identification and matching according to the gesture template of selecting and the applicable rule of generation.
Have higher matching degree through gesture identification system and method for the present invention, thereby make the people to carry out better mutual through gesture with giant display.
Description of drawings
Fig. 1 is the block diagram that the structure that is used for the mutual gesture identification system of large-scale display device immersion according to an exemplary embodiment of the present invention is shown.
Fig. 2 shows the process flow diagram of the process of mode selection module 1 selection gesture template according to an exemplary embodiment of the present invention.
Fig. 3 shows the process flow diagram of the process of the operation of identification and matching module 6 identification hand shapes according to an exemplary embodiment of the present invention.
Fig. 4 is the process flow diagram to the hand testing process in the step 601.
Fig. 5 A demonstrates the scale map of all depth values, and Fig. 5 B demonstrates the segmentation result that the threshold method in the step 6013 obtains.
Fig. 6 is the diagrammatic sketch that the step 603 in the displayed map 3 is extracted the unique point of finger tip.
Fig. 7 shows the process flow diagram according to the process that sampling of hand shape and gesture template are mated of the present invention.
(a) among Fig. 8 shows the shape contextual feature of anchor point, and Fig. 8 (b) shows rotational invariance, and Fig. 8 (c) shows according to the gesture template of final matching of the present invention and gesture sampling.
Fig. 9 is the gesture identification method according to exemplary embodiment of the present invention.
Figure 10 shows the gesture collection of selecting according to an exemplary embodiment of the present invention.
Embodiment
Fig. 1 is the block diagram that the structure that is used for the mutual gesture identification system of large-scale display device immersion according to an exemplary embodiment of the present invention is shown.This gesture identification system comprises mode selection module 1, rule module 2, pattern database 3, rule database 4, image capture module 5 and identification and matching module 6.Wherein, mode selection module 1 is exported the gesture template that warp is selected, and the gesture template of selecting is stored in the pattern database 3 as input, also available gesture template is notified to rule module 2 simultaneously.Rule module 2 produces applicable rule, and applicable rule is stored in the rule database 4.Image capture module 5 is carried out IMAQ, and the image of gathering is sent to identification and matching module 6, thereby identification and matching module 6 is carried out gesture identification.In addition, pattern database 3 sends to identification and matching module 6 with the gesture template, and rule database 4 sends to identification and matching module 6 with applicable rule.
In the present invention, can for example can realize image capture module 5 through using two kinds of cameras through CCD camera and TOF degree of depth camera.Wherein, CCD (CCD) camera provides colouring information and half-tone information, and TOF camera (degree of depth camera) provides the depth information of scene.
Fig. 9 is the gesture identification method according to exemplary embodiment of the present invention.
In step 901, mode selection module 1 is exported the gesture template that warp is selected, and the gesture template of selecting is stored in the pattern database 3 as input.
In step 902, rule module 2 produces applicable rule according to the gesture template of selecting, and applicable rule is stored in the rule database 4.
In step 903, image capture module 5 is carried out IMAQ, and the image of gathering is sent to identification and matching module 6.
In step 904, identification and matching module 6 comes the images of gestures in the image of gathering is carried out identification and matching according to gesture template in the pattern database 3 and the applicable rule in the rule database 4.
Although after step 901 and 902, step 903 also can be carried out with step 901 and 902 simultaneously in step 903 shown in Fig. 9.The order of the step shown in Fig. 9 only is exemplary, and those skilled in the art can according to circumstances revise arbitrarily it.
With reference to Fig. 2 the operation of mode selection module 1 is described in detail below.
Fig. 2 shows the process flow diagram of the process of mode selection module 1 selection gesture template according to an exemplary embodiment of the present invention.
Gesture mode is the gesture template of hand form fit.Owing to there are many different hand shapes, so in the shape of these hands some are similar and are not easy to distinguish.Therefore must at first select some gestures of distinguishing easily as template.
With reference to Fig. 2, in step 101, mode selection module 1 is set several candidate's gesture templates, and number is greater than the template number of expectation.In step 102, calculate the similarity distance between each candidate's gesture template.Mode selection module 1 extracts the eigenwert of each candidate template, and calculates similarity distance between any two.Wherein, the eigenwert of template can be general image outline characteristic, for example Fourier descriptor (Fourier Descriptor) characteristic, profile invariant moments (Contour moments) characteristic etc.Then, to each template, calculate between it and other templates similarity and the candidate template that similarity is minimum apart from sum is eliminated apart from sum.And the like, equal to expect number up to the number of current candidate template collection.Through like this, select the candidate template of expectation number in step 103.Subsequently, the candidate template in the expectation number that step 104 will be selected stores in the pattern database 3.
Operation in the face of rule module 2 is described in detail down.In the present invention, define some rules and help gesture identification.Mode selection module 1 notifies available gesture template (candidate template of promptly selecting) to rule module 2.For each gesture template, rule module 2 is extracted some characteristics of gesture template, for example, and the angle of the quantity of finger tip, each finger tip and palm of the hand line etc.Enumerate an example below how generation rule is described: if the posture type is A, then have two finger tips, and connect angle between two connecting lines of finger tip and the palm of the hand between 15 degree to 45 scopes spent.Rule module 2 with the rale store that produces in rule database 4.
After identification and matching module 6 has been discerned gesture, identification and matching module 6 will be verified the recognition result of gesture through these rules.
To be described in detail the operation of identification and matching module 6 with reference to Fig. 3 below.
Fig. 3 shows the process flow diagram of the process of the operation of identification and matching module 6 identification hand shapes according to an exemplary embodiment of the present invention.
In step 601, identification and matching module 6 receives the image of catching from image capture module 5, and the adversary detects and cuts apart, thus the segmentation result of output hand portion.To be described in detail step 601 through Fig. 4 below.
In step 602, the profile that profile extracted and exported hand, the set of the marginal point of hand are just carried out in the adversary zone.
In step 604, the profile at the hand of step 602 output is extracted the wide characteristic of handwheel.
In step 603, the unique point that the segmentation result in the hand portion of step 601 output is extracted finger tip is carried out finger tip and is detected.
In step 605, receive the unique point (below be called the gesture sampling) of wide characteristic of handwheel and finger tip and the gesture sampling compared with gesture template and rule template and discern gesture, be about to gesture sampling and template and mate.
Fig. 4 is the process flow diagram to the hand testing process in the step 601.
At first, in step 6011, the image of catching from image capture module 5 is carried out skin class zone detect to find out people's face.If find people's face, then from skin class zone, remove people's face in step 6012.
If do not find people's face, then to carry out the degree of depth and cut apart at the step 6013 pair image of catching, the degree of depth is cut apart the inconsistency of just utilizing hand and face's degree of depth, extracts degree of depth segmentation threshold (being the automatic threshold method) automatically, and face is removed from image.
Subsequently, carry out connected domain in step 6014 and detect, the part that just depth value is close in the detected image, so staff is detected as connected domain.
The hand testing process of foregoing description is a kind of example.For example, search in step 6011 in people's face and the step 6012 that from skin class zone, to remove people's face be omissible.In other words, can be to the direct execution in step 6013 of the image of catching.In addition, in the hand testing process, people's face is a kind of example of comparison material, and it can be any object with other degree of depth in the image.
Fig. 5 A demonstrates the scale map of all depth values.A point and B point among the figure are depth value ratio maximum points, represent the depth capacity of hand and face to distribute respectively.Some p in the middle of them can be used as the threshold point that hand-face is cut apart.
Fig. 5 B demonstrates the segmentation result that the threshold method in the step 6013 obtains.
(1) among Fig. 5 B demonstrates people's face and the hand in the depth image.(2) demonstrate in face is removed from image, thereby obtained the depth image of hand.(3) image in is that hand is detected as connected domain.
Fig. 6 is the diagrammatic sketch that the step 603 in the displayed map 3 is extracted the unique point of finger tip.Unique point through extracting finger tip finds correct finger tip candidate point.
It is to realize through calculating the curvature of putting on the handwheel exterior feature that finger tip detects.Just realize, generally consider the length of each 5 pixel about this point through the zone of facing line segment that the territory constitutes of a bit.After calculating the curvature of each point, if curvature value greater than setting threshold, then this point is judged as the finger tip candidate point.The point that shows among Fig. 6 is the finger tip candidate point of judgement.With reference to Fig. 6 the finger tip candidate point of eliminating the mistake in the finger tip candidate point of selection through some rules is described below.What promptly meet following rule is wrong finger tip candidate point:
(1)rls>R
(2) angle [alpha]>Ta between Lla and the Lc
(3) candidate point is positioned at carpal area
Wherein, Rls representes the major and minor axis ratio of the ellipse of finger tip candidate point; R representes the major and minor axis ratio threshold value of the ellipse of finger tip candidate point; Lla representes the long axis of ellipse of finger tip candidate point, and Lc representes the elliptical center of finger tip candidate point and the line at palm center, and Ta representes the angle threshold value between transverse and the elliptical center and the palm line of centres.R and Ta confirm according to experiment.The R value is between (1.2,2.0), and the Ta value is about 60 degree.The ellipse of said finger tip candidate point is that the curved segments of utilizing above-mentioned point to face in the territory obtains through match.
As shown in Figure 6, mistake 1 candidate point belongs to (2) and (3) point of above-mentioned rule, i.e. angle between the long axis of ellipse of this candidate point and elliptical center and the palm line of centres excessive (greater than Ta), and this candidate point is positioned at carpal area.Mistake 2 candidate points belong to (1) point of above-mentioned rule, i.e. the major and minor axis ratio of the ellipse of this candidate point excessive (greater than R).
Confirm their similarity through the matching result between sampling of hand shape and the template.The present invention proposes improved hereinafter in shape (shape context) method that is used for the hand form fit.Original-shape context method is (like S.Belongie; J.Malik.Shape matching and object recognition using shape context.PAMI 2002) be time-consuming, and for rotary constant property and shape noise robust so not.Fig. 7 shows the process flow diagram according to the process that sampling of hand shape and gesture template are mated of the present invention.
Carrying out finger tip through step 603 pair gesture sampling detects to find out the finger tip candidate point.The rule of describing through Fig. 6 then finds correct finger tip candidate point, i.e. anchor point.
In step 701, realize the anchor point coupling through the similarity distance between the anchor point in anchor point and the gesture template in the sampling of calculating gesture.Carry out the anchor point coupling according to following equality (1), promptly equality (1) is total anchor point coupling cost (cost) equality.
c(s(i),s(j))=ω 1mdis(h(i m),h(j m))+ω 2mdis(len(i m),len(j m))+ω 3mdis(ang(i m),ang(j m))(1)
Obtain the similarity distance of two anchor points through calculation equation (1).Wherein, ω is the weighted value of various piece, confirms according to experiment.Dis is a distance function, and the value of m is the number of anchor point.This equality is divided into three parts: first is based on hereinafter in shape (shape context) characteristic of each point and carries out, and just the histogram of shape contextual feature matees, h (i m) and h (j m) be the histogram of two anchor point i and j; Second portion is based on that anchor point and barycenter wire length mate, len (i m) and len (j m) be respectively the length of anchor point i and barycenter o line and the length of anchor point j and barycenter o; Third part is based on line and the angle between the horizontal line of anchor point and barycenter and the similarity coupling of carrying out, ang (i m) and ang (j m) be to be respectively anchor point i and the line of j and barycenter o and the angle between the horizontal line.Find out the minimum anchor point of cost value, then anchor point through calculation equation (1) for mating.
Following equality (2) is realized the specific algorithm of above-mentioned first, promptly calculates two histogrammic coupling costs of the shape contextual feature among anchor point i and the j.Wherein, k is the number of histogram handle value (bin).G (k) and h (k) represent hereinafter the value of k bin in shape of two anchor points respectively.The value of cost value Cs is between 0 to 1, and cost value is more little, and similarity is big more, and matching degree then is high more.
Cs = 1 2 Σ k = 1 K [ g ( k ) - h ( k ) ] 2 g ( k ) + h ( k ) - - - ( 2 )
(a) among Fig. 8 shows the shape contextual feature of anchor point.Consider angle beta but not through considering in the original method that angle [alpha] obtains rotational invariance through (b) among Fig. 8, thereby obtain more strong shape contextual feature.Shown in if (b) in 8, the angle between the line that angle beta is represented a p and some q and the line of putting p and barycenter o.
Measure anchor point and the similarity distance of the anchor point in the gesture template in the sampling through equality (1); If the anchor point coupling is fine; Then come dividing gesture profile (step 702), and each contour segment that these anchor points are cut apart is mated (step 703) again through these anchor points.If anchor point can not successful match, then two hand shapes couplings are unsuccessful.
What equality (3) was described is the similarity distance of two sections outline line P and Q.
D sc ( P , Q ) = 1 n Σ n ∈ P arg min q ∈ Q C ( p , q ) + 1 m Σ q ∈ Q arg min p ∈ P C ( q , p ) - - - ( 3 )
Wherein, P, Q are two sections edge line segments, and C is the cost function.P is the point on the line segment P, and q is the point on the line segment Q.Their method for measuring similarity is to consider that earlier the institute on the P have a few, calculates each point and Q nearest similarity distance upward, then with they additions.Likewise, consider that again have a few and P on the Q go up the nearest similarity distance and the addition of point.So just, calculate on the P minor increment between any two of having a few on the have a few and Q, its mean value is the similarity distance of P-Q.
At last, in step 704, calculate total coupling cost through equality (4).That is, calculate the wide final similarity distance of two handwheels, i.e. anchor point similarity and contour segment similarity sum.
MC = Σ D sc ( P , Q ) + Σ a ∈ A , a ′ ∈ A ′ C ( a , a ′ ) - - - ( 4 )
(c) among Fig. 8 shows according to the gesture template of final matching of the present invention and gesture sampling.Wherein, a representes the anchor point in the gesture template, the anchor point of the correspondence in the sampling of a ' expression gesture.
In experiment according to the present invention, designed 20 types gesture and selected 5 in them to form gesture collection (see figure 10) through mode selection module 1 according to the present invention.In experiment, collect the images of gestures data from ten different people.For everyone, collect the gesture of different angles.Obtain discrimination as shown in table 1 according to the method for the invention.
Table 1
A B C D E
A 94.2 2.5 0.8 1.06 1.44
B 2.92 93.5 1.06 1.46 1.06
C 1.22 3.88 92.9 1.56 0.44
D 0.76 1.72 2.58 94.1 0.84
E 1.28 2.02 1.88 1.22 93.6
Can find out from above data, be efficiently according to the recognition effect of gesture identification of the present invention system.Gesture identification system and method according to the present invention for example may be used on: have 3D TV, home entertainment device, video-game of camera etc.Camera is installed in above the electronic equipment, and the people carries out through gesture and application program or recreation in the camera front alternately.
Although specifically shown and described the present invention with reference to its exemplary embodiment; But it should be appreciated by those skilled in the art; Under the situation that does not break away from the spirit and scope of the present invention that are defined by the claims, can carry out the various changes on form and the details to it.

Claims (18)

1. gesture identification method comprises:
Select the gesture template;
Produce applicable rule according to the gesture template of selecting;
Catch image, comprise images of gestures in this image;
Come images of gestures is carried out identification and matching according to the gesture template of selecting and the applicable rule of generation.
2. gesture identification method as claimed in claim 1, wherein, select the step of gesture template to comprise: set several candidate's gesture templates, the similarity of calculating between each candidate's gesture template is selected the candidate template of expecting number.
3. gesture identification method as claimed in claim 1, wherein, the step that produces applicable rule according to the gesture template of selecting comprises: for the gesture template extraction characteristic of each selection as applicable rule.
4. gesture identification method as claimed in claim 1, wherein, the step of images of gestures carrying out identification and matching according to the applicable rule of gesture template of selecting and generation comprises:
Thereby the image that image capture module is caught is carried out hand detect the output images of gestures;
Images of gestures is carried out wide extraction of handwheel and output handwheel exterior feature;
To the wide wide characteristic of handwheel of extracting of the handwheel of output;
The images of gestures of catching is extracted the unique point of finger tip and carried out the finger tip detection;
Receive the wide characteristic of handwheel and the unique point of finger tip and sample, and the gesture sampling compared with gesture template and applicable rule discern gesture as gesture.
5. gesture identification method as claimed in claim 3, wherein, the characteristic of extraction comprises: the angle of the quantity of the finger tip of gesture, each finger tip and palm of the hand connecting line.
6. gesture identification method as claimed in claim 4, wherein, the image that image capture module is caught carries out the step that hand detects and comprises:
The image of catching is carried out skin class zone detect finding out the object that has other degree of depth in the image of catching, thereby from skin class zone, remove said object with other degree of depth;
If do not find said object, then the image of catching is carried out the degree of depth and cut apart so that said object with other degree of depth is removed from image with other degree of depth;
Carry out detecting to having removed said image, thereby find out staff together with the territory with object of other degree of depth.
7. gesture identification method as claimed in claim 4, wherein, realize that through calculating the curvature of putting on the handwheel exterior feature finger tip detects, thereby find out the finger tip candidate point that wherein, the candidate point that meets following rule is excluded to find out correct finger tip candidate point:
(1)rls>R
(2) angle [alpha]>Ta between Lla and the Lc
(3) candidate point is positioned at carpal area
Wherein, Rls representes the major and minor axis ratio of the ellipse of finger tip candidate point; R representes the major and minor axis ratio threshold value of the ellipse of finger tip candidate point; Lla representes the long axis of ellipse of finger tip candidate point, and Lc representes the elliptical center of finger tip candidate point and the line at palm center, and Ta representes the angle threshold value between transverse and the elliptical center and the palm line of centres.
8. gesture identification method as claimed in claim 7, wherein, the step of identification gesture comprises:
Finger tip and the distance of the similarity between the finger tip in the gesture template through calculating in the gesture sampling realize the anchor point coupling;
Anchor point through coupling comes the dividing gesture profile;
The gesture contour segment that anchor point is cut apart matees;
Calculate anchor point similarity and contour segment similarity sum.
9. gesture identification method as claimed in claim 8; Wherein, the coupling of the histogram through the shape contextual feature, based on anchor point with the barycenter wire length matees and mate based on the similarity that the line and the angle between the horizontal line of anchor point and barycenter carries out and to calculate finger tip and the similarity distance finger tip in gesture template between of gesture in sampling.
10. gesture identification method as claimed in claim 9, wherein, the similarity of calculating two ends outline line P and Q through following equality is mated apart from the gesture contour segment that comes anchor point is cut apart:
D sc ( P , Q ) = 1 n Σ n ∈ P arg min q ∈ Q C ( p , q ) + 1 m Σ q ∈ Q arg min p ∈ P C ( q , p ) ,
Wherein, C is the cost function, and p is the point on the outline line P, and q is the point on the outline line Q.
11. a gesture identification system comprises:
Mode selection module is selected the gesture template;
Rule module produces applicable rule according to the gesture template of selecting;
Image capture module is caught image, comprises images of gestures in this image;
The identification and matching module comes images of gestures is carried out identification and matching according to the gesture template of selecting and the applicable rule of generation.
12. gesture identification as claimed in claim 11 system, wherein, mode selection module is through setting several candidate's gesture templates, calculates the candidate template that similarity between each candidate's gesture template is selected the expectation number.
13. gesture identification as claimed in claim 11 system, wherein, rule module for the gesture template extraction characteristic of each selection as applicable rule.
14. gesture identification as claimed in claim 11 system, wherein, the identification and matching module comprises:
Thereby the image that image capture module is caught is carried out the device that hand detects the output images of gestures;
Images of gestures is carried out the wide device that extracts and export the handwheel exterior feature of handwheel;
To the wide device that extracts the wide characteristic of handwheel of the handwheel of output;
The images of gestures of catching is extracted the unique point of finger tip and carried out the device that finger tip detects;
Receive the wide characteristic of handwheel and the unique point of finger tip and sample, and gesture sampling and gesture template and applicable rule are compared the device of discerning gesture as gesture.
15. gesture identification as claimed in claim 13 system, wherein, the characteristic of extraction comprises: the angle of the quantity of the finger tip of gesture, each finger tip and palm of the hand connecting line.
16. gesture identification as claimed in claim 14 system wherein, comprises thereby the image that image capture module is caught is carried out the device that hand detects the output images of gestures:
To the image of catching carry out skin class zone detect with find out have other degree of depth in the image of catching object from skin class zone, to remove said device with object of other degree of depth;
If do not find said object, then the image of catching is carried out the degree of depth and cut apart the device of from image, removing with said object with other degree of depth with other degree of depth;
Thereby carry out detecting the device of finding out staff together with the territory to having removed said image with object of other degree of depth.
17. gesture identification as claimed in claim 14 system; Wherein, The unique point that the images of gestures of catching is extracted finger tip is carried out the device that finger tip detects and is comprised: calculate the device that the curvature of putting on the handwheel exterior feature is found out the finger tip candidate point; Wherein, the candidate point that meets following rule is excluded to find out correct finger tip candidate point:
(1)rls>R
(2) angle [alpha]>Ta between Lla and the Lc
(3) candidate point is positioned at carpal area
Wherein, Rls representes the major and minor axis ratio of the ellipse of finger tip candidate point; R representes the major and minor axis ratio threshold value of the ellipse of finger tip candidate point; Lla representes the long axis of ellipse of finger tip candidate point, and Lc representes the elliptical center of finger tip candidate point and the line at palm center, and Ta representes the angle threshold value between transverse and the elliptical center and the palm line of centres.
18. gesture identification as claimed in claim 17 system, wherein, the device of identification gesture comprises:
Finger tip and the distance of the similarity between the finger tip in the gesture template through calculating in the gesture sampling realize the device that anchor point matees;
Come the device of dividing gesture profile through the anchor point of coupling;
The device that the gesture contour segment that anchor point is cut apart matees;
Calculate the device of anchor point similarity and contour segment similarity sum.
CN2010105518019A 2010-11-16 2010-11-16 Gesture recognizing system and method Pending CN102467657A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2010105518019A CN102467657A (en) 2010-11-16 2010-11-16 Gesture recognizing system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2010105518019A CN102467657A (en) 2010-11-16 2010-11-16 Gesture recognizing system and method

Publications (1)

Publication Number Publication Date
CN102467657A true CN102467657A (en) 2012-05-23

Family

ID=46071281

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2010105518019A Pending CN102467657A (en) 2010-11-16 2010-11-16 Gesture recognizing system and method

Country Status (1)

Country Link
CN (1) CN102467657A (en)

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103544472A (en) * 2013-08-30 2014-01-29 Tcl集团股份有限公司 Processing method and processing device based on gesture images
CN103699225A (en) * 2013-12-17 2014-04-02 深圳市威富多媒体有限公司 Method for interacting with mobile terminal through hand shape and device for implementing same
CN103793053A (en) * 2013-12-27 2014-05-14 天津三星电子有限公司 Gesture projection method and device for mobile terminals
CN103870801A (en) * 2012-12-18 2014-06-18 现代自动车株式会社 Method and system for recognizing gesture
CN104038799A (en) * 2014-05-21 2014-09-10 南京大学 Three-dimensional television-oriented gesture manipulation method
CN104063059A (en) * 2014-07-13 2014-09-24 华东理工大学 Real-time gesture recognition method based on finger division
CN104134061A (en) * 2014-08-15 2014-11-05 上海理工大学 Number gesture recognition method for support vector machine based on feature fusion
CN104376298A (en) * 2013-08-16 2015-02-25 联想(北京)有限公司 Matching method and electronic device
CN104598915A (en) * 2014-01-24 2015-05-06 深圳奥比中光科技有限公司 Gesture recognition method and gesture recognition device
CN104627094A (en) * 2013-11-08 2015-05-20 现代自动车株式会社 Vehicle recognizing user gesture and method for controlling the same
CN104680132A (en) * 2015-01-30 2015-06-03 哈尔滨工程大学 Sonar target identification method based on shape context method
CN104914989A (en) * 2014-03-12 2015-09-16 欧姆龙株式会社 Gesture recognition apparatus and control method of gesture recognition apparatus
CN104915011A (en) * 2015-06-28 2015-09-16 合肥金诺数码科技股份有限公司 Open environment gesture interaction game system
CN105095849A (en) * 2014-05-23 2015-11-25 财团法人工业技术研究院 Object identification method and device
CN105760828A (en) * 2016-02-04 2016-07-13 山东大学 Visual sense based static gesture identification method
CN106250700A (en) * 2016-08-09 2016-12-21 京东方科技集团股份有限公司 Interactive rehabilitation system, interactive convalescence device and interactive method of rehabilitation
CN106295463A (en) * 2015-05-15 2017-01-04 济南大学 A kind of gesture identification method of feature based value
CN104123712B (en) * 2013-04-26 2017-07-28 富士通株式会社 Similarity Measure apparatus and method and object identification device and method
CN107018121A (en) * 2016-10-13 2017-08-04 阿里巴巴集团控股有限公司 The method and device of subscriber authentication
CN107220634A (en) * 2017-06-20 2017-09-29 西安科技大学 Based on the gesture identification method for improving D P algorithms and multi-template matching
CN107430680A (en) * 2015-03-24 2017-12-01 英特尔公司 Multilayer skin detection and fusion gesture matching
CN108475070A (en) * 2017-04-28 2018-08-31 深圳市大疆创新科技有限公司 A kind of control method, control device and the unmanned plane of the landing of unmanned plane palm
CN109271847A (en) * 2018-08-01 2019-01-25 阿里巴巴集团控股有限公司 Method for detecting abnormality, device and equipment in unmanned clearing scene
CN109359566A (en) * 2018-09-29 2019-02-19 河南科技大学 The gesture identification method of hierarchical classification is carried out using finger characteristic
CN109726646A (en) * 2018-12-14 2019-05-07 中国联合网络通信集团有限公司 A kind of gesture identification method and system, display methods and system
CN111062312A (en) * 2019-12-13 2020-04-24 RealMe重庆移动通信有限公司 Gesture recognition method, gesture control method, device, medium and terminal device
CN112001334A (en) * 2020-08-27 2020-11-27 闽江学院 Portrait recognition device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050089225A1 (en) * 2003-01-29 2005-04-28 Industrial Technology Research Institute Method for aligning gesture features of image
US20080181459A1 (en) * 2007-01-25 2008-07-31 Stmicroelectronics Sa Method for automatically following hand movements in an image sequence
CN101470800A (en) * 2007-12-30 2009-07-01 沈阳工业大学 Hand shape recognition method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050089225A1 (en) * 2003-01-29 2005-04-28 Industrial Technology Research Institute Method for aligning gesture features of image
US20080181459A1 (en) * 2007-01-25 2008-07-31 Stmicroelectronics Sa Method for automatically following hand movements in an image sequence
CN101470800A (en) * 2007-12-30 2009-07-01 沈阳工业大学 Hand shape recognition method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
SERGE BELONGIE,ET AL: "shape matching and object recognition using shape contexts", 《IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE》, vol. 24, no. 24, 30 April 2002 (2002-04-30) *
TAEHEE LEE,ET AL: "handy AR:markerless inspection of augmented reality objects using fingertip tracking", 《2007 11TH IEEE INTERNATIONAL SYMPOSIUM ON WEARABLE COMPUTERS》, 11 October 2007 (2007-10-11), pages 84 - 85 *
丁海洋,阮秋琦: "多尺度模型与矩描绘子相结合的手势识别算法", 《北方交通大学学报》, vol. 28, no. 2, 30 April 2004 (2004-04-30), pages 42 - 44 *
于洋: "基于手形特征的静态手势识别", 《中国优秀硕士学位论文全文数据库信息科技辑》, no. 11, 15 November 2008 (2008-11-15) *

Cited By (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103870801A (en) * 2012-12-18 2014-06-18 现代自动车株式会社 Method and system for recognizing gesture
CN104123712B (en) * 2013-04-26 2017-07-28 富士通株式会社 Similarity Measure apparatus and method and object identification device and method
CN104376298A (en) * 2013-08-16 2015-02-25 联想(北京)有限公司 Matching method and electronic device
CN103544472B (en) * 2013-08-30 2018-06-19 Tcl集团股份有限公司 A kind of processing method and processing unit based on images of gestures
CN103544472A (en) * 2013-08-30 2014-01-29 Tcl集团股份有限公司 Processing method and processing device based on gesture images
CN104627094A (en) * 2013-11-08 2015-05-20 现代自动车株式会社 Vehicle recognizing user gesture and method for controlling the same
CN104627094B (en) * 2013-11-08 2018-10-09 现代自动车株式会社 Identify the vehicle of user gesture and the method for controlling the vehicle
CN103699225B (en) * 2013-12-17 2017-02-15 深圳市威富多媒体有限公司 Method for interacting with mobile terminal through hand shape and device for implementing same
CN103699225A (en) * 2013-12-17 2014-04-02 深圳市威富多媒体有限公司 Method for interacting with mobile terminal through hand shape and device for implementing same
CN103793053A (en) * 2013-12-27 2014-05-14 天津三星电子有限公司 Gesture projection method and device for mobile terminals
CN103793053B (en) * 2013-12-27 2017-04-12 天津三星电子有限公司 Gesture projection method and device for mobile terminals
CN104598915B (en) * 2014-01-24 2017-08-11 深圳奥比中光科技有限公司 A kind of gesture identification method and device
CN104598915A (en) * 2014-01-24 2015-05-06 深圳奥比中光科技有限公司 Gesture recognition method and gesture recognition device
CN104914989A (en) * 2014-03-12 2015-09-16 欧姆龙株式会社 Gesture recognition apparatus and control method of gesture recognition apparatus
CN104914989B (en) * 2014-03-12 2018-08-28 欧姆龙株式会社 The control method of gesture recognition device and gesture recognition device
CN104038799A (en) * 2014-05-21 2014-09-10 南京大学 Three-dimensional television-oriented gesture manipulation method
CN105095849A (en) * 2014-05-23 2015-11-25 财团法人工业技术研究院 Object identification method and device
CN104063059B (en) * 2014-07-13 2017-01-04 华东理工大学 A kind of real-time gesture recognition method based on finger segmentation
CN104063059A (en) * 2014-07-13 2014-09-24 华东理工大学 Real-time gesture recognition method based on finger division
CN104134061A (en) * 2014-08-15 2014-11-05 上海理工大学 Number gesture recognition method for support vector machine based on feature fusion
CN104680132A (en) * 2015-01-30 2015-06-03 哈尔滨工程大学 Sonar target identification method based on shape context method
CN104680132B (en) * 2015-01-30 2017-11-21 哈尔滨工程大学 A kind of sonar target recognition methods based on Shape context method
CN107430680B (en) * 2015-03-24 2023-07-14 英特尔公司 Multi-layer skin detection and fusion gesture matching
CN107430680A (en) * 2015-03-24 2017-12-01 英特尔公司 Multilayer skin detection and fusion gesture matching
CN106295463A (en) * 2015-05-15 2017-01-04 济南大学 A kind of gesture identification method of feature based value
CN106295463B (en) * 2015-05-15 2019-05-07 济南大学 A kind of gesture identification method based on characteristic value
CN104915011A (en) * 2015-06-28 2015-09-16 合肥金诺数码科技股份有限公司 Open environment gesture interaction game system
CN105760828B (en) * 2016-02-04 2019-03-22 山东大学 A kind of static gesture identification method of view-based access control model
CN105760828A (en) * 2016-02-04 2016-07-13 山东大学 Visual sense based static gesture identification method
CN106250700A (en) * 2016-08-09 2016-12-21 京东方科技集团股份有限公司 Interactive rehabilitation system, interactive convalescence device and interactive method of rehabilitation
CN106250700B (en) * 2016-08-09 2019-10-01 京东方科技集团股份有限公司 Interactive rehabilitation system, interactive convalescence device and interactive method of rehabilitation
CN107018121A (en) * 2016-10-13 2017-08-04 阿里巴巴集团控股有限公司 The method and device of subscriber authentication
CN107018121B (en) * 2016-10-13 2021-07-20 创新先进技术有限公司 User identity authentication method and device
CN108475070A (en) * 2017-04-28 2018-08-31 深圳市大疆创新科技有限公司 A kind of control method, control device and the unmanned plane of the landing of unmanned plane palm
US11449076B2 (en) 2017-04-28 2022-09-20 SZ DJI Technology Co., Ltd. Method for controlling palm landing of unmanned aerial vehicle, control device, and unmanned aerial vehicle
CN108475070B (en) * 2017-04-28 2021-11-30 深圳市大疆创新科技有限公司 Control method and control equipment for palm landing of unmanned aerial vehicle and unmanned aerial vehicle
CN107220634B (en) * 2017-06-20 2019-02-15 西安科技大学 Based on the gesture identification method for improving D-P algorithm and multi-template matching
CN107220634A (en) * 2017-06-20 2017-09-29 西安科技大学 Based on the gesture identification method for improving D P algorithms and multi-template matching
CN109271847A (en) * 2018-08-01 2019-01-25 阿里巴巴集团控股有限公司 Method for detecting abnormality, device and equipment in unmanned clearing scene
US11132559B2 (en) 2018-08-01 2021-09-28 Advanced New Technologies Co., Ltd. Abnormality detection method, apparatus, and device for unmanned checkout
CN109359566B (en) * 2018-09-29 2022-03-15 河南科技大学 Gesture recognition method for hierarchical classification by using finger characteristics
CN109359566A (en) * 2018-09-29 2019-02-19 河南科技大学 The gesture identification method of hierarchical classification is carried out using finger characteristic
CN109726646A (en) * 2018-12-14 2019-05-07 中国联合网络通信集团有限公司 A kind of gesture identification method and system, display methods and system
CN111062312A (en) * 2019-12-13 2020-04-24 RealMe重庆移动通信有限公司 Gesture recognition method, gesture control method, device, medium and terminal device
CN111062312B (en) * 2019-12-13 2023-10-27 RealMe重庆移动通信有限公司 Gesture recognition method, gesture control device, medium and terminal equipment
CN112001334A (en) * 2020-08-27 2020-11-27 闽江学院 Portrait recognition device
CN112001334B (en) * 2020-08-27 2024-01-19 闽江学院 Portrait recognition device

Similar Documents

Publication Publication Date Title
CN102467657A (en) Gesture recognizing system and method
Aldoma et al. Multimodal cue integration through hypotheses verification for rgb-d object recognition and 6dof pose estimation
Feng et al. Features extraction from hand images based on new detection operators
Nai et al. Fast hand posture classification using depth features extracted from random line segments
Yang et al. Dynamic hand gesture recognition using hidden Markov models
CN102231093B (en) Screen locating control method and device
Zhu et al. Vision based hand gesture recognition using 3D shape context
CN103279770B (en) Based on the person's handwriting recognition methods of stroke fragment and contour feature
JP2016014954A (en) Method for detecting finger shape, program thereof, storage medium of program thereof, and system for detecting finger shape
Weiyao et al. Human action recognition using multilevel depth motion maps
She et al. A real-time hand gesture recognition approach based on motion features of feature points
CN106022227A (en) Gesture identification method and apparatus
JP2017211938A (en) Biological information processor, biological information processing method and biological information processing program
Gawali et al. 3d face recognition using geodesic facial curves to handle expression, occlusion and pose variations
Jin et al. Essential body-joint and atomic action detection for human activity recognition using longest common subsequence algorithm
Wolin et al. Sort, merge, repeat: An algorithm for effectively finding corners in hand-sketched strokes
Mehryar et al. Automatic landmark detection for 3d face image processing
Liu et al. Circuit sketch recognition
Shan et al. Adaptive slice representation for human action classification
Półrola et al. Real-time hand pose estimation using classifiers
CN109978829B (en) Detection method and system for object to be detected
Vieriu et al. Background invariant static hand gesture recognition based on Hidden Markov Models
CN116229556A (en) Face recognition method and device, embedded equipment and computer readable storage medium
Wang et al. Skin Color Weighted Disparity Competition for Hand Segmentation from Stereo Camera.
Dehankar et al. Using AEPI method for hand gesture recognition in varying background and blurred images

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
AD01 Patent right deemed abandoned
AD01 Patent right deemed abandoned

Effective date of abandoning: 20170630