CN102368290B - Hand gesture identification method based on finger advanced characteristic - Google Patents

Hand gesture identification method based on finger advanced characteristic Download PDF

Info

Publication number
CN102368290B
CN102368290B CN 201110258962 CN201110258962A CN102368290B CN 102368290 B CN102368290 B CN 102368290B CN 201110258962 CN201110258962 CN 201110258962 CN 201110258962 A CN201110258962 A CN 201110258962A CN 102368290 B CN102368290 B CN 102368290B
Authority
CN
China
Prior art keywords
gesture
finger
edge
operator
fingers
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN 201110258962
Other languages
Chinese (zh)
Other versions
CN102368290A (en
Inventor
林耀荣
江国来
李映辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN 201110258962 priority Critical patent/CN102368290B/en
Publication of CN102368290A publication Critical patent/CN102368290A/en
Application granted granted Critical
Publication of CN102368290B publication Critical patent/CN102368290B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention provides a hand gesture identification method based on a finger advanced characteristic. The method comprises the following steps: (1) crude segmentation of gestures; (2) extraction of a gesture edge; (3) extraction of a finger center area; (4) extraction of a gesture advanced characteristic; (5) gesture identification: utilizing the gesture advanced characteristic formed by a palm center position, a gesture direction, a wrist position, positioning points of five fingers, an angle characteristic of the five fingers, and a stretching state of the five fingers which are obtained in step (4), constructing a gesture model, and obtaining a gesture identification result. According to the invention, by utilizing a stretching/retracting state of the five fingers, 2<5>=32 different hand gestures can be identified independent of a rotation angle of a hand. Considering an application scene of a hand gesture direction, an obtained hand gesture direction can be utilized to identify more user gestures.

Description

A kind of gesture identification method based on the finger advanced features
Technical field
The present invention relates to a kind of gesture identification method, specifically is a kind of gesture identification method based on the finger advanced features, belongs to field of human-computer interaction.
Background technology
The continuous development of Along with computer technology, the interacting activity of people and computing machine becomes the important component part of people's daily life gradually.In this case, more natural, meet the human-computer interaction technology that the people exchanges custom more and obtained bigger development.These technology comprise gesture identification, speech recognition and eye tracking or the like.
Gesture Recognition based on computer vision provides a kind of natural Man Machine Interface.The image characteristics extraction of gesture is the important component part of gesture identification, with the speed and the robustness of directly influence identification.Current the most frequently used gesture feature has:
(1) colour of skin, human body skin have the cluster characteristic of height in some color spaces, utilize the colour of skin to distinguish staff and background is a kind of rough dividing gesture method commonly used.
(2) edge, edge feature receive illumination effect less, can reflect the structure of gesture, often are used as images match, and then location identification gesture.
(3) profile refers to the gesture outline that obtains after the dividing gesture.Contour feature both can directly be used for images match with the identification gesture, also can obtain advanced features such as fingertip location through local characteristicses such as analysis curvature.
(4) finger tip, the i.e. position of finger tip in image.The finger tip characteristic can obtain through the finger tip template matches or to the curvature analysis of gesture profile.Fingertip location and quantity can directly reflect gesture.
(5) gesture center, i.e. the positional information of gesture in the space, moving continuously of center can be used for defining dynamic gesture.It is a kind of method commonly used that the barycenter of barycenter or point of sketch figure of directly using gesture is located the gesture center.But this method receives the influence of different gesture shapes bigger.
(6) direction of gesture, promptly the directional information of gesture in the space can be used for defining tell-tale gesture.The direction of gesture can be confirmed by arm, detect arm surely but differ in the image sequence; The direction of gesture also can the transverse direction through match gesture outline confirm, but and is not suitable for all gestures.
Because the shape more complicated of hand, even same gesture, the slight change along with the different of position and angle and finger angle also can present bigger difference in image.Existing gesture identification method exists following not enough:
(1) the real-time gesture method of estimation requires the user to wear the color mark thing usually more accurately, and is very inconvenient in practical application;
(2) a large amount of templates need be prepared usually based on the method for template matches, the identification of limited quantity gesture can only be applied to;
(3) aspect advanced features, existing finger detection method generally can only detect the finger that stretches out, and can't detect the finger of packing up, and is unfavorable for the estimation of whole gesture;
(4) recognition result can receive exposed arm regions interference;
(5) be difficult to estimate gesture direction accurately;
(6) recognition result can be pointed variable angle, the hand-screw gyration influences.
Summary of the invention
The objective of the invention is to overcome the above-mentioned defective that prior art exists, a kind of gesture identification method based on the finger advanced features is provided.The user needn't wear any label; Also needn't training in advance be used to the template of mating in a large number; The gesture advanced features such as position of position, the centre of the palm that can the extract real-time hand, wrist location, gesture direction, all fingers (comprise stretch out and pack up finger), and then discern gesture.This method has rotational invariance and good real-time, and not influenced by the bare arm area of skin color.
The present invention realizes through following technical scheme:
A kind of gesture identification method based on the finger advanced features comprises the steps: that (1) gesture rough segmentation cuts; (2) extraction at gesture edge; (3) extract the finger central area; (4) the gesture advanced features extracts; (5) gesture identification; It is characterized in that,
Extract said finger central area: according to the approximate parallel characteristic of finger both sides of the edge, utilize the central area of the operator extraction finger of design; Operator is isotropic annular operator, can be expressed as:
N=0.25D wherein Finger, D FingerBe the finger mean breadth, d representes that operator puts the distance at operator center, and K is a positive count; Adopt this operator to carry out convolution to the gesture edge, and carry out threshold binarization, the span of threshold value does
Figure GDA00002090569600031
The binaryzation result is carried out dilation operation, obtain pointing central area figure;
Said gesture advanced features extracts: central area figure calculates barycenter to finger, as the finger central point; Utilize the finger central point to cut apart the hand zone again, remove arm segment, obtain hand zone more accurately; Erosion operation is carried out in hand zone to being partitioned into, and the barycenter of Corrosion results is as the palm center; Can confirm the gesture direction by finger center and palm center; According to palm center, gesture direction and palm radius, the location wrist location; With the wrist location is limit, and it directly is the weighting radially projecting conversion of weighting coefficient that the finger central area figure that extracts is carried out with the utmost point, obtains one dimension gesture feature sequence; This sequence is carried out LPF and normalization processing, the gesture feature sequence after obtaining handling; Extract the local maximum of 5 maximums of this gesture feature sequence, detection range wrist location point farthest in the angular range that local maximum is corresponding on the gesture outline is as the anchor point of 5 fingers; Wrist location arrives the vectorial angle of every finger locating point as the finger angle character; According to the distance of wrist location to every finger locating point, the differentiation finger stretches out or packs up;
Said gesture identification: the gesture advanced features that directly utilizes the flexible state of palm center that step (4) obtains, gesture direction, wrist location, the anchor point of 5 fingers, the angle character of 5 fingers, 5 fingers to constitute; Construct gesture model, obtain the result of gesture identification.
The said gesture rough segmentation of step (1) is cut: adopt the skin color segmentation based on the YCbCr color space, be used for the initial gross separation background; Adopt morphologic filtering, cut apart the zone of selling, and obtain the gesture outline.
The extraction at the said gesture of step (2) edge: tentatively extract the gesture edge by the Canny edge detection algorithm; Through gray values of pixel points on the Canny edge is carried out threshold decision; And according to the lower characteristics of both sides of the edge gray-scale value of packing up finger; From the Canny edge, extract the both sides of the edge of packing up finger, carry out and set operation with outline again, obtain the gesture edge.
For the discrete picture signal, said annular operator is the annular operator of (6N+1) * (6N+1).
Said method can realize through software on computer system, also can on embedded system, digital signal processor and FPGA, realize through firmware.
Compared with prior art, the present invention has advantage and effect:
(1) user need not wear any label;
(2) do not need a large amount of matching templates of training in advance just can effectively discern multiple gesture;
(3) can extract the characteristic of all fingers, comprise the finger of packing up;
(4) can effectively avoid the interference of exposed arm regions;
(5) can accurately extract the gesture direction character;
(6) influence that does not receive the hand-screw gyration and point variable angle;
(7) algorithm is simple, can real-time implementation.
Description of drawings
Fig. 1 is the process flow diagram of gesture identification method of the present invention.
Fig. 2 is that parallel lines central area annular detects the operator synoptic diagram, and Fig. 2 (a) is the detection operator of continuous signal, and Fig. 2 (b) is the detection operator (size is 7 * 7) of discrete signal, and Fig. 2 (c) is the detection operator (size is 13 * 13) of discrete signal.
Fig. 3 is that annular detects the parallel lines central area synoptic diagram that operator detects.
Fig. 4 is the method synoptic diagram of removing arm regions and confirming characteristics such as finger central point, gesture direction, wrist location.
Fig. 5 is a finger locating method synoptic diagram.
Embodiment
Below in conjunction with accompanying drawing the specific embodiment of the invention is described further, but need to prove, embodiment does not constitute the restriction that the present invention is required protection domain.
Practical implementation method flow of the present invention is as shown in Figure 1, and key step is following:
Step 1: the gesture rough segmentation is cut
This embodiment adopts the skin color segmentation method based on the YCbCr color space.To the coloured image of the RGB color space that obtains through monocular cam, adopt formula (1), it is transformed into the YCbCr color space:
Y = 0.299 R + 0.587 G + 0.114 B Cb = 0.564 ( B - Y ) + 128 Cr = 0.713 ( R - Y ) + 128 - - - ( 1 )
Fixed threshold boundary model according to formula (2) carries out colour of skin differentiation:
R st = 84 < Cb < 127 137 < Cr < 177 190 < ( Cb + 0.6 Cr ) < 215 - - - ( 2 )
If the YCbCr value of pixel is at R StIn the scope, then differentiate for colour of skin point, this point is made as 1, non-colour of skin point is made as 0, thereby obtains broca scale I SkinTo broca scale I Skin, adopting morphologic filtering, the noise spot that filtering is less extracts the connected domain that area satisfies threshold condition, gets region mask I in one's hands Hand_maskConvert the RGB coloured image of input into gray-scale map I, utilize hand region mask I Hand_maskFrom gray-scale map I, extract the gray-scale map I in hand zone Hand
Step 2: the extraction at gesture edge
Adopt Canny edge detection algorithm adversary area grayscale figure I earlier HandCarry out rim detection, obtain Canny outline map I CannyBecause the finger both sides of the edge point gray-scale value of packing up is less, and the gray-scale value that skinfold edge etc. disturb is bigger, adversary's area grayscale figure I Hand, get threshold value T 1(T 1Desirable 50) carry out binaryzation, get the low gray-scale value figure I in the zone in one's hands Dark:
I dark = 1 , I hand < T 1 0 , I hand &GreaterEqual; T 1 - - - ( 3 )
Outline by the hand zone obtains the wide I of handwheel Contour, gesture edge I then Edge=I Canny∩ I Dark∪ I Contour, wherein ∩ representes intersection operation, ∪ representes and set operation.
Step 3: the extraction of finger central area
Fig. 2 is the detection operator G that extracts the finger central area in this implementation method.For the discrete picture signal, G is the annular operator of (6N+1) * (6N+1), wherein N=0.25D Finger, D FingerBe the finger mean breadth.Fig. 2 (b), Fig. 2 (c) represent N=1 respectively, 2 o'clock operator G.Note d puts the distance at operator center for certain, and then annular operator can be expressed as:
Figure GDA00002090569600052
Wherein K is a positive count, and calculating K desirable 1 for ease.
The step that extract the finger central area is: at first adopt annular operator G to I EdgeCarry out convolution and obtain I w=I Edge* G; Then with threshold value T 2(T 2Can get
Figure GDA00002090569600053
Numerical value in the scope) to I wCarry out binaryzation and obtain I ' Finger(x y), sees formula (5);
I &prime; finger ( x , y ) = 1 I w ( x , y ) > T 2 0 I w ( x , y ) &le; T 2 - - - ( 5 )
Fig. 3 is the parallel lines central area synoptic diagram through the detection operator extraction.Get I ' at last Finger(x, y) and I SkinCommon factor, and carry out dilation operation, obtain pointing central area figure I Finger(x, y).
Step 4: the gesture advanced features extracts
In this embodiment, the gesture advanced features extracts sees Fig. 4, Fig. 5.
Wherein, Fig. 4 representes to point the method for distilling of center, gesture direction and wrist location, is used to calculate characteristics such as finger center and gesture direction, and step is following: calculate finger central area figure I FingerBarycenter, be designated as the finger center P FcUtilize the center to be P Fc, radius is 2.5R PalmCircle cut apart the hand zone again, remove arm regions, obtain revised gesture outline I Hand_f, R wherein PalmBe the palm mean radius; To I Hand_fCarry out erosion operation and calculate the barycenter of Corrosion results, obtain palm center P HcGesture direction vector of unit length
Figure GDA00002090569600061
Try to achieve by formula (6):
N &RightArrow; p = P hc P fc &RightArrow; | P hc P fc &RightArrow; | - - - ( 6 )
Wrist location P wTry to achieve by formula (7):
P w = P hc - R palm N &RightArrow; p - - - ( 7 )
Fig. 5 representes the method for finger locating.Purpose is to calculate respectively to point angle and fingertip location.To point central area figure I Finger(x y) converts into wrist location P wPolar coordinate representation I for limit f(r θ), carries out equal angles cutting at interval, calculates the weighting radially projecting in each angle intervals:
p n = &Integral; r 1 r 2 &Integral; &theta;n &theta;n + 1 rI f ( r , &theta; ) d&theta;dr , n=0,1,2,...,M-1 (8)
Wherein, p nCorresponding n polar angle scope [θ n, θ N+1) eigenwert; R representes utmost point footpath, as weighting coefficient; R1, r2 are the scope of integration variable r, desirable r1=0.5R Palm, r2=2.5R Palm, M is the space-number (desirable 36 ~ 360) of angle cutting.With p nArrange from small to large by angle, obtain pointing characteristic sequence { p n; To { p nTo carry out LPF level and smooth, obtain after level and smooth the result p ' n; To at last p ' nNormalization obtains { p N_norm}:
p n _ norm = p n &prime; &Sigma; i = 0 M - 1 p n &prime; , n=0,1,2,...,M-1 (9)
Obtain normalization characteristic sequence { p successively N_normFive local maximum points, its corresponding polar angle scope is 5 angular ranges that finger is corresponding.The angle of finger can be taken as (θ n+ θ N+1)/2.In the angular range of every finger correspondence, detection range wrist location P on the gesture outline wPoint P farthest TiAnd maximum distance D TiWork as D Ti>α R PalmThe time, P TiDifferentiation is the fingertip location of a finger that stretches out; Work as D Ti≤α R PalmThe time, P TiArticulation point corresponding to a finger of packing up.To middle three fingers, threshold alpha is desirable 3, to the finger (thumb and little finger of toe) of both sides, threshold alpha desirable 2.2.Wrist location P wTo a P TiVectorial angle as the finger angle character.
Step 5: gesture identification
The gesture advanced features that directly utilizes the flexible state of palm center that above-mentioned steps obtains, gesture direction, wrist location, the anchor point of 5 fingers, the angle character of 5 fingers, 5 fingers to constitute; Construct gesture model, obtain the result of gesture identification.
In this implementation method, directly utilize the stretching out of 5 fingers/collapsed state can discern 2 5=32 kinds of different gestures, and irrelevant with the anglec of rotation of hand.Consider to utilize the gesture direction of acquisition in the application scenarios of gesture direction identification more users gesture at needs.

Claims (4)

1. the gesture identification method based on the finger advanced features comprises the steps: that (1) gesture rough segmentation cuts; (2) extraction at gesture edge; (3) extract the finger central area; (4) the gesture advanced features extracts; (5) gesture identification; It is characterized in that,
Extract said finger central area: according to the approximate parallel characteristic of finger both sides of the edge, utilize the central area of the operator extraction finger of design; Operator is isotropic annular operator, can be expressed as:
Figure FDA00002090569500011
N=0.25D wherein Finger, D FingerBe the finger mean breadth, d representes that operator puts the distance at operator center, and K is a positive count; Adopt this operator to carry out convolution to the gesture edge, and carry out threshold binarization, the span of threshold value does
Figure FDA00002090569500012
The binaryzation result is carried out dilation operation, obtain pointing central area figure;
Said gesture advanced features extracts: central area figure calculates barycenter to finger, as the finger central point; Utilize the finger central point to cut apart the hand zone again, remove arm segment, obtain hand zone more accurately; Erosion operation is carried out in hand zone to being partitioned into, and the barycenter of Corrosion results is as the palm center; Can confirm the gesture direction by finger center and palm center; According to palm center, gesture direction and palm radius, the location wrist location; With the wrist location is limit, and it directly is the weighting radially projecting conversion of weighting coefficient that the finger central area figure that extracts is carried out with the utmost point, obtains one dimension gesture feature sequence; This sequence is carried out LPF and normalization processing, the gesture feature sequence after obtaining handling; Extract the local maximum of 5 maximums of this gesture feature sequence, detection range wrist location point farthest in the angular range that local maximum is corresponding on the gesture outline is as the anchor point of 5 fingers; Wrist location arrives the vectorial angle of every finger locating point as the finger angle character; According to the distance of wrist location to every finger locating point, the differentiation finger stretches out or packs up;
Said gesture identification: the gesture advanced features that directly utilizes the flexible state of palm center that step (4) obtains, gesture direction, wrist location, the anchor point of 5 fingers, the angle character of 5 fingers, 5 fingers to constitute; Construct gesture model, obtain the result of gesture identification.
2. recognition methods according to claim 1 is characterized in that, the said gesture rough segmentation of step (1) is cut: adopt the skin color segmentation based on the YCbCr color space, be used for the initial gross separation background; Adopt morphologic filtering, cut apart the zone of selling, and obtain the gesture outline.
3. recognition methods according to claim 1; It is characterized in that the extraction at the said gesture of step (2) edge: tentatively extract the gesture edge by the Canny edge detection algorithm, through gray values of pixel points on the Canny edge is carried out threshold decision; And according to the lower characteristics of both sides of the edge gray-scale value of packing up finger; From the Canny edge, extract the both sides of the edge of packing up finger, carry out and set operation with outline again, obtain the gesture edge.
4. recognition methods according to claim 1 is characterized in that, for the discrete picture signal, said annular operator is the annular operator of (6N+1) * (6N+1) in the step (3).
CN 201110258962 2011-09-02 2011-09-02 Hand gesture identification method based on finger advanced characteristic Expired - Fee Related CN102368290B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201110258962 CN102368290B (en) 2011-09-02 2011-09-02 Hand gesture identification method based on finger advanced characteristic

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201110258962 CN102368290B (en) 2011-09-02 2011-09-02 Hand gesture identification method based on finger advanced characteristic

Publications (2)

Publication Number Publication Date
CN102368290A CN102368290A (en) 2012-03-07
CN102368290B true CN102368290B (en) 2012-12-26

Family

ID=45760853

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201110258962 Expired - Fee Related CN102368290B (en) 2011-09-02 2011-09-02 Hand gesture identification method based on finger advanced characteristic

Country Status (1)

Country Link
CN (1) CN102368290B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103413080A (en) * 2013-08-20 2013-11-27 苏州跨界软件科技有限公司 Password protection realization method based on gesture
CN103426000A (en) * 2013-08-28 2013-12-04 天津大学 Method for detecting static gesture fingertip

Families Citing this family (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622601A (en) * 2012-03-12 2012-08-01 李博男 Fingertip detection method
CN102915111B (en) * 2012-04-06 2017-05-31 寇传阳 A kind of wrist gesture control system and method
US9646200B2 (en) * 2012-06-08 2017-05-09 Qualcomm Incorporated Fast pose detector
CN102854983B (en) * 2012-09-10 2015-12-02 中国电子科技集团公司第二十八研究所 A kind of man-machine interaction method based on gesture identification
CN102880865B (en) * 2012-09-28 2015-06-17 东南大学 Dynamic gesture recognition method based on complexion and morphological characteristics
CN102938060A (en) * 2012-12-07 2013-02-20 上海电机学院 Dynamic gesture recognition system and method
CN103034333A (en) * 2012-12-18 2013-04-10 福建华映显示科技有限公司 Gesture recognition device and gesture recognition method
CN104007865B (en) * 2013-02-27 2017-04-19 联想(北京)有限公司 Recognition method and electronic device
TWI499937B (en) * 2013-10-15 2015-09-11 Univ Nat Taipei Technology Remote control method and remote control device using gestures and fingers
KR101526426B1 (en) * 2013-12-31 2015-06-05 현대자동차 주식회사 Gesture recognize apparatus and method
CN104794472A (en) * 2014-01-20 2015-07-22 富士通株式会社 Method and device used for extracting gesture edge image and gesture extracting method
CN104063059B (en) * 2014-07-13 2017-01-04 华东理工大学 A kind of real-time gesture recognition method based on finger segmentation
CN104123007B (en) * 2014-07-29 2017-01-11 电子科技大学 Multidimensional weighted 3D recognition method for dynamic gestures
US9971442B2 (en) * 2014-10-29 2018-05-15 Microchip Technology Germany Gmbh Human interface device and method
JP6606335B2 (en) * 2015-02-25 2019-11-13 株式会社メガチップス Image recognition device
CN106295464A (en) * 2015-05-15 2017-01-04 济南大学 Gesture identification method based on Shape context
CN105451029B (en) * 2015-12-02 2019-04-02 广州华多网络科技有限公司 A kind of processing method and processing device of video image
CN106909872A (en) * 2015-12-22 2017-06-30 江苏达科智能科技有限公司 Staff outline identification method
CN105759967B (en) * 2016-02-19 2019-07-09 电子科技大学 A kind of hand overall situation attitude detecting method based on depth data
CN105868715B (en) * 2016-03-29 2020-02-07 苏州科达科技股份有限公司 Gesture recognition method and device and gesture learning system
CN106846403B (en) * 2017-01-04 2020-03-27 北京未动科技有限公司 Method and device for positioning hand in three-dimensional space and intelligent equipment
CN107180224B (en) * 2017-04-10 2020-06-19 华南理工大学 Finger motion detection and positioning method based on space-time filtering and joint space Kmeans
CN107589850A (en) * 2017-09-26 2018-01-16 深圳睛灵科技有限公司 A kind of recognition methods of gesture moving direction and system
CN110837766B (en) * 2018-08-17 2023-05-05 北京市商汤科技开发有限公司 Gesture recognition method, gesture processing method and device
CN109359566B (en) * 2018-09-29 2022-03-15 河南科技大学 Gesture recognition method for hierarchical classification by using finger characteristics
CN109919039B (en) * 2019-02-14 2023-07-25 上海磐启微电子有限公司 Static gesture recognition method based on palm and finger characteristics
CN110046603B (en) * 2019-04-25 2020-11-27 合肥工业大学 Gesture action recognition method for Chinese pule sign language coding
CN111046734B (en) * 2019-11-12 2022-10-18 重庆邮电大学 Multi-modal fusion sight line estimation method based on expansion convolution
CN111062312B (en) * 2019-12-13 2023-10-27 RealMe重庆移动通信有限公司 Gesture recognition method, gesture control device, medium and terminal equipment
CN111523435A (en) * 2020-04-20 2020-08-11 安徽中科首脑智能医疗研究院有限公司 Finger detection method, system and storage medium based on target detection SSD
CN111461059A (en) * 2020-04-21 2020-07-28 哈尔滨拓博科技有限公司 Multi-zone multi-classification extensible gesture recognition control device and control method
US20230031200A1 (en) * 2021-07-30 2023-02-02 Jadelynn Kim Dao Touchless, Gesture-Based Human Interface Device
CN115100747B (en) * 2022-08-26 2022-11-08 山东宝德龙健身器材有限公司 Treadmill intelligent auxiliary system based on visual detection

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101853071A (en) * 2010-05-13 2010-10-06 重庆大学 Gesture identification method and system based on visual sense
CN101901350A (en) * 2010-07-23 2010-12-01 北京航空航天大学 Characteristic vector-based static gesture recognition method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8135181B2 (en) * 2007-03-26 2012-03-13 The Hong Kong Polytechnic University Method of multi-modal biometric recognition using hand-shape and palmprint

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101853071A (en) * 2010-05-13 2010-10-06 重庆大学 Gesture identification method and system based on visual sense
CN101901350A (en) * 2010-07-23 2010-12-01 北京航空航天大学 Characteristic vector-based static gesture recognition method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
江国来,林耀荣.自适应模型和固定模型结合的肤色分割算法.《计算机应用》.2010,第30卷(第10期),全文. *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103413080A (en) * 2013-08-20 2013-11-27 苏州跨界软件科技有限公司 Password protection realization method based on gesture
CN103426000A (en) * 2013-08-28 2013-12-04 天津大学 Method for detecting static gesture fingertip
CN103426000B (en) * 2013-08-28 2016-12-28 天津大学 A kind of static gesture Fingertip Detection

Also Published As

Publication number Publication date
CN102368290A (en) 2012-03-07

Similar Documents

Publication Publication Date Title
CN102368290B (en) Hand gesture identification method based on finger advanced characteristic
CN104899600B (en) A kind of hand-characteristic point detecting method based on depth map
CN104063059B (en) A kind of real-time gesture recognition method based on finger segmentation
CN101901350B (en) Characteristic vector-based static gesture recognition method
CN101593022B (en) Method for quick-speed human-computer interaction based on finger tip tracking
CN102402680B (en) Hand and indication point positioning method and gesture confirming method in man-machine interactive system
US8254627B2 (en) Method for automatically following hand movements in an image sequence
CN101470800B (en) Hand shape recognition method
CN103294996A (en) 3D gesture recognition method
Feng et al. Features extraction from hand images based on new detection operators
CN104978012B (en) One kind points to exchange method, apparatus and system
CN109684959B (en) Video gesture recognition method and device based on skin color detection and deep learning
CN109919039B (en) Static gesture recognition method based on palm and finger characteristics
CN105739702A (en) Multi-posture fingertip tracking method for natural man-machine interaction
CN103971102A (en) Static gesture recognition method based on finger contour and decision-making trees
CN109359566B (en) Gesture recognition method for hierarchical classification by using finger characteristics
CN102609683A (en) Automatic labeling method for human joint based on monocular video
CN106503626A (en) Being mated with finger contours based on depth image and refer to gesture identification method
CN109190460B (en) Hand-shaped arm vein fusion identification method based on cumulative matching and equal error rate
CN102402289A (en) Mouse recognition method for gesture based on machine vision
CN106650628B (en) Fingertip detection method based on three-dimensional K curvature
CN106503619B (en) Gesture recognition method based on BP neural network
Ren et al. Hand gesture recognition with multiscale weighted histogram of contour direction normalization for wearable applications
CN105335711A (en) Fingertip detection method in complex environment
CN103870071A (en) Touch source identification method and system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20121226