CN102968618A - Static hand gesture recognition method fused with BoF model and spectral clustering algorithm - Google Patents

Static hand gesture recognition method fused with BoF model and spectral clustering algorithm Download PDF

Info

Publication number
CN102968618A
CN102968618A CN2012104206339A CN201210420633A CN102968618A CN 102968618 A CN102968618 A CN 102968618A CN 2012104206339 A CN2012104206339 A CN 2012104206339A CN 201210420633 A CN201210420633 A CN 201210420633A CN 102968618 A CN102968618 A CN 102968618A
Authority
CN
China
Prior art keywords
unique point
bof
recognition
histogram data
sample
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2012104206339A
Other languages
Chinese (zh)
Inventor
陈岭
闯跃龙
王敬昌
赵江奇
解正宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Hongcheng Computer Systems Co Ltd
Original Assignee
Zhejiang Hongcheng Computer Systems Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Hongcheng Computer Systems Co Ltd filed Critical Zhejiang Hongcheng Computer Systems Co Ltd
Priority to CN2012104206339A priority Critical patent/CN102968618A/en
Publication of CN102968618A publication Critical patent/CN102968618A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention relates to an object recognition technology in a static image, in particular to a static hand gesture recognition method fused with a BoF model and a spectral clustering algorithm. The method includes a recognition training method and a recognition application method, a layering BoF model is established to rapidly and accurately catch hand gesture characteristic distribution in a complicated background, characteristic points which belong to the background are filtered out through the spectral clustering algorithm, and efficiencies and accuracy of the recognition are guaranteed. The static hand gesture recognition method fused with the BoF model and the spectral clustering algorithm has the advantages that by means of the layering BoF model, the advantages that traditional BoF models are high in operation efficiency and relatively accurate are maintained, and simultaneously the defect that the traditional BoF models don't contain characteristic point space distribution information is overcome; a filter algorithm (Spectral-HIK) based on spectrum and HIK is provided, by means of the filter algorithm, most of the background characteristic points are filtered out on the basis of maintenance of prospect characteristic points to the greatest extent, and the recognition efficiency and accuracy of the whole algorithm can be effectively improved.

Description

A kind of static gesture identification method that merges BoF model and spectral clustering
Technical field
The present invention relates to the Identifying Technique of Object in the still image, relate in particular to the static gesture identification method of a kind of BoF of fusion model and spectral clustering.
Background technology
Gesture identification method roughly can be divided into two classes at present.First kind method is to utilize utility appliance to carry out gesture identification, such as data glove (Data Gloves), and Magnetic Sensor (Magnetic Sensors) and inertial sensor (Inertial Sensors) etc.; The Equations of The Second Kind method is based on the Gesture Recognition Algorithm of computer vision.With respect to the Gesture Recognition Algorithm based on optional equipment, these class methods only adopt camera to carry out data acquisition, need to not add any other equipment with it the user.But the Gesture Recognition Algorithm that is based on computer vision is faced with problem demanding prompt solution, mainly contain 2 points: the 1) structural complexity of staff itself: people's a hand just has general 14 joints, this is just so that hand is unusually flexible, for this non-rigid objects of staff, how distinguishing accurately and efficiently different gestures is one of difficult points of gesture identification; 2) complicacy of background: this non-rigid objects of staff is easy to be subject to the impact of environment, especially has similar color when background environment has with staff, and is more serious on the identification impact of gesture.In order to overcome above two problems that face, the certain methods of current appearance strictly limits background, adopts complexion model to detect hand region.Clearly these class methods can't be applied in the gesture identification under the complex background.Another kind of method is that background is carried out modeling, by the characteristics of gesture itself being carried out the position of analyzing and testing finger, finally realizes gesture identification.The method has preferably recognition result for the gesture identification under the known background condition, but can't be applied in the gesture identification under the Unknown Background condition.
Summary of the invention
The present invention overcomes above-mentioned weak point, purpose is to provide the static gesture identification method of a kind of BoF of fusion model and spectral clustering, making up stratification BoF model distributes for the gesture feature that catches rapidly and accurately under the complex background, filter out the unique point that belongs to background by spectral clustering again, guarantee efficient and the accuracy of identification.
The present invention achieves the above object by the following technical programs: the static gesture identification method of a kind of BoF of fusion model and spectral clustering, comprise recognition training method and identification application process, and the recognition training method may further comprise the steps:
1) inputs positive and negative training sample;
2) unique point of the positive and negative training sample of extraction;
3) employing is filtered the unique point of extracting based on the filter algorithm of spectral clustering, obtains belonging to the unique point of prospect;
4) adopt based on the HIK clustering algorithm foreground features point of all samples is carried out cluster;
5) divide with respect to the space distribution situation at hand center according to all unique points: take the hand center as the center of circle, mark off some concentric regions, for different zones unique point is divided into groups;
6) unique point of all groupings is carried out projection to the horizontal and vertical axle respectively and calculate, make up the histogram data structure on each projecting direction, calculate the space distribution information of preserving unique point;
7) comprehensive spatial division information and the projection information of each unique point, make up layer stratification BoF model;
8) training sample of each classification obtains respectively its histogram data by stratification BoF modelling, finally makes up sorter by standardization;
9) call the sorter of structure, adopt voting mechanism that sample is carried out Classification and Identification: the histogram data of sample and the histogram data of sorter carry out the similarity contrast, it is more to comprise the data number in the histogram data in the minor structure of same position, sample is just more similar to this sorter, with the BoF model of all categories relatively after, select to approach the most as recognition result;
10) output category result;
The identification application process may further comprise the steps:
1) input images of gestures to be identified;
2) extract the images of gestures unique point of inputting;
3) employing is filtered the unique point of extracting based on the filter algorithm of spectral clustering, obtains belonging to the unique point of prospect;
4) clustering algorithm that calls based on HIK carries out cluster to the foreground features point;
5) divide with respect to the space distribution situation at hand center according to all unique points: take the hand center as the center of circle, mark off some concentric regions, for different zones unique point is divided into groups;
6) unique point of all groupings is carried out projection to the horizontal and vertical axle respectively and calculate, make up the histogram data structure on each projecting direction, calculate the space distribution information of preserving unique point;
7) comprehensive spatial division information and the projection information of each unique point, construct inferiorization BoF model;
8) call the sorter that the recognition training method obtains, adopt voting mechanism that sample is carried out Classification and Identification: the histogram data of sample and the histogram data of sorter carry out the similarity contrast, it is more to comprise the data number in the histogram data in the minor structure of same position, sample is just more similar to this sorter, with the BoF model of all categories relatively after, select to approach the most as recognition result;
9) output category result.
As preferably, step 2 in the recognition training method) step 2 in the unique point of the positive and negative training sample of described extraction and the identification application process) the images of gestures unique point of described extraction input all adopts the ASIFT algorithm.
Beneficial effect of the present invention is: 1, stratification BoF model is keeping having revised again the defective that does not comprise the unique point space distribution information in the traditional B oF model simultaneously on high, the relatively accurate advantage basis of traditional B oF model running efficient; 2, in order further to improve recognition efficiency and the accuracy rate of whole algorithm, a kind of filter algorithm (Spectral-HIK) based on spectrum and HIK has been proposed, this algorithm filters out most background characteristics point on the basis of preserving foreground features point as far as possible, can effectively improve recognition efficiency and the accuracy rate of whole algorithm.
Description of drawings
Fig. 1 is flow chart of steps of the present invention;
Fig. 2 adopts the flow chart of steps of the unique point of extracting being filtered based on the filter algorithm of spectral clustering;
Fig. 3 is the flow chart of steps of sorter Classification and Identification;
Fig. 4 a is the positive and negative samples of inputting in the recognition training method;
Fig. 4 b is the images of gestures to be identified of inputting in the identification application process;
Fig. 5 a is positive and negative samples unique point space distribution and perspective view in the recognition training method;
Fig. 5 b is images of gestures unique point space distribution and perspective view in the identification application process.
Embodiment
The present invention is described further below in conjunction with specific embodiment, but protection scope of the present invention is not limited in this:
Embodiment 1: as shown in Figure 1, a kind of static gesture identification method that merges BoF model and spectral clustering comprises recognition training method and identification application process;
(1) recognition training method, step is as follows:
Step 110: input positive and negative training sample, the positive and negative samples of input represents shown in Fig. 4 a;
Step 120: the unique point of extracting positive and negative training sample: unique point is more, then recognition efficiency is just higher, the present invention adopts the ASIFT algorithm as the method for the unique point of extracting positive and negative training sample, gesture place image range in the positive and negative training sample is described by the shape of unique point to gesture as the chosen area of unique point;
Step 130: adopt based on the filter algorithm of spectral clustering the unique point of extracting filtered, obtain belonging to the unique point of prospect, step as shown in Figure 2:
Step 131: input feature vector point set;
Step 132: the unique point set of input makes up affine matrix A (A ∈ R N * n), A wherein IjRepresent two some P iAnd P jSimilarity; The unique point set of input makes up diagonal matrix D, D IjCapable the adding up of expression affine matrix A i;
Step 133: the poor matrix L=D-A that calculates diagonal matrix D and affine matrix A;
Step 134: the proper vector of calculating poor matrix L makes up matrix U (U ∈ R N * k),
{ u wherein 1... u kBe the column vector set of matrix U, { y 1... y nIt is the row vector set of matrix U;
Step 135: according to the HIK clustering algorithm: K HI ( h p , h q ) = Σ i min ( h p i , h q i ) - - - ( 1 )
With the cluster similarity algorithm:
||p-q|| 2=||φ(h p)-φ(h q)||=K HI(h p-h p)+K HI(h q-h q)-2K HI(h p-h q) (2)
All row vectors to matrix U carry out cluster, obtain k cluster result: C 1C k,
Wherein, h pAnd h qThe expression histogram data, i represents the sequence number of the every sub regions of histogram;
Step 136: call the space distribution judgement cluster result is differentiated, the subset of definition mean distance maximum is judged as the background subset, and discrimination formula is as follows:
g background=arg max{average Euc(g)},g∈G (3)
Wherein, G represents all unique point subsets, average Euc() represents in certain subset the average Euclidean distance between two points;
Step 137: export final filter result: totally 23 of unique points that belong to prospect;
Step 140: call formula (1) the foreground features point of step 137 output is carried out cluster, cluster result is: leg-of-mutton unique point has 11, and the unique point of pentagram shape has 7, and circular unique point has 5;
Step 150: divide with respect to the space distribution situation at hand center according to all unique points: take the hand center as the center of circle, be divided into interior zone, zone line, perimeter, according to different zones unique point is divided into groups, wherein, interior zone has 6 of unique points, comprises 5 of triangle character points, 1 of pentagram unique point; Zone line has 8 of unique points, comprises 4 of triangle character points, 3 of pentagram unique points, 1 of circular feature point; There are 9 of unique points the perimeter, comprises 2 of triangle character points, 3 of pentagram unique points, 4 of circular feature points;
Step 160: shown in Fig. 5 a, the unique point of all groupings is carried out projection to the horizontal and vertical axle respectively calculate, make up the histogram data structure on each projecting direction, calculate the space distribution information of preserving unique point;
Step 170: comprehensive spatial division information and the projection information of each unique point, construct inferiorization BoF model;
Step 180: the training sample of each classification obtains respectively the histogram data of each classification by stratification BoF modelling, finally makes up sorter by standardization;
Step 190: call the sorter that the recognition training method obtains, adopt voting mechanism that sample is carried out Classification and Identification: the histogram data of sample and the histogram data of sorter carry out the similarity contrast, it is more to comprise the data number in the histogram data in the minor structure of same position, sample is just more similar to this sorter, with the BoF model of all categories relatively after, select to approach the most as recognition result, as shown in Figure 3:
Step 191: image to be sorted obtains stratification histogram data collection H after processing by stratification BoF model s
Step 192: each gesture obtains a data set H m, with H sAnd H mCarry out similarity relatively, the corresponding data set of similarity value maximum is classification results:
H m < H m , H s > = &Sigma; i = 1 N w i ( &Sigma; j = 1 K Sim ( h ij m , h ij s ) ) - - - ( 4 )
Sim ( h 1 , h 2 ) = K HI ( h 1 , h 2 )
= &Sigma; i Min ( h 1 i , h 2 i ) - - - ( 5 )
In the formula: H mThe histogram data model that the expression training obtains, H sThe histogram data collection of expression input picture; w iWeights for the gesture subregion; The quantity of gesture subregion is represented by N; K representation feature point set clusters number; Sim (. .) be based on the similar function of HIK; h 1And h 2The expression histogram data;
Step 200: output category result.
(2) the identification application process may further comprise the steps:
Step 310: input images of gestures to be identified, shown in Fig. 4 b;
Step 320: the images of gestures unique point of extracting input;
Step 330: adopt based on the filter algorithm of spectral clustering the unique point of extracting is filtered, obtain belonging to the unique point of prospect, belong to totally 23 of the unique points of prospect;
Step 340: the clustering algorithm that calls based on HIK carries out cluster to the foreground features point, and cluster result is: leg-of-mutton unique point has 11, and the unique point of pentagram shape has 7, and circular unique point has 5;
Step 350: divide with respect to the space distribution situation at hand center according to all unique points: take the hand center as the center of circle, be divided into interior zone, zone line, perimeter, according to different zones unique point is divided into groups, wherein, interior zone has 6 of unique points, comprises 5 of triangle character points, 1 of pentagram unique point; Zone line has 8 of unique points, comprises 4 of triangle character points, 3 of pentagram unique points, 1 of circular feature point; There are 9 of unique points the perimeter, comprises 2 of triangle character points, 3 of pentagram unique points, 4 of circular feature points;
Step 360: shown in Fig. 5 b, the unique point of all groupings is carried out projection to the horizontal and vertical axle respectively calculate, make up the histogram data structure on each projecting direction, calculate the space distribution information of preserving unique point;
Step 370: comprehensive spatial division information and the projection information of each unique point, construct stratification BoF model;
Step 380: the histogram data of sample and the histogram data of sorter carry out similarity contrast, and the histogram data of sample and the histogram data of sorter comprise the data number in the minor structure of same position identical, and then selecting this sorter is recognition result;
Step 390: output recognition result.
Above described be specific embodiments of the invention and the know-why used, if the change of doing according to conception of the present invention when its function that produces does not exceed spiritual that instructions and accompanying drawing contain yet, must belong to protection scope of the present invention.

Claims (2)

1. a static gesture identification method that merges BoF model and spectral clustering is characterized in that comprising recognition training method and identification application process, and the recognition training method may further comprise the steps:
1) inputs positive and negative training sample;
2) unique point of the positive and negative training sample of extraction;
3) employing is filtered the unique point of extracting based on the filter algorithm of spectral clustering, obtains belonging to the unique point of prospect;
4) adopt based on the HIK clustering algorithm foreground features point of all samples is carried out cluster;
5) divide with respect to the space distribution situation at hand center according to all unique points: take the hand center as the center of circle, mark off some concentric regions, for different zones unique point is divided into groups;
6) unique point of all groupings is carried out projection to the horizontal and vertical axle respectively and calculate, make up the histogram data structure on each projecting direction, calculate the space distribution information of preserving unique point;
7) comprehensive spatial division information and the projection information of each unique point, make up stratification BoF model;
8) training sample of each classification obtains respectively the histogram data of each classification by stratification BoF modelling, finally makes up sorter by standardization;
9) call the sorter of structure, adopt voting mechanism that sample is carried out Classification and Identification: the histogram data of sample and the histogram data of sorter carry out the similarity contrast, it is more to comprise the data number in the histogram data in the minor structure of same position, sample is just more similar to this sorter, with the BoF model of all categories relatively after, select to approach the most as recognition result;
10) output category result;
The identification application process may further comprise the steps:
1) input images of gestures to be identified;
2) extract the images of gestures unique point of inputting;
3) employing is filtered the unique point of extracting based on the filter algorithm of spectral clustering, obtains belonging to the unique point of prospect;
4) clustering algorithm that calls based on HIK carries out cluster to the foreground features point;
5) divide with respect to the space distribution situation at hand center according to all unique points: take the hand center as the center of circle, mark off some concentric regions, for different zones unique point is divided into groups;
6) unique point of all groupings is carried out projection to the horizontal and vertical axle respectively and calculate, make up the histogram data structure on each projecting direction, calculate the space distribution information of preserving unique point;
7) comprehensive spatial division information and the projection information of each unique point, construct inferiorization BoF model;
8) call the sorter that the recognition training method obtains, adopt voting mechanism that sample is carried out Classification and Identification: the histogram data of sample and the histogram data of sorter carry out the similarity contrast, it is more to comprise the data number in the histogram data in the minor structure of same position, sample is just more similar to this sorter, with the BoF model of all categories relatively after, select to approach the most as recognition result;
9) output category result.
2. a kind of static gesture identification method that merges BoF model and spectral clustering according to claim 1, it is characterized in that step 2 in the recognition training method) step 2 in the unique point of the positive and negative training sample of described extraction and the identification application process) the images of gestures unique point of described extraction input all adopts the ASIFT algorithm.
CN2012104206339A 2012-10-24 2012-10-24 Static hand gesture recognition method fused with BoF model and spectral clustering algorithm Pending CN102968618A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2012104206339A CN102968618A (en) 2012-10-24 2012-10-24 Static hand gesture recognition method fused with BoF model and spectral clustering algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2012104206339A CN102968618A (en) 2012-10-24 2012-10-24 Static hand gesture recognition method fused with BoF model and spectral clustering algorithm

Publications (1)

Publication Number Publication Date
CN102968618A true CN102968618A (en) 2013-03-13

Family

ID=47798750

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2012104206339A Pending CN102968618A (en) 2012-10-24 2012-10-24 Static hand gesture recognition method fused with BoF model and spectral clustering algorithm

Country Status (1)

Country Link
CN (1) CN102968618A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103559504A (en) * 2013-11-04 2014-02-05 北京京东尚科信息技术有限公司 Image target category identification method and device
CN105279526A (en) * 2014-06-13 2016-01-27 佳能株式会社 Trajectory segmentation method and device
CN105989266A (en) * 2015-02-11 2016-10-05 北京三星通信技术研究有限公司 Electrocardiosignal-based authentication method, apparatus and system
US10089451B2 (en) 2015-02-11 2018-10-02 Samsung Electronics Co., Ltd. Electrocardiogram (ECG)-based authentication apparatus and method thereof, and training apparatus and method thereof for ECG-based authentication
CN109472307A (en) * 2018-11-07 2019-03-15 郑州云海信息技术有限公司 A kind of method and apparatus of training image disaggregated model

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090299999A1 (en) * 2009-03-20 2009-12-03 Loui Alexander C Semantic event detection using cross-domain knowledge
CN101661556A (en) * 2009-09-25 2010-03-03 哈尔滨工业大学深圳研究生院 Static gesture identification method based on vision
CN102663446A (en) * 2012-04-24 2012-09-12 南方医科大学 Building method of bag-of-word model of medical focus image

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090299999A1 (en) * 2009-03-20 2009-12-03 Loui Alexander C Semantic event detection using cross-domain knowledge
CN101661556A (en) * 2009-09-25 2010-03-03 哈尔滨工业大学深圳研究生院 Static gesture identification method based on vision
CN102663446A (en) * 2012-04-24 2012-09-12 南方医科大学 Building method of bag-of-word model of medical focus image

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
YUELONG CHUANG ET AL.: "hierarchical bag-of-features for hand posture recognition", 《2011 18TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING》, 31 December 2011 (2011-12-31), pages 1777 - 1780 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103559504A (en) * 2013-11-04 2014-02-05 北京京东尚科信息技术有限公司 Image target category identification method and device
CN103559504B (en) * 2013-11-04 2016-08-31 北京京东尚科信息技术有限公司 Image target category identification method and device
CN105279526A (en) * 2014-06-13 2016-01-27 佳能株式会社 Trajectory segmentation method and device
CN105279526B (en) * 2014-06-13 2019-11-29 佳能株式会社 Divide the method and apparatus of track
CN105989266A (en) * 2015-02-11 2016-10-05 北京三星通信技术研究有限公司 Electrocardiosignal-based authentication method, apparatus and system
US10089451B2 (en) 2015-02-11 2018-10-02 Samsung Electronics Co., Ltd. Electrocardiogram (ECG)-based authentication apparatus and method thereof, and training apparatus and method thereof for ECG-based authentication
CN105989266B (en) * 2015-02-11 2020-04-03 北京三星通信技术研究有限公司 Authentication method, device and system based on electrocardiosignals
CN109472307A (en) * 2018-11-07 2019-03-15 郑州云海信息技术有限公司 A kind of method and apparatus of training image disaggregated model

Similar Documents

Publication Publication Date Title
CN102609686B (en) Pedestrian detection method
CN108875600A (en) A kind of information of vehicles detection and tracking method, apparatus and computer storage medium based on YOLO
CN103035013B (en) A kind of precise motion shadow detection method based on multi-feature fusion
CN105975934B (en) Dynamic gesture recognition method and system for augmented reality auxiliary maintenance
CN104361313B (en) A kind of gesture identification method merged based on Multiple Kernel Learning heterogeneous characteristic
CN106529499A (en) Fourier descriptor and gait energy image fusion feature-based gait identification method
CN105956560A (en) Vehicle model identification method based on pooling multi-scale depth convolution characteristics
CN105354565A (en) Full convolution network based facial feature positioning and distinguishing method and system
CN112052186B (en) Target detection method, device, equipment and storage medium
CN104915673B (en) A kind of objective classification method and system of view-based access control model bag of words
CN104166841A (en) Rapid detection identification method for specified pedestrian or vehicle in video monitoring network
CN104134071A (en) Deformable part model object detection method based on color description
CN105046197A (en) Multi-template pedestrian detection method based on cluster
CN108280397A (en) Human body image hair detection method based on depth convolutional neural networks
CN104123529A (en) Human hand detection method and system thereof
CN104202547A (en) Method for extracting target object in projection picture, projection interaction method and system thereof
CN104834941A (en) Offline handwriting recognition method of sparse autoencoder based on computer input
CN106503170B (en) It is a kind of based on the image base construction method for blocking dimension
CN104598889B (en) The method and apparatus of Human bodys&#39; response
CN102968618A (en) Static hand gesture recognition method fused with BoF model and spectral clustering algorithm
CN109343920A (en) A kind of image processing method and its device, equipment and storage medium
CN110956099B (en) Dynamic gesture instruction identification method
CN103593695A (en) Method for positioning DPM two-dimension code area
CN108073940B (en) Method for detecting 3D target example object in unstructured environment
CN109086772A (en) A kind of recognition methods and system distorting adhesion character picture validation code

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
DD01 Delivery of document by public notice

Addressee: Zhejiang Hongcheng Computer Systems Co., Ltd.

Document name: the First Notification of an Office Action

DD01 Delivery of document by public notice

Addressee: Zhejiang Hongcheng Computer Systems Co., Ltd.

Document name: Notification that Application Deemed to be Withdrawn

C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20130313