CN106529594A - Supervised dimension reduction algorithm for big data behavior recognition - Google Patents

Supervised dimension reduction algorithm for big data behavior recognition Download PDF

Info

Publication number
CN106529594A
CN106529594A CN201610982038.2A CN201610982038A CN106529594A CN 106529594 A CN106529594 A CN 106529594A CN 201610982038 A CN201610982038 A CN 201610982038A CN 106529594 A CN106529594 A CN 106529594A
Authority
CN
China
Prior art keywords
lasrc
class
algorithm
sample
classification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610982038.2A
Other languages
Chinese (zh)
Other versions
CN106529594B (en
Inventor
简献忠
周小朋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Shanghai for Science and Technology
Original Assignee
University of Shanghai for Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Shanghai for Science and Technology filed Critical University of Shanghai for Science and Technology
Priority to CN201610982038.2A priority Critical patent/CN106529594B/en
Publication of CN106529594A publication Critical patent/CN106529594A/en
Application granted granted Critical
Publication of CN106529594B publication Critical patent/CN106529594B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2136Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on sparsity criteria, e.g. with an overcomplete basis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/28Determining representative reference patterns, e.g. by averaging or distorting; Generating dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a supervised dimension reduction algorithm for big data behavior recognition. The supervised dimension reduction algorithm is associated with a linearly approximated sparse representation based classification (LASRC) algorithm in order to maintain the class information of data when projecting high-dimensional behavior data onto a low-dimensional space, and effectively reduce the data dimension. The OP-LASRC uses the classification residual as a standard, and pursues a linear orthogonal projection, which gives the supervised effect to the OP-LASRC. The high-dimensional behavior picture is converted into small features with divergence information to be classified, thereby achieving less calculation quantities, reducing storage and improving the classification efficiency, so the LASRC algorithm achieves higher recognition. In the KTH behavior database, the OP-LASRC algorithm is verified from the accuracy, speed and robustness so as to verify that the OP-LASRC can perfectly match the LASRC algorithm. The association of dimension reduction and the classification structure can form a behavior recognition system to be used in big data behavior recognition high efficiently.

Description

It is applied to the supervision dimension-reduction algorithm of big data Activity recognition
Technical field
The present invention relates to a kind of image data treatment technology, more particularly to a kind of supervision for being applied to big data Activity recognition Dimension-reduction algorithm.
Background technology
Human bodys' response is by the study hotspot of extensive concern, not only in intelligence in pattern recognition and field of machine vision There is vast application prospect in terms of monitoring, motion analysiss, identity discriminating and man-machine interaction, and in the different of potential safety hazard place Normal behavior monitoring, such as have great importance in terms of vehicle accident, Electrical Safety, medical monitoring etc. (document 1Chen L, Wei H, J Ferryman.A survey of human motion analysis using depth imagery [J] .Pattern Recognition Letters, 2013,34 (15):1995-2006).The behavior act of human body target spatially shows multiple Polygamy, it is in the Human bodys' response of big data, when describing spatiotemporal motion change of the target in two-dimensional image sequence, just single dynamic Multiframe picture need to be gathered for work in monitor video just, the training data of the human body behavior composition of various quantity is often huge , to needing the substantial amounts of calculating time during these data processings.But all in all two-dimentional Activity recognition speed than it is three-dimensional more Hurry up (2 paddy army rosy clouds of document, Ding Xiaoqing, Wang Shengjin. the 2D Activity recognitions [J] based on human body behavior 3D models. automatization's journal, 2010,36(1):46-53), it is suitable for the Activity recognition of big data, how in the case where illumination, visual angle are different with background, Quick, accurate, the stable classification of behavior to the big data of two dimension, is badly in need of a difficult problem (document studied in remaining Activity recognition 3Candamo J, Shreve M, Goldgof D B, et al.Under-standing Transit Scenes:A Survey on Human Behavior-Recognition Algorithms[J].IEEE Transactions on Intelligent Transportation Systems, 2010,11 (1):206-224).In order to reach this target, Chinese scholars are from dimensionality reduction With accelerate grader two in terms of done numerous studies (document 4 is yellow triumphant strange, Chen Xiaotang, Kang Yun, etc. intelligent Video Surveillance Technology is comprehensive State [J]. Chinese journal of computers, 2015 (6):1093-1118).
In past 20 years, various graders are proposed by Chinese scholars, but the Activity recognition grader to big data grinds Study carefully very few, this grader will not only be adapted to the data type of multi-quantity behavior, will also ensure that Activity recognition is quick, accurate, steady It is fixed.Traditional grader has support vector machine (SVM) and k- neighbours (NN), document 5 (Ren Xiaofang, Qin Jianyong, Yang Jie, etc. it is based on Applications [J] of the LS-TSVM of energy model in human action identification. computer utility is studied, and 2016,33 (2):598-601) With the LS-TSVM sorting techniques based on energy model, using two hyperplane, each hyperplane is introduced energy parameter and is made an uproar to reduce The impact of sound and exceptional value, improves recognition efficiency.But SVM classifier there is also, and optimization is difficult, calculate the big deficiency of intensity.Text Offer 6 (Liu L, Shao L, Rockett P.Human action recognition based on boosted feature selection and naive Bayes nearest-neighbor classification[J].Signal Processing, 2013,93 (6):NB-NN algorithm classifications 1521-1530) are used, the algorithm does not need the training time, only need inquiry The sample of the distance of minimum is obtained, but its discrimination need to be improved.John Wright proposition SRC (document 7Wright J, Yang A Y, Ganesh A, et al.Robust Face Recognition via Sparse Representation [J] .IEEE Transactions on Pattern Analysis and Machine Intelligence, 2009,31 (2): 210-227) algorithm, the algorithm is to blocking, noise, illumination have extremely strong robustness and famous.Document 8 (Liu C, Yang Y, Chen Y.Human Action Recognition using Sparse Representation[C].IEEE International Conference on Intelligent Computing and Intelligent Systems, 2009,4:SRC is used for into Activity recognition 184-188), recognition correct rate is higher than NN algorithm, but is consumed on L1 norms are solved The substantial amounts of time, in order to accelerate classification speed, 9 (Zhang L, Yang M, Feng X.Sparse representation of document or collaborative representation:Which helps face recognition?[C] .International Conference on Computer Vision, 2011,6669 (5):Collaboration table is proposed 471-478) Show classification (CRC) algorithm, solved using L2 norms, substantially increase recognition efficiency, but reduce the robustness of algorithm.Document 10 (Ortiz E G, Becker B C.Face recognition for web-scale datasets [J] .Computer Vision and Image Understanding, 2014,118 (1):The LASRC algorithms for 153-170) proposing first adopt L2 models Number quickly estimates coefficient vector, and Sample Storehouse is screened, and finds the corresponding sample of front k greatest coefficient, constitutes low capacity Sample after with SRC classify, accelerate the speed of algorithm, so the picture to big quantity can be realized using LASRC sorting algorithms Fast Classification, during research finds the Activity recognition for can use it for big data.
As LASRC graders are classified for big data, which is computationally intensive, and speed still needs raising.The behavior of higher-dimension Data can include substantial amounts of irrelevant information and redundancy (11 Hu Jie of document, high dimensional data Feature Dimension Reduction Review Study [J], meter Calculation machine applied research, 2008,25 (09):Dimension disaster can be caused 2601-2606), this speed that will be had a strong impact on when data are classified Degree, is from an initial high dimensional feature set to select low-dimensional characteristic set using Feature Dimension Reduction, can substantially increase classification effect Rate.Classical dimension reduction method has principal component analysiss (PCA) (document 12Turk M A, Pentland AP.Face recognition using eigenfaces[C].in Proceedings of the 1991IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), 1991:586– 591) with linear discriminant analysiss (LDA) (document 13Belhumeur P, Hespanha P, Kriegman D.recognition Using class specific linear projection [J] .IEEE TransPatternAnal Mach Intell, 1997,19 (7):711-720) algorithm, PCA are that non-supervisory dimensionality reduction finds mapping matrix by maximizing variance, and LDA is then supervision Dimensionality reduction, disperses and minimizes the dispersion of this class to obtain projection matrix between class by maximizing, and both approaches are not disclosed and are embedded in The critical data of high dimensional nonlinear data space.In order to the dimension reduction method for overcoming this restriction, the popular study of a class is suggested, Jing Allusion quotation is locality preserving projections (LPP) (document 14He X, Yan S, Hu Y, et al.Face recognition using laplacianfaces[J].IEEE Transactions on Pattern Analysis and Machine Intelligence, 2005,27 (3):328-40) algorithm, it is assumed that low-dimensional data is sampled a potential stream in higher dimensional space In shape, this algorithm not being subjected to supervision does not account for class label, also has Shortcomings during classification, document 15 (Zhao Z S, Zhang L, Zhao M, et al.Gabor face recognition by multi-channel classifier Fusion of supervised kernel manifold learning [J] Neurocomputing, 2012,97:398- 404) popular study dimensionality reduction (SLPP) based on supervision core is proposed, using class label information, this class is strengthened by similar matrix Information, between reduction class, information carrys out dimensionality reduction, and this allows classification to reach more preferable effect.
Although supervision dimensionality reduction is conducive to preferably classification, above-mentioned dimension reduction method is with rarefaction representation sorting technique without direct Association, 16 (Qiao L S, Chen S C, Tan X Y.Sparsity preserving projections with of document Applications to face recognition [J] .Pattern Recognition, 2010,43:331-341) propose Sparse retaining projection (SPP), using one given sample of all of training sample rarefaction representation and seeking a linear projection, So that rarefaction representation coefficient is saved.Document 17 (Yang J, Chu D, Zhang L, et al.Sparse representation classifier steered discriminative projection with applications To face recognition [J] .IEEE Transaction son Neural Networks&Learning Systems, 2013,24 (7):1023-1035) propose that rarefaction representation classification control difference projection (SRC-DP) sets up the nature of dimensionality reduction and classification Contact, using the residual computations rule of SRC algorithms, carrys out controlling feature extraction as a new criterion, and the algorithm is obtained Projection matrix needs iterative calculation, consumes duration, is unfavorable for the Activity recognition of big data.Document 18 (Hua J, Wang H, Ren M, et al.Dimension Reduction Using Collaborative Representation Reconstruction Based Projections [J] .Neurocomputing, 2016,193:1-6) with 19 (Yin of document J, Wei L, Song M, et al.Optimized projection for Collaborative Representation based Classification and its applications to face recognition[J].Pattern Recognition Letter, 2016,73:83-90) the method is used in CRC algorithm, collaboration presentation class control difference is thrown Shadow (CRC-DP) algorithm dimensionality reduction can improve efficiency in classification, but dimensionality reduction is former needs iterative calculation.In order to solve iteration duration Consumption problem, 20 (Lu C Y, Huang D S.Optimized projections for sparse of document Representation based classification [J] .Neurocomputing, 2013,113:213-219) propose Optimization projection directly calculates projection matrix for supervising dimensionality reduction, and matches SRC point to rarefaction representation classification (OP-SRC) algorithm Class, improves the classification effectiveness of SRC.
The content of the invention
The present invention be directed to poor real and discrimination low problem during the Human bodys' response of big data, it is proposed that a kind of The supervision dimension-reduction algorithm of big data Activity recognition is applied to, based on the thought of OP-SRC, proposes that optimization projection is dilute to linear approximation The supervision dimension-reduction algorithm of thin presentation class OP-LASRC, associates linear approximation rarefaction representation LASRC fast classification algorithms by higher-dimension Behavioral data effective reduction of data dimension with the classification information of retention data, can be realized when projecting to a lower dimensional space, OP-LASRC classification residual errors are standard, pursue a linear orthogonal projection, this give the supervisory role of OP-LASRC, by height The behavior picture of dimension is converted into the little feature with difference information to classify, and number of computations is few, can reduce storing and improve classification Efficiency, so as to LASRC fast classification algorithms reach higher identification.From degree of accuracy, speed, robust on KTH behavior databases Property verifying OP-LASRC algorithms, so as to verify OP-LASRC can Perfect Matchings LASRC algorithm, associate the knot of dimensionality reduction and classification Structure, the system that can constitute an Activity recognition efficiently apply to the Activity recognition of big data.
The technical scheme is that:A kind of supervision dimension-reduction algorithm for being applied to big data Activity recognition, specifically include as Lower step:
1), to training sample and test specimens sample y principal component analysiss PCA dimension-reduction treatment, keeping characteristics information;
2), complete dictionary A was made up of the training sample after dimensionality reduction,
A=[A1,A2,.....,Ac]=[v1,1,v1,2,....v1,j,v2,1,v2,2......vi,j], i=c, j=e,
A=[A1,A2,.....,Ac]∈RN×M, A is the matrix of N rows M row, has c classes, has e width figures per class, altogether c × e=n width figures, each sample is v,
Each training sample is isolated in order as test sample, uses formulaCalculate correspondence Sparse coefficient
3), this class reconstructed residual R is calculated with formula belowWThe reconstructed residual R and between classB, and with formula (β RB-RW)pkkpk, k=1,2 ..., d, d<<N draws matrix P, λkIt is β RB-RWEigenvalue, PkIt is corresponding characteristic vector, β is one permanent Fixed parameter, for balancing the information of reconstructed residual between this class reconstructed residual and class,
δl(x0) it is whole sparse coefficient,For the coefficient of each class;
4), be subjected to supervision the matrix B=P after dimensionality reductionTA, uses L2 standardization, then uses formulaEstimation Corresponding sparse coefficient
5), from sparse coefficientIn select first w maximum coefficient, find the classification of corresponding training sample, these classes Training sample constitute new complete dictionary Ω, test sample is expressed as y=Ω x0
6), by formulaλ is sparse control coefrficient, uses L1 Norm is solved and obtains sparse coefficient x0
7), by formulaCalculate residual error riY (), the minimum class of residual error, as knows Other result.
The beneficial effects of the present invention is:The present invention is applied to the supervision dimension-reduction algorithm of big data Activity recognition, from dimensionality reduction With start with terms of Fast Classification two, OP-LASRC algorithmic match LASRC of proposition is used successfully to big data Activity recognition.Wherein The Fast Classification of LASRC is quickly to calculate sparse coefficient by L2 norms, and before choosing, k maximum coefficient constitutes new training sample Difference very big sample is successfully excluded, and the Sample Storehouse after diminution is accurately calculated with L1 norms by this, it is ensured that identification Rate, the reduction of sample size can reduce the width of training sample data.Dimensionality reduction is to supervise dimensionality reduction with OP-LASRC, by higher-dimension Image data optimization retains difference feature when projecting to low-dimensional data, and this low-dimensional data with difference feature can allow sparse The residual computations of presentation class avoid Errors Catastrophic, and from high discrimination is realized, lack the effect that data volume is calculated, and keep The characteristics of strong robustness that rarefaction representation classification is calculated, dimensionality reduction then reduces the height of training sample data.From the point of view of experiment, identification Rate 96.5%, strong robustness, execution time are shorter, illustrate that OP-LASRC can allow classification to reach height with LASRC Perfect Matchings Effect.It is this in terms of width and height two reducing the mode of data processing amount, be that the Activity recognition of big data opens one New thinking.
Description of the drawings
Fig. 1 is to adopt PCA, LDA, LPP dimensionality reduction and OP-LASRC dimension-reduction algorithms totality discrimination and dimension under 4 sample of the invention The relation comparison diagram of degree;
Fig. 2 is to adopt PCA, LDA, LPP dimensionality reduction and OP-LASRC dimension-reduction algorithms totality discrimination and dimension under 5 sample of the invention The relation comparison diagram of degree;
Fig. 3 is to adopt PCA, LDA, LPP dimensionality reduction and OP-LASRC dimension-reduction algorithms totality discrimination and dimension under 6 sample of the invention The relation comparison diagram of degree;
Fig. 4 is to adopt PCA, LDA, LPP dimensionality reduction and OP-LASRC dimension-reduction algorithms totality discrimination and dimension under 7 sample of the invention The relation comparison diagram of degree;
Fig. 5 adds noise damage figure for test sample of the present invention.
Specific embodiment
First, principle:OP-LASRC algorithms
OP-LASRC algorithms are a kind of methods of the optimization projection telltale in PCA dimensionality reductions.By a higher-dimension When data projection is to a low-dimensional data, retains difference feature, be just to maintain the category label of picture, this difference information is to dilute Thin presentation class is particularly important.
In LASRC sorting algorithms, the excessively complete dictionary A=[A of the pattern of wants1,A2,.....,Ac]∈RN×M, (A is a N The matrix of row M row) total c classes, there are e width figures per class, altogether c × e=n width figure, each of which class image AiRepresent, i=1, 2 ... c, each sample are v, Ai=[vi,1,vi,2,vi,3,.......vi,j], j=e, excessively complete dictionary A are represented by:
A=[A1,A2,.....,Ac]=[v1,1,v1,2,....v1,j,v2,1,v2,2......vi,j]
For each test sample y can use training sample linear expression:
Y=αi,1vi,1i,2vi,2i,3vi,3+...+αi,jvi,j=Ax0
Wherein x0It is equation coefficient vector, if x0It is sparse, ideally relevant with test sample class training Coefficient non-zero before sample, other coefficients are all 0, when n is sufficiently large, x0It is expressed as:
x0=[0 ..., 0, αi,1i,2,...,αi,j,0,...,0]T
Coefficient vector x0, can be by solving equation y=Ax0Obtain, can first be changed into the Solve problems of L2 norms:
Above formula can calculate sparse coefficient by pseudo inverse matrix
Pseudo inverse matrix is calculated and solves more convenient, the speed with method of least square than L1 norm.FromIn select First w maximum coefficient, finds out the corresponding training sample of this w greatest coefficient and constitutes a new excessively complete dictionary Ω, test Sample is represented by y=Ω x0.For this equation is solved using L1 norms:
λ is sparse control coefrficient, takes 0.01 according to λ in document 10.
With the sparse coefficient of each classCalculate residual error:
Recognition result:
I (y)=minri(y) (5)
Easily find, LASRC algorithms can be used in the Activity recognition of big data.But for excessively complete dictionary A, can use One mapping matrix P ∈ Rd, (P is a matrix for d row, d<<N), in y=PTUnder v linear transformations, sample v can be reflected from N-dimensional It is mapped to d dimensions, each sample vI, jY can be passed throughi,j=PTvi,jConversion is calculated, a new dictionary B=after conversion, is obtained PTA.This new dictionary just replaces original complete dictionary A, calculates least residual, be identified in being brought into LASRC algorithms As a result.
Isolate a sample y from training sample in orderI, jAs test sample, by the pseudo inverse matrix of above formula (2) Calculate coefficient vector x0, use δl(x0) whole sparse coefficient is represented, useRepresent the sparse coefficient of each class, and residual error r (yi,j)=| | yi,j-Bδi(x0)||2
Define this class reconstructed residual to be defined as:
Define reconstructed residual between class to be defined as:
Statistics residual matrix is defined as:
In order to be able to, in the implementation procedure of LASRC algorithms, be worth to more preferable recognition result by the minimum of residual error, it should Make this class reconstructed residual as far as possible little, make reconstructed residual between class as far as possible big, select maximum standard:
β is a constant parameter, can balance the information of reconstructed residual between this class reconstructed residual and class, according to reference text Offer 10, β and take 0.25.B=PTA, then be readily obtainedWithSo:
J (P)=tr (PT(βRB-RW)P) (11)
In order to prevent falling, P=[p are needed1,p2,....pk,] it is that unit vector is constituted, and work asK= 1 ..., d. also has other constrained procedures certainly, such as:Tr (P can be madeTRWP)=1 and then maximization tr (PTRBP).RequireA rectangular projection, the distributed architecture of retention data can be produced, therefore object function can be reconstructed into optimization Problem:
Above-mentioned object function is converted with Lagrange multiplier is:
Above formula is to PkDerivation derivation simultaneously makes which be equal to 0:
Can obtain:
(βRB-RW)pkkpk, k=1,2 ..., d (15)
λkIt is β RB-RWEigenvalue, PkIt is corresponding characteristic vector, then:
P is by being made up of the d maximum corresponding characteristic vector of eigenvalue, it can be found that J (P) is maximized.It is orthogonal The P and β R of changeB-RWIt is symmetrical, then be multiplied by matrix P during dimensionality reduction and just form a kind of method of supervision rectangular projection, this projection Substantial amounts of difference information can be remained, highly beneficial is classified to LASRC.The P matrixes for obtaining are a kind of effects of supervision dimensionality reduction, that Before algorithm performs, first with PCA PCA dimensionality reductions, after dimensionality reduction, sample constituted complete dictionary A, you can obtain B= PTA, the dictionary B for obtaining remained complete, were used further to LASRC classification, it will improve recognition efficiency.
2nd, algorithm flow:
Step one:To training sample and test specimens sample y PCA dimension-reduction treatment, keeping characteristics information.
Step 2:Complete dictionary A was made up of the training sample after dimensionality reduction, is isolated each training sample in order and is regarded Test sample, calculates corresponding sparse coefficient with formula (2).
Step 3:Reconstructed residual between this class reconstructed residual and class is calculated with formula (10), and matrix is drawn with formula (15) P。
Step 4:Be subjected to supervision the matrix B=P after dimensionality reductionTA, uses L2 standardization, then B to replace A matrixes formula (2) to estimate Corresponding sparse coefficient
Step 5:From sparse coefficientIn select first k maximum coefficient, find the classification of corresponding training sample, this The training sample of a little classes constitutes new complete dictionary Ω.
Step 6:Solved with L1 norms by formula (3) and obtain sparse coefficient x0
Step 7:Residual error, the as minimum class of residual error, recognition result are calculated by formula (4).
3rd, test
1st, degree of accuracy experiment
This algorithm is verified, from KTH behavior databases as experimental data.In test adopt PCA, LDA, LPP dimensionality reduction and OP-LASRC dimensionality reductions are contrasted, then are classified with LASRC.Drop to different dimensions, discrimination difference, its overall discrimination and dimension Relation is as shown in Figures 1 to 4.The number of each class sample can also affect discrimination, the experiment employ 4,5,6,7 sample numbers, Former sample storehouse acquires 10 figures per class, if 5 width therein regards training sample, then 5 width figures regard test sample in addition, Test takes average discrimination successively.Fig. 1 to 4 is respectively 4 samples, 5 samples, 6 samples and 7 samples and adopts PCA, LDA, LPP dimensionality reduction With OP-LASRC dimensionality reductions totality discrimination and the relation comparison diagram of dimension.
Visible in Fig. 1 to 4, different sample numbers, discrimination are different, as the increase of dimension, discrimination are also increased, arrive During 200 dimension, discrimination basically reaches peak.Clearly can see in figure the discrimination of OP-LASRC dimensionality reductions than PCA, LDA, The discrimination of LPP dimensionality reductions is high, and the average recognition rate of the maximum of OP-LASRC difference sample numbers is:4 samples 93%, 5 samples 96.5%, 6 samples 96.8%, 7 samples 97.0%.Discrimination after OP-LASRC dimensionality reductions on discrimination than PCA, LDA, LPP Height illustrates the feasibility of OP-LASRC, and this is the accurate premise of Activity recognition.
2nd, comparison of classification experiment
When LASRC algorithms are subjected to supervision dimensionality reduction to 200 dimension, it is compared with the maximum average recognition rate of other algorithms, has Body result six kinds of algorithm discriminations of visible table 1 below:
For comparing other algorithms, the LASRC algorithms discrimination of the dimensionality reduction that is only subjected to supervision is than NB-NN (document 6), Linear SVM (Moayedi F, Azimifar Z, Boostani R.Structured sparse representation for Human action recognition [J] .Neurocomputing, 2015,161 (C):38-46), CRC (document 9) algorithm Discrimination is high, has a same discrimination with SRC (document 7) algorithm, and the LASRC algorithms of the dimensionality reduction that is not subjected to supervision, discrimination is only Have 85.4%, it can be seen that Jing after OP-LASRC supervision dimensionality reductions, can guarantee that LASRC occupies some superiority on discrimination.
Table 1
3rd, robustness experiment:
LASRC algorithms improve recognition speed on the basis of SRC algorithms, still do not affect its degree of accuracy and robustness, Add noise in the test to test sample, damage percentage such as Fig. 5 of picture, 1~5 width figure in Fig. 5 is separately added into:Average is 0, variance corresponds to 0.2,0.5,0.1,0.2,0.3 Gaussian noise.PCA dimensionality reductions and OP-LASRC supervision dimensionality reductions is adopted during classification To 200 dimensions, tested with 5 sample classifications, its discrimination such as table 2.
According to table 2 as can be seen that when picture destruction is than less than 60%, the discrimination of OP-LASRC supervision dimensionality reductions remains to protect Hold more than 90%, it is overall higher than the discrimination of PCA dimensionality reduction, illustrate under OP-LASRC supervision dimensionality reductions, although data reduction, The robustness of LASRC algorithms is still very strong.
Table 2

Claims (1)

1. a kind of supervision dimension-reduction algorithm for being applied to big data Activity recognition, it is characterised in that specifically include following steps:
1), to training sample and test specimens sample y principal component analysiss PCA dimension-reduction treatment, keeping characteristics information;
2), complete dictionary A was made up of the training sample after dimensionality reduction,
A=[A1,A2,.....,Ac]=[v1,1,v1,2,....v1,j,v2,1,v2,2......vi,j], i=c, j=e,
A=[A1,A2,.....,Ac]∈RN×M, A is the matrix of N rows M row, has c classes, has e width figures per class, altogether c × e =n width figures, each sample is v,
Each training sample is isolated in order as test sample, uses formulaCalculate corresponding dilute Sparse coefficient
3), this class reconstructed residual R is calculated with formula belowWThe reconstructed residual R and between classB, and with formula (β RB-RW)pkkpk, k= 1,2 ..., d, d<<N draws matrix P, λkIt is β RB-RWEigenvalue, PkIt is corresponding characteristic vector, β is a constant ginseng Number, for balancing the information of reconstructed residual between this class reconstructed residual and class,
R W = 1 n &Sigma; i = 1 c &Sigma; j = 1 e ( v i j - A&delta; i ( x 0 ) ) ( v i j - A&delta; i ( x 0 ) ) T
R B = 1 n ( c - 1 ) &Sigma; i = 1 c &Sigma; j = 1 e &Sigma; l &NotEqual; i ( v i j - A&delta; l ( x 0 ) ) ( v i j - A&delta; l ( x 0 ) ) T
δl(x0) it is whole sparse coefficient,For the coefficient of each class;
4), be subjected to supervision the matrix B=P after dimensionality reductionTA, uses L2 standardization, then uses formulaEstimation correspondence Sparse coefficient
5), from sparse coefficientIn select first w maximum coefficient, find the classification of corresponding training sample, the instruction of these classes Practice sample and constitute new complete dictionary Ω, test sample is expressed as y=Ω x0
6), by formulaλ is sparse control coefrficient, is asked with L1 norms Solution obtains sparse coefficient x0
7), by formulaCalculate residual error ri(y), the as minimum class of residual error, identification knot Really.
CN201610982038.2A 2016-11-08 2016-11-08 Supervision dimension reduction method applied to big data Activity recognition Active CN106529594B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610982038.2A CN106529594B (en) 2016-11-08 2016-11-08 Supervision dimension reduction method applied to big data Activity recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610982038.2A CN106529594B (en) 2016-11-08 2016-11-08 Supervision dimension reduction method applied to big data Activity recognition

Publications (2)

Publication Number Publication Date
CN106529594A true CN106529594A (en) 2017-03-22
CN106529594B CN106529594B (en) 2019-07-23

Family

ID=58351348

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610982038.2A Active CN106529594B (en) 2016-11-08 2016-11-08 Supervision dimension reduction method applied to big data Activity recognition

Country Status (1)

Country Link
CN (1) CN106529594B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107784293A (en) * 2017-11-13 2018-03-09 中国矿业大学(北京) A kind of Human bodys' response method classified based on global characteristics and rarefaction representation
CN110210443A (en) * 2019-06-11 2019-09-06 西北工业大学 A kind of gesture identification method of the sparse classification of optimization projection symmetry approximation

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1466270A4 (en) * 2001-12-06 2005-03-23 Univ New York Logic arrangement, data structure, system and method for multilinear representation of multimodal data ensembles for synthesis, recognition and compression
JP2015133085A (en) * 2014-01-15 2015-07-23 キヤノン株式会社 Information processing device and method thereof
CN105095866A (en) * 2015-07-17 2015-11-25 重庆邮电大学 Rapid behavior identification method and system
CN105631420A (en) * 2015-12-23 2016-06-01 武汉工程大学 Multi-angle indoor human action recognition method based on 3D skeleton
CN105930790A (en) * 2016-04-19 2016-09-07 电子科技大学 Human body behavior recognition method based on kernel sparse coding

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1466270A4 (en) * 2001-12-06 2005-03-23 Univ New York Logic arrangement, data structure, system and method for multilinear representation of multimodal data ensembles for synthesis, recognition and compression
JP2015133085A (en) * 2014-01-15 2015-07-23 キヤノン株式会社 Information processing device and method thereof
CN105095866A (en) * 2015-07-17 2015-11-25 重庆邮电大学 Rapid behavior identification method and system
CN105631420A (en) * 2015-12-23 2016-06-01 武汉工程大学 Multi-angle indoor human action recognition method based on 3D skeleton
CN105930790A (en) * 2016-04-19 2016-09-07 电子科技大学 Human body behavior recognition method based on kernel sparse coding

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
ORUGANTI, VRM 等: "Dimensionality reduction of Fisher vectors for human action recognition", 《IET COMPUTER VISION》 *
SHENG, JY 等: "Assigning PLS Based Descriptors by SVM in Action Recognition", 《INTELLIGENCE SCIENCE AND BIG DATA ENGINEERING: IMAGE AND VIDEO DATA ENGINEERING》 *
蔡加欣 等: "基于姿势字典学习的人体行为识别", 《光学学报》 *
黄文丽 等: "结合时空拓扑特征和稀疏表达的人体行为识别算法", 《计算机应用》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107784293A (en) * 2017-11-13 2018-03-09 中国矿业大学(北京) A kind of Human bodys' response method classified based on global characteristics and rarefaction representation
CN110210443A (en) * 2019-06-11 2019-09-06 西北工业大学 A kind of gesture identification method of the sparse classification of optimization projection symmetry approximation
CN110210443B (en) * 2019-06-11 2022-03-15 西北工业大学 Gesture recognition method for optimizing projection symmetry approximate sparse classification

Also Published As

Publication number Publication date
CN106529594B (en) 2019-07-23

Similar Documents

Publication Publication Date Title
Yan et al. Graph embedding and extensions: A general framework for dimensionality reduction
Cai et al. Isometric projection
Cai et al. Multi-view super vector for action recognition
Yuan et al. Fractional-order embedding canonical correlation analysis and its applications to multi-view dimensionality reduction and recognition
CN108647690B (en) Non-constrained face image dimension reduction method based on discrimination sparse preserving projection
Shirazi et al. Clustering on Grassmann manifolds via kernel embedding with application to action analysis
CN109241813B (en) Non-constrained face image dimension reduction method based on discrimination sparse preservation embedding
Chanti et al. Improving bag-of-visual-words towards effective facial expressive image classification
Zhao et al. Bisecting k-means clustering based face recognition using block-based bag of words model
Zhou et al. Novel Gaussianized vector representation for improved natural scene categorization
Zheng et al. Improved sparse representation with low-rank representation for robust face recognition
Gao et al. Median null (sw)-based method for face feature recognition
De la Torre et al. Representational oriented component analysis (ROCA) for face recognition with one sample image per training class
Lee et al. Guided co-training for multi-view spectral clustering
Wan et al. Feature extraction based on fuzzy local discriminant embedding with applications to face recognition
Beham et al. Face recognition using appearance based approach: A literature survey
Kumar et al. Max-margin non-negative matrix factorization
CN106529594A (en) Supervised dimension reduction algorithm for big data behavior recognition
Wei et al. Kernel locality-constrained collaborative representation based discriminant analysis
Culpepper et al. Building a better probabilistic model of images by factorization
Qiu et al. Learning transformations for classification forests
CN104361337A (en) Sparse kernel principal component analysis method based on constrained computation and storage space
Wang et al. Canonical principal angles correlation analysis for two-view data
Tang et al. Robust L1-norm matrixed locality preserving projection for discriminative subspace learning
Wang et al. Unsupervised discriminant canonical correlation analysis for feature fusion

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant