CN101159021A - Feature extracting method, device and pattern recognition method and device - Google Patents

Feature extracting method, device and pattern recognition method and device Download PDF

Info

Publication number
CN101159021A
CN101159021A CNA2007101784105A CN200710178410A CN101159021A CN 101159021 A CN101159021 A CN 101159021A CN A2007101784105 A CNA2007101784105 A CN A2007101784105A CN 200710178410 A CN200710178410 A CN 200710178410A CN 101159021 A CN101159021 A CN 101159021A
Authority
CN
China
Prior art keywords
matrix
sample
class
twocouese
folk prescription
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2007101784105A
Other languages
Chinese (zh)
Other versions
CN100589118C (en
Inventor
王磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhongxingtianshi Technology Co ltd
Original Assignee
Vimicro Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vimicro Corp filed Critical Vimicro Corp
Priority to CN200710178410A priority Critical patent/CN100589118C/en
Publication of CN101159021A publication Critical patent/CN101159021A/en
Application granted granted Critical
Publication of CN100589118C publication Critical patent/CN100589118C/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The invention discloses a feature extraction method. The method includes steps: acquiring a unidirectional projection matrix formed by a first determined number of eigenvectors corresponding to the maximum eigenvalues of the difference matrix of a first inter-class scatter matrix and a first intra-class scatter matrix of a sample matrix including more than one class of samples, and projecting the sample matrix onto the unidirectional projection matrix to obtain a unidirectional characteristic sample matrix; and acquiring a bidirectional projection matrix formed by a second determined number of eigenvectors corresponding to the maximum eigenvalues of the difference matrix of a second inter-class scatter matrix and a second intra-class scatter matrix of the transposed matrix of the unidirectional characteristic sample matrix, and projecting the transposed matrix of the unidirectional characteristic matrix to the bidirectional projection matrix to obtain bidirectional characteristic samples. The invention also discloses a feature extraction device and a pattern recognition method and a device. Applying the inventive method and the device can improve stability and speed for feature extraction and pattern recognition.

Description

Feature extracting method, device and mode identification method, device
Technical field
The present invention relates to image processing field, relate in particular to a kind of feature extracting method, device and a kind of mode identification method, device.
Background technology
Feature extraction is the important step of carrying out pattern-recognition, and feature deriving means is the important component part of pattern recognition system.Feature extracting method and device all have a wide range of applications in image processing field such as the detection of people's face, recognitions of face.The core concept of feature extraction is: D primitive character piezometric of sample is condensed to d characteristic quantity, makes the sample that apparatus has the feature samples of a described d characteristic quantity just can represent to have D primitive character amount, described d is less than D.Can realize above-mentioned feature extraction according to mathematical principles such as matrix theories, carry out feature extraction the mathematic(al) manipulation process can for: suppose to have N training sample X=[x 1, x 2..., x N], each sample has D primitive character amount, wishes by mathematic(al) manipulation each sample boil down to be had the feature samples Y=[y of d characteristic quantity 1, y 2..., y N], transformation relation is X=f (Y), the task of feature extraction is determined transforming function transformation function f (Y).The purpose of feature extraction is to reduce the feature quantity of sample, just can represent original sample with having the sample that is less than primitive character quantity.
In the existing feature extracting method, the application of two-dimensional linear techniques of discriminant analysis (2DLDA, 2 DimensionLinear Discriminant Analysis) is comparatively extensive.For example and in conjunction with Fig. 1 the concrete grammar that adopts the 2DLDA method to carry out feature extraction is described below.
Suppose to have c class sample, add up to N, each sample has D primitive character, and each sample can represent that uniquely the number of i class sample is N with a two-dimensional matrix i, j sample in the i class is I j (i), the average of i class sample is
Figure S2007101784105D00011
, the average of all samples is
Figure S2007101784105D00012
, wherein I ‾ ( i ) = 1 N i Σ j = 1 N i I j ( i ) , I ‾ = 1 N Σ i = 1 C Σ j = 1 N i I j ( i ) , I j (i)With the matrix representation of a m * n dimension, so sample I j (i)Have m * n feature altogether.
The process flow diagram of Fig. 1 for adopting the 2DLDA method to carry out feature extraction, as shown in Figure 1, its concrete steps are as follows:
Step 101 obtains divergence matrix G in the sample class w
Divergence matrix G in the described sample class wComputing formula be G w = 1 N Σ i = 1 c Σ j = 1 N i ( I j ( i ) - I ‾ ( i ) ) T ( I j ( i ) - I ‾ ( i ) ) .
Step 102 obtains sample between class scatter matrix G b
Described sample between class scatter matrix G bComputing formula be G b = 1 N Σ i = 1 c N i ( I ‾ ( i ) - I ‾ ) T ( I ‾ ( i ) - I ‾ ) .
Step 103 is according to G wAnd G bObtain folk prescription to feature samples.
To G w -1G bCarry out characteristic value decomposition and obtain a eigenwert, from a described a eigenwert, choose e (the eigenwert ξ of individual maximum of e<a) j, j=1,2 ..., e, ξ 1〉=ξ 2〉=... 〉=ξ e, obtain eigenwert ξ jPairing proper vector u j, promptly ( G w - 1 G b ) u j = ξ j u j , J=1,2 ..., e, ξ 1〉=ξ 2〉=... 〉=ξ e, structure projection matrix U e=[u 1, u 2..., u e], sample is projected to U eOn, obtain folk prescription to feature samples B j ( i ) = I j ( i ) U e , So original sample I that uses the matrix representation of m * n dimension j (i), the matrix B that just can use m * e to tie up j (i)Represent, for explaining conveniently, with B j (i)Be called folk prescription to feature samples.
Step 104 obtains folk prescription divergence matrix H in the class of the transposed matrix of feature samples w
Described folk prescription is the divergence matrix H in the class of feature samples wComputing formula be H w = 1 N Σ i = 1 c Σ j = 1 N i ( B j ( i ) - B ‾ ( i ) ) ( B j ( i ) - B ‾ ( i ) ) T .
Step 105 obtains the between class scatter matrix H of folk prescription to the transposed matrix of feature samples b
Described folk prescription is to the between class scatter matrix H of feature samples bComputing formula be H b = 1 N Σ i = 1 c N i ( B ‾ ( i ) - B ‾ ) ( B ‾ ( i ) - B ‾ ) T .
Step 106 is according to H wAnd H bObtain the twocouese feature samples.
To H w -1H bCarry out characteristic value decomposition and obtain b eigenwert, from a described b eigenwert, choose k (the eigenwert η of individual maximum of k<b) q, q=1,2 ..., k, η 1〉=η 2〉=... 〉=η k, obtain eigenwert η qPairing proper vector v q, promptly ( H w - 1 H b ) v q = η q v q , Q=1,2 ..., k, η 1〉=η 2〉=... 〉=η k, structure projection matrix V q=[v 1, v 2..., v q], folk prescription is projected to V to the transposed matrix of feature samples qOn, obtain ( C j ( i ) ) T = ( B j ( i ) ) T V q , With folk prescription to feature samples B j ( i ) = I j ( i ) U e Substitution ( C j ( i ) ) T = ( B j ( i ) ) T V q In can get twocouese feature samples matrix C j ( i ) = ( V q ) T B j ( i ) = ( V q ) T I j ( i ) U e . So original matrix I with m * n dimension j (i)The sample of expression, the Matrix C that just can use e * k to tie up j (i)Represent, for explaining conveniently, with C j (i)Be called the twocouese feature samples.Through after the above-mentioned conversion, the sample I that had m * n feature originally j (i)After above-mentioned feature extraction processing, be transformed to twocouese feature samples C with e * k feature j (i), with described twocouese feature samples C j (i)Just can represent sample I j (i)When needs to sample I j (i)When operating, can use C j (i)Carrying out same operation replaces I j (i)Operation reach same effect.
In the above-mentioned steps, the execution sequence of step 101 and step 102 can exchange, and the execution sequence of step 104 and step 105 can exchange.
As seen from the above technical solution, when utilization 2DLDA method is carried out feature extraction, comprised the computing of matrix inversion in step 103 and the step 106 respectively, because not all matrix all has inverse matrix to exist, therefore, the poor stability that uses the 2DLDA method to carry out feature extraction.
Summary of the invention
In view of this, the purpose of the embodiment of the invention is to provide a kind of feature extracting method, device and a kind of mode identification method, device, to improve the stability of feature extraction.
For achieving the above object, the technical scheme of the embodiment of the invention specifically is achieved in that
A kind of feature extracting method, the method comprising the steps of: acquisition comprises more than divergence matrix in the first between class scatter matrix of the sample matrix of the sample of a classification and the first kind; Matrix of differences to divergence matrix in the first between class scatter matrix and the first kind is carried out characteristic value decomposition, obtain the pairing proper vector of eigenvalue of maximum of first predetermined number and form folk prescription, described sample matrix is projected to folk prescription on projection matrix, obtain folk prescription to the feature samples matrix to projection matrix; Obtain folk prescription divergence matrix in the second between class scatter matrix of feature samples transpose of a matrix matrix and second class; Matrix of differences to divergence matrix in the second between class scatter matrix and second class is carried out characteristic value decomposition, obtain the pairing proper vector of eigenvalue of maximum of second predetermined number and form the twocouese projection matrix, described folk prescription is projected on the described twocouese projection matrix to feature samples transpose of a matrix matrix obtain the twocouese feature samples.
A kind of mode identification method, the method comprising the steps of: acquisition comprises more than divergence matrix in the first between class scatter matrix of the sample matrix of the sample of a classification and the first kind; Matrix of differences to divergence matrix in the first between class scatter matrix and the first kind is carried out characteristic value decomposition, obtain the pairing proper vector of the eigenvalue of maximum of first predetermined number and form folk prescription, the average sample of each class sample in the described sample matrix is projected to described folk prescription on projection matrix, obtain folk prescription to the characteristics of mean sample matrix to projection matrix; Obtain folk prescription divergence matrix in the second between class scatter matrix of the transposed matrix of characteristics of mean sample matrix and second class; Matrix of differences to divergence matrix in the second between class scatter matrix and second class is carried out characteristic value decomposition, obtain the pairing proper vector of the eigenvalue of maximum of second predetermined number and form the twocouese projection matrix, described folk prescription is projected to the transposed matrix of characteristics of mean sample matrix obtain twocouese characteristics of mean sample on the described twocouese projection matrix; The two-dimensional matrix of the sample that expression is to be identified projects to described folk prescription and obtains folk prescription to feature samples matrix to be identified on projection matrix, described folk prescription is projected to feature samples transpose of a matrix matrix to be identified obtain twocouese feature samples to be identified on the described twocouese projection matrix; Distance between more described twocouese feature samples matrix to be identified and the described twocouese characteristics of mean sample matrix is chosen classification that the minimum twocouese characteristics of mean sample matrix of described distance the belongs to classification as sample to be identified.
A kind of feature deriving means, this device comprise divergence extraction unit, between class scatter extraction unit and eigentransformation unit in the class; The divergence extraction unit is used to obtain to comprise more than divergence matrix in the first kind of the sample matrix of the sample of a classification in the described class, the folk prescription that reception eigentransformation unit is sent is to feature samples transpose of a matrix matrix, obtain the interior divergence matrix of second class of this transposed matrix, with divergence matrix output in first and second class; Described between class scatter extraction unit is used to obtain to comprise the first between class scatter matrix more than the sample matrix of the sample of a classification, the folk prescription that reception eigentransformation unit is sent is to feature samples transpose of a matrix matrix, obtain the second between class scatter matrix of this transposed matrix, with divergence matrix output in first and second class; Described eigentransformation unit is used for the matrix of differences of divergence matrix in the first kind of divergence extraction unit output in first between class scatter matrix of between class scatter extraction unit output and the class is carried out characteristic value decomposition, choose the pairing proper vector of eigenvalue of maximum of first predetermined number and form folk prescription to projection matrix, described sample is projected to folk prescription on projection matrix, obtain folk prescription to the feature samples matrix, and folk prescription carried out transposition to the feature samples matrix, the folk prescription that obtains is exported to feature samples transpose of a matrix matrix, matrix of differences to divergence matrix in second class of divergence extraction unit output in the second between class scatter matrix of between class scatter extraction unit output and the class is carried out characteristic value decomposition, choose the pairing proper vector of eigenvalue of maximum of second predetermined number and form the twocouese projection matrix, described transposed matrix is projected to obtain twocouese feature samples matrix on the twocouese projection matrix.
A kind of pattern recognition device, this device comprises feature extraction unit and pattern recognition unit; Described feature extraction unit is used to obtain to comprise more than divergence matrix in the first between class scatter matrix of the sample matrix of the sample of a classification and the first kind, matrix of differences to divergence matrix in the first between class scatter matrix and the first kind is carried out characteristic value decomposition, obtain the pairing proper vector of eigenvalue of maximum of first predetermined number and form folk prescription to projection matrix, described sample matrix is projected to folk prescription on projection matrix, obtain folk prescription to the feature samples matrix, obtain folk prescription divergence matrix in the second between class scatter matrix of feature samples transpose of a matrix matrix and second class, matrix of differences to divergence matrix in the second between class scatter matrix and second class is carried out characteristic value decomposition, obtain the pairing proper vector of described eigenvalue of maximum of second predetermined number and form the twocouese projection matrix, described folk prescription projected to feature samples transpose of a matrix matrix obtain twocouese feature samples matrix on the described twocouese projection matrix, the two-dimensional matrix of the sample that expression is to be identified projects to described folk prescription and obtains folk prescription to feature samples matrix to be identified on projection matrix, described folk prescription projected to feature samples transpose of a matrix matrix to be identified obtain twocouese feature samples matrix to be identified on the described twocouese projection matrix, with described twocouese characteristics of mean sample matrix and the feature samples matrix output to be identified of described twocouese; Described pattern recognition unit is used to receive the described twocouese characteristics of mean sample matrix and the described twocouese feature samples matrix to be identified of described feature extraction unit output, distance between more described twocouese feature samples matrix to be identified and the described twocouese characteristics of mean sample matrix is chosen classification that the minimum twocouese characteristics of mean sample matrix of described distance the belongs to classification as sample to be identified.
As seen from the above technical solution, this method and apparatus of the embodiment of the invention, because it does not need matrix is carried out inversion operation in the process that obtains projection matrix, therefore use described method of the embodiment of the invention or device to carry out the stability height of feature extraction or pattern-recognition.
Description of drawings
Fig. 1 is the process flow diagram that available technology adopting 2DLDA method is carried out feature extraction;
Fig. 2 is the process flow diagram of the feature extracting method that provides of the embodiment of the invention;
Fig. 3 is the process flow diagram of the face identification method that provides of the embodiment of the invention;
Fig. 4 is the structural drawing of the feature deriving means that provides of the embodiment of the invention;
Fig. 5 is the structural drawing of the pattern recognition device that provides of the embodiment of the invention.
Embodiment
For the purpose, technical scheme and the advantage that make the embodiment of the invention is clearer, below with reference to the accompanying drawing preferred embodiment that develops simultaneously, the embodiment of the invention is further described.
Fig. 2 is the process flow diagram of the feature extracting method that the embodiment of the invention provided.
Suppose to have c class sample, add up to N, each sample has D primitive character, and each sample can represent that uniquely the number of i class sample is N with a two-dimensional matrix i, j sample in the i class is I j (i), the average of i class sample is
Figure S2007101784105D00061
The average of all samples is
Figure S2007101784105D00062
Wherein I ‾ ( i ) = 1 N i Σ j = 1 N i I j ( i ) , I ‾ = 1 N Σ i = 1 C Σ j = 1 N i I j ( i ) , I j (i)With the matrix representation of a m * n dimension, so sample I j (i)Have m * n feature altogether.
Step 201 obtains divergence matrix G in the sample class w
Divergence matrix G in the described sample class wComputing formula be G w = 1 N Σ i = 1 c Σ j = 1 N i ( I j ( i ) - I ‾ ( i ) ) T ( I j ( i ) - I ‾ ( i ) ) .
Step 202 obtains sample between class scatter matrix G b
Described sample between class scatter matrix G bComputing formula be G b = 1 N Σ i = 1 c N i ( I ‾ ( i ) - I ‾ ) T ( I ‾ ( i ) - I ‾ ) .
Step 203 is according to G wAnd G bObtain folk prescription to feature samples.
To G b-G wCarry out characteristic value decomposition and obtain r eigenwert, from a described r eigenwert, choose s (the eigenwert δ of individual maximum of s<r) g, g=1,2 ..., s, δ 1〉=δ 2〉=... 〉=δ s, obtain eigenwert δ gPairing proper vector d g, promptly ( G w - 1 G b ) d g = δ g d g , G=1,2 ..., s, δ 1〉=δ 2〉=... 〉=δ s, structure projection matrix D s=[d 1, d 2..., d s], sample is projected to D sOn, obtain folk prescription to feature samples B j ( i ) = I j ( i ) D s , So original sample I that uses the matrix representation of m * n dimension j (i), the matrix B that just can use m * s to tie up j (i)Represent B j (i)Be folk prescription to feature samples.
Step 204 obtains folk prescription divergence matrix H in the class of the transposed matrix of feature samples w
Described folk prescription is the divergence matrix H in the class of feature samples wComputing formula be H w = 1 N Σ i = 1 c Σ j = 1 N i ( B j ( i ) - B ‾ ( i ) ) ( B j ( i ) - B ‾ ( i ) ) T .
Step 205 obtains the between class scatter matrix H of folk prescription to the transposed matrix of feature samples b
Described folk prescription is to the between class scatter matrix H of feature samples bComputing formula be H b = 1 N Σ i = 1 c N i ( B ‾ ( i ) - B ‾ ) ( B ‾ ( i ) - B ‾ ) T .
Step 206 is according to H wAnd H bObtain the twocouese feature samples.
To H b-H wCarry out characteristic value decomposition and obtain λ eigenwert, from a described λ eigenwert, choose θ (the eigenwert β of individual maximum of θ<λ) t, t=1,2 ..., θ, β 1〉=β 2〉=... 〉=β t, obtain eigenwert β tPairing proper vector φ t, i.e. (H b-H w) φ ttφ t, t=1,2 ..., θ, β 1〉=β 2〉=... 〉=β t, structure projection matrix Φ θ=[φ 1, φ 2..., φ θ, folk prescription is projected to Φ to the transposed matrix of feature samples θOn, obtain ( C j ( i ) ) T = ( B j ( i ) ) T Φ θ , With folk prescription to feature samples B j ( i ) = I j ( i ) D s Substitution ( C j ( i ) ) T = ( B j ( i ) ) T Φ θ In can get twocouese feature samples matrix C j ( i ) = ( Φ θ ) T B j ( i ) = ( Φ θ ) T I j ( i ) D s . So original matrix I with m * n dimension j (i)The sample of expression, the Matrix C that just can use s * θ to tie up j (i)Represent C j (i)Be the twocouese feature samples.S and θ can determine according to the ratio of image characteristics extraction and the factors such as data storage capacity in the practical application, be transformed to the characteristic image of the matrix representations of 128 * 256 dimensions through feature extraction such as the image of the matrix representation that wants a usefulness 1024 * 1024 is tieed up, just s can be made as 128, θ is made as 256.
In the above-mentioned steps, the execution sequence of step 201 and step 202 can exchange, and the execution sequence of step 204 and step 205 can exchange.
Now be example with the recognition of face, how concrete introduction uses the feature extracting method that present embodiment provides.
The process flow diagram that Fig. 3 carries out recognition of face for the application feature extracting method that present embodiment provided.
Step 301 is set up face database, comprises N people's facial image in the described database, and everyone total M opens facial image, and everyone face images constitutes people's face classification.
Step 302 as sample, obtains the described projection matrix D of present embodiment with the face images in the described face database gAnd Φ θ
Step 303, the twocouese characteristic image of everyone average facial image in the computational data storehouse.
At first calculate everyone average facial image, its method is: the M that supposes everyone opens facial image and is respectively K 1, K 2, K 3,,, K m, so, described average facial image is K ‾ = 1 M Σ i = 1 M K i . People's face classification that everyone average facial image belongs to is exactly that this people's M opens people's face classification that facial image constitutes.
According to the described feature extracting method of present embodiment described average facial image is carried out conversion and obtain twocouese feature facial image, concrete transform method is: with the average facial image
Figure S2007101784105D00082
Project to projection matrix D according to the described feature extracting method of present embodiment gAnd Φ θOn, promptly try to achieve function f (J)=(Φ θ) TJD gValue, (Φ wherein θ) TBe Φ θTransposed matrix,
Figure S2007101784105D00083
Be the twocouese feature facial image of described average facial image.
According to the transform method of the described twocouese characteristic image of this step, can obtain the twocouese feature facial image of everyone average facial image of a described N philtrum, described twocouese feature facial image is designated as respectively
Step 304 is imported sample to be identified, i.e. image J, and obtain twocouese characteristic image f (J)=(Φ of this image θ) TJD g
Step 305, the twocouese characteristic image f (J) of more described sample to be identified and described twocouese feature facial image
Figure S2007101784105D00085
Distance, people's face classification that the average facial image of the twocouese feature facial image correspondence of selected distance minimum belongs to is as the classification of sample to be identified.The calculating of described distance is same as the prior art, such as by calculate f (J) with
Figure S2007101784105D00086
In the norm of each twocouese feature facial image calculate described distance.
The feature extracting method that present embodiment provided can also be applied to fingerprint recognition, palmmprint identification etc., and other can be represented in the pattern-recognition application scenarios of object to be identified with two-dimensional matrix.
The structural drawing of the feature deriving means that Fig. 4 provides for the embodiment of the invention, as shown in Figure 4, this device comprises divergence extraction unit 401, between class scatter extraction unit 402 and eigentransformation unit 403 in the class.
Divergence extraction unit 401 is used to obtain the interior divergence matrix G of class that total sample number is the c class sample of N in the class wReceive the B of eigentransformation unit 403 outputs j (i)Transposed matrix, obtain folk prescription to feature samples B j (i)The class of transposed matrix in the divergence matrix H w, with G w, H wOutput to eigentransformation unit 403.
Between class scatter extraction unit 402 is used to obtain the between class scatter matrix G that total sample number is the c class sample of N bReceive the B of eigentransformation unit 403 outputs j (i)Transposed matrix, obtain folk prescription to feature samples B j (i)The between class scatter matrix H of transposed matrix b, with G b, H bOutput to eigentransformation unit 403.
Eigentransformation unit 403 is used to receive the G of divergence extraction unit 401 outputs in the class w, receive the G that between class scatter extraction unit 402 is exported b, to G b-G wCarry out characteristic value decomposition and obtain G b-G wEigenwert and proper vector, choose in the described eigenwert s the pairing proper vector of maximum eigenwert and form projection matrix D s, described sample is projected to D sOn obtain folk prescription to feature samples B j (i), and to B j (i)Carry out transposition and obtain described folk prescription to feature samples B j (i)Transposed matrix, described transposed matrix is outputed to divergence extraction unit 401 and described between class scatter extraction unit 402 in the described class, receive the H of divergence extraction unit 401 outputs in the described class wH with described between class scatter extraction unit 402 outputs b, to H b-H wCarry out characteristic value decomposition and obtain H b-H wEigenwert and proper vector, choose in the described eigenwert θ the pairing proper vector of maximum eigenwert and form projection matrix Φ θ, described transposed matrix is projected to Φ θOn obtain twocouese feature samples C j (i)
The structural drawing of the pattern recognition device that Fig. 5 provides for the embodiment of the invention.As shown in Figure 5, this device comprises feature extraction unit 501 and pattern recognition unit 502.
Described feature extraction unit 501 comprises divergence extraction unit 401, between class scatter extraction unit 402 and eigentransformation unit 403 in the class.
Divergence extraction unit 401 in the class, between class scatter extraction unit 402 and the interior divergence extraction unit 401 of class shown in Figure 4, between class scatter extraction unit 402 are identical.
Eigentransformation unit 403 is used to receive the G of divergence extraction unit 401 outputs in the class w, receive the G that between class scatter extraction unit 402 is exported b, to G b-G wCarry out characteristic value decomposition and obtain G b-G wEigenwert and proper vector, choose in the described eigenwert s the pairing proper vector of maximum eigenwert and form projection matrix D s, the average sample of each class sample in the described sample is projected to D sOn obtain folk prescription to the characteristics of mean sample, described folk prescription is carried out transposition to the characteristics of mean sample obtain transposed matrix, described transposed matrix is outputed to divergence extraction unit 401 and described between class scatter extraction unit 402 in the described class, receive the H of divergence extraction unit 401 outputs in the described class wH with described between class scatter extraction unit 402 outputs b, to H b-H wCarry out characteristic value decomposition and obtain H b-H wEigenwert and proper vector, choose in the described eigenwert θ the pairing proper vector of maximum eigenwert and form projection matrix Φ θ, described transposed matrix is projected to Φ θOn obtain twocouese characteristics of mean sample C j (i), the two-dimensional matrix J of expression sample to be identified is projected to D sOn obtain folk prescription to feature samples to be identified, described folk prescription is projected to Φ to the transposed matrix of feature samples to be identified θOn obtain twocouese feature samples to be identified, with twocouese characteristics of mean sample C j (i)Output to pattern recognition unit 502 with twocouese feature samples to be identified.
Pattern recognition unit 502 is used to receive the twocouese characteristics of mean sample C of described feature extraction unit 501 outputs j (i)With twocouese feature samples to be identified, the distance between more described twocouese feature samples to be identified and the described twocouese characteristics of mean sample is chosen classification that the minimum twocouese characteristics of mean sample of described distance the belongs to classification as sample to be identified.
As seen from the above technical solution, because described method of the embodiment of the invention or device do not need matrix is carried out inversion operation in the process that obtains projection matrix, therefore use described method of the embodiment of the invention or device to carry out the stability height of feature extraction or pattern-recognition, the feature samples that promptly uses described feature extracting method or device to obtain can be represented original sample, uses described mode identification method or device sample to be identified can be included in its classification that belongs to.
In addition, because the utilization technical scheme that the embodiment of the invention provided does not need matrix is carried out inversion operation, therefore reduced the operand that carries out feature extraction and pattern-recognition, thereby improved the speed of feature extraction and pattern-recognition, reduced hardware configuration requirement feature deriving means and pattern recognition device.
The above is preferred embodiment of the present invention only, is not to be used to limit protection scope of the present invention, all any modifications of being made within the spirit and principles in the present invention, is equal to replacement, improvement etc., all should be included within protection scope of the present invention.

Claims (5)

1. feature extracting method is characterized in that the method comprising the steps of:
Acquisition comprises more than divergence matrix in the first between class scatter matrix of the sample matrix of the sample of a classification and the first kind;
Matrix of differences to divergence matrix in the first between class scatter matrix and the first kind is carried out characteristic value decomposition, obtain the pairing proper vector of eigenvalue of maximum of first predetermined number and form folk prescription, described sample matrix is projected to folk prescription on projection matrix, obtain folk prescription to the feature samples matrix to projection matrix;
Obtain folk prescription divergence matrix in the second between class scatter matrix of feature samples transpose of a matrix matrix and second class;
Matrix of differences to divergence matrix in the second between class scatter matrix and second class is carried out characteristic value decomposition, obtain the pairing proper vector of eigenvalue of maximum of second predetermined number and form the twocouese projection matrix, described folk prescription is projected on the described twocouese projection matrix to feature samples transpose of a matrix matrix obtain the twocouese feature samples.
2. the method for claim 1 is characterized in that,
The described acquisition first between class scatter matrix is: G b = 1 N Σ i = 1 c N i ( I ‾ ( i ) - I ‾ ) T ( I ‾ ( i ) - I ‾ ) ;
The divergence matrix is in the described acquisition first kind: G w = 1 N Σ i = 1 c Σ j = 1 N i ( I j ( i ) - I ‾ ( i ) ) T ( I j ( i ) - I ‾ ( i ) ) ;
Wherein, G bBe the first between class scatter matrix, G wBe divergence matrix in the first kind, N is the total number of sample, N iBe the number of i class sample, I j (i)Be j sample in the i class sample, Be the average of i class sample,
Figure S2007101784105C00014
Be the average of all samples, C is the classification total number of sample.
3. mode identification method is characterized in that the method comprising the steps of:
Acquisition comprises more than divergence matrix in the first between class scatter matrix of the sample matrix of the sample of a classification and the first kind;
Matrix of differences to divergence matrix in the first between class scatter matrix and the first kind is carried out characteristic value decomposition, obtain the pairing proper vector of the eigenvalue of maximum of first predetermined number and form folk prescription, the average sample of each class sample in the described sample matrix is projected to described folk prescription on projection matrix, obtain folk prescription to the characteristics of mean sample matrix to projection matrix;
Obtain folk prescription divergence matrix in the second between class scatter matrix of the transposed matrix of characteristics of mean sample matrix and second class;
Matrix of differences to divergence matrix in the second between class scatter matrix and second class is carried out characteristic value decomposition, obtain the pairing proper vector of the eigenvalue of maximum of second predetermined number and form the twocouese projection matrix, described folk prescription is projected to the transposed matrix of characteristics of mean sample matrix obtain twocouese characteristics of mean sample on the described twocouese projection matrix;
The two-dimensional matrix of the sample that expression is to be identified projects to described folk prescription and obtains folk prescription to feature samples matrix to be identified on projection matrix, described folk prescription is projected to feature samples transpose of a matrix matrix to be identified obtain twocouese feature samples to be identified on the described twocouese projection matrix;
Distance between more described twocouese feature samples matrix to be identified and the described twocouese characteristics of mean sample matrix is chosen classification that the minimum twocouese characteristics of mean sample matrix of described distance the belongs to classification as sample to be identified.
4. a feature deriving means is characterized in that, this device comprises divergence extraction unit, between class scatter extraction unit and eigentransformation unit in the class;
The divergence extraction unit is used to obtain to comprise more than divergence matrix in the first kind of the sample matrix of the sample of a classification in the described class, the folk prescription that reception eigentransformation unit is sent is to feature samples transpose of a matrix matrix, obtain the interior divergence matrix of second class of this transposed matrix, with divergence matrix output in first and second class;
Described between class scatter extraction unit is used to obtain to comprise the first between class scatter matrix more than the sample matrix of the sample of a classification, the folk prescription that reception eigentransformation unit is sent is to feature samples transpose of a matrix matrix, obtain the second between class scatter matrix of this transposed matrix, with divergence matrix output in first and second class;
Described eigentransformation unit is used for the matrix of differences of divergence matrix in the first kind of divergence extraction unit output in first between class scatter matrix of between class scatter extraction unit output and the class is carried out characteristic value decomposition, choose the pairing proper vector of eigenvalue of maximum of first predetermined number and form folk prescription to projection matrix, described sample is projected to folk prescription on projection matrix, obtain folk prescription to the feature samples matrix, and folk prescription carried out transposition to the feature samples matrix, the folk prescription that obtains is exported to feature samples transpose of a matrix matrix, matrix of differences to divergence matrix in second class of divergence extraction unit output in the second between class scatter matrix of between class scatter extraction unit output and the class is carried out characteristic value decomposition, choose the pairing proper vector of eigenvalue of maximum of second predetermined number and form the twocouese projection matrix, described transposed matrix is projected to obtain twocouese feature samples matrix on the twocouese projection matrix.
5. a pattern recognition device is characterized in that, this device comprises feature extraction unit and pattern recognition unit;
Described feature extraction unit is used to obtain to comprise more than divergence matrix in the first between class scatter matrix of the sample matrix of the sample of a classification and the first kind, matrix of differences to divergence matrix in the first between class scatter matrix and the first kind is carried out characteristic value decomposition, obtain the pairing proper vector of eigenvalue of maximum of first predetermined number and form folk prescription to projection matrix, described sample matrix is projected to folk prescription on projection matrix, obtain folk prescription to the feature samples matrix, obtain folk prescription divergence matrix in the second between class scatter matrix of feature samples transpose of a matrix matrix and second class, matrix of differences to divergence matrix in the second between class scatter matrix and second class is carried out characteristic value decomposition, obtain the pairing proper vector of described eigenvalue of maximum of second predetermined number and form the twocouese projection matrix, described folk prescription projected to feature samples transpose of a matrix matrix obtain twocouese feature samples matrix on the described twocouese projection matrix, the two-dimensional matrix of the sample that expression is to be identified projects to described folk prescription and obtains folk prescription to feature samples matrix to be identified on projection matrix, described folk prescription projected to feature samples transpose of a matrix matrix to be identified obtain twocouese feature samples matrix to be identified on the described twocouese projection matrix, with described twocouese characteristics of mean sample matrix and the feature samples matrix output to be identified of described twocouese;
Described pattern recognition unit is used to receive the described twocouese characteristics of mean sample matrix and the described twocouese feature samples matrix to be identified of described feature extraction unit output, distance between more described twocouese feature samples matrix to be identified and the described twocouese characteristics of mean sample matrix is chosen classification that the minimum twocouese characteristics of mean sample matrix of described distance the belongs to classification as sample to be identified.
CN200710178410A 2007-11-29 2007-11-29 Feature extracting method, device and pattern recognition method and device Active CN100589118C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200710178410A CN100589118C (en) 2007-11-29 2007-11-29 Feature extracting method, device and pattern recognition method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200710178410A CN100589118C (en) 2007-11-29 2007-11-29 Feature extracting method, device and pattern recognition method and device

Publications (2)

Publication Number Publication Date
CN101159021A true CN101159021A (en) 2008-04-09
CN100589118C CN100589118C (en) 2010-02-10

Family

ID=39307110

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200710178410A Active CN100589118C (en) 2007-11-29 2007-11-29 Feature extracting method, device and pattern recognition method and device

Country Status (1)

Country Link
CN (1) CN100589118C (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102254166A (en) * 2011-08-15 2011-11-23 无锡中星微电子有限公司 Face recognition method
CN102693422A (en) * 2012-06-05 2012-09-26 江苏物联网研究发展中心 Designing method of filter capable of enhancing local-binary-pattern-like characteristic face identification performance
CN103646244A (en) * 2013-12-16 2014-03-19 北京天诚盛业科技有限公司 Methods and devices for face characteristic extraction and authentication
WO2019090879A1 (en) * 2017-11-09 2019-05-16 合肥工业大学 Analog circuit fault diagnosis method based on cross wavelet features
CN110673042A (en) * 2019-10-31 2020-01-10 安徽优旦科技有限公司 Data-based battery PACK thermal field change evaluation method
CN112395986A (en) * 2020-11-17 2021-02-23 广州像素数据技术股份有限公司 Face recognition method for quickly migrating new scene and preventing forgetting

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102254166A (en) * 2011-08-15 2011-11-23 无锡中星微电子有限公司 Face recognition method
CN102693422A (en) * 2012-06-05 2012-09-26 江苏物联网研究发展中心 Designing method of filter capable of enhancing local-binary-pattern-like characteristic face identification performance
CN102693422B (en) * 2012-06-05 2014-02-19 江苏物联网研究发展中心 Designing method of filter capable of enhancing local-binary-pattern-like characteristic face identification performance
CN103646244A (en) * 2013-12-16 2014-03-19 北京天诚盛业科技有限公司 Methods and devices for face characteristic extraction and authentication
CN103646244B (en) * 2013-12-16 2018-01-09 北京天诚盛业科技有限公司 Extraction, authentication method and the device of face characteristic
WO2019090879A1 (en) * 2017-11-09 2019-05-16 合肥工业大学 Analog circuit fault diagnosis method based on cross wavelet features
CN110673042A (en) * 2019-10-31 2020-01-10 安徽优旦科技有限公司 Data-based battery PACK thermal field change evaluation method
CN110673042B (en) * 2019-10-31 2021-07-20 安徽优旦科技有限公司 Data-based battery PACK thermal field change evaluation method
CN112395986A (en) * 2020-11-17 2021-02-23 广州像素数据技术股份有限公司 Face recognition method for quickly migrating new scene and preventing forgetting
CN112395986B (en) * 2020-11-17 2024-04-26 广州像素数据技术股份有限公司 Face recognition method capable of quickly migrating new scene and preventing forgetting

Also Published As

Publication number Publication date
CN100589118C (en) 2010-02-10

Similar Documents

Publication Publication Date Title
CN100589118C (en) Feature extracting method, device and pattern recognition method and device
Ohn-Bar et al. Joint angles similarities and HOG2 for action recognition
Li et al. Local log-Euclidean multivariate Gaussian descriptor and its application to image classification
Hoang et al. Invariant pattern recognition using the RFM descriptor
CN104318219A (en) Face recognition method based on combination of local features and global features
CN103646244A (en) Methods and devices for face characteristic extraction and authentication
CN104616280B (en) Method for registering images based on maximum stable extremal region and phase equalization
CN103839042A (en) Human face recognition method and human face recognition system
CN111209974A (en) Tensor decomposition-based heterogeneous big data core feature extraction method and system
CN104239859A (en) Face recognition method based on structuralized factor analysis
Sun et al. An efficient algorithm for Kernel two-dimensional principal component analysis
Liu et al. A novel local texture feature extraction method called multi-direction local binary pattern
CN107194314A (en) The fuzzy 2DPCA and fuzzy 2DLDA of fusion face identification method
Cui et al. Rotation and scaling invariant texture classification based on Radon transform and multiscale analysis
CN103714340A (en) Self-adaptation feature extracting method based on image partitioning
Yin et al. Nonnegative matrix factorization with bounded total variational regularization for face recognition
CN106778522A (en) A kind of single sample face recognition method extracted based on Gabor characteristic with spatial alternation
CN102289679B (en) Method for identifying super-resolution of face in fixed visual angle based on related characteristics and nonlinear mapping
Wenjing et al. Face recognition based on the fusion of wavelet packet sub-images and fisher linear discriminant
Nassara et al. Linear discriminant analysis for large-scale data: Application on text and image data
CN100533478C (en) Chinese character composition and realization method based on optimum affine conversion
CN101482917B (en) Human face recognition system and method based on second-order two-dimension principal component analysis
Lu et al. SCUT-MMSIG: a multimodal online signature database
Bychkov et al. Software application for biometrical person’s identification by portrait photograph based on wavelet transform
Yang et al. Robust affine invariant descriptors

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20210219

Address after: No. 602, 6th floor, shining building, 35 Xueyuan Road, Haidian District, Beijing 100083

Patentee after: BEIJING ZHONGXINGTIANSHI TECHNOLOGY Co.,Ltd.

Address before: 100083, Haidian District, Xueyuan Road, Beijing No. 35, Nanjing Ning building, 15 Floor

Patentee before: Vimicro Corp.