CN110874576A - Pedestrian re-identification method based on canonical correlation analysis fusion features - Google Patents

Pedestrian re-identification method based on canonical correlation analysis fusion features Download PDF

Info

Publication number
CN110874576A
CN110874576A CN201911114451.7A CN201911114451A CN110874576A CN 110874576 A CN110874576 A CN 110874576A CN 201911114451 A CN201911114451 A CN 201911114451A CN 110874576 A CN110874576 A CN 110874576A
Authority
CN
China
Prior art keywords
pedestrian
features
identification
matrix
mapping
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911114451.7A
Other languages
Chinese (zh)
Other versions
CN110874576B (en
Inventor
张凯兵
李春茂
李敏奇
景军锋
刘薇
卢健
陈小改
刘钟燕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pengbopuhua Technology Co ltd
Shenzhen Wanzhida Technology Co ltd
Original Assignee
Xian Polytechnic University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Polytechnic University filed Critical Xian Polytechnic University
Priority to CN201911114451.7A priority Critical patent/CN110874576B/en
Publication of CN110874576A publication Critical patent/CN110874576A/en
Application granted granted Critical
Publication of CN110874576B publication Critical patent/CN110874576B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a pedestrian re-identification method based on typical correlation analysis fusion characteristics, which comprises three stages of a characteristic extraction stage, a mapping matrix solving stage and a pedestrian re-identification stage for fusion characteristics, wherein in the characteristic extraction stage, two different characteristics X and Y are extracted from a pedestrian image, in the mapping matrix solving stage, typical correlation analysis is respectively carried out on the two characteristics X and Y to obtain a pair of mapping matrixes α and β, and the new characteristic is represented as X' ═ αTX,Y'=βTY,αTFor the transpose of the mapping matrix α, βTAs a transpose of the mapping matrix β, the fused features may be expressed as a pedestrian re-identification phase of the fused features
Figure DDA0002273684440000011
Or Z2X '+ Y' fused feature Z1Or Z2Divided into training set and testAnd in the test set, the model identified by the pedestrian is trained by the training set, and the trained model is tested by the test set, so that the characteristics are fused, the redundant information is effectively eliminated, and the calculation amount and the difficulty are reduced.

Description

Pedestrian re-identification method based on canonical correlation analysis fusion features
Technical Field
The invention belongs to the technical field of computer vision, and relates to a pedestrian re-identification method based on typical correlation analysis fusion characteristics.
Background
Pedestrian re-identification is a very popular research topic in the field of computer vision, and aims to provide an interested pedestrian which is found in non-overlapping monitoring equipment through computer vision technology. Most existing methods, when solving the problem of pedestrian re-identification, mainly start from two aspects: 1 developing a distinctive feature representation; 2 seek a discriminative distance measure. The method based on feature representation aims to extract features with robustness to represent pedestrians, and the features used for pedestrian re-identification can be divided into three categories: visual features, filter features, attribute features. The method of metric-based learning consists in learning the similarity between two pictures. The application of pedestrian re-identification is that on the basis of feature representation, the similarity between features is used for judging the similarity between pedestrian images, and the distance between the same pedestrians is made as small as possible and the distance between different pedestrians is made as large as possible by learning a distance measurement function with strong judgment force.
The characteristics are the basis of pedestrian re-identification, and the final result of pedestrian re-identification is directly influenced by the identification power of the characteristics. The color feature is a feature which is most widely applied and can represent the color distribution condition of the pedestrian image, the color feature has robustness on the change of the posture and the visual angle, but is easily influenced by illumination and shielding, and meanwhile, the color feature is difficult to distinguish for the similar pedestrian image. The texture features have robustness to illumination, and the color texture features are combined, so that the identification accuracy can be effectively improved. Generally, the artificial design features that a plurality of simple features are combined, the method combines the advantages of different features representing pedestrians, and the recognition effect is good. But as the number of combined features increases, the dimension of the combined features grows exponentially. The existing method for fusing features mostly fuses different features together through a serial or parallel strategy, and is simple and effective, and has the defects that the method does not consider the internal relation among the different features during combination, only different features are stacked, all feature information is reserved during fusion, a large amount of redundant information is reserved, the dimension of the combined features is high, the calculation complexity is increased, and certain influence is exerted on the identification accuracy and the real-time performance.
Disclosure of Invention
The invention aims to provide a pedestrian re-identification method based on typical correlation analysis fusion characteristics, and solves the problems of high result dimensionality, a large amount of redundant information and complex calculation of the fusion characteristics in the prior art.
The technical scheme includes that the pedestrian re-identification method based on the typical correlation analysis fusion features comprises three stages of a feature extraction stage, a mapping matrix solving stage and a pedestrian re-identification stage of the fusion features, two different features X and Y are extracted from a pedestrian image in the feature extraction stage, typical correlation analysis is respectively carried out on the two features X and Y in the mapping matrix solving stage to obtain a pair of mapping matrixes α and β, and new features are represented as X' ═ αTX,Y'=βTY,αTFor the transpose of the mapping matrix α, βTIs the transposition of the mapping matrix β, and during the pedestrian re-identification stage of the fusion feature, the fusion feature is expressed as
Figure BDA0002273684420000021
Or Z2X '+ Y' fused feature Z1Or Z2Divided into training set and test set, and the training set is used to train pedestriansAnd identifying the model, and testing the trained model by using the test set.
The invention is also characterized in that:
the method comprises the following specific steps:
step 1, extracting two features of a pedestrian re-identification data set:
extracting features of the data set of the pedestrian image by using different feature extraction algorithms, and respectively recording the features as follows:
X∈Rp*N,Y∈Rq*N
p and q respectively represent the dimensionality of the two features, and N represents the number of pictures contained in the data set;
step 2, performing typical correlation analysis on the two features X and Y extracted in the step 1 respectively, solving by using a singular value decomposition method to obtain a pair of mapping matrixes α and β, and expressing the new feature as X' ═ αTX,Y'=βTY,αTFor the transpose of the mapping matrix α, βTIs a transpose of the mapping matrix β;
and 3, carrying out pedestrian re-identification by using the fusion features:
step 3.1, through the mapping matrices α and β obtained in step 2, a fused representation of typical relevant features is obtained by the following fusion strategy as
Figure BDA0002273684420000031
Or Z2=X'+Y'=αTX+βTY, fused feature Z1Or Z2According to the division rule of different data sets in pedestrian re-identification, dividing the data sets into a training set I and a testing set I at a first visual angle, a training set II and a testing set II at a second visual angle, training a model for pedestrian re-identification by using the training set I and the training set II, and testing the trained models by using the testing set I and the testing set II;
and 3.2, evaluating the result tested in the step 3.1 by using the cumulative matching curve CMC, and taking the identification rate of rank1 as the most important evaluation index, wherein the identification effect is better when the value of rank1 is larger.
In the step 2, a projection matrix is solved by using a singular value decomposition method, and the solving process is as follows:
1) standardizing the two characteristics to obtain standard data with the mean value of 0 and the variance of 1;
2) calculating the variance S of XXXVariance S of YYYX and Y covariance SXY
3) Calculating the matrix
Figure BDA0002273684420000032
4) Performing singular value decomposition on the matrix M to obtain a maximum singular value sigma and left and right singular vectors u and v corresponding to the maximum singular value;
5) calculating mapping matrices α and β for X and Y,
Figure BDA0002273684420000033
6) the representation of the two features in the relevant subspace is X' ═ αTX,Y'=βTY。
The specific process of solving the projection matrix by using the singular value decomposition method in the step 2 is as follows:
(1) let the mapping matrices for X and Y be α and β, respectively, which are denoted X' ═ α in subspaceTX and Y' βTY, their correlation coefficient can be expressed as:
Figure BDA0002273684420000041
the objective function is:
Figure BDA0002273684420000042
that is, the mapping matrixes α and β corresponding to the maximum correlation coefficient are solved;
(2) before projection, raw data are firstly normalized to obtain data with the mean value of 0 and the variance of 1,
Cov(αTX,βTY)=E(<αTX,βTY>)=E((αTX)(βTY)T)=αTE(XYT
Figure BDA0002273684420000043
similarly, Var (β)TY)=βTE(YYT)β,μxIs the mean of X;
(3) since the mean values of X and Y are both 0, then
Var(X)=Cov(X,X)=E(XXT)
Var(Y)=Cov(Y,Y)=E(YYT)
Cov(X,Y)=E(XYT)
Cov(Y,X)=E(YXT);
(4) Order SXX=Var(X,X),SYY=Var(Y,Y),SXYCov (X, Y), the objective function is converted to
Figure BDA0002273684420000044
(5) Because the denominator of the numerator is increased by the same times, the optimization target result is not changed, the denominator is fixed, and the numerator is optimized, namely:
Figure BDA0002273684420000051
s.t.αTSXXα=1,βTSYYβ=1;
(6) in the solution of the objective function in (5), a singular value decomposition method is adopted, u and v are two unit vectors,
order to
Figure BDA0002273684420000052
αTSXXα=1,βTSYYβ=1,
Figure BDA0002273684420000053
At the same time, αTSXXα — 1, available:
Figure BDA0002273684420000054
from βTSYYβ — 1, available:
Figure BDA0002273684420000055
at this time, the objective function is:
Figure BDA0002273684420000056
s.t.uTu=1,vTv=1;
(7) for the objective function in (6), let the matrix
Figure BDA0002273684420000057
At this time, U and V represent left and right singular vectors corresponding to one singular value of the matrix M, and M ═ U Σ V is obtained by singular value decompositionTWherein U and V are matrixes formed by a left singular vector and a right singular vector of M respectively, and sigma is a diagonal matrix formed by singular values of M; since all columns of U, V are orthonormal bases, UTU and VTv, obtaining a vector with only one scalar being 1 and the other scalars being 0; at this time, the process of the present invention,
Figure BDA0002273684420000058
maximization
Figure BDA0002273684420000059
The corresponding maximum value is the maximum value of singular values corresponding to a group of left and right singular vectors, namely after singular value decomposition is carried out on M, the maximum singular value is the maximum value of an optimization target, namely the maximum correlation coefficient between X and Y;
(8) the original mapping matrix of X and Y is obtained by using the corresponding left and right singular vectors u, v
Figure BDA0002273684420000061
In step 3.1, the XQDA algorithm is used in the process of training the pedestrian re-identification model, the training set and the training sample labels are used as input, and the output is the subspace mapping matrix W and
Figure BDA0002273684420000062
wherein ∑'IIs an intra-class covariance matrix, sigma'EIs an inter-class covariance matrix;
during testing, the Mahalanobis distance is used for measuring the similarity between two pedestrian images, M and the mapping of the training features on the subspace W are input, and the Mahalanobis distance of the original features on the subspace is obtained.
The invention has the beneficial effects that: the invention mainly researches a pedestrian re-identification method based on feature fusion of typical correlation analysis. Aiming at the problems of high fusion result dimensionality, a large amount of redundant information and complex calculation of the current feature fusion method, a typical correlation analysis algorithm is used for analyzing the internal relation among different features of the same target and respectively searching a linear combination of the features, so that the new feature retains most of information of the original feature and has the maximum correlation with another new feature. The two new characteristics are fused according to a certain strategy, so that the purpose of characteristic fusion is achieved, and redundant information between the characteristics is eliminated.
Drawings
FIG. 1 is a diagram of a feature fusion process of a pedestrian re-identification method based on a typical correlation analysis fusion feature according to the present invention;
FIG. 2 is a graph of the results of two features of a pedestrian re-identification method based on representative correlation analysis fused features and fused features on a VIPeR data set in accordance with the present invention;
fig. 3 is a graph showing the specific results of fig. 2 when rank1, rank5, rank10, and rank20 are used as evaluation indexes.
Detailed Description
The present invention will be described in detail below with reference to the accompanying drawings and specific embodiments.
The invention relates to a pedestrian re-identification method based on typical correlation analysis fusion characteristics, which comprises three stages as shown in figure 1, namely a characteristic extraction stage, a mapping matrix solving stage and a pedestrian re-identification stage of fusion characteristics, wherein in the characteristic extraction stage, two different characteristics X and Y are extracted from a pedestrian image, in the mapping matrix solving stage, typical correlation analysis is respectively carried out on the two characteristics X and Y to obtain a pair of mapping matrixes α and β, and the new characteristic is represented as X' ═ αTX,Y'=βTY,αTFor the transpose of the mapping matrix α, βTIs the transposition of the mapping matrix β, and during the pedestrian re-identification stage of the fusion feature, the fusion feature is expressed as
Figure BDA0002273684420000071
Or Z2X '+ Y' fused feature Z1Or Z2And dividing the model into a training set and a testing set, training the model identified by the pedestrian by using the training set, and testing the trained model by using the testing set.
The invention relates to a pedestrian re-identification method based on typical correlation analysis fusion characteristics, which comprises the following specific steps of:
step 1, extracting two features of a pedestrian re-identification data set:
extracting features of the data set of the pedestrian image by using different feature extraction algorithms, and respectively recording the features as follows:
X∈Rp*N,Y∈Rq*N
p and q respectively represent the dimensionality of the two features, and N represents the number of pictures contained in the data set;
step 2, performing typical correlation analysis on the two features X and Y extracted in the step 1 respectively, solving by using a singular value decomposition method to obtain a pair of mapping matrixes α and β, and expressing the new feature as X' ═ αTX,Y'=βTY,αTFor the transpose of the mapping matrix α, βTIs a transpose of the mapping matrix β;
and 3, carrying out pedestrian re-identification by using the fusion features:
step 3.1, through the mapping matrices α and β obtained in step 2, a typical one is obtained by the following fusion strategyThe fusion of the relevant features is represented as
Figure BDA0002273684420000072
Or Z2=X'+Y'=αTX+βTY, fused feature Z1Or Z2According to the division rule of different data sets in pedestrian re-identification, dividing the data sets into a training set I and a testing set I at a first visual angle, a training set II and a testing set II at a second visual angle, training a model for pedestrian re-identification by using the training set I and the training set II, and testing the trained models by using the testing set I and the testing set II;
and 3.2, evaluating the result tested in the step 3.1 by using the cumulative matching curve CMC, and taking the identification rate of rank1 as the most important evaluation index, wherein the identification effect is better when the value of rank1 is larger.
In the step 2, a projection matrix is solved by using a singular value decomposition method, and the solving process is as follows:
1) standardizing the two characteristics to obtain standard data with the mean value of 0 and the variance of 1;
2) calculating the variance S of XXXVariance S of YYYX and Y covariance SXY
3) Calculating the matrix
Figure BDA0002273684420000081
4) Performing singular value decomposition on the matrix M to obtain a maximum singular value sigma and left and right singular vectors u and v corresponding to the maximum singular value;
5) calculating mapping matrices α and β for X and Y,
Figure BDA0002273684420000082
6) the representation of the two features in the relevant subspace is X' ═ αTX,Y'=βTY。
The specific process of solving the projection matrix by using the singular value decomposition method in the step 2 is as follows:
(1) let the mapping matrices for X and Y be α and β, respectively, which are denoted X' ═ α in subspaceTX andY'=βTy, their correlation coefficient can be expressed as:
Figure BDA0002273684420000083
the objective function is:
Figure BDA0002273684420000084
that is, the mapping matrixes α and β corresponding to the maximum correlation coefficient are solved;
(2) before projection, raw data are firstly normalized to obtain data with the mean value of 0 and the variance of 1,
Cov(αTX,βTY)=E(<αTX,βTY>)=E((αTX)(βTY)T)=αTE(XYT
Figure BDA0002273684420000091
similarly, Var (β)TY)=βTE(YYT)β,μxIs the mean of X;
(3) since the mean values of X and Y are both 0, then
Var(X)=Cov(X,X)=E(XXT)
Var(Y)=Cov(Y,Y)=E(YYT)
Cov(X,Y)=E(XYT)
Cov(Y,X)=E(YXT);
(4) Order SXX=Var(X,X),SYY=Var(Y,Y),SXYCov (X, Y), the objective function is converted to
Figure BDA0002273684420000092
(5) Because the denominator of the numerator is increased by the same times, the optimization target result is not changed, the denominator is fixed, and the numerator is optimized, namely:
Figure BDA0002273684420000093
s.t.αTSXXα=1,βTSYYβ=1;
(6) in the solution of the objective function in (5), a singular value decomposition method is adopted, u and v are two unit vectors,
order to
Figure BDA0002273684420000094
αTSXXα=1,βTSYYβ=1,
Figure BDA0002273684420000095
At the same time, αTSXXα — 1, available:
Figure BDA0002273684420000096
from βTSYYβ — 1, available:
Figure BDA0002273684420000101
at this time, the objective function is:
Figure BDA0002273684420000102
s.t.uTu=1,vTv=1;
(7) for the objective function in (6), let the matrix
Figure BDA0002273684420000103
At this time, U and V represent left and right singular vectors corresponding to one singular value of the matrix M, and M ═ U Σ V is obtained by singular value decompositionTWherein U and V are matrices composed of left singular vectors and right singular vectors of M, and Σ is diagonal matrix composed of singular values of M(ii) a Since all columns of U, V are orthonormal bases, UTU and VTv, obtaining a vector with only one scalar being 1 and the other scalars being 0; at this time, the process of the present invention,
Figure BDA0002273684420000104
maximization
Figure BDA0002273684420000105
The corresponding maximum value is the maximum value of singular values corresponding to a group of left and right singular vectors, namely after singular value decomposition is carried out on M, the maximum singular value is the maximum value of an optimization target, namely the maximum correlation coefficient between X and Y;
(8) the original mapping matrix of X and Y is obtained by using the corresponding left and right singular vectors u, v
Figure BDA0002273684420000106
In step 3.1, the XQDA algorithm is used in the process of training the pedestrian re-identification model, the training set and the training sample labels are used as input, and the output is the subspace mapping matrix W and
Figure BDA0002273684420000107
wherein ∑'IIs an intra-class covariance matrix, sigma'EIs an inter-class covariance matrix;
during testing, the Mahalanobis distance is used for measuring the similarity between two pedestrian images, M and the mapping of the training features on the subspace W are input, and the Mahalanobis distance of the original features on the subspace is obtained.
The invention discloses a pedestrian re-identification method based on typical correlation analysis fusion characteristics, which has the advantages that: the method adopts a typical correlation analysis and fusion strategy in the feature fusion stage, analyzes the maximum correlation of different spatial features in a public subspace, takes the maximum correlation feature between the two features as discrimination information, effectively eliminates redundant information while fusing the features, and reduces the calculation amount and difficulty.
Example one
The invention relates to a pedestrian re-identification method based on typical correlation analysis fusion characteristics, which is implemented according to the following steps:
step 1: extraction of two features from a pedestrian re-identification dataset
Using a pedestrian re-identification dataset VIPeR, which contains 632 pairs of pedestrian images, for a total of 1264, each pair of images containing two pictures of a person from different perspectives, and each image being scaled to a size of 128 × 48 pixels, we (weighted Histogram of overlapping stripes) and lomo (local maximum occupancy) features were extracted from the dataset in combination with existing feature extraction methods.
Step 2: performing typical correlation analysis on the features, solving a mapping matrix, and solving by using a singular value decomposition method as follows:
1) standardizing the two characteristics to obtain standard data with the mean value of 0 and the variance of 1;
2) calculating the variance S of XXXVariance S of YYYX and Y covariance SXY
3) Calculating the matrix
Figure BDA0002273684420000111
4) Performing singular value decomposition on the matrix M to obtain a maximum singular value sigma and left and right singular vectors u and v corresponding to the maximum singular value;
5) calculating mapping matrices α and β for X and Y,
Figure BDA0002273684420000112
6) the representation of the two features in the relevant subspace is X' ═ αTX,Y'=βTY。
And step 3: and re-identifying the pedestrian by using the fusion characteristics, wherein the specific process is as follows:
1) the fusion characteristic Z belongs to Rd*NWhere d is the dimension of the fused feature, N is the number of pictures in the data set, and the fused feature is represented as
Figure BDA0002273684420000121
Or Z2=X'+Y'=αTX+βTY, for the VIPeR data set, N is 1264, 1-632 columns of characteristics are used as a query set, and 633-1264 columns of characteristics are used as a candidate set;
2) for the query set and the candidate set, respectively and randomly selecting 316 columns of characteristics as two training sets, and the rest 316 columns are two test sets;
3) the identification process uses an XQDA (Cross-view quantized invariant Analysis) algorithm, takes a training set and a training sample label as input, and outputs a subspace mapping matrix W and a subspace mapping matrix W
Figure BDA0002273684420000122
Wherein ∑'IIs an intra-class covariance matrix, sigma'EIs an inter-class covariance matrix;
4) during testing, the Mahalanobis distance is used for measuring the similarity between two pedestrian images, M and the mapping of the training features on the subspace W are input, and the Mahalanobis distance of the original features in the subspace can be obtained;
5) as shown in fig. 2 and 3, the evaluation results used CMC curves, using rank1, rank5, rank10, rank20 as evaluation indexes, where the value of rank1 is particularly important in evaluating the effect of pedestrian re-recognition.

Claims (5)

1. A pedestrian re-identification method based on typical correlation analysis fusion features is characterized by comprising three stages of a feature extraction stage, a mapping matrix solving stage and a pedestrian re-identification stage of fusion features, wherein in the feature extraction stage, two different features X and Y are extracted from a pedestrian image, in the mapping matrix solving stage, typical correlation analysis is respectively carried out on the two features X and Y to obtain a pair of mapping matrixes α and β, and new features are represented as X' ═ αTX,Y'=βTY,αTFor the transpose of the mapping matrix α, βTIs the transposition of the mapping matrix β, and during the pedestrian re-identification stage of the fusion feature, the fusion feature is expressed as
Figure FDA0002273684410000011
Or Z2X '+ Y' fused feature Z1Or Z2And dividing the model into a training set and a testing set, training the model identified by the pedestrian by using the training set, and testing the trained model by using the testing set.
2. The pedestrian re-identification method based on the canonical correlation analysis fusion feature according to claim 1, characterized by comprising the following specific steps:
step 1, extracting two features of a pedestrian re-identification data set:
extracting features of the data set of the pedestrian image by using different feature extraction algorithms, and respectively recording the features as follows:
X∈Rp*N,Y∈Rq*N
p and q respectively represent the dimensionality of the two features, and N represents the number of pictures contained in the data set;
step 2, performing typical correlation analysis on the two features X and Y extracted in the step 1 respectively, solving by using a singular value decomposition method to obtain a pair of mapping matrixes α and β, and expressing the new feature as X' ═ αTX,Y'=βTY,αTFor the transpose of the mapping matrix α, βTIs a transpose of the mapping matrix β;
and 3, carrying out pedestrian re-identification by using the fusion features:
step 3.1, through the mapping matrices α and β obtained in step 2, a fused representation of typical relevant features is obtained by the following fusion strategy as
Figure FDA0002273684410000012
Or Z2=X'+Y'=αTX+βTY, fused feature Z1Or Z2According to the division rule of different data sets in pedestrian re-identification, the data sets are divided into a training set I and a testing set I with a first visual angle, a training set II and a testing set II with a second visual angle, a model for pedestrian re-identification is trained by using the training set I and the training set II, and a model trained by using the testing set I and the testing set II is trained by using two pairsTesting the model;
and 3.2, evaluating the result tested in the step 3.1 by using the cumulative matching curve CMC, and taking the identification rate of rank1 as the most important evaluation index, wherein the identification effect is better when the value of rank1 is larger.
3. The pedestrian re-identification method based on the canonical correlation analysis fusion feature as claimed in claim 2, wherein the projection matrix is solved by using a singular value decomposition method in the step 2, and the solving process is as follows:
1) standardizing the two characteristics to obtain standard data with the mean value of 0 and the variance of 1;
2) calculating the variance S of XXXVariance S of YYYX and Y covariance SXY
3) Calculating the matrix
Figure FDA0002273684410000021
4) Performing singular value decomposition on the matrix M to obtain a maximum singular value sigma and left and right singular vectors u and v corresponding to the maximum singular value;
5) calculating mapping matrices α and β for X and Y,
Figure FDA0002273684410000022
6) the representation of the two features in the relevant subspace is X' ═ αTX,Y'=βTY。
4. The pedestrian re-identification method based on the canonical correlation analysis fusion feature according to claim 3, wherein the specific process of solving the projection matrix by using the singular value decomposition method in the step 2 is as follows:
(1) let the mapping matrices for X and Y be α and β, respectively, which are denoted X' ═ α in subspaceTX and Y' βTY, their correlation coefficient can be expressed as:
Figure FDA0002273684410000023
the objective function is:
Figure FDA0002273684410000031
that is, the mapping matrixes α and β corresponding to the maximum correlation coefficient are solved;
(2) before projection, raw data are firstly normalized to obtain data with the mean value of 0 and the variance of 1,
Cov(αTX,βTY)=E(<αTX,βTY>)
=E((αTX)(βTY)T)=αTE(XYT
Figure FDA0002273684410000032
similarly, Var (β)TY)=βTE(YYT)β,μxIs the mean of X;
(3) since the mean values of X and Y are both 0, then
Var(X)=Cov(X,X)=E(XXT)
Var(Y)=Cov(Y,Y)=E(YYT)
Cov(X,Y)=E(XYT)
Cov(Y,X)=E(YXT);
(4) Order SXX=Var(X,X),SYY=Var(Y,Y),SXYCov (X, Y), the objective function is converted to
Figure FDA0002273684410000033
(5) Because the denominator of the numerator is increased by the same times, the optimization target result is not changed, the denominator is fixed, and the numerator is optimized, namely:
Figure FDA0002273684410000034
s.t.αTSXXα=1,βTSYYβ=1;
(6) in the solution of the objective function in (5), a singular value decomposition method is adopted, u and v are two unit vectors,
order to
Figure FDA0002273684410000041
αTSXXα=1,βTSYYβ=1,
Figure FDA0002273684410000042
At the same time, αTSXXα — 1, available:
Figure FDA0002273684410000043
from βTSYYβ — 1, available:
Figure FDA0002273684410000044
at this time, the objective function is:
Figure FDA0002273684410000045
s.t.uTu=1,vTv=1;
(7) for the objective function in (6), let the matrix
Figure FDA0002273684410000046
At this time, U and V represent left and right singular vectors corresponding to one singular value of the matrix M, and M ═ U Σ V is obtained by singular value decompositionTWherein U and V are matrixes formed by a left singular vector and a right singular vector of M respectively, and sigma is a diagonal matrix formed by singular values of M; all columns are orthonormal due to U, VBasic group, then uTU and VTv, obtaining a vector with only one scalar being 1 and the other scalars being 0; at this time, the process of the present invention,
Figure FDA0002273684410000047
maximization
Figure FDA0002273684410000048
The corresponding maximum value is the maximum value of singular values corresponding to a group of left and right singular vectors, namely after singular value decomposition is carried out on M, the maximum singular value is the maximum value of an optimization target, namely the maximum correlation coefficient between X and Y;
(8) the original mapping matrix of X and Y is obtained by using the corresponding left and right singular vectors u, v
Figure FDA0002273684410000049
5. The pedestrian re-identification method based on the canonical correlation analysis fusion feature according to claim 2, wherein the XQDA algorithm is used in the process of training the pedestrian re-identification model in step 3.1, and the training set and the training sample labels are used as input and output as subspace mapping matrices W and W
Figure FDA0002273684410000051
Wherein ∑'IIs an intra-class covariance matrix, sigma'EIs an inter-class covariance matrix;
during testing, the Mahalanobis distance is used for measuring the similarity between two pedestrian images, M and the mapping of the training features on the subspace W are input, and the Mahalanobis distance of the original features on the subspace is obtained.
CN201911114451.7A 2019-11-14 2019-11-14 Pedestrian re-identification method based on typical correlation analysis fusion characteristics Active CN110874576B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911114451.7A CN110874576B (en) 2019-11-14 2019-11-14 Pedestrian re-identification method based on typical correlation analysis fusion characteristics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911114451.7A CN110874576B (en) 2019-11-14 2019-11-14 Pedestrian re-identification method based on typical correlation analysis fusion characteristics

Publications (2)

Publication Number Publication Date
CN110874576A true CN110874576A (en) 2020-03-10
CN110874576B CN110874576B (en) 2023-10-27

Family

ID=69718334

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911114451.7A Active CN110874576B (en) 2019-11-14 2019-11-14 Pedestrian re-identification method based on typical correlation analysis fusion characteristics

Country Status (1)

Country Link
CN (1) CN110874576B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112270228A (en) * 2020-10-16 2021-01-26 西安工程大学 Pedestrian re-identification method based on DCCA fusion characteristics
CN114139614A (en) * 2021-11-18 2022-03-04 南京工业大学 Fisher photovoltaic module hot spot diagnosis method and system based on typical correlation analysis feature extraction
CN115984193A (en) * 2022-12-15 2023-04-18 东北林业大学 PDL1 expression level detection method fusing histopathology image and CT image

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107273825A (en) * 2017-05-25 2017-10-20 西安电子科技大学 Personal identification method is merged based on the physiological signal for improving canonical correlation analysis
CN107506700A (en) * 2017-08-07 2017-12-22 苏州经贸职业技术学院 Pedestrian's recognition methods again based on the study of broad sense similarity measurement
WO2019206265A1 (en) * 2018-04-26 2019-10-31 北京京东尚科信息技术有限公司 Pedestrian re-identification method and apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107273825A (en) * 2017-05-25 2017-10-20 西安电子科技大学 Personal identification method is merged based on the physiological signal for improving canonical correlation analysis
CN107506700A (en) * 2017-08-07 2017-12-22 苏州经贸职业技术学院 Pedestrian's recognition methods again based on the study of broad sense similarity measurement
WO2019206265A1 (en) * 2018-04-26 2019-10-31 北京京东尚科信息技术有限公司 Pedestrian re-identification method and apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
袁立等: "基于融合特征的行人再识别方法", 《模式识别与人工智能》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112270228A (en) * 2020-10-16 2021-01-26 西安工程大学 Pedestrian re-identification method based on DCCA fusion characteristics
CN114139614A (en) * 2021-11-18 2022-03-04 南京工业大学 Fisher photovoltaic module hot spot diagnosis method and system based on typical correlation analysis feature extraction
CN115984193A (en) * 2022-12-15 2023-04-18 东北林业大学 PDL1 expression level detection method fusing histopathology image and CT image

Also Published As

Publication number Publication date
CN110874576B (en) 2023-10-27

Similar Documents

Publication Publication Date Title
CN107330397B (en) Pedestrian re-identification method based on large-interval relative distance measurement learning
CN110659665B (en) Model construction method of different-dimension characteristics and image recognition method and device
CN112507901B (en) Unsupervised pedestrian re-identification method based on pseudo tag self-correction
CN105005772B (en) A kind of video scene detection method
CN109255289B (en) Cross-aging face recognition method based on unified generation model
CN110717554B (en) Image recognition method, electronic device, and storage medium
CN111325115A (en) Countermeasures cross-modal pedestrian re-identification method and system with triple constraint loss
CN111126240B (en) Three-channel feature fusion face recognition method
CN110874576A (en) Pedestrian re-identification method based on canonical correlation analysis fusion features
CN113011357A (en) Depth fake face video positioning method based on space-time fusion
Lee et al. Face image retrieval using sparse representation classifier with gabor-lbp histogram
Puthenputhussery et al. A sparse representation model using the complete marginal fisher analysis framework and its applications to visual recognition
Li et al. Dating ancient paintings of Mogao Grottoes using deeply learnt visual codes
CN116612335B (en) Few-sample fine-granularity image classification method based on contrast learning
CN115131580B (en) Space target small sample identification method based on attention mechanism
CN108875448B (en) Pedestrian re-identification method and device
Najibi et al. Towards the success rate of one: Real-time unconstrained salient object detection
CN108960013B (en) Pedestrian re-identification method and device
CN106326927B (en) A kind of shoes print new category detection method
CN112329698A (en) Face recognition method and system based on intelligent blackboard
CN116935411A (en) Radical-level ancient character recognition method based on character decomposition and reconstruction
CN112307894A (en) Pedestrian age identification method based on wrinkle features and posture features in community monitoring scene
Yuan et al. Holistic learning-based high-order feature descriptor for smoke recognition
CN111353443B (en) Pedestrian re-identification method based on cross-view kernel collaborative representation
CN112270228A (en) Pedestrian re-identification method based on DCCA fusion characteristics

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20230918

Address after: 150000 No. 113-3, Zhongshan Road, Nangang District, Harbin City, Heilongjiang Province

Applicant after: PENGBOPUHUA TECHNOLOGY Co.,Ltd.

Address before: 518000 1002, Building A, Zhiyun Industrial Park, No. 13, Huaxing Road, Henglang Community, Longhua District, Shenzhen, Guangdong Province

Applicant before: Shenzhen Wanzhida Technology Co.,Ltd.

Effective date of registration: 20230918

Address after: 518000 1002, Building A, Zhiyun Industrial Park, No. 13, Huaxing Road, Henglang Community, Longhua District, Shenzhen, Guangdong Province

Applicant after: Shenzhen Wanzhida Technology Co.,Ltd.

Address before: 710048 Shaanxi province Xi'an Beilin District Jinhua Road No. 19

Applicant before: XI'AN POLYTECHNIC University

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant