CN101487892A - High-spectrum data dimensionality reduction method based on factor analysis model - Google Patents

High-spectrum data dimensionality reduction method based on factor analysis model Download PDF

Info

Publication number
CN101487892A
CN101487892A CNA2009100784431A CN200910078443A CN101487892A CN 101487892 A CN101487892 A CN 101487892A CN A2009100784431 A CNA2009100784431 A CN A2009100784431A CN 200910078443 A CN200910078443 A CN 200910078443A CN 101487892 A CN101487892 A CN 101487892A
Authority
CN
China
Prior art keywords
factor
matrix
sigma
centerdot
spectral data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2009100784431A
Other languages
Chinese (zh)
Other versions
CN101487892B (en
Inventor
赵慧洁
李娜
蔡辉
贾国瑞
徐州
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN200910078443A priority Critical patent/CN101487892B/en
Publication of CN101487892A publication Critical patent/CN101487892A/en
Application granted granted Critical
Publication of CN101487892B publication Critical patent/CN101487892B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides a method for reducing dimensions of high spectroscopic data based on a factorial analysis model, comprising the following steps: (1) reading in the high spectroscopic data; (2) establishing the factorial analysis model for dimension reduction of the high spectroscopic data; (3) calculating average of data, covariance matrix and correlation matrix; (4) calculating proper value and standard eigen vector of the correlation matrix of the data; (5) carrying out solving on factor loading matrix by a parametric estimation method; (6) calculating covariance of special factors and communality of data variables in the factorial analysis model; (7) calculating the biggest factor loading spin matrix based on variance; (8) calculating factor scores by a least square method based on variance; and (9) obtaining eigen dimensionality characterizing high spectroscopic data, thereby realizing dimension reduction of the high spectroscopic data. The method is an automatic method for dimension reduction of the high spectroscopic data, which can effectively remove relativity of wave bands of the high spectroscopic data and improve separability of different categories of ground objects.

Description

A kind of high-spectral data dimension reduction method based on Factor Analysis Model
Technical field
The present invention relates to a kind of high-spectral data dimension reduction method, belong to high-spectral data disposal route and applied technical field, be applicable to the theoretical method and the application technical research of high-spectral data dimensionality reduction based on Factor Analysis Model.
Background technology
Hyperspectral imager is a kind of novel remote sensing load, and its spectrum has tight, continuous characteristics, can write down the spectrum and the spatial information feature of tested same atural object simultaneously, and therefore, high-spectral data dimension height and data volume are big; Because spectral resolution is higher, has higher correlativity between each wave band, has had a large amount of redundant informations.And along with the increase of wave band data, data processing amount is index and rises.Therefore,, eliminate the correlativity between wave band, become the key link of data processing how with the high-spectral data dimensionality reduction.
Propose a lot of methods at the high-spectral data dimensionality reduction at present, mainly be divided into band selection and feature extraction two big classes.The purpose of band selection is n to be tieed up high-spectral data tie up (m<n) data subset replacement with the m with the meaning represented, though the problem of these class methods is the characteristics that kept original digital image data preferably, lost the quantity of information and the spectral details information of high-spectral data; The purpose of feature extraction is n to be tieed up high-spectral data be compressed to m dimension and (in the model space of m<n), make the separability maximum between classification.Therefore, these class methods have been given prominence to the otherness between view data in the data dimensionality reduction, but have changed the primary characteristic of data.At present the problem that mainly exists of Method of Data with Adding Windows is: good all useful informations of providing of retention data can not be illustrated in the same type characters of ground object in the identical wave band.
Summary of the invention
The objective of the invention is at a kind of high-spectral data dimension reduction methods of not enough proposition such as existing high-spectral data dimension reduction method lost data information based on Factor Analysis Model.
Technical solution of the present invention is: a kind of high-spectral data dimension reduction method based on Factor Analysis Model of the present invention is meant that specifically a kind of Factor Analysis Model drops to high spectrum higher dimensional space data the method for the lower dimensional space that characterizes the high-spectral data intrinsic property.This method mainly is to utilize Factor Analysis Model, and by the calculated factor loading matrix, based on the factor loading rotation of variance maximum, the intrinsic dimension that the calculated factor score obtains characterizing high-spectral data, thereby realize the high-spectral data dimensionality reduction.
A kind of high-spectral data dimension reduction method of the present invention based on Factor Analysis Model, its step is as follows:
(1) high-spectrum remote sensing data reads in;
(2) set up the Factor Analysis Model of high-spectral data dimensionality reduction;
(3) average, covariance matrix and the correlation matrix of calculating high-spectral data;
(4) eigenwert of computational data correlation matrix and standardized feature vector;
(5) carry out finding the solution of factor loading matrix by the major component method of separating;
(6) the common degree of the variance matrix of specific factor and data variable in the calculated factor analytical model;
(7) calculating is based on the factor loading rotation matrix of variance maximum;
(8) utilize based on weighted least require method calculated factor score;
(9) obtain characterizing the intrinsic dimension of high-spectral data, realize the high-spectral data dimensionality reduction.
Wherein, " the setting up the Factor Analysis Model of high-spectral data dimensionality reduction " described in the step (2), its Factor Analysis Model is:
X=μ+AF+ε
In the formula, X=(x 1, x 2..., x pThe observable stochastic variable of) ' be, average μ=E (X), the common factor F=(F of X 1, F 2..., F m) ' (m<p) is a unobservable random variable, and satisfies average E (F)=0, variance D (F)=I m(being that each component variance of F is 1 and uncorrelated mutually), specific factor ε=(ε 1, ε 2..., ε p) ' uncorrelated mutually with F, and satisfy average E (ε)=0, variance D ( ϵ ) = diag ( σ 1 2 , σ 2 2 , . . . , σ p 2 ) = def D ; Common factor generally all works to each component of X, and specific factor only works to a certain component of X, and between the specific factor and uncorrelated mutually between specific factor and the common factor; Factor loading matrix A=(a Ij) P * mBe matrix of coefficients to be estimated, in factorial analysis, specific factor plays residual error or noise.
Wherein, step (3) described " calculating average, covariance matrix and the correlation matrix of high-spectral data ", its computing method are as follows:
X ‾ = 1 n Σ t = 1 n X ( t ) = ( x ‾ 1 , x ‾ 2 , . . . , x ‾ p ) ′
V = 1 n Σ t = 1 n ( X ( t ) - X ‾ ) ( X ( t ) - X ‾ ) ′ = def V ij
R = ( r ij ) = ( V ij V ii V jj ) ; ( i , j = 1,2 , . . . , p )
In the formula, X is the average of data; V is a covariance matrix; R is a correlation matrix; N represents the pixel number; X (t)Represent t pixel.
Wherein, " eigenwert of computational data correlation matrix and the standardized feature vector " described in the step (4), its implication is described as follows: the eigenwert of calculating correlation matrix R is λ 1〉=λ 2〉=... 〉=λ p〉=0, and the proper vector l of the Schimidt orthogonalization of each eigenwert correspondence 1, l 2..., l p
Wherein, " the carrying out finding the solution of factor loading matrix " described in the step (5) by the major component method of separating, its computing method are as follows:
Basis at first λ 1 + λ 2 + · · · + λ m λ 1 + · · · + λ m + · · · λ p ≥ 0.9 Smallest positive integral, determine common factor number m;
The method of utilizing major component to separate is then found the solution the factor loading matrix: the eigenwert of utilizing the correlation matrix R that calculates in the step (4) is λ 1〉=λ 2〉=... 〉=λ p〉=0, the proper vector behind the corresponding Schimidt orthogonalization is l 1, l 2..., l p, then utilize the spectral factorization of correlation matrix R: R = Σ i = 1 p λ i l i l i ′ , Separate for one that obtains Factor Analysis Model: A = ( λ 1 l 1 , · · · , λ m l m ) = def ( a ij ) p × m .
Wherein, " the common degree of the variance matrix of specific factor and data variable in the calculated factor analytical model " described in the step (6), its implication is described as follows: the variance of specific factor is in the Factor Analysis Model: σ i 2 = 1 - Σ t = 1 m a it 2 , ( i = 1,2 , . . . , p ) , Variable x iCommon degree h i 2Be estimated as h i 2 = Σ t = 1 m a it 2 , ( i = 1,2 , . . . , p ) .
Wherein, " calculating factor loading rotation matrix " described in the step (7) based on the variance maximum, its implication is described as follows: common factor is done the quadrature rotation exactly loading matrix A is done orthogonal transformation, orthogonal matrix Г is taken advantage of on the right side, make A Γ have distincter meaning, postrotational common factor vector is F *=Г ' F, its each component It also is mutual incoherent common factor; Difference according to orthogonal matrix Г is chosen mode, will construct different quadrature spinning solutions, and the method that adopts among the present invention is the spinning solution of maximum variance; The specific implementation process is as follows:
Order A * = AΓ = ( a ij * ) p × m , d ij = a ij * / h i , d ‾ j = 1 p Σ i = 1 p d ij 2 , A then *The relative variance of j column element square can be defined as: V j = 1 p Σ i = 1 p ( d ij 2 - d ‾ j ) 2 , In the formula, get
Figure A200910078443D00096
Be in order to eliminate d IjThe influence that symbol is different; With
Figure A200910078443D00097
Divided by h iBe in order to eliminate each original variable X iThe influence different to the common factor degree of dependence; The maximum variance spinning solution is selected orthogonal matrix Г exactly, makes A *It is maximum that the relative variance V of all m column elements square reaches: V=V 1+ V 2+ ... + V mOne by one per two common factors are carried out above-mentioned rotation, to common factor F lAnd F kRotation, the 1st row and the k row to the A matrix carry out orthogonal transformation exactly, make the relative variance sum maximum of this two element that is listed as square, and all the other row are constant, rotation matrix Г LkCan be expressed as:
Figure A200910078443D00098
In the formula, θ is F lAnd F kThe anglec of rotation, all the other elements of matrix are 0; Ask θ to make V reach maximum, by the method for asking extreme value in the infinitesimal analysis, to the θ differentiate, and to make it be 0 with V:
V = V 1 + V 2 + · · · + V m = 1 p Σ i = 1 p ( d i 1 2 - d ‾ 1 ) 2 + · · · + 1 p Σ i = 1 p ( d im 2 - d ‾ m ) 2
So, by ∂ V ∂ θ = 0 , : tan 4 θ = M - 2 UV / p W - ( U 2 - V 2 ) / p , In the formula: U = Σ i = 1 p u i , V = Σ i = 1 p v i , W = Σ i = 1 p ( u i 2 - v i 2 ) , M = 2 Σ i = 1 p u i v i , u i = ( a i 1 h i ) 2 + ( a i 2 h i ) 2 , v i = 2 a i 1 a i 2 h i 2 , a I1, a T2Be the element in the factor loading matrix, h i 2Be variable x iCommon degree.
The first round, postrotational factor loading matrix was A (1), begin the rotation of next round then, obtain a series of factor loading matrix and be: A (1), A (2)..., A (s)..., then must have: V (1)≤ V (2)≤ ...≤V (s)≤ ..., V (s)Be A (s)The relative variance sum of each column element square in the practical application, is worked as V (s-1)-V (s)During≤ε, can stop the rotation.
Wherein, " utilization " described in the step (8) based on weighted least require method calculated factor score, its implication is described as follows: on the basis that obtains A and D, adopt weighted least-squares that F is found the solution: F ^ = ( A ^ ′ D ^ - 1 A ^ ) - 1 A ′ D - 1 X , Promptly obtained the weighted least-squares estimation of F, in the formula,
Figure A200910078443D00106
Be the factor loading matrix that estimation obtains, Be the specific factor variance matrix that estimation obtains, X reads in data.
Wherein, " obtaining characterizing the intrinsic dimension of high-spectral data; realize the high-spectral data dimensionality reduction " described in the step (9), its implication is described as follows: utilize factor score to estimate to obtain characterizing the intrinsic dimension of high-spectral data, thereby realized the high-spectral data dimensionality reduction.
The present invention's advantage compared with prior art is: overcome existing high-spectral data dimension reduction method lost data information, can't the retention data primary characteristic etc. limitation, this method has been utilized Factor Analysis Model, and adopted quadrature rotation based on the factor matrix of variance maximum, obtain characterizing the intrinsic dimension of high-spectral data, realized the high-spectral data dimensionality reduction.It has following advantage: (1) utilizes Factor Analysis Model that high-spectral data is carried out the intrinsic dimension that reflection high-spectral data structure and feature are tried to achieve in modeling, simplify data structure when eliminating the correlativity between wave band, kept the data primary characteristic preferably; (2) utilized based on the rotation of the factor loading of variance maximum, be easy to the explanation of each factor representative implication, and every kind of atural object only plays a leading role in a factor, farthest increased the separability between the dissimilar atural objects.
Description of drawings
Fig. 1 is the realization flow of a kind of high-spectral data dimension reduction method based on Factor Analysis Model that the present invention relates to
Embodiment
For the high-spectral data dimension reduction method based on Factor Analysis Model that the present invention relates to better is described, utilize PHI aviation high-spectral data to carry out area, Fang Lu tea plantation, Jiangsu crops sophisticated category.A kind of high-spectral data dimension reduction method of the present invention, realization flow based on Factor Analysis Model as shown in Figure 1, the specific implementation step is as follows:
(1) reading in of high-spectral data: read in Fang Lu tea plantation PHI high-spectral data, remove wave bands such as signal to noise ratio (S/N ratio) is low, Atmospheric Absorption, former size of data is 210 * 150 * 64;
(2) set up the Factor Analysis Model of high-spectral data dimensionality reduction;
The Factor Analysis Model of high-spectral data dimensionality reduction is:
X=μ+AF+ε
In the formula, X=(x 1, x 2..., x 64The observable stochastic variable of) ' be, average μ=E (X), the common factor F=(F of X 1, F 2..., F m) ' (m<64) be unobservable random variable, and satisfy average E (F)=0, variance D (F)=I m(being that each component variance of F is 1 and uncorrelated mutually), specific factor ε=(ε 1, ε 2..., ε 64) ' uncorrelated mutually with F, and satisfy average E (ε)=0, variance D ( ϵ ) = diag ( σ 1 2 , σ 2 2 , . . . , σ 64 2 ) = def D ; Common factor generally all works to each component of X, and specific factor only works to a certain component of X, and between the specific factor and uncorrelated mutually between specific factor and the common factor; Matrix A=(a Ij) P * mBe matrix of coefficients to be estimated, be called the factor loading matrix, in factorial analysis, specific factor plays residual error or noise;
(3) average, covariance matrix and the correlation matrix of calculating high-spectral data;
210 * 150 * 64 high-spectral data average X, covariance matrix V, the computing method of correlation matrix R are as follows:
X ‾ = 1 n Σ t = 1 n X ( t ) = ( x ‾ 1 , x ‾ 2 , . . . , x ‾ 64 ) ′
V = 1 n Σ t = 1 n ( X ( t ) - X ‾ ) ( X ( t ) - X ‾ ) ′ = def V ij
R = ( r ij ) = ( V ij V ii V jj ) ; ( i , j = 1,2 , . . . , 64 )
In the formula, X is the average of data; V is a covariance matrix; R is a correlation matrix; N=210 * 150 expression pixel numbers; X (t)Represent t pixel.
(4) eigenwert of computational data correlation matrix and standardized feature vector;
The eigenwert of calculating correlation matrix R is λ 1〉=λ 2〉=... 〉=λ 64〉=0, the corresponding proper vector behind Schimidt orthogonalization is l 1, l 2..., l 64
(5) carry out finding the solution of factor loading matrix by the major component method of separating;
Basis at first λ 1 + λ 2 + · · · + λ m λ 1 + · · · + λ m + · · · λ 64 ≥ 0.9 Smallest positive integral determine common factor number m=6;
The method of utilizing major component to separate is then found the solution the factor loading matrix: the eigenwert of utilizing the correlation matrix R that calculates in the step (4) is λ 1〉=λ 2〉=... 〉=λ 64〉=0, the corresponding proper vector behind Schimidt orthogonalization is l 1, l 2..., l 64, then utilize the spectral factorization of correlation matrix R: R = Σ i = 1 64 λ i l i l i ′ , Separate for one that obtains Factor Analysis Model, A = ( λ 1 l 1 , · · · , λ 6 l 6 ) = def ( a ij ) 64 × 6 ;
(6) the common degree of the variance matrix of specific factor and data variable in the calculated factor analytical model;
The variance of specific factor is in the Factor Analysis Model: σ i 2 = 1 - Σ t = 1 m a it 2 , ( i = 1,2 , . . . , 64 ) , Variable x iCommon degree h i 2Be estimated as h i 2 = Σ t = 1 m a it 2 , ( i = 1,2 , . . . , 64 ) ;
(7) calculating is based on the factor loading rotation matrix of variance maximum;
Common factor is done the quadrature rotation exactly loading matrix A is done orthogonal transformation, orthogonal matrix Γ is taken advantage of on the right side, makes A Γ have distincter meaning, and postrotational common factor vector is F *=Г ' F, its each component
Figure A200910078443D00126
It also is mutual incoherent common factor; Difference according to orthogonal matrix Г is chosen mode, will construct different quadrature spinning solutions, and the method that adopts among the present invention is the spinning solution of maximum variance; The specific implementation process is as follows:
Order A * = AΓ = ( a ij * ) 64 × 6 , d ij = a ij * / h i , d ‾ j = 1 64 Σ i = 1 64 d ij 2 , A then *The relative variance of j column element square can be defined as: V j = 1 64 Σ i = 1 64 ( d ij 2 - d ‾ j ) 2 , In the formula, get
Figure A200910078443D00132
Be in order to eliminate d IjThe influence that symbol is different; With
Figure A200910078443D00133
Divided by h iBe in order to eliminate each original variable X iThe influence different to the common factor degree of dependence; Selection is divided by h iBe because A *The capable quadratic sum of i:
h i * 2 = Σ j = 1 64 a ij * 2 = ( a i 1 * , a i 2 * , . . . , a i 64 * ) a i 1 * a i 2 * · · · a i 64 * = ( a i 1 , a i 2 , . . . , a i 64 ) ΓΓ ′ a i 1 a i 2 · · · a i 64 = Σ j = 1 64 a ij 2 = h i 2
The maximum variance spinning solution is selected orthogonal matrix Г exactly, makes A *It is maximum that the relative variance V of all 6 column elements square reaches: V=V 1+ V 2+ ... + V 6One by one per two common factors are carried out above-mentioned rotation, to common factor F lAnd F kRotation, the 1st row and the k row to the A matrix carry out orthogonal transformation exactly, make the relative variance sum maximum of this two element that is listed as square, and all the other row are constant, rotation matrix Г LkCan be expressed as:
Figure A200910078443D00135
In the formula, θ is F lAnd F kThe anglec of rotation, all the other elements of matrix are 0; Ask θ to make V reach maximum, by the method for asking extreme value in the infinitesimal analysis, to the θ differentiate, and to make it be 0 with V:
V = V 1 + V 2 + · · · + V 6 = 1 64 Σ i = 1 64 ( d i 1 2 - d ‾ 1 ) 2 + · · · + 1 64 Σ i = 1 64 ( d i 6 2 - d ‾ 6 ) 2
So, by ∂ V ∂ θ = 0 , : tan 4 θ = M - 2 UV / 64 W - ( U 2 - V 2 ) / 64 , in the formula: U = Σ i = 1 64 u i , V = Σ i = 1 64 v i , W = Σ i = 1 64 ( u i 2 - v i 2 ) , M = 2 Σ i = 1 64 u i v i , u i = ( a i 1 h i ) 2 + ( a i 2 h i ) 2 , v i = 2 a i 1 a i 2 h i 2 , a T1, a I2Be the element in the factor loading matrix, h i 2Be variable x iCommon degree.
The first round, postrotational factor loading matrix was A (1), begin the rotation of next round then, obtain a series of factor loading matrix and be: A (1), A (2)..., A (s)..., then must have: V (1)≤ V (2)≤ ...≤V (s)≤ ..., V (s)Be A (s)The relative variance sum of each column element square in the practical application, is worked as V (s-1)-V (s)During≤ε (ε among the present invention≤0.001), can stop the rotation;
(8) utilize based on weighted least require method calculated factor score;
On the basis that obtains factor loading matrix A and specific factor variance matrix D, adopt weighted least-squares that F is found the solution:
F ^ = ( A ^ ′ D ^ - 1 A ^ ) - 1 A ′ D - 1 X ,
The weighted least-squares that has promptly obtained F is estimated;
(9) obtain characterizing the intrinsic dimension of high-spectral data, realize the high-spectral data dimensionality reduction.
Utilize factor score to estimate to obtain characterizing the intrinsic dimension of high-spectral data, thereby realize the high-spectral data dimensionality reduction; 6 intrinsic wave bands that characterize all information of the high-spectral data that uses have been obtained.

Claims (8)

1, a kind of high-spectral data dimension reduction method based on Factor Analysis Model, it is characterized in that: it comprises following steps:
(1) high-spectrum remote sensing data reads in;
(2) set up the Factor Analysis Model of high-spectral data dimensionality reduction;
(3) average, covariance matrix and the correlation matrix of calculating high-spectral data;
(4) eigenwert of computational data correlation matrix and standardized feature vector;
(5) carry out finding the solution of factor loading matrix by the major component method of separating;
(6) the common degree of the variance matrix of specific factor and data variable in the calculated factor analytical model;
(7) calculating is based on the factor loading rotation matrix of variance maximum;
(8) utilize based on weighted least require method calculated factor score;
(9) obtain characterizing the intrinsic dimension of high-spectral data, realize the high-spectral data dimensionality reduction.
2, a kind of high-spectral data dimension reduction method according to claim 1 based on Factor Analysis Model, it is characterized in that: the Factor Analysis Model of the high-spectral data dimensionality reduction described in the step (2) is:
X=μ+AF+ε
In the formula, X=(x 1, x 2..., x pThe observable stochastic variable of) ' be, average μ=E (X), the common factor F=(F of X 1, F 2..., F m) ' (m<p) is a unobservable random variable, and satisfies average E (F)=0, variance D (F)=I m, specific factor ε=(ε 1, ε 2..., ε p) ' uncorrelated mutually with F, and satisfy average E (ε)=0, variance D ( ϵ ) = diag ( σ 1 2 , σ 2 2 , . . . , σ p 2 ) = def D ; Factor loading matrix A=(a Ij) P * mIt is matrix of coefficients to be estimated.
3, a kind of high-spectral data dimension reduction method according to claim 1 based on Factor Analysis Model, it is characterized in that: the average of the calculating high-spectral data described in the step (3), covariance matrix and correlation matrix computing method are as follows:
X ‾ = 1 n Σ t = 1 n X ( t ) = ( x ‾ 1 , x ‾ 2 , . . . , x ‾ p ) ′
V = 1 n Σ t = 1 n ( X ( t ) - X ‾ ) ( X ( t ) - X ‾ ) ′ = def V ij
R = ( r ij ) = ( V ij V ii V jj ) ; ( i , j = 1,2 , . . . , p )
In the formula, X is the average of data; V is a covariance matrix; R is a correlation matrix; N represents the pixel number; X (t)Represent t pixel.
4, a kind of high-spectral data dimension reduction method based on Factor Analysis Model according to claim 1, it is characterized in that: the eigenwert and the standardized feature vector of the computational data correlation matrix described in its step (4) are: the eigenwert of correlation matrix R is λ 1〉=λ 2〉=... 〉=λ p〉=0, the proper vector of the Schimidt orthogonalization of each eigenwert correspondence is l 1, l 2..., l p
5, a kind of high-spectral data dimension reduction method based on Factor Analysis Model according to claim 1 is characterized in that: described in its step (5) to find the solution factor loading matrix computations method by the major component method of separating as follows:
Basis at first λ 1 + λ 2 + · · · + λ m λ 1 + · · · + λ m + · · · λ p ≥ 0.9 Smallest positive integral determine common factor number m;
Utilize the major component method of separating to find the solution the factor loading matrix then:
A = ( λ 1 l 1 , · · · , λ m l m ) = def ( a ij ) p × m
In the formula, A is the factor loading matrix, λ iBe correlation matrix eigenwert, l iBe the correlation matrix proper vector;
6, a kind of high-spectral data dimension reduction method according to claim 1 based on Factor Analysis Model, it is characterized in that: the variance matrix of the specific factor described in the step (6) and the common degree of data variable are respectively: the variance of specific factor is: σ i 2 = 1 - Σ t = 1 m a it 2 , ( i = 1,2 , . . . , p ) , Variable x iCommon degree
Figure A200910078443C00037
For h i 2 = Σ t = 1 m a it 2 , ( i = 1,2 , . . . , p ) .
7, a kind of high-spectral data dimension reduction method based on Factor Analysis Model according to claim 1 is characterized in that: the factor loading rotation matrix computing method based on the variance maximum described in the step (7) are as follows:
A * = AΓ = ( a ij * ) p × m , d ij = a ij * / h i , d ‾ j = 1 p Σ i = 1 p d ij 2 , A then *The relative variance of j column element square be: V j = 1 p Σ i = 1 p ( d ij 2 - d ‾ j ) 2 ; Select orthogonal matrix Γ then, make A *It is maximum that the relative variance V of all m column elements square reaches: V=V 1+ V 2+ ... + V mOne by one per two common factors are carried out above-mentioned rotation, make the relative variance sum maximum of element square of these two row, and all the other row are constant, rotation matrix Γ LkCan be expressed as:
In the formula, θ is common factor F lAnd F kThe anglec of rotation, all the other elements of matrix are 0; Ask θ to make V reach maximum: tan 4 θ = M - 2 UV / p W - ( U 2 - V 2 ) / p , In the formula: U = Σ i = 1 p u i , V = Σ i = 1 p v i , W = Σ i = 1 p ( u i 2 - v i 2 ) , M = 2 Σ i = 1 p u i v i , u i = ( a i 1 h i ) 2 + ( a i 2 h i ) 2 , v i = 2 a i 1 a i 2 h i 2 , a 1, a 2Be the element in the factor loading matrix,
Figure A200910078443C000413
Be variable x iCommon degree;
The first round, postrotational factor loading matrix was A (1), begin the rotation of next round then, obtain a series of factor loading matrix and be: A (1), A (2)..., A (s)..., then must have: V (1)≤ V (2)≤ ...≤V (s)≤ ..., V (s)Be A (s)The relative variance sum of each column element square is worked as V (s-1)-V (s)Stop the rotation during≤ε.
8, a kind of high-spectral data dimension reduction method according to claim 1 based on Factor Analysis Model, it is characterized in that: the utilization described in the step (8) is as follows based on the factor score computing method of weighted least require method: adopt weighted least-squares that F is found the solution, the weighted least-squares that has obtained F is estimated F ^ = ( A ^ ′ D ^ - 1 A ^ ) - 1 A ′ D - 1 X , In the formula,
Figure A200910078443C00052
Be the factor loading matrix that estimation obtains,
Figure A200910078443C00053
Be the specific factor variance matrix that estimation obtains, X reads in data.
CN200910078443A 2009-02-23 2009-02-23 High-spectrum data dimensionality reduction method based on factor analysis model Expired - Fee Related CN101487892B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200910078443A CN101487892B (en) 2009-02-23 2009-02-23 High-spectrum data dimensionality reduction method based on factor analysis model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200910078443A CN101487892B (en) 2009-02-23 2009-02-23 High-spectrum data dimensionality reduction method based on factor analysis model

Publications (2)

Publication Number Publication Date
CN101487892A true CN101487892A (en) 2009-07-22
CN101487892B CN101487892B (en) 2012-09-26

Family

ID=40890857

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200910078443A Expired - Fee Related CN101487892B (en) 2009-02-23 2009-02-23 High-spectrum data dimensionality reduction method based on factor analysis model

Country Status (1)

Country Link
CN (1) CN101487892B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102260781A (en) * 2010-05-24 2011-11-30 上海宝钢工业检测公司 On-line early warming method of elongation rate and flattening process state of strip steel of continuous annealing unit
CN102879099A (en) * 2012-08-08 2013-01-16 北京建筑工程学院 Wall painting information extraction method based on hyperspectral imaging
CN103336968A (en) * 2013-05-28 2013-10-02 中国矿业大学 Hyperspectral data dimensionality reduction method based on tensor distance patch alignment
CN105447126A (en) * 2015-11-17 2016-03-30 苏州蜗牛数字科技股份有限公司 Game prop personalized recommendation method
CN105508152A (en) * 2015-12-31 2016-04-20 北京金风科创风电设备有限公司 Blade icing model construction method and icing state monitoring method and device
CN105787940A (en) * 2016-02-29 2016-07-20 长安大学 High-frequency resistance straight seam welding quality state online detection method
CN106021792A (en) * 2016-06-02 2016-10-12 东南大学 Non-stationary random dynamic response analysis method considering load dependence
CN107045120A (en) * 2017-01-20 2017-08-15 南京航空航天大学 A kind of adaptive framing method of one-dimensional range profile based on Factor Analysis Model
CN107657227A (en) * 2017-09-22 2018-02-02 上正大(上海)基因生物工程有限公司 A kind of principal component dimensionality reduction value algorithm based on Signal-to-Noise
CN109146003A (en) * 2018-09-30 2019-01-04 合刃科技(深圳)有限公司 A kind of classifying identification method based on high light spectrum image-forming, apparatus and system
CN110827919A (en) * 2019-11-05 2020-02-21 哈尔滨工业大学 Dimension reduction method applied to gene expression profile data
CN115426199A (en) * 2022-11-03 2022-12-02 南京金易众和信息科技有限公司 Website data security protection method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103714341B (en) * 2014-01-21 2016-09-28 北京航空航天大学 Hyper spectral reflectance data light spectrum signature extracting method based on overall situation sensitivity analysis

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6185331B1 (en) * 1998-03-20 2001-02-06 United Microelectronics Corp. Switched-current fuzzy processor for pattern recognition
CN101131734A (en) * 2007-06-25 2008-02-27 北京航空航天大学 Automatic waveband selection method for high optical spectrum remote sensing image

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102260781B (en) * 2010-05-24 2014-11-05 上海宝钢工业检测公司 On-line early warming method of elongation rate and flattening process state of strip steel of continuous annealing unit
CN102260781A (en) * 2010-05-24 2011-11-30 上海宝钢工业检测公司 On-line early warming method of elongation rate and flattening process state of strip steel of continuous annealing unit
CN102879099A (en) * 2012-08-08 2013-01-16 北京建筑工程学院 Wall painting information extraction method based on hyperspectral imaging
CN102879099B (en) * 2012-08-08 2015-04-15 北京建筑大学 Wall painting information extraction method based on hyperspectral imaging
CN103336968A (en) * 2013-05-28 2013-10-02 中国矿业大学 Hyperspectral data dimensionality reduction method based on tensor distance patch alignment
CN103336968B (en) * 2013-05-28 2016-04-13 中国矿业大学 Based on the high-spectral data dimension reduction method of tensor distance patch calibration
CN105447126A (en) * 2015-11-17 2016-03-30 苏州蜗牛数字科技股份有限公司 Game prop personalized recommendation method
CN105508152B (en) * 2015-12-31 2018-10-23 北京金风科创风电设备有限公司 Blade icing model construction method and icing state monitoring method and device
CN105508152A (en) * 2015-12-31 2016-04-20 北京金风科创风电设备有限公司 Blade icing model construction method and icing state monitoring method and device
CN105787940A (en) * 2016-02-29 2016-07-20 长安大学 High-frequency resistance straight seam welding quality state online detection method
CN106021792A (en) * 2016-06-02 2016-10-12 东南大学 Non-stationary random dynamic response analysis method considering load dependence
CN106021792B (en) * 2016-06-02 2019-09-10 东南大学 A kind of FLIGHT VEHICLE VIBRATION and shock dynamics response analysis method considering load dependence
CN107045120A (en) * 2017-01-20 2017-08-15 南京航空航天大学 A kind of adaptive framing method of one-dimensional range profile based on Factor Analysis Model
CN107045120B (en) * 2017-01-20 2019-10-11 南京航空航天大学 A kind of adaptive framing method of one-dimensional range profile based on Factor Analysis Model
CN107657227A (en) * 2017-09-22 2018-02-02 上正大(上海)基因生物工程有限公司 A kind of principal component dimensionality reduction value algorithm based on Signal-to-Noise
CN107657227B (en) * 2017-09-22 2021-04-30 上海三誉华夏基因科技有限公司 Principal component dimensionality reduction value taking method based on signal-to-noise ratio
CN109146003A (en) * 2018-09-30 2019-01-04 合刃科技(深圳)有限公司 A kind of classifying identification method based on high light spectrum image-forming, apparatus and system
CN109146003B (en) * 2018-09-30 2022-06-07 合刃科技(深圳)有限公司 Hyperspectral imaging-based classification identification method, device and system
CN110827919A (en) * 2019-11-05 2020-02-21 哈尔滨工业大学 Dimension reduction method applied to gene expression profile data
CN115426199A (en) * 2022-11-03 2022-12-02 南京金易众和信息科技有限公司 Website data security protection method
CN115426199B (en) * 2022-11-03 2023-10-10 南京金易众和信息科技有限公司 Website data safety protection method

Also Published As

Publication number Publication date
CN101487892B (en) 2012-09-26

Similar Documents

Publication Publication Date Title
CN101487892B (en) High-spectrum data dimensionality reduction method based on factor analysis model
CN102324047B (en) Hyper-spectral image ground object recognition method based on sparse kernel representation (SKR)
CN111461190B (en) Deep convolutional neural network-based non-equilibrium ship classification method
CN111914907B (en) Hyperspectral image classification method based on deep learning space-spectrum combined network
CN100514085C (en) Method for enhancing distinguishability cooperated with space-optical spectrum information of high optical spectrum image
Bioucas-Dias et al. Hyperspectral subspace identification
CN102609703B (en) Method and device for detecting target ground object in hyperspectral image
CN103033362B (en) Gear fault diagnosis method based on improving multivariable predictive models
CN103810704B (en) Based on support vector machine and the SAR image change detection of discriminative random fields
CN113962259B (en) Multi-mode double-layer fault diagnosis method for fuel cell system
CN103020265B (en) The method and system of image retrieval
CN103150579A (en) Abnormal human behavior detecting method based on video sequence
CN111797829A (en) License plate detection method and device, electronic equipment and storage medium
CN114462553B (en) Image labeling and element extracting method and system for car insurance anti-fraud
CN104866871A (en) Projection structure sparse coding-based hyperspectral image classification method
CN108154094A (en) The non-supervisory band selection method of high spectrum image divided based on subinterval
CN112633401B (en) Hyperspectral remote sensing image classification method, device, equipment and storage medium
CN106295498A (en) Remote sensing image target area detection apparatus and method
CN103500343A (en) Hyperspectral image classification method based on MNF (Minimum Noise Fraction) transform in combination with extended attribute filtering
CN107862687A (en) A kind of early warning system for being used to monitor agricultural pest
CN104933425A (en) Hyper-spectral data processing method
CN104504391A (en) Hyperspectral image classification method based on sparse feature and Markov random field
CN101093508B (en) Method and device to determine a descriptor for a signal representing a multimedia item
CN103745232B (en) Band migration-based hyperspectral image clustering method
CN104077608A (en) Behavior recognition method based on sparsely coded slow characteristic functions

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C53 Correction of patent for invention or patent application
CB03 Change of inventor or designer information

Inventor after: Zhao Huijie

Inventor after: Li Na

Inventor after: Cai Hui

Inventor after: Jia Guorui

Inventor after: Xu Zhou

Inventor after: Bai Xiao

Inventor before: Zhao Huijie

Inventor before: Li Na

Inventor before: Cai Hui

Inventor before: Jia Guorui

Inventor before: Xu Zhou

COR Change of bibliographic data

Free format text: CORRECT: INVENTOR; FROM: ZHAO HUIJIE LI NA CAI HUI JIA GUORUI XU ZHOU TO: ZHAO HUIJIE LI NA CAI HUIJIA GUORUI XU ZHOU BAI XIAO

C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120926

Termination date: 20140223